[go: up one dir, main page]

WO2015031861A1 - Making a user's data, settings, and licensed content available in the cloud - Google Patents

Making a user's data, settings, and licensed content available in the cloud Download PDF

Info

Publication number
WO2015031861A1
WO2015031861A1 PCT/US2014/053596 US2014053596W WO2015031861A1 WO 2015031861 A1 WO2015031861 A1 WO 2015031861A1 US 2014053596 W US2014053596 W US 2014053596W WO 2015031861 A1 WO2015031861 A1 WO 2015031861A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
settings
account
data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2014/053596
Other languages
French (fr)
Inventor
Derek P. MARTIN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
U-Me Holdings LLC
Original Assignee
U-Me Holdings LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/016,038 external-priority patent/US9118670B2/en
Priority claimed from US14/044,843 external-priority patent/US20150066941A1/en
Priority claimed from US14/132,104 external-priority patent/US9456164B2/en
Priority claimed from US14/153,630 external-priority patent/US20150066246A1/en
Priority claimed from US14/207,490 external-priority patent/US20150067099A1/en
Priority claimed from US14/462,523 external-priority patent/US20150066853A1/en
Application filed by U-Me Holdings LLC filed Critical U-Me Holdings LLC
Publication of WO2015031861A1 publication Critical patent/WO2015031861A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2111Location-sensitive, e.g. geographical location, GPS
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2149Restricted operating environment

Definitions

  • This disclosure generally relates to computer systems, and more specifically relates to making information relating to a user available in the cloud and to multiple devices used by the user.
  • Dropbox is an example of a cloud-based file service.
  • a subscriber to Dropbox defines a file folder that is synchronized to the cloud, then all data written to the file folder will be automatically stored in the cloud, making that data automatically available to the user via any device that has an Internet connection.
  • services like Dropbox are very useful, they have their drawbacks. For example, a Dropbox user must remember to store data in a Dropbox folder or sub-folder.
  • Many different software applications have default settings that save files to a folder that may not be a Dropbox folder. The user must know to change the default folder settings to a Dropbox folder if the data is to be available via
  • Dropbox But many users lack the knowledge or sophistication to realize all the changes that need to be made to a computer to assure all of the user's data is stored to Dropbox. As a result, if the user's hard drive crashes and data is not recoverable from the hard drive, the user may discover some of their data was not stored to a Dropbox folder or sub-folder, resulting in loss of that data when the hard drive crashed.
  • Universal remote controls have greatly reduced the number of remote controls the user must deal with to control the user's home electronics. These remote controls are called “universal remotes" because of the ability to program them to accommodate a large number of devices from many different vendors.
  • the universal remote control can include a database of vendor models of equipment and corresponding codes for controlling the equipment.
  • the remote control can program itself for the set of codes to control the vendor's equipment. This process can be repeated for each piece of equipment the user wants to control with the universal remote control.
  • a user could program a known universal remote control to control a Samsung television, a DirecTV digital video recorder (DVR), and a Sony DVD player.
  • DVR DirecTV digital video recorder
  • the ability to program a universal remote control to support different equipment provides the capability for a user to customize the remote control.
  • the programming for the remote control typically does not change until the user adds a new piece of equipment or replaces an existing piece of equipment with different equipment.
  • Some universal remote controls use touch-screens that display graphical symbols called icons that may be selected by a user to perform certain functions.
  • a CNN icon on a universal remote control with a touch screen may be presented, and when the user selects the CNN icon by pressing on the CNN icon on the touch screen, the remote control will send the appropriate command to change the channel to CNN.
  • a cloud -based computer system referred to herein as "Universal Me” or “U-Me” changes the modern paradigm from being device-centric to being person-centric.
  • the system makes all user data, settings, and licensed content for a user available in the cloud.
  • the system includes a conversion mechanism that can convert information intended for one device type to a different device type.
  • a user changing smart phone platforms can convert their current smart phone settings to equivalent settings on the new phone platform, and their new phone can then be configured using the user's converted settings stored in the cloud.
  • this information may be accessed by the user anywhere and may be used to configure a number of different devices according to the user's data and settings.
  • the U-Me system includes a photo processing mechanism that allows cataloging and storing a user's photos using relationships between people that allow the user's photos to be retrieved using a search engine.
  • a user enters people and specifies relationships, and may also enter locations, events, and other information.
  • Photos are then processed, and indexing info is generated for each photo that may include any or all of the following: user-defined relationships, system-derived relationships, user-defined locations, system-defined locations, user-defined events, and system-derived events and ages for the people in the photos.
  • the indexing info is used to catalog a photo for easy retrieval later.
  • the indexing info may be stored as metadata with the photo or may be stored separately from the photo.
  • the indexing info allows photos to be retrieved using a powerful search engine.
  • the U-Me system includes a universal remote control that allows dynamically programming the remote control according to location.
  • the remote control includes a communication interface that allows the remote control to communicate with a remote database.
  • a location is specified, and the remote control uses the location information to access corresponding programming information for the remote control.
  • the remote control is then dynamically reprogrammed according to the location information to make the remote control suitable to the location.
  • the U-Me system may make a user's information available in a vehicle.
  • the system includes a conversion mechanism that can convert information intended for one device type to a different device type.
  • a user driving a Chevrolet can store the settings for the Chevrolet, which can then be converted to equivalent settings for any other vehicle, including vehicles from different manufacturers.
  • the U-Me system allows transferring user settings from a first device to a second device that has the same hardware architecture type and the same system software type as the first device.
  • a conversion mechanism also allows converting user settings for a first device to corresponding user settings for a second device that has a different hardware architecture type and/or different system software type.
  • the user settings for the first device can be transferred to an external device, which may then be connected to a second device, which can then use the user settings on the external device to program the second device.
  • a television receiver such as a cable box, a digital video recorder (DV ), a satellite television receiver, etc. is one example of a device that can be programmed from settings of a different device.
  • multiple templates provide mapping information from physical devices to a master template that serves as a central repository for all of a user's settings for all of a user's devices.
  • the templates also provide mapping information that allow for mapping settings between different physical devices, between physical devices and other templates, and between templates.
  • a user settings mechanism uses the mapping information to propagate user settings stored in one template to other templates and to one or more physical devices, and to propagate user settings stored in a physical device to multiple templates, including a master template that serves as a central repository for all of a user's settings.
  • FIG. 1 is block diagram showing the Universal Me (U-Me) system
  • FIG. 2 is block diagram showing additional details of the U-Me system
  • FIG. 3 is block diagram showing a computer system that runs the U-Me system
  • FIG. 4 is a block diagram showing how a user using a physical device can access information in the
  • FIG. 5 is a block diagram showing various features of the U-Me system
  • FIG. 6 is a block diagram showing examples of user data
  • FIG. 7 is a block diagram showing examples of user licensed content
  • FIG. 8 is a block diagram showing examples of user settings
  • FIG. 9 is a block diagram showing examples of universal templates
  • FIG. 10 is a block diagram showing examples of device-specific templates
  • FIG. 11 is a block diagram showing examples of phone templates
  • FIG. 12 is a block diagram showing examples of tablet templates
  • FIG. 13 is a block diagram showing examples of laptop templates
  • FIG. 14 is a block diagram showing examples of desktop templates
  • FIG. 15 is a block diagram showing examples of television templates
  • FIG. 16 is a block diagram showing examples of software templates
  • FIG. 17 is a block diagram showing examples of vehicle templates
  • FIG. 18 is a block diagram showing examples of home automation templates
  • FIG. 19 is a block diagram showing examples of gaming system templates
  • FIG. 20 is a block diagram showing examples of audio system templates
  • FIG. 21 is a block diagram showing examples of security system templates
  • FIG. 22 is a block diagram showing examples of device interfaces
  • FIG. 23 is a block diagram of a universal user interface
  • FIG. 24 is a flow diagram of a method for programming a physical device with user settings
  • FIG. 25 is a flow diagram of a first suitable method for performing step 2410 in FIG. 24 using a mapping between two physical devices;
  • FIG. 26 is a block diagram showing the generation of settings for Device2 from settings for Device 1 as shown in the flow diagram in FIG. 25;
  • FIG. 27 is a flow diagram of a second suitable method for performing step 2410 in FIG. 24 using a universal template
  • FIG. 28 is a block diagram showing the generation of settings for Device2 from a universal template as shown in the flow diagram in FIG. 27;
  • FIG. 29 is a flow diagram of a third suitable method for performing step 2410 in FIG. 24 using settings from a first device and a universal template;
  • FIG. 30 is a block diagram showing the generation of settings for Device2 as shown in the flow diagram in FIG. 29;
  • FIG. 31 is a table showing mapping of some channel numbers for DirecTV to channel numbers for Dish Network
  • FIG. 32 is a table showing examples of user television settings
  • FIG. 33 is a flow diagram of a method for converting channel numbers for Dish Network to channel numbers for DirecTV;
  • FIG. 34 is a flow diagram of a method for reprogramming a remote control for a television
  • FIG. 35 is an example of a display of a television remote control
  • FIG. 36 is a flow diagram of a method for converting a channel number entered by a user to a corresponding different channel number for a target system
  • FIG. 37 is a flow diagram of a method for reprogramming a television remote control according to a target system at a location
  • FIG. 38 is a flow diagram of a method for defining an e eceipt template
  • FIG. 39 is a flow diagram of a method for sending and storing an eReceipt to a user's U-Me account; [0059] FIG. 40
  • FIG. 47 is a flow diagram of a method for handling a timed warranty link
  • FIG. 48 is a flow diagram of a method for prompting a user to purchase an extended warranty when a warranty is about to expire;
  • FIG. 49 is a flow diagram of a method for handling a warranty claim using an eReceipt
  • FIG. 50 shows an example of a screen for one suitable implementation of an eReceipt search engine
  • FIG. 51 shows examples of eReceipt queries that could be submitted via the eReceipt search engine screen shown in FIG. 50;
  • FIG. 52 is a flow diagram of a method for processing an e-mail receipt to generate an eReceipt
  • FIG. 53 is a flow diagram of a method for generating settings in a universal vehicle template based on settings from a vehicle;
  • FIG. 54 shows examples of items that could be included in a universal vehicle template
  • FIG. 55 is a flow diagram of a method for downloading user settings to a car from the user's U-Me account
  • FIG. 56 is a representation of a vehicle seat with respect to the vehicle floor, the vehicle accelerator pedal, and the vehicle steering wheel;
  • FIG. 57 is a block diagram of a system for using a phone hands-free in a prior art vehicle
  • FIG. 58 is a block diagram of a system for using a phone hands-free and also for accessing information in the vehicle's engine system;
  • FIG. 59 is a flow diagram of a method for prompting a user regarding scheduled maintenance for a vehicle
  • FIG. 60 is a flow diagram of a method for providing shop bids to a user for scheduled maintenance
  • FIG. 61 is a flow diagram of a method for notifying users of the U-Me system of manufacturer recalls and service actions for the manufacturer's vehicles;
  • FIG. 62 is a flow diagram of a method for providing vehicle service reminders to a user;
  • FIG. 63 is a flow diagram of a method for prompting the user when engine warning information is sent to the user's U-Me account;
  • FIG. 64 is an example of a photo system data entry screen
  • FIG. 65 is the example photo system entry screen in FIG. 64 filled with sample data
  • FIG. 66 is a flow diagram of a method for the U-Me system to construct family relationships from information entered in the photo system data entry screen;
  • FIG. 67 is a flow diagram of a method for generating indexing information for photos and for storing photos with the indexing information
  • FIG. 68 is a flow diagram of a method for adding a photographer's name to indexing information for a photo
  • FIG. 69 shows examples of photo metadata
  • FIG. 70 is a flow diagram of a method for adding location name to indexing information for a photo
  • FIG. 71 is a flow diagram of a method for storing photos that were scanned from hard copy photos with indexing information
  • FIG. 72 is a flow diagram of a method for a user to define indexing info for one or more photos at the same time
  • FIG. 73 shows examples of photo indexing info
  • FIG. 74 shows an example photo file stored in a user's U-Me account
  • FIG. 75 is an example of a screen for a user to generate indexing info for one or more photos as shown in the method in FIG. 72;
  • FIG. 76 shows a screen for an example of a photo search engine
  • FIG. 77 shows examples of photo queries that could be formulated in the photo search engine shown in FIG. 76;
  • FIG. 78 shows a screen for an example photo share engine
  • FIG. 79 is a flow diagram of a method for processing a photo just taken and storing the photo with automatically-generated indexing information in the user's U-Me account;
  • FIG. 80 shows an example of medical information for a user
  • FIG. 81 is a flow diagram of a method for a user to define semi-private medical info in the user's U-Me account;
  • FIG. 82 is a flow diagram of a method for uploading medical information to a user's U-Me account
  • FIG. 83 is a flow diagram of a method for a medical person to attempt to access medical information stored on a U-Me user's device;
  • FIG. 84 is a flow diagram of a method for determining whether the current location of the U-Me user's device is at a medical facility;
  • FIG. 85 is an example of a display on a U-Me user's smart phone showing a medical button that allows bypassing any security on the smart phone to access the semi-private medical information for the user;
  • FIG. 86 is a flow diagram of a method for a medical person to attempt to gain access to a patient's medical information
  • FIG. 87 shows an example of a screen for a medical information sharing engine
  • FIG. 88 is a flow diagram of a method for a user to share the user's medical information with one or more other users;
  • FIG. 89 is a flow diagram of a method for a user who was authorized to share medical information for a different user to share that medical information with one or more other users;
  • FIG. 90 is a flow diagram of a method for a user to revoke sharing of medical information by other users
  • FIG. 91 is a flow diagram of a method for the U-Me system to track when a user takes meds
  • FIG. 92 is a flow diagram of a method for the U-Me system to provide reminders to a user to take the user's meds;
  • FIG. 93 is a flow diagram of a method for a user to authenticate to the U-Me system
  • FIG. 94 shows examples of authentication types that could be used by a user to authenticate to the
  • FIG. 95 is a flow diagram of a method for assuring the U-Me system functions are available to a user on only one physical device at a time when the user authenticates using non-biometric authentication;
  • FIG. 96 is a flow diagram of a method for licensing licensed content to a user, not to a physical device, then making the licensed content available to the user on any device;
  • FIG. 97 is a flow diagram of a method for licensing music to a user, not to a physical device, then making the music available to the user on any device that can play music;
  • FIG. 98 is a flow diagram of a method for making a user's music settings in the user's U-Me account available on any suitable music player;
  • FIG. 99 shows examples of suitable music players
  • FIG. 100 shows license pricing that varies according to the length of the license
  • FIG. 101 is a flow diagram of a method for generating virtual devices in a user's U-Me account that correspond to physical devices used by the user;
  • FIG. 102 shows an example of a smart phone and corresponding example of a virtual smart phone that is stored in the user's U-Me account;
  • FIG. 103 is a flow diagram of a method for tracking all changes to a physical device and synchronizing all the changes to a corresponding virtual device in the user's U-Me account;
  • FIG. 104 is a flow diagram of a method for synchronizing all data changes between the user's physical devices and the user's U-Me account;
  • FIG. 105 is a flow diagram of a method for storing data to a user's U-Me account with indexing information that allows retrieving the data later via a search engine;
  • FIG. 106 shows examples of data attributes
  • FIG. 107 shows examples of data attributes that could be stored as indexing info to identify type of data stored
  • FIG. 108 shows examples of data attributes that could be stored as indexing info to identify location of where data was created
  • FIG. 109 shows examples of data attributes that could be stored as indexing info to identify time- related parameters for data
  • FIG. 110 shows an example of a data file format for data stored in the U-Me system
  • FIG. I l l shows an example of a data file that complies with the format shown in FIG. 110 and that includes examples of indexing info that helps to retrieve the data later via a search engine;
  • FIG. 112 shows an example of a data search engine that allows a user to query data stored in the user's U-Me account
  • FIG. 113 is a flow diagram of a method for configuring a new physical device from information in a user's U-Me account;
  • FIG. 114 is a flow diagram of a method for the U-Me system to host software that is licensed to the user;
  • FIG. 115 is a flow diagram of a method for the U-Me system to host software that is licensed to a device;
  • FIG. 116 is a block diagram of a virtual machine for the user.
  • FIG. 117 is a flow diagram of a method for selecting weather alerts for defined geographic regions
  • FIG. 118 is table showing examples of weather alerts defined by the United States National Oceanographic and Atmospheric Administration (NOAA);
  • FIG. 119 shows an example of an interface for a user to define weather alerts
  • FIG. 120 shows the interface in FIG. 119 with data that defines a weather alert for a tornado warning
  • FIG. 121 shows the interface in FIG. 119 with data that defines a weather alert for a flash flood watch
  • FIG. 122 shows the interface in FIG. 119 with data that defines a weather alert for a wind chill watch
  • FIG. 123 is a flow diagram of a method for the U-Me system to process weather alerts
  • FIG. 124 is a block diagram showing examples of home automation settings
  • FIG. 125 is a block diagram showing examples of appliance settings
  • FIG. 126 is a block diagram showing examples of HVAC settings
  • FIG. 127 is a block diagram showing examples of light settings
  • FIG. 128 is a block diagram showing examples of security settings
  • FIG. 129 is a block diagram showing examples of home theater settings
  • FIG. 130 is a block diagram showing one specific example of home automation settings
  • FIG. 131 is a flow diagram of a method for the U-Me system to track a user's software and license information
  • FIG. 132 is an example of a license management entry that stores a license key with the software
  • FIG. 133 is a block diagram showing examples of alerts a user can define in the user's U-Me account
  • FIG. 134 is a block diagram showing examples of periodic reminders a user can define in the user's U-Me account
  • FIG. 135 is a block diagram showing examples of seasonal reminders a user can define in the user's U-Me account
  • FIG. 136 is a flow diagram of a method for the U-Me system to automatically destroy data and/or licensed content and/or settings according to a defined retention/destruction policy;
  • FIG. 137 is a block diagram showing examples of retention/destruction criteria that could be defined in a retention/destruction policy
  • FIG. 138 is a block diagram showing examples of transfers that could be made within the U-Me system between users;
  • FIG. 139 is a flow diagram of a method for a user to transfer licensed content to a different user
  • FIG. 140 is a flow diagram of a method for the U-Me system to transfer upon the death of one user the user's licensed content to other user(s);
  • FIG. 141 is a flow diagram of a method for auditing the licensed content in a user's U-Me account
  • FIG. 142 is a flow diagram of a method for deleting content that is unlicensed from a user's U-Me account
  • FIG. 143 shows an example of a U-Me sub-account mechanism
  • FIG. 144 is a flow diagram of a method for defining and using sub-accounts
  • FIG. 145 is a flow diagram of a method for the U-Me system to track credit card usage by a user for online transactions
  • FIG. 146 is an example credit card log that shows three different credit cards and websites where the user used each credit card;
  • FIG. 147 is a flow diagram of a method for prompting a user regarding on which websites the user used a credit card when the credit card is about to expire;
  • FIG. 148 is a flow diagram of a method for a user to update credit card information on websites where the user has used the credit card;
  • FIG. 149 is a block diagram of an example of a macro/script mechanism
  • FIG. 150 is a flow diagram of a method for generating macros and/or scripts;
  • FIG. 151 is a flow diagram of a method for scheduling a macro or script to run;
  • FIG. 152 is a flow diagram of an example method for running a script to automatically retrieve a bank statement on the 5th of each month, and storing the bank statement to the user's U-Me account;
  • FIG. 153 is a flow diagram of a method for downloading settings from a user's U-Me account to a location
  • FIG. 154 shows examples of queries that could be formulated in the data search engine
  • FIG. 155 is a flow diagram of a method for a company to identify a person who is the licensee of software purchased by the company;
  • FIG. 156 is a flow diagram of a method for a company to revoke the license of a person to software purchased by the company;
  • FIG. 157 is a flow diagram of a method for converting physical items to electronic form and storing those items in a user's U-Me account;
  • FIG. 158 is a block diagram of a virtual machine image
  • FIG. 159 is a block diagram of a running virtual machine generated from the virtual machine image in FIG. 158, where the running virtual machine is not specific to any user;
  • FIG. 160 is a block diagram of the running virtual machine in FIG. 159 after a U-Me user descriptor file has been written to the U-Me generic user shell to create U-Me user-specific components that are running;
  • FIG. 161 is a block diagram representing aspects of a virtual phone
  • FIG. 162 is a block diagram representing one suitable example of a virtual phone representing the items shown in FIG. 161;
  • FIG. 163 is a screen display showing steps to configure a new phone using the virtual phone settings in FIG. 162;
  • FIG. 164 is a flow diagram of a method for generating indexing info for one or more photos
  • FIG. 165 is a data entry screen for entering info about people into the U-Me system
  • FIG. 166 shows the data entry screen in FIG. 165 after a person fills in information
  • FIG. 167 is a data entry screen for a person to enter family relationships
  • FIG. 168 shows the data entry screen in FIG. 167 after a person fills in information regarding family relationships
  • FIG. 169 is a block diagram showing different entries for a spouse and a wedding date to the spouse;
  • FIG. 170 is a block diagram showing user-defined relationships and system-derived relationships that are derived from the user-defined relationships
  • FIG. 171 is a flow diagram of a method for constructing relationships based on the photo system data entry;
  • FIG. 172 is a display of a family tree based on the information entered by a user in the data entry screen in FIG.168;
  • FIG. 173 is a block diagram showing the user-defined relationships entered by a user in the data entry screen in FIG. 168;
  • FIG. 174 is a display of the family tree in FIG. 172 after adding information relating to the wife and son of Billy Jones;
  • FIG. 175 is a block diagram showing both the user-defined relationships as well as the system- derived relationships for the family tree in FIG. 174;
  • FIG. 176 is a data entry screen for a person to enter locations
  • FIG. 177 shows a data entry screen that allows a person to define a location based on an address
  • FIG. 178 is a flow diagram of a method for defining a location using an app on a mobile device
  • FIG. 179 is a schematic diagram showing how method 17800 in FIG. 178 could be used for a user to define two different geographic regions that are stored as locations;
  • FIG. 180 is a block diagram showing user-defined locations and system-defined locations
  • FIG. 181 shows examples of photo metadata
  • FIG. 182 is a flow diagram of a method for adding location name to indexing information for a photo
  • FIG. 183 is a block diagram showing photo indexing info that could be generated for a photo
  • FIG. 184 is a block diagram showing examples of markup language tags that could be used as photo indexing info
  • FIG. 185 is a block diagram showing examples of user -defined events, system-derived events, and system-defined events selected by a user;
  • FIG. 186 is a flow diagram of a method for generating and storing indexing info for a photo
  • FIG. 187 is a flow diagram of a method for processing a photo for facial and feature recognition
  • FIG. 188 is a flow diagram of a method for generating indexing info for a photo
  • FIG. 189 is a flow diagram of a method for generating indexing information relating to one or more locations(s) for a photo when a user defines a location for the photo;
  • FIG. 190 is a flow diagram of a method for generating indexing information relating to one or more locations(s) for a photo based on geocode info in the photo metadata;
  • FIG. 191 is a flow diagram of a method for generating indexing information relating to one or more events for a photo based on a date or date range for the photo;
  • FIG. 192 is a flow diagram of a method for generating indexing information for a photographer's name based on the camera that took the photo;
  • FIG. 193 is a flow diagram of a method for automatically processing a photo using the U-Me system; [0213 ] FIG. 194 is a flow diagram of a method for storing photos that were scanned from hard copy photos with corresponding indexing information;
  • FIG. 195 is a flow diagram of a method for a user to define indexing info for one or more photos at the same time;
  • FIG. 196 shows storing indexing info separate from a digital photo file
  • FIG. 197 shows storing the indexing info within the digital photo file
  • FIG. 198 is an example of a data entry screen for a user to generate indexing info for one or more photos as shown in the method in FIG. 44;
  • FIG. 199 shows a screen for an example of a photo search engine
  • FIG. 200 shows examples of photo queries that could be formulated in the photo search engine shown in FIG. 199;
  • FIG. 201 shows a screen for an example photo share engine
  • FIG. 202 is a flow diagram of a method for sharing photos in a user's U-Me account with another user;
  • FIG. 203 is a representation of a sample photo
  • FIG. 204 is sample indexing info that could be generated for the sample photo in FIG. 203;
  • FIG. 205 shows information in a user's U-Me account
  • FIG. 206 represents how a first user's people info, location info, and event info can be shared with a second user, and further shows the second user may have different names that correspond to the faces defined in the first user's account, and may have different indexing info for the photos in the first user's account;
  • FIG. 207 is a method for generating indexing info based on existing tags in a digital photo file
  • FIG. 208 is a flow diagram of a method for identifying duplicate photos
  • FIG. 209 is a flow diagram of a method for importing people and relationships from an external file
  • FIG. 210 is a flow diagram of a method for automatically propagating changes to a user's U-Me account to indexing info for the user's photos;
  • FIG. 211 is a flow diagram of a method for downloading settings from a user's U-Me account to a location;
  • FIG. 212 is a block diagram of a universal remote control that includes a dynamic location -based programming mechanism
  • FIG. 213 shows some examples of different types of remote controls
  • FIG. 214 is a flow diagram of a method for storing remote control programming parameters for a given location
  • FIG. 215 shows an example of remote control programming parameters for a given location
  • FIG. 216 is a flow diagram of a method for a remote control to program itself using programming parameters for a given location stored in an external database
  • FIG. 217 is a flow diagram of a method for converting user settings from a first vehicle to corresponding user settings for a second vehicle that are used to configure the second vehicle;
  • FIG. 218 is a block diagram of a television receiver that includes a user settings transfer mechanism for exporting user settings to an external device and for importing user settings from an external device;
  • FIG. 219 is a flow diagram of a method for storing user settings to an external device
  • FIG. 220 is a flow diagram of a method for programming user settings for a second device based on the user's settings for a first device
  • FIG. 221 is a flow diagram of a method for providing default settings for a device and for returning the device to its default settings after a user leaves;
  • FIG. 222 is a block diagram showing multiple levels of templates for user settings
  • FIG. 223 is a block diagram showing multiple levels of templates and mappings for user settings
  • FIG. 224 is a flow diagram of a method for propagating a user setting from a physical device to multiple templates
  • FIG. 225 is a flow diagram of a method for propagating a user setting from the master template to one or more other templates and to a physical device;
  • FIG. 226 is a flow diagram of a method for resolving an incompatibility between user settings in different devices
  • FIG. 227 is a block diagram showing multiple levels of templates and multiple physical devices.
  • FIG. 228 is a block diagram showing multiple levels of templates and mappings for user settings that include multiple levels of universal templates.
  • Apple's iPod was a revolutionary device that allowed storing a large number of songs, which the user may listen to at his or her convenience.
  • Apple's iTunes software application that allows a user to purchase music, which is stored on the user's computer system in their iTunes account.
  • This music may be copied from the computer system to a suitable Apple device, such as an iPod or iPad.
  • music from an iPod or iPad cannot be copied to the user's computer because this would make illegal copying of music very easy.
  • all of a user's music is stored in the user's computer system in their iTunes software.
  • Dropbox is an online service that allows storing information to the cloud.
  • Dropbox is based on the folder/sub folder (or directory/subdirectory) paradigm.
  • the user when using Dropbox, the user must remember to store the data in a Dropbox folder or subfolder, and then must also store the data in a location and use a file name the user is likely to remember.
  • Relying on the memory of a user to remember where the user stored something on a computer system is very inefficient and error -prone.
  • Many users have experienced storing a file to their computer system, then having to search many files across many directories in an attempt to locate the file they stored.
  • Database systems provide very structured ways of storing information, which results in supporting very powerful ways of retrieving information in the database via queries.
  • these powerful database tools for storing and retrieving information have not been employed in helping most users to store and retrieve information on their computer systems or smart phones.
  • Photography is an area that has greatly benefitted from modern technology. Digital cameras and cell phones allow capturing very high-resolution photographs and video in digital form that can be easily stored to an electronic device. While photography itself has been revolutionized by technology, the technology for storing and retrieving photographs has lagged far behind. Many people who have used digital cameras for years have many directories or folders on a computer system that contain thousands of digital photos and videos.
  • the device When a person uses a digital camera or cell phone to take a photo, the device typically names the photo with a cryptic name that includes a number that is sequential. For example, a Nikon camera may name a photo file with a name such as "DSC_0012.jpg.”. The digital file for the next photo is the next number in sequence, such as DSC_0013.jpg.
  • the digital camera or cell phone may reuse file names that were used previously.
  • many users choose to create a new directory or folder each time photos are downloaded from a camera or cell phone. This results in two significant problems.
  • the file name for a photo may be shared by multiple photos in multiple directories.
  • the names of digital photo files give the user no information regarding the photo.
  • the user may have to navigate a large number of directories, searching thumbnails of the photos in each directory to locate the desired photo. This is grossly inefficient and relies on the memory of the user to locate a desired photo.
  • a user can more efficiently locate photos if the user takes the time to carefully name directories or folders and also takes the time to carefully name individual photo files. But this is very time-consuming, and most users don't take the time needed to name folders and photo files in a way that would make retrieval of the photos easier.
  • Most people who take digital photos have thousands of photos that have cryptic names in dozens or hundreds of different directories or folders that may also have cryptic names. The result is that finding a particular photo may be very difficult.
  • Photo tagging is a way to add tags, or identifiers, to the metadata in a photo.
  • Google's Picasa service includes face recognition and tagging.
  • the face recognition engine in Picasa can recognize a person's face in multiple photos, and can then create a tag for that person that is written to the metadata for each of the photos. This allows for more easily retrieving photos based on a search of tags.
  • current tagging technology is not very sophisticated. If a person tags some photos with the name Jim, and other photos with the name Jimmy for the same person, a search for Jim will identify the photos tagged with Jim but will not identify the photos tagged with Jimmy.
  • Known tagging allows placing simple labels in the metadata of a photo file. A person can then use a search engine to search for photos that have one or more specified tags. But current tags do not allow identifying relationships between people, do not allow storing ages of people, and lack the flexibility and power needed to catalog, store and retrieve photos in a powerful way.
  • U-Me Universal Me
  • the U-Me system includes a photo processing mechanism that allows cataloging and storing a user's photos using relationships between people that allow the user's photos to be retrieved using a search engine.
  • a user enters people and specifies relationships, and may also enter locations, events, and other information.
  • Photos are then processed, and indexing info is generated for each photo that may include any or all of the following: user-defined relationships, system-derived relationships, user-defined locations, system-defined locations, user-defined events, and system -derived events and ages for the people in the photos.
  • the indexing info is used to catalog a photo for easy retrieval later.
  • the indexing info may be stored as metadata with the photo or may be stored separately from the photo.
  • the indexing info allows photos to be retrieved using a powerful search engine.
  • the U-Me system provides multiple templates that provide mapping information from physical devices to a master template that serves as a central repository for all of a user's settings for all of a user's devices.
  • the templates also provide mapping information that allow for mapping settings between different physical devices, between physical devices and other templates, and between templates.
  • a user settings mechanism uses the mapping information to propagate user settings stored in one template to other templates and to one or more physical devices, and to propagate user settings stored in a physical device to multiple templates, including a master template that serves as the central repository for all of a user's settings.
  • the Universal Me (U-Me) system 100 includes multiple user accounts 110, shown in FIG. 1 as 11 OA, . . ., HON.
  • Each user account includes data, licensed content, and settings that correspond to the user.
  • Userl account 11 OA includes corresponding data 120 A, licensed content 130A, and settings 140 A.
  • UserN account 110N includes corresponding data 120N, licensed content 130N, and settings 140N. Any or all of the user's data, licensed content and settings may be made available on any device 150 the user may use. Examples of suitable devices are shown in FIG. 1 to include a smart phone 150A, a tablet computer 150B, a laptop computer 150C, a desktop computer 150D, and other device 150N.
  • the devices shown in FIG. 1 are examples of suitable devices the user could use to access any of the data, licensed content, or settings in the user's account.
  • the disclosure and claims herein expressly extend to using any type of device to access the user's data, licensed content, or settings, whether the device is currently known or developed in the future.
  • the U-Me system 100 may include virtual devices in a user's account. Referring to FIG. 2, the
  • Userl account 110A is shown to include a virtual smart phone 250A that corresponds to the physical smart phone 150A; a virtual tablet computer 250B that corresponds to the physical tablet computer 150B; a virtual laptop computer 250C that corresponds to the physical laptop computer 150C; a virtual desktop computer 250D that corresponds to a physical desktop computer 150D; and a virtual other device 250N that corresponds to a physical other device 150N.
  • the virtual devices preferably include all information that makes a physical device function, including operating system software and settings, software applications (including apps) and their settings, and user settings. It may be impossible due to access limitations on the physical device to copy all the information that makes the physical device function. For example, the operating system may not allow for the operating system code to be copied.
  • the virtual devices contain as much information as they are allowed to contain by the physical devices. In the most preferred
  • the virtual devices contain all information that makes the physical devices function.
  • the user can purchase a new smart phone, and all the needed information to configure the new smart phone exactly as the old one is available in the virtual smart phone stored in the user's U-Me account.
  • the phone will connect to the user's U-Me account, authenticate the user, and the user will then have the option of configuring the new device exactly as the old device was configured using the information in the virtual smart phone in the user's U-Me account.
  • the U-Me account will prompt the user with a list of things to do before the new physical device can be configured using the data in the virtual device. For example, if the user had just applied an operating system update and the new phone did not include that update, the user will be prompted to update the operating system before continuing. If an app installed on the old phone cannot be copied to the user's U-Me account, the U-Me app could prompt the user to install the app before the rest of the phone can be configured.
  • the virtual device preferably contains as much information as possible for configuring the new device, but when information is missing, the U-Me system prompts the user to perform certain tasks as prerequisites. Once the tasks have been performed by the user, the U-Me system can take over and configure the phone using the information stored in the corresponding virtual device.
  • a computer system 300 is an example of one suitable computer system that could host the universal me system 100.
  • Server computer system 300 could be, for example, an IBM System i computer system.
  • computer system 300 comprises one or more processors 310, a main memory 320, a mass storage interface 330, a display interface 340, and a network interface 350. These system components are interconnected through the use of a system bus 360.
  • Mass storage interface 330 is used to connect mass storage devices, such as local mass storage device 355, to computer system 300.
  • One specific type of local mass storage device 355 is a readable and writable CD-RW drive, which may store data to and read data from a CD-RW 395.
  • Main memory 320 preferably contains data 321, an operating system 322, and the Universal Me System 100.
  • Data 121 represents any data that serves as input to or output from any program in computer system 100.
  • Operating system 322 is a multitasking operating system.
  • the Universal Me System 100 is the cloud -based system described in detail in this specification, and includes a user settings mechanism 324 and user settings mapping information 326.
  • the Universal Me System 100 as shown in FIG. 3 is a software mechanism that provides all of the functionality of the U-Me system.
  • FIG. 3 in conjunction with FIG. 1 thus shows a computer system comprising at least one processor, a memory coupled to the at least one processor, user data residing in the memory corresponding to a first user, first user settings corresponding to the first user for a plurality of software applications residing in the memory, second user settings corresponding to the first user for a plurality of hardware devices, and a software mechanism executed by the at least one processor that makes the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
  • Computer system 300 utilizes well known virtual addressing mechanisms that allow the programs of computer system 300 to behave as if they only have access to a large, contiguous address space instead of access to multiple, smaller storage entities such as main memory 320 and local mass storage device 355. Therefore, while data 321, operating system 322, and Universal Me System 100 are shown to reside in main memory 320, those skilled in the art will recognize that these items are not necessarily all completely contained in main memory 320 at the same time. It should also be noted that the term "memory” is used herein generically to refer to the entire virtual memory of computer system 300, and may include the virtual memory of other computer systems coupled to computer system 300.
  • Processor 310 may be constructed from one or more microprocessors and/or integrated circuits. Processor 310 executes program instructions stored in main memory 320. Main memory 320 stores programs and data that processor 310 may access. When computer system 300 starts up, processor 310 initially executes the program instructions that make up the operating system 322. Processor 310 also executes the Universal Me System 100.
  • computer system 300 is shown to contain only a single processor and a single system bus, those skilled in the art will appreciate that the Universal Me system may be practiced using a computer system that has multiple processors and/or multiple buses.
  • the interfaces that are used preferably each include separate, fully programmed microprocessors that are used to off-load compute-intensive processing from processor 310.
  • these functions may be performed using I/O adapters as well.
  • Display interface 340 is used to directly connect one or more displays 365 to computer system 300. These displays 365, which may be non-intelligent (i.e., dumb) terminals or fully programmable workstations, are used to provide system administrators and users the ability to communicate with computer system 300. Note, however, that while display interface 340 is provided to support communication with one or more displays 365, computer system 300 does not necessarily require a display 365, because all needed interaction with users and other processes may occur via network interface 350.
  • Network interface 350 is used to connect computer system 300 to other computer systems or workstations 375 via network 370.
  • Network interface 350 broadly represents any suitable way to interconnect electronic devices, regardless of whether the network 370 comprises present-day analog and/or digital techniques or via some networking mechanism of the future.
  • Network interface 350 preferably includes a combination of hardware and software that allow communicating on the network 370.
  • Software in the network interface 350 preferably includes a communication manager that manages communication with other computer systems 375 via network 370 using a suitable network protocol.
  • Many different network protocols can be used to implement a network. These protocols are specialized computer programs that allow computers to communicate across a network.
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • aspects of the Universal Me system may be embodied as a system, method or computer program product. Accordingly, aspects of the Universal Me system may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the Universal Me system may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro -magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, F, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the Universal Me system may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • FIG. 4 shows another view of a configuration for running the U-Me system 100.
  • the U-Me system 100 preferably runs in a cloud, shown in FIG. 4 as cloud 410.
  • a user connects to the U-Me system 100 using some physical device 150 that may include a browser 430 and/or software 440 (such as an application or app) that allows the user to interact with the U-Me system 100.
  • the physical device 150 is connected to the U-Me system 100 by a network connection 420, which is representative of network 370 shown in FIG. 3, and which can include any suitable wired or wireless network or combination of networks.
  • the network connection 420 in the most preferred implementation is an Internet connection, which makes the U-Me system available to any physical device that has Internet access. Note, however, other types of networks may be used, such as satellite networks and wireless networks.
  • the disclosure and claims herein expressly extend to any suitable network or connection for connecting a physical device to the U-Me system 100.
  • U-Me system 100 includes user data 120, user licensed content 130, and user settings 140, as the specific examples in FIGS. 1 and 2 illustrate.
  • U-Me system 100 further includes a universal user interface 142, universal templates 152, device- specific templates 154, device interfaces 156, a virtual machine mechanism 158, a conversion mechanism 160, a data tracker 162, a data search engine 164, an alert mechanism 166, a licensed content transfer mechanism 168, a retention/destruction mechanism 170, a macro/script mechanism 172, a sharing mechanism 174, a virtual device mechanism 176, an eReceipt mechanism 178, a vehicle mechanism 180, a photo mechanism 182, a medical info mechanism 184, a home automation mechanism 186, a license management mechanism 188, a sub-account mechanism 190, a credit card monitoring mechanism 192, and a user authentication mechanism 194.
  • the virtual user interface 142 includes user data 120, user licensed content 130, and user settings 140, as the specific examples in FIGS
  • FIG. 6 shows some specific examples of user data 120 that could be stored in a user's U-Me account, including personal files 610, contacts 615, e-mail 620, calendar 625, tasks 630, financial info 635, an electronic wallet 640, photos 645, reminders 650, eReceipts 655, medical information 660, and other data 665.
  • the user data shown in FIG. 6 are examples shown for the purpose of illustration.
  • the disclosure and claims herein extend to any suitable data that can be generated by a user, generated for a user, or any other data relating in any way to the user, including data known today as well as data developed in the future.
  • Personal files 610 can include any files generated by the user, including word processor files, spreadsheet files, .pdf files, e-mail attachments, etc.
  • Contacts 615 include information for a user's contacts, preferably including name, address, phone number(s), e-mail address, etc.
  • E-mail 620 is e-mail for the user.
  • E-mail 620 may include e-mail from a single e-mail account, or e-mail from multiple e-mail accounts.
  • E- mail 620 may aggregate e-mails from different sources, or may separate e-mails from different sources into different categories or views.
  • Calendar 625 includes an electronic calendar in any suitable form and format.
  • Tasks 630 include tasks that a user may set and tasks set by the U-Me system.
  • Financial info 635 can include any financial information relating to the user, including bank statements, tax returns, investment account information, etc.
  • Electronic wallet 640 includes information for making electronic payments, including credit card and bank account information for the user. Google has a product for Android devices called Google Wallet. The electronic wallet 640 can include the features of known products such as Google
  • Photos 645 include digital electronic files for photographs and videos. While it is understood that a user may have videos that are separate from photographs, the term "photos" as used herein includes both photographs and videos for the sake of convenience in discussing the function of the U-Me system.
  • Reminders 650 include any suitable reminders for the user, including reminders for events on the calendar
  • eReceipts 655 includes electronic receipts in the form of electronic files that may include warranty information and/or links that allow a user to make a warranty claim.
  • Medical info 660 includes any suitable medical information relating to the user, including semi-private medical information, private medical information, and information provided by medical service providers, insurance companies, etc.
  • Other data 665 can include any other suitable data for the user.
  • FIG. 7 shows some specific examples of user licensed content 130 that could be stored in a user's U-Me account, including purchased music 710, stored music 715, purchased movies 720, stored movies 725, eBooks 730, software 735, games 740, sheet music 745, purchased images 750, online subscriptions 755, and other licensed content 760.
  • the user licensed content shown in FIG. 7 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable user licensed content, including user licensed content known today as well as user licensed content developed in the future.
  • Purchased music 710 includes music purchased from an online source. Note the purchased music 710 could include entire music files, or could include license information that authorizes the user to stream a music file on-demand.
  • Stored music 715 includes music the user owns and which has been put into electronic format, such as music recorded (i.e., ripped) from a compact disc.
  • Purchased movies 720 include movies purchased from an online source. Note the purchased movies 720 could include an entire movie file, or could include license information that authorizes the user to stream a movie on-demand.
  • Stored movies 725 include movies the user owns and which have been put into electronic format, such as movies recorded from a digital video disc (DVD).
  • eBooks 730 include books for the Apple iPad, books for the Kindle Fire, and books for the Barnes & Noble Nook. Of course, eBooks 730 could include books in any suitable electronic format.
  • Software 735 includes software licensed to the user and/or to the user's devices. In the most preferred implementation, software is licensed to the user and not to any particular device, which makes the software available to the user on any device capable of running the software. However, software 735 may also include software licensed to a user for use on only one device, as discussed in more detail below. Software 735 may include operating system software, software applications, apps, or any other software capable of running on any device. In addition, software 735 may include a backup of all software stored on all devices used by the user. Games 740 include any suitable electronic games, including games for computer systems and any suitable gaming system. Known gaming systems include Sony Playstation, Microsoft Xbox, Nintendo Wii, and others.
  • Games 740 may include any games for any platform, whether currently known or developed in the future.
  • Sheet music 745 includes sheet music that has been purchased by a user and is in electronic form. This may include sheet music files that are downloaded as well as hard copy sheet music that has been scanned.
  • Some pianos now include an electronic display screen that is capable of displaying documents such as sheet music files. If a user owns such a piano, the user could access via the piano all of the user's stored sheet music 745 in the user's U-Me account. Purchased images
  • Online subscriptions 755 include content generated by the user on a subscription basis by any suitable provider. For example, if a user subscribes to Time magazine online, the online subscriptions 755 could include electronic copies of Time magazine.
  • Other licensed content 760 can include any other licensed content for a user.
  • FIG. 8 shows some specific examples of user settings 140 that could be stored in a user's U-Me account, including universal interface settings 810, phone settings 815, tablet settings 820, laptop settings 825, desktop settings 830, television settings 835, software settings 840, vehicle settings 845, home automation settings 850, gaming system settings 855, audio system settings 860, security system settings 865, user authentication settings 870, and other settings 875.
  • the user settings shown in FIG. 8 are examples shown for the purpose of illustration.
  • the software settings 840 which include user settings for software applications, include user preferences for each software application.
  • software application is used herein to broadly encompass any software the user can use, whether it is operating system software, an application for a desktop, an app for a phone, or any other type of software.
  • User settings for physical devices include user settings for each physical device.
  • the term "physical device” is used herein to broadly include any tangible device, whether currently known or developed in the future, that includes any combination of hardware and software. The disclosure and claims herein extend to any suitable user settings, including user settings known today as well as user settings developed in the future.
  • Universal interface settings 810 include settings for a universal interface for the U-Me system that can be presented to a user on any suitable device, which allows the user to interact with the U-Me system using that device.
  • Phone settings 815 include settings for the user's phone, such as a smart phone. Apple iPhone and Samsung Galaxy S4 are examples of known smart phones.
  • Tablet settings 820 include settings for the user's tablet computer. Examples of known tablet computers include the Apple iPad, Amazon Kindle, Barnes & Noble Nook, Samsung Galaxy Tab, and many others.
  • Laptop settings 825 are settings for a laptop computer.
  • Desktop settings 830 are settings for a desktop computer.
  • Television settings 835 are settings for any suitable television device.
  • television settings 835 could include settings for a television, for a cable set -top box, for a satellite digital video recorder (DV ), for a remote control, and for many other television devices.
  • Software settings 840 include settings specific to software used by the user. Examples of software settings include the configuration of a customizable menu bar on a graphics program such as Microsoft Visio; bookmarks in Google Chrome or favorites in Internet Explorer; default file directory for a word processor such as Microsoft Word; etc.
  • Software settings 840 may include any suitable settings for software that may be defined or configured by a user.
  • Vehicle settings 845 include user settings relating to a vehicle, including such things as position of seats, position of mirrors, position of the steering wheel, radio presets, heat/cool settings, music playlists, and video playlists.
  • Home automation settings 850 include settings for a home automation system, and may include settings for appliances, heating/ventilation/air conditioning (HVAC), lights, security, home theater, etc.
  • Gaming system settings 855 include settings relating to any gaming system.
  • Audio system settings 860 include settings for any suitable audio system, including a vehicle audio system, a home theater system, a handheld audio player, etc.
  • the security system settings 865 may include settings for any suitable security system.
  • User authentication settings 870 include settings related to the user's authentication to the U-Me system. Other settings 875 may include any other settings for the user.
  • the U-Me system makes a user's data, licensed content, and settings available to the user on any device the user desires to use. This is a significant advantage for many reasons. First of all, even for people who are comfortable with technology, getting a device configured exactly as the user wants is time- consuming and often requires research to figure out how to configure the device. For example, let's assume a user installs the Google Chrome browser on a desktop computer. When the user downloads a file using Google Chrome, the downloaded file appears as a clickable icon on the lower left of the Google Chrome display. To open the file, the user clicks on the icon. Let's assume the user wants to always open .pdf files after they are downloaded.
  • the user could borrow an iPad from a friend, and have access to all the user's data, licensed content, and settings.
  • the power and flexibility of the U-Me system leads to its usage in many different scenarios, several of which are described in detail below.
  • FIG. 8 While many different categories of user settings are shown in FIG. 8, these are shown by way of example.
  • a benefit of the U-Me system is that a user only has to configure a device once, and the configuration for that device is stored in the user's U-Me account. Replacing a device that is lost, stolen, or broken is a simple matter of buying a new similar device, then following the instructions provided by the U- Me system to configure the new device to be identical to the old device. In the most preferred
  • the U-Me system will back up all user data, licensed content, and settings related to the device to the user's U-Me account, which will allow the U-Me system to configure the new device automatically with minimal input from the user.
  • features of the devices themselves may prevent copying all the relevant data, licensed content and settings to the user's U-Me account.
  • the U-Me system will provide instructions to the user regarding what steps the user needs to take before the U-Me system can configure the device with the information stored in the user's U-Me account.
  • universal templates 152 include phone templates 910, tablet templates 915, laptop templates 920, desktop templates 925, television templates 930, software templates 935, vehicle templates 940, home automation templates 945, gaming system templates 950, audio system templates 955, security system templates 960, eReceipt templates 965, medical information templates 970, master template 975, and other templates 980.
  • the universal templates shown in FIG. 9 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable universal templates, including universal templates related to devices known today as well as universal templates related to devices developed in the future.
  • the various universal templates in FIG. 9 include categories of devices that may include user settings.
  • One of the benefits of the U-Me system is the ability for a user to store settings for any device or type of device that requires configuration by the user. This allows a user to spend time once to configure a device or type of device, and the stored settings in the user's U-Me account will allow automatically configuring identical or similar devices.
  • the U-Me system expressly extends to storing any suitable user data and/or user licensed content and/or user settings for any suitable device in a user's U-Me account.
  • the universal templates 152 provide a platform-independent way of defining settings for a particular type of device.
  • a universal phone template maybe defined by a user using the U-Me system without regard to which particular phone the user currently has or plans to acquire.
  • the universal templates are platform-independent, they may include settings that do not directly map to a specific physical device.
  • the universal templates may include information uploaded from one or more physical devices.
  • the universal template can thus become a superset of user data, user licensed content, and user settings for multiple devices.
  • the universal templates can also include settings that do not correspond to a particular setting on a particular physical device.
  • device-specific templates 154 include phone templates 1005, tablet templates 1010, laptop templates 1015, desktop templates 1020, television templates 1025, software templates 1030, vehicle templates 1035, home automation templates 1040, gaming system templates 1045, audio system templates 1050, security system templates 1055, and other templates 1060.
  • the device-specific templates shown in FIG. 10 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable device-specific templates, including device -specific templates for devices known today as well as device-specific templates for devices developed in the future.
  • the device-specific templates 154 provide platform-dependent templates.
  • the user data, user licensed content, and user settings represented in a device-specific template includes specific items on a specific device or device type.
  • the device-specific templates 154 may also include mapping information to map settings in a physical device to settings in a universal template.
  • FIGS. 11-21 are related to device specific templates 154.
  • phone templates 1005 may include iPhone templates 1110, Android templates 1120 and Windows phone templates 1130, which represent different phone types.
  • Phone templates 1005 may also include templates for a specific phone, such as iPhone 4 template 1140 and Samsung Galaxy S3 template 1150, as well as one or more other phone templates 1160 that may be for a phone type or for a specific phone.
  • Tablet templates 1010 are shown in FIG. 12 to include iPad templates 1210 and Nook templates 1220, which represent different tablet platforms. Tablet templates 1010 may also include templates for a specific tablet, such as a Kindle Fire HD template 1230 and an iPad mini 2 template 1240, as well as one or more other tablet templates 1250 that may be for a tablet type or for a specific tablet.
  • a specific tablet such as a Kindle Fire HD template 1230 and an iPad mini 2 template 1240
  • tablet templates 1250 may be for a tablet type or for a specific tablet.
  • Laptop templates 1015 are shown in FIG. 13 to include Lenovo laptop templates 1310 and MacBook templates 1320, which represent different laptop computer types. Laptop templates 1015 may also include templates for a specific laptop, such as a Samsung Chromebook template 1330 and an HP Envy template 1340, as well as one or more other laptop templates 1350 that may be for a laptop type or for a specific laptop. [0298] Desktop templates 1020 are shown in FIG. 14 to include HP desktop templates 1410 and Dell desktop templates 1420, which represent different laptop computer types.
  • Desktop templates 1020 may also include templates for a specific desktop computer, such as an HP Pavilion PS-2355 desktop template 1430 and an Asus Ml 1BB-B05 desktop template 1440, as well as one or more other desktop templates 1450 that may be for a desktop type or for a specific desktop computer.
  • a specific desktop computer such as an HP Pavilion PS-2355 desktop template 1430 and an Asus Ml 1BB-B05 desktop template 1440, as well as one or more other desktop templates 1450 that may be for a desktop type or for a specific desktop computer.
  • Television templates 1025 are shown in FIG. 15 to include a Sony TV template 1510 and a satellite TV template 1520, which represent different types of television devices.
  • Television templates 1025 may also include templates for a specific television device, such as a Mitsubishi WD-60638 template 1530, a Dish Network Hopper DV template 1540, and an RCA RCUIOIO remote template 1540, as well as one or more other television device templates 1560 that may be for a television device type or for a specific television-related device.
  • Software templates 1030 are shown in FIG. 16 to include a word processor template 1610 and an e- mail template 1620, which represent different types of software.
  • Software templates 1030 may also include templates for specific software, such as a Microsoft Word template 1630 and a Google Chrome template 1640, as well as one or more other software templates 1650 that may be for a type of software or for specific software.
  • Vehicle templates 1035 are shown in FIG. 17 to include a Chevrolet template 1710 and a Toyota template 1720, which represent different types of vehicles.
  • Vehicle templates 1035 may also include templates for specific vehicles, such as a Hyundai Civic LX template 1730 or a Ford F150 XLT template 1740, as well as one or more other vehicle templates 1750 that may be for a type of vehicle or for a specific vehicle.
  • the vehicle templates 1035 could include templates for any type of vehicle, including cars, trucks, boats, large semi trucks, planes, and other vehicles.
  • the "type" of the vehicle herein can also vary, and a single vehicle can correspond to many different types.
  • a 2012 Lexus RX350 could be categorized as a passenger vehicle, as a small SUV, as a Lexus, as a Lexus passenger vehicle, as a Lexus small SUV, etc.
  • One of the significant advantages of the U-Me system is the ability to convert settings from a vehicle of one type to a vehicle of a different type.
  • a user normally drives a Ford F150 XLT pickup, the user's settings for his Ford pickup can be converted to corresponding settings in a Toyota rental car.
  • Home automation templates 1040 are shown in FIG. 18 to include a refrigerator template 1810, an
  • HVAC template 1820 HVAC template 1820, and an energy usage template 1830, which represent different things that may be controlled by a home automation system.
  • Home automation templates 1040 may also include templates for specific home automation systems, such as Home Automation Inc. (HAJ) Omni template 1840, Samsung refrigerator template 1850, lighting template 1860, as well as one or more other home automation templates 1870 that may be for a type of home automation controller or type of item controlled by a home automation controller or for a specific home automation controller or item controlled by a home automation controller.
  • Gaming system templates 1045 are shown in FIG. 19 to include Xbox templates 1910 and Playstation templates, which represent different types of gaming systems. Gaming templates 1045 may also include templates for specific gaming systems, such as Nintendo Wii U template 1930 and Xbox 360 template 1940, as well as one or more other gaming system templates 1950 that may be for a type of gaming system or for a specific gaming system.
  • Audio system templates 1050 are shown in FIG. 20 to include stereo receiver templates 2010, home theater templates 2020, and vehicle audio templates 2030, which represent different types of audio systems. Audio system templates 1050 may also include templates for specific audio systems, such as Sony STR- DH130 template 2040 and Hyundai RX-V375 template 2050, as well as one or more other audio system templates 2060 that may be for a type of audio system or for a specific audio system.
  • Security system templates 1055 are shown in FIG. 21 to include ADT templates 2110 and FrontPoint templates 2120, which represent different types of security systems from different manufacturers.
  • Security system templates 1055 may also include templates for specific security systems, such as a Fortress S02-B template 2130 and a Simplisafe2 template 2140, as well as one or more other security system templates 2150 that may be for a type of security system or for a specific audio system.
  • the templates disclosed herein may be of any suitable format, it is expected that industry experts will have to spend time brainstorming and meeting to arrive at an industry standard.
  • the automotive industry may generate an industry- standard template for cars
  • the personal computer industry may generate a very different industry- standard template for desktop computers. Generating and publishing standard templates will greatly accelerate the acceptance of the U-Me system.
  • the device-specific templates shown in FIGS. 10-21 could be provided by any suitable entity.
  • the U-Me system may provide some of the device-specific templates.
  • some device-specific templates will preferably be provided by manufacturers of devices.
  • the U-Me system includes the capability of device manufacturers to become "U-Me Certified", which means their devices have been designed and certified to appropriately interact with the U-Me system.
  • Part of the U-Me certification process for a device manufacturer could be for the manufacturer to provide a universal template for each category of devices the manufacturer produces, a device-specific template for each category of devices the manufacturer produces, as well as a device-specific template for each specific device the manufacturer produces.
  • device interfaces 156 preferably include phone interfaces 2205, tablet interfaces 2210, laptop interfaces 2215, desktop interfaces 2220, television interfaces 2225, software interfaces 2230, vehicle interfaces 2235, home automation interfaces 2240, gaming system interfaces 2245, audio system interfaces 2250, security system interfaces 2255, and other interfaces 2260.
  • the device interfaces shown in FIG. 22 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable device interfaces, including device interfaces for devices known today as well as device interfaces for devices developed in the future.
  • Each device interface provides the logic and intelligence to interact with a specific type of device or with a specific device.
  • phone interfaces 2205 could include an iPhone interface and an Android interface.
  • phone interfaces 2205 could include different interfaces for the same type of device.
  • phone interfaces 2205 could include separate phone interfaces for an iPhone 4 and an iPhone 5.
  • phone interfaces 2205 could be combined into a single phone interface that has the logic and intelligence to communicate with any phone.
  • a device interface is provided for each specific device that will interact with the U-Me system. This could be a requirement for a device to become U-Me certified, that the manufacturer of the device provide the device interface that meets U-Me specifications.
  • the U-Me system preferably includes a universal user interface 142 shown in FIG. 5.
  • the universal user interface 2300 shown in FIG. 23 is one suitable example of a specific implementation for the universal user interface 142 shown in FIG. 5.
  • the universal user interface 2300 in FIG. 23 includes several icons the user may select to access various features in the U-Me system.
  • the icons shown in FIG. 23 include a data icon 2310, a licensed content icon 2320, a software icon 2330, a settings icon 2340, a devices icon 2350, and a templates icon 2360. Selecting the data icon 2310 gives the user access to the user data 120 stored in the user's U-Me account, including the types of data shown in FIG. 6.
  • Selecting the licensed content icon 2320 gives the user access to any and all of the user's licensed content 130, including the categories of licensed content shown in FIG. 7. Selecting the software icon 2330 gives the user access to software available in the user's U-Me account. While software is technically a category of licensed content
  • a separate icon 2330 is provided in the universal user interface 2300 in FIG. 23 because most users would not mentally know to select the licensed content icon 2320 to run software. Selecting the software icon 2330 results in a display of the various software applications available in the user's U-Me account. The user may then select one of the software applications to run.
  • the display of software icons after selecting the software icon 1230 could be considered a "virtual desktop" that is available anywhere via a browser or other suitable interface.
  • Selecting the settings icon 2340 gives the user access to any and all of the user settings 140, including the categories of settings shown in FIG. 8.
  • Selecting the devices icon 2350 gives the user access to virtual devices, which are discussed in more detail below, where the virtual devices correspond to a physical device used by the user. The user will also have access to the device interfaces 156, including the device interfaces shown in FIG. 22. Accessing devices via the device interfaces allows the user to have remote control via the universal user interface over different physical devices.
  • Selecting the templates icon 2360 gives the user access to the templates in the user's U-Me account, including: universal templates, including the universal templates shown in FIG. 9; and device-specific templates, including those shown in FIGS. 10- 21.
  • the devices icon 2350 and the templates icon 2360 provide access to information in the user's U-Me account pertaining to devices and templates, which can be part of the settings in the user's U-Me account. While the Devices icon 2350 and Templates icon 2360 could be displayed as a result of a user selecting the Setting icon 2240, these icons 2350 and 2360 that are separate from the settings icon 2340 could be provided as shown in FIG. 23 to make using the universal user interface 2300 more intuitive for the user.
  • the universal user interface gives the user great flexibility in accessing a user's U-Me account.
  • the universal user interface is browser -based, which means it can be accessed on any device that has a web browser.
  • other configurations for the universal user interface are also possible, and are within the scope of the disclosure and claims herein. For example, a user on vacation in a foreign country can go into an Internet cafe, invoke the login page for the U-Me system, log in, and select an icon that causes the universal user interface (e.g., 2300 in FIG. 23) to be displayed. The user then has access to any and all information stored in the user's U-Me account.
  • the universal user interface allows a user to access the user's U-Me account on any device
  • the universal user interface also provides a way for a user to change settings on the user's devices.
  • the user's U-Me account includes virtual devices that mirror the configuration of their physical device counterparts, the user could use a laptop or desktop computer to define the settings for the user's phone. This can be a significant advantage, particularly for those who don't see well or who are not dexterous enough to use the tiny keypads on a phone.
  • a simple example will illustrate. Let's assume a U-Me user wants to assign a specific ringtone to her husband's contact info in her phone.
  • the user could sit down at a desktop computer, access the universal user interface 2300, select the Devices icon 2350, select a Phone icon, which then gives the user access to all of the settings in the phone.
  • the user can then navigate a menu displayed on a desktop computer system using a mouse and full-sized keyboard to change settings on the phone instead of touching tiny links and typing on a tiny keyboard provided by the phone.
  • the user could assign the ringtone to her husband's contact info in the settings in the virtual device in the U-Me account that corresponds to her phone. Once she makes the change in the virtual phone settings in the U-Me account, this change will be automatically propagated to her phone.
  • the universal user interface may thus provide access to the user to set or change the settings for all of the user's physical devices.
  • the universal user interface 142 can include any suitable interface type.
  • the universal user interface 142 can provide different levels of interfaces depending on preferences set by the user.
  • the universal user interface may provide simple, intermediate, and power interfaces that vary in how the information is presented to the user depending on the user's preferences, which could reflect the technical prowess and capability of the user.
  • Those who are the least comfortable with technology could select a simple interface, which could provide wizards and lots of help context to help a user accomplish a desired task.
  • Those more comfortable with technology could select the intermediate interface, which provides fewer wizards and less help, but allows a user to more directly interact with and control the U-Me system.
  • a method 2400 for programming a device called Device2 begins by determining settings for Device2 (step 2410), then programming the device with those settings (step 2420). There are different ways to determine the settings for Device2 in step 2410. Referring to FIG. 25, method 2500 shows one suitable implementation for step 2410 in FIG. 24. Settings for a device called Devicel are read (step 2510). A mapping from Devicel to Device2 is then read (step 2520).
  • the settings for Devicel are then converted to the settings for Device2 (step 2530).
  • FIGS. 25 and 26 show how to program a device by converting settings from one device to settings for a different device. For example, let's assume a user has been using an iPhone 4, then decides to change to a Samsung Galaxy S4 phone. Assuming there are device -specific templates 154 for both phones, the conversion mechanism 160 in FIG. 5 can convert the settings on the iPhone 4 to settings on the Samsung Galaxy S4, provided there is a mapping in the phone templates between the device-specific settings of the two devices. The example in FIGS. 25 and 26 shows how to program a device by converting from settings of a different device.
  • FIGS. 27 and 28 A second suitable implementation for step 2410 in FIG. 24 is shown in FIGS. 27 and 28.
  • Device2 is programmed from settings stored in the Universal Template corresponding to Device2.
  • the universal template settings are read (step 2710).
  • a mapping from the universal template to Device 2 is read (step 2720).
  • the conversion mechanism then converts the settings from the universal template to the settings for Device2 (step 2730).
  • FIG. 28 This is shown graphically in FIG. 28, where universal template settings 2810 are converted using the universal template to Device2 mapping 2820 to generate Device2 settings 2630.
  • This second implementation in FIGS. 27 and 28 vary from the first implementation in steps 25 and 26 because the conversion of settings is between the universal template settings to the Device2 settings, not from the settings of another device (such as Devicel).
  • FIGS. 29 and 30 A third suitable implementation for step 2410 in FIG. 24 is shown in FIGS. 29 and 30.
  • Devicel settings are read (step 2910).
  • a mapping from Devicel to the universal template is also read (step 2920).
  • the Devicel settings are then converted to the universal template settings (step 2930).
  • a mapping from the universal template to Device2 is then read (step 2940).
  • the universal template settings are then converted to Device 2 settings (step 2950).
  • This third implementation converts settings between two devices, similar to the first implementation shown in FIGS. 25 and 26, but this is not a direct mapping between two devices, but is rather a mapping to and from universal template settings.
  • the conversion of settings from one device to another in FIGS. 25-30 can be performed by the conversion mechanism 160 shown in FIG. 5, which could include the user settings mechanism 324 and the user settings mapping information 326 shown in FIG. 3.
  • the examples in FIGS. 25-30 allow converting settings from one device to corresponding settings for a different device.
  • the different device may be of the same type or may be of a different type.
  • Type can be defined according to hardware architecture, system software (e.g., operating system), manufacturer, brand, or any other characteristic that characterizes a device.
  • an iPhone and a Samsung Galaxy phone are devices of different types because they have a different hardware architecture type and run different system software.
  • a Chevrolet and a Toyota are devices of different types because they are made by different manufacturers.
  • An iPhone 4 and iPhone5 could be categorized as devices of the same type because they have the same hardware architecture type and run the same system software, even if the version of the system software is not the exact same.
  • the disclosure and claims herein extends to any suitable definition or categorization for the "type" of a device.
  • the conversion mechanism allows converting settings between devices of the same type, between devices of similar type, and also between devices of different types. For example, devices may be of the same type when they have the same hardware architecture type and run the same system software. Devices may be of similar type when they have the same hardware architecture type and run different system software. Devices may be of different types when they have different hardware architecture type and different system software.
  • suitable television settings 835 are shown in FIG. 32 to include one or more favorite channels list 3210, shows set to record 3220, blocked channels 3230, parental controls 3240, channel numbers for stations 3250, and one or more passwords 3260. These are all settings the user can define, for example, in a
  • DVR for Dish Network For this specific example, we assume the user has Dish Network at the user's home, and programs the Dish Network DVR with some or all of the user television settings 835 shown in FIG. 32. We now assume the user travels to a new location during a vacation, such as a hotel room, a resort, a relative's house, etc., and we further assume the new location has DirecTV. Referring to FIG. 33, method 3300 begins by detecting the target system (at the new location) is a DirecTV system (step 3310). The user's Dish Network television settings are converted to equivalent or similar DirecTV settings in the user's U-Me account (step 3320).
  • the converted DirecTV settings from the user's U-Me account are then downloaded to the DirecTV target system (e.g., DVR) at the new location (step 3330).
  • the result is the user's Dish Network television settings are now available on the DirecTV DVR.
  • One part of the conversion in step 3320 is converting the channel numbers from Dish Network to the equivalent channel numbers in
  • a sample mapping for ten channels is shown at 3100 in FIG. 31.
  • the indication of "local" in the channel mapping will indicate a need to determine the location of the target system, and determine the appropriate mapping to the target system using the channel numbers that are specific to the geographic region where the target system is located. This is a task easily accomplished by the U-Me system.
  • the mapping 3100 shown in FIG. 31 is one suitable example for user settings mapping information 326 shown in FIG. 3.
  • each provider's DVR will need to be "U-Me Certified", meaning the DVR includes logic and intelligence that allows the DVR to interact with the U-Me system.
  • This certification process will also preferably provide a device-specific template for each DVR, along with information that allows mapping the settings from one provider to another provider.
  • a universal template for a DVR could be defined with required fields, and each device-specific template for each DVR will have to have the required fields specified in the universal DVR template.
  • Changing television settings in the new location would not be very helpful unless the user has a remote control that can accommodate the change.
  • a user has a remote control with a screen that displays channel icons, such as shown in FIG. 35.
  • Such remote controls such as the Pronto touch-screen remote control and the RCA RCUlOlO remote control, allow displaying a channel icon.
  • Method 3400 in FIG. 34 can be used to reprogram a remote control to accommodate the change of location in the example above.
  • the remote control is "U-Me Certified" , meaning the remote control includes logic and intelligence that allows the remote control to interact with the U-Me system.
  • the settings for the remote control are read (step 3410).
  • the mapping of channel icons to channel numbers is determined in step 3410.
  • the settings are converted to equivalent or similar settings for the target system (step 3420). This means the channel numbers of the displayed icons in display 3500 in
  • FIG. 35 for Dish Network are converted to the equivalent channel numbers using the mapping 3100 in FIG. 31.
  • the conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5.
  • the remote control is then reprogrammed for the target system (step 3430). This means the channel numbers that are sent by the remote control are now the channel numbers for DirecTV, not Dish Network. Thus, when the user is home and presses the Fox News icon, the remote control sends channel 205 to the
  • Dish Network DVR Dish Network DVR. But after the remote control has been reprogrammed for the target system at the new location as shown in FIG. 34, when the user presses the Fox News icon, the remote control will now send channel 360 to the DirecTV DVR.
  • This reprogramming thus allows a user to use a remote control with icon- based channels by reprogramming the underlying channel numbers that are sent by the remote control when an icon is pressed. The user is thus able to travel with the user's home remote control, and have the remote control be automatically reprogrammed to accommodate the television system at the new location, assuming the television system at the new location is U-Me compliant.
  • the remote can also be reprogrammed to transmit different channel numbers than channel number pressed by the user. This is shown in method 3600 in FIG. 36.
  • a user uses the numeric keypad on the remote control to key in a channel number for Device 1 (step 3610).
  • the remote automatically converts the channel number entered by the user to the equivalent channel number in the target system (step 3620).
  • the remote then transmits the channel number for Device2 (step 3630).
  • step 3610 when the user presses channel 138 on the remote control keypad (step 3610), the remote control keypad will detect the number and convert the number 138 for TNT in Dish Network to number 245 for TNT in DirecTV (as shown in FIG. 31) (step 3610)
  • the remote control then transmits channel number 245 to the DirecTV DV (step 3630). In this manner the user need not learn the new channel numbers at the new location, but can instead use the old channel numbers from home to access the same television channels on the system at the new location.
  • 31-36 could be carried out by reprogramming a smart phone app. This will be incredibly convenient because the user will always travel with the user's smart phone, which means the user will always have a remote control that can be reprogrammed by the U-Me system to work on a target system at a new location. Of course, this scenario is many years into the future after such televisions are widely available and after manufacturers of televisions, television equipment, and remote controls all become U-Me certified.
  • Method 3700 in FIG. 37 shows another method for reprogramming a remote control.
  • a user selects a TV provider on the remote control (step 3710).
  • the remote control determines its location (step 3720).
  • the remote determines from the detected location and from the selected TV provider channel numbers for defined channel icons from a database of TV providers (step 3730).
  • the remote then reprograms itself for channel numbers for the selected TV provider at the detected location (step 3740).
  • a simple example will illustrate. Let's assume the same scenario discussed in detail above, where a user has Dish Network at home and travels to a location that has DirecTV. The user could press a button, icon, or selection from a drop-down list on the remote control that selects DirecTV in step 3710.
  • the remote control could detect its location in step 3720 in any suitable way, including an internal GPS device, a wireless network interface that detects an Internet Protocol (IP) address and determines a geographic location for that IP address, or in any other suitable way.
  • IP Internet Protocol
  • the remote then consults a database of channel numbers for various TV providers at that geographic location.
  • the database will be stored in the remote control itself.
  • the database will be stored external to the remote, such as at a website, and could be accessed by the remote control via a Wi-Fi connection.
  • the remote control reprograms itself for those channel numbers (step 3740). Note that method 3700 supports changing the underlying channel numbers for displayed channel icons, similar to that discussed with respect to FIG.
  • the U-Me system provides a very powerful way for a user to use settings the user is accustomed to using at home while interacting with an entirely unfamiliar system at a new location.
  • the U-Me system introduces the concept of an eReceipt, which is simply a receipt in electronic form.
  • the eReceipt may include warranty information as well as a record of a purchase.
  • An eReceipt is processed by the eReceipt mechanism 178 shown in FIG. 5.
  • a method 3800 begins by defining an eReceipt template (step 3810).
  • the eReceipt template is then published (step 3820). Once published, any and all vendors may create eReceipts that conform to the published eReceipt template.
  • the eReceipt template can be defined in any suitable way.
  • One suitable way uses a markup language such as XML to define fields, some of which are mandatory and some of which are optional.
  • a seller can determine based on the fields in the eReceipt template how to format an eReceipt according to the definition of the eReceipt template.
  • Method 3900 in FIG. 39 shows how eReceipts are used.
  • a user buys a product (step 3910).
  • the seller determines the warranty policy for the product (step 3920).
  • the seller formats an eReceipt according to the warranty policy and the eReceipt template (step 3930).
  • the seller sends the eReceipt to the user (step 3940), preferably to the user's U-Me account.
  • the eReceipt is then processed and stored (step 3950) in the user's U-Me account.
  • the result of method 3900 is an electronic copy of a receipt that is automatically stored in the user's U-Me account when the user makes a purchase.
  • a seller delivers eReceipts that are formatted according to the eReceipt template. This is easily done by attaching an eReceipt file to an e-mail.
  • the U-Me system monitors all incoming e-mail, and when an eReceipt is detected in an incoming e- mail, the U-Me system reads the eReceipt. The U-Me system will then will process and store the eReceipt in the user's U-Me account. In the most preferred implementation, the eReceipt will be processed and stored in the user's U-Me account without any further input required by the user. However, in an alternative implementation, the user may be prompted to enter information related to the purchase before the eReceipt is stored in the user's U-Me account.
  • FIG. 40 shows a sample warranty policy 4000. If a warranty claim is less than 90 days from the date of purchase, the item will be returned to the store. If a warranty claim is 90 days or more from the data of purchase, the item will be returned to the manufacturer.
  • This warranty policy information may be included in the eReceipt. In one specific implementation, the warranty policy is included in the eReceipt in the form of a timed warranty link, discussed in more detail below.
  • FIG. 41 shows a method 4100 that illustrates a specific example for method 3900 in FIG. 39.
  • a user buys a television from manufacturer ABC from seller XYZ (step 4110).
  • the seller XYZ formats an eReceipt with a timed warranty link according to the eReceipt template (step 4120).
  • the e-Receipt with the timed warranty link is e-mailed to the user (step 4130).
  • the information in the eReceipt is then extracted and stored in fields in an eReceipt database in the user's U-Me account (step 4140), where the fields correspond to the fields defined in the eReceipt template.
  • the eReceipt can include a timed warranty link, which allows the user to submit a warranty claim by clicking on the timed warranty link.
  • a timed warranty link may be created and maintained using any suitable method. Two such methods are disclosed herein by way of example. The first method is shown in method 4200 in FIG. 42.
  • the U-Me system detects a timed warranty link (step 4210).
  • the timed warranty link is according to the warranty policy 4000 shown in FIG. 40.
  • the warranty link in the U-Me system is initially set to point to the seller's warranty return system (step 4220).
  • the U-Me system sets a timer according to the timed warranty link (step 4230).
  • FIG. 43 A second method for defining a timed warranty link is shown in FIG. 43.
  • This timed warranty link appears to the user as a link in plain language, such as "Click here to make a warranty claim.”
  • the logic shown in FIG. 43 underlies the warranty link.
  • the timed warranty link in FIG. 43 includes a date that is 90 days from the date of purchase (in accordance with the warranty policy 4000 in FIG. 40). If the current date is less than or equal to the set date at 90 days, selecting the link takes the user to the Best Buy warranty claim website. If the current date is greater than the set date at 90 days, selecting the link takes the user to the Mitsubishi warranty claim website. This is shown in method 4700 in FIG. 47. A user clicks on the timed warranty link (step 4710).
  • the system compares the current date to one or more dates in the timed warranty link (step 4720). The system then navigates to the link corresponding to the current date (step 4730).
  • timed warranty links By providing timed warranty links, the user has an improved experience because the appropriate place to submit a warranty claim is automatically presented to the user when the user selects a timed warranty link.
  • an eReceipt for a lawnmower could include a link to a parts web page that would allow the user to order parts for the lawnmower, including blades, belts, or any other parts.
  • the advantage of providing a parts link with the eReceipt is the information in the eReceipt can be used to direct the user to the correct parts page automatically. The user no longer has to go to the garage and find the sticker on the lawnmower that indicates the name, model and serial number, because this information is preferably included in the eReceipt.
  • the eReceipt thus provides a very effective way for sellers and manufacturers to provide valuable information to customers.
  • An example of an eReceipt template 965 is shown in FIG. 44 to include multiple sections, including a Seller Information section, a Product Information section, a Transaction Information section, a Buyer Information section, and an Embedded Metadata section.
  • the Seller Information section includes fields for Seller, Seller ID and Location.
  • the Product Information section includes fields for Product Category, Product Type, Product Attribute, Manufacturer, Product ID, Serial Number, Price, Warranty Link, and Gift. Note the Product Information fields are preferably replicated for each item that is purchased.
  • the Transaction Information section includes fields for Date, Transaction ID, Tax, Shipping and Total.
  • the Buyer Information section includes fields for Buyer Name, Buyer Address, Buyer Phone, and Buyer E-mail.
  • the Embedded Metadata field includes data that is in the eReceipt but that is not visible when an eReceipt is viewed.
  • any suitable field could be included in an eReceipt.
  • the fields shown in FIG. 44 are by way of example, and are not limiting.
  • Embedded Metadata includes a unique identifier that allows uniquely identifying the eReceipt. Values stored in the Embedded Metadata field include constant values, or values generated using any suitable heuristic. For example, a manufacturer could provide embedded metadata in the form of
  • ⁇ SellerID.Date.ValidationCode> where the SellerlD and Date are from the values stored in the eReceipt and the Validation Code is a unique code that is dynamically generated by the seller and assigned to this eReceipt.
  • the Embedded Metadata provides an electronic identifier that can identify this receipt as genuine.
  • FIG. 45 An example of an eReceipt 4500 formatted according to the eReceipt template 965 in FIG. 44 is shown in FIG. 45.
  • This particular eReceipt is for a Mitsubishi 60 inch television purchased at Best Buy.
  • the Seller is listed as Best Buy.
  • the Seller ID is shown as 14296, which could be a code that uniquely identifies Best Buy from other sellers.
  • the TV was purchased at Store 564, which is a code that tells Best Buy where the TV was purchased.
  • the Product Category is Home Electronics.
  • the Product Type is Flat Screen TV.
  • the Product Attribute is 60 inch DLP.
  • the manufacturer is Mitsubishi.
  • the Product ID is WD - 60735.
  • the serial number is 166-4923.
  • the price is $1,499.99.
  • the Warranty Link shows ⁇ Click Here to make a Warranty Claim>, which is abbreviated in FIG. 45 due to space constraints.
  • the Warranty Link includes a timed warranty link as discussed above.
  • the Gift field has a value of No because this TV was not purchased as a gift.
  • the purchase date of the TV was 08/02/2013.
  • the transaction ID is 543921268.
  • the Sales Tax is $123.12.
  • the shipping is zero (because the customer purchased the TV at a store).
  • the total for the purchase is $1,623.11.
  • the Buyer Name is Jim Jones.
  • the Buyer Address is 21354 Dogwood, Carthage, MO 64836 (not shown in FIG. 45 due to space constraints).
  • the Buyer Phone is 417-555-3399.
  • the buyer e-mail is J29 A @gmail.com.
  • the Embedded Metadata is data that uniquely identifies the eReceipt and can be used in the future to validate the eReceipt.
  • Some sellers could include the buyer information, and some may not.
  • the eReceipt received from the seller includes the Buyer Information.
  • the eReceipt could be e-mailed to the buyer without all of the Buyer
  • the eReceipt mechanism could automatically add the buyer information to the eReceipt, or could leave the buyer information incomplete.
  • method 4600 shows a seller sending an eReceipt to a manufacturer that provided the product sold in the eReceipt (step 4610).
  • the manufacturer can then register the product to the eReceipt. Note the product is registered to the eReceipt and not necessarily to the buyer, although the eReceipt includes the buyer's information. This is because many products are purchased as gifts, especially during the Christmas season.
  • eReceipts for gifts is a great advantage, because the eReceipts can be forwarded to gift recipients so they can have the purchase and warranty information for the gifts they received.
  • a user could do a search for all eReceipts for items purchased in November and December, where the eReceipt indicates the product was a gift.
  • the user who bought the gift could forward the eReceipt via e-mail to the gift recipient. If the gift recipient is a user of the U-Me system, the eReceipt will be processed and put into the user's eReceipt database.
  • the eReceipt mechanism 178 can also include the ability to delete price information when forwarding an eReceipt to a gift recipient.
  • a user could check on a "Gift Receipt" box which would delete all financial information from the gift receipt, such as price, sales tax, and shipping, before sending the eReceipt to the gift recipient.
  • the eReceipt mechanism 178 can thus provide an eGiftReceipt to the gift recipient for the product that was given as a gift, which includes all pertinent product information without including the financial information.
  • the method 4600 in FIG. 46 registers a product to an eReceipt received from the seller. Let's assume the eReceipt is then transferred from a first U-Me user to a second U-Me user who received the product as a gift from the first U-Me user. If the second U-Me user needs to make a warranty claim, the second U-Me user can click on the warranty link in the eReceipt, which we assume for this example directs the second U-Me user to the manufacturer's warranty claim website, and submits the eReceipt to the manufacturer to identify the product.
  • the manufacturer can then search its database and locate the corresponding eReceipt to which the product was registered.
  • the manufacturer could check the embedded metadata in the eReceipt to verify it is the same as the eReceipt to which the product was originally registered.
  • the manufacturer can then provide the warranty service to the new user.
  • the eReceipt concept can also be extended to help a user report and potentially recover stolen goods. Let's assume a burglar robs a television, a computer, and a home theater audio system from a U-Me user's house. If the user has eReceipts for these stolen goods, the user can submit the eReceipts to the police, to an insurance company, and to the U-Me system to report the goods as stolen. The U-Me system could, in turn, contact the manufacturer and/or insurance company to inform them these goods were stolen. Because the eReceipt includes all the pertinent information for the product, including serial number, the eReceipt should contain all information law enforcement and insurance companies needs to identify the stolen property if it is recovered.
  • 4800 provides a way for the U-Me system to provide reminders of warranties that are about to expire, which provides an opportunity for the manufacturer to sell an extended warranty. Even when an extended warranty is not available, the user is still given the notice that the warranty is about to expire.
  • a manufacturer can verify the validity of the eReceipt before processing the warranty claim, as shown in method 4900 in FIG. 49.
  • the manufacturer receives a warranty claim (step 4910).
  • the warranty claim is processed (step 4940).
  • the warranty claim is rejected (step 4930).
  • the eReceipt can be validated in step 4920 in any suitable way.
  • the manufacturer can compare the eReceipt submitted by the user as part of the warranty claim process to the eReceipt received from the seller at the time of sale. If the two match, the sale is valid.
  • the disclosure and claims herein expressly extend to all suitable mechanisms and methods for determining how to validate an eReceipt to make sure it represents a valid sale. This check prevents a user from receiving warranty service when the user submits a bogus eReceipt.
  • FIG. 5 An example of a display for an eReceipt search engine is shown at 5000 in FIG. 5.
  • the eReceipts are stored in the user's U-Me account in an eReceipt database using the fields in the eReceipt as indexing information. This allows eReceipts to be searched using power database query techniques.
  • Examples of eReceipt queries that could be formulated using the eReceipt search engine 5000 are shown in FIG. 51.
  • the example queries 5100 in FIG. 51 include "All purchases this year", "Home Electronics purchased in the last 5 years", and "All products over $500.” While certain fields are shown in FIG. 50, these are shown by way of example.
  • the eReceipt search engine could include many fields not shown in FIG. 50.
  • the disclosure and claims herein extend to using any suitable fields or search criteria for searching for eReceipts.
  • the U-Me system contemplates users receiving eReceipts from sellers in a format defined by the eReceipt template, such as via an attachment to an e-mail
  • sellers that do not provide eReceipts in the defined format may still e-mail receipts to users. This is particularly true for online sales.
  • the eReceipt mechanism 178 in the U-Me system can process receipt information in an e-mail and generate a corresponding eReceipt, as shown in method 5200 in FIG. 52, which is preferably performed by eReceipt mechanism 178 in FIG. 5.
  • the eReceipt mechanism processes the e-mail and generates an eReceipt (step 5260), filling in fields of the eReceipt with information in the e-mail.
  • the eReceipt mechanism proceeds to process the e-mail and generate an eReceipt (step 5260) without further input from the user, filling in fields of the eReceipt with information in the e-mail.
  • Method 5200 thus provides a way to convert an ordinary e-mail that includes receipt information into an e eceipt that can be stored in the user's U-Me account and that can be searched using the eReceipt search engine.
  • the U-Me system can be used to store vehicle settings for a user and to download those settings to a different vehicle, even a vehicle the user has never driven before. Let's assume a user travels from Kansas City to Chicago via airplane for a business meeting. Upon arriving in Chicago, the user rents a rental car. Let's assume the rental car is U-Me certified. The user can identify the rental car to the U-Me system, which can then determine the type of car, convert any of the user's settings as required for that car, and download the settings to the car. The user thus benefits from having the U-Me system configure the rental car according to his or her settings stored in the user's U-Me account.
  • the various functions with respect to FIGS. 53-63 discussed below are preferably performed by the vehicle mechanism 180 in FIG. 5.
  • method 5300 begins by uploading settings from a vehicle (step 5310).
  • the vehicle settings may then be converted and stored in a universal vehicle template (step 5320).
  • the vehicle settings could also be stored in a device-specific template for the user's vehicle.
  • the conversion of settings in step 5320 may be performed by the conversion mechanism 160 shown in FIG. 5.
  • One suitable universal vehicle template is shown at 5400 in FIG. 54, which is an example of a suitable universal vehicle template 940.
  • the universal vehicle template 5400 includes settings for driver seat position, passenger seat position, driver mirror position, passenger mirror position, rearview mirror position, steering wheel position, audio presets, driver heat/cool settings, passenger heat/cool settings, music playlists, and video playlists.
  • audio presets in FIG. 54 can include presents for a satellite radio receiver, and can additionally include presents on the receiver or other audio mechanism in the vehicle that corresponds to a user's favorite songs.
  • a vehicle audio system allows a user to define three sets of six presets each for satellite radio stations. Let's further assume the vehicle audio system also allows a user to define three additional sets of presets that correspond to the user's favorite songs, which can be made available to the vehicle either by downloading or by streaming. If a user wants to listen to the '80s satellite radio station, the user can select a preset that is programmed for that station. If the user wants to listen to one of her favorite songs that she has purchased, she can select a preset that corresponds to the desired song.
  • the driver heat/cool settings can include heat and cool settings for the heating and air conditioning system for the driver's side of the car, and can additionally include heat/cool settings for the driver's seat.
  • the heat/cool settings can be programmed as a function of temperature exterior to the car and temperature interior to the car.
  • the user can define several different sets of desired heat/cool settings based on the outside temperature and/or based on the interior temperature of the car.
  • a simple example for heating and cooling the driver's seat follows. Let's assume the user specifies that when the inside temperature of the car is less than 30 degrees Fahrenheit (-1 degrees Celsius), the seat heater is set to high. When the inside temperature is between 30 and 40 degrees Fahrenheit (between -1 and 4 degrees Celsius), the seat heater is set to medium.
  • the seat heater When the inside temperature is between 40 and 50 degrees (between 4 and 10 degrees Celsius), the seat heater is set to low. When the inside temperature is between 50 and 80 degrees Fahrenheit (between 10 and 27 degrees Celsius), the seat is neither heated nor cooled. When the inside temperature is between 80 and 90 degrees (between 27 and 32 degrees Celsius), the seat cooler is set to low. When the inside temperature is between 90 and 100 degrees (between 32 and 38 degrees Celsius), the seat cooler is set to medium. When the inside temperature is over 100 degrees (over 38 degrees Celsius), the seat cooler is set to high.
  • This simple example shows how the climate control system can vary according to the environmental conditions inside the vehicle. Of course, the outside temperature could also be taken into account.
  • the universal vehicle template is preferably defined such that settings from different car manufacturers can all be converted to and from the settings in the universal vehicle template.
  • seat position can be expressed in numerous different ways.
  • the position of seat 5610 can be expressed in terms of the height A of the front of the seat above some reference point, such as the floor of the vehicle; height B of the rear of the seat above some reference point; distance C from the accelerator pedal 5640 to a front of the seat; angle D of the back portion with respect to the bottom portion; distance E from the center of the steering wheel to the seat back; and distance F from a reference point on the seat (such as the back) to some fixed reference point.
  • the way a car represents seat position may vary with the car manufacturer. For example, let's assume one car manufacturer allows adjusting the forward/backward position of the driver's seat over a ten inch span, and uses a stepper motor to do the adjusting. The position of the seat could be expressed as a numerical value for the stepper motor. A different manufacturer may allow adjusting the forward/backward position of the driver's seat over a twelve inch span using an analog motor and a position sensor, where the seat position is stored as a value of the position sensor.
  • the universal vehicle template 5400 preferably describes seat position in a way that is actually descriptive of the position of the seat itself with respect to one or more physical features in the vehicle, not based on some motor settings or sensor readings of any particular manufacturer.
  • the process of a car vendor becoming U-Me certified includes the car vendor providing a device-specific template for the car that includes mapping information for converting the car vendor's settings to the types of settings referenced in the universal vehicle template.
  • the device-specific template will be used to do the conversion in step 5320 in FIG. 53 from the vehicle settings to the equivalent settings in the universal vehicle template.
  • FIG. 55 shows a method 5500 that could be representative, for example, of the steps when a user rents a car that is U-Me certified.
  • the phone is paired to the car (step 5510). Pairing the phone to the car allows the user's phone to send the information identifying the car to the U-Me system, and to authenticate the user to the U-Me system via the user's phone.
  • the user settings for this car are downloaded from the user's U-Me account to the car (step 5320).
  • all of the user's preferred settings are made available in the rental car.
  • the result is a car that is configured to the user's taste with minimal effort from the user.
  • a car or other vehicle could include a transceiver that allows the vehicle to directly interact with the U-Me system, instead of going through the user's phone.
  • FIG. 57 shows a block diagram of a phone 5710 coupled via Bluetooth interfaces 5730 and 5740 to a prior art vehicle 5720.
  • the Bluetooth interface 5740 of known cars provides a way to pair the phone 5710 to the vehicle 5720 so the user may use the phone hands-free while driving.
  • the Bluetooth interface 5740 thus communicates with a phone mechanism 5780, which controls the microphone 5750, speaker 5760 and controls of the audio system 5770 during a phone call.
  • the phone mechanism 5780 mutes the radio using the audio control 5770, announces via speaker 5760 the user has an incoming call, and when the user presses a button to answer the call, the phone mechanism 5780 then communicates with the phone 5710 to service the call, including playing the call audio on the speaker 5760 and receiving the user's voice input to the call via microphone 5750.
  • a user cannot access any of the engine system 5790 via the Bluetooth interface 5740 that communicated with the user's phone.
  • the engine system 5790 includes information in electronic form that could be useful to the user, including mileage 5719, error codes 5792, and warning lights 5793. Because prior art vehicles do not allow the phone to communicate with the engine system 5790, the user cannot use information that is generated in the engine system 5790.
  • the same phone 5710 with its Bluetooth interface 5730 communicates with the Bluetooth interface 5840 to service telephone calls using microphone 5850, speaker 5860, audio control 5870, and phone mechanism 5880, similar to what is done in the prior art system shown in FIG. 57.
  • the Bluetooth interface 5840 has access to the engine system 5890. This means information in the engine system 5890 can be communicated via the Bluetooth interface 5840 to the user's phone, and from there to the user's U-Me account.
  • Information such as mileage 5891, error codes 5892, warning lights 5893, scheduled maintenance 5894, collision detection 5895, and emergency response system 5896 can be made available to the U-Me system by a vehicle such as vehicle 5820 that has been U-Me certified.
  • the U-Me system can perform method 6000 in FIG. 60.
  • a notice can be sent to one or more U-Me certified shops of the needed scheduled maintenance (step 6010).
  • the notified shop(s) then return a bid for performing the scheduled maintenance to the U-Me system (step 6020).
  • the U-Me system then provides the bids to the user (step 6030). In this manner the user can automatically receive bids from one shop or from competing shops with the bids for doing the scheduled maintenance.
  • FIG. 60 shows the specific case of scheduled maintenance, a similar method could be performed when an error code or engine warning light comes on so the user can automatically receive one or more bids for performing the repair that is needed based on the error code or warning light.
  • the U-Me system also provides a central place for vehicle manufacturers to notify customers of recalls or service actions, as shown in method 6100 in FIG. 61.
  • the vehicle manufacturer sends the recall or service action information to the U-Me system (step 6110).
  • the U-Me system then notifies its users who are affected by the recall or service action (step 6120).
  • a method 6200 begins when a U-Me certified shop performs service for the U-Me user (step 6210).
  • the shop uploads to the user's U-Me account the service performed by the shop, with a recommended future reminder (step 6220). For example, if the shop changes the oil, the shop could upload a record of the oil change along with a recommendation that a reminder be set to change the oil in 5,000 miles.
  • the U-Me system sets the reminder for the user (step 6230). When the reminder conditions are not met (step 6210).
  • the U-Me system then provides a reminder to the user (step 6250).
  • Method 6200 is especially useful for service that needs to be performed at specified mileage and/or time intervals, such as oil changes and rotation of the tires.
  • Engine warning information can include, for example, information from error codes 5892, warning lights 5893, a collision detection system 5895, or an emergency response system 5896.
  • the engine warning information is provided to the user (step 6350).
  • the user may be prompted to authorize additional payment for access to the engine warning information (step 6330).
  • step 6350 the engine warning information is provided to the user (step 6350).
  • the engine produces an error code that indicates the fuel pump is failing. This could be indicated on the dash by a "service engine soon" light, but this does not give the user any meaningful information regarding what service is required. Having access to this engine warning information could cost a premium above the normal U-Me subscription, so the user could be prompted in step 6330 to authorize an additional charge of, say $5, to access the information. If the user is on a long highway trip and the "service engine soon" light comes on, the user doesn't know whether the warning is minor or more serious. In the case of a fuel pump that is failing, knowing the fuel pump is failing may allow the user to stop at a repair shop in the next town. In this scenario, paying an extra $5 for the engine warning information is money well-spent.
  • the U-Me system provides an improved way to manage photos, including photos that originated from a digital camera or other digital device, along with hard copy photos that have been digitized for electronic storage.
  • the U-Me system improves over the known art of software that adds metadata to photos by providing a people-centric approach to managing photos, as described in detail below with reference to FIGS. 64-79.
  • the methods discussed with respect to FIGS. 64-79 are preferably performed by the photo mechanism 182 shown in FIG. 5.
  • the U-Me system includes a photo system data entry screen, such as screen 6410 shown in FIG. 64 by way of example.
  • the photo system data entry screen 6410 like all of the U-Me system, is person-centric. Thus, when a user decides to have the U-Me system manager the user's photos, the user starts by entering data for a particular person in the photo system data entry screen 6410. Fields in the photo system data entry screen 6410 include Name, Preferred Name, Birth Date, Father, Mother, Wedding Day, Spouse, Married Name, Child, Camera, Street, City, State, ZIP and address name. The user can provide a sample photo of the person's face at 6450 to help train the facial recognition engine in the U-Me photo system.
  • Child field includes an Add button 6420 that allows the user to add additional children.
  • Camera field includes an Add button 6430 that allows the user to enter all cameras the user uses to take digital photos.
  • FIG. 65 A sample photo system entry page 6510 with data filled in is shown in FIG. 65.
  • the name of the person is Jim Jones, his preferred name is Jimmy, his birth date is 08/03/1957, his father is Fred Jones, his mother is Sally Jones, his wedding day was 06/21/1983, his spouse is Pat Jones, the Married Name field is empty indicating his married name is the same as what was entered above, he has two children Billy Jones and Sandy Jones, the camera he uses to take photos is a Nikon Coolpix S01, his address is 21354 Dogwood,
  • Carthage, MO 64836 The name chosen for this address is "Jim and Pat's House.”
  • a photo is provided at 6550 that is a good photo of Jim's face.
  • the GPS coordinates for the address is computed and displayed at 6540.
  • method 6600 monitors the photo system data entry (step 6610) and constructs family relationships from the photo system data entry (step 6620). People naturally think along the lines of family relationships. While known software for adding metadata to a photo allows adding name labels such as "Katie" and perform facial recognition, these labels have no meaning within the context of other people in the photos. The U-Me system, in contrast, constructs family relationships that allow storing and retrieving photos much more effectively than in the prior art.
  • a method 6700 begins by uploading a photo to the user's U-Me account (step 6710). Facial and feature recognition is performed on the photo (step 6720). Facial recognition is known in the art, but the processing in step 6720 preferably also includes feature recognition. Feature recognition may recognize any suitable feature or features in the photo that could be found in other photos. Examples of features that could be recognized include a beach, mountains, trees, buildings, a ball, a birthday cake, a swing set, a car, a boat, etc. Any existing metadata in the photo is extracted (step 6730) and processed to generate indexing information for the photo (step 6740).
  • step 6750 the user may be prompted to identify the unrecognized faces and/or features (step 6760).
  • the photo is then stored with the indexing information generated in step 6740 (step 6770).
  • the result is a digital photo stored with indexing information that may be used to retrieve the digital photo later using a powerful database search engine, discussed in more detail below.
  • step 6700 gives the user the chance to build up a library of faces and features that the system will have an easier time recognizing next time around.
  • step 6760 might display the photo with various different faces and regions defined. The user could select a face, then enter the name for the person, or if the person will appear in many photos, the user could enter some or all of the person's data in a photo system data entry screen, similar to that shown in FIG. 65. The user could also select various regions of the photo to define features that could be recognized in future photos.
  • a photo shows a couple on a beach with a cruise ship in the background
  • the user could click on each face to define information corresponding to those two people, and could also click on the sand on the beach and define this feature as "beach”, click on the water and define this feature as "water”, and click on the cruise ship and define this feature as "boat.”
  • these features may be recognized in other photos, which allows adding indexing information that describes those features automatically when the photo is processed, as shown in method 6700 in FIG. 67.
  • the indexing information generated in step 6730 preferably includes data that is not in the metadata for the photo, but is generated based on the metadata and information stored in the user's U-Me account.
  • the U-Me system when the U-Me system recognizes a date in the photo metadata that corresponds to Jim & Pat's wedding anniversary, the U-Me system can generate indexing info for the photo that identifies the Event for the photo as Jim & Pat's Wedding Anniversary. Having dates, locations and relationships defined in the user's U-Me account provides a way to add indexing info to a photo that will help to retrieve the photo later using a powerful search engine, discussed in more detail below.
  • a method 6800 reads camera info from the metadata for a photo (step 6810), looks up the photographer name that corresponds to the camera info (step 6820), and adds the photographer's name to the indexing info (step 6830). In this manner, the metadata in the photo that identifies the camera is used to go a step further to identify the person who uses that camera so the photographer can be specified in the indexing information for the photo.
  • FIG. 69 shows sample metadata 6900 that may exist in known digital photos.
  • metadata is used herein to mean data that is not part of the visible image in the digital photo that describes some attribute of the photo.
  • the metadata 6900 in FIG. 69 is shown to include fields for Camera Make, Camera Model, Camera Serial Number, Resolution of the photo, Image Size of the photo, Date/Timestamp, and Geocode Info.
  • the metadata shown in FIG. 69 is shown by way of example. Many other fields of metadata are known in the art, such as the metadata fields defined at the website photometadata.org.
  • the photo metadata disclosed herein expressly extends to any suitable data, whether currently known or developed in the future, that is placed in the digital photo file by the device that took the photo to describe some attribute that relates to the photo.
  • method 7000 in FIG. 70 reads this geocode info from the metadata (step 7010).
  • the geocode info can be in any suitable form such as GPS coordinates or other forms of geocode info that specifies location, whether currently known or developed in the future.
  • the location name is added to the indexing info for the photo (step 7030).
  • Jim Jones takes a photo with his cell phone of his daughter at his house.
  • the geocode info will reflect that the location corresponds to a stored location, namely, Jim & Pat's House. Jim & Pat's House can then be added to the indexing information, which makes retrieval of photos much easier using a photo search engine.
  • method 7100 begins by scanning a hard copy photo (step 7110). Facial and feature recognition is performed (step 7120). A wizard prompts the user to enter indexing information for the photo (step 7130).
  • the indexing information is appended to the scanned image data for the photo (step 7140).
  • the photo with its indexing info is then stored in the user's photo database (step 7150).
  • method 7200 begins by a user invoking a photo indexing info generator (step 7210). The user can then define indexing info for groups of photos or for individual photos (step 7220).
  • indexing info 7300 examples include fields for Recognized Person(s), Age(s) of Recognized Person(s), Recognized Feature(s), Location Name and Event. Note the Recognized Person(s), Age(s) of Recognized Person(s) and Recognized Feature fields could be replicate for as many recognized persons or features that exist in the photo.
  • a sample photo file 7400 is shown in FIG.
  • An example of a photo indexing info generator screen 7500 is shown in FIG. 75 to include Date fields, a People field, an Event field, a Location field, and a display of thumbnails of photos. The user specifies a date or range of dates in the Date fields.
  • the user specifies one or more people in the People field.
  • the user specifies location in the Location field.
  • An example will illustrate how a user might use the photo indexing info generator in FIG. 75 to generate indexing info for scanned hard copy photos. Let's assume Jim Jones has a stack of 163 photos of all the wedding -related photos of when he married Pat, including some on the morning of their wedding day showing the wedding ceremony, some that were taken later on their wedding day at the reception, and some a week later at a second reception in Pat's hometown.
  • Jim could enter a date range that begins at the wedding day and extends to the date of the second reception, could define an event called "Jim & Pat's Wedding", and could select the 163 thumbnails that correspond to the wedding and reception photos.
  • the user selects the Save button 7560, which results in the photos being saved in Jim's photo database with the appropriate dates and event information as indexing information.
  • the Event and Location fields can include drop-down lists that list events and locations that have been previously defined, along with a selection to define a new event or location. If the user decides to abort entering the indexing info for photos, the user may select the Cancel button 7570.
  • indexing info for photos is the ability to search for and retrieve photos using the indexing info. No longer must a user search through hundreds or thousands of thumbnails stored in dozens or hundreds of directories with cryptic names that mean nothing to a person! Instead, the user can use a photo search engine to retrieve photos based people, their ages, family relationships both entered and computed, location, and dates.
  • FIG. 76 One example of a screen 7600 for a photo search engine is shown in FIG. 76.
  • the example shown in FIG. 76 includes fields for Date(s), Event, Location, People, Relationship, and Photographer. Because of the family relationships generated by the U-Me system (e.g., in step 6620 in FIG. 6), searches or queries for photos can now be formulated based on those generated relationships. Examples of photo queries supported by the photo search engine 7600 in FIG. 76 are shown at 7700 in FIG. 77, and include: photos of grandchildren of Jim Jones between the ages of 6-18 months; photos of the wedding of Sandy Jones; and photos taken at the Lake House in 2010. These simple examples illustrate that adding indexing info that relates to people and locations allows for much more powerful querying and retrieving of photos than is known in the art.
  • the user may want to share photos stored in the user's U-Me account. This can be done using a photo share engine, a sample display of which is shown at 7800 in FIG. 78.
  • the photo share engine is preferably provided as a feature of the sharing mechanism 174 shown in FIG. 5.
  • the user defines criteria for photos to share, then specifies other U-Me users with which to share the photos.
  • the user can also select whether to share the metadata and whether to share the indexing info.
  • the criteria for photos to share can include any suitable criteria, including any suitable criteria that could be entered into the photo search engine for retrieving a photo.
  • the "Share with” field could be a drop-down list with people in the U-Me system, could be a drop-down list of people the user has defined in the user's U-Me account, or could be an e-mail address or other unique identifier for the person.
  • a user could thus enter the e-mail address of a person who is not a U-Me member, and this could result in the U-Me system sending an e-mail to the person inviting the person to join U-Me to view the photos the user is trying to share with the person.
  • Method 7900 in FIG. 79 shows a method for storing a photo with corresponding indexing information.
  • the user takes the photo (step 7910).
  • the U-Me software or app sends the photo with metadata to the user's U-Me account (step 7920).
  • the U-Me software or app can send the photo with metadata to the user's U-Me account in any suitable way, including a direct connection from the U-Me software or app to the U-Me system. In the alternative, the U-Me software or app can send one or more e- mails to the user.
  • the U-Me system monitors incoming e-mail, and when a photo is detected, embedded in an e-mail or as an attachment, the U-Me system recognizes the file as a photo.
  • the metadata is processed to generate indexing info (step 7930). Facial and feature recognition is performed (step 7940). Indexing information is generated for all recognized faces and features (step 7950).
  • the photo is then stored with its metadata and with the generated indexing info in the user's photo database (step 7960).
  • a flag is set to prompt the user for the needed input (step 7980). Setting a flag lets the user decide when to enter the needed input.
  • Method 7900 could be carried out by a user taking a photo with a smart phone that is running the U-Me app, which results in the photo being automatically uploaded, processed, and stored in the user's
  • the generation of location-based indexing info for photos may be done using any suitable heuristic and method. For example, let's assume Jim & Pat live on 40 acres of land. The GPS coordinates for their address may correspond to the mailbox at the road, which could be several hundred yards away from the actual house. Using the U-Me app on his smart phone, Jim could walk the perimeter of their 40 acres, identifying the corner points of the property by selecting a button on the app. When Jim arrives back at the point of origin, the U-Me app will recognize the various points define a closed area, and will define a region that includes the entire area. Jim could repeat the procedure on the outside corners of the house.
  • Jim could then define the 40 acres as "Jim & Pat's Property” and the house as "Jim & Pat's House.” If Jim takes a photo of a grandson at a birthday party in his living room in his house, the U-Me system will recognize the location as Jim & Pat's House, and will store this location as indexing info with the photo. If Jim takes a photo of the grandson fishing at a pond on the property, the U-Me system will recognize the smart phone is not at the house but is on the property, and will recognize the location as "Jim & Pat's Property", and will store this location as indexing info with the photo. In addition, various heuristics could be defined to generate location descriptors.
  • any suitable location information that could be generated and included as indexing information to describe location of where a photo was taken.
  • the medical field is one area where technology has led to great advances in some areas.
  • modern medical equipment such as Magnetic Resonance Imaging (MRI) machines allows imaging the body in non-invasive ways with sufficient resolution that allows diagnoses based on graphical images.
  • MRI Magnetic Resonance Imaging
  • some areas of the medical field have lagged way behind technology.
  • One of these areas is how medical records are generated, stored, and retrieved.
  • Most doctors and other medical providers still use hard-copy files. This is grossly inefficient.
  • the medical files are typically kept by each medical provider in each provider's respective offices. For example, let's assume a patient goes to her dentist, who exams the patient and believes the patient needs a root canal on a tooth.
  • the dentist could refer the patient to an Endodontist for the root canal. Because the files are in hard-copy form, the dentist could make a copy of the exam notes and the X-ray of the tooth, and provide these to the patient, who carries these hard copy records by hand to the Endodontist.
  • the medical profession needs to modernize and create electronic files instead of hard copy files.
  • FIGS. 80-92 Various functions relating to medical info are shown in FIGS. 80-92 and are discussed in detail below. These functions are preferably performed by the medical info mechanism 184 shown in FIG. 5.
  • user medical info 8000 is one suitable example for user medical info 660 in FIG. 6.
  • User medical information 8000 includes semi-private medical info 8010 and private medical info 8020.
  • Semi-private medical info 8010 may include any information the user decides to make available to medical personnel in case of an emergency, while the private medical info 8020 includes the rest of the user's medical info, which can only be shared by the user granting access.
  • semi-private medical info 8010 examples include blood type, medical conditions, medications, allergies, warnings, emergency contact info, a living will, a health care power of attorney, and other semi -private medical info.
  • Private medical info 8020 is shown in FIG. 80 to include hospital records, doctor records, test results, lab results, diagnoses, treatments, surgeries, and other private medical information. In the most preferred implementation, all of the user's medical information is initially set to be private medical info 8020. The user may then select which of the private medical info 8020 to make available as semi -private medical info 8010.
  • the medical conditions can include any medical condition the user may have. Allergies can include allergies to medications as well as allergies to food, insects, or other items.
  • Warnings could include any suitable warning that medical personnel should have, such as severe allergies that could send the patient into anaphylactic shock, warnings about brittle bones, warning the patient only has one kidney, or any other suitable warning.
  • Emergency contact info can include the name and contact information in a hierarchical list for those who should be notified of the patient's condition.
  • the emergency contact info could include names, addresses, cell phone numbers, e-mail addresses, relationship to the user, etc.
  • a living will can give the medical person knowledge regarding the patient's wishes if the patient is in a vegetative state.
  • a health care power of attorney will inform the medical person to whom the user has given power of attorney for health care in case of the user's incapacity.
  • method 8100 begins by a user defining semi-private medical info (step 8110).
  • the semi-private medical information could be entered by the user, but could also be selected from the private medical info 8020.
  • the user also defines the authorization needed to access the semi -private medical info (step 8120).
  • the semi-private medical info defined by the user can be accessed by medical personnel when the authorization defined by the user in step 8120 is satisfied, as discussed in more detail below.
  • U-Me system shows how the U-Me system can store medical information for a user.
  • a U-Me certified medical person treats a U-Me user (step 8210).
  • All medical info related to the treatment is uploaded to the user's U-Me account (step 8220).
  • the result is the user has electronic copies of all the user's medical info.
  • the user can thus make these records available to a doctor if the user decides to switch doctors without having to request those records from the previous doctor.
  • By automatically storing the user's medical info in the user's U-Me account all of the user's medical info will be present in one place, in electronic form, in a searchable format, which can be easily shared as needed.
  • FIG. 85 shows a sample display of a smart phone 8500.
  • Some smart phones such as Android phones, include security displays that require a user to enter a password or perform some other action to access the functions of the phone.
  • One such display of a security screen 8510 is shown in FIG. 85 to include nine circles. The user may set a pattern of four circles in a path, and when the security screen 8510 is displayed, the user drags a finger over the four circles in the defined path, which then unlocks the phone for use. While this is an effective way to stop a stranger from using the smart phone, it can also prevent a medical person from accessing medical information for the user.
  • Known Android smart phones such as the Samsung Galaxy S3 phone, include an Emergency Call button 8520 on the security screen 8510 that allow someone to bypass the security screen 8510 and make an emergency call.
  • a similar function for accessing medical information could be provided by the phone's operating system or by a U-Me app running on the phone.
  • a Medical bypass button 8530 could be provided on the security screen 8510 that allows a medical person to access the user's medical info stored in the smart phone 8500.
  • the medical bypass button 8530 could have text, or could have a symbol such as a red cross.
  • Method 8300 in FIG. 83 shows a method 8300 for a medical person to access a user's semi-private medical info 8010.
  • the U-Me user is in a car accident, is injured and unconscious, and arrives via ambulance to an emergency room of a hospital for treatment.
  • the user has a smart phone with the U-Me app running, and the smart phone provides a medical bypass button, such as 8530 shown in FIG. 85.
  • a nurse or doctor can press the medical bypass button 8530 on the security screen of the user device (step 8310).
  • the user's semi-private medical info is displayed on the user device to the medical person (step 8330).
  • the same method for the medical person to authenticate to the U-Me system can be performed when a user is at a medical facility. This would prevent someone from stealing the user's phone, driving to a parking lot of a hospital, and accessing the user's semi-private medical info.
  • the display of the user's semi-private medical info to a medical person could also trigger display of the user's emergency contact info to the medical person.
  • This could include a data input screen that allows the medical person to enter the user's condition and the medical person's contact information.
  • the relevant information could then be texted and sent via e-mail according to the information in the user's emergency contact info, or phone numbers could be called by the medical person.
  • the current location is determined (step 8410). This could be done, for example, using the Global Positioning System (GPS) in the user's smart phone.
  • GPS Global Positioning System
  • the current location is then checked against a database of medical facilities (step 8420).
  • the database of medical facilities may be stored on the user device, or may be stored in the U-Me system in the cloud and be accessed via the user's device.
  • a method 8600 shows one suitable example for displaying a U-Me user's medical info when the U-Me user is not conscious or otherwise able to provide access to the user's medical info.
  • a medical information sharing engine 8700 in FIG. 87 allows easily sharing a user's medical info with others.
  • the medical info sharing engine is preferably provided as a feature of the sharing mechanism 174 shown in FIG. 5.
  • the sharing engine 8700 includes a data entry screen that allows the user to select medical info in the user's U-Me account, then to specify one or more users to share this medical info.
  • the user had a car accident on 05/14/2013, and was treated in an emergency room on that day. The user puts in a date or date range. In this case, to assure the user catches all relevant medical info pertaining to the injuries from the car accident, the user enters the date of the car accident as the beginning date of a date range and enters a date a week later as the end date of the date range.
  • the U-Me system displays all medical information in the User's U-Me account for that date range.
  • the example in FIG. 87 shows three items of medical information: X-rays taken on 05/14/2013; E treatment on 05/14/2013; and a lab report dated 05/16/2013.
  • the user can select the "Share All" box, which will automatically select all three items to share.
  • the user has selected the X-rays and ER treatment for sharing and has not selected the lab report for sharing.
  • the user can then specify one or more parties with whom to share the selected medical information.
  • the user can share medical info with one or more other U-Me users, which may include individuals such as family members and doctors as well as organizations such as hospitals, insurance companies, etc.
  • U-Me users which may include individuals such as family members and doctors as well as organizations such as hospitals, insurance companies, etc.
  • a user we call Userl invokes the medical info sharing engine (step 8810).
  • Userl selects medical info to share (step 8820).
  • Userl selects one or more U-Me users
  • Me users to share the selected info (step 8830).
  • the U-Me system then grants access to the selected medical info for Userl by the selected users (step 8840). Granting access can mean the selected users are given permission by the U-Me system to access the selected medical info stored in Userl 's U-Me account.
  • Granting access could also mean copying the selected medical info from Userl 's U-Me account to the U-Me account of the selected users.
  • method 8900 begins by Userl sharing selected medical info with User2 (e.g., the first doctor) with authorization to allow User2 to share Userl 's medical info with others (step 8910).
  • User2 can then select one or more other U-Me users (e.g., other doctors) to share Userl 's medical info (step 8920).
  • the U-Me system then grants access to the medical info for Userl to the U-Me users selected by User2 (step 8930). Method 8900 is then done.
  • method 9000 begins when Userl revokes sharing of Userl 's medical info to User2 (step 9010).
  • the U-Me system revokes access to Userl 's medical info by User2 (step 9020).
  • Revoking access can mean not allowing User2 to access the medical info in Userl 's U-Me account. Revoking access can also mean deleting any of Userl 's medical info that was copied to User2's U-Me account.
  • the U-Me system also revokes access to medical info of Userl for all users to which User2 shared Userl 's medical info (step 9030). Method 9000 is then done. A user will be much more likely to share his or her medical info when the user retains control to revoke the access at a later time.
  • the U-Me system can also track when a U-Me user takes medication, as shown in method 9100 in FIG. 91.
  • a user takes meds (step 91 10).
  • the user indicates to the U-Me system when the user took the meds (step 9120). This could be done, for example, vie the U-Me app on the user's smart phone.
  • the U-Me system logs the meds taken and the time to the user's U-Me account. This information of when meds are taken can be enabled to the user's semi -private medical info so medical personnel will know what medications the user took and when.
  • Method 9100 can track not only prescription medications, but nonprescription (over the counter) meds as well. In addition, method 9100 could be used to track the user's consumption of food, vitamins, supplements, herbs, etc.
  • the U-Me system can also provide reminders for a user to take meds at the appropriate time, as shown in method 9200 in FIG. 92.
  • a U-Me certified pharmacy dispenses prescription meds to a U-Me user (step 9210).
  • the pharmacy uploads the prescription med info to the user's U-Me account (step 9220).
  • the U- Me system reminds the user to take meds at the prescribed times (step 9240).
  • the user's U-Me system will have an exact record of which prescriptions were filled and when.
  • the meds reminders may include any relevant information to taking the meds, such as "take with food”, “do not drive after taking this medication”, drug interaction warnings, etc.
  • the U-Me system includes a user authentication mechanism 194 shown in FIG. 5.
  • the user authentication mechanism 194 can perform suitable methods for authenticating a user to the U-Me system, including the methods shown in FIGS. 93 and 95. Referring to FIG. 93, method 9300 requires a user to authenticate to the U-Me system (step 9310). Once authenticated, the U-Me system functions are available to the user (step 9320).
  • One suitable example of the user authentication mechanism 194 in FIG 5 is shown as user authentication mechanism 9400 in FIG. 94.
  • User authentication mechanism can use biometric authentication 9410 as well as non-biometric authentication 9420. Suitable examples of biometric authentication 9410 shown in FIG. 94 include fingerprint, retina scan, voice print, DNA, and other biometric authentication.
  • Biometric authentication refers to authentication related to some aspect of a person's body that is unique for each person. Due to the large amount of sensitive data stored in the user's U-Me account, biometric authentication is preferred to assure unauthorized parties cannot access the user's U-Me account. Biometric authentication is performed by providing a sample, storing the sample as the reference, then comparing the reference sample to future samples submitted for authentication. Thus, a user could scan the fingerprint of the user's right index finger, and the scanned fingerprint could be processed and stored as a reference fingerprint.
  • the user When the user wants to authenticate to the U-Me system, the user scans the same fingerprint, and the newly scanned fingerprint is processed and compared to the stored reference fingerprint. If the new fingerprint scan matches the stored reference fingerprint, the U-Me system is assured the user trying to gain access to the user's U-Me account is, indeed, the user.
  • Suitable examples of non-biometric authentication 9420 shown in FIG. 94 include
  • Non- biometric authentication refers to authentication that is not necessarily unique to a person's body. Most online services today use the username/password paradigm for authenticating a user. The disadvantage of using non-biometric authentication is the possibility of somebody hacking a user's authentication information and accessing the user's U-Me account. The most preferred authentication for the U-Me system is biometric authentication because this assures only the user can access the user's U-Me account. In addition, biometric authentication can become an important way to address potential fears of licensing content such as music to a user instead of to a device.
  • the U-Me system requires biometric authentication, the U-Me system can be assured: 1) the user is who the user claims to be; and 2) the user can only be in one place at one time, so a user's licensed content can be provided to the user at this place and at this time. This should minimize pirating concerns because only the user can access the user's licensed content after the user authenticates to a location or device.
  • Fingerprint scanners are becoming more and more common. Many laptop computers now include a small slit near the keyboard over which a person may slide a fingertip, causing the sensor under the slit to scan the person's fingerprint. These fingerprint scanners are becoming very common, and can be added to many systems without great cost to help the system interact in a secure way with users and with the U-Me system. For example, a fingerprint scanner could be added to vehicles that authenticate the driver to the vehicle. When a user rents a rental car, a slot-type fingerprint scanner on the rental car can be used for the user to authenticate to the car, which can, in turn, authenticate to the U-Me system. When the user is authenticated to the car, the U-Me system knows it can provide the user access to the user's music because the user has scanned a fingerprint to gain access to the U-Me system, and only the user has that fingerprint.
  • Method 9500 in FIG. 95 uses non-biometric authentication.
  • the user authenticates to the U-Me system using non-biometric authentication (step 9510).
  • the U-Me system functions are made available to the user only on one physical device at a time (step 9520). By restricting U-Me functions to one device at a time, this reduces the likelihood of a user sharing the user's username and password to provide access to the user's U-Me account by others, and also reduces the likelihood of a person who hacked a user's U-Me username and password gaining access to the user's U-Me account, at least while the user is logged in to the U-Me system.
  • the iTunes account is tied both to the computer and to the device to which the music may be copied.
  • the iTunes account will only function with the user's iPad, not with other devices. This creates a real problem in the event of a crash of the hard disk on the computer system. If the person has not faithfully backed up their hard drive, the person risks losing the licensed content that was in the iTunes account. This is true even when Apple can verify the purchases were made by the user from the iTunes store. Having purchased music tied so tightly to a computer system is a real problem with the computer system fails.
  • 9600 shows licensing licensed content to a user (a human person), not to a physical device (step 9610).
  • the licensed content can then be made available to the user on any physical device (step 9620).
  • the piracy concerns go away when the user must use biometric authentication to gain access to licensed content in the user's U-Me account. From a logical perspective, when a user purchases a song, shouldn't the user be able to listen to that song regardless of what device the user may have access to?
  • method 9700 One suitable example for method 9600 in the context of licensed music is shown in method 9700 in FIG. 97.
  • a user purchases music (step 9710).
  • the license for the music is to the user, and is not connected to any physical device (step 9720).
  • the user's licensed music may then be made available by the U-Me system to any U-Me certified music player (step 9730).
  • Method 9700 is then done.
  • a user may define settings related to the user's music, as shown in method 9800 shown in FIG. 98.
  • the user defines music settings by organizing music into favorites, playlists, genres, etc. (step 9810). All of the user's music settings are then made available on any U-Me certified music player (step 9820).
  • Examples of physical music players 9900 are shown in FIG. 99 to include a phone, a tablet computer, a laptop computer, a desktop computer, a portable MP3 player, a car audio system, a home audio system, and other music player.
  • a phone a tablet computer, a laptop computer, a desktop computer, a portable MP3 player, a car audio system, a home audio system, and other music player.
  • FIG. 100 shows a user can purchase a one week license for $0.49; a one year license for $0.89; a five year license for $1.29; a license for the life of the purchaser for $1.59; and a perpetual license for $1.99.
  • the change of licensing to a person and not to any physical device gives rise to the concept of "digital estate planning" where a person may own perpetual rights to licensed content that may be transferred to someone upon the user's death.
  • Such transfers could be handled by the U-Me system automatically once proof of death is verified by a U-Me system administrator according to transfer-on-death rules defined by the user in the user's U-Me account, or according to a user's will or other estate planning documents.
  • One of the features of the U-Me system is the generation of virtual devices for the user in the user's U-Me account that correspond to physical devices the user uses. The goal is to have as much information as possible in the virtual device so if a physical device is lost, stolen, or malfunctions, a similar device may be easily configured using the information in the virtual device.
  • a physical device is scanned for all data, licensed content, and settings (step 10110).
  • a virtual device is then generated in the U-
  • the virtual device will have virtually all information needed to configure a new device to have the exact same configuration as a previous device.
  • An example will illustrate. Let's assume a user has a Samsung Galaxy S3 smart phone, and takes hours to get the phone configured exactly as he wants, with many apps installed and configured, with many different ringtones assigned to different contacts, with photos for many of his contacts, etc. Let's now assume the user registers this device with the U-Me system, which causes a process, such as method 10100 in FIG.
  • the user 101 to create a virtual device that corresponds to the Samsung Galaxy S3 phone in the user's U-Me account, with all of the data, licensed content, software, and settings that define how the phone is configured and functions.
  • the virtual device Once the virtual device is created, if the user accidentally flushes his Samsung Galaxy S3 phone down a toilet, the user can go to his phone store, purchase a new Samsung Galaxy S3 phone, install the U-Me app, then log into his U- Me account.
  • the U-Me system will ask of the user wants to configure this phone to match the stored configuration in the virtual device in the U-Me account.
  • the new phone is configured to exactly match the old phone, so the user can have the new phone up and running in very little time with the exact configuration on the new phone that the user spent so many hours defining on the old phone.
  • the U-Me app can log when copying something to the virtual device is not allowed, and can provide a list of instructions for the user to follow. For example, let's assume when the user creates a virtual device that corresponds to his Samsung Galaxy S3 phone (discussed above), the U-Me app cannot copy the operating system or two of the eighteen apps installed on the phone to the virtual device in the user's U-Me account. The U-Me app can then provide a list of instructions stored in the U-Me account for configuring a new device.
  • the U-Me system can then configure the phone using the data, licensed content and settings stored in the corresponding virtual device in the user's U-Me account.
  • Known apps may include features that prevent copying their software and settings. This problem can be addressed by providing a process for an app to become U-Me certified.
  • a U-Me certified app will have defined interfaces and methods that allow completely copying the entire app, including all of its settings, to a virtual device. In this manner, U-Me certified apps will allow fully automatic copying of the app and its settings to a new device. Once customers start demanding that apps and devices are U-Me certified, the providers of these apps and device will feel pressure get their products U-Me certified, which will aid in the widespread proliferation of the U-Me system.
  • a method is needed to synchronize any changes in the device to the virtual device stored in the user's U-Me account.
  • An example of such a method is shown at 10400 in FIG. 104, which is preferably performed by the data tracker 162 shown in FIG. 5. All data additions and deletions in all of the user's physical devices are tracked (step
  • Tracking data as shown in FIG. 104 requires identifying data attributes that describe the data.
  • data as discussed in FIGS. 105-109 refers to user data, user licensed content, and user settings. Attributes of the added data are identified (step 10510). The added data is stored to the U-Me account with indexing info that includes the identified attributes (step 10520). The indexing info will help the U-Me system know how the data should be stored, retrieved and handled. Examples of suitable data attributes 10600 are shown in FIG. 106 to include what, where, when, and other data attributes. "What” could refer to the type of data. "Where” could refer to where the data was generated. "When” could refer to a time and date relating to the changed data. Referring to FIG.
  • examples of data type attributes 10700 include operating system data, application data, user input, source, licensed content, size, and other data type.
  • examples of location data attributes include a device, the user's U-Me account, and other location attribute.
  • examples of time/date attributes include time, date, expiration, and other time/date attribute.
  • a data file 11000 represents data that is stored in the user's U-Me account.
  • Each data file preferably includes an identifier ID, indexing info, and data.
  • One example of a suitable data file that conforms to the format shown in FIG. 110 is shown in data file 11100 in FIG. 111.
  • the ID is a numerical identifier that uniquely identifies this data record from other data records in the user's U-Me account.
  • the indexing information in the example in FIG. 11 includes Jim Jones, .pdf file, TurboTax form, 2012 Federal Tax Return, 142KB, Desktop Computer, and 04/15/2013 at 21:46:23.
  • the data is the .pdf form data.
  • the indexing info shows Jim Jones created this file, a .pdf file, which is a TurboTax form that is part of his 2012 Federal Tax Return, the size of the file is 142 KB, and the file was generated on Jim's desktop computer on 04/15/2013 at 9:46.23 PM.
  • the first desktop computers included a file system that included a hierarchy of directories and subdirectories where files were stored. The same paradigm exists today more than thirty years later.
  • the directory structure is the same, and the end result is the same - the user must decide where to store data on a computer system, and how to name files being stored. Requiring the user to decide where to store data and what file names to use also requires a user to remember the directory and file name when the user wants to retrieve the data. Most computer users have had the experience of storing a file using a directory and filename the user selected, then having great difficulty locating the file later on because the user cannot remember what directory or subdirectory the user saved the file to, or what name the user gave the file when the file was stored.
  • the U-Me system provides a much easier way to store and retrieve data from a user's U-Me account. Instead of using a directory/subdirectory file system that requires a user to remember directory names and file names, the U-Me system allows a user and/or the U-Me system to add indexing info that describes the data, such as indexing info shown in data file 11100 in FIG. 111.
  • the indexing info may be used to retrieve the data as well using a suitable search engine, such as data search engine 164 in FIG. 5.
  • a suitable search engine such as data search engine 164 in FIG. 5.
  • FIG. 112. Screen 11200 includes fields for Data Created By, Data Type, Date and Device. Note each field includes an Add button to add more fields of that type.
  • Data search engine screen 11200 is an example of a screen that could be provided by the data search engine 164 in FIG. 5.
  • a screen for a search engine could include any suitable field that can be used as indexing info that can be used to locate the stored data.
  • the data search engine 164 in FIG. 5 can provided different screens, including the e eceipt search engine screen in FIG. 50, the photo search engine screen in FIG. 76, and the data search engine screen in FIG. 112.
  • the data search engine disclosed herein includes any suitable way to specify index information for retrieving data stored previously, whether currently known or developed in the future.
  • the U-Me system can provide a level of abstraction that hides the underlying file system. This can be done by creating "containers" for different item types. For example, when the user stores the first photo to the user's U-Me account, the U-Me system can recognize from the file type and the indexing info that this is a photo, and can create a container where all photos are stored.
  • the "container” is also a logical construct, and can be implemented using any suitable technology under -the-co vers.
  • a virtual machine that is provisioned to run a user's U-Me account could have a directory/subdirectory/filename file system, but this could be hidden from the U-Me user by the containers defined in the user's U-Me account.
  • the U-Me system can use any disk file system, including a directory/subdirectory structure.
  • the U-Me system preferably provides a layer of abstraction that hides the directory/subdirectory file structure from the U-Me user, and instead provides easy-to-use data entry screens for storing data and search screens for retrieving data.
  • a method 11300 is performed by the U-Me system when a new physical device needs to be configured.
  • the new physical device is registered with the U-Me system (step 11310).
  • the new device is configured as a clone of the stored virtual device (step 11322).
  • the new phone can be configured as a clone of the virtual device in the user's U-Me account.
  • Method 11300 configures the new device as closely as possible to the old device, hopefully leaving a minimum of manual configuration for the user to perform. Once the new device is configured, if there are settings on the new device that were not available on the old device, the U-Me system could display those settings to the user for configuration.
  • the U-Me system determines whether a device -specific template exists for the new device (step 11340).
  • the new device configuration is determined from the device-specific template and info in the user's U-Me account (step 11342).
  • the new device configuration is generated from info in the user's U-Me account (step 11344), such as from a universal template, or from converting settings between two device-specific templates.
  • the definition of "similar” in step 11330 in FIG. 113 can be related to whether a device is of the same type or of a different type.
  • the definition of "type” can be related to the physical characteristics of the device, the operational characteristics, and the manufacturer.
  • a Samsung Galaxy S4 phone can be deemed to be of the same type as a Samsung Galaxy S3 phone because they both come from the same manufacturer and run the same Android operating system, while an iPhone could be deemed to be of a different type because it has a different manufacturer and runs a different operating system.
  • an iPhone can be deemed to be of the same type as a Samsung Galaxy S3 phone when the definition of type includes smart phones.
  • the conversion mechanism 160 in FIG. 5 can convert settings between two different types of devices, regardless of how "type" is defined in any specific scenario.
  • Method 11400 in FIG. 114 shows an example for handling software that is licensed to a user. The user purchases the software that is licensed to the user
  • step 11410 For this specific example in FIG. 114, we assume this means the software is not licensed to any physical device.
  • the user provides the download and license information to the U-Me system (step 11420).
  • an installer file could be the download information
  • the license information could be a license key received via e-mail after purchasing the software.
  • the U-Me system installs the software on a virtual machine for the user (step 11430).
  • the user then interacts with the software running on the virtual machine in the U-Me system (step 11440).
  • the functions of the software can be made available to the user on any device the user may be using.
  • the U-Me system then installs the software on a virtual machine for the user (step 11540). The user then interacts with the software running on the virtual machine in the U-Me system (step 11550).
  • FIG. 116 A suitable example of a virtual machine 11600 is shown in FIG. 116.
  • the virtual machine 11600 hosts the user's data 120A, the user's licensed content 130A, the user's settings
  • a phone interface 11610 a tablet interface 11612, a laptop interface 11614, a desktop interface 11616, and the universal user interface 142.
  • These interfaces 11610, 11612, 11614 and 11616 are suitable examples of device interfaces 156 in FIG. 5 that allow the U-Me system to communicate with each of the user's physical devices. Note that any and all of the items shown in FIG. 5 could run on the user's virtual machine 11600, but some of these may execute on other systems that interact with the user's virtual machine 11600 using a defined interface or protocol. All of the functions on the virtual machine are provided by the virtual machine mechanism 158 shown in FIG. 5.
  • a user could interact with software running on a virtual machine in the U-Me system (step 11440 in FIG. 14 and step 11550 in FIG. 115) by invoking the universal user interface 2300 in FIG. 23, then clicking on the Software icon 2330.
  • the universal user interface 2300 could display a screen that shows icons corresponding to all software that is available to run in the user's U-Me account. This could be a sort of "virtual desktop" that provides icon-based display of software available to the user.
  • the universal user interface thus provides an interface to any software running on a virtual machine in the U-Me system.
  • This provides many advantages.
  • the user can access and use the software using any suitable physical device. Because the software runs on a virtual machine, the physical device need not run the software. The physical device merely needs to provide a universal user interface to the software running on a virtual machine in the U-Me system. Thus, a user could be on vacation in Italy, go into an Internet cafe, and invoke using a standard web browser the login page to the U-Me system.
  • the user could then authenticate to the U-Me system, preferably by biometric authentication if the computer in the Internet cafe has that capability, or via a username and password or other non-biometric authentication if the computer does not have biometric authentication capability.
  • the universal user interface could be displayed in the web browser, which then provides access to all of the user's data, licensed content, and settings, including the user's licensed software, on the computer system in the Internet cafe.
  • Running software on a virtual machine via a universal user interface provides a very powerful platform that may be accessed using any suitable device in any suitable location.
  • a user rents a hotel room that is U-Me certified
  • the user could invoke via a browser on the television the universal user interface via a web browser, which would make all of the user's data, content and settings, including software, available in the hotel room.
  • the U-Me system can also give weather notifications to its users.
  • Weather notifications are referred to herein as weather alerts.
  • the weather alerts may be provided, for example, by the alert mechanism 166 shown in FIG. 5.
  • method 1700 allows a user to define one or more geographic regions (step 11710), and for each defined region, the user may select one or more weather alerts (step 11720).
  • NOAA National Oceanic and Atmospheric Association
  • NOAA operates special radio stations in the United States that broadcast weather information continuously. These radio stations work in conjunction with special weather radios that can be activated by a weather alert from the NOAA radio station.
  • Known weather radios allow the user to select counties for which weather alerts are received. Some of the weather alerts defined by NOAA are shown byway of example in FIG. 118. However, known weather radios do not allow the user to customize weather alerts according to the user's preferences.
  • a person lives in an area where tornados can occur, and wants a weather radio to sound the alarm in the event of a tornado watch or a tornado warning
  • the person typically sets the location on a weather radio to the county where the person lives. While this may result in tornado watches and warnings waking up the user in the middle of the night, which is desirable so the user can take cover in a shelter, this will also result in many other weather alerts waking up the user in the middle of the night. For example, if the user lives several miles from any creeks, streams or rivers, the user probably does not want to be awakened at
  • a sample input screen 11900 for a weather alert interface is shown to include fields for Alert Type, Geographic Region, Time of Day, Priority, and Send Alert to.
  • the weather alert interface provides a way for a user to define many different types of weather alerts so the user is not repeatedly awakened in the night for weather alerts in which the user has no interest.
  • Sample weather alerts defined by a user are shown in FIGS. 120-122.
  • the weather alert in FIG. 120 is for a Tornado Alert, which is to sound when a tornado alert is issued for this county and neighboring counties, the time of day for the warning is Any, the priority is High, and the Send Alert To specifies to send the alert for the tornado warning to all of the user's registered devices using all message types.
  • FIG. 121 is a weather alert for a Flash flood
  • the geographic region may be defined as "within 10 miles of my current location; the Time of Day is Any; the priority is low; and the Send Alert To is e-mail and text. Because the priority is low, the user will not be awakened by a flash flood watch.
  • a user can define a geographic region in any suitable way, including by specifying defined regions like counties, or defining a radius from the user's current location. By defining a radius, the U-Me system can dynamically adjust as the user moves.
  • the U-Me system could convert the "10 mile radius of my current location" specified in FIG. 121 to include flash flood watches for both Jasper County as well as Newton County.
  • the conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5. Because the low-priority Flash Flood Watch weather alert is sent to the user via e-mail and text, the user will not be awakened by this alert, but the information will be available via e-mail and text when the user awakens.
  • FIG. 122 is a weather alert for a Wind Chill Watch.
  • the geographic region is set to a 50 mile radius of my current location, the time of day is Any, the priority is Low, and the alerts are sent via e-mail and text.
  • the ability to truly customize weather alerts allows the U-Me system to benefit from the weather alerts provided by the NOAA system while giving the user flexibility so the user will not be repeatedly awakened by weather alerts for which he has no interest.
  • a method 12300 shows how the U-Me system processes weather alerts.
  • the U-Me system receives a weather alert for a geographic region (step 12310).
  • the weather alert received in step 12310 could be, for example, a weather alert from NOAA, but could also include other weather alerts from other sources as well.
  • step 12330 One of the users that has an alert set for this geographic region is selected (step 12330).
  • the U-Me system provides an additional layer of technology atop the inefficient NOAA weather radio system to allow the user to customize weather alerts to the user's liking. In this manner the user can be awakened for any weather alert he chooses, yet can remain asleep for other weather alerts for which the user does not care to be awakened.
  • FIGS. 124-130 Various functions relating to home automation are shown in FIGS. 124-130 and discussed in detail below. These functions are preferably performed by the home automation mechanism 186 shown in FIG. 5.
  • FIG. 8 shows that user settings 140 may include home automation settings 850.
  • An example of suitable home automation settings are shown at 12400 in FIG. 124.
  • the example home automation settings shown in FIG. 124 include appliance settings 12410, Heating/Ventilation/ Air Conditioning (HVAC) settings 12420, light settings 12430, security settings 12440, home theater settings 12450, programs 12360, and other home automation settings 12470,
  • HVAC Heating/Ventilation/ Air Conditioning
  • suitable appliance settings 12410 are shown in FIG. 125 to include coffee pot settings 12510, refrigerator settings 12520, alarm clock settings 12530, and other appliance settings 12540. Some appliances already have IP addresses, and some people think all devices that plug in will have IP addresses someday. The U-Me system contemplates the future, when a user may want to define settings in the user's U-Me account for any and all of the user's appliances.
  • HVAC settings 12420 are shown in FIG. 126 to include thermostat settings 12610, heater settings 12620, air conditioning settings 12630, fan settings 12640, air cleaner settings 12650, and other HVAC settings 12660.
  • Thermostat settings 12610 may include settings for different thermostats in different zones.
  • Heater settings 12620 may include the heat temperature setting on a thermostat.
  • Air conditioning settings 12630 may include the cool temperature setting on a thermostat.
  • Fan settings 12640 may include turning various fans on or off, or varying the speed of fans, including fans on heaters, air conditioners, ceiling fans, stove exhaust fans, bathroom exhaust fans, etc.
  • fan settings 12640 could specify to run a bathroom exhaust fan when the bathroom light is turned on and to keep the exhaust fan running for ten minutes after the bathroom light is turned off.
  • Air cleaner settings 12650 could include settings to run the air cleaner for a specified continuous period at night, then intermittently during the day according to a defined schedule.
  • Examples of suitable light settings 12430 are shown in FIG. 127 to include kitchen light settings 12710, master bedroom light settings 12720, master bathroom light settings 12730, living room light settings
  • Light settings 12430 can include settings for any light or group of lights.
  • the user may have a motion sensor near the front door of the house to detect when somebody is approaching the front door.
  • Light settings 12430 could include a setting that turns on the porch light when the motion detector detects motion near the front door.
  • Light settings 12430 could include any suitable condition or trigger for turning lights on or off.
  • exterior security floodlights could be illuminated at dark and be kept on until dawn, or could be selectively turned on and off based on one or more motion sensors.
  • Examples of suitable security settings 12440 are shown in FIG. 128 to include an arm code 12810, a disarm code 12820, a bypass when arming condition 12830, a lock doors when arming condition 12840, and other security settings 12850.
  • Examples of home theater settings 12450 are shown in FIG. 129 to include news settings 12810, sporting events settings 12820, TV series settings 12830, movie settings 12840, and other home theater settings 12850.
  • the home theater settings 12450 allow a user to define a "scene" for various types of viewing experiences.
  • the settings shown in FIG. 129 could each include settings for the home theater audio system, for lights in the TV room and possibly adjacent rooms, for opening and closing drapes or blinds on one or more windows, etc.
  • the user can select
  • Programs 12460 shown in FIG. 124 can include any suitable program or logic to cause different things in the home automation controller to occur based on some specific condition, event, or relationship.
  • An example of a suitable program for a home automation controller is: When arming alarm (ARM), set
  • Thermostatl to Day High Setting When the user presses buttons on the home automation keypad to set the alarm as the user is leaving the house, the program above will cause the settings in Thermostatl to change since nobody is home.
  • Another example of a suitable program is: When GARAGE DOOR opens, turn on Garage Lights.
  • the programs 12460 can include any suitable logic, setting, or combination of settings for home automation, whether currently known or developed in the future.
  • FIG. 130 shows a display of sample home automation settings 13000 that could reside in a user's U-Me account.
  • Home automation settings 13000 for the simple example shown in FIG. 130 include appliance settings 12410, HVAC settings 12420, and light settings 12430.
  • the appliance settings 12410 in FIG. 130 include "start the coffee pot brewing at 8:00 AM,” “turn on the refrigerator icemaker,” and “set the alarm clock for an 8:00 AM alarm.”
  • the HVAC settings 12420 in FIG. 130 include cool and heat settings from 8:00 AM to 10:00 PM; cool and heat settings from 10:00 PM to 8:00 AM; the furnace fan set to auto on the thermostats; and the heat/cool mode set to auto on the thermostats.
  • the U-Me system will have all of the templates that pertain to all of the U-Me certified equipment in that condo.
  • all of the user's home automation settings could be programmed into corresponding equipment at the rental condo. This could even extend to the arm/disarm code for a security system, so the user can use the same codes she uses at home to arm and disarm the security system at the rental condo. This can be done even if the security system at the condo is a totally different brand and type of security system.
  • the conversion mechanism 160 in the U-Me system can convert the user's settings at home to corresponding settings in the rental condo.
  • the U-Me system thus allows "me to be me, anywhere" by making a user's settings in the U-Me account available wherever the user happens to be, on whatever device or equipment the user is using.
  • license key means any information the user may enter to gain access to a computer program.
  • the program itself and 2) the license key that allows the user to install the program.
  • Most users today purchase software online, which results in an installation package being downloaded to the user's computer.
  • the license key is typically sent to the user via e-mail. Thus, when the user clicks on the installation package, the user is then prompted to enter the license key. Once the license key is installed, the software can be installed and run. In other scenarios, the software can be completely installed, but the license key (or activation code) must be entered after installation to enable all of the features of the software.
  • the license management mechanism 188 can perform a method such as method 13100 in FIG. 131.
  • the user downloads software (step 13110).
  • the user enters the license key to install or activate the software (step 13120).
  • the license management mechanism 188 has both crucial pieces of information that will be needed if the software ever needs to be installed or activated in the future.
  • a license management entry is thus created in the user's U-Me account for the software and the corresponding license key.
  • FIG. 132 includes a license key 13210 and the software 13220.
  • the software 13220 can include a fully downloaded computer program ready to install, or can include an installation package that includes a smaller piece of code that, in turn, downloads and installs the software from an online provider.
  • the license management mechanism 188 can track license information for all licensed content. For example, when music is purchased, the license management mechanism will create an entry that includes the music file as well as license information for the music file. The license management mechanism thus monitors and stores licensed content and corresponding license information so this information can be accessed in the future. This can be useful, for example, when performing an audit to assure the user has licenses for all licensed content, and when transferring licensed content to another user, as discussed in more detail below.
  • alerts examples of alerts that can be provided by alert mechanism 166 in FIG. 5 include birthdays, anniversaries, periodic reminders, seasonal reminders, weather alerts, medication alerts, and other alerts.
  • Examples of periodic reminders are shown at 13400 in FIG. 134 to include a reminder for a user to take thyroid medication every day at 7:30 AM, a reminder to check oil on all vehicles on the 1st of each month, a reminder to pay the house payment on the 5th of each month, a reminder to check the air filter on the furnace each quarter, a reminder to pay estimated taxes on specified dates, a reminder to file income tax returns annually on April 15th, and other periodic reminders.
  • Periodic reminders can include any reminder for any type of information, event, or task.
  • Examples of seasonal reminders are shown at 13500 in FIG. 135 to include a reminder each October 1st to remove hoses from the hose bibs (so a freeze does not cause the hose bibs to burst), a reminder each April 1st to clean out the roof gutters, a reminder each March 15th to take the cover off the
  • Another function that can be provided by the U-Me account is the automated destruction of data, content or settings based on defined criteria in a retention/destruction policy by the retention/destruction mechanism 170 shown in FIG. 5.
  • a user defines criteria in the user's retention/destruction policy (step 13610).
  • the U-Me system then destroys the data and/or licensed content and/or settings that satisfy the criteria for destruction specified in the user's Retention/Destruction policy (step 13630).
  • a sample retention/destruction policy is shown at 13700 in FIG. 137. This shows to retain tax returns for five years; to destroy specified data upon my death; to destroy a virtual device that is more than two generations back; and other retention/destruction criteria. Note that destroying data in this context means deleting the data from the user's U-Me account. Instead of specifying to retain tax returns for five years, the user could instead specify to destroy tax returns after five years. These may accomplish the same end result in the U-Me system, or may provide completely different results.
  • the U-Me system could prevent the deletion and provide a message to the user that deletion of the tax return is not allowed due to the retention criteria the user specified in the Retention/Destruction policy.
  • the user specifies to delete tax returns after five years, this would not necessarily prevent the user from deleting a tax return that is less than five years old.
  • the U-Me system also provides a licensed content transfer mechanism 168 in FIG. 5.
  • a suitable example of the licensed content transfer mechanism 168 is shown as licensed transfer content mechanism 13800 in FIG. 138, which lists possible transfers of licensed content, including gift, sale, inheritance, and other transfers.
  • licensed content in the U-Me system is preferably licensed to a user and not to any physical device, the licensed content becomes digital personal property that can be transferred by gift, sale, inheritance, etc.
  • a user buys a perpetual license for a song, which is then downloaded to the user's U-Me account. Let's assume the user tires of the song and wants to sell the song to someone else.
  • the U-Me system receives a request from Userl to transfer licensed content to User2 (step 13910).
  • the U-Me system then transfers the licensed content from Userl 's U-Me account to User2's U-Me account (step 13920).
  • the U-Me system can include appropriate controls to verify Userl 's license to the licensed content before the transfer, and to transfer the license with the licensed content to
  • a user that tires of a song could list the song for sale on eBay, and when the song sells to another U-Me user, the seller could request the transfer of the song to the buyer, which is then carried out by the U-Me system.
  • the second criteria in the Retention/Destruction policy 13700 shown in FIG. 137 specifies to delete specified data upon my death.
  • the Retention/Destruction mechanism 170 thus allows a user to specify certain information that will be destroyed when the U-Me system receives proof of the user's death. For example, if a user is single, upon the user's death, the user's tax returns are no longer relevant, and can be automatically destroyed. An imaginative reader can imagine many other scenarios where automatic destruction of data upon a user's death would be desirable.
  • the U-Me system supports the concept of digital estate planning. This means for content that is licensed to the user in perpetuity, the user can define transfer-on-death rules.
  • a method 14000 is shown in FIG. 140.
  • the U-Me system receives proof of the death of Userl (step 14010).
  • the U- Me system then reads the transfer-on-death rules the user may have defined (step 14020).
  • the U-Me system then transfers Userl 's licensed content to one or more other U-Me users according to Userl 's transfer-on- death rules (step 14030). In this manner, content for which the user has a perpetual license may be automatically transferred to one or more different users automatically upon the user's death.
  • FIG. 141 shows a suitable method 14100 for performing an audit.
  • the U-Me system reads the licensed content in the user's U-Me account (step 14110).
  • step 14120 YES
  • method 14100 reports the user has a license for all the user's licensed content (step 14130).
  • the display provided to the user in step 14140 may include controls to delete licensed content for which the user does not have a license, may include controls to acquire a license, and may allow the user to put off acquiring the needed license(s) for some period of time.
  • the U-Me system can enforce the license audit with a deadline.
  • the U-Me system provides to the user a deadline for acquiring the missing license(s) for licensed content (step 14210).
  • step 14230 any unlicensed content in the user's U-Me account is deleted.
  • the U-Me system includes a sub-account mechanism 190 that allows a user to setup sub-accounts to the user's U-Me account.
  • FIG. 143 shows a U-Me sub-account mechanism 14300 that is one suitable example of the sub-account mechanism 190 shown in FIG. 5, which includes one or more master accounts, one or more sub-accounts, access controls, and a social media monitor.
  • a Mom and Dad are setup as master accounts, and their kids are setup as sub-accounts.
  • the users who have master accounts may define access controls for the sub-accounts, and may further define parameters for the social media monitor.
  • parents can control how the kids use their U-Me sub-accounts.
  • This same scenario could be used in a classroom setting, where the teacher has a master account and the students all have sub-accounts.
  • the sub-account mechanism can define an account as a sub-account to establish a relationship with the master account without limiting access of the user in the user's account.
  • the sub-account mechanism can be used to establish a relationship between the U-Me users.
  • the teacher could post homework assignments to her U-Me account, and the homework assignment could then be made available to all sub-accounts.
  • Method 14400 in FIG. 144 starts by defining one or more master accounts (step 14410). For each master account, one or more sub-accounts may be defined (step 14420). Access controls may be defined for the sub-accounts (step 14430). In addition, social media activity of the sub-accounts may be reported to the master account(s) (step 14440).
  • the sub-account concept is a powerful tool for creating relationships between U-Me users and for potentially defining how users can access and use the U-Me system.
  • the U-Me system also includes a credit card monitoring mechanism 192 shown in FIG. 5.
  • the credit card monitoring mechanism 192 preferably monitors when a user makes a purchase with a credit card online, and creates a log of all websites where the user used each credit card.
  • the U-Me system detects when a user enters credit card info on a web page (step 14510).
  • Method 14500 then confirms the user completes the purchase (step 14520).
  • the website for the web page where the user made the purchase is determined (step 14530).
  • the website is added to the entry (step 14450).
  • an entry is created in the credit card log for this credit card (step 14560), and the website is added to the entry (step 14550).
  • FIG. 146 One suitable example of a credit card log is shown at 14600 in FIG. 146 to include entries for each credit card that specify the credit card name, credit card number, and expiration date, with a list of websites where the credit card was used.
  • Log 14600 in FIG. 146 shows entries 14610, 14620 and 14630 for three credit cards, respectively, with each entry including a list of websites where the credit card was used.
  • “about to expire” can be defined in any suitable way, such as one month before expiration.
  • the user is prompted that the credit card is about to expire (step 14720).
  • step 14810 the user views the display of websites where the credit card was used (step 14810).
  • the user selects one of the websites (step 14820).
  • the user updates the credit card info on the selected website (step 14830).
  • the credit card monitoring mechanism in the U-Me system allows the user to update the credit card information on some or all of the websites where the credit card was used before the credit card expires. If the use of the credit card was for a one-time purchase at a website, the user will probably not want to update the credit card information for that website. But when the use was for a website the user uses often (e.g. , PayPal, Amazon.com), or when the use is for a recurring bill (e.g., electric bill, phone bill), being prompted that a credit card is about to expire can provide important benefits, such as making sure a bill payment is not missed due to an expired credit card.
  • a recurring bill e.g., electric bill, phone bill
  • the U-Me system also includes a macro/script mechanism 172 shown in FIG. 5.
  • One suitable example of the macro/script mechanism 172 is shown at 14900 in FIG. 149 to include a user interface monitor, a macro record mechanism, a script record mechanism, a macro playback mechanism, a script playback mechanism, and scheduled tasks.
  • method 15000 starts the user interface monitor (step 15010).
  • the user interactions are monitored (step 15020).
  • the user interface monitor is stopped (step 15030).
  • a macro or script is then generated that performs the monitored user interactions (step 15040). For example, let's assume a U-Me user receives their bank statements from Wells Fargo via the user's online account with Wells Fargo. The user needs to retrieve the bank statement each month.
  • the user could start the user interface monitor in step 15010, then perform all of the actions to retrieve and store the bank statement to the user's account, which might include the steps of: the user clicks on a bookmark for the Wells Fargo website, which bring up a login page; the user enters the username in the username field; the user enters the password in the password field; the user selects a "login” button; the user selects the account of interest; the user clicks on the link to retrieve this month's bank statement for the account of interest; the bank statement is displayed in a separate window as a .pdf file; the user then saves the .pdf file to the user's
  • a task may be scheduled using that macro or script, as shown in method 15100 in FIG. 151.
  • a macro or script is selected (step 15110).
  • One or more times are scheduled to run the macro or script (step 15120).
  • the user could use method 15100 to schedule the "Wells Fargo Bank Statement" macro or script to run on the 4th of each month.
  • a method 15200 in FIG. 152 represents a particular example for the case of automatically retrieving a bank statement as discussed above.
  • a script is defined for retrieving an online bank statement (step 15210). The defined script is then run on the 4th of each month, which retrieves the bank statement for that month and stores the bank statement to the user's U-Me account (step 15220).
  • the macro/script mechanism 172 in the U-Me system can have more capabilities than merely repeating a user's functions.
  • a simple example will illustrate. Let's assume the user retrieves the bank statement for July 2013 on August 10th in defining the script in step 15210 in FIG. 152, and schedules the macro or script to run on the 4th of each month.
  • the macro/script mechanism 172 goes to the user's online Wells Fargo account to retrieve the August bank statement on September 4th, the macro/script mechanism has to have the intelligence to retrieve the August bank statement, not to simply retrieve again the July bank statement.
  • the script/macro mechanism may thus include intelligence that allows detecting patterns and learning from those patterns.
  • the macro/script mechanism can recognize the pattern and know to retrieve the statement corresponding to the month previous to the month when the macro/script is run.
  • the user may provide additional input after the macro/script is recorded to direct how the macro/script performs its functions.
  • the macro/script mechanism allows automating routine tasks so the U-Me system can perform these routine tasks automatically for the user.
  • a method 15300 shows how the user may access the user's data and/or licensed content and/or settings that are stored in the user's U-Me account. The user authenticates to the U-
  • Me system (step 15310).
  • the user identifies a location that is U-Me certified (step 15320).
  • the U-Me system reads the location settings and compares the location settings with the user settings (step 15330).
  • the conversion mechanism in the U-Me system converts the user settings to suitable location settings (step 15350).
  • the conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5.
  • the user settings that correspond to the location are then downloaded to devices at the location (step 15360).
  • the user settings in the user's U-Me account can be downloaded to devices at the location (step 15360).
  • Method 15300 could be representative of any suitable location, including a vehicle, a hotel room, a rental condo, etc.
  • the data tracker 162 in FIG. 5 tracks data changes and generates indexing information that is stored with a file to help retrieve the file using a search engine.
  • the data search engine 164 in the U-Me system allows formulating very powerful queries using drop-down lists, dates or date ranges, key words, dollar amounts, devices, etc. in a very intuitive, plain-English type of interface.
  • a U-Me user thus does not have to be a database expert who is familiar with Structured Query Language (SQL) in order to use the data search engine 164.
  • SQL Structured Query Language
  • Some sample queries that could be submitted via the data search engine 164 are shown at 15400 in FIG. 154 to include: purchases over $100 this year; phones I have owned since 06/12/2012; electronic devices I currently own; total charges on credit card XYZ for the second quarter of this year; and warranties that will expire in the next six months.
  • FIG. 154 show the data search engine 164 supports complex queries the user may formulate without being an expert at formulating database queries.
  • licensed content is licensed to a person and not to any particular device. In some circumstances there needs to be separation between ownership of the software and who is licensed to use the software. This is especially true in a company or corporate environment, where a company or corporation purchases licensed content for use by its employees.
  • a company purchases licensed content (step 15510).
  • An administrator within the company specifies a person who is the licensee for the licensed content (step 15520).
  • the licensee is then licensed to use the licensed content (step 15530). What happens if the licensee quits or is fired? This is shown in method 15600 in FIG. 156.
  • the person who is the licensee quits working for the company (step 15610).
  • the administrator within the company deletes the person from the licensing info for the licensed content (step 15620).
  • the person is no longer authorized to use the licensed content (step 15630).
  • the company owns the licensed content, and can authorize a person to use the licensed content, and can also remove authorization for the person to use the licensed content. This is especially handy when a company purchases software with a site license or with a license that covers many users.
  • the world envisioned by a fully- functioning Universal Me system as described in detail above is much different than the world in which we currently live. Implementing the U-Me system will take many years.
  • One aspect of evolving towards the U-Me system is the need to put existing information that does not exist in an electronic format into the user's U-Me account.
  • a method 15700 shown in FIG. 157 shows how existing physical items can be converted to electronic format and stored in a user's U-Me account.
  • An electronic file for a physical item is created (step 15710).
  • This can be any suitable physical item that is converted to any suitable electronic file format.
  • a hard copy medical record could be scanned to an electronic image file.
  • a DVD of a movie the user owns could be ripped to a .mov file.
  • Music from a CD the user owns could be ripped to a .mp3 file.
  • Hard copy photos could be scanned to an electronic image file.
  • the electronic file is then uploaded to the user's U-Me account with user-specified indexing info (step a hard copy medical record could be scanned to an electronic image file.
  • a DVD of a movie the user owns could be ripped to a .mov file.
  • Music from a CD the user owns could be ripped to a .mp3 file.
  • Hard copy photos could be scanned to an electronic image file.
  • the electronic file is then uploaded to the user's U-Me account with user-specified indexing info (step ).
  • the user may the access the electronic file in the user's U-Me account (step 15730).
  • method 15700 could be performed by a third party provide who specializes in migrating physical items to the U-Me system.
  • a user may become a U-Me member, and may use the system initially to store photos the user takes from a smart phone.
  • the user could hire a third party U-Me migration specialist to help migrate the user's info to the user's U-Me account.
  • the user could take a box with photos, movies, CDs, medical records, tax returns, etc. to the U-Me migration specialist, who would have the user specify indexing info for each type of item being stored.
  • the user can decide how much or how little indexing info to provide initially, because the user will always have the option to pull up the information later and add additional indexing info.
  • FIG. 116 An example of a virtual machine is shown at 11600 in FIG. 116.
  • Those skilled in the art of cloud computing will recognize that cloud resources are deployed on virtual machines. Virtual machines can be run on logically-partitioned computer systems. Thus, a single server computer can provide many different virtual machines to different users.
  • a virtual machine runs the U-Me system for a particular user, as shown by way of example in FIG. 116, the virtual machine provides all the needed memory, hard disk space and processor power the user requires. Note, however, that dedicating a virtual machine for a particular user to be always running would be a significant dedication of resources that is not needed if the virtual machine can be created when the user needs to access the U-Me system, and automatically destroyed once the user no longer needs to access the
  • a virtual machine image 15800 is shown to include U-Me system info 15810, which includes virtual machine provisioning info 15820, and a U-Me generic user shell 15830.
  • a virtual machine image can be a disk image that, once installed on a physical machine, can be executed.
  • FIG. 158 is provisioned, or instantiated, on a physical computer system, which results in a running virtual machine 15900 as shown in FIG. 159 that is running in the U-Me system.
  • the U-Me system components 15910 are running and a U-Me generic user shell 15920 is running.
  • the U-Me generic user shell is not specific to any particular user, because the U-Me user shell only becomes user-specific once a U-Me user descriptor file is written to the U-Me generic user shell 15920.
  • FIG. 160 Such an example is shown in FIG. 160, where the U-Me user descriptor file 16020 has been installed in the U-Me generic user shell to create the U-Me user-specific components 16010 for a particular user.
  • the virtual machine 16000 can then provide all the functions the user requires from the U-Me system.
  • the virtual machine will not necessarily perform all U-Me system functions. In the most preferred implementation, many of the U-Me system functions will be available on other virtual machines that are always running to service requests from many users simultaneously.
  • the virtual machine 15900 in FIG. 159 is running, and can be easily deployed to serve a user by writing the user's U-Me descriptor file to the U-Me user shell.
  • the time to customize a running virtual machine to a specific user will be a small fraction of the time to provision or instantiate a virtual machine from a virtual machine image.
  • the U-Me system can thus monitor user demand, and in anticipation of a spike in usage, the U-Me system can provision many user-generic virtual machines that are ready and waiting for the U-Me descriptor file to make these virtual machines user- specific.
  • the U-Me system can satisfy the demand by configuring each running virtual machines to a particular user. Because the user -generic virtual machines are already up and running, the time to configure a running virtual machine with the user's info will be much less than the time it would take to instantiate a virtual machine from a virtual machine image.
  • Method 11300 in FIG. 113 was discussed above. An example is now presented to illustrate specifics of how method 11300 may perform some of its steps.
  • a virtual phone 16100 includes many items copied from a physical phone, such as operating system specifications 16110, operating system settings 16120, apps installed 16130, apps settings 16140, and user settings 16150.
  • Virtual phone 16100 also includes a list of items not copied to the virtual phone 16160.
  • a U-Me app running on the physical phone can preferably detect which items in the physical phone cannot be copied to the virtual phone, and provides those items to the virtual phone so the virtual phone knows what items on a physical phone do not exist in the corresponding virtual phone.
  • a virtual phone 16200 is shown as an example of a virtual phone that could exist for a user's Samsung Galaxy S3 phone.
  • Virtual phone 16200 includes operating system specs 16210 that specify the phone is running the Android operating system, version 4.1.2.
  • the operating system settings 16220 include all settings for the Android 4.1.2 operating system that can be copied to the virtual phone 16200.
  • the list of apps installed 16230 shows apps labeled A, B, C, D, H, I, J, K, L and M.
  • the settings for installed apps 16240 include all settings for all apps that can be copied to the virtual phone 16200.
  • the user settings 16250 include all of the user settings on the physical phone.
  • the items not copied to virtual phone 16260 lists the operating system 16262, App B 16264, App D 16266, and App J 16268. It is assumed these apps have features that prevent copying all of their information to the virtual phone in the user's U-Me account (i.e., are not U-Me certified). Based on the items that could not be copied to the virtual phone 16260, the user may be prompted with steps to take before the U-Me system can configure the phone. For example, let's assume the user flushes his phone down a toilet, purchases a new phone, and installs the U-
  • the user will log in to his U-Me account, then register the new phone as one of the user's devices.
  • the user will select to configure the new device as a clone from the stored virtual device. But there still may be some missing configuration info for the new device (step 11450 in FIG. 13).
  • the user is prompted for the missing configuration info (step 1 1360).
  • the prompt to the user in step 11460 could be as shown at 16300 in FIG. 163.
  • the instructions to the user on how to setup the new Samsung Galaxy S3 phone include a list of steps the user must perform before the U-Me system can install all of the information from the user's virtual phone. First, the user makes sure Android 4.1.2 is installed. If an earlier release is installed, the user updates the operating system to Android 4.1.2.
  • the user installs Apps B, D and J manually. Once the operating system is up to date and apps B, D and J have been installed, the user can select the Continue Setup button 16310, which will cause the U-Me system to copy all of the information in the virtual phone 16200 to the physical phone. While apps B, D and J had to be manually installed, the settings for the apps could still be in the settings for installed apps 16240 in the virtual phone 16200. Installing these settings by the U-Me system after the apps are manually installed by the user can result in the apps being configured exactly as they were on the user's previous phone, even though the apps themselves could not be copied to the virtual phone 16200.
  • the U-Me system provides an improved way to manage photos, including photos that originated from a digital camera or other digital device, along with hard copy photos that have been digitized for electronic storage.
  • the U-Me system improves over the known art of software that adds metadata to photos by providing a people-centric approach to managing photos, as described in detail below.
  • the methods discussed with respect to FIGS. 164-210 are preferably performed by the photo mechanism 182 shown in FIG. 5.
  • a method 16400 generates and stores indexing information for a photo.
  • a user defines people and relationships in the U-Me system (step 16410).
  • the U-Me system derives relationships from the user-defined relationships (step 16420).
  • the user may also define one or more locations (step 16430).
  • the U-Me system may also provide defined locations (step 16440).
  • the user may also define one or more events (step 16450).
  • the U-Me system derives events from the user-defined events
  • the U-Me system then generates indexing info for a photo based on any or all of the user- defined relationships, system-derived relationships, user-defined locations, system-defined locations, user- defined events, and system-derived events (step 16470).
  • the U-Me system includes a photo system data entry screen for people, such as screen 16510 shown in FIG. 165 by way of example.
  • the photo system data entry screen 16510 like all of the U-Me system, is person-centric. Thus, when a user decides to have the U-Me system manager the user's photos, the user starts by entering data for a particular person in the photo system data entry screen 16510.
  • Fields in the photo system data entry screen 16510 include Name, Preferred Name, birth Date and Camera.
  • the user can provide a sample photo of the person's face at 1480 to help train the facial recognition engine in the U- Me photo system.
  • the Camera field includes an Add button 16570 that allows the user to enter all cameras the user uses to take digital photos.
  • the data entry screen for people 16510 shown in FIG. 165 includes a People button 16520, a Locations button 16530, an Events button 16540, a Save button 16550, and a Cancel button 16560.
  • FIG. 166 shows the data entry screen for people 16510 after a user has entered information into the data entry screen.
  • the user is Jim Jones
  • screen 16510 in FIG. 166 shows the pertinent information relating to Jim Jones, including a Preferred Name of Jimmy, a Gender of Male, a birth Date of
  • Jim Jones has a son named Billy Jones with a preferred name of Bubba who is Jim's son by birth, a daughter names Sally Jones with a preferred name of Sally who is Jim's stepdaughter, a wife named Pat who is Jim's current wife, a father named Fred Jones with a preferred name of Dad Jones who is Jim's birth father, and a mother named Nelma Pierce with a preferred name of Mom Jones who is Jim's birth mother.
  • the user can select the Save button 16550, which results in saving all of the people as people in the user's U-Me database, and which results in saving all the relationships relating to these people.
  • entering information about a spouse can include a type of spouse, such as Current,
  • a Display Family Tree button 16810 is displayed that, when selected, will display a family tree of all the relationships for the user. Note that once a person is entered into the user's People database, the user can enter more information for that person by invoking the data entry screens 16510 and 16710. [0482]
  • the initial entry of photo system data for all the people in a user's immediate and extended family may take some time, but once this work is done the U-Me system can use this data in many ways that allow easily storing photos to and easily retrieving photos from the user's U-Me account.
  • this data relating to people can be shared with others, thus allowing a first user to provide a significant shortcut to a second user who is invited to share the first user's photos as well as people, locations, events, etc.
  • FIG. 170 shows user-defined relationships can include son, daughter, father, mother, brother, sister, stepson, stepdaughter, stepfather, stepmother, boss, manager, employee, co-worker, and others.
  • Examples of system-derived relationships in FIG. 170 include grandson, granddaughter, grandpa, grandma, uncle, aunt, nephew, nephew, son-in-law, daughter-in-law, mother-in-law, father-in-law, great-grandson, great-granddaughter, great-grandpa, great-grandma, great-uncle, great-aunt, great-nephew, great-niece and others.
  • FIG. 170 All of the relationships shown in FIG. 170 are for illustration only, and are not limiting. Other user-defined relationships and system -derived relationships not shown in FIG. 170 are within the scope of the disclosure and claims herein. For example, the system could derive any suitable relationship, such as second cousin twice removed, third-level employee, etc.
  • method 17100 monitors the photo system data entry (step 17110) and constructs relationships from the photo system data entry (step 17120). People naturally think along the lines of family relationships and other relationships between people. While known software for adding metadata to a photo allows adding name labels such as "Katie" and performing facial recognition, these labels have no meaning within the context of other people in the photos.
  • the U-Me system in contrast, constructs relationships between people, such as family relationships, that allow storing and retrieving photos much more effectively than in the prior art.
  • FIG. 172 shows a display of a family tree that could be displayed when the user clicks on the Display Family Tree button 16810 after saving the information in the data entry screen 16710 as shown in
  • FIG. 168 Note there is a small “s” next so Sally Jones' name to indicate she is a step-daughter of Jim, not a daughter by birth.
  • the user-defined relationships for Jim Jones specified in FIG. 168 are shown in FIG. 173.
  • FIG. 175 shows the addition of Jenny Black and Todd Jones results in the creation of some system- derived relationships.
  • the U-Me system recognizes the wife of a son is a daughter-in-law, and thus derives from the fact that Jenny Black is listed as Billy Jones' wife that Jenny Black is the daughter-in-law of Jim Jones.
  • the U-Me system recognizes the son of a son is a grandson, and thus derives from the fact that Todd Jones is listed as Billy Jones' son that Todd Jones is the grandson of Jim Jones.
  • the U-Me system will monitor the additions and dynamically create more system-derived relationships that are derived from the user-defined relationships.
  • a user can also define locations. For example, we assume selecting the Locations button 16530 in FIGS. 165- 168 results in the display of a locations data entry page, of which an example 17610 is shown in FIG. 176.
  • the data entry screen for locations 17610 may include a button 17620 to enter a new location or a button 17630 to add a new location using a phone app.
  • the user selects the Enter a New Location button 17620, which could result, for example, in the display of the data entry screen 17710 shown in FIG. 177.
  • the location is named Jim & Pat's House
  • the street address is 21354 Dogwood
  • the city is Carthage
  • the state is Missouri (postal abbreviation of MO)
  • the ZIP code is 64836.
  • the U-Me system includes the capability of allowing a user to define any suitable location using the U-Me app on the user's smart phone or other mobile device, such as when the user selects button 17630 in FIGS. 176 or 177.
  • Method 17800 in FIG. 178 shows an example of a method for a user to define a location using the user's smart phone.
  • the user invokes the U-Me app on the user's smart phone, then activates a "location definition" mode on the U-Me app (step 17810).
  • the user selects the "Begin" button, which causes the current location of the smart phone to be stored as the beginning boundary point (step 17820).
  • step 17830 The user then travels to the next boundary point (step 17830), and selects "store” on the U-Me app (step 17840) to store the current location of the smart phone as the next boundary point.
  • the boundary points are then connected (step 17860), preferably using straight lines in connect -the-dot fashion.
  • a location is then defined from the region enclosed by the boundary points (step 17870).
  • the coordinates for the region are then sent from the U-Me app to the U-Me system, thereby defining a location for the user in the user's U-Me account.
  • the coordinates are GPS coordinates, but any suitable location coordinates could be used.
  • FIG. 179 illustrates examples of how method 17800 could be used by a user to define locations in the user's U-Me system.
  • Jim & Pat's house is in a rural area on 40 acres of land that has an irregular shape.
  • Jim uses the U-Me app on his smart phone walks to a corner of his property shown at points 1,6 in FIG. 179, activates the "location definition" mode on the U-Me app (step 17810), and selects Begin on the U-Me app (step 17820), which stores point 1 as the Begin point.
  • Jim then walks to point 2 (step 17830), the next corner of his property, and selects Store on the U-Me app to store point 2 as the next boundary point (step 17840).
  • Point 2 is not back to the beginning point (step 17810)
  • the U-Me app connects the boundary points (step 17860), and defines a location from the connected boundary points (step 17870). The geographical coordinates for this location can then be sent to the user's U-Me account, and the user can then name the location.
  • the location 17920 shown in FIG. 179 that was defined by the user is named "Jim & Pat's Property.”
  • the U-Me system computed latitude and longitude coordinates for that location based on a database of addresses with corresponding location coordinates.
  • the location coordinates for an address may not correspond very closely to the location of the house.
  • the location coordinates shown in FIG. 177 might correspond to the driveway entrance to Dogwood Road shown at 17930 in FIG. 179. If the house sits back from the road a substantial distance, the location coordinates of the address may not be accurate for the location of the house.
  • Jim could use the U-Me app to walk the boundary points of his house, shown at points 7, 8, 9, 10, 11, 12 and 13 in FIG. 179.
  • the U-Me app could then connect the boundary points and define a location 17910.
  • This user-defined location 17910 could be substituted for the system-derived location 17720 shown in FIG. 177 to provide a more accurate location of Jim & Pat's house.
  • a photo taken inside of Jim & Pat's house could include indexing information that includes both Jim & Pat's House and Jim & Pat's Property.
  • Jim & Pat's Property may be defined specifically to exclude Jim & Pat's house, so a photo taken in Jim & Pat's house will have indexing information generated that indicates the location as Jim & Pat's House, while a photo taken outside the house on the property (for example, of a grandson fishing in a pond) will have indexing information generated that indicates the location as Jim & Pat's Property.
  • FIG. 180 shows examples of user-defined locations and system-defined locations.
  • User-defined locations have a specified name and derived geocode information that defines the location.
  • the derived geocode information for Jim & Pat's Property defined by the user at 17920 in FIG. 179 is all geographical coordinates that fall within the defined location 17920.
  • the user-defined locations include Jim & Pat's House, Jim & Pat's Property, Jim's Office, Billy's House, Dad Jones' House, etc.
  • the system -de fined locations can include any location information available from any suitable source, such as online databases or websites, etc.
  • System -defined locations may include, for example, city, county, state, country, city parks, state parks, national parks, tourist attractions, buildings, etc.
  • the U-Me system can detect the location coordinates, check available databases of system-defined locations, detect the location corresponds to the Grand Canyon, and add a location "Grand Canyon” to indexing info for the photo. The same could be done for tourist attractions such as Disney World, and for buildings such as the Empire State
  • a user could define many user -defined locations and the system could define any type and number of system-defined locations. Note that one location can correspond to both a user-defined location and a system-defined location. Thus, if a user owns a cabin in a state park, the user could define the location of the cabin as "Cabin", and photos could then include indexing information that specifies both the state park and "Cabin”.
  • FIG. 181 shows sample metadata 18110 that may exist in known digital photo files.
  • metadata is used herein to mean data that is not part of the visible image in the digital photo that describes some attribute of the photo.
  • the metadata 18110 in FIG. 181 is shown to include fields for Camera Make, Camera Model, Camera Serial Number, Resolution of the photo, Image Size of the photo, Date/Timestamp, and Geocode Info.
  • the metadata shown in FIG. 181 is shown by way of example. Many other fields of metadata are known in the art, such as the metadata fields defined at the website photometadata.org.
  • the photo metadata disclosed herein expressly extends to any suitable data, whether currently known or developed in the future, that is placed in the digital photo file by the device that took the photo to describe some attribute that relates to the photo.
  • step 182 When photo metadata includes geocode info as shown in FIG. 181 that defines the geographical location of where the camera was when the photo was taken (as is common in smart phones), method 18200 in FIG. 182 reads this geocode info from the metadata (step 18210).
  • the geocode info can be in any suitable form such as GPS coordinates or other forms of geographical info that specifies location, whether currently known or developed in the future.
  • Jim Jones takes a photo with his cell phone of his daughter in his house.
  • the geocode info will reflect that the location corresponds to a stored location, namely, Jim & Pat's House. Jim & Pat's House can then be added to the indexing information, which makes retrieval of photos much easier using a photo search engine.
  • photo indexing info 18310 are shown by way of example to include person info, location info, event info, and other info.
  • Person info can include information relating to a person, including relationship info, both user-defined and system-derived.
  • Location info can include any information relating to a location, including user -defined and system -defined.
  • Event info can include any information relating to a date or date range for the photo, including user -defined events, system-derived events, and system -defined events, as discussed in more detail below.
  • the photo indexing info is generated using tags in a markup language such as extensible Markup Language (XML). Sample tags for the photo indexing info 18400 shown in FIG.
  • XML extensible Markup Language
  • the sample tags for Person Info shown in FIG. 184 include Person_FuUName, Person_PreferredName, Person_Age and Person_Other.
  • the sample tags for Location Info shown in FIG. 184 include
  • the sample tags for Event Info shown in FIG. 184 include Event_Name, Event_Date, Event_Date_ ange, Event_BeginDate, Event_EndDate, and EventOther. Note the specific tags shown in FIG. 184 are shown by way of example, and are not limiting.
  • events may include user-defined events, system -derived events that are derived from user-defined events, or system-defined events that are selected by the user.
  • user- defined events include birth dates, wedding dates, events date ranges entered by a user, and labels entered by a user that correspond to a date or date range, and others.
  • system-derived events include Jim's 56th Birthday, Jim & Pat's 30th Anniversary, Jim's Age, and others. Note that a person's age is not an "event” in a generic sense of the word, but the term "event” as used in the disclosure and claims herein includes anything that can be determined based on a date or date range for a photo, including ages of the people in the photo.
  • Examples of events that are system -defined and selected by a user may include fixed- date holidays, variable-date holidays and holiday ranges.
  • Examples of fixed-date holidays in the United States include New Year's Eve, New Year's Day, Valentine's Day, April Fool's Day, Flag Day,
  • variable- date holidays in the United States include Martin Luther King, Jr. Day, President's Day, Easter, Memorial Day, Labor Day and Thanksgiving. Of course, there are many other fixed-date holidays and variable-date holidays that have not been listed here. Holiday ranges could be defined by the system, and could be selected or modified by the user. For example, an event called "Memorial Day Weekend" could be defined by the system to be the Saturday and Sunday before Memorial Day as well as Memorial Day itself. The user could select this system definition for "Memorial Day Weekend", or could modify the definition. For example, the user could change the definition of "Memorial Day Weekend" to include the Friday before Memorial Day as well.
  • Similar holiday ranges could be defined for Labor Day Weekend, Thanksgiving Weekend and Christmas Holidays.
  • a user can accept a system -defined holiday range or could modify the system-defined holiday range to the user's liking.
  • the system could define "Christmas Holidays" to include December 20th to January 1st.
  • the user could then modify the system definition of "Christmas Holidays” to include December 15th to January 2nd.
  • the system -de fined holidays may include holidays for a number of different countries, nationalities, ethnic groups, etc., which allows a user to select which holidays the user wants to include in indexing information for the user's photos.
  • a Jewish user could select to include Jewish holidays while excluding Christian holidays.
  • a photo is selected (step 18610).
  • a photo is a digital photo file.
  • a unique ID and checksum are generated for the photo (step 18620).
  • the unique ID is a simple numerical designator (like a serial number) that uniquely identifies the photo to the U-Me system.
  • the checksum is computed to facilitate detecting duplicate photos, as discussed in more detail below. Facial and feature recognition are performed on the photo (step 18630). Indexing info is generated for recognized faces and features (step 18640).
  • Indexing info is also generated for recognized locations (step 18650) based on the location of the photo.
  • the location may be entered by a user. This could be useful, for example, when a user scans a hard-copy photo to generate a digital photo file that does not include location info. The user could then specify a location, which would be a recognized location for the photo.
  • the location is determined from geocode info embedded in the metadata of the digital photo file.
  • Indexing info is also generated for recognized events (step 18660). Recognized events may include anything relating to a date or date range for the photo, including user-defined events, system -derived events (include ages of people in the photo), and system-defined and user-selected events, as described above with reference to FIG. 185. Indexing info for other photo metadata may also be generated (step 18670). The photo is stored (step 18670).
  • the indexing info for the photo is also stored (step 18690).
  • the indexing info is stored separately from the photo.
  • the indexing info is stored as metadata in the digital photo file.
  • the indexing information generated in steps 18640, 18650, 18660 and 18670 may include data that is not in the metadata for the photo, but is generated based on the metadata for the photo in conjunction with information stored in the user's U-Me account. For example, when the U-Me system recognizes a date in the photo metadata that corresponds to Jim & Pat's wedding anniversary, the U-Me system can generate indexing info for the photo that identifies the Event for the photo as Jim & Pat's Wedding Anniversary. Having dates, locations and relationships defined in the user's U-Me account provides a way to add indexing info to a photo that will help to retrieve the photo later using a powerful search engine, discussed in more detail below.
  • step 18630 in FIG. 186 One suitable implementation for step 18630 in FIG. 186 is shown as method 18700 in FIG. 187.
  • step 18730 might display the photo with various different faces and regions defined.
  • the user could select a face and then enter the name for the person, or if the person will appear in many photos, the user could enter some or all of the person's data in a photo system data entry screen, similar to that shown in FIG. 165.
  • the user could also select various regions of the photo to define features that could be recognized in future photos.
  • a photo shows a couple on a beach with a cruise ship in the background
  • the user could click on each face to define information corresponding to those two people, and could also click on the sand on the beach and define this feature as "beach”, click on the water and define this feature as "water”, and click on the cruise ship and define this feature as "boat.”
  • these features may be recognized in other photos, which allows adding indexing information that describes those features automatically when the photo is processed, as shown in method 18700 in FIG. 187.
  • a method 18800 is one specific implementation for step 18640 in FIG. 186.
  • Indexing info is generated for recognized faces and/or features based on the user -defined relationships (step 18810). Indexing info is also generated for the recognized faces and/or features based on the system -derived relationships (step 18820).
  • Method 18900 in FIG. 189 is one suitable implementation for step 18650 in FIG. 186 for a digital photo file that does not include any geocode info.
  • a user defines the location for the photo (step 18910).
  • the user may specify the location using geographical coordinates, or by selecting a name of a geographical location that has already been defined by the user or by the system. This would be the logical approach when a digital photo file has been created from a hard-copy photo, and no geocode info is available for the photo.
  • indexing info is generated for the system -defined location(s) (step 18930).
  • indexing info is generated for the user-defined location(s) (step 18950).
  • a single photo can include indexing info that relates to multiple user-defined locations and multiple system-defined locations, all based on the one location where the photo was taken. For example, if we assume region 17910 in FIG. 179 is defined as Jim & Pat's House, and region 17920 is defined as Jim & Pat's Property to include Jim & Pat's House, and assuming the house and property are in Jasper County, Missouri, a photo taken in Jim & Pat's House could include indexing info that specifies the location as Jim & Pat's House, Jim & Pat's Property, Jasper County, Missouri, USA.
  • a method 19000 is one suitable implementation for step 18650 in FIG. 186 for a digital photo file that includes geocode info, such as a photo taken with a smart phone.
  • the geocode info is read from the photo metadata (step 19010).
  • indexing info is generated for the user-defined location (step 19050).
  • the location is added to the user-defined locations (step 19070).
  • indexing info for a photo can specify many different locations that all apply to the photo, both user-defined and system -defined.
  • the generation of location-based indexing info for photos may be done using any suitable heuristic and method. For example, if Jim Jones takes a photo of a grandson at a birthday party in his living room in his house, the U-Me system will recognize the location as Jim & Pat's House, and will store this location as indexing info with the photo. If Jim takes a photo of the grandson fishing at a pond on the property, the U- Me system will recognize the smart phone is not at the house but is on the property, and will recognize the location as "Jim & Pat's Property", and will store this location as indexing info with the photo. In addition, various heuristics could be defined to generate location descriptors. For example, anything within 100 yards of a defined location but not at the defined location could be "near" the defined location. The disclosure and claims herein expressly extend to any suitable location information that could be generated and included as indexing information to describe location of where a photo was taken.
  • Method 19100 in FIG. 191 is one suitable implementation for step 18660 in FIG. 186.
  • a date or date range is determined for the photo (step 19110).
  • the user could specify a date or date range for the photo. Why would the user specify a date range instead of an exact date? One reason is when the user is not sure of the specific date the photo was taken, but can narrow it down to a date range. Another reason is a date range could apply to many photos to make it easier to generate indexing info for those photos.
  • the digital photo file includes a date that indicates when the photo was taken, determining the date in step 19110 will include reading the date from the metadata in the digital photo file.
  • step 19120 When a recognized person is in the photo image, step 19120 will be YES because step 19130 will be performed to compute the age of any and all recognized persons in the photo. Note that age can be computed differently for infants than for older children and adults. When asked how old a baby or a toddler is, the mother will typically reply in months because this is much more informative than telling years. The U-Me system could recognize this, and in addition to generating indexing info that indicates years for a recognized person, when that person is less than say three years old, the indexing info generated in step 19130 could additionally include the age of the recognized person in months.
  • step 19140 YES
  • indexing info is generated for the corresponding user-defined event(s) (step 19150).
  • a method 19200 reads camera info from the metadata for a photo (step 19210), looks up the photographer name that corresponds to the camera info (step 19220), and adds the photographer's name to the indexing info (step 19230). In this manner, the metadata in the photo that identifies the camera is used to go a step further to identify the person who uses that camera so the photographer can be specified in the indexing information for the photo.
  • Method 19200 in FIG. 192 is one suitable implementation for step 18670 in FIG. 186.
  • method 19300 is a method for storing a photo with corresponding indexing information.
  • the user takes the photo (step 19310).
  • the U-Me software or app sends the photo with metadata (i.e., the digital photo file) to the user's U-Me account (step 19320).
  • the U-Me software or app can send the photo with metadata to the user's U-Me account in any suitable way, including a direct connection from the U-Me software or app to the U-Me system.
  • the U-Me software or app can send one or more e-mails to the user.
  • the U-Me system monitors incoming e-mail, and when a photo is detected, embedded in an e-mail or as an attachment, the U-Me system recognizes the file as a photo. Facial and feature recognition is performed (step 19330). Indexing information is generated for all recognized faces and features (step 19340), for all recognized locations (step 19350), for all recognized events (step 19360), and for any other metadata (step 19370).
  • the digital photo file including its metadata, is stored with the generated indexing info in the user's photo database (step 19380).
  • a flag is set to prompt the user for the needed input (step 19390). Setting a flag lets the user decide when to enter the needed input.
  • Method 19300 could be carried out by a user taking a photo with a smart phone that is running the U- Me app, which results in the photo being automatically uploaded, processed, and stored in the user's U-Me account.
  • method 19400 begins by scanning a hard copy photo (step 19410). Facial and feature recognition is performed (step 19420). A wizard prompts the user to enter indexing information for the photo (step 19430). The photo with its indexing info is then stored in the user's photo database (step 19440). Note the indexing info can be stored in the metadata in the digital photo file, or can be stored separately from the digital photo file.
  • method 19500 begins by a user invoking a photo indexing info generator (step 19510). The user can then define indexing info for groups of photos or for individual photos (step 19520).
  • a sample digital photo file 19620 is shown in FIG. 196 to include an identifier (ID), Metadata, and the Image. While the indexing information is "metadata" in a general sense, the term “metadata” as used herein relates to data generated by the camera that describes some attribute related to the image, while “indexing info” as used herein relates to data that was not included in the metadata for the image but was generated by the U-Me system to facilitate retrieval of photos using a powerful search engine.
  • the indexing info 19610 can be stored separately from the digital photo file 19620 by simply using the same unique identifier for the photo to correlate the indexing info to the photo. In the alternative, the indexing info can be stored as part of the digital photo file 19610, as shown in FIG. 196.
  • FIG. 198 An example of a photo indexing info generator screen 19800 is shown in FIG. 198 to include Date fields, a People field, an Event field, a Location field, and a display of thumbnails of photos.
  • the user specifies a date or range of dates in the Date fields.
  • the user specifies one or more people in the People field.
  • the user specifies location in the Location field.
  • An example will illustrate how a user might use the photo indexing info generator in FIG. 198 to generate indexing info for scanned hard copy photos.
  • Jim Jones has a stack of 163 photos of all the wedding -related photos of when he married Pat, including some on the morning of their wedding day showing the wedding ceremony, some that were taken later on their wedding day at the reception, and some a week later at a second reception in Pat's hometown.
  • Jim could enter a date range that begins at the wedding day and extends to the date of the second reception, could define an event called "Jim & Pat's Wedding", and could select the 163 thumbnails that correspond to the wedding and reception photos.
  • the user selects the Save button 19810, which results in the photos being saved in Jim's photo database with the appropriate dates and event information as indexing information.
  • People, Event and Location fields can include drop-down lists that list people, events and locations that have been previously defined, along with a selection to define a new event or location. If the user decides to abort entering the indexing info for photos, the user may select the Cancel button 19820.
  • indexing info for photos is the ability to search for and retrieve photos using the indexing info. No longer must a user search through hundreds or thousands of thumbnails stored in dozens or hundreds of directories with cryptic names that mean nothing to a person! Instead, the user can use a photo search engine to retrieve photos based on people, their ages, family relationships both entered and derived, location, dates, and events.
  • FIG. 199 One example of a screen 19900 for a photo search engine is shown in FIG. 199.
  • the example shown in FIG. 199 includes fields for Date(s), Event, Location, People, Relationship, and Photographer. Because of the relationships entered by the user and derived by the U-Me system, searches or queries for photos can now be formulated based on those relationships. Examples of photo queries supported by the photo search engine 19900 in FIG. 199 are shown at 20000 in FIG. 200, and include: photos of grandchildren of Jim Jones between the ages of 2 and 4; photos of the wedding of Sandy Jones; and photos taken at the Lake House in 2010. These simple examples illustrate that adding indexing info that relates to people, locations and events allows for much more powerful querying and retrieving of photos than is known in the art.
  • the user may want to share photos stored in the user' s U-Me account. This can be done using a photo share engine, a sample display of which is shown at 20100 in FIG. 201.
  • the photo share engine could be provided as a feature of the sharing mechanism 174 shown in FIG. 5, or could be provided by the photo mechanism 182.
  • the user defines criteria for photos to share, then specifies contact information for people with whom the user wants to share the photos. The user can also select whether to share the user's faces, people, locations, events, metadata, and indexing info.
  • the criteria for photos to share can include any suitable criteria, including any suitable criteria that could be entered into the photo search engine for retrieving a photo.
  • the "Share with” field could be a drop-down list with people in the U-Me system, could be a drop-down list of people the user has defined in the user's U-Me account, or could be an e-mail address or other unique identifier for the person. A user could thus enter the e-mail address of a person who is not a
  • the Representative Photo could designate a photo that includes many family members so the person invited to share the photos can see how the people in the representative photo are identified by the person sharing the photo.
  • a method 20200 is one example of how a U-Me user could share information with other U-Me users.
  • PI denotes a first U-Me user who wants to share photos and related information with a second U-Me user denoted P2.
  • PI designates P2 to share photos with the "Share All" option (step 20210).
  • selecting Yes to the Share All option causes all of the user's photo -related information to be shared with another user, including faces, people, locations, events, metadata, and indexing info.
  • the U-Me system sends an invitation to P2 to share Pi 's photos (step 20220).
  • P2 will sign up as a U-Me user.
  • P2 logs in to the U-Me system (step 20230).
  • Pi 's defined people are displayed to P2 (step 20240).
  • P2 may select one of Pi 's people (step 20250) and update info for that person (step 20260).
  • P2 updating the info for that person does not change the info for that person in Pi 's account. Instead, the info for that person in Pi 's account is copied to P2's account so changes made by P2 do not affect the person info in Pi 's account. For example, let's assume Jim Jones invites his son Billy to share some family photos.
  • Pi 's representative photo may be displayed to P2 showing the recognized faces (step 20270). P2 may then identify the identifies of the recognized faces in the representative photo to assure they are correct. If the user updated info for PI 's people in step 20260, P2's preferred names for those people can now be displayed for the recognized faces in step 20270.
  • P2's people are correlated to Pi 's faces (step 20290).
  • P2 may make any needed corrections or changes to Pi 's faces (step 20280) before correlating P2's people to Pi's faces (step 20290).
  • FIG. 203 An example is now presented to illustrate the generation of indexing information for a photo by the U-Me system.
  • a sample photo 20300 is represented that includes Jim Jones and Todd Jones on Christmas Day 2012. Based on information entered by the user in the data entry screen 16710 in
  • the indexing information shown in FIG. 204 could represent examples of possible indexing information for the photo in FIG. 203 generated by the photo mechanism in the U-Me system.
  • the first tag at 20410 in FIG. 204 is a photo_id tag that provides a unique identifier that identifies the photo to the U-Me system.
  • FIG. 204 is in hexadecimal format.
  • the second tag 20420 is photo_checksum, which provides a checksum for the image that is computed based on the information in the image portion of the photo file. While the photo_checksum value could include information in the metadata and indexing info for a photo, in the most preferred implementation the photo_checksum would include only the image data in the checksum computation so changes to the metadata or the indexing info will not affect the value of the
  • Indexing info for Jim Jones is shown in the person tag at 20430.
  • Each person may be given a unique face identifier, such as Person_FaceID shown in FIG. 204, that uniquely identified a face recognized by the facial recognition software in the U-Me system.
  • the indexing info for Jim Jones shown at 20430 include a Person_FaceID of 4296, a Person_FullName of Jim Jones, a Person_PreferredName of Jimmy, a Person_Age of 53, and four Person_Relationship entries that specify one or more relationships in the U-Me system, including a relationship that specifies Jim Jones is the spouse of Pat Jones, is the father of Bill Jones, is the father-in-law of Jenny Jones, and is a grandpa of Todd Jones.
  • the indexing info for a person preferably includes all relationships for that person, both user-defined and system-derived. Indexing info for Todd Jones is shown in the person tag at 20440. Note the age of Todd Jones is shown as 2, and Todd's family relationships are also indicated. As stated above, Todd's age could additionally be shown in a number of months since he is a young child under some specified age, such as three.
  • the indexing info at 20450 is due to the facial and feature recognition engine recognizing a Christmas tree in the photo.
  • the indexing info at 20460 includes all location info for the photo. We assume the photo was taken in Jim's house. Thus, the location information includes the name Jim & Pat's House with the address, city, state, zip and country. In an alternative implementation, a separate location tag could be created for each item that specifies a location.
  • Indexing info 20470 and 20480 are also shown, which correspond to two defined events. Indexing info 20470 identifies the event as Christmas, and a date of 2012/12/25, while indexing info 20480 identifies the event as Christmas Holidays with a date range of
  • the indexing info shown in FIG. 204 is used by the U-Me system to create indexes that allow easily identifying many different pieces of information that can be used in formulating a sophisticated search.
  • Pat Jones does a search for photos of all her grandchildren, this photo 20400 would be returned in the search because of the tag Person_ elationship:Grandson:Pat Jones.
  • Jim does a search for all photos taken during the Christmas Holidays for the years 2008-2012, this picture will also be returned in the search because of the tag that defines the event as Christmas Holidays for the specified date range.
  • Pat does a search for all photos of Todd taken at Jim & Pat's house, this photo would also be returned.
  • tags used with respect to the indexing info is a different kind of "tag” known in the art for tagging photos. This potential confusion is caused by the use of the label "tag” for two different things in the art.
  • a tag in the indexing info described herein could be a markup language tag that identifies some attribute of the photo.
  • Known tags for photos in contrast, are simple labels. Thus, a user can use Google's Picasa software and service to perform facial recognition on photos and to tag photos with the names of the recognized people. These simple tags are labels that are searchable, but do not contain any relationship information, ages of people in the photos, events, etc.
  • FIGS. 205 and 206 illustrate how a first user called PI can share photo-related information with another user called P2.
  • FIG. 205 features of Pi 's U-Me account related to the photo mechanism are shown in FIG. 205 to include People Info, Location Info, Event Info, Face Info, and Photo Info, which is a database of Pi 's photos that includes photos and associated indexing info.
  • the indexing info could be stored separate from the digital photo file as shown in FIG. 196, or could be stored as part of the digital photo file as shown in FIG. 197.
  • PI wants to share all of Pi 's photos with P2, as shown in method 20200 in
  • FIG. 202 Because PI selected "Share All" when sharing Pi 's photos with P2, when P2 creates P2's U-Me account, all of the People Info, Location Info, and Event Info are copied to P2's account, as shown by the dotted lines in FIG. 206. Once the info has been copied, it can be edited by P2 in any suitable way. For example, let's assume Jim Jones enters a preferred name of Cookie for his wife Pat because that is the nickname he always uses to refer to Pat. If Jim shares his photo info with his son Bill, it is likely that Bill will want to change the preferred name of Cookie for Pat to something else, like Mom. In addition, if Jim defined a location for Jim & Pat's House and Jim & Pat's Property as illustrated in FIG.
  • Bill could change the location name to Mom & Dad's House and Mom & Dad's Property.
  • P2 could thus use the copied definitions for People Info, Location Info and Event Info from Pi 's account as a shortcut to defining people, locations and events in P2's account.
  • the sharing of photos between PI and P2 is done in a way similar to the sharing of face info.
  • the photos remain in Pi 's account, and are not copied to P2's account.
  • P2's account includes photo info that is indexing info for the photos that can be separate from the digital photo files.
  • Pi 's account has photos Photo 1, . . ., PhotoN and corresponding indexing info for PI, labeled P1_II1, . . ., P1_IIN.
  • the indexing info can be copies to P2's account, and the indexing info can then be modified as desired by P2.
  • P2 having its own set of indexing info P2_II1, . . ., P2_IIN for the photos stored in Pi 's account.
  • the U-Me system thus allows users to share photos and related information in a very efficient manner while still providing customization for each user, and while providing an incredibly powerful search engine that can retrieve photos based on relationships, locations and events.
  • the U-Me photo mechanism may include a way for a person such as P2 in FIG. 206 to decide to share some of Pi 's photos but not all. For example, a woman PI might take seventy photos of Pi 's daughter at her birthday party, and may share all seventy of those photos with her mom P2. Her mom may not want or need to share all seventy of those photos.
  • the U-Me photo mechanism could display thumbnails for Pi 's photos, then allow P2 to select which of those P2 wants to share. This could become an issue regarding how a user is billed for using the U-Me system.
  • Pat Jones decides to become a U-Me subscriber, takes the time to enter all the information for the people in her family, including birth dates, wedding dates, events, locations, etc. and has the U-Me system generate the indexing info for all of her photos.
  • This significant amount of work can be to the benefit of other users who Pat invites to share her photos, such as her husband and her children. While some of the preferred names may change, the vast majority of information entered by Pat will apply to her children and spouse as well. This allows one enthused family member to do the majority of the work in defining people and relationships in the U-Me system, and creates an incredibly powerful shortcut for others that are invited to share that family member's photos and related info.
  • Method 20700 assumes the existing tags are stored as part of the photo metadata in a digital photo file.
  • the photo metadata is processed for existing tags (step 20710).
  • a list of existing tags is displayed to the user (step 20720).
  • the user is then allowed to correlate existing tags with defined people, locations and events in the user's U-Me account (step 20730).
  • Indexing info is then generated based on the people, locations and events corresponding to the existing tags (step 20740).
  • Identifiers for the photos are compared (step 20810).
  • One suitable example for identifiers that could be used to compare photos is a checksum that is computed over all the data in the image portion of a digital photo file. When two checksums match, it is very, very likely the photos are duplicates.
  • Photos that have the same identifiers are marked as possible duplicates (step 20820).
  • a list of possible duplicates is then displayed to the user (step 20830). The user can then identify duplicates from the list and delete any duplicates (step 20840). By detecting and deleting duplicates as shown in method 20800, the U-Me photo mechanism avoids needless storage and processing of duplicate photos.
  • the U-Me photo mechanism thus includes the capability of importing a file that specifies people and relationships (step 20910), such as a .GEDCOM file generated by most popular genealogy software, such as Roots Magic. Photo system data is generated in the user's U-Me account for the people in the imported file (step 20920). System-derived relationships are then derived based on the relationships in the imported file (step 20930).
  • a genealogy file such as a .GEDCOM file can thus provide a significant shortcut for importing data regarding people and family relationships into a user's U-Me account.
  • the file need not necessarily be a genealogy file, but could be any type of file that represents people and/or relationships between people, such as an organization chart for a business.
  • a significant advantage of the U-Me system is the ability to dynamically update indexing info for a person when information is added or changed. For example, let's assume the photo mechanism recognizes a person's face in 49 photos, but the user has not yet correlated the face to a defined person. The face is represented by a face ID, as shown by way of example in FIG. 204. Once the face is correlated by the user to a defined person, the attributes for that person may by dynamically added to the indexing info.
  • a method 21000 illustrates how additions or changes are automatically propagated to indexing info for a user's photos.
  • An addition or change to a person's people, relationships, locations or events is detected (step 21010).
  • This change is then propagated to the indexing info for all affected photos (step 21020).
  • Jim & Pat occasionally spend Christmas at a cabin in the mountains that they rent.
  • the owners offer to sale the cabin to Jim & Pat, who accept and purchase the cabin.
  • a location is defined for "Our Cabin”
  • all pictures that include the geographic information for the cabin will be updated to include the location name "Our Cabin.”
  • U-Me system lends itself to some great features for the photo mechanism. For example, creating a person in a user's U-Me account can result in automatically creating a "container" corresponding to that person in the user's U-Me account.
  • the user can select any container corresponding to any person, which will result in displaying or listing all photos that person is in. Note the containers do not "contain" the photos in a literal sense, but contain pointers to the photos stored in the user's photo database. Note the displayed photos for a person can be organized in any suitable way, such as chronologically, alphabetically according to location, etc.
  • a method 21100 shows how the user may access the user's data and/or licensed content and/or settings that are stored in the user's U-Me account.
  • the user authenticates to the U- Me system (step 21110).
  • the user identifies a location that is U-Me certified (step 21120).
  • the U-Me system reads the location settings and compares the location settings with the user settings (step 21130).
  • the conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5.
  • the user settings that correspond to the location are then downloaded to devices at the location (step 21160).
  • a universal remote control 21210 is shown to include a processor 21220 coupled to a memory 21230, a touch-screen display 21240, an equipment communication interface 21250, an external database communication interface 21260, and a code reader 21281 via a system bus 21280.
  • Batteries 21272 preferably provide power to a power supply 21270, which provides power to the various components in the universal remote control 21210.
  • a different power source than batteries could be used, such as power from a wall-plug DC adapter. Batteries 21272 are preferred so the universal remote control 21210 can be used without a power cord.
  • Memory 21230 preferably contains a display programming mechanism 21232, a dynamic location- based programming mechanism 21234, and an internal database 21238.
  • the display programming mechanism 21232 allows dynamically programming touch-screen display 21240 so the graphical icons
  • 21242 are programmed to correct channel numbers for a specified location. If any of the channels represented by graphical icons 21242 are not available in the specified location, the display programming mechanism 21232 could delete those graphical icons, or could show those graphical icons "grayed out", meaning they still show up but are not selectable by the user.
  • the dynamic location-based programming mechanism 21234 functions according to a specified location 21236.
  • the specified location 21236 can be specified in any suitable way. For example, the user could enter a numeric identifier that identifies the specified location. The user could use the code reader 21280 to read a suitable machine -readable code or identifier, such as text, a QR code, a barcode, or any other machine-readable identifier.
  • the specified location 21236 could be determined from a GPS device internal to the universal remote control 21210.
  • the specified location 21236 could be determined by communicating any suitable location-specific information, such as IP address, to an external database such as 21290, which could return a location based on the IP address.
  • Processor 21220 may be constructed from one or more microprocessors and/or integrated circuits. Processor 21220 executes program instructions stored in memory 21230. Memory 21230 stores programs and data that processor 21220 may access.
  • universal remote control 21210 is shown to contain only a single processor and a single system bus, those skilled in the art will appreciate that a universal remote control as disclosed and claimed herein may have multiple processors and/or multiple buses.
  • the interfaces that are used preferably each include separate, fully programmed microprocessors that are used to off-load compute- intensive processing from processor 21220.
  • these functions may be performed using I/O adapters as well.
  • Touch-screen display 21240 includes graphical icons 21242 that can be selected by a user touching the icon on the display 21240.
  • Equipment communication interface 21250 is used to transmit commands to equipment or devices. For the specific example shown in FIG. 212, the equipment communication interface
  • the equipment communication interface 21250 can send commands in any suitable format or combination of formats. For example, let's assume the television 21252 is Wi-Fi enabled, which means it can be controlled via commands sent via the Wi-Fi network. Let's further assume the DV 21254, DVD 21256 and audio receiver 21258 are not Wi-Fi enabled, but are controlled by infrared signals.
  • the equipment communication interface 21250 would include a Wi-Fi interface that communicates via a Wi-Fi network with the TV 21252, and an infrared interface that communicates with the DVR 21254, DVD 21256 and audio receiver 21258. Note the equipment communication interface 21250 could also communicate with external hardware 21259, which then communicates with equipment.
  • the equipment interface 21250 could be a Wi-Fi interface that communicates directly with the TV and with external hardware 21259 that includes an infrared transmitter so the commands sent via the Wi-Fi interface to the external hardware 21259 can be converted to corresponding commands on the infrared transmitter.
  • the equipment communication interface 21250 can include any suitable interface or combination of interfaces to control any suitable equipment or device.
  • the external database communication interface 21260 provides an interface to an external database 21290 that includes location-specific programming parameters 21292. Any suitable interface 21260 can be used for communicating with any suitable type of external database 21290.
  • the external database communication interface 21260 could be a wireless interface that connects via Wi-Fi to a website that provides the external database 21290.
  • the location-specific programming parameters 21292 include all information needed to program the universal remote control 21210 for the specified location 21236.
  • the location-specific parameters 21292 may include location-specific information, such as which devices are at the specified location, while the programming codes for the devices are stored in the internal database 21238.
  • the universal remote control 21210 can thus be programmed automatically to control equipment (devices) at the specified location by the universal remote control interacting with the external database 21290 to determine the location- specific programming parameters 21292 that correspond to the specified location.
  • the universal remote control 21210 is thus more "universally remote” than known universal remote controls, because it can be easily and automatically programmed to suit different locations using location-specific programming parameters.
  • the universal remote control 21210 is thus universal across locations, not just universal in being programmable to control a large number of devices, as is known in the art.
  • aspects of the disclosed universal remote control may be embodied as a system, method or computer program product. Accordingly, aspects of the universal remote control may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the universal remote control may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the universal remote control may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • the universal remote control 21210 shown in FIG. 212 can be implemented in a number of different ways. Examples shown in FIG. 213 include a smart phone with an app; a smart phone with external hardware and an app; a tablet computer with an app; a tablet computer with external hardware and an app; and a dedicated universal remote control. Of course, other implementations are also possible within the scope of the disclosure and claims herein.
  • a method 21400 shows how to setup location-specific programming parameters 21292 in the external database 21290 shown in FIG. 212.
  • a location is specified (step 21410).
  • a TV provider at the specified location is specified (step 21420).
  • a channel map for the specified TV provider is determined (step 21430).
  • Devices at the specified location are specified (step 21440).
  • Devices in step 21440 refers to equipment at the specified location that will be controlled by the universal remote control.
  • Programming codes are then specified for each device at the specified location (step 21450). Note that method 21400 could be repeated for each location specified in the external database 21490. Note also that step 21450 could be optional if the programming codes are included in the internal database 21238 shown in FIG. 212.
  • entry 21500 in the external database 21290 in FIG. 212 is shown in FIG. 215.
  • entry 21500 could be generated using method 21400 in FIG. 214.
  • Entry 21500 includes a location identifier 124987 that uniquely identifies a location.
  • the entry 21500 specifies the TV provider at that location, which is DirecTV for this location.
  • the channel map is the DirecTV channel map, which will correlate channels provided by DirecTV at the specified location with the corresponding channel numbers.
  • the devices in entry 21500 include a DirecTV DVR, a Samsung TV, and a Sony Blu Ray player, with their corresponding code sets. Because the entry 21500 includes all location-specific information, the entry 21500 can be used to program the universal remote control for the specified location.
  • entry 21500 is one suitable implementation for the location-specific programming parameters 21292 shown in FIG. 212.
  • the external database 21290 includes many entries similar to 21500 in FIG. 215 that each specifies programming parameters for a different location.
  • the code sets shown in FIG. 215 need not be in the entries in the external database, but could optionally be stored in the internal database 21238 shown in FIG. 212.
  • a method 21600 is preferably performed by the dynamic location -based programming mechanism 21234 shown in FIG. 212.
  • a location is specified (step 21610).
  • the location can be specified in any suitable manner.
  • One suitable manner for specifying a location is for the user to enter a unique identifier for the location.
  • Another suitable manner is for the remote control to use its code reader to read a machine-readable representation of a unique identifier, such as a text identifier, a QR code, a barcode, or any other machine -readable identifier.
  • the external database is then accessed by the universal remote control to determine the TV provider, channel map, devices and corresponding programming codes for the specified location (step 21620).
  • the remote control then programs itself for the specified location using the TV provider, channel map, devices and corresponding programming codes (step 21630). While the most preferred implementation includes all of the information shown in entry 21500 in FIG. 215 for each location, in an alternative implementation the codes sets could be stored in the internal database 21238 of the universal remote control as shown in FIG. 212, while the remainder of the information is stored in entry
  • location-specific information could be distributed across multiple databases.
  • an entry could include the location, TV provider and channel map fields shown in FIG. 215, along with a link to a different entry in a different database that specifies the devices at the location.
  • the disclosure and claims herein expressly extend to storing location-specific programming parameters in any suitable number of locations, and accessing the location-specific programming parameters in any suitable number of locations to program the universal remote control 3910 to a specified location.
  • a method 21700 for converting user settings from a first vehicle to a second vehicle and for configuring the second vehicle using those settings.
  • the user settings for the first vehicle are stored (step 21710).
  • the user settings for the first vehicle are then converted to corresponding user settings for a second vehicle (step 21720).
  • the second vehicle could be the same type as the first vehicle, could be a similar type as the first vehicle, or could be a different type as the first vehicle.
  • the user settings for the second vehicle are then downloaded to the second vehicle (step 21730).
  • the second vehicle is then configured using the downloaded user settings (step 21740).
  • Method 21700 allows a user to rent a rental car of a type the user has never before driven, and have the rental car automatically configured according to the user's settings on a different car.
  • the "type" of vehicle can vary.
  • a second vehicle could be considered the “same type” as the first vehicle when the first and second vehicles have the exact same set of settings. This could happen, for example, when the two vehicles are the same vehicle just one model year apart.
  • "same set of settings” means all of the corresponding settings are expressed in the same units or in the same manner.
  • a setting for horizontal driver seat position measured in distance from the floor is the same setting as a setting in a different vehicle for that also specifies horizontal driver seat position measured in distance from the floor.
  • a setting for horizontal seat position measured in distance from the ceiling is a different setting than a setting for horizontal seat position measured in distance from the floor.
  • a second vehicle could be considered a "similar type” as the first vehicle when the first and second vehicles share most of the same settings, but there are differences as well. This could happen, for example, between different models from the same manufacturer.
  • a second vehicle could be considered a "different type” as the first vehicle when the first and second vehicles do not share most of the same settings. This could happen, for example, between different models from different manufacturers.
  • the disclosure and claims herein expressly extend to any suitable way to define type of a vehicle, and the principles herein apply regardless of whether the settings are being converted between vehicles of the same type, between vehicles of a similar type, or between vehicles of different types.
  • the process for converting settings from the first vehicle to corresponding settings for the second vehicle varies according to whether the vehicles are of the same type, of a similar type, or of a different type.
  • the conversion in step 4220 may produce user settings for the second vehicle that are identical to the user settings for the first vehicle. In this case, conversion means simply using the same settings for the second vehicle as used for the first vehicle.
  • the conversion in step 4220 may produce user settings for the second vehicle, some of which may be the same as settings for the first vehicle, and some or all of which may be different than the user settings for the first vehicle.
  • a television receiver 21800 includes one or more processors 21810, a memory 21820, one or more television receivers 21850A, . . ., 21850N that receive input from a TV signal input 21860, which receives a TV signal from a TV signal source, a TV signal output 21870 that is output to a television, a network connection 21880, and an external device interface 21890 that allows transferring user settings to and from an external device 21892.
  • Receivers 21850 could be, for example, tuners that can each receive a single television station.
  • the memory 21820 preferably includes system code 21822, system settings 21824, user settings 21826, a user settings transfer mechanism 21830, and recorded shows 21840 (assuming the television receiver 21800 is a digital video recorder
  • System code 21822 includes code executed by the one or more processors 21810 to make the television receiver 21800 function.
  • System settings 21824 are settings that are required for the system to function correctly. For example, when the television receiver 21800 is a satellite television receiver, the system settings 21824 could include all system information required for the receiver to connect to one or more satellites. While this information may be entered by a human installer, these are different than "user settings" because the system settings are the settings needed for the television receiver 21800 to function properly.
  • the user settings 21826 are settings such as those configured by the end user (e.g. , satellite TV subscriber) that specify preferences the user can set on the receiver 21800. Examples of suitable user television settings are shown in FIG. 32. While the channel numbers for stations 3250 are typically defined by the system and not by the user, they may be included in user television settings 835 so suitable translation of channel numbers can be performed, as described above.
  • the user settings transfer mechanism 21830 includes a user settings external write mechanism 21832, a user settings external read mechanism 21834, and a user settings conversion mechanism 21836.
  • the user settings external write mechanism 21832 allows writing the user settings 21826, and possibly some of the system settings 21824, to an external device, such as external device 21892 coupled to external device interface 21890.
  • the user settings written to the external device 21892 are represented in FIG. 218 as user settings 21894.
  • the user settings external read mechanism 21834 allows reading user settings from an external device, such as reading user settings 21894 from external device 21892.
  • the user settings external write mechanism 21832 thus writes user settings 21826 to the user settings 21894 in the external device 21892, while the user settings external read mechanism 21834 reads user settings 21894 from the external device 21892.
  • the external device 21892 could be any device capable of storing the user settings
  • a thumb drive with a uniform serial bus (USB) interface is one suitable example of an external device 21892, which can be used by plugging the thumb drive into a suitable USB port on the television receiver 21800.
  • USB uniform serial bus
  • Such a USB port is one suitable example for the external device interface 21890.
  • an external device 21892 is a smart phone that could be coupled to the television receiver in any suitable way, including via a Wi-Fi connection to the network connection 21880, via a Bluetooth interface (one specific implementation for external device interface 21890), or via a direct cable -type connection (e.g., USB) to a USB port, which is another suitable implementation of the external device interface 21890.
  • an external device 21892 is the U-Me system that could be accessed via the network connection 21880.
  • the disclosure and claims herein expressly extend to any suitable external device capable of storing user settings, whether currently known or developed in the future, which can communicate with the television receiver 21800 in any suitable way, whether currently known or developed in the future.
  • the user settings conversion mechanism 21836 includes logic to convert user settings from one type of television receiver to another. This logic can include direct conversion between device settings, conversion to and from a universal template, and conversion from one device to a universal template followed by conversion from the universal template to the second device, as discussed in detail above with respect to FIGS. 24-30.
  • the DirecTV settings could be read by the user settings external read mechanism 21834, and could then be converted to equivalent Dish Network user settings by the user settings conversion mechanism 21836.
  • the converted settings may then be written to the user settings 21826, thereby programming the Dish Network receiver with settings similar to those used on the DirecTV system, represented by the user settings 21894.
  • the user settings conversion mechanism 21836 could include the logic to convert in both directions, both from the television receiver 21800 to one or more different receivers, and from one or more different receivers to the television receiver 21800.
  • the user could specify to convert the user settings in the Dish Network receiver to equivalent DirecTV settings, which could then be written to a thumb drive. Because the settings on the thumb drive are then DirecTV settings, the user could the plug the thumb drive into a USB port on the DirecTV receiver at his parents' cabin, and the DirecTV receiver could then program itself from those settings on the thumb drive.
  • the ability to store user settings external to the television receiver is a great advantage in many different scenarios.
  • One such scenario is when a user upgrades to a new television receiver. For example, let's assume a Dish Network customer decides to upgrade from his current DVR that can record two channels at a time to a newer DVR that can record four channels at a time. There is currently no known way to transfer settings between the old DVR and the new DVR. The user is stuck with having to manually enter all the user settings, including those shown in FIG. 32, which can take a long time to do. With the ability to store user settings to an external device as disclosed herein, the user could store the user's settings from the old DVR to the external device, and the new DVR could then read those settings.
  • the user settings conversion mechanism 21836 If any conversion is needed between the old settings and the new settings, this can be handled by the user settings conversion mechanism 21836.
  • the result is that most or all of the user's settings are available on the new DVR, thus greatly enhancing the ease and convenience of upgrading to a newer DVR.
  • Another scenario is described above, where a user wants to take his settings with him to program his parents' DVR at their cabin.
  • Another scenario is when a user wants to program a DVR at a hotel or vacation rental.
  • the ability to store user settings external to the television receiver thus provides a very powerful tool that enhances the convenience of the user.
  • a method 21900 represents one suitable method that could be performed by the television receiver 21800 in FIG. 218.
  • the settings for Device 1 are read (step 21910). These settings are then stored to an external device (step 21920). Method 21900 in FIG. 219 could be performed, by example, by the user settings external write mechanism 21932.
  • a method 22000 represents one suitable method that could be performed by the television receiver 21800 in FIG. 218.
  • the settings for Device 1 are read from the external device (step 22010).
  • Device 2 is programmed with the settings for Devicel (step 22030).
  • the settings are read from the external device in step 22010, for example, by the user settings external read mechanism 3434, while determining the settings for Device2 from the settings for Devicel could be performed, for example, by the user settings conversion mechanism 22036.
  • the user settings conversion mechanism 21836 could convert the settings in any suitable way, including the ways discussed in detail above with reference to FIGS. 24-30.
  • FIG. 218 shows the user settings conversion mechanism 21836 residing in the memory 21820 of a television receiver 21800.
  • the function of the user settings conversion mechanism 21836 in FIG. 218 could additionally be performed or could alternatively be performed by the conversion mechanism 160 in the U-Me system 100 shown in FIG. 5.
  • the user settings external write mechanism 21832 would write the user settings 21826 to the U-Me system, which could then convert those settings to corresponding settings for a second device.
  • the settings for the second device could then be read from the U-Me system by a user settings external read mechanism in the second device, which can then be programmed with those settings.
  • a user's settings might only be needed for a defined period of time, such as a temporary stay in a hotel or rental condo. In such a scenario it would be desirable to be able to clear out the user's settings once the user's stay is over.
  • Method 22100 in FIG. 221 shows one suitable method for doing this.
  • Default settings are defined for a television receiver Devicel (step 22110). Default settings can be any suitable set of user settings, which could include a lack of settings as when Devicel was new, or any set of user settings the hotel manager or condo owner may want to define.
  • the programming of Devicel with external user settings is enabled (step 22120).
  • step 22130 the hotel manager or condo owner sends a signal to Devicel that allows the user to program Devicel with his or her settings stored on an external device, such as the U-Me system, a thumb drive, a smart phone, etc.
  • the user programs Devicel with the user's settings read from the external device (step 22130).
  • the user can then use Devicel with the user's settings.
  • step 22140 can be made in any suitable way. For example, the user could enter into the DV the date the user is checking out, and the DVR could then reset itself on that date at the appropriate checkout time.
  • the hotel manager or condo owner could send a message to the DVR that instructs the DVR to reset itself to the default settings.
  • Another option is for the hotel manager or condo owner to use a privileged mode in the DV to define the time period when the user will be staying.
  • the disclosure and claims herein expressly extend to any suitable way to allow a user to program the user's settings to a device, followed by the device being reset to its default settings at some later time.
  • FIG. 34 shows a suitable hierarchy of templates related to physical devices.
  • a master template A master template
  • 140A shown in FIG. 24 is one suitable implementation for user settings shown by way of example as 140A in FIGS. 1 and 2.
  • the master template 140A in FIG. 34 could be one suitable implementation for master template 975 shown in FIG. 9.
  • Master template 140A includes settings for multiple physical devices for one user.
  • the master template 140 A is a superset that includes all settings for all of the user's devices, and thus serves as a central repository for storing all of a user's settings.
  • the master template 140A includes settings from a single user.
  • the U-Me system could have a repository for settings from many different users, with a master template corresponding to each user.
  • the master template 140A may include all of the user's settings, which may include, for example, phone settings, tablet settings, laptop settings, desktop settings, TV settings, software settings, vehicle settings, home automation settings, gaming settings, audio settings, and security settings. Because the U-Me system is intended to include any settings a user may have, the U-Me system could include settings for all of a user's devices, even those that are developed in the future.
  • the master template 140 A as shown in FIG. 34 may include any and all settings for a user.
  • the U-Me system may optionally include one or more universal templates 152 as discussed above with reference to FIGS. 5 and 9.
  • the universal templates 152 may include, for example, one or more phone templates, one or more tablet templates, one or more laptop templates, one or more desktop templates, one or more TV templates, one or more software templates, one or more vehicle templates, one or more home automation templates, one or more gaming templates, one or more audio templates, and one or more security templates.
  • the U-Me system also includes one or more device -specific templates 154 as discussed above with reference to FIGS. 5 and 10-21.
  • the device-specific templates 154 may include, for example, one or more phone templates, one or more tablet templates, one or more laptop templates, one or more desktop templates, one or more TV templates, one or more software templates, one or more vehicle templates, one or more home automation templates, one or more gaming templates, one or more audio templates, and one or more security templates, as shown in detail in FIGS. 10-21.
  • Physical devices 150 may include any suitable device the user may use, and which includes one or more user settings a user may set. Physical devices 150 may include, for example, one or more phones, one or more tablet computers, one or more laptop computers, one or more desktop computers, one or more TV devices, one or more vehicles, one or more home automation systems, one or more gaming systems, one or more audio systems, one or more security systems, and any other physical device a user may use. [0566] As indicated by arrows in FIG. 34, settings in physical devices 150 are stored in corresponding device-specific templates 154, which serve as virtual clones of the physical device. The settings in the device-specific templates 154 may be stored in one or more universal templates 152.
  • the settings in the universal templates 152 may be stored in the master template 140A. Note, however, the universal templates 152 are optional.
  • the master template 140A may store settings directly to device -specific templates 154, and the device-specific templates 154 may store settings directly to the master template 140A.
  • the universal templates 152 when used, provide a more organized hierarchy of templates that may help in the process of mapping user settings between templates and between templates and physical devices.
  • each template includes mapping information to other templates or to a physical device.
  • device-specific templates 154 preferably include settings 3550 that correspond to settings in the physical devices 150, and additionally include inter -template mappings 3560 and device mappings 3570.
  • Inter-template mappings 3560 indicate how the user settings 3550 map to corresponding settings in one or more other templates.
  • Device mappings 3570 indicate how the settings 3550 map to the corresponding physical devices 150. In one specific implementation, device mappings 3570 may not be needed when the settings 3550 correspond exactly to the user settings on a physical device 150.
  • the device mappings 3570 may indicate how to map the user settings 3550 to the settings on a physical device 150.
  • Universal templates 152 may include settings 3530 and inter-template mappings 3540 that indicate how the settings 3530 are mapped to settings in other templates.
  • Master template 140A includes settings 3510 and inter-template mappings 3520 that indicate how the settings 3510 are mapped to settings in other templates.
  • each template includes the mappings that map the user settings.
  • the inter-template mappings 3520, 3540 and 3560, and the device mappings 3570 could be stored separately from the templates.
  • the mappings 3520, 3540, 3560 and 3570 could all be part of the user settings mapping information 326 shown in FIG. 3.
  • a method 3600 begins by receiving a user setting from a physical device (step 3610). This could happen, for example, when a user changes a setting on an existing physical device.
  • the user setting is stored to the device-specific template corresponding to the physical device (step 3620).
  • the mapping information is then used to map the user setting to the master template (step 3630). Note this may include mapping between multiple levels of universal templates to the master template.
  • the user setting is then stored to the master template (step 3640).
  • Method 36 illustrates how all user settings are stored both in a device-specific template as well as in the master template.
  • the master template in the most preferred implementation thus becomes a superset of all user settings for all of a user's devices, and is thus a central repository for all of a user's settings.
  • FIG. 37 shows a method 3700 for storing a setting from a master template to a physical device.
  • a user setting is read from the master template (step 3710).
  • the mapping information is then used to map the user setting to a corresponding device -specific template (step 3720).
  • the user setting is then mapped to one or more physical devices (step 3730).
  • the user setting is then stored to the corresponding one or more physical devices (step 3740).
  • one setting in the master template may map to multiple physical devices.
  • the changed address may be written to the user's phone, laptop computer and desktop computer using method 3700 shown in FIG. 37.
  • method 3800 begins when an incompatibility in user settings is detected (step 3810).
  • the incompatibility in user settings is displayed to the user (step 3820).
  • the user may then select the preferred setting (step 3830).
  • the preferred setting is then stored in the master template (step 3840).
  • the incompatibility is then logged (step 3850).
  • FIG. 39 An example of incompatible settings is shown in FIG. 39. We assume the user has two phones, a
  • the user has a contact John Smith that has a ringtone of High Tide on the Samsung Galaxy S3, while the same contact John Smith on the iPhone 4 has a ringtone of Harp.
  • the device-specific templates 154 shown in FIG. 39 shows these settings in the respective corresponding physical devices.
  • This setting is then stored in the master template in step 3840. While this simple example shows how to resolve an incompatibility, there may be reasons why an incompatibility cannot be resolved by the user.
  • both ringtones could be stored in the universal template and in the master template.
  • the information logged in step 3850 may indicate to the U-Me system when settings are not compatible, and thus multiple settings for different devices should be stored.
  • FIG. 40 illustrates that multiple levels of universal templates may be optionally employed between the device-specific templates and the master template.
  • the settings from the master template 140A are mapped to and from a computer universal template 152A.
  • the settings to and from the computer universal template 152A are mapped to and from a laptop computer universal template 152B and to and from a desktop computer universal template 152C.
  • the settings in the laptop computer universal template 152B are mapped to and from a Dell N5110 laptop computer device-specific template 154A, which sends settings to and receives settings from a Dell N5110 laptop computer, thereby serving as a virtual clone of the Dell N5110 laptop computer.
  • the settings in the desktop computer universal template 152C are mapped to and from a HP Pavillion 500-205t desktop computer device -specific template 154B, which sends settings to and receives settings from a HP Pavillion 500-205t desktop computer, thereby serving as a virtual clone of the HP Pavillion 500-205t desktop computer.
  • Including multiple levels of universal templates allows creating a hierarchy of templates that ease the process of mapping between templates.
  • Embodiment 1 A computer system comprising:
  • a memory coupled to the at least one processor
  • second user settings corresponding to the first user for a plurality of physical devices; and a software mechanism executed by the at least one processor that makes the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
  • Embodiment 2 The computer system of Embodiment 1, further comprising an authentication mechanism that authenticates the first user on the first device before the software mechanism makes the user data, the first user settings, and the second user settings available to the first user on the first device.
  • Embodiment 3 The computer system of Embodiment 2, wherein the authentication mechanism uses biometric authentication of the first user.
  • Embodiment 4 The computer system of Embodiment 3, wherein the biometric authentication comprises scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously- stored reference fingerprint for the first user.
  • Embodiment 5 The computer system of Embodiment 1, wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos.
  • Embodiment 6 The computer system of Embodiment 1, further comprising licensed content in the memory that is licensed to the first user, wherein the software mechanism makes the licensed content available to the first user on the first device.
  • Embodiment 7 The computer system of Embodiment 6, wherein the licensed content comprises music, movies, electronic books, software and games.
  • Embodiment 8 The computer system of Embodiment 1, wherein the first user settings comprise first user preferences for each of the plurality of software applications.
  • Embodiment 9. The computer system of Embodiment 1, wherein the second user settings comprise second user preferences for each of the plurality of physical devices.
  • Embodiment 10 The computer system of Embodiment 1, wherein the plurality of physical devices comprises a mobile phone and a computer system.
  • Embodiment 11 The computer system of Embodiment 1, further comprising a conversion mechanism that converts user settings for a first physical device to user settings for a second physical device.
  • Embodiment 12 The computer system of Embodiment 11, wherein the first physical device and second physical device are of a similar type.
  • Embodiment 13 The computer system of Embodiment 11, wherein the first physical device and second physical device are of different types.
  • Embodiment 14 A computer-implemented method executing on at least one processor comprising the steps of:
  • first user settings corresponding to the first user for a plurality of software applications storing second user settings corresponding to the first user for a plurality of physical devices; and making the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
  • Embodiment 15 The method of Embodiment 14, further comprising the step of authenticating the first user on the first device before making the user data, the first user settings, and the second user settings available to the first user on the first device.
  • Embodiment 16 The method of Embodiment 15, wherein the step of authenticating the first user performs biometric authentication of the first user.
  • Embodiment 17 The method of Embodiment 16, wherein the biometric authentication comprises scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously-stored reference fingerprint for the first user.
  • Embodiment 18 The method of Embodiment 14, wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos.
  • Embodiment 19 The method of Embodiment 14, further comprising the step of storing licensed content that is licensed to the first user and making the licensed content available to the first user on the first device.
  • Embodiment 20 The method of Embodiment 19, wherein the licensed content comprises music, movies, electronic books, software and games.
  • Embodiment 21 The method of Embodiment 14, wherein the first user settings comprise first user preferences for each of the plurality of software applications.
  • Embodiment 22 The method of Embodiment 14, wherein the second user settings comprise second user preferences for each of the plurality of physical devices.
  • Embodiment 23 The method of Embodiment 14, wherein the plurality of physical devices comprises a mobile phone and a computer system.
  • Embodiment 24 The method of Embodiment 14, further comprising a conversion mechanism that converts user settings for a first physical device to user settings for a second physical device.
  • Embodiment 25 The method of Embodiment 24, wherein the first physical device and second physical device are of a similar type.
  • Embodiment 26 The method of Embodiment 24, wherein the first physical device and second physical device are of different types.
  • Embodiment 27 A computer-implemented method executing on at least one processor comprising the steps of:
  • user data corresponding to a first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos;
  • licensed content that is licensed to the first user, wherein the licensed content comprises music, movies, electronic books, software and games;
  • first user settings corresponding to the first user for a plurality of software applications, wherein the first user settings comprise first user preferences for each of the plurality of software applications;
  • second user settings corresponding to the first user for a plurality of physical devices, wherein the second user settings comprise second user preferences for each of the plurality of physical devices, wherein the plurality of physical devices comprises a mobile phone and a computer system;
  • Embodiment 1 A computer system comprising:
  • a memory coupled to the at least one processor
  • photo mechanism residing in the memory and executed by the at least one processor, the photo mechanism generating indexing information for a digital photo file that includes at least one of the at least one user-defined relationship and the at least one system-derived relationship.
  • Embodiment 2 The computer system of Embodiment 1, wherein the photo mechanism further generates an event in the indexing information for the digital photo file, where the event comprises at least one of:
  • At least one system -derived event that is derived from the at least one user -defined event.
  • Embodiment 3 The computer system of Embodiment 2, wherein the at least one user-defined event comprises a birth date of a person and the at least one system-derived event comprises an age of the person computed from a date for the digital photo file and the birth date of the person.
  • Embodiment 4 The computer system of Embodiment 3, wherein the at least one system-derived event comprises a birthday of the person.
  • Embodiment 5 The computer system of Embodiment 1, wherein the at least one user-defined event comprises a wedding date of a person and the at least one system-derived event comprises an anniversary of the person computed from a date for the digital photo file and the wedding date of the person.
  • Embodiment 6 The computer system of Embodiment 1, wherein the photo mechanism further generates a location in the indexing information for the photo, where the location comprises at least one of a user- defined location and a system -defined location.
  • Embodiment 7 The computer system of Embodiment 1, wherein the photo mechanism includes a facial recognition mechanism that recognizes at least one face in an image in the digital photo file and allows a user to correlate the recognized face to one of the plurality of people.
  • the photo mechanism includes a facial recognition mechanism that recognizes at least one face in an image in the digital photo file and allows a user to correlate the recognized face to one of the plurality of people.
  • Embodiment 8 The computer system of Embodiment 1, wherein the indexing information is stored separate from the digital photo file.
  • Embodiment 9 The computer system of Embodiment 1, wherein the photo mechanism allows a user to add new people, to modify at least one of the plurality of people, and to modify the at least one user-defined relationship, and in response, the photo mechanism updates the indexing information for a plurality of digital photo files to reflect the addition or modification by the user.
  • Embodiment 10 A computer-implemented method executing on at least one processor comprising:
  • indexing information for a digital photo file that includes at least one of the at least one user-defined relationship and the at least one system -derived relationship.
  • Embodiment 11 The method of Embodiment 10, further comprising: generating an event in the indexing information for the digital photo file, where the event comprises at least one of:
  • Embodiment 12 The method of Embodiment 11, wherein the at least one user-defined event comprises a birth date of a person and the at least one system-derived event comprises an age of the person computed from a date of the digital photo file and the birth date of the person.
  • Embodiment 13 The method of Embodiment 12, wherein the at least one system-derived event comprises a birthday of the person.
  • Embodiment 14 The method of Embodiment 10, wherein the at least one user-defined event comprises a wedding date of a person and the at least one system-derived event comprises an anniversary of the person computed from a date for the digital photo file and the wedding date of the person.
  • Embodiment 15 The method of Embodiment 10, further comprising generating a location in the indexing information for the photo, where the location comprises at least one of a user -defined location and a system- defined location.
  • Embodiment 16 The method of Embodiment 10, further comprising performing facial recognition that recognizes at least one face in the photo and allows a user to correlate the recognized face to one of the plurality of people.
  • Embodiment 17 The method of Embodiment 10, further comprising storing the indexing information separate from the digital photo file.
  • Embodiment 18 The method of Embodiment 10, further comprising detecting when a user adds new people, modifies at least one of the plurality of people, and modifies the at least one user-defined relationship, and in response, updating the indexing information for a plurality of digital photo files to reflect the detected addition or modification by the user.
  • Embodiment 19 A computer-implemented method executing on at least one processor comprising:
  • user-defined information for a plurality of people including at least one user-defined relationship between the plurality of people and at least one user-defined event for a person that comprises at least one of a birth date and a wedding date;
  • identifying a person in an image in the digital photo file by performing facial recognition that recognizes a face in the image and allows a user to correlate the recognized face to one of the plurality of people; generating at least one system-derived event for the identified person that is derived from the at least one user-defined event, the at least one system-derived event comprising at least one of:
  • the location comprises at least one of a user-defined location derived from the geocode information in the digital photo file and a system -defined location derived from the geocode information in the digital photo file;
  • indexing information for the digital photo file that includes the one of the plurality of people corresponding to the recognized face, at least one user -de fined relationship between the one of the plurality of people and at least one other person, at least one system -derived relationship between the one of the plurality of people and at least one other person, at least one system -derived event for the one of the plurality of people, and location for the photo;
  • Embodiment 20 The method of Embodiment 19, wherein storing the indexing information for the digital photo file comprises storing the indexing information separate from the digital photo file.
  • Embodiment 1 An apparatus comprising:
  • a memory coupled to the at least one processor
  • a display coupled to the at least one processor that displays a plurality of graphical icons that may be selected by a user touching the display;
  • an equipment communication interface that transmits a plurality of commands to control equipment external to the apparatus
  • an external database communication interface that allows retrieving at least one programming parameter from an external database that includes a plurality of entries for a plurality of locations, wherein each of the pluralities of entries includes at least one programming parameter for equipment at each location; and a dynamic location-based programming mechanism residing in the memory and executed by the at least one processor, the dynamic location-based programming mechanism accessing the external database using the external communication interface to retrieve the at least one programming parameter for a specified location and dynamically programming the apparatus using the at least one programming parameter for the specified location to make the apparatus control equipment at the specified location by transmitting a plurality of commands via the equipment communication interface.
  • Embodiment 2 The apparatus of Embodiment 1, wherein the equipment communication interface comprises a wireless interface and the external communication interface comprises the wireless interface.
  • Embodiment 3 The apparatus of Embodiment 1, wherein the at least one programming parameter for the specified location comprises:
  • Embodiment 4 The apparatus of Embodiment 3, wherein dynamically programming the apparatus comprises programming the plurality of graphical icons on the display to correspond to the channel map for the television provider at the specified location and programming the apparatus to control the equipment at the specified location using programming codes for controlling the equipment at the specified location.
  • Embodiment 5. The apparatus of Embodiment 1, wherein a user inputs information to the apparatus that identifies the specified location.
  • Embodiment 6 The apparatus of Embodiment 1, further comprising an interface for reading a machine- readable code, wherein the specified location is determined by the interface reading a machine -readable code corresponding to the specified location.
  • Embodiment 7 The apparatus of Embodiment 1, wherein the equipment communication interface transmits the plurality of commands to external hardware that, in turn, transmits a corresponding plurality of codes to the equipment external to the apparatus.
  • Embodiment 8 The apparatus of Embodiment 1, wherein the apparatus comprises a mobile phone running a corresponding application.
  • Embodiment 9 The apparatus of Embodiment 1, wherein the apparatus comprises a tablet computer running a corresponding application.
  • Embodiment 10 The apparatus of Embodiment 1, wherein the apparatus comprises a dedicated universal remote control.
  • Embodiment 11 A method for programming a remote control comprising:
  • each of the pluralities of entries includes at least one programming parameter for equipment at each location; retrieving from the external database the at least one programming parameter for the specified location;
  • Embodiment 12 The method of Embodiment 11, wherein the equipment communication interface comprises a wireless interface.
  • Embodiment 13 The method of Embodiment 11, wherein the at least one programming parameter for the specified location comprises:
  • Embodiment 14 The method of Embodiment 13, wherein dynamically programming the remote control comprises programming a plurality of graphical icons on a display on the remote control to correspond to the channel map for the television provider at the specified location and programming the remote control to control the equipment at the specified location using programming codes for controlling the equipment at the specified location.
  • Embodiment 15 The method of Embodiment 11, further comprising the step of a user inputting information to the remote control that identifies the specified location.
  • Embodiment 16 The method of Embodiment 11, further comprising the step of the remote control determining the specified location by reading a machine -readable code corresponding to the specified location.
  • Embodiment 17 The method of Embodiment 11, wherein the remote control transmits the plurality of commands to external hardware that, in turn, transmits a corresponding plurality of codes to equipment.
  • Embodiment 18 The method of Embodiment 11, wherein the remote control comprises a mobile phone running a corresponding application.
  • Embodiment 19 The method of Embodiment 11, wherein the remote control comprises a tablet computer running a corresponding application.
  • Embodiment 20 The method of Embodiment 11, wherein the remote control comprises a dedicated universal remote control.
  • Embodiment 21 A method for programming a remote control comprising:
  • a television provider at the specified location for each of the plurality of specified locations, specifying: a television provider at the specified location and a corresponding channel map that provides television channels available from the television provider along with corresponding channel numbers;
  • Embodiment 1 A computer system comprising:
  • a memory coupled to the at least one processor
  • the conversion mechanism converting the first user settings for the first vehicle to corresponding second user settings for the user for a second vehicle
  • Embodiment 2 The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise seat position for at least one seat.
  • Embodiment 3 The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise mirror position for at least one mirror.
  • Embodiment 4 The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise at least one climate control setting.
  • Embodiment 5 The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise audio presets.
  • Embodiment 6 The computer system of Embodiment 1 , the first user settings for the first vehicle and the second user settings for the second vehicle comprise music licensed to the user.
  • Embodiment 7 The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise video licensed to the user.
  • Embodiment 8 The computer system of Embodiment 1, wherein the first vehicle and the second vehicle are of a similar type.
  • Embodiment 9 The computer system of Embodiment 1, wherein the first vehicle and the second vehicle are of different types.
  • Embodiment 10 A computer-implemented method executing on at least one processor comprising:
  • Embodiment 11 The method of Embodiment 10, further comprising:
  • Embodiment 12 The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise seat position for at least one seat.
  • Embodiment 13 The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise mirror position for at least one mirror.
  • Embodiment 14 The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise at least one climate control setting.
  • Embodiment 15 The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise audio presets.
  • Embodiment 16 The method of Embodiment 10, the first user settings for the first vehicle and the second user settings for the second vehicle comprise music licensed to the user.
  • Embodiment 17 The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise video licensed to the user.
  • Embodiment 18 The method of Embodiment 10, wherein the first vehicle and the second vehicle are of a similar type.
  • Embodiment 19 The method of Embodiment 10, wherein the first vehicle and the second vehicle are of different types.
  • Embodiment 20 A computer-implemented method executing on at least one processor comprising:
  • first user settings corresponding to a user for a first vehicle, wherein the first user settings comprise:
  • At least one climate control setting At least one climate control setting
  • Embodiment 1 A computer system comprising:
  • a memory coupled to the at least one processor
  • the conversion mechanism converting the first user settings for the first device to corresponding second user settings for the user for a second device
  • a software mechanism executed by the at least one processor that downloads the second user settings to the second device.
  • Embodiment 2 The computer system of Embodiment 1, wherein the first device comprises a first television receiver device and the second device comprises a second television receiver device.
  • Embodiment 3 The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise shows set to record.
  • Embodiment 4 The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise blocked channels and parental controls.
  • Embodiment 5 The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one favorite channels list.
  • Embodiment 6 The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one password.
  • Embodiment 7 The computer system of Embodiment 1, wherein the first device and the second device have the same hardware architecture type and have the same system software type.
  • Embodiment 8 The computer system of Embodiment 1, wherein the first device and the second device have different hardware architecture type and different system software type.
  • Embodiment 9 A computer-implemented method executing on at least one processor comprising:
  • Embodiment 10 The method of Embodiment 9, further comprising:
  • Embodiment 11 The method of Embodiment 9, wherein the first device comprises a first television receiver device and the second device comprises a second television receiver device.
  • Embodiment 12 The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise shows set to record.
  • Embodiment 13 The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise blocked channels and parental controls.
  • Embodiment 14 The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one favorite channels list.
  • Embodiment 15 The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one password.
  • Embodiment 16 The method of Embodiment 9, wherein the first device and the second device have the same hardware architecture type and have the same system software type.
  • Embodiment 17 The method of Embodiment 9, wherein the first device and the second device have different hardware architecture type and different system software type.
  • Embodiment 18 An apparatus comprising:
  • a memory coupled to the at least one processor, the memory comprising a first plurality of user television settings defined by a user;
  • a television signal input for receiving a television signal from a television signal source
  • a television signal output for sending a television signal to a television
  • a user settings transfer mechanism residing in the memory and executed by the at least one processor that reads a second plurality of user television settings from an external device coupled to the apparatus and programs at least one of the first plurality of user television settings based on at least one of the second plurality of user television settings.
  • Embodiment 19 The apparatus of Embodiment 18, wherein the external device comprises a computer system coupled via a network connection to the apparatus.
  • Embodiment 20 The apparatus of Embodiment 18, wherein the external device comprises a removable memory coupled to the apparatus.
  • Embodiment 1 An apparatus comprising:
  • a memory coupled to the at least one processor
  • a master template corresponding to the first user that includes the plurality of user settings stored in the plurality of device-specific templates
  • mapping information residing in the memory for mapping the user settings for the first user in the plurality of device-specific templates to the master template and for mapping the user settings for the first user in the master template to the plurality of device-specific templates;
  • a user settings mechanism residing in the memory and executed by the at least one processor that receives a first user setting from a first physical device used by the first user, stores the first user setting in a first device-specific template corresponding to the first physical device, uses the mapping information to map the first user setting in the first device-specific template to a corresponding setting in the master template, and stores the corresponding setting in the master template.
  • Embodiment 2 The apparatus of Embodiment 1, wherein the user settings mechanism reads a second user setting for the first user from the master template, uses the mapping information to map the second user setting in the master template to a corresponding setting in a second device-specific template, stores the corresponding setting in the second device-specific template, and stores the corresponding setting to a physical device corresponding to the second device -specific template.
  • Embodiment 3 The apparatus of Embodiment 1, wherein the master template includes a superset of all user settings for the first user stored in all of the plurality of device -specific templates and wherein the master template is a repository for all of the user settings for the first user.
  • Embodiment 4 The apparatus of Embodiment 1, wherein the mapping information comprises information for mapping user settings from the plurality of device-specific templates to a plurality of universal templates that correspond to a plurality of device types, and information for mapping user settings from the universal templates to the master template.
  • Embodiment 5 The apparatus of Embodiment 4, wherein each of the plurality of device types is defined by a combination of hardware architecture and system software.
  • Embodiment 6 The apparatus of Embodiment 5, wherein the combination of hardware architecture and system software for each universal template is different than the combination of hardware architecture and system software for other universal templates.
  • Embodiment 7 The apparatus of Embodiment 1, wherein the user settings mechanism uses the mapping information to map the corresponding setting in the master template to a second corresponding setting in a second device-specific template corresponding to the first user, stores the second corresponding setting to the second device-specific template, and stores the second corresponding setting to a second physical device corresponding to the second device-specific template.
  • Embodiment 8 The apparatus of Embodiment 7, wherein the first physical device has a first combination of hardware architecture and system software and the second physical device has a second combination of hardware architecture and system software different than the first combination of hardware architecture and system software.
  • Embodiment 9 A computer-implemented method executed by a processor for managing user settings, the method comprising:

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Multimedia (AREA)
  • Technology Law (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A cloud-based computer system changes the modern paradigm from being device-centric to being person-centric. The system makes all user data, software settings, device settings, and licensed content for a user available in the cloud. The system includes a conversion mechanism that can convert information intended for one device type to a different device type. Thus, a user changing smart phone platforms can convert their current smart phone settings to equivalent settings on the new phone platform, and their new phone can then be configured using the user's converted settings stored in the cloud. By storing all the user's relevant information in the cloud, this information may be accessed anywhere and may be used to configure many different devices according to the user's settings.

Description

MAKING A USER'S DATA, SETTINGS, AND LICENSED CONTENT AVAILABLE IN THE
CLOUD
BACKGROUND [0001] 1. Technical Field
[0002] This disclosure generally relates to computer systems, and more specifically relates to making information relating to a user available in the cloud and to multiple devices used by the user.
[0003] 2. Background Art
[0004] Modern technology has greatly simplified many aspects of our lives. For example, the Internet has made vast amounts of information available at the click of a mouse. Smart phones allow not only making phone calls, but also provide a mobile computing platform by providing the ability to run apps, view e-mail, and access many different types of information, including calendar, contacts, etc.
[0005] Some cloud-based services allow storing data in the cloud, and providing access to that data from any device that has Internet access. Dropbox is an example of a cloud-based file service. A subscriber to Dropbox defines a file folder that is synchronized to the cloud, then all data written to the file folder will be automatically stored in the cloud, making that data automatically available to the user via any device that has an Internet connection. While services like Dropbox are very useful, they have their drawbacks. For example, a Dropbox user must remember to store data in a Dropbox folder or sub-folder. Many different software applications have default settings that save files to a folder that may not be a Dropbox folder. The user must know to change the default folder settings to a Dropbox folder if the data is to be available via
Dropbox. But many users lack the knowledge or sophistication to realize all the changes that need to be made to a computer to assure all of the user's data is stored to Dropbox. As a result, if the user's hard drive crashes and data is not recoverable from the hard drive, the user may discover some of their data was not stored to a Dropbox folder or sub-folder, resulting in loss of that data when the hard drive crashed.
[0006] The evolution of modern technology has resulted in a world that is "device -centric." This means each device must be configured to a user's needs. If a user owns a smart phone, tablet computer, and laptop computer, the user must take the time to configure each of these devices to his or her liking. This effort represents a significant investment of time for the user. For example, let's assume a user has been using the iPhone 4 for over a year, and decides to change to the Samsung Galaxy S4 phone. Depending on the vendor of the Samsung Galaxy S4 phone, the vendor may be able to transfer the phone contacts on the iPhone 4 to the new Samsung phone, but none of the apps or other data can be transferred. As a result, the decision to change to a new smart phone platform will require hours of time for the user to download apps and configure the new phone to his or her liking. The same problem exists when a user buys a new computer. The user must take the time to install all the software the user wants to use on the computer, and must take the time to configure the desired settings and preferences on the new computer. Again, this can be a very time- consuming proposition. It is not unusual for a user to spend many hours installing software and configuring a new computer system to his or her liking. For professionals who do not have the support of an IT department, taking the time to configure a new computer system either takes hours out of their work day, or takes hours of their personal time after work. In either case, the user loses hours of valuable time setting up a new computer system.
[0007] Most home electronics come with dedicated remote controls. When a person has many different electronic devices to control, the result can be many different remote controls sitting on an end table or coffee table. It is frustrating for a user to fumble through many different remote controls to find the right one, then to figure out which buttons to push to accomplish the function the user wants. To eliminate having to deal with many different remote controls with different key layouts, universal remote controls were developed that can be programmed to control multiple pieces of equipment using the same keys. For example, a universal remote control could include buttons for TV, DV , Receiver and DVD. A user can push one of these buttons, and the user's keypresses will then control the corresponding equipment.
Universal remote controls have greatly reduced the number of remote controls the user must deal with to control the user's home electronics. These remote controls are called "universal remotes" because of the ability to program them to accommodate a large number of devices from many different vendors.
[0008] Programming a universal remote control to control a wide variety of different electronic equipment from a number of different vendors is possible because the universal remote control can include a database of vendor models of equipment and corresponding codes for controlling the equipment. By the user selecting a vendor model or code, the remote control can program itself for the set of codes to control the vendor's equipment. This process can be repeated for each piece of equipment the user wants to control with the universal remote control. Thus, a user could program a known universal remote control to control a Samsung television, a DirecTV digital video recorder (DVR), and a Sony DVD player. The ability to program a universal remote control to support different equipment provides the capability for a user to customize the remote control. However, once the remote control is programmed for the user's equipment, the programming for the remote control typically does not change until the user adds a new piece of equipment or replaces an existing piece of equipment with different equipment.
[0009] Some universal remote controls use touch-screens that display graphical symbols called icons that may be selected by a user to perform certain functions. Thus, a CNN icon on a universal remote control with a touch screen may be presented, and when the user selects the CNN icon by pressing on the CNN icon on the touch screen, the remote control will send the appropriate command to change the channel to CNN.
[0010] Proliferation of technology in our modern live has extended to our vehicles. Many modern cars and trucks include power seats, power mirrors, and separate heat/air conditioning and settings for the driver and passenger sides. Vehicles are thus device-centric as well, requiring a user to configure a vehicle to the user's liking. Thus, when a user rents a car or purchases a new car, the user must take the time to configure the car to the user's liking. [0011] Not only must a user configure each of his or her devices, the configuration and capabilities of each device differ greatly. Apps installed on a smart phone are not made to run on a laptop or desktop computer. Software installed on a desktop or laptop computer are not made to run on smart phones. The result is the user must configure each device and install the software or apps to make the device as functional as the user needs it to be. This requires significant thought and expertise from the user to know how to configure each device.
BRIEF SUMMARY
[0012] A cloud -based computer system referred to herein as "Universal Me" or "U-Me" changes the modern paradigm from being device-centric to being person-centric. The system makes all user data, settings, and licensed content for a user available in the cloud. The system includes a conversion mechanism that can convert information intended for one device type to a different device type. Thus, a user changing smart phone platforms can convert their current smart phone settings to equivalent settings on the new phone platform, and their new phone can then be configured using the user's converted settings stored in the cloud. By storing all the user's relevant information in the cloud, this information may be accessed by the user anywhere and may be used to configure a number of different devices according to the user's data and settings.
[0013] The U-Me system includes a photo processing mechanism that allows cataloging and storing a user's photos using relationships between people that allow the user's photos to be retrieved using a search engine. A user enters people and specifies relationships, and may also enter locations, events, and other information. Photos are then processed, and indexing info is generated for each photo that may include any or all of the following: user-defined relationships, system-derived relationships, user-defined locations, system-defined locations, user-defined events, and system-derived events and ages for the people in the photos. The indexing info is used to catalog a photo for easy retrieval later. The indexing info may be stored as metadata with the photo or may be stored separately from the photo. The indexing info allows photos to be retrieved using a powerful search engine.
[0014] The U-Me system includes a universal remote control that allows dynamically programming the remote control according to location. The remote control includes a communication interface that allows the remote control to communicate with a remote database. A location is specified, and the remote control uses the location information to access corresponding programming information for the remote control. The remote control is then dynamically reprogrammed according to the location information to make the remote control suitable to the location.
[0015] The U-Me system may make a user's information available in a vehicle. The system includes a conversion mechanism that can convert information intended for one device type to a different device type. Thus, a user driving a Chevrolet can store the settings for the Chevrolet, which can then be converted to equivalent settings for any other vehicle, including vehicles from different manufacturers. [0016] The U-Me system allows transferring user settings from a first device to a second device that has the same hardware architecture type and the same system software type as the first device. A conversion mechanism also allows converting user settings for a first device to corresponding user settings for a second device that has a different hardware architecture type and/or different system software type. In addition to transferring user settings from the cloud, the user settings for the first device can be transferred to an external device, which may then be connected to a second device, which can then use the user settings on the external device to program the second device. A television receiver, such as a cable box, a digital video recorder (DV ), a satellite television receiver, etc. is one example of a device that can be programmed from settings of a different device.
[0017] In the U-Me system, multiple templates provide mapping information from physical devices to a master template that serves as a central repository for all of a user's settings for all of a user's devices. The templates also provide mapping information that allow for mapping settings between different physical devices, between physical devices and other templates, and between templates. A user settings mechanism uses the mapping information to propagate user settings stored in one template to other templates and to one or more physical devices, and to propagate user settings stored in a physical device to multiple templates, including a master template that serves as a central repository for all of a user's settings.
[0018] The foregoing and other features and advantages will be apparent from the following more particular description, as illustrated in the accompanying drawings.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
[0019] The disclosure will be described in conjunction with the appended drawings, where like designations denote like elements, and:
[0020] FIG. 1 is block diagram showing the Universal Me (U-Me) system;
[0021] FIG. 2 is block diagram showing additional details of the U-Me system;
[0022] FIG. 3 is block diagram showing a computer system that runs the U-Me system;
[0023] FIG. 4 is a block diagram showing how a user using a physical device can access information in the
U-Me system;
[0024] FIG. 5 is a block diagram showing various features of the U-Me system;
[0025] FIG. 6 is a block diagram showing examples of user data;
[0026] FIG. 7 is a block diagram showing examples of user licensed content;
[0027] FIG. 8 is a block diagram showing examples of user settings;
[0028] FIG. 9 is a block diagram showing examples of universal templates;
[0029] FIG. 10 is a block diagram showing examples of device-specific templates;
[0030] FIG. 11 is a block diagram showing examples of phone templates;
[0031] FIG. 12 is a block diagram showing examples of tablet templates;
[0032] FIG. 13 is a block diagram showing examples of laptop templates;
[0033] FIG. 14 is a block diagram showing examples of desktop templates; [0034] FIG. 15 is a block diagram showing examples of television templates;
[0035] FIG. 16 is a block diagram showing examples of software templates;
[0036] FIG. 17 is a block diagram showing examples of vehicle templates;
[0037] FIG. 18 is a block diagram showing examples of home automation templates;
[0038] FIG. 19 is a block diagram showing examples of gaming system templates;
[0039] FIG. 20 is a block diagram showing examples of audio system templates;
[0040] FIG. 21 is a block diagram showing examples of security system templates;
[0041] FIG. 22 is a block diagram showing examples of device interfaces;
[0042] FIG. 23 is a block diagram of a universal user interface;
[0043] FIG. 24 is a flow diagram of a method for programming a physical device with user settings;
[0044] FIG. 25 is a flow diagram of a first suitable method for performing step 2410 in FIG. 24 using a mapping between two physical devices;
[0045] FIG. 26 is a block diagram showing the generation of settings for Device2 from settings for Device 1 as shown in the flow diagram in FIG. 25;
[0046] FIG. 27 is a flow diagram of a second suitable method for performing step 2410 in FIG. 24 using a universal template;
[0047] FIG. 28 is a block diagram showing the generation of settings for Device2 from a universal template as shown in the flow diagram in FIG. 27;
[0048] FIG. 29 is a flow diagram of a third suitable method for performing step 2410 in FIG. 24 using settings from a first device and a universal template;
[0049] FIG. 30 is a block diagram showing the generation of settings for Device2 as shown in the flow diagram in FIG. 29;
[0050] FIG. 31 is a table showing mapping of some channel numbers for DirecTV to channel numbers for Dish Network;
[0051] FIG. 32 is a table showing examples of user television settings;
[0052] FIG. 33 is a flow diagram of a method for converting channel numbers for Dish Network to channel numbers for DirecTV;
[0053] FIG. 34 is a flow diagram of a method for reprogramming a remote control for a television;
[0054] FIG. 35 is an example of a display of a television remote control;
[0055] FIG. 36 is a flow diagram of a method for converting a channel number entered by a user to a corresponding different channel number for a target system;
[0056] FIG. 37 is a flow diagram of a method for reprogramming a television remote control according to a target system at a location;
[0057] FIG. 38 is a flow diagram of a method for defining an e eceipt template;
[0058] FIG. 39 is a flow diagram of a method for sending and storing an eReceipt to a user's U-Me account; [0059] FIG. 40
[0060] FIG. 41
example
[0061] FIG. 42
[0062] FIG. 43
[0063] FIG. 44
[0064] FIG. 45
44;
[0065] FIG. 46
manufacturer by sending the manufacturer an e-Receipt;
[0066] FIG. 47 is a flow diagram of a method for handling a timed warranty link;
[0067] FIG. 48 is a flow diagram of a method for prompting a user to purchase an extended warranty when a warranty is about to expire;
[0068] FIG. 49 is a flow diagram of a method for handling a warranty claim using an eReceipt;
[0069] FIG. 50 shows an example of a screen for one suitable implementation of an eReceipt search engine;
[0070] FIG. 51 shows examples of eReceipt queries that could be submitted via the eReceipt search engine screen shown in FIG. 50;
[0071] FIG. 52 is a flow diagram of a method for processing an e-mail receipt to generate an eReceipt;
[0072] FIG. 53 is a flow diagram of a method for generating settings in a universal vehicle template based on settings from a vehicle;
[0073] FIG. 54 shows examples of items that could be included in a universal vehicle template;
[0074] FIG. 55 is a flow diagram of a method for downloading user settings to a car from the user's U-Me account;
[0075] FIG. 56 is a representation of a vehicle seat with respect to the vehicle floor, the vehicle accelerator pedal, and the vehicle steering wheel;
[0076] FIG. 57 is a block diagram of a system for using a phone hands-free in a prior art vehicle;
[0077] FIG. 58 is a block diagram of a system for using a phone hands-free and also for accessing information in the vehicle's engine system;
[0078] FIG. 59 is a flow diagram of a method for prompting a user regarding scheduled maintenance for a vehicle;
[0079] FIG. 60 is a flow diagram of a method for providing shop bids to a user for scheduled maintenance;
[0080] FIG. 61 is a flow diagram of a method for notifying users of the U-Me system of manufacturer recalls and service actions for the manufacturer's vehicles;
[0081 ] FIG. 62 is a flow diagram of a method for providing vehicle service reminders to a user; [0082] FIG. 63 is a flow diagram of a method for prompting the user when engine warning information is sent to the user's U-Me account;
[0083] FIG. 64 is an example of a photo system data entry screen;
[0084] FIG. 65 is the example photo system entry screen in FIG. 64 filled with sample data;
[0085] FIG. 66 is a flow diagram of a method for the U-Me system to construct family relationships from information entered in the photo system data entry screen;
[0086] FIG. 67 is a flow diagram of a method for generating indexing information for photos and for storing photos with the indexing information;
[0087] FIG. 68 is a flow diagram of a method for adding a photographer's name to indexing information for a photo;
[0088] FIG. 69 shows examples of photo metadata;
[0089] FIG. 70 is a flow diagram of a method for adding location name to indexing information for a photo;
[0090] FIG. 71 is a flow diagram of a method for storing photos that were scanned from hard copy photos with indexing information;
[0091] FIG. 72 is a flow diagram of a method for a user to define indexing info for one or more photos at the same time;
[0092] FIG. 73 shows examples of photo indexing info;
[0093] FIG. 74 shows an example photo file stored in a user's U-Me account;
[0094] FIG. 75 is an example of a screen for a user to generate indexing info for one or more photos as shown in the method in FIG. 72;
[0095] FIG. 76 shows a screen for an example of a photo search engine;
[0096] FIG. 77 shows examples of photo queries that could be formulated in the photo search engine shown in FIG. 76;
[0097] FIG. 78 shows a screen for an example photo share engine;
[0098] FIG. 79 is a flow diagram of a method for processing a photo just taken and storing the photo with automatically-generated indexing information in the user's U-Me account;
[0099] FIG. 80 shows an example of medical information for a user;
[0100] FIG. 81 is a flow diagram of a method for a user to define semi-private medical info in the user's U-Me account;
[0101] FIG. 82 is a flow diagram of a method for uploading medical information to a user's U-Me account;
[0102] FIG. 83 is a flow diagram of a method for a medical person to attempt to access medical information stored on a U-Me user's device;
[0103] FIG. 84 is a flow diagram of a method for determining whether the current location of the U-Me user's device is at a medical facility; [0104] FIG. 85 is an example of a display on a U-Me user's smart phone showing a medical button that allows bypassing any security on the smart phone to access the semi-private medical information for the user;
[0105] FIG. 86 is a flow diagram of a method for a medical person to attempt to gain access to a patient's medical information;
[0106] FIG. 87 shows an example of a screen for a medical information sharing engine;
[0107] FIG. 88 is a flow diagram of a method for a user to share the user's medical information with one or more other users;
[0108] FIG. 89 is a flow diagram of a method for a user who was authorized to share medical information for a different user to share that medical information with one or more other users;
[0109] FIG. 90 is a flow diagram of a method for a user to revoke sharing of medical information by other users;
[0110] FIG. 91 is a flow diagram of a method for the U-Me system to track when a user takes meds;
[0111] FIG. 92 is a flow diagram of a method for the U-Me system to provide reminders to a user to take the user's meds;
[0112] FIG. 93 is a flow diagram of a method for a user to authenticate to the U-Me system;
[0113] FIG. 94 shows examples of authentication types that could be used by a user to authenticate to the
U-Me system as shown in FIG. 93;
[0114] FIG. 95 is a flow diagram of a method for assuring the U-Me system functions are available to a user on only one physical device at a time when the user authenticates using non-biometric authentication;
[0115] FIG. 96 is a flow diagram of a method for licensing licensed content to a user, not to a physical device, then making the licensed content available to the user on any device;
[0116] FIG. 97 is a flow diagram of a method for licensing music to a user, not to a physical device, then making the music available to the user on any device that can play music;
[0117] FIG. 98 is a flow diagram of a method for making a user's music settings in the user's U-Me account available on any suitable music player;
[0118] FIG. 99 shows examples of suitable music players;
[0119] FIG. 100 shows license pricing that varies according to the length of the license;
[0120] FIG. 101 is a flow diagram of a method for generating virtual devices in a user's U-Me account that correspond to physical devices used by the user;
[0121] FIG. 102 shows an example of a smart phone and corresponding example of a virtual smart phone that is stored in the user's U-Me account;
[0122] FIG. 103 is a flow diagram of a method for tracking all changes to a physical device and synchronizing all the changes to a corresponding virtual device in the user's U-Me account;
[0123] FIG. 104 is a flow diagram of a method for synchronizing all data changes between the user's physical devices and the user's U-Me account; [0124] FIG. 105 is a flow diagram of a method for storing data to a user's U-Me account with indexing information that allows retrieving the data later via a search engine;
[0125] FIG. 106 shows examples of data attributes;
[0126] FIG. 107 shows examples of data attributes that could be stored as indexing info to identify type of data stored;
[0127] FIG. 108 shows examples of data attributes that could be stored as indexing info to identify location of where data was created;
[0128] FIG. 109 shows examples of data attributes that could be stored as indexing info to identify time- related parameters for data;
[0129] FIG. 110 shows an example of a data file format for data stored in the U-Me system;
[0130] FIG. I l l shows an example of a data file that complies with the format shown in FIG. 110 and that includes examples of indexing info that helps to retrieve the data later via a search engine;
[0131] FIG. 112 shows an example of a data search engine that allows a user to query data stored in the user's U-Me account;
[0132] FIG. 113 is a flow diagram of a method for configuring a new physical device from information in a user's U-Me account;
[0133] FIG. 114 is a flow diagram of a method for the U-Me system to host software that is licensed to the user;
[0134] FIG. 115 is a flow diagram of a method for the U-Me system to host software that is licensed to a device;
[0135] FIG. 116 is a block diagram of a virtual machine for the user;
[0136] FIG. 117 is a flow diagram of a method for selecting weather alerts for defined geographic regions;
[0137] FIG. 118 is table showing examples of weather alerts defined by the United States National Oceanographic and Atmospheric Administration (NOAA);
[0138] FIG. 119 shows an example of an interface for a user to define weather alerts;
[0139] FIG. 120 shows the interface in FIG. 119 with data that defines a weather alert for a tornado warning;
[0140] FIG. 121 shows the interface in FIG. 119 with data that defines a weather alert for a flash flood watch;
[0141] FIG. 122 shows the interface in FIG. 119 with data that defines a weather alert for a wind chill watch;
[0142] FIG. 123 is a flow diagram of a method for the U-Me system to process weather alerts;
[0143] FIG. 124 is a block diagram showing examples of home automation settings;
[0144] FIG. 125 is a block diagram showing examples of appliance settings;
[0145] FIG. 126 is a block diagram showing examples of HVAC settings;
[0146] FIG. 127 is a block diagram showing examples of light settings; [0147] FIG. 128 is a block diagram showing examples of security settings;
[0148] FIG. 129 is a block diagram showing examples of home theater settings;
[0149] FIG. 130 is a block diagram showing one specific example of home automation settings;
[0150] FIG. 131 is a flow diagram of a method for the U-Me system to track a user's software and license information;
[0151] FIG. 132 is an example of a license management entry that stores a license key with the software;
[0152] FIG. 133 is a block diagram showing examples of alerts a user can define in the user's U-Me account;
[0153] FIG. 134 is a block diagram showing examples of periodic reminders a user can define in the user's U-Me account;
[0154] FIG. 135 is a block diagram showing examples of seasonal reminders a user can define in the user's U-Me account;
[0155] FIG. 136 is a flow diagram of a method for the U-Me system to automatically destroy data and/or licensed content and/or settings according to a defined retention/destruction policy;
[0156] FIG. 137 is a block diagram showing examples of retention/destruction criteria that could be defined in a retention/destruction policy;
[0157] FIG. 138 is a block diagram showing examples of transfers that could be made within the U-Me system between users;
[0158] FIG. 139 is a flow diagram of a method for a user to transfer licensed content to a different user;
[0159] FIG. 140 is a flow diagram of a method for the U-Me system to transfer upon the death of one user the user's licensed content to other user(s);
[0160] FIG. 141 is a flow diagram of a method for auditing the licensed content in a user's U-Me account;
[0161] FIG. 142 is a flow diagram of a method for deleting content that is unlicensed from a user's U-Me account;
[0162] FIG. 143 shows an example of a U-Me sub-account mechanism;
[0163] FIG. 144 is a flow diagram of a method for defining and using sub-accounts;
[0164] FIG. 145 is a flow diagram of a method for the U-Me system to track credit card usage by a user for online transactions;
[0165] FIG. 146 is an example credit card log that shows three different credit cards and websites where the user used each credit card;
[0166] FIG. 147 is a flow diagram of a method for prompting a user regarding on which websites the user used a credit card when the credit card is about to expire;
[0167] FIG. 148 is a flow diagram of a method for a user to update credit card information on websites where the user has used the credit card;
[0168] FIG. 149 is a block diagram of an example of a macro/script mechanism;
[0169] FIG. 150 is a flow diagram of a method for generating macros and/or scripts; [0170] FIG. 151 is a flow diagram of a method for scheduling a macro or script to run;
[0171] FIG. 152 is a flow diagram of an example method for running a script to automatically retrieve a bank statement on the 5th of each month, and storing the bank statement to the user's U-Me account;
[0172] FIG. 153 is a flow diagram of a method for downloading settings from a user's U-Me account to a location;
[0173] FIG. 154 shows examples of queries that could be formulated in the data search engine;
[0174] FIG. 155 is a flow diagram of a method for a company to identify a person who is the licensee of software purchased by the company;
[0175] FIG. 156 is a flow diagram of a method for a company to revoke the license of a person to software purchased by the company;
[0176] FIG. 157 is a flow diagram of a method for converting physical items to electronic form and storing those items in a user's U-Me account;
[0177] FIG. 158 is a block diagram of a virtual machine image;
[0178] FIG. 159 is a block diagram of a running virtual machine generated from the virtual machine image in FIG. 158, where the running virtual machine is not specific to any user;
[0179] FIG. 160 is a block diagram of the running virtual machine in FIG. 159 after a U-Me user descriptor file has been written to the U-Me generic user shell to create U-Me user-specific components that are running;
[0180] FIG. 161 is a block diagram representing aspects of a virtual phone;
[0181] FIG. 162 is a block diagram representing one suitable example of a virtual phone representing the items shown in FIG. 161;
[0182] FIG. 163 is a screen display showing steps to configure a new phone using the virtual phone settings in FIG. 162;
[0183] FIG. 164 is a flow diagram of a method for generating indexing info for one or more photos;
[0184] FIG. 165 is a data entry screen for entering info about people into the U-Me system;
[0185] FIG. 166 shows the data entry screen in FIG. 165 after a person fills in information;
[0186] FIG. 167 is a data entry screen for a person to enter family relationships;
[0187] FIG. 168 shows the data entry screen in FIG. 167 after a person fills in information regarding family relationships;
[0188] FIG. 169 is a block diagram showing different entries for a spouse and a wedding date to the spouse;
[0189] FIG. 170 is a block diagram showing user-defined relationships and system-derived relationships that are derived from the user-defined relationships;
[0190] FIG. 171 is a flow diagram of a method for constructing relationships based on the photo system data entry; [0191] FIG. 172 is a display of a family tree based on the information entered by a user in the data entry screen in FIG.168;
[0192] FIG. 173 is a block diagram showing the user-defined relationships entered by a user in the data entry screen in FIG. 168;
[0193] FIG. 174 is a display of the family tree in FIG. 172 after adding information relating to the wife and son of Billy Jones;
[0194] FIG. 175 is a block diagram showing both the user-defined relationships as well as the system- derived relationships for the family tree in FIG. 174;
[0195] FIG. 176 is a data entry screen for a person to enter locations;
[0196] FIG. 177 shows a data entry screen that allows a person to define a location based on an address;
[0197] FIG. 178 is a flow diagram of a method for defining a location using an app on a mobile device;
[0198] FIG. 179 is a schematic diagram showing how method 17800 in FIG. 178 could be used for a user to define two different geographic regions that are stored as locations;
[0199] FIG. 180 is a block diagram showing user-defined locations and system-defined locations;
[0200] FIG. 181 shows examples of photo metadata;
[0201] FIG. 182 is a flow diagram of a method for adding location name to indexing information for a photo;
[0202] FIG. 183 is a block diagram showing photo indexing info that could be generated for a photo;
[0203] FIG. 184 is a block diagram showing examples of markup language tags that could be used as photo indexing info;
[0204] FIG. 185 is a block diagram showing examples of user -defined events, system-derived events, and system-defined events selected by a user;
[0205] FIG. 186 is a flow diagram of a method for generating and storing indexing info for a photo;
[0206] FIG. 187 is a flow diagram of a method for processing a photo for facial and feature recognition;
[0207] FIG. 188 is a flow diagram of a method for generating indexing info for a photo;
[0208] FIG. 189 is a flow diagram of a method for generating indexing information relating to one or more locations(s) for a photo when a user defines a location for the photo;
[0209] FIG. 190 is a flow diagram of a method for generating indexing information relating to one or more locations(s) for a photo based on geocode info in the photo metadata;
[0210] FIG. 191 is a flow diagram of a method for generating indexing information relating to one or more events for a photo based on a date or date range for the photo;
[0211] FIG. 192 is a flow diagram of a method for generating indexing information for a photographer's name based on the camera that took the photo;
[0212] FIG. 193 is a flow diagram of a method for automatically processing a photo using the U-Me system; [0213 ] FIG. 194 is a flow diagram of a method for storing photos that were scanned from hard copy photos with corresponding indexing information;
[0214] FIG. 195 is a flow diagram of a method for a user to define indexing info for one or more photos at the same time;
[0215] FIG. 196 shows storing indexing info separate from a digital photo file;
[0216] FIG. 197 shows storing the indexing info within the digital photo file;
[0217] FIG. 198 is an example of a data entry screen for a user to generate indexing info for one or more photos as shown in the method in FIG. 44;
[0218] FIG. 199 shows a screen for an example of a photo search engine;
[0219] FIG. 200 shows examples of photo queries that could be formulated in the photo search engine shown in FIG. 199;
[0220] FIG. 201 shows a screen for an example photo share engine;
[0221] FIG. 202 is a flow diagram of a method for sharing photos in a user's U-Me account with another user;
[0222] FIG. 203 is a representation of a sample photo;
[0223] FIG. 204 is sample indexing info that could be generated for the sample photo in FIG. 203;
[0224] FIG. 205 shows information in a user's U-Me account;
[0225] FIG. 206 represents how a first user's people info, location info, and event info can be shared with a second user, and further shows the second user may have different names that correspond to the faces defined in the first user's account, and may have different indexing info for the photos in the first user's account;
[0226] FIG. 207 is a method for generating indexing info based on existing tags in a digital photo file;
[0227] FIG. 208 is a flow diagram of a method for identifying duplicate photos;
[0228] FIG. 209 is a flow diagram of a method for importing people and relationships from an external file;
[0229] FIG. 210 is a flow diagram of a method for automatically propagating changes to a user's U-Me account to indexing info for the user's photos;
[0230] FIG. 211 is a flow diagram of a method for downloading settings from a user's U-Me account to a location;
[0231 ] FIG. 212 is a block diagram of a universal remote control that includes a dynamic location -based programming mechanism;
[0232] FIG. 213 shows some examples of different types of remote controls;
[0233] FIG. 214 is a flow diagram of a method for storing remote control programming parameters for a given location;
[0234] FIG. 215 shows an example of remote control programming parameters for a given location; [0235] FIG. 216 is a flow diagram of a method for a remote control to program itself using programming parameters for a given location stored in an external database;
[0236] FIG. 217 is a flow diagram of a method for converting user settings from a first vehicle to corresponding user settings for a second vehicle that are used to configure the second vehicle;
[0237] FIG. 218 is a block diagram of a television receiver that includes a user settings transfer mechanism for exporting user settings to an external device and for importing user settings from an external device;
[0238] FIG. 219 is a flow diagram of a method for storing user settings to an external device;
[0239] FIG. 220 is a flow diagram of a method for programming user settings for a second device based on the user's settings for a first device;
[0240] FIG. 221 is a flow diagram of a method for providing default settings for a device and for returning the device to its default settings after a user leaves;
[0241] FIG. 222 is a block diagram showing multiple levels of templates for user settings;
[0242] FIG. 223 is a block diagram showing multiple levels of templates and mappings for user settings;
[0243] FIG. 224 is a flow diagram of a method for propagating a user setting from a physical device to multiple templates;
[0244] FIG. 225 is a flow diagram of a method for propagating a user setting from the master template to one or more other templates and to a physical device;
[0245] FIG. 226 is a flow diagram of a method for resolving an incompatibility between user settings in different devices;
[0246] FIG. 227 is a block diagram showing multiple levels of templates and multiple physical devices; and
[0247] FIG. 228 is a block diagram showing multiple levels of templates and mappings for user settings that include multiple levels of universal templates.
DETAILED DESCRIPTION
[0248] The evolution of technology has resulted in a device-centric world. Early desktop computer systems allowed a user to define certain settings or preferences that defined how the computer system functioned. This trend has continued to our modern times. Each computer system allows installing software according to the user's needs, and allows setting numerous settings or preferences that define how the computer system functions. A user who buys a new computer system typically must spend many hours installing software and setting user preferences and settings to get the computer system to a state where it is usable according to the user's needs.
[0249] The same device-centric approach has been used with cell phones, and now with smart phones. When a user purchases a new phone, the user typically must spend many hours installing apps and setting the appropriate preferences and settings so the smart phone will perform the functions the user desires. Some phone vendors provide a service that can transfer a person's contacts from their old phone to the new phone, and some provide a backup service for those contacts should the person lose or damage their phone. This backup service, however, typically backs up only the contacts, and does not back up apps or settings on the phone. Thus, even with the backup service, when a user gets a new phone, the user still spends hours downloading and installing apps, ringtones, etc. and setting all the system settings to configure the phone to the user's liking.
[0250] While many aspects of modern life have been simplified through the use of technology, other aspects have yet to take advantage of technology in a significant way. For example, let's assume a person is watching television (TV), and the TV has a failure that causes the TV to quit working. The user may then try to remember where she bought the TV, when she bought the TV, and whether the TV is still under warranty. The user must typically then locate a stack or file of paper receipts, then go through the stack or file hoping to find the paper receipt for the TV. Even when the user is able to locate the paper receipt, the receipt itself may not indicate the warranty information for the TV. She may have to search for the hard copy documentation she received with the TV. In the alternative, she could contact the store or the manufacturer to determine the warranty for the TV. And when the TV is under warranty, the user will have to make a photocopy of the receipt and send the copy of the receipt with the TV when the TV is returned for warranty service. This system of paper receipts is grossly inefficient, and does not benefit from technology available today.
[0251] One aspect of modern life that has been greatly simplified through the use of technology is how music is purchased and used. Apple's iPod was a revolutionary device that allowed storing a large number of songs, which the user may listen to at his or her convenience. To satisfy concerns in the music industry regarding the ease of pirating (performing illegal copying) of digital music files, Apple developed the iTunes software application that allows a user to purchase music, which is stored on the user's computer system in their iTunes account. This music may be copied from the computer system to a suitable Apple device, such as an iPod or iPad. However, music from an iPod or iPad cannot be copied to the user's computer because this would make illegal copying of music very easy. Thus, all of a user's music is stored in the user's computer system in their iTunes software. So what happens when the user's hard drive crashes? Recovering the music in an iTunes account that was on a hard drive that crashed is not an easy process. This is because the iTunes account is tied to the computer system on which iTunes is installed. This shows that iTunes is device-centric as well, which means if the device that hosts iTunes crashes, the music that was stored on the device is difficult to recover.
[0252] Another aspect of our modern life that has not fully taken advantage of modern technology is data storage and retrieval. As referenced in the Background section above, Dropbox is an online service that allows storing information to the cloud. However, Dropbox is based on the folder/sub folder (or directory/subdirectory) paradigm. Thus, when using Dropbox, the user must remember to store the data in a Dropbox folder or subfolder, and then must also store the data in a location and use a file name the user is likely to remember. Relying on the memory of a user to remember where the user stored something on a computer system is very inefficient and error -prone. Many users have experienced storing a file to their computer system, then having to search many files across many directories in an attempt to locate the file they stored. Database systems provide very structured ways of storing information, which results in supporting very powerful ways of retrieving information in the database via queries. However, these powerful database tools for storing and retrieving information have not been employed in helping most users to store and retrieve information on their computer systems or smart phones.
[0253] Photography is an area that has greatly benefitted from modern technology. Digital cameras and cell phones allow capturing very high-resolution photographs and video in digital form that can be easily stored to an electronic device. While photography itself has been revolutionized by technology, the technology for storing and retrieving photographs has lagged far behind. Many people who have used digital cameras for years have many directories or folders on a computer system that contain thousands of digital photos and videos. When a person uses a digital camera or cell phone to take a photo, the device typically names the photo with a cryptic name that includes a number that is sequential. For example, a Nikon camera may name a photo file with a name such as "DSC_0012.jpg.". The digital file for the next photo is the next number in sequence, such as DSC_0013.jpg. Once the photo files are transferred to a computer and are deleted on the digital camera or cell phone, the digital camera or cell phone may reuse file names that were used previously. To avoid overwriting existing photos, many users choose to create a new directory or folder each time photos are downloaded from a camera or cell phone. This results in two significant problems. First, the file name for a photo may be shared by multiple photos in multiple directories. Second, the names of digital photo files give the user no information regarding the photo. Thus, to locate a particular photo of interest, the user may have to navigate a large number of directories, searching thumbnails of the photos in each directory to locate the desired photo. This is grossly inefficient and relies on the memory of the user to locate a desired photo. A user can more efficiently locate photos if the user takes the time to carefully name directories or folders and also takes the time to carefully name individual photo files. But this is very time-consuming, and most users don't take the time needed to name folders and photo files in a way that would make retrieval of the photos easier. Most people who take digital photos have thousands of photos that have cryptic names in dozens or hundreds of different directories or folders that may also have cryptic names. The result is that finding a particular photo may be very difficult.
[0254] While there are programs that allow organizing digital photos, they have not gained widespread acceptance due to their expense and the time required and difficulty for a user to organize their photos using these programs. As a result, these programs have done little to address the widespread problem of most users having thousands of digital photos that are stored using cryptic names in many different directories or folders, making retrieval of photographs difficult. The prior art includes various programs and online services that support photo tagging. Photo tagging is a way to add tags, or identifiers, to the metadata in a photo. Thus, a person could tag a photo with the names of people in the photo. Google's Picasa service includes face recognition and tagging. Thus, the face recognition engine in Picasa can recognize a person's face in multiple photos, and can then create a tag for that person that is written to the metadata for each of the photos. This allows for more easily retrieving photos based on a search of tags. However, current tagging technology is not very sophisticated. If a person tags some photos with the name Jim, and other photos with the name Jimmy for the same person, a search for Jim will identify the photos tagged with Jim but will not identify the photos tagged with Jimmy. Known tagging allows placing simple labels in the metadata of a photo file. A person can then use a search engine to search for photos that have one or more specified tags. But current tags do not allow identifying relationships between people, do not allow storing ages of people, and lack the flexibility and power needed to catalog, store and retrieve photos in a powerful way.
[0255] The disclosure herein presents a paradigm shift, from the device-centric world we live in today, to a person-centric world. This shift gives rise to many different opportunities that are not available in the world we live in today. A system called Universal Me (U-Me) disclosed herein is a cloud-based system that is person-centric. The U-Me system makes a user's data, licensed content and settings available in the cloud to any suitable device that user may choose to use.
[0256] The U-Me system includes a photo processing mechanism that allows cataloging and storing a user's photos using relationships between people that allow the user's photos to be retrieved using a search engine. A user enters people and specifies relationships, and may also enter locations, events, and other information. Photos are then processed, and indexing info is generated for each photo that may include any or all of the following: user-defined relationships, system-derived relationships, user-defined locations, system-defined locations, user-defined events, and system -derived events and ages for the people in the photos. The indexing info is used to catalog a photo for easy retrieval later. The indexing info may be stored as metadata with the photo or may be stored separately from the photo. The indexing info allows photos to be retrieved using a powerful search engine.
[0257] The U-Me system provides multiple templates that provide mapping information from physical devices to a master template that serves as a central repository for all of a user's settings for all of a user's devices. The templates also provide mapping information that allow for mapping settings between different physical devices, between physical devices and other templates, and between templates. A user settings mechanism uses the mapping information to propagate user settings stored in one template to other templates and to one or more physical devices, and to propagate user settings stored in a physical device to multiple templates, including a master template that serves as the central repository for all of a user's settings.
[0258] Referring to FIG. 1, the Universal Me (U-Me) system 100 includes multiple user accounts 110, shown in FIG. 1 as 11 OA, . . ., HON. Each user account includes data, licensed content, and settings that correspond to the user. Thus, Userl account 11 OA includes corresponding data 120 A, licensed content 130A, and settings 140 A. In similar fashion, UserN account 110N includes corresponding data 120N, licensed content 130N, and settings 140N. Any or all of the user's data, licensed content and settings may be made available on any device 150 the user may use. Examples of suitable devices are shown in FIG. 1 to include a smart phone 150A, a tablet computer 150B, a laptop computer 150C, a desktop computer 150D, and other device 150N. The devices shown in FIG. 1 are examples of suitable devices the user could use to access any of the data, licensed content, or settings in the user's account. The disclosure and claims herein expressly extend to using any type of device to access the user's data, licensed content, or settings, whether the device is currently known or developed in the future.
[0259] The U-Me system 100 may include virtual devices in a user's account. Referring to FIG. 2, the
Userl account 110A is shown to include a virtual smart phone 250A that corresponds to the physical smart phone 150A; a virtual tablet computer 250B that corresponds to the physical tablet computer 150B; a virtual laptop computer 250C that corresponds to the physical laptop computer 150C; a virtual desktop computer 250D that corresponds to a physical desktop computer 150D; and a virtual other device 250N that corresponds to a physical other device 150N. The virtual devices preferably include all information that makes a physical device function, including operating system software and settings, software applications (including apps) and their settings, and user settings. It may be impossible due to access limitations on the physical device to copy all the information that makes the physical device function. For example, the operating system may not allow for the operating system code to be copied. The virtual devices contain as much information as they are allowed to contain by the physical devices. In the most preferred
implementation, the virtual devices contain all information that makes the physical devices function. In this scenario, if a user accidentally flushes his smart phone down the toilet, the user can purchase a new smart phone, and all the needed information to configure the new smart phone exactly as the old one is available in the virtual smart phone stored in the user's U-Me account. Once the user downloads a U-Me app on the new smart phone, the phone will connect to the user's U-Me account, authenticate the user, and the user will then have the option of configuring the new device exactly as the old device was configured using the information in the virtual smart phone in the user's U-Me account.
[0260] There may be some software on a physical device that cannot be copied to the corresponding virtual device. When this is the case, the U-Me account will prompt the user with a list of things to do before the new physical device can be configured using the data in the virtual device. For example, if the user had just applied an operating system update and the new phone did not include that update, the user will be prompted to update the operating system before continuing. If an app installed on the old phone cannot be copied to the user's U-Me account, the U-Me app could prompt the user to install the app before the rest of the phone can be configured. The virtual device preferably contains as much information as possible for configuring the new device, but when information is missing, the U-Me system prompts the user to perform certain tasks as prerequisites. Once the tasks have been performed by the user, the U-Me system can take over and configure the phone using the information stored in the corresponding virtual device.
[0261] Referring to FIG. 3, a computer system 300 is an example of one suitable computer system that could host the universal me system 100. Server computer system 300 could be, for example, an IBM System i computer system. However, those skilled in the art will appreciate that the disclosure and claims herein apply equally to any computer system, regardless of whether the computer system is a complicated multi- user computing apparatus, a single user workstation, or an embedded control system. As shown in FIG. 3, computer system 300 comprises one or more processors 310, a main memory 320, a mass storage interface 330, a display interface 340, and a network interface 350. These system components are interconnected through the use of a system bus 360. Mass storage interface 330 is used to connect mass storage devices, such as local mass storage device 355, to computer system 300. One specific type of local mass storage device 355 is a readable and writable CD-RW drive, which may store data to and read data from a CD-RW 395.
[0262] Main memory 320 preferably contains data 321, an operating system 322, and the Universal Me System 100. Data 121 represents any data that serves as input to or output from any program in computer system 100. Operating system 322 is a multitasking operating system. The Universal Me System 100 is the cloud -based system described in detail in this specification, and includes a user settings mechanism 324 and user settings mapping information 326. The Universal Me System 100 as shown in FIG. 3 is a software mechanism that provides all of the functionality of the U-Me system.
[0263] FIG. 3 in conjunction with FIG. 1 thus shows a computer system comprising at least one processor, a memory coupled to the at least one processor, user data residing in the memory corresponding to a first user, first user settings corresponding to the first user for a plurality of software applications residing in the memory, second user settings corresponding to the first user for a plurality of hardware devices, and a software mechanism executed by the at least one processor that makes the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
[0264] Computer system 300 utilizes well known virtual addressing mechanisms that allow the programs of computer system 300 to behave as if they only have access to a large, contiguous address space instead of access to multiple, smaller storage entities such as main memory 320 and local mass storage device 355. Therefore, while data 321, operating system 322, and Universal Me System 100 are shown to reside in main memory 320, those skilled in the art will recognize that these items are not necessarily all completely contained in main memory 320 at the same time. It should also be noted that the term "memory" is used herein generically to refer to the entire virtual memory of computer system 300, and may include the virtual memory of other computer systems coupled to computer system 300.
[0265] Processor 310 may be constructed from one or more microprocessors and/or integrated circuits. Processor 310 executes program instructions stored in main memory 320. Main memory 320 stores programs and data that processor 310 may access. When computer system 300 starts up, processor 310 initially executes the program instructions that make up the operating system 322. Processor 310 also executes the Universal Me System 100.
[0266] Although computer system 300 is shown to contain only a single processor and a single system bus, those skilled in the art will appreciate that the Universal Me system may be practiced using a computer system that has multiple processors and/or multiple buses. In addition, the interfaces that are used preferably each include separate, fully programmed microprocessors that are used to off-load compute-intensive processing from processor 310. However, those skilled in the art will appreciate that these functions may be performed using I/O adapters as well.
[0267] Display interface 340 is used to directly connect one or more displays 365 to computer system 300. These displays 365, which may be non-intelligent (i.e., dumb) terminals or fully programmable workstations, are used to provide system administrators and users the ability to communicate with computer system 300. Note, however, that while display interface 340 is provided to support communication with one or more displays 365, computer system 300 does not necessarily require a display 365, because all needed interaction with users and other processes may occur via network interface 350.
[0268] Network interface 350 is used to connect computer system 300 to other computer systems or workstations 375 via network 370. Network interface 350 broadly represents any suitable way to interconnect electronic devices, regardless of whether the network 370 comprises present-day analog and/or digital techniques or via some networking mechanism of the future. Network interface 350 preferably includes a combination of hardware and software that allow communicating on the network 370. Software in the network interface 350 preferably includes a communication manager that manages communication with other computer systems 375 via network 370 using a suitable network protocol. Many different network protocols can be used to implement a network. These protocols are specialized computer programs that allow computers to communicate across a network. TCP/IP (Transmission Control Protocol/Internet Protocol) is an example of a suitable network protocol that may be used by the communication manager within the network interface 350.
[0269] As will be appreciated by one skilled in the art, aspects of the Universal Me system may be embodied as a system, method or computer program product. Accordingly, aspects of the Universal Me system may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the Universal Me system may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0270] Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable readonly memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD- ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0271] A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro -magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[0272] Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, F, etc., or any suitable combination of the foregoing.
[0273] Computer program code for carrying out operations for aspects of the Universal Me system may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
[0274] Aspects of the Universal Me system are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the Universal Me system. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0275] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
[0276] The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0277] FIG. 4 shows another view of a configuration for running the U-Me system 100. The U-Me system 100 preferably runs in a cloud, shown in FIG. 4 as cloud 410. A user connects to the U-Me system 100 using some physical device 150 that may include a browser 430 and/or software 440 (such as an application or app) that allows the user to interact with the U-Me system 100. Note the physical device 150 is connected to the U-Me system 100 by a network connection 420, which is representative of network 370 shown in FIG. 3, and which can include any suitable wired or wireless network or combination of networks. The network connection 420 in the most preferred implementation is an Internet connection, which makes the U-Me system available to any physical device that has Internet access. Note, however, other types of networks may be used, such as satellite networks and wireless networks. The disclosure and claims herein expressly extend to any suitable network or connection for connecting a physical device to the U-Me system 100.
[0278] Various features of the U-Me system are represented in FIG. 5. U-Me system 100 includes user data 120, user licensed content 130, and user settings 140, as the specific examples in FIGS. 1 and 2 illustrate. U-Me system 100 further includes a universal user interface 142, universal templates 152, device- specific templates 154, device interfaces 156, a virtual machine mechanism 158, a conversion mechanism 160, a data tracker 162, a data search engine 164, an alert mechanism 166, a licensed content transfer mechanism 168, a retention/destruction mechanism 170, a macro/script mechanism 172, a sharing mechanism 174, a virtual device mechanism 176, an eReceipt mechanism 178, a vehicle mechanism 180, a photo mechanism 182, a medical info mechanism 184, a home automation mechanism 186, a license management mechanism 188, a sub-account mechanism 190, a credit card monitoring mechanism 192, and a user authentication mechanism 194. Each of these features is discussed in more detail below. The virtual devices 150 in FIG. 2 are preferably created and maintained by the virtual device mechanism 176 in FIG. 5.
[0279] FIG. 6 shows some specific examples of user data 120 that could be stored in a user's U-Me account, including personal files 610, contacts 615, e-mail 620, calendar 625, tasks 630, financial info 635, an electronic wallet 640, photos 645, reminders 650, eReceipts 655, medical information 660, and other data 665. The user data shown in FIG. 6 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable data that can be generated by a user, generated for a user, or any other data relating in any way to the user, including data known today as well as data developed in the future.
[0280] Personal files 610 can include any files generated by the user, including word processor files, spreadsheet files, .pdf files, e-mail attachments, etc. Contacts 615 include information for a user's contacts, preferably including name, address, phone number(s), e-mail address, etc. E-mail 620 is e-mail for the user. E-mail 620 may include e-mail from a single e-mail account, or e-mail from multiple e-mail accounts. E- mail 620 may aggregate e-mails from different sources, or may separate e-mails from different sources into different categories or views. Calendar 625 includes an electronic calendar in any suitable form and format. Tasks 630 include tasks that a user may set and tasks set by the U-Me system. Financial info 635 can include any financial information relating to the user, including bank statements, tax returns, investment account information, etc. Electronic wallet 640 includes information for making electronic payments, including credit card and bank account information for the user. Google has a product for Android devices called Google Wallet. The electronic wallet 640 can include the features of known products such as Google
Wallet, as well as other features not known in the art.
[0281] Photos 645 include digital electronic files for photographs and videos. While it is understood that a user may have videos that are separate from photographs, the term "photos" as used herein includes both photographs and videos for the sake of convenience in discussing the function of the U-Me system.
Reminders 650 include any suitable reminders for the user, including reminders for events on the calendar
625, reminders for tasks 630, and reminders set by the U-Me system for other items or events. eReceipts 655 includes electronic receipts in the form of electronic files that may include warranty information and/or links that allow a user to make a warranty claim. Medical info 660 includes any suitable medical information relating to the user, including semi-private medical information, private medical information, and information provided by medical service providers, insurance companies, etc. Other data 665 can include any other suitable data for the user.
[0282] FIG. 7 shows some specific examples of user licensed content 130 that could be stored in a user's U-Me account, including purchased music 710, stored music 715, purchased movies 720, stored movies 725, eBooks 730, software 735, games 740, sheet music 745, purchased images 750, online subscriptions 755, and other licensed content 760. The user licensed content shown in FIG. 7 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable user licensed content, including user licensed content known today as well as user licensed content developed in the future.
[0283] Purchased music 710 includes music purchased from an online source. Note the purchased music 710 could include entire music files, or could include license information that authorizes the user to stream a music file on-demand. Stored music 715 includes music the user owns and which has been put into electronic format, such as music recorded (i.e., ripped) from a compact disc. Purchased movies 720 include movies purchased from an online source. Note the purchased movies 720 could include an entire movie file, or could include license information that authorizes the user to stream a movie on-demand. Stored movies 725 include movies the user owns and which have been put into electronic format, such as movies recorded from a digital video disc (DVD). eBooks 730 include books for the Apple iPad, books for the Kindle Fire, and books for the Barnes & Noble Nook. Of course, eBooks 730 could include books in any suitable electronic format.
[0284] Software 735 includes software licensed to the user and/or to the user's devices. In the most preferred implementation, software is licensed to the user and not to any particular device, which makes the software available to the user on any device capable of running the software. However, software 735 may also include software licensed to a user for use on only one device, as discussed in more detail below. Software 735 may include operating system software, software applications, apps, or any other software capable of running on any device. In addition, software 735 may include a backup of all software stored on all devices used by the user. Games 740 include any suitable electronic games, including games for computer systems and any suitable gaming system. Known gaming systems include Sony Playstation, Microsoft Xbox, Nintendo Wii, and others. Games 740 may include any games for any platform, whether currently known or developed in the future. Sheet music 745 includes sheet music that has been purchased by a user and is in electronic form. This may include sheet music files that are downloaded as well as hard copy sheet music that has been scanned. Some pianos now include an electronic display screen that is capable of displaying documents such as sheet music files. If a user owns such a piano, the user could access via the piano all of the user's stored sheet music 745 in the user's U-Me account. Purchased images
750 include any images purchased by the user, including clip art, pictures, etc. Online subscriptions 755 include content generated by the user on a subscription basis by any suitable provider. For example, if a user subscribes to Time magazine online, the online subscriptions 755 could include electronic copies of Time magazine. Other licensed content 760 can include any other licensed content for a user.
[0285] FIG. 8 shows some specific examples of user settings 140 that could be stored in a user's U-Me account, including universal interface settings 810, phone settings 815, tablet settings 820, laptop settings 825, desktop settings 830, television settings 835, software settings 840, vehicle settings 845, home automation settings 850, gaming system settings 855, audio system settings 860, security system settings 865, user authentication settings 870, and other settings 875. The user settings shown in FIG. 8 are examples shown for the purpose of illustration. The software settings 840, which include user settings for software applications, include user preferences for each software application. Note the term "software application" is used herein to broadly encompass any software the user can use, whether it is operating system software, an application for a desktop, an app for a phone, or any other type of software. User settings for physical devices include user settings for each physical device. The term "physical device" is used herein to broadly include any tangible device, whether currently known or developed in the future, that includes any combination of hardware and software. The disclosure and claims herein extend to any suitable user settings, including user settings known today as well as user settings developed in the future.
[0286] Universal interface settings 810 include settings for a universal interface for the U-Me system that can be presented to a user on any suitable device, which allows the user to interact with the U-Me system using that device. Phone settings 815 include settings for the user's phone, such as a smart phone. Apple iPhone and Samsung Galaxy S4 are examples of known smart phones. Tablet settings 820 include settings for the user's tablet computer. Examples of known tablet computers include the Apple iPad, Amazon Kindle, Barnes & Noble Nook, Samsung Galaxy Tab, and many others. Laptop settings 825 are settings for a laptop computer. Desktop settings 830 are settings for a desktop computer. Television settings 835 are settings for any suitable television device. For example, television settings 835 could include settings for a television, for a cable set -top box, for a satellite digital video recorder (DV ), for a remote control, and for many other television devices. Software settings 840 include settings specific to software used by the user. Examples of software settings include the configuration of a customizable menu bar on a graphics program such as Microsoft Visio; bookmarks in Google Chrome or favorites in Internet Explorer; default file directory for a word processor such as Microsoft Word; etc. Software settings 840 may include any suitable settings for software that may be defined or configured by a user.
[0287] Vehicle settings 845 include user settings relating to a vehicle, including such things as position of seats, position of mirrors, position of the steering wheel, radio presets, heat/cool settings, music playlists, and video playlists. Home automation settings 850 include settings for a home automation system, and may include settings for appliances, heating/ventilation/air conditioning (HVAC), lights, security, home theater, etc. Gaming system settings 855 include settings relating to any gaming system. Audio system settings 860 include settings for any suitable audio system, including a vehicle audio system, a home theater system, a handheld audio player, etc. The security system settings 865 may include settings for any suitable security system. User authentication settings 870 include settings related to the user's authentication to the U-Me system. Other settings 875 may include any other settings for the user.
[0288] The U-Me system makes a user's data, licensed content, and settings available to the user on any device the user desires to use. This is a significant advantage for many reasons. First of all, even for people who are comfortable with technology, getting a device configured exactly as the user wants is time- consuming and often requires research to figure out how to configure the device. For example, let's assume a user installs the Google Chrome browser on a desktop computer. When the user downloads a file using Google Chrome, the downloaded file appears as a clickable icon on the lower left of the Google Chrome display. To open the file, the user clicks on the icon. Let's assume the user wants to always open .pdf files after they are downloaded. Because the user does not know how to configure Chrome to do this, the user does a quick search, and discovers that Chrome can be configured to always open .pdf files after they are downloaded by clicking on a down arrow next to the downloaded .pdf file icon, which brings up a pop-up menu, then selecting "Always open files of this type." This configures Google Chrome to always open .pdf files after they are downloaded. However, the user cannot be expected to remember this small tidbit of knowledge. If the user made this setting change to Google Chrome when the desktop computer was new, and two years passes when the user gets a new desktop computer, it is highly unlikely the user will remember how to configure Google Chrome to automatically open .pdf files after they are downloaded. In any modern device, there are dozens or perhaps hundreds of such user settings. By storing these user settings in the user's U-Me account, the user will not have to remember each and every setting the user makes in each and every device. The same is true for configuring a smart phone. Often users have to search online to figure out how to do certain things, such as setting different ringtones for different contacts. In today's world, such settings are lost when a user changes to a different phone, which requires the user repeat the learning process to configure the new phone. With the U-Me system disclosed herein, all of the user's settings are saved to the user's U-Me account, allowing a new device to be easily configured using the stored user settings.
[0289] While the previous paragraph discusses an example of a user setting in Google Chrome, similar concepts apply to user data and user licensed content. There is currently no known way to make all of a user's data, licensed content, and settings available in the cloud so this information is available to the user on any device or system the user decides to use. The Universal Me system solves this problem. The system is called Universal Me because it "allows me to be me, anywhere" for each user. Thus, a user on vacation on Italy could find an Internet cafe, use a computer in the Internet cafe to access the user's universal interface to the U-Me system, and would then have access to all of the user's data, licensed content, and settings.
Similarly, the user could borrow an iPad from a friend, and have access to all the user's data, licensed content, and settings. The power and flexibility of the U-Me system leads to its usage in many different scenarios, several of which are described in detail below.
[0290] While many different categories of user settings are shown in FIG. 8, these are shown by way of example. A benefit of the U-Me system is that a user only has to configure a device once, and the configuration for that device is stored in the user's U-Me account. Replacing a device that is lost, stolen, or broken is a simple matter of buying a new similar device, then following the instructions provided by the U- Me system to configure the new device to be identical to the old device. In the most preferred
implementation, the U-Me system will back up all user data, licensed content, and settings related to the device to the user's U-Me account, which will allow the U-Me system to configure the new device automatically with minimal input from the user. However, features of the devices themselves may prevent copying all the relevant data, licensed content and settings to the user's U-Me account. When this is the case, the U-Me system will provide instructions to the user regarding what steps the user needs to take before the U-Me system can configure the device with the information stored in the user's U-Me account.
[0291] The U-Me system could use various templates that define settings for different physical devices. Referring to FIG. 9, universal templates 152 include phone templates 910, tablet templates 915, laptop templates 920, desktop templates 925, television templates 930, software templates 935, vehicle templates 940, home automation templates 945, gaming system templates 950, audio system templates 955, security system templates 960, eReceipt templates 965, medical information templates 970, master template 975, and other templates 980. The universal templates shown in FIG. 9 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable universal templates, including universal templates related to devices known today as well as universal templates related to devices developed in the future.
[0292] The various universal templates in FIG. 9 include categories of devices that may include user settings. One of the benefits of the U-Me system is the ability for a user to store settings for any device or type of device that requires configuration by the user. This allows a user to spend time once to configure a device or type of device, and the stored settings in the user's U-Me account will allow automatically configuring identical or similar devices. The U-Me system expressly extends to storing any suitable user data and/or user licensed content and/or user settings for any suitable device in a user's U-Me account.
[0293] The universal templates 152 provide a platform-independent way of defining settings for a particular type of device. Thus, a universal phone template maybe defined by a user using the U-Me system without regard to which particular phone the user currently has or plans to acquire. Because the universal templates are platform-independent, they may include settings that do not directly map to a specific physical device. Note, however, the universal templates may include information uploaded from one or more physical devices. The universal template can thus become a superset of user data, user licensed content, and user settings for multiple devices. The universal templates can also include settings that do not correspond to a particular setting on a particular physical device.
[0294] Referring to FIG. 10, device-specific templates 154 include phone templates 1005, tablet templates 1010, laptop templates 1015, desktop templates 1020, television templates 1025, software templates 1030, vehicle templates 1035, home automation templates 1040, gaming system templates 1045, audio system templates 1050, security system templates 1055, and other templates 1060. The device-specific templates shown in FIG. 10 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable device-specific templates, including device -specific templates for devices known today as well as device-specific templates for devices developed in the future.
[0295] The device-specific templates 154 provide platform-dependent templates. Thus, the user data, user licensed content, and user settings represented in a device-specific template includes specific items on a specific device or device type. The device-specific templates 154 may also include mapping information to map settings in a physical device to settings in a universal template. FIGS. 11-21 are related to device specific templates 154. Referring to FIG. 11, phone templates 1005 may include iPhone templates 1110, Android templates 1120 and Windows phone templates 1130, which represent different phone types. Phone templates 1005 may also include templates for a specific phone, such as iPhone 4 template 1140 and Samsung Galaxy S3 template 1150, as well as one or more other phone templates 1160 that may be for a phone type or for a specific phone.
[0296] Tablet templates 1010 are shown in FIG. 12 to include iPad templates 1210 and Nook templates 1220, which represent different tablet platforms. Tablet templates 1010 may also include templates for a specific tablet, such as a Kindle Fire HD template 1230 and an iPad mini 2 template 1240, as well as one or more other tablet templates 1250 that may be for a tablet type or for a specific tablet.
[0297] Laptop templates 1015 are shown in FIG. 13 to include Lenovo laptop templates 1310 and MacBook templates 1320, which represent different laptop computer types. Laptop templates 1015 may also include templates for a specific laptop, such as a Samsung Chromebook template 1330 and an HP Envy template 1340, as well as one or more other laptop templates 1350 that may be for a laptop type or for a specific laptop. [0298] Desktop templates 1020 are shown in FIG. 14 to include HP desktop templates 1410 and Dell desktop templates 1420, which represent different laptop computer types. Desktop templates 1020 may also include templates for a specific desktop computer, such as an HP Pavilion PS-2355 desktop template 1430 and an Asus Ml 1BB-B05 desktop template 1440, as well as one or more other desktop templates 1450 that may be for a desktop type or for a specific desktop computer.
[0299] Television templates 1025 are shown in FIG. 15 to include a Sony TV template 1510 and a satellite TV template 1520, which represent different types of television devices. Television templates 1025 may also include templates for a specific television device, such as a Mitsubishi WD-60638 template 1530, a Dish Network Hopper DV template 1540, and an RCA RCUIOIO remote template 1540, as well as one or more other television device templates 1560 that may be for a television device type or for a specific television-related device.
[0300] Software templates 1030 are shown in FIG. 16 to include a word processor template 1610 and an e- mail template 1620, which represent different types of software. Software templates 1030 may also include templates for specific software, such as a Microsoft Word template 1630 and a Google Chrome template 1640, as well as one or more other software templates 1650 that may be for a type of software or for specific software.
[0301] Vehicle templates 1035 are shown in FIG. 17 to include a Chevrolet template 1710 and a Toyota template 1720, which represent different types of vehicles. Vehicle templates 1035 may also include templates for specific vehicles, such as a Honda Civic LX template 1730 or a Ford F150 XLT template 1740, as well as one or more other vehicle templates 1750 that may be for a type of vehicle or for a specific vehicle. Note while the only vehicles shown in FIG. 17 are cars and a small truck, the vehicle templates 1035 could include templates for any type of vehicle, including cars, trucks, boats, large semi trucks, planes, and other vehicles. The "type" of the vehicle herein can also vary, and a single vehicle can correspond to many different types. For example, a 2012 Lexus RX350 could be categorized as a passenger vehicle, as a small SUV, as a Lexus, as a Lexus passenger vehicle, as a Lexus small SUV, etc. One of the significant advantages of the U-Me system is the ability to convert settings from a vehicle of one type to a vehicle of a different type. Thus, if a user normally drives a Ford F150 XLT pickup, the user's settings for his Ford pickup can be converted to corresponding settings in a Toyota rental car. Of course, brands or
manufacturers are examples of "types" as well.
[0302] Home automation templates 1040 are shown in FIG. 18 to include a refrigerator template 1810, an
HVAC template 1820, and an energy usage template 1830, which represent different things that may be controlled by a home automation system. Home automation templates 1040 may also include templates for specific home automation systems, such as Home Automation Inc. (HAJ) Omni template 1840, Samsung refrigerator template 1850, lighting template 1860, as well as one or more other home automation templates 1870 that may be for a type of home automation controller or type of item controlled by a home automation controller or for a specific home automation controller or item controlled by a home automation controller. [0303] Gaming system templates 1045 are shown in FIG. 19 to include Xbox templates 1910 and Playstation templates, which represent different types of gaming systems. Gaming templates 1045 may also include templates for specific gaming systems, such as Nintendo Wii U template 1930 and Xbox 360 template 1940, as well as one or more other gaming system templates 1950 that may be for a type of gaming system or for a specific gaming system.
[0304] Audio system templates 1050 are shown in FIG. 20 to include stereo receiver templates 2010, home theater templates 2020, and vehicle audio templates 2030, which represent different types of audio systems. Audio system templates 1050 may also include templates for specific audio systems, such as Sony STR- DH130 template 2040 and Yamaha RX-V375 template 2050, as well as one or more other audio system templates 2060 that may be for a type of audio system or for a specific audio system.
[0305] Security system templates 1055 are shown in FIG. 21 to include ADT templates 2110 and FrontPoint templates 2120, which represent different types of security systems from different manufacturers. Security system templates 1055 may also include templates for specific security systems, such as a Fortress S02-B template 2130 and a Simplisafe2 template 2140, as well as one or more other security system templates 2150 that may be for a type of security system or for a specific audio system.
[0306] While the templates disclosed herein may be of any suitable format, it is expected that industry experts will have to spend time brainstorming and meeting to arrive at an industry standard. Thus, the automotive industry may generate an industry- standard template for cars, while the personal computer industry may generate a very different industry- standard template for desktop computers. Generating and publishing standard templates will greatly accelerate the acceptance of the U-Me system.
[0307] The device-specific templates shown in FIGS. 10-21 could be provided by any suitable entity. For example, the U-Me system may provide some of the device-specific templates. However, some device- specific templates will preferably be provided by manufacturers of devices. As discussed below, the U-Me system includes the capability of device manufacturers to become "U-Me Certified", which means their devices have been designed and certified to appropriately interact with the U-Me system. Part of the U-Me certification process for a device manufacturer could be for the manufacturer to provide a universal template for each category of devices the manufacturer produces, a device-specific template for each category of devices the manufacturer produces, as well as a device-specific template for each specific device the manufacturer produces.
[0308] Referring to FIG. 22, device interfaces 156 preferably include phone interfaces 2205, tablet interfaces 2210, laptop interfaces 2215, desktop interfaces 2220, television interfaces 2225, software interfaces 2230, vehicle interfaces 2235, home automation interfaces 2240, gaming system interfaces 2245, audio system interfaces 2250, security system interfaces 2255, and other interfaces 2260. The device interfaces shown in FIG. 22 are examples shown for the purpose of illustration. The disclosure and claims herein extend to any suitable device interfaces, including device interfaces for devices known today as well as device interfaces for devices developed in the future. [0309] Each device interface provides the logic and intelligence to interact with a specific type of device or with a specific device. Thus, phone interfaces 2205 could include an iPhone interface and an Android interface. In addition, phone interfaces 2205 could include different interfaces for the same type of device. Thus, phone interfaces 2205 could include separate phone interfaces for an iPhone 4 and an iPhone 5. In the alternative, phone interfaces 2205 could be combined into a single phone interface that has the logic and intelligence to communicate with any phone. In the most preferred implementation, a device interface is provided for each specific device that will interact with the U-Me system. This could be a requirement for a device to become U-Me certified, that the manufacturer of the device provide the device interface that meets U-Me specifications.
[0310] The U-Me system preferably includes a universal user interface 142 shown in FIG. 5. The universal user interface 2300 shown in FIG. 23 is one suitable example of a specific implementation for the universal user interface 142 shown in FIG. 5. The universal user interface 2300 in FIG. 23 includes several icons the user may select to access various features in the U-Me system. The icons shown in FIG. 23 include a data icon 2310, a licensed content icon 2320, a software icon 2330, a settings icon 2340, a devices icon 2350, and a templates icon 2360. Selecting the data icon 2310 gives the user access to the user data 120 stored in the user's U-Me account, including the types of data shown in FIG. 6. One way for the user to access the user data 120 is via a data search engine, discussed in more detail below. Selecting the licensed content icon 2320 gives the user access to any and all of the user's licensed content 130, including the categories of licensed content shown in FIG. 7. Selecting the software icon 2330 gives the user access to software available in the user's U-Me account. While software is technically a category of licensed content
(see 735 in FIG. 7), a separate icon 2330 is provided in the universal user interface 2300 in FIG. 23 because most users would not mentally know to select the licensed content icon 2320 to run software. Selecting the software icon 2330 results in a display of the various software applications available in the user's U-Me account. The user may then select one of the software applications to run. The display of software icons after selecting the software icon 1230 could be considered a "virtual desktop" that is available anywhere via a browser or other suitable interface.
[0311] Selecting the settings icon 2340 gives the user access to any and all of the user settings 140, including the categories of settings shown in FIG. 8. Selecting the devices icon 2350 gives the user access to virtual devices, which are discussed in more detail below, where the virtual devices correspond to a physical device used by the user. The user will also have access to the device interfaces 156, including the device interfaces shown in FIG. 22. Accessing devices via the device interfaces allows the user to have remote control via the universal user interface over different physical devices. Selecting the templates icon 2360 gives the user access to the templates in the user's U-Me account, including: universal templates, including the universal templates shown in FIG. 9; and device-specific templates, including those shown in FIGS. 10- 21. The devices icon 2350 and the templates icon 2360 provide access to information in the user's U-Me account pertaining to devices and templates, which can be part of the settings in the user's U-Me account. While the Devices icon 2350 and Templates icon 2360 could be displayed as a result of a user selecting the Setting icon 2240, these icons 2350 and 2360 that are separate from the settings icon 2340 could be provided as shown in FIG. 23 to make using the universal user interface 2300 more intuitive for the user.
[0312] The universal user interface gives the user great flexibility in accessing a user's U-Me account. In the most preferred implementation, the universal user interface is browser -based, which means it can be accessed on any device that has a web browser. Of course, other configurations for the universal user interface are also possible, and are within the scope of the disclosure and claims herein. For example, a user on vacation in a foreign country can go into an Internet cafe, invoke the login page for the U-Me system, log in, and select an icon that causes the universal user interface (e.g., 2300 in FIG. 23) to be displayed. The user then has access to any and all information stored in the user's U-Me account.
[0313] Because the universal user interface allows a user to access the user's U-Me account on any device, the universal user interface also provides a way for a user to change settings on the user's devices. Because the user's U-Me account includes virtual devices that mirror the configuration of their physical device counterparts, the user could use a laptop or desktop computer to define the settings for the user's phone. This can be a significant advantage, particularly for those who don't see well or who are not dexterous enough to use the tiny keypads on a phone. A simple example will illustrate. Let's assume a U-Me user wants to assign a specific ringtone to her husband's contact info in her phone. The user could sit down at a desktop computer, access the universal user interface 2300, select the Devices icon 2350, select a Phone icon, which then gives the user access to all of the settings in the phone. The user can then navigate a menu displayed on a desktop computer system using a mouse and full-sized keyboard to change settings on the phone instead of touching tiny links and typing on a tiny keyboard provided by the phone. The user could assign the ringtone to her husband's contact info in the settings in the virtual device in the U-Me account that corresponds to her phone. Once she makes the change in the virtual phone settings in the U-Me account, this change will be automatically propagated to her phone. The universal user interface may thus provide access to the user to set or change the settings for all of the user's physical devices.
[0314] The universal user interface 142 can include any suitable interface type. In fact, the universal user interface 142 can provide different levels of interfaces depending on preferences set by the user. Thus, the universal user interface may provide simple, intermediate, and power interfaces that vary in how the information is presented to the user depending on the user's preferences, which could reflect the technical prowess and capability of the user. Those who are the least comfortable with technology could select a simple interface, which could provide wizards and lots of help context to help a user accomplish a desired task. Those more comfortable with technology could select the intermediate interface, which provides fewer wizards and less help, but allows a user to more directly interact with and control the U-Me system. And those who are very technically-oriented can select the power interface, which provides few wizards or help, but allows the user to directly interact with and control many aspects of the U-Me system in a powerful way. [0315] There are many different ways to program a device using the information in the user's U-Me account. Referring to FIG. 24, a method 2400 for programming a device called Device2 begins by determining settings for Device2 (step 2410), then programming the device with those settings (step 2420). There are different ways to determine the settings for Device2 in step 2410. Referring to FIG. 25, method 2500 shows one suitable implementation for step 2410 in FIG. 24. Settings for a device called Devicel are read (step 2510). A mapping from Devicel to Device2 is then read (step 2520). The settings for Devicel are then converted to the settings for Device2 (step 2530). This is shown graphically in FIG. 26, where the Devicel settings 2610 are converted using the Devicel to Device2 mapping 2620 to Device2 settings 2630. This first example in FIGS. 25 and 26 show how to program a device by converting settings from one device to settings for a different device. For example, let's assume a user has been using an iPhone 4, then decides to change to a Samsung Galaxy S4 phone. Assuming there are device -specific templates 154 for both phones, the conversion mechanism 160 in FIG. 5 can convert the settings on the iPhone 4 to settings on the Samsung Galaxy S4, provided there is a mapping in the phone templates between the device-specific settings of the two devices. The example in FIGS. 25 and 26 shows how to program a device by converting from settings of a different device.
[0316] A second suitable implementation for step 2410 in FIG. 24 is shown in FIGS. 27 and 28. In this implementation, Device2 is programmed from settings stored in the Universal Template corresponding to Device2. The universal template settings are read (step 2710). A mapping from the universal template to Device 2 is read (step 2720). The conversion mechanism then converts the settings from the universal template to the settings for Device2 (step 2730). This is shown graphically in FIG. 28, where universal template settings 2810 are converted using the universal template to Device2 mapping 2820 to generate Device2 settings 2630. This second implementation in FIGS. 27 and 28 vary from the first implementation in steps 25 and 26 because the conversion of settings is between the universal template settings to the Device2 settings, not from the settings of another device (such as Devicel).
[0317] A third suitable implementation for step 2410 in FIG. 24 is shown in FIGS. 29 and 30. Devicel settings are read (step 2910). A mapping from Devicel to the universal template is also read (step 2920). The Devicel settings are then converted to the universal template settings (step 2930). A mapping from the universal template to Device2 is then read (step 2940). The universal template settings are then converted to Device 2 settings (step 2950). This is shown graphically in FIG. 30, where the Devicel settings are converted using the Devicel to universal template mapping 3020 to universal template settings 3030, which are then converted using the universal template to Device2 mapping 3040 to Device2 settings 3050. This third implementation converts settings between two devices, similar to the first implementation shown in FIGS. 25 and 26, but this is not a direct mapping between two devices, but is rather a mapping to and from universal template settings.
[0318] The conversion of settings from one device to another in FIGS. 25-30 can be performed by the conversion mechanism 160 shown in FIG. 5, which could include the user settings mechanism 324 and the user settings mapping information 326 shown in FIG. 3. The examples in FIGS. 25-30 allow converting settings from one device to corresponding settings for a different device. The different device may be of the same type or may be of a different type. Type can be defined according to hardware architecture, system software (e.g., operating system), manufacturer, brand, or any other characteristic that characterizes a device. Thus, an iPhone and a Samsung Galaxy phone are devices of different types because they have a different hardware architecture type and run different system software. A Chevrolet and a Toyota are devices of different types because they are made by different manufacturers. An iPhone 4 and iPhone5 could be categorized as devices of the same type because they have the same hardware architecture type and run the same system software, even if the version of the system software is not the exact same. The disclosure and claims herein extends to any suitable definition or categorization for the "type" of a device. The conversion mechanism allows converting settings between devices of the same type, between devices of similar type, and also between devices of different types. For example, devices may be of the same type when they have the same hardware architecture type and run the same system software. Devices may be of similar type when they have the same hardware architecture type and run different system software. Devices may be of different types when they have different hardware architecture type and different system software.
[0319] We now consider one specific usage of the U-Me system with regards to television equipment with respect to FIGS. 31 -37. We assume a user's television settings are store in the user's U-Me account.
Examples of suitable television settings 835 are shown in FIG. 32 to include one or more favorite channels list 3210, shows set to record 3220, blocked channels 3230, parental controls 3240, channel numbers for stations 3250, and one or more passwords 3260. These are all settings the user can define, for example, in a
DVR for Dish Network. For this specific example, we assume the user has Dish Network at the user's home, and programs the Dish Network DVR with some or all of the user television settings 835 shown in FIG. 32. We now assume the user travels to a new location during a vacation, such as a hotel room, a resort, a relative's house, etc., and we further assume the new location has DirecTV. Referring to FIG. 33, method 3300 begins by detecting the target system (at the new location) is a DirecTV system (step 3310). The user's Dish Network television settings are converted to equivalent or similar DirecTV settings in the user's U-Me account (step 3320). The converted DirecTV settings from the user's U-Me account are then downloaded to the DirecTV target system (e.g., DVR) at the new location (step 3330). The result is the user's Dish Network television settings are now available on the DirecTV DVR. One part of the conversion in step 3320 is converting the channel numbers from Dish Network to the equivalent channel numbers in
DirecTV. A sample mapping for ten channels is shown at 3100 in FIG. 31. Note the channels ABC, NBC, CBS and Fox in the mapping 3100 show "local" instead of a number, because the channel numbers will vary from one geographic region to the next. The indication of "local" in the channel mapping will indicate a need to determine the location of the target system, and determine the appropriate mapping to the target system using the channel numbers that are specific to the geographic region where the target system is located. This is a task easily accomplished by the U-Me system. The mapping 3100 shown in FIG. 31 is one suitable example for user settings mapping information 326 shown in FIG. 3.
[0320] Note that known DVRs for Dish Network and DirecTV do not allow downloading settings as discussed above with respect to method 3300 in FIG. 33. For television providers to work in conjunction with the U-Me system, each provider's DVR will need to be "U-Me Certified", meaning the DVR includes logic and intelligence that allows the DVR to interact with the U-Me system. This certification process will also preferably provide a device-specific template for each DVR, along with information that allows mapping the settings from one provider to another provider. In the most preferred implementation, a universal template for a DVR could be defined with required fields, and each device-specific template for each DVR will have to have the required fields specified in the universal DVR template.
[0321] Changing television settings in the new location would not be very helpful unless the user has a remote control that can accommodate the change. We assume for this example a user has a remote control with a screen that displays channel icons, such as shown in FIG. 35. Such remote controls, such as the Pronto touch-screen remote control and the RCA RCUlOlO remote control, allow displaying a channel icon. Method 3400 in FIG. 34 can be used to reprogram a remote control to accommodate the change of location in the example above. This example also assumes the remote control is "U-Me Certified" , meaning the remote control includes logic and intelligence that allows the remote control to interact with the U-Me system. The settings for the remote control are read (step 3410). Thus, the mapping of channel icons to channel numbers is determined in step 3410. The settings are converted to equivalent or similar settings for the target system (step 3420). This means the channel numbers of the displayed icons in display 3500 in
FIG. 35 for Dish Network are converted to the equivalent channel numbers using the mapping 3100 in FIG. 31. The conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5. The remote control is then reprogrammed for the target system (step 3430). This means the channel numbers that are sent by the remote control are now the channel numbers for DirecTV, not Dish Network. Thus, when the user is home and presses the Fox News icon, the remote control sends channel 205 to the
Dish Network DVR. But after the remote control has been reprogrammed for the target system at the new location as shown in FIG. 34, when the user presses the Fox News icon, the remote control will now send channel 360 to the DirecTV DVR. This reprogramming thus allows a user to use a remote control with icon- based channels by reprogramming the underlying channel numbers that are sent by the remote control when an icon is pressed. The user is thus able to travel with the user's home remote control, and have the remote control be automatically reprogrammed to accommodate the television system at the new location, assuming the television system at the new location is U-Me compliant.
[0322] In addition to reprogramming the remote to transmit different channel numbers when an icon is pressed, the remote can also be reprogrammed to transmit different channel numbers than channel number pressed by the user. This is shown in method 3600 in FIG. 36. A user uses the numeric keypad on the remote control to key in a channel number for Device 1 (step 3610). The remote automatically converts the channel number entered by the user to the equivalent channel number in the target system (step 3620). The remote then transmits the channel number for Device2 (step 3630). For the simple example given above, with Dish Network at the user's home and DirecTV at the new location, when the user presses channel 138 on the remote control keypad (step 3610), the remote control keypad will detect the number and convert the number 138 for TNT in Dish Network to number 245 for TNT in DirecTV (as shown in FIG. 31) (step
3620). The remote control then transmits channel number 245 to the DirecTV DV (step 3630). In this manner the user need not learn the new channel numbers at the new location, but can instead use the old channel numbers from home to access the same television channels on the system at the new location.
[0323] Note there are apps for Android smart phones that allow turning the phone into a touch-screen remote control for certain types of TVs. One such app is Smart VE Remote Control that can serve as a remote control for certain Samsung televisions. Because known Android smart phones typically do not have an infrared (IR) transmitter that is commonly used in many remote controls, a smart phone cannot be used in the same manner as conventional remote controls. Instead, these apps send signals via the Wi-Fi network, and these signals are transmitted via the Wi-Fi network to the television, which is Wi-Fi enabled. As televisions that are Wi-Fi enabled that can be controlled by a Wi-Fi remote (such as a smart phone with the appropriate app) become more popular, the methods discussed above with respect to FIGS. 31-36 could be carried out by reprogramming a smart phone app. This will be incredibly convenient because the user will always travel with the user's smart phone, which means the user will always have a remote control that can be reprogrammed by the U-Me system to work on a target system at a new location. Of course, this scenario is many years into the future after such televisions are widely available and after manufacturers of televisions, television equipment, and remote controls all become U-Me certified.
[0324] Method 3700 in FIG. 37 shows another method for reprogramming a remote control. A user selects a TV provider on the remote control (step 3710). The remote control determines its location (step 3720). The remote determines from the detected location and from the selected TV provider channel numbers for defined channel icons from a database of TV providers (step 3730). The remote then reprograms itself for channel numbers for the selected TV provider at the detected location (step 3740). A simple example will illustrate. Let's assume the same scenario discussed in detail above, where a user has Dish Network at home and travels to a location that has DirecTV. The user could press a button, icon, or selection from a drop-down list on the remote control that selects DirecTV in step 3710. The remote control could detect its location in step 3720 in any suitable way, including an internal GPS device, a wireless network interface that detects an Internet Protocol (IP) address and determines a geographic location for that IP address, or in any other suitable way. The remote then consults a database of channel numbers for various TV providers at that geographic location. In one embodiment, the database will be stored in the remote control itself. In another embodiment, the database will be stored external to the remote, such as at a website, and could be accessed by the remote control via a Wi-Fi connection. Once the remote control determines the channel numbers that correspond to DirecTV at the geographic location, the remote control reprograms itself for those channel numbers (step 3740). Note that method 3700 supports changing the underlying channel numbers for displayed channel icons, similar to that discussed with respect to FIG. 34, as well as dynamically changing channel numbers entered by the user, similar to that discussed above with respect to FIG. 36. The U-Me system provides a very powerful way for a user to use settings the user is accustomed to using at home while interacting with an entirely unfamiliar system at a new location.
[0325] One area that has not benefitted much from modern technology is receipts for purchases. For example, let's assume a person buys a television at Best Buy that has a two year warranty. The purchaser is given a paper receipt. Let's then assume that six months later, the television quits working. Let's further assume that for this television, Best Buy's warranty policy states to return the television to the store if warranty service is needed within 90 days of purchase date, and to return the television to the manufacturer if warranty service is needed after 90 days of the purchase date but within the two year warranty period. When the TV quits working, the purchaser will have to locate the paper receipt, then call or e-mail Best Buy to determine where and how to make a warranty claim. This is grossly inefficient, especially in a world where electronic communications and transactions are so easily accomplished.
[0326] Some retailers have made efforts to create electronic receipts and records of purchases. For example, Lowes offers to its customers a MyLowes card. After a person signs up for a MyLowes card, for each purchase, the person offers his or her MyLowes card to be scanned. The result is the receipt is made available electronically at the Lowes website when the Lowes customer logs in. This gives a central place where a customer can review all receipts for all purchases made at Lowes. While this is progress, it is one solution for one retailer. What is needed is a way to create electronic receipts for all purchases so a user can easily track all purchases electronically.
[0327] The U-Me system introduces the concept of an eReceipt, which is simply a receipt in electronic form. In the most preferred implementation, the eReceipt may include warranty information as well as a record of a purchase. An eReceipt is processed by the eReceipt mechanism 178 shown in FIG. 5. Referring to FIG. 38, a method 3800 begins by defining an eReceipt template (step 3810). The eReceipt template is then published (step 3820). Once published, any and all vendors may create eReceipts that conform to the published eReceipt template.
[0328] The eReceipt template can be defined in any suitable way. One suitable way uses a markup language such as XML to define fields, some of which are mandatory and some of which are optional. A seller can determine based on the fields in the eReceipt template how to format an eReceipt according to the definition of the eReceipt template.
[0329] Method 3900 in FIG. 39 shows how eReceipts are used. A user buys a product (step 3910). The seller determines the warranty policy for the product (step 3920). The seller formats an eReceipt according to the warranty policy and the eReceipt template (step 3930). The seller sends the eReceipt to the user (step 3940), preferably to the user's U-Me account. The eReceipt is then processed and stored (step 3950) in the user's U-Me account. The result of method 3900 is an electronic copy of a receipt that is automatically stored in the user's U-Me account when the user makes a purchase. This assumes a seller delivers eReceipts that are formatted according to the eReceipt template. This is easily done by attaching an eReceipt file to an e-mail. The U-Me system monitors all incoming e-mail, and when an eReceipt is detected in an incoming e- mail, the U-Me system reads the eReceipt. The U-Me system will then will process and store the eReceipt in the user's U-Me account. In the most preferred implementation, the eReceipt will be processed and stored in the user's U-Me account without any further input required by the user. However, in an alternative implementation, the user may be prompted to enter information related to the purchase before the eReceipt is stored in the user's U-Me account.
[0330] FIG. 40 shows a sample warranty policy 4000. If a warranty claim is less than 90 days from the date of purchase, the item will be returned to the store. If a warranty claim is 90 days or more from the data of purchase, the item will be returned to the manufacturer. This warranty policy information may be included in the eReceipt. In one specific implementation, the warranty policy is included in the eReceipt in the form of a timed warranty link, discussed in more detail below.
[0331] FIG. 41 shows a method 4100 that illustrates a specific example for method 3900 in FIG. 39. A user buys a television from manufacturer ABC from seller XYZ (step 4110). The seller XYZ formats an eReceipt with a timed warranty link according to the eReceipt template (step 4120). The e-Receipt with the timed warranty link is e-mailed to the user (step 4130). The information in the eReceipt is then extracted and stored in fields in an eReceipt database in the user's U-Me account (step 4140), where the fields correspond to the fields defined in the eReceipt template.
[0332] The eReceipt can include a timed warranty link, which allows the user to submit a warranty claim by clicking on the timed warranty link. A timed warranty link may be created and maintained using any suitable method. Two such methods are disclosed herein by way of example. The first method is shown in method 4200 in FIG. 42. We assume the U-Me system detects a timed warranty link (step 4210). For this specific example, we assume the timed warranty link is according to the warranty policy 4000 shown in FIG. 40. Based on the warranty policy, the warranty link in the U-Me system is initially set to point to the seller's warranty return system (step 4220). The U-Me system then sets a timer according to the timed warranty link (step 4230). For the example warranty policy 4000 shown in FIG. 40, the timer will be set for 90 days from the date of purchase. As long as the timer does not fire (step 4240=NO), method 4200 waits. Once timer 4240 fires (step 4240=YES), the warranty link is set to point to the manufacturer's warranty return system. Method 4200 thus shows how the U-Me system can determine the presence of a timed warranty link in an eReceipt, and can then change the warranty link accordingly.
[0333] A second method for defining a timed warranty link is shown in FIG. 43. This timed warranty link appears to the user as a link in plain language, such as "Click here to make a warranty claim." However, the logic shown in FIG. 43 underlies the warranty link. The timed warranty link in FIG. 43 includes a date that is 90 days from the date of purchase (in accordance with the warranty policy 4000 in FIG. 40). If the current date is less than or equal to the set date at 90 days, selecting the link takes the user to the Best Buy warranty claim website. If the current date is greater than the set date at 90 days, selecting the link takes the user to the Mitsubishi warranty claim website. This is shown in method 4700 in FIG. 47. A user clicks on the timed warranty link (step 4710). The system compares the current date to one or more dates in the timed warranty link (step 4720). The system then navigates to the link corresponding to the current date (step 4730). By providing timed warranty links, the user has an improved experience because the appropriate place to submit a warranty claim is automatically presented to the user when the user selects a timed warranty link.
[0334] Other information could also be included in the eReceipt. For example, a link to order parts could be included. Thus, an eReceipt for a lawnmower could include a link to a parts web page that would allow the user to order parts for the lawnmower, including blades, belts, or any other parts. The advantage of providing a parts link with the eReceipt is the information in the eReceipt can be used to direct the user to the correct parts page automatically. The user no longer has to go to the garage and find the sticker on the lawnmower that indicates the name, model and serial number, because this information is preferably included in the eReceipt. The eReceipt thus provides a very effective way for sellers and manufacturers to provide valuable information to customers.
[0335] An example of an eReceipt template 965 is shown in FIG. 44 to include multiple sections, including a Seller Information section, a Product Information section, a Transaction Information section, a Buyer Information section, and an Embedded Metadata section. The Seller Information section includes fields for Seller, Seller ID and Location. The Product Information section includes fields for Product Category, Product Type, Product Attribute, Manufacturer, Product ID, Serial Number, Price, Warranty Link, and Gift. Note the Product Information fields are preferably replicated for each item that is purchased. The Transaction Information section includes fields for Date, Transaction ID, Tax, Shipping and Total. The Buyer Information section includes fields for Buyer Name, Buyer Address, Buyer Phone, and Buyer E-mail. The Embedded Metadata field includes data that is in the eReceipt but that is not visible when an eReceipt is viewed. Of course, any suitable field could be included in an eReceipt. The fields shown in FIG. 44 are by way of example, and are not limiting.
[0336] Embedded Metadata includes a unique identifier that allows uniquely identifying the eReceipt. Values stored in the Embedded Metadata field include constant values, or values generated using any suitable heuristic. For example, a manufacturer could provide embedded metadata in the form of
<SellerID.Date.ValidationCode>, where the SellerlD and Date are from the values stored in the eReceipt and the Validation Code is a unique code that is dynamically generated by the seller and assigned to this eReceipt. The Embedded Metadata provides an electronic identifier that can identify this receipt as genuine.
[0337] An example of an eReceipt 4500 formatted according to the eReceipt template 965 in FIG. 44 is shown in FIG. 45. This particular eReceipt is for a Mitsubishi 60 inch television purchased at Best Buy. The Seller is listed as Best Buy. The Seller ID is shown as 14296, which could be a code that uniquely identifies Best Buy from other sellers. The TV was purchased at Store 564, which is a code that tells Best Buy where the TV was purchased. The Product Category is Home Electronics. The Product Type is Flat Screen TV. The Product Attribute is 60 inch DLP. The manufacturer is Mitsubishi. The Product ID is WD - 60735. The serial number is 166-4923. The price is $1,499.99. The Warranty Link shows <Click Here to make a Warranty Claim>, which is abbreviated in FIG. 45 due to space constraints. The Warranty Link includes a timed warranty link as discussed above. The Gift field has a value of No because this TV was not purchased as a gift.
[0338] The purchase date of the TV was 08/02/2013. The transaction ID is 543921268. The Sales Tax is $123.12. The shipping is zero (because the customer purchased the TV at a store). The total for the purchase is $1,623.11. The Buyer Name is Jim Jones. The Buyer Address is 21354 Dogwood, Carthage, MO 64836 (not shown in FIG. 45 due to space constraints). The Buyer Phone is 417-555-3399. The buyer e-mail is J29A@gmail.com. The Embedded Metadata is data that uniquely identifies the eReceipt and can be used in the future to validate the eReceipt.
[0339] Some sellers could include the buyer information, and some may not. In the most preferred implementation, the eReceipt received from the seller includes the Buyer Information. However, in an alternative implementation, the eReceipt could be e-mailed to the buyer without all of the Buyer
Information. When this happens, the eReceipt mechanism could automatically add the buyer information to the eReceipt, or could leave the buyer information incomplete.
[0340] One advantage of using eReceipts is this allows manufacturers to receive from sellers a record of who purchased their product and when. Referring to FIG. 46, method 4600 shows a seller sending an eReceipt to a manufacturer that provided the product sold in the eReceipt (step 4610). The manufacturer can then register the product to the eReceipt. Note the product is registered to the eReceipt and not necessarily to the buyer, although the eReceipt includes the buyer's information. This is because many products are purchased as gifts, especially during the Christmas season.
[0341] Having eReceipts for gifts is a great advantage, because the eReceipts can be forwarded to gift recipients so they can have the purchase and warranty information for the gifts they received. Thus, after the holiday season is over, a user could do a search for all eReceipts for items purchased in November and December, where the eReceipt indicates the product was a gift. For each such eReceipt, the user who bought the gift could forward the eReceipt via e-mail to the gift recipient. If the gift recipient is a user of the U-Me system, the eReceipt will be processed and put into the user's eReceipt database. Even if the gift recipient is not a user of the U-Me system, having the eReceipt as an attachment to an e-mail will make the eReceipt available on the gift recipient's system so the gift recipient does not have to keep track of a paper gift receipt. Note the eReceipt mechanism 178 can also include the ability to delete price information when forwarding an eReceipt to a gift recipient. A user could check on a "Gift Receipt" box which would delete all financial information from the gift receipt, such as price, sales tax, and shipping, before sending the eReceipt to the gift recipient. The eReceipt mechanism 178 can thus provide an eGiftReceipt to the gift recipient for the product that was given as a gift, which includes all pertinent product information without including the financial information.
[0342] Because many products are purchased by one person then given as a gift to a different person, the method 4600 in FIG. 46 registers a product to an eReceipt received from the seller. Let's assume the eReceipt is then transferred from a first U-Me user to a second U-Me user who received the product as a gift from the first U-Me user. If the second U-Me user needs to make a warranty claim, the second U-Me user can click on the warranty link in the eReceipt, which we assume for this example directs the second U-Me user to the manufacturer's warranty claim website, and submits the eReceipt to the manufacturer to identify the product. The manufacturer can then search its database and locate the corresponding eReceipt to which the product was registered. The manufacturer could check the embedded metadata in the eReceipt to verify it is the same as the eReceipt to which the product was originally registered. When the fields in the eReceipt sent by the second user are verified by the manufacturer as matching the fields in the eReceipt to which the product was registered, and when the embedded metadata in both eReceipts match, the manufacturer can then provide the warranty service to the new user.
[0343] The eReceipt concept can also be extended to help a user report and potentially recover stolen goods. Let's assume a burglar robs a television, a computer, and a home theater audio system from a U-Me user's house. If the user has eReceipts for these stolen goods, the user can submit the eReceipts to the police, to an insurance company, and to the U-Me system to report the goods as stolen. The U-Me system could, in turn, contact the manufacturer and/or insurance company to inform them these goods were stolen. Because the eReceipt includes all the pertinent information for the product, including serial number, the eReceipt should contain all information law enforcement and insurance companies needs to identify the stolen property if it is recovered.
[0344] Having eReceipts stored in a user's U-Me account allows monitoring the eReceipts for warranty expiration. Referring to FIG. 48, in method 4800 the U-Me system monitors for warranty expiration (step 4810). This can be done, for example, by logging warranty information embedded in eReceipts. Where there are no warranties near expiration (step 4820=NO), method 4800 loops back and waits. When a warranty for a product in an eReceipt will expire within a specified time period (step 4820= YES), such as two months, the user is notified of the impending warranty expiration (step 4830). When there is an extended warranty available (step 4840=YES), the price and terms of the extended warranty are provided to the user (step 4850). When no extended warranty is available (step 4850=NO), method 4800 ends. Method
4800 provides a way for the U-Me system to provide reminders of warranties that are about to expire, which provides an opportunity for the manufacturer to sell an extended warranty. Even when an extended warranty is not available, the user is still given the notice that the warranty is about to expire.
[0345] When a manufacturer receives a warranty claim, the manufacturer can verify the validity of the eReceipt before processing the warranty claim, as shown in method 4900 in FIG. 49. The manufacturer receives a warranty claim (step 4910). When the eReceipt indicates the sale was a valid sale (step 4920=YES), the warranty claim is processed (step 4940). When the eReceipt does not indicate a value sale (step 4920=NO), the warranty claim is rejected (step 4930). The eReceipt can be validated in step 4920 in any suitable way. For example, if the seller sent the eReceipt to the manufacturer at the time of sale, the manufacturer can compare the eReceipt submitted by the user as part of the warranty claim process to the eReceipt received from the seller at the time of sale. If the two match, the sale is valid. Of course, other mechanisms and methods could be used to determine whether a sale based on an eReceipt is a valid sale in step 4920, and the disclosure and claims herein expressly extend to all suitable mechanisms and methods for determining how to validate an eReceipt to make sure it represents a valid sale. This check prevents a user from receiving warranty service when the user submits a bogus eReceipt.
[0346] An example of a display for an eReceipt search engine is shown at 5000 in FIG. 5. The eReceipts are stored in the user's U-Me account in an eReceipt database using the fields in the eReceipt as indexing information. This allows eReceipts to be searched using power database query techniques. Examples of eReceipt queries that could be formulated using the eReceipt search engine 5000 are shown in FIG. 51. The example queries 5100 in FIG. 51 include "All purchases this year", "Home Electronics purchased in the last 5 years", and "All products over $500." While certain fields are shown in FIG. 50, these are shown by way of example. The eReceipt search engine could include many fields not shown in FIG. 50. The disclosure and claims herein extend to using any suitable fields or search criteria for searching for eReceipts.
[0347] While the U-Me system contemplates users receiving eReceipts from sellers in a format defined by the eReceipt template, such as via an attachment to an e-mail, sellers that do not provide eReceipts in the defined format may still e-mail receipts to users. This is particularly true for online sales. Even when e- mailed receipts are received that do not conform to the eReceipt template, the eReceipt mechanism 178 in the U-Me system can process receipt information in an e-mail and generate a corresponding eReceipt, as shown in method 5200 in FIG. 52, which is preferably performed by eReceipt mechanism 178 in FIG. 5. The user's e-mail inbox is monitored (step 5210). If an e-mail does not look like a receipt (step 5220=NO), method 5200 is done. When an e-mail looks like a receipt (step 5220=YES), a determination is made whether user confirmation is needed (step 5230). A user could configure the eReceipt mechanism, for example, to always prompt the user when an e-mail looks like a receipt so the user can confirm the e-mail is a receipt. When user confirmation is needed (step 5230=YES), the user is prompted to confirm the e-mail is a receipt (step 5240). When the user confirms the e-mail is a receipt (step 5250=YES), the eReceipt mechanism processes the e-mail and generates an eReceipt (step 5260), filling in fields of the eReceipt with information in the e-mail. When user confirmation is not needed (step 5230=NO), the eReceipt mechanism proceeds to process the e-mail and generate an eReceipt (step 5260) without further input from the user, filling in fields of the eReceipt with information in the e-mail. While generating an eReceipt from an e-mail is less ideal than receiving an eReceipt in a format that complies with the eReceipt template, the result is still an eReceipt with many of the fields filled in that will allow a user to do searches based on the information in the eReceipt. Method 5200 thus provides a way to convert an ordinary e-mail that includes receipt information into an e eceipt that can be stored in the user's U-Me account and that can be searched using the eReceipt search engine.
[0348] Generating eReceipts instead of paper receipts saves trees because paper receipts are not needed. In addition, having receipt and warranty information in electronic form provides many advantages, discussed in detail above.
[0349] The U-Me system can be used to store vehicle settings for a user and to download those settings to a different vehicle, even a vehicle the user has never driven before. Let's assume a user travels from Kansas City to Chicago via airplane for a business meeting. Upon arriving in Chicago, the user rents a rental car. Let's assume the rental car is U-Me certified. The user can identify the rental car to the U-Me system, which can then determine the type of car, convert any of the user's settings as required for that car, and download the settings to the car. The user thus benefits from having the U-Me system configure the rental car according to his or her settings stored in the user's U-Me account. The various functions with respect to FIGS. 53-63 discussed below are preferably performed by the vehicle mechanism 180 in FIG. 5.
[0350] Referring to FIG. 53, method 5300 begins by uploading settings from a vehicle (step 5310). The vehicle settings may then be converted and stored in a universal vehicle template (step 5320). The vehicle settings could also be stored in a device-specific template for the user's vehicle. The conversion of settings in step 5320 may be performed by the conversion mechanism 160 shown in FIG. 5. One suitable universal vehicle template is shown at 5400 in FIG. 54, which is an example of a suitable universal vehicle template 940. The universal vehicle template 5400 includes settings for driver seat position, passenger seat position, driver mirror position, passenger mirror position, rearview mirror position, steering wheel position, audio presets, driver heat/cool settings, passenger heat/cool settings, music playlists, and video playlists.
[0351] The term "audio presets" in FIG. 54 can include presents for a satellite radio receiver, and can additionally include presents on the receiver or other audio mechanism in the vehicle that corresponds to a user's favorite songs. A simple example will illustrate. Let's assume a vehicle audio system allows a user to define three sets of six presets each for satellite radio stations. Let's further assume the vehicle audio system also allows a user to define three additional sets of presets that correspond to the user's favorite songs, which can be made available to the vehicle either by downloading or by streaming. If a user wants to listen to the '80s satellite radio station, the user can select a preset that is programmed for that station. If the user wants to listen to one of her favorite songs that she has purchased, she can select a preset that corresponds to the desired song.
[0352] The driver heat/cool settings can include heat and cool settings for the heating and air conditioning system for the driver's side of the car, and can additionally include heat/cool settings for the driver's seat. In addition, the heat/cool settings can be programmed as a function of temperature exterior to the car and temperature interior to the car. Thus, the user can define several different sets of desired heat/cool settings based on the outside temperature and/or based on the interior temperature of the car. A simple example for heating and cooling the driver's seat follows. Let's assume the user specifies that when the inside temperature of the car is less than 30 degrees Fahrenheit (-1 degrees Celsius), the seat heater is set to high. When the inside temperature is between 30 and 40 degrees Fahrenheit (between -1 and 4 degrees Celsius), the seat heater is set to medium. When the inside temperature is between 40 and 50 degrees (between 4 and 10 degrees Celsius), the seat heater is set to low. When the inside temperature is between 50 and 80 degrees Fahrenheit (between 10 and 27 degrees Celsius), the seat is neither heated nor cooled. When the inside temperature is between 80 and 90 degrees (between 27 and 32 degrees Celsius), the seat cooler is set to low. When the inside temperature is between 90 and 100 degrees (between 32 and 38 degrees Celsius), the seat cooler is set to medium. When the inside temperature is over 100 degrees (over 38 degrees Celsius), the seat cooler is set to high. This simple example shows how the climate control system can vary according to the environmental conditions inside the vehicle. Of course, the outside temperature could also be taken into account. Thus, even when the interior of the car is comfortable on a very cold day, a person will feel colder because of the cold glass that is in proximity to the person's head. Taking interior and/or exterior temperature into account can thus allow a user to define desired heat/cool settings that can be fine-tuned to the user's liking. While the examples above discuss heat/cool settings for the driver's seat, similar settings could be defined for the passenger's seat.
[0353] The universal vehicle template is preferably defined such that settings from different car manufacturers can all be converted to and from the settings in the universal vehicle template. We take the example of seat position to illustrate. Referring to FIG. 56, the position of seat 5610 can be expressed in numerous different ways. For example, the position of seat 5610 can be expressed in terms of the height A of the front of the seat above some reference point, such as the floor of the vehicle; height B of the rear of the seat above some reference point; distance C from the accelerator pedal 5640 to a front of the seat; angle D of the back portion with respect to the bottom portion; distance E from the center of the steering wheel to the seat back; and distance F from a reference point on the seat (such as the back) to some fixed reference point.
[0354] The way a car represents seat position may vary with the car manufacturer. For example, let's assume one car manufacturer allows adjusting the forward/backward position of the driver's seat over a ten inch span, and uses a stepper motor to do the adjusting. The position of the seat could be expressed as a numerical value for the stepper motor. A different manufacturer may allow adjusting the forward/backward position of the driver's seat over a twelve inch span using an analog motor and a position sensor, where the seat position is stored as a value of the position sensor. The universal vehicle template 5400 preferably describes seat position in a way that is actually descriptive of the position of the seat itself with respect to one or more physical features in the vehicle, not based on some motor settings or sensor readings of any particular manufacturer. This may require converting a vehicle's settings to a more universal measure of seat position, such as distance and angle, as represented in FIG. 56. In the most preferred implementation, the process of a car vendor becoming U-Me certified includes the car vendor providing a device-specific template for the car that includes mapping information for converting the car vendor's settings to the types of settings referenced in the universal vehicle template. In this scenario, the device-specific template will be used to do the conversion in step 5320 in FIG. 53 from the vehicle settings to the equivalent settings in the universal vehicle template.
[0355] Let's assume the user drives a 2012 Lexus RX350. When the user's settings in the 2012 Lexus RX350 are uploaded to the user's U-Me account, there may be multiple device-specific vehicle templates that apply to this vehicle, such as: a vehicle template for the 2012 Lexus RX350; a vehicle template for 2012 Lexus vehicles; a vehicle template for Lexus small SUVs; etc. These vehicle templates are preferably provided by the vehicle manufacturer to identify user settings in their vehicles, and how these settings map to the universal vehicle template.
[0356] FIG. 55 shows a method 5500 that could be representative, for example, of the steps when a user rents a car that is U-Me certified. First, the phone is paired to the car (step 5510). Pairing the phone to the car allows the user's phone to send the information identifying the car to the U-Me system, and to authenticate the user to the U-Me system via the user's phone. When the user's U-Me account already has settings for this car (step 5520=YES), the user settings for this car are downloaded from the user's U-Me account to the car (step 5320). This can happen, for example, when the user rents a car that is the same type of car the user drives at home, or is the same type of U-Me certified car the user has rented before. When the user's U-Me account does not have settings for this car (step 5520=NO), the settings for this car are generated from the universal vehicle template (step 5310), and those settings are then downloaded to the car (step 5320). As a result, all of the user's preferred settings (see FIG. 54) are made available in the rental car. The result is a car that is configured to the user's taste with minimal effort from the user. While the discussion above assumes the car communicates with the user's phone, which in turn communicates with the U-Me system, other configurations are possible. For example, a car or other vehicle could include a transceiver that allows the vehicle to directly interact with the U-Me system, instead of going through the user's phone.
[0357] FIG. 57 shows a block diagram of a phone 5710 coupled via Bluetooth interfaces 5730 and 5740 to a prior art vehicle 5720. The Bluetooth interface 5740 of known cars provides a way to pair the phone 5710 to the vehicle 5720 so the user may use the phone hands-free while driving. The Bluetooth interface 5740 thus communicates with a phone mechanism 5780, which controls the microphone 5750, speaker 5760 and controls of the audio system 5770 during a phone call. For example, when the user is listening to the radio and a call comes in, the phone mechanism 5780 mutes the radio using the audio control 5770, announces via speaker 5760 the user has an incoming call, and when the user presses a button to answer the call, the phone mechanism 5780 then communicates with the phone 5710 to service the call, including playing the call audio on the speaker 5760 and receiving the user's voice input to the call via microphone 5750.
[0358] In known vehicles, a user cannot access any of the engine system 5790 via the Bluetooth interface 5740 that communicated with the user's phone. The engine system 5790 includes information in electronic form that could be useful to the user, including mileage 5719, error codes 5792, and warning lights 5793. Because prior art vehicles do not allow the phone to communicate with the engine system 5790, the user cannot use information that is generated in the engine system 5790.
[0359] Referring to FIG. 58, the same phone 5710 with its Bluetooth interface 5730 communicates with the Bluetooth interface 5840 to service telephone calls using microphone 5850, speaker 5860, audio control 5870, and phone mechanism 5880, similar to what is done in the prior art system shown in FIG. 57. Note, however, the Bluetooth interface 5840 has access to the engine system 5890. This means information in the engine system 5890 can be communicated via the Bluetooth interface 5840 to the user's phone, and from there to the user's U-Me account. Information such as mileage 5891, error codes 5892, warning lights 5893, scheduled maintenance 5894, collision detection 5895, and emergency response system 5896 can be made available to the U-Me system by a vehicle such as vehicle 5820 that has been U-Me certified.
[0360] Method 5900 is shown in FIG. 59, and begins by determining the make and model of the vehicle (step 5910). The maintenance schedule for the vehicle is then determined (step 5920). When no scheduled maintenance is needed (step 5930=NO), method 5900 waits until scheduled maintenance is needed (step 5930=YES). The user is prompted regarding the needed scheduled maintenance (step 5940). Method 5900 is then done. Note the information in steps 5910 and 5920 may be stored in the engine system itself, as shown at 5894 in FIG. 58. In the alternative, the information in steps 5910 and 5920 may be retrieved from a manufacturer's website, from a third party website, or from the U-Me system. Because the U-Me system now has access to the engine system 5890 in FIG. 58 via the user's phone, the U-Me system can provide information regarding the status of the engine to the U-Me user.
[0361] When scheduled maintenance is needed (step 5930=YES in FIG. 59), the U-Me system can perform method 6000 in FIG. 60. A notice can be sent to one or more U-Me certified shops of the needed scheduled maintenance (step 6010). The notified shop(s) then return a bid for performing the scheduled maintenance to the U-Me system (step 6020). The U-Me system then provides the bids to the user (step 6030). In this manner the user can automatically receive bids from one shop or from competing shops with the bids for doing the scheduled maintenance. Note while FIG. 60 shows the specific case of scheduled maintenance, a similar method could be performed when an error code or engine warning light comes on so the user can automatically receive one or more bids for performing the repair that is needed based on the error code or warning light.
[0362] The U-Me system also provides a central place for vehicle manufacturers to notify customers of recalls or service actions, as shown in method 6100 in FIG. 61. Referring to FIG. 61, the vehicle manufacturer sends the recall or service action information to the U-Me system (step 6110). The U-Me system then notifies its users who are affected by the recall or service action (step 6120).
[0363] A great advantage to the U-Me system is having U-Me certified shops store the service performed on a user's vehicle to the user's U-Me account, which can trigger reminders for the user. Referring to FIG. 62, a method 6200 begins when a U-Me certified shop performs service for the U-Me user (step 6210). The shop uploads to the user's U-Me account the service performed by the shop, with a recommended future reminder (step 6220). For example, if the shop changes the oil, the shop could upload a record of the oil change along with a recommendation that a reminder be set to change the oil in 5,000 miles. The U-Me system sets the reminder for the user (step 6230). When the reminder conditions are not met (step
6240=NO), method 6200 waits, until the reminder conditions have been met (step 6240=YES). The U-Me system then provides a reminder to the user (step 6250). Method 6200 is especially useful for service that needs to be performed at specified mileage and/or time intervals, such as oil changes and rotation of the tires.
[0364] Once the user has access to the vehicle's engine system as shown in FIG. 58, other methods are possible, such as method 6300 shown in FIG. 63. The vehicle sends engine warning information to the user's U-Me account (step 6310). Engine warning information can include, for example, information from error codes 5892, warning lights 5893, a collision detection system 5895, or an emergency response system 5896. When the engine warning is authorized to the user (step 6320=YES), the engine warning information is provided to the user (step 6350). When the engine warning is not authorized to the user (step 6320=NO), the user may be prompted to authorize additional payment for access to the engine warning information (step 6330). When the user does not authorize the additional payment (step 6340=NO), method 6300 is done.
When the user authorizes the additional payment (step 6340= YES), the engine warning information is provided to the user (step 6350). A simple example will illustrate. Let's assume the engine produces an error code that indicates the fuel pump is failing. This could be indicated on the dash by a "service engine soon" light, but this does not give the user any meaningful information regarding what service is required. Having access to this engine warning information could cost a premium above the normal U-Me subscription, so the user could be prompted in step 6330 to authorize an additional charge of, say $5, to access the information. If the user is on a long highway trip and the "service engine soon" light comes on, the user doesn't know whether the warning is minor or more serious. In the case of a fuel pump that is failing, knowing the fuel pump is failing may allow the user to stop at a repair shop in the next town. In this scenario, paying an extra $5 for the engine warning information is money well-spent.
[0365] As discussed above, the widespread acceptance of digital photography has been accompanied by a corresponding widespread problem of most users having thousands of digital photos that are stored using cryptic names in many different directories or folders on their computer systems, making retrieval of photographs difficult. The U-Me system provides an improved way to manage photos, including photos that originated from a digital camera or other digital device, along with hard copy photos that have been digitized for electronic storage. The U-Me system improves over the known art of software that adds metadata to photos by providing a people-centric approach to managing photos, as described in detail below with reference to FIGS. 64-79. The methods discussed with respect to FIGS. 64-79 are preferably performed by the photo mechanism 182 shown in FIG. 5.
[0366] The U-Me system includes a photo system data entry screen, such as screen 6410 shown in FIG. 64 by way of example. The photo system data entry screen 6410, like all of the U-Me system, is person-centric. Thus, when a user decides to have the U-Me system manager the user's photos, the user starts by entering data for a particular person in the photo system data entry screen 6410. Fields in the photo system data entry screen 6410 include Name, Preferred Name, Birth Date, Father, Mother, Wedding Day, Spouse, Married Name, Child, Camera, Street, City, State, ZIP and address name. The user can provide a sample photo of the person's face at 6450 to help train the facial recognition engine in the U-Me photo system. Note the Child field includes an Add button 6420 that allows the user to add additional children. Similarly, the Camera field includes an Add button 6430 that allows the user to enter all cameras the user uses to take digital photos. When the user enters the location information in the Street, City, State, and ZIP fields, the U-Me system computes GPS coordinates for that location, and stores those GPS coordinates at 6440 relating to the address of the person whose information appears on the screen 6410.
[0367] A sample photo system entry page 6510 with data filled in is shown in FIG. 65. The name of the person is Jim Jones, his preferred name is Jimmy, his birth date is 08/03/1957, his father is Fred Jones, his mother is Sally Jones, his wedding day was 06/21/1983, his spouse is Pat Jones, the Married Name field is empty indicating his married name is the same as what was entered above, he has two children Billy Jones and Sandy Jones, the camera he uses to take photos is a Nikon Coolpix S01, his address is 21354 Dogwood,
Carthage, MO 64836. The name chosen for this address is "Jim and Pat's House." A photo is provided at 6550 that is a good photo of Jim's face. The GPS coordinates for the address is computed and displayed at 6540. Once the user is satisfied with the content in the data entry screen 6510, the user can select the Save button 6360 to save this information to the user's U-Me account. If the user wants to abort the data entry, the user can select the Cancel button 6370.
[0368] Referring to FIG. 66, as entries are made into the photo system (e.g., as shown in FIG. 65), method 6600 monitors the photo system data entry (step 6610) and constructs family relationships from the photo system data entry (step 6620). People naturally think along the lines of family relationships. While known software for adding metadata to a photo allows adding name labels such as "Katie" and perform facial recognition, these labels have no meaning within the context of other people in the photos. The U-Me system, in contrast, constructs family relationships that allow storing and retrieving photos much more effectively than in the prior art.
[0369] The initial entry of photo system data for all the people in a user's immediate and extended family may take some time, but once this work is done the U-Me system can use this data in many ways that allow easily storing photos to and easily retrieving photos from the user's U-Me account.
[0370] Referring to FIG. 67, a method 6700 begins by uploading a photo to the user's U-Me account (step 6710). Facial and feature recognition is performed on the photo (step 6720). Facial recognition is known in the art, but the processing in step 6720 preferably also includes feature recognition. Feature recognition may recognize any suitable feature or features in the photo that could be found in other photos. Examples of features that could be recognized include a beach, mountains, trees, buildings, a ball, a birthday cake, a swing set, a car, a boat, etc. Any existing metadata in the photo is extracted (step 6730) and processed to generate indexing information for the photo (step 6740). If there are unrecognized faces or features (step 6750), the user may be prompted to identify the unrecognized faces and/or features (step 6760). The photo is then stored with the indexing information generated in step 6740 (step 6770). The result is a digital photo stored with indexing information that may be used to retrieve the digital photo later using a powerful database search engine, discussed in more detail below.
[0371] By prompting the user for unrecognized faces and features, method 6700 gives the user the chance to build up a library of faces and features that the system will have an easier time recognizing next time around. For example, step 6760 might display the photo with various different faces and regions defined. The user could select a face, then enter the name for the person, or if the person will appear in many photos, the user could enter some or all of the person's data in a photo system data entry screen, similar to that shown in FIG. 65. The user could also select various regions of the photo to define features that could be recognized in future photos. For example, if a photo shows a couple on a beach with a cruise ship in the background, the user could click on each face to define information corresponding to those two people, and could also click on the sand on the beach and define this feature as "beach", click on the water and define this feature as "water", and click on the cruise ship and define this feature as "boat." Using various heuristics, including artificial intelligence algorithms, these features may be recognized in other photos, which allows adding indexing information that describes those features automatically when the photo is processed, as shown in method 6700 in FIG. 67.
[0372] The indexing information generated in step 6730 preferably includes data that is not in the metadata for the photo, but is generated based on the metadata and information stored in the user's U-Me account.
For example, when the U-Me system recognizes a date in the photo metadata that corresponds to Jim & Pat's wedding anniversary, the U-Me system can generate indexing info for the photo that identifies the Event for the photo as Jim & Pat's Wedding Anniversary. Having dates, locations and relationships defined in the user's U-Me account provides a way to add indexing info to a photo that will help to retrieve the photo later using a powerful search engine, discussed in more detail below.
[0373] One advantage to the U-Me system being person-centric is camera information can be converted to the corresponding person who took the photo. Referring to FIG. 68, a method 6800 reads camera info from the metadata for a photo (step 6810), looks up the photographer name that corresponds to the camera info (step 6820), and adds the photographer's name to the indexing info (step 6830). In this manner, the metadata in the photo that identifies the camera is used to go a step further to identify the person who uses that camera so the photographer can be specified in the indexing information for the photo.
[0374] FIG. 69 shows sample metadata 6900 that may exist in known digital photos. Note the term "metadata" is used herein to mean data that is not part of the visible image in the digital photo that describes some attribute of the photo. The metadata 6900 in FIG. 69 is shown to include fields for Camera Make, Camera Model, Camera Serial Number, Resolution of the photo, Image Size of the photo, Date/Timestamp, and Geocode Info. The metadata shown in FIG. 69 is shown by way of example. Many other fields of metadata are known in the art, such as the metadata fields defined at the website photometadata.org. The photo metadata disclosed herein expressly extends to any suitable data, whether currently known or developed in the future, that is placed in the digital photo file by the device that took the photo to describe some attribute that relates to the photo.
[0375] When photo metadata includes geocode info as shown in FIG. 69 that defines the geographical location of where the camera was when the photo was taken (as is common in smart phones), method 7000 in FIG. 70 reads this geocode info from the metadata (step 7010). The geocode info can be in any suitable form such as GPS coordinates or other forms of geocode info that specifies location, whether currently known or developed in the future. The geocode info is processed to determine whether the geocode info corresponds to a recognized location (step 7020). If not (step 7020=NO), method 7000 is done. When the geocode info corresponds to a recognized location (step 7020=YES), the location name is added to the indexing info for the photo (step 7030). For example, let's assume Jim Jones takes a photo with his cell phone of his daughter at his house. The geocode info will reflect that the location corresponds to a stored location, namely, Jim & Pat's House. Jim & Pat's House can then be added to the indexing information, which makes retrieval of photos much easier using a photo search engine.
[0376] While most young adults and children have taken only digital photographs for their entire lives, older people typically have hundreds or thousands of hard copy photographs. These people need a way to store those photos electronically so they can be easily searched and retrieved as needed. Referring to FIG. 71, method 7100 begins by scanning a hard copy photo (step 7110). Facial and feature recognition is performed (step 7120). A wizard prompts the user to enter indexing information for the photo (step 7130).
The indexing information is appended to the scanned image data for the photo (step 7140). The photo with its indexing info is then stored in the user's photo database (step 7150).
[0377] Repeating method 7100 for hundreds or thousands of individual photos may be too time- consuming. Instead, the user may process photos in groups. Referring to FIG. 72, method 7200 begins by a user invoking a photo indexing info generator (step 7210). The user can then define indexing info for groups of photos or for individual photos (step 7220).
[0378] Examples of suitable indexing info 7300 are shown in FIG. 73 to include fields for Recognized Person(s), Age(s) of Recognized Person(s), Recognized Feature(s), Location Name and Event. Note the Recognized Person(s), Age(s) of Recognized Person(s) and Recognized Feature fields could be replicate for as many recognized persons or features that exist in the photo. A sample photo file 7400 is shown in FIG.
74 to include an identifier (ID), Metadata, Indexing Info, and the Image. While the indexing information is "metadata" in a general sense, the term "metadata" as used herein relates to data generated by the camera that describes some attribute related to the image, while "indexing info" as used herein relates to data that was not included in the metadata for the image but was generated by the U-Me system to facilitate retrieval of photos using a powerful search engine. [0379] An example of a photo indexing info generator screen 7500 is shown in FIG. 75 to include Date fields, a People field, an Event field, a Location field, and a display of thumbnails of photos. The user specifies a date or range of dates in the Date fields. The user specifies one or more people in the People field. The user specifies location in the Location field. An example will illustrate how a user might use the photo indexing info generator in FIG. 75 to generate indexing info for scanned hard copy photos. Let's assume Jim Jones has a stack of 163 photos of all the wedding -related photos of when he married Pat, including some on the morning of their wedding day showing the wedding ceremony, some that were taken later on their wedding day at the reception, and some a week later at a second reception in Pat's hometown. Instead of defining indexing info for each photo, Jim could enter a date range that begins at the wedding day and extends to the date of the second reception, could define an event called "Jim & Pat's Wedding", and could select the 163 thumbnails that correspond to the wedding and reception photos. Once this is done, the user selects the Save button 7560, which results in the photos being saved in Jim's photo database with the appropriate dates and event information as indexing information. Note the Event and Location fields can include drop-down lists that list events and locations that have been previously defined, along with a selection to define a new event or location. If the user decides to abort entering the indexing info for photos, the user may select the Cancel button 7570.
[0380] A significant advantage of generating indexing info for photos is the ability to search for and retrieve photos using the indexing info. No longer must a user search through hundreds or thousands of thumbnails stored in dozens or hundreds of directories with cryptic names that mean nothing to a person! Instead, the user can use a photo search engine to retrieve photos based people, their ages, family relationships both entered and computed, location, and dates.
[0381] One example of a screen 7600 for a photo search engine is shown in FIG. 76. The example shown in FIG. 76 includes fields for Date(s), Event, Location, People, Relationship, and Photographer. Because of the family relationships generated by the U-Me system (e.g., in step 6620 in FIG. 6), searches or queries for photos can now be formulated based on those generated relationships. Examples of photo queries supported by the photo search engine 7600 in FIG. 76 are shown at 7700 in FIG. 77, and include: photos of grandchildren of Jim Jones between the ages of 6-18 months; photos of the wedding of Sandy Jones; and photos taken at the Lake House in 2010. These simple examples illustrate that adding indexing info that relates to people and locations allows for much more powerful querying and retrieving of photos than is known in the art.
[0382] The user may want to share photos stored in the user's U-Me account. This can be done using a photo share engine, a sample display of which is shown at 7800 in FIG. 78. The photo share engine is preferably provided as a feature of the sharing mechanism 174 shown in FIG. 5. The user defines criteria for photos to share, then specifies other U-Me users with which to share the photos. The user can also select whether to share the metadata and whether to share the indexing info. The criteria for photos to share can include any suitable criteria, including any suitable criteria that could be entered into the photo search engine for retrieving a photo. The "Share with" field could be a drop-down list with people in the U-Me system, could be a drop-down list of people the user has defined in the user's U-Me account, or could be an e-mail address or other unique identifier for the person. A user could thus enter the e-mail address of a person who is not a U-Me member, and this could result in the U-Me system sending an e-mail to the person inviting the person to join U-Me to view the photos the user is trying to share with the person.
[0383] Method 7900 in FIG. 79 shows a method for storing a photo with corresponding indexing information. The user takes the photo (step 7910). The U-Me software or app sends the photo with metadata to the user's U-Me account (step 7920). The U-Me software or app can send the photo with metadata to the user's U-Me account in any suitable way, including a direct connection from the U-Me software or app to the U-Me system. In the alternative, the U-Me software or app can send one or more e- mails to the user. The U-Me system monitors incoming e-mail, and when a photo is detected, embedded in an e-mail or as an attachment, the U-Me system recognizes the file as a photo. The metadata is processed to generate indexing info (step 7930). Facial and feature recognition is performed (step 7940). Indexing information is generated for all recognized faces and features (step 7950). The photo is then stored with its metadata and with the generated indexing info in the user's photo database (step 7960). When input from the user is needed (step 7970=YES), a flag is set to prompt the user for the needed input (step 7980). Setting a flag lets the user decide when to enter the needed input. Thus, when a user has some spare time, the user may log into the U-Me account and enter all needed input that has accumulated for many photos that have been taken. Method 7900 could be carried out by a user taking a photo with a smart phone that is running the U-Me app, which results in the photo being automatically uploaded, processed, and stored in the user's
U-Me account.
[0384] The generation of location-based indexing info for photos may be done using any suitable heuristic and method. For example, let's assume Jim & Pat live on 40 acres of land. The GPS coordinates for their address may correspond to the mailbox at the road, which could be several hundred yards away from the actual house. Using the U-Me app on his smart phone, Jim could walk the perimeter of their 40 acres, identifying the corner points of the property by selecting a button on the app. When Jim arrives back at the point of origin, the U-Me app will recognize the various points define a closed area, and will define a region that includes the entire area. Jim could repeat the procedure on the outside corners of the house. Jim could then define the 40 acres as "Jim & Pat's Property" and the house as "Jim & Pat's House." If Jim takes a photo of a grandson at a birthday party in his living room in his house, the U-Me system will recognize the location as Jim & Pat's House, and will store this location as indexing info with the photo. If Jim takes a photo of the grandson fishing at a pond on the property, the U-Me system will recognize the smart phone is not at the house but is on the property, and will recognize the location as "Jim & Pat's Property", and will store this location as indexing info with the photo. In addition, various heuristics could be defined to generate location descriptors. For example, anything within 100 yards of a defined location but not at the defined location could be "near" the defined location. In addition, some geographic regions may be predefined within the U-Me system. For example, anything within the city limits of a city could be "within" the city. Large tourist destinations such as Disney World could be included in a location database, so the name Disney World can be added as a location to indexing information for all photos taken while at Disney World. The same could be done for state and national parks, and for any other defined geographical region. The disclosure and claims herein expressly extend to any suitable location information that could be generated and included as indexing information to describe location of where a photo was taken.
[0385] The medical field is one area where technology has led to great advances in some areas. For example, modern medical equipment such as Magnetic Resonance Imaging (MRI) machines allows imaging the body in non-invasive ways with sufficient resolution that allows diagnoses based on graphical images. However, some areas of the medical field have lagged way behind technology. One of these areas is how medical records are generated, stored, and retrieved. Most doctors and other medical providers still use hard-copy files. This is grossly inefficient. In addition, the medical files are typically kept by each medical provider in each provider's respective offices. For example, let's assume a patient goes to her dentist, who exams the patient and believes the patient needs a root canal on a tooth. Assuming the dentist does not do root canals, the dentist could refer the patient to an Endodontist for the root canal. Because the files are in hard-copy form, the dentist could make a copy of the exam notes and the X-ray of the tooth, and provide these to the patient, who carries these hard copy records by hand to the Endodontist. The medical profession needs to modernize and create electronic files instead of hard copy files.
[0386] Any patient who has seen many doctors soon realizes what an inefficient system we currently have for medical records. One doctor may request copies of a patient's medical records from another doctor, but this takes written authorization by the patient and also takes time due to the time needed to process and mail the hard copy records. A patient has a right to obtain copies of their medical records, and can request copies in writing. But managing hard copy files of medical information is grossly inefficient in our highly- computerized world today. The U-Me system addresses this problem by making a user's medical information available in the user's U-Me account.
[0387] Various functions relating to medical info are shown in FIGS. 80-92 and are discussed in detail below. These functions are preferably performed by the medical info mechanism 184 shown in FIG. 5. Referring to FIG. 80, user medical info 8000 is one suitable example for user medical info 660 in FIG. 6. User medical information 8000 includes semi-private medical info 8010 and private medical info 8020. Semi-private medical info 8010 may include any information the user decides to make available to medical personnel in case of an emergency, while the private medical info 8020 includes the rest of the user's medical info, which can only be shared by the user granting access. Examples of semi-private medical info 8010 include blood type, medical conditions, medications, allergies, warnings, emergency contact info, a living will, a health care power of attorney, and other semi -private medical info. Private medical info 8020 is shown in FIG. 80 to include hospital records, doctor records, test results, lab results, diagnoses, treatments, surgeries, and other private medical information. In the most preferred implementation, all of the user's medical information is initially set to be private medical info 8020. The user may then select which of the private medical info 8020 to make available as semi -private medical info 8010. The medical conditions can include any medical condition the user may have. Allergies can include allergies to medications as well as allergies to food, insects, or other items. Warnings could include any suitable warning that medical personnel should have, such as severe allergies that could send the patient into anaphylactic shock, warnings about brittle bones, warning the patient only has one kidney, or any other suitable warning. Emergency contact info can include the name and contact information in a hierarchical list for those who should be notified of the patient's condition. The emergency contact info could include names, addresses, cell phone numbers, e-mail addresses, relationship to the user, etc. A living will can give the medical person knowledge regarding the patient's wishes if the patient is in a vegetative state. A health care power of attorney will inform the medical person to whom the user has given power of attorney for health care in case of the user's incapacity.
[0388] Referring to FIG. 81, method 8100 begins by a user defining semi-private medical info (step 8110). The semi-private medical information could be entered by the user, but could also be selected from the private medical info 8020. The user also defines the authorization needed to access the semi -private medical info (step 8120). The semi-private medical info defined by the user can be accessed by medical personnel when the authorization defined by the user in step 8120 is satisfied, as discussed in more detail below.
[0389] One of the benefits of the U-Me system is having a user-centric place to store all of the user's information, including medical info. Referring to FIG. 82, method 8200 shows how the U-Me system can store medical information for a user. A U-Me certified medical person treats a U-Me user (step 8210). All medical info related to the treatment is uploaded to the user's U-Me account (step 8220). The result is the user has electronic copies of all the user's medical info. The user can thus make these records available to a doctor if the user decides to switch doctors without having to request those records from the previous doctor. By automatically storing the user's medical info in the user's U-Me account, all of the user's medical info will be present in one place, in electronic form, in a searchable format, which can be easily shared as needed.
The result is a vast improvement over known methods for handling medical information.
[0390] FIG. 85 shows a sample display of a smart phone 8500. Some smart phones, such as Android phones, include security displays that require a user to enter a password or perform some other action to access the functions of the phone. One such display of a security screen 8510 is shown in FIG. 85 to include nine circles. The user may set a pattern of four circles in a path, and when the security screen 8510 is displayed, the user drags a finger over the four circles in the defined path, which then unlocks the phone for use. While this is an effective way to stop a stranger from using the smart phone, it can also prevent a medical person from accessing medical information for the user. Known Android smart phones, such as the Samsung Galaxy S3 phone, include an Emergency Call button 8520 on the security screen 8510 that allow someone to bypass the security screen 8510 and make an emergency call. A similar function for accessing medical information could be provided by the phone's operating system or by a U-Me app running on the phone. Thus, a Medical bypass button 8530 could be provided on the security screen 8510 that allows a medical person to access the user's medical info stored in the smart phone 8500. The medical bypass button 8530 could have text, or could have a symbol such as a red cross.
[0391] Method 8300 in FIG. 83 shows a method 8300 for a medical person to access a user's semi-private medical info 8010. Let's assume the U-Me user is in a car accident, is injured and unconscious, and arrives via ambulance to an emergency room of a hospital for treatment. Let's further assume the user has a smart phone with the U-Me app running, and the smart phone provides a medical bypass button, such as 8530 shown in FIG. 85. A nurse or doctor can press the medical bypass button 8530 on the security screen of the user device (step 8310). When the current location of the device is a medical facility (step 8320=YES), the user's semi-private medical info is displayed on the user device to the medical person (step 8330). This would be the case when the user is being treated in the emergency room of a hospital. However, what if the medical person is an Emergency Medical Technician (EMT) on-site at the car accident? In this case, the current location is not a medical facility (step 8320=NO), so the U-Me app prompts the EMT for authentication info to authenticate to the U-Me system (step 8340). Assuming the EMT or the ambulance company is U-Me certified, the EMT enters authentication info. If the user is not authenticated to the U-Me system (step 8350=NO), which would happen if the EMT does not have an account to authenticate to the U- Me system, the semi-private medical info for the user is not displayed on the user's device (step 8360). When the EMT is authenticated to the U-Me system (step 88350=YES), the user's semi-private medical info is displayed on the user's device (step 8330). Method 8300 is then done. Method 8300 thus provides two tiers of security, a first tier where the user's medical info is displayed at the touch of the Medical bypass button 8530 when the location of the user device is determined to be a medical facility, and a second tier that requires the medical person to authenticate to the U-Me system when the location of the user device is not a medical facility. Of course, the same method for the medical person to authenticate to the U-Me system can be performed when a user is at a medical facility. This would prevent someone from stealing the user's phone, driving to a parking lot of a hospital, and accessing the user's semi-private medical info.
[0392] The display of the user's semi-private medical info to a medical person could also trigger display of the user's emergency contact info to the medical person. This could include a data input screen that allows the medical person to enter the user's condition and the medical person's contact information. The relevant information could then be texted and sent via e-mail according to the information in the user's emergency contact info, or phone numbers could be called by the medical person.
[0393] One suitable way to determine whether the current location is a medical facility in step 8320 is shown by method 8400 in FIG. 84. The current location is determined (step 8410). This could be done, for example, using the Global Positioning System (GPS) in the user's smart phone. The current location is then checked against a database of medical facilities (step 8420). The database of medical facilities may be stored on the user device, or may be stored in the U-Me system in the cloud and be accessed via the user's device. [0394] Let's go back to the scenario outlined above that a U-Me user has been in a car accident, is injured and unconscious, and arrives via ambulance to an emergency room of a hospital for treatment. Let's further assume the user's smart phone is not available, perhaps because the smart phone is still in the car that was wrecked. The U-Me system still allows for displaying the user's semi-private medical info to a medical person if the medical person or the medical facility is U-Me certified. A method 8600 shows one suitable example for displaying a U-Me user's medical info when the U-Me user is not conscious or otherwise able to provide access to the user's medical info. A medical person could use a handheld electronic device with a fingerprint scanner to scan a fingerprint of the patient (step 8610). If the patient is not a U-Me member (step 8620=NO), the handheld device will display that medical info for the patient is not available (step 8630). When the patient is a U-Me member (step 8620=YES), the medical person is then prompted for authentication info (step 8640). Assuming the medical person or the medical facility is U-Me certified, the medical person will be authenticated to the U-Me system (step 8650=YES), and the semi-private medical info for the patient is displayed to the medical person (step 8660). If the medical person is not authenticated to the U-Me system (step 8650=NO), the semi-private medical info for the patient is not displayed (step 8670). Method 8600 is then done. Method 8600 illustrates one specific way for a medical person to access the private medical info for a patient when both the patient and the medical person or medical facility are U- Me members or users.
[0395] A medical information sharing engine 8700 in FIG. 87 allows easily sharing a user's medical info with others. The medical info sharing engine is preferably provided as a feature of the sharing mechanism 174 shown in FIG. 5. The sharing engine 8700 includes a data entry screen that allows the user to select medical info in the user's U-Me account, then to specify one or more users to share this medical info. In the example shown in FIG. 87, we assume the user had a car accident on 05/14/2013, and was treated in an emergency room on that day. The user puts in a date or date range. In this case, to assure the user catches all relevant medical info pertaining to the injuries from the car accident, the user enters the date of the car accident as the beginning date of a date range and enters a date a week later as the end date of the date range.
The U-Me system then displays all medical information in the User's U-Me account for that date range. The example in FIG. 87 shows three items of medical information: X-rays taken on 05/14/2013; E treatment on 05/14/2013; and a lab report dated 05/16/2013. The user can select the "Share All" box, which will automatically select all three items to share. In the particular example in FIG. 87, the user has selected the X-rays and ER treatment for sharing and has not selected the lab report for sharing. The user can then specify one or more parties with whom to share the selected medical information.
[0396] In the most preferred implementation, the user can share medical info with one or more other U-Me users, which may include individuals such as family members and doctors as well as organizations such as hospitals, insurance companies, etc. Referring to FIG. 87, a user we call Userl invokes the medical info sharing engine (step 8810). Userl selects medical info to share (step 8820). Userl selects one or more U-
Me users to share the selected info (step 8830). The U-Me system then grants access to the selected medical info for Userl by the selected users (step 8840). Granting access can mean the selected users are given permission by the U-Me system to access the selected medical info stored in Userl 's U-Me account.
Granting access could also mean copying the selected medical info from Userl 's U-Me account to the U-Me account of the selected users.
[0397] In some circumstances, a doctor may need to share a patient's medical information with other doctors. For example, some patients suffer from a condition that may require consultation with doctors of different areas of expertise. We assume Userl has consulted with a doctor, who asks permission to share Userl 's medical info with other doctors. Referring to FIG. 89, method 8900 begins by Userl sharing selected medical info with User2 (e.g., the first doctor) with authorization to allow User2 to share Userl 's medical info with others (step 8910). User2 can then select one or more other U-Me users (e.g., other doctors) to share Userl 's medical info (step 8920). The U-Me system then grants access to the medical info for Userl to the U-Me users selected by User2 (step 8930). Method 8900 is then done.
[0398] In one implementation, when a user has shared medical information stored in the user's U-Me account with other U-Me users, the sharing may be revoked at any time. Referring to FIG. 90, method 9000 begins when Userl revokes sharing of Userl 's medical info to User2 (step 9010). The U-Me system revokes access to Userl 's medical info by User2 (step 9020). Revoking access can mean not allowing User2 to access the medical info in Userl 's U-Me account. Revoking access can also mean deleting any of Userl 's medical info that was copied to User2's U-Me account. The U-Me system also revokes access to medical info of Userl for all users to which User2 shared Userl 's medical info (step 9030). Method 9000 is then done. A user will be much more likely to share his or her medical info when the user retains control to revoke the access at a later time.
[0399] The U-Me system can also track when a U-Me user takes medication, as shown in method 9100 in FIG. 91. A user takes meds (step 91 10). The user indicates to the U-Me system when the user took the meds (step 9120). This could be done, for example, vie the U-Me app on the user's smart phone. The U-Me system logs the meds taken and the time to the user's U-Me account. This information of when meds are taken can be enabled to the user's semi -private medical info so medical personnel will know what medications the user took and when. Method 9100 can track not only prescription medications, but nonprescription (over the counter) meds as well. In addition, method 9100 could be used to track the user's consumption of food, vitamins, supplements, herbs, etc.
[0400] The U-Me system can also provide reminders for a user to take meds at the appropriate time, as shown in method 9200 in FIG. 92. We assume a U-Me certified pharmacy dispenses prescription meds to a U-Me user (step 9210). The pharmacy uploads the prescription med info to the user's U-Me account (step 9220). When the user has set an option in the U-Me account for meds reminders (step 9230=YES), the U- Me system reminds the user to take meds at the prescribed times (step 9240). When the user does not want meds reminders (step 9230=NO), method 9200 is done. Even when the user does not want reminders, the user's U-Me system will have an exact record of which prescriptions were filled and when. Note the meds reminders may include any relevant information to taking the meds, such as "take with food", "do not drive after taking this medication", drug interaction warnings, etc.
[0401] The U-Me system includes a user authentication mechanism 194 shown in FIG. 5. The user authentication mechanism 194 can perform suitable methods for authenticating a user to the U-Me system, including the methods shown in FIGS. 93 and 95. Referring to FIG. 93, method 9300 requires a user to authenticate to the U-Me system (step 9310). Once authenticated, the U-Me system functions are available to the user (step 9320). One suitable example of the user authentication mechanism 194 in FIG 5 is shown as user authentication mechanism 9400 in FIG. 94. User authentication mechanism can use biometric authentication 9410 as well as non-biometric authentication 9420. Suitable examples of biometric authentication 9410 shown in FIG. 94 include fingerprint, retina scan, voice print, DNA, and other biometric authentication. Of course, any suitable biometric authentication could be used, whether currently known or developed in the future. Biometric authentication as used herein refers to authentication related to some aspect of a person's body that is unique for each person. Due to the large amount of sensitive data stored in the user's U-Me account, biometric authentication is preferred to assure unauthorized parties cannot access the user's U-Me account. Biometric authentication is performed by providing a sample, storing the sample as the reference, then comparing the reference sample to future samples submitted for authentication. Thus, a user could scan the fingerprint of the user's right index finger, and the scanned fingerprint could be processed and stored as a reference fingerprint. When the user wants to authenticate to the U-Me system, the user scans the same fingerprint, and the newly scanned fingerprint is processed and compared to the stored reference fingerprint. If the new fingerprint scan matches the stored reference fingerprint, the U-Me system is assured the user trying to gain access to the user's U-Me account is, indeed, the user.
[0402] Suitable examples of non-biometric authentication 9420 shown in FIG. 94 include
username/password, touch screen pattern recognition, and other non-biometric authentication. Non- biometric authentication as used herein refers to authentication that is not necessarily unique to a person's body. Most online services today use the username/password paradigm for authenticating a user. The disadvantage of using non-biometric authentication is the possibility of somebody hacking a user's authentication information and accessing the user's U-Me account. The most preferred authentication for the U-Me system is biometric authentication because this assures only the user can access the user's U-Me account. In addition, biometric authentication can become an important way to address potential fears of licensing content such as music to a user instead of to a device. If the U-Me system requires biometric authentication, the U-Me system can be assured: 1) the user is who the user claims to be; and 2) the user can only be in one place at one time, so a user's licensed content can be provided to the user at this place and at this time. This should minimize pirating concerns because only the user can access the user's licensed content after the user authenticates to a location or device.
[0403] Fingerprint scanners are becoming more and more common. Many laptop computers now include a small slit near the keyboard over which a person may slide a fingertip, causing the sensor under the slit to scan the person's fingerprint. These fingerprint scanners are becoming very common, and can be added to many systems without great cost to help the system interact in a secure way with users and with the U-Me system. For example, a fingerprint scanner could be added to vehicles that authenticate the driver to the vehicle. When a user rents a rental car, a slot-type fingerprint scanner on the rental car can be used for the user to authenticate to the car, which can, in turn, authenticate to the U-Me system. When the user is authenticated to the car, the U-Me system knows it can provide the user access to the user's music because the user has scanned a fingerprint to gain access to the U-Me system, and only the user has that fingerprint.
[0404] Method 9500 in FIG. 95 uses non-biometric authentication. The user authenticates to the U-Me system using non-biometric authentication (step 9510). The U-Me system functions are made available to the user only on one physical device at a time (step 9520). By restricting U-Me functions to one device at a time, this reduces the likelihood of a user sharing the user's username and password to provide access to the user's U-Me account by others, and also reduces the likelihood of a person who hacked a user's U-Me username and password gaining access to the user's U-Me account, at least while the user is logged in to the U-Me system.
[0405] The security of the U-Me system allows a paradigm shift in how licensed content may be licensed.
Currently, if someone uses Apple iTunes to purchase and download music to the person's computer, the iTunes account is tied both to the computer and to the device to which the music may be copied. Thus, when a user configures iTunes on a computer system and configures iTunes for the user's iPad, the iTunes account will only function with the user's iPad, not with other devices. This creates a real problem in the event of a crash of the hard disk on the computer system. If the person has not faithfully backed up their hard drive, the person risks losing the licensed content that was in the iTunes account. This is true even when Apple can verify the purchases were made by the user from the iTunes store. Having purchased music tied so tightly to a computer system is a real problem with the computer system fails.
[0406] All of the ills associated with tying licensed content to specific devices can be cured by the U-Me system. But this requires a paradigm shift in how content is licensed to users. Referring to FIG. 96, method
9600 shows licensing licensed content to a user (a human person), not to a physical device (step 9610). The licensed content can then be made available to the user on any physical device (step 9620). But what about piracy concerns? The piracy concerns go away when the user must use biometric authentication to gain access to licensed content in the user's U-Me account. From a logical perspective, when a user purchases a song, shouldn't the user be able to listen to that song regardless of what device the user may have access to?
This is the philosophy underlying the U-Me system, to make the user's data, licensed content, and settings available anywhere the user might be on any device the user might use. This is what allows "me to be me, anywhere", which is the goal of the Universal Me system.
[0407] One suitable example for method 9600 in the context of licensed music is shown in method 9700 in FIG. 97. A user purchases music (step 9710). The license for the music is to the user, and is not connected to any physical device (step 9720). The user's licensed music may then be made available by the U-Me system to any U-Me certified music player (step 9730). Method 9700 is then done.
[0408] A user may define settings related to the user's music, as shown in method 9800 shown in FIG. 98. The user defines music settings by organizing music into favorites, playlists, genres, etc. (step 9810). All of the user's music settings are then made available on any U-Me certified music player (step 9820).
[0409] Examples of physical music players 9900 are shown in FIG. 99 to include a phone, a tablet computer, a laptop computer, a desktop computer, a portable MP3 player, a car audio system, a home audio system, and other music player. Of course, these are listed as examples, and the disclosure and claims herein expressly extend to any suitable music player or type of music player, whether currently known or developed in the future.
[0410] With the paradigm shift to licensing to a person and not to any physical device comes the opportunity to provide creative licensing schemes for licensed content. Referring to FIG. 100, different music licenses for a song could be offered that are priced according to duration of the license. Thus, FIG. 100 shows a user can purchase a one week license for $0.49; a one year license for $0.89; a five year license for $1.29; a license for the life of the purchaser for $1.59; and a perpetual license for $1.99. The change of licensing to a person and not to any physical device gives rise to the concept of "digital estate planning" where a person may own perpetual rights to licensed content that may be transferred to someone upon the user's death. Such transfers could be handled by the U-Me system automatically once proof of death is verified by a U-Me system administrator according to transfer-on-death rules defined by the user in the user's U-Me account, or according to a user's will or other estate planning documents.
[0411] One of the features of the U-Me system is the generation of virtual devices for the user in the user's U-Me account that correspond to physical devices the user uses. The goal is to have as much information as possible in the virtual device so if a physical device is lost, stolen, or malfunctions, a similar device may be easily configured using the information in the virtual device. Referring to FIG. 101, a physical device is scanned for all data, licensed content, and settings (step 10110). A virtual device is then generated in the U-
Me account from the data, licensed content and settings of the physical device (step 10120). In the most preferred implementation, the virtual device will have virtually all information needed to configure a new device to have the exact same configuration as a previous device. An example will illustrate. Let's assume a user has a Samsung Galaxy S3 smart phone, and takes hours to get the phone configured exactly as he wants, with many apps installed and configured, with many different ringtones assigned to different contacts, with photos for many of his contacts, etc. Let's now assume the user registers this device with the U-Me system, which causes a process, such as method 10100 in FIG. 101, to create a virtual device that corresponds to the Samsung Galaxy S3 phone in the user's U-Me account, with all of the data, licensed content, software, and settings that define how the phone is configured and functions. Once the virtual device is created, if the user accidentally flushes his Samsung Galaxy S3 phone down a toilet, the user can go to his phone store, purchase a new Samsung Galaxy S3 phone, install the U-Me app, then log into his U- Me account. In the most preferred implementation, once the user registers the new phone with the user's U- Me account, the U-Me system will ask of the user wants to configure this phone to match the stored configuration in the virtual device in the U-Me account. When the user selects "Yes", the new phone is configured to exactly match the old phone, so the user can have the new phone up and running in very little time with the exact configuration on the new phone that the user spent so many hours defining on the old phone.
[0412] Current limitations may not allow all of the user's data, licensed content, and settings to be transferred to the virtual device in the user's U-Me account or from the virtual device in the user's U-Me account to a new device. For example, the operating system may prevent copying its code to the virtual device. Other apps could likewise prevent copying their code as well. The goal of the virtual device is to contain as much information as possible so a new device can be more easily configured than using known techniques used today, which typically takes hours of time to configure a new device. The U-Me system can still be helpful even when not all needed information is contained within the corresponding virtual device. When the U-Me app scans the physical device for data, licensed content, and settings, the U-Me app can log when copying something to the virtual device is not allowed, and can provide a list of instructions for the user to follow. For example, let's assume when the user creates a virtual device that corresponds to his Samsung Galaxy S3 phone (discussed above), the U-Me app cannot copy the operating system or two of the eighteen apps installed on the phone to the virtual device in the user's U-Me account. The U-Me app can then provide a list of instructions stored in the U-Me account for configuring a new device. Thus, in the scenario above where the user flushes his phone down a toilet and buys a new phone to replace it, when the user registers the new phone to the user's U-Me account, a list of instructions will be provided, such as: install OS update 2.6.4, then install Southwest Airlines app, then install Yelp app. Once these preliminary things are done by the user, the U-Me system can then configure the phone using the data, licensed content and settings stored in the corresponding virtual device in the user's U-Me account.
[0413] Known apps may include features that prevent copying their software and settings. This problem can be addressed by providing a process for an app to become U-Me certified. A U-Me certified app will have defined interfaces and methods that allow completely copying the entire app, including all of its settings, to a virtual device. In this manner, U-Me certified apps will allow fully automatic copying of the app and its settings to a new device. Once customers start demanding that apps and devices are U-Me certified, the providers of these apps and device will feel pressure get their products U-Me certified, which will aid in the widespread proliferation of the U-Me system.
[0414] For the virtual devices in the user's U-Me account to be up-to-date, a method is needed to synchronize any changes in the device to the virtual device stored in the user's U-Me account. An example of such a method is shown at 10400 in FIG. 104, which is preferably performed by the data tracker 162 shown in FIG. 5. All data additions and deletions in all of the user's physical devices are tracked (step
10410). All data additions and deletions in the user's U-Me account that affect the configuration of the physical devices are also tracked (step 10420). All data is then synchronized between the physical devices and the virtual devices stored in the user's U-Me account (step 10430).
[0415] Tracking data as shown in FIG. 104 requires identifying data attributes that describe the data. Note that "data" as discussed in FIGS. 105-109 refers to user data, user licensed content, and user settings. Attributes of the added data are identified (step 10510). The added data is stored to the U-Me account with indexing info that includes the identified attributes (step 10520). The indexing info will help the U-Me system know how the data should be stored, retrieved and handled. Examples of suitable data attributes 10600 are shown in FIG. 106 to include what, where, when, and other data attributes. "What" could refer to the type of data. "Where" could refer to where the data was generated. "When" could refer to a time and date relating to the changed data. Referring to FIG. 107, examples of data type attributes 10700 include operating system data, application data, user input, source, licensed content, size, and other data type. Referring to FIG. 108, examples of location data attributes include a device, the user's U-Me account, and other location attribute. Referring to FIG. 109, examples of time/date attributes include time, date, expiration, and other time/date attribute. By storing data to the user's U-Me account with the indexing info that includes attributes of the data, the data can be appropriate handled when synchronizing the data between the user's U-Me account and the user's devices.
[0416] Referring to FIG. 110, a data file 11000 represents data that is stored in the user's U-Me account. Each data file preferably includes an identifier ID, indexing info, and data. One example of a suitable data file that conforms to the format shown in FIG. 110 is shown in data file 11100 in FIG. 111. The ID is a numerical identifier that uniquely identifies this data record from other data records in the user's U-Me account. The indexing information in the example in FIG. 11 includes Jim Jones, .pdf file, TurboTax form, 2012 Federal Tax Return, 142KB, Desktop Computer, and 04/15/2013 at 21:46:23. The data is the .pdf form data. The indexing info shows Jim Jones created this file, a .pdf file, which is a TurboTax form that is part of his 2012 Federal Tax Return, the size of the file is 142 KB, and the file was generated on Jim's desktop computer on 04/15/2013 at 9:46.23 PM. By storing data files in the user's U-Me account using indexing info, the data can be retrieved using a powerful data search engine.
[0417] The first desktop computers included a file system that included a hierarchy of directories and subdirectories where files were stored. The same paradigm exists today more than thirty years later.
Although some refer to the directory/subdirectory structure as folder/sub folders, regardless of the label, the directory structure is the same, and the end result is the same - the user must decide where to store data on a computer system, and how to name files being stored. Requiring the user to decide where to store data and what file names to use also requires a user to remember the directory and file name when the user wants to retrieve the data. Most computer users have had the experience of storing a file using a directory and filename the user selected, then having great difficulty locating the file later on because the user cannot remember what directory or subdirectory the user saved the file to, or what name the user gave the file when the file was stored. To put users through this frustration is just plain silly given the fact that the computer always knows were data is stored because it is the entity that stores the data. We need to get away from the very antiquated and outdated directory/subdirectory paradigm that has frustrated and plagued so many computer users for so long. The U-Me system provides a much easier way to store and retrieve data from a user's U-Me account. Instead of using a directory/subdirectory file system that requires a user to remember directory names and file names, the U-Me system allows a user and/or the U-Me system to add indexing info that describes the data, such as indexing info shown in data file 11100 in FIG. 111. Because this indexing info is stored with the data, the indexing info may be used to retrieve the data as well using a suitable search engine, such as data search engine 164 in FIG. 5. One example of a screen 11200 for a data search engine is shown in FIG. 112. Screen 11200 includes fields for Data Created By, Data Type, Date and Device. Note each field includes an Add button to add more fields of that type. Data search engine screen 11200 is an example of a screen that could be provided by the data search engine 164 in FIG. 5. A screen for a search engine could include any suitable field that can be used as indexing info that can be used to locate the stored data. The data search engine 164 in FIG. 5 can provided different screens, including the e eceipt search engine screen in FIG. 50, the photo search engine screen in FIG. 76, and the data search engine screen in FIG. 112. The data search engine disclosed herein includes any suitable way to specify index information for retrieving data stored previously, whether currently known or developed in the future.
[0418] The U-Me system can provide a level of abstraction that hides the underlying file system. This can be done by creating "containers" for different item types. For example, when the user stores the first photo to the user's U-Me account, the U-Me system can recognize from the file type and the indexing info that this is a photo, and can create a container where all photos are stored. The "container" is also a logical construct, and can be implemented using any suitable technology under -the-co vers. Thus, a virtual machine that is provisioned to run a user's U-Me account could have a directory/subdirectory/filename file system, but this could be hidden from the U-Me user by the containers defined in the user's U-Me account.
[0419] While the concept of data search engines is very well-known in the database field, these concepts have not yet been applied to user interfaces for most physical devices. As a result, the prior art is stuck in the same antiquated directory/ subdirectory paradigm for disk file systems. The U-Me system can use any disk file system, including a directory/subdirectory structure. However, the U-Me system preferably provides a layer of abstraction that hides the directory/subdirectory file structure from the U-Me user, and instead provides easy-to-use data entry screens for storing data and search screens for retrieving data.
[0420] Referring to FIG. 113, a method 11300 is performed by the U-Me system when a new physical device needs to be configured. The new physical device is registered with the U-Me system (step 11310). When the new device is a replacement for an identical device (step 11320=YES), the new device is configured as a clone of the stored virtual device (step 11322). For example, in the scenario discussed above where a U-Me user flushes his Samsung Galaxy S3 phone down the toilet, and purchases a new Samsung Galaxy S3 phone as a replacement, the new phone can be configured as a clone of the virtual device in the user's U-Me account. When the new device is not a replacement for an identical device (step 11320=NO), a determination is made whether the new device is a replacement for a similar device (step 11330). For example, if the user who flushed his Samsung Galaxy S3 phone down the toilet buys a Samsung Galaxy S4 phone as a replacement, the new device is not a replacement for an identical device (step 11320=NO), but is a replacement for a similar device (step 11330=YES). The new device configuration for the Samsung Galaxy S4 phone can be determined from the stored virtual device for the Samsung Galaxy S3 phone (step
11332) because these are phones in the same family by the same manufacturer. Of course, any suitable criteria for "similar" could be defined. Note the new device may have settings that were not included in the stored virtual device because the new device is similar, but not identical. Method 11300 configures the new device as closely as possible to the old device, hopefully leaving a minimum of manual configuration for the user to perform. Once the new device is configured, if there are settings on the new device that were not available on the old device, the U-Me system could display those settings to the user for configuration.
[0421] When the new device is not a replacement for a similar device (step 11330=NO), the U-Me system determines whether a device -specific template exists for the new device (step 11340). When a device- specific template exists for the new device (step 11340=YES), the new device configuration is determined from the device-specific template and info in the user's U-Me account (step 11342). When no device- specific template exists for the new device (step 11340=NO), the new device configuration is generated from info in the user's U-Me account (step 11344), such as from a universal template, or from converting settings between two device-specific templates.
[0422] When the configuration for the new device is missing some info (step 11350=YES), the user may be prompted to provide the missing configuration info for the new device (step 11360). When the configuration for the new device is not missing any info (step 11350=NO), or after the user has provided all missing info in step 11360, the new device is configured (step 11370). Configuration of the new device occurs by the U-Me system copying the configuration information to the new device.
[0423] The definition of "similar" in step 11330 in FIG. 113 can be related to whether a device is of the same type or of a different type. The definition of "type" can be related to the physical characteristics of the device, the operational characteristics, and the manufacturer. Thus, in one scenario, a Samsung Galaxy S4 phone can be deemed to be of the same type as a Samsung Galaxy S3 phone because they both come from the same manufacturer and run the same Android operating system, while an iPhone could be deemed to be of a different type because it has a different manufacturer and runs a different operating system. In a different scenario, an iPhone can be deemed to be of the same type as a Samsung Galaxy S3 phone when the definition of type includes smart phones. The conversion mechanism 160 in FIG. 5 can convert settings between two different types of devices, regardless of how "type" is defined in any specific scenario.
[0424] As shown in FIG. 96, it is preferred in the U-Me system for licensed content, including software, to be licensed to a human user, and not to any physical device. Method 11400 in FIG. 114 shows an example for handling software that is licensed to a user. The user purchases the software that is licensed to the user
(step 11410). For this specific example in FIG. 114, we assume this means the software is not licensed to any physical device. The user provides the download and license information to the U-Me system (step 11420). For example, an installer file could be the download information, and the license information could be a license key received via e-mail after purchasing the software. The U-Me system installs the software on a virtual machine for the user (step 11430). The user then interacts with the software running on the virtual machine in the U-Me system (step 11440). By installing purchased software to a U-Me virtual machine instead of to a physical machine, the functions of the software can be made available to the user on any device the user may be using.
[0425] As shown in FIG. 96 and discussed above, it is preferred in the U-Me system for licensed content to be licensed to a human user, and not to any physical device. However, this is not how licensing works today for most software. As a result, there needs to be a path for gradually migrating from the current paradigm of licensing software to a user for use on a particular device, to the new paradigm of licensing licensed content to the human user and not to any particular device. This path can be provided using method 11500 in FIG. 115. A user purchases software that is linked to a physical device (step 11510). The user provides the download and license info to the U-Me system (step 11520). The U-Me system generates a virtual device ID for the software (step 11530). The virtual device ID can be the physical ID of any computer system in the
U-Me system, or could be a spoofed ID that is not from any physical computer system. The U-Me system then installs the software on a virtual machine for the user (step 11540). The user then interacts with the software running on the virtual machine in the U-Me system (step 11550).
[0426] A suitable example of a virtual machine 11600 is shown in FIG. 116. In this particular example, the virtual machine 11600 hosts the user's data 120A, the user's licensed content 130A, the user's settings
140A, a phone interface 11610, a tablet interface 11612, a laptop interface 11614, a desktop interface 11616, and the universal user interface 142. These interfaces 11610, 11612, 11614 and 11616 are suitable examples of device interfaces 156 in FIG. 5 that allow the U-Me system to communicate with each of the user's physical devices. Note that any and all of the items shown in FIG. 5 could run on the user's virtual machine 11600, but some of these may execute on other systems that interact with the user's virtual machine 11600 using a defined interface or protocol. All of the functions on the virtual machine are provided by the virtual machine mechanism 158 shown in FIG. 5.
[0427] For the methods in FIGS. 114 and 115, a user could interact with software running on a virtual machine in the U-Me system (step 11440 in FIG. 14 and step 11550 in FIG. 115) by invoking the universal user interface 2300 in FIG. 23, then clicking on the Software icon 2330. In response, the universal user interface 2300 could display a screen that shows icons corresponding to all software that is available to run in the user's U-Me account. This could be a sort of "virtual desktop" that provides icon-based display of software available to the user. When the user selects an icon corresponding to a software program the user wants to run, the software program will be run on a virtual machine in the U-Me system, and the user will then interact with the software using the universal user interface. The universal user interface thus provides an interface to any software running on a virtual machine in the U-Me system. This provides many advantages. First, the user can access and use the software using any suitable physical device. Because the software runs on a virtual machine, the physical device need not run the software. The physical device merely needs to provide a universal user interface to the software running on a virtual machine in the U-Me system. Thus, a user could be on vacation in Italy, go into an Internet cafe, and invoke using a standard web browser the login page to the U-Me system. The user could then authenticate to the U-Me system, preferably by biometric authentication if the computer in the Internet cafe has that capability, or via a username and password or other non-biometric authentication if the computer does not have biometric authentication capability. Once authenticated, the universal user interface could be displayed in the web browser, which then provides access to all of the user's data, licensed content, and settings, including the user's licensed software, on the computer system in the Internet cafe. Running software on a virtual machine via a universal user interface provides a very powerful platform that may be accessed using any suitable device in any suitable location. For example, if a user rents a hotel room that is U-Me certified, the user could invoke via a browser on the television the universal user interface via a web browser, which would make all of the user's data, content and settings, including software, available in the hotel room.
[0428] The U-Me system can also give weather notifications to its users. Weather notifications are referred to herein as weather alerts. The weather alerts may be provided, for example, by the alert mechanism 166 shown in FIG. 5. Referring to FIG. 117, method 1700 allows a user to define one or more geographic regions (step 11710), and for each defined region, the user may select one or more weather alerts (step 11720). The National Oceanic and Atmospheric Association (NOAA) operates special radio stations in the United States that broadcast weather information continuously. These radio stations work in conjunction with special weather radios that can be activated by a weather alert from the NOAA radio station. Known weather radios allow the user to select counties for which weather alerts are received. Some of the weather alerts defined by NOAA are shown byway of example in FIG. 118. However, known weather radios do not allow the user to customize weather alerts according to the user's preferences.
[0429] If a person lives in an area where tornados can occur, and wants a weather radio to sound the alarm in the event of a tornado watch or a tornado warning, the person typically sets the location on a weather radio to the county where the person lives. While this may result in tornado watches and warnings waking up the user in the middle of the night, which is desirable so the user can take cover in a shelter, this will also result in many other weather alerts waking up the user in the middle of the night. For example, if the user lives several miles from any creeks, streams or rivers, the user probably does not want to be awakened at
3:00 AM by the weather radio giving a flash flood warning. Similarly, if the user is home and warm in his bed, the user probably does not want to be awakened to be information of a wind chill warning. While this may be valuable information in the morning before the user leaves for work, it is annoying to be repeatedly awakened during the night for all the different weather alerts that may issue from the NOAA radio station.
[0430] Some people become so annoyed at being repeatedly awakened for weather alerts they don't care about that they turn the alert function of the weather radio off. This defeats the entire purpose of the weather radio, because with the alert turned off, the weather radio cannot awaken the user when a tornado watch or tornado warning is issued. There needs to be a better way for a user to customize weather alerts. The U-Me system provides this better way.
[0431] Referring to FIG. 119, a sample input screen 11900 for a weather alert interface is shown to include fields for Alert Type, Geographic Region, Time of Day, Priority, and Send Alert to. The weather alert interface provides a way for a user to define many different types of weather alerts so the user is not repeatedly awakened in the night for weather alerts in which the user has no interest. Sample weather alerts defined by a user are shown in FIGS. 120-122. The weather alert in FIG. 120 is for a Tornado Alert, which is to sound when a tornado alert is issued for this county and neighboring counties, the time of day for the warning is Any, the priority is High, and the Send Alert To specifies to send the alert for the tornado warning to all of the user's registered devices using all message types. Thus, when a tornado warning is issued by NOAA for this county and neighboring counties, the user will be notified with high-priority alerts, which can include alerts on any and all of the user's devices, so the user will be informed of the tornado warning, even if it means the user is awakened in the night.
[0432] A second example weather alert is shown in FIG. 121, which is a weather alert for a Flash Flood
Watch. The geographic region may be defined as "within 10 miles of my current location; the Time of Day is Any; the priority is low; and the Send Alert To is e-mail and text. Because the priority is low, the user will not be awakened by a flash flood watch. In the U-Me system, a user can define a geographic region in any suitable way, including by specifying defined regions like counties, or defining a radius from the user's current location. By defining a radius, the U-Me system can dynamically adjust as the user moves. Thus, if the user is in Jasper County but lives within 10 miles of the Jasper/Newton County line, and if the NOAA alerts are done county-by- county as they currently are done, the U-Me system could convert the "10 mile radius of my current location" specified in FIG. 121 to include flash flood watches for both Jasper County as well as Newton County. The conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5. Because the low-priority Flash Flood Watch weather alert is sent to the user via e-mail and text, the user will not be awakened by this alert, but the information will be available via e-mail and text when the user awakens.
[0433] A third example weather alert is shown in FIG. 122, which is a weather alert for a Wind Chill Watch. The geographic region is set to a 50 mile radius of my current location, the time of day is Any, the priority is Low, and the alerts are sent via e-mail and text. The ability to truly customize weather alerts allows the U-Me system to benefit from the weather alerts provided by the NOAA system while giving the user flexibility so the user will not be repeatedly awakened by weather alerts for which he has no interest.
[0434] Referring to FIG. 123, a method 12300 shows how the U-Me system processes weather alerts. The U-Me system receives a weather alert for a geographic region (step 12310). The weather alert received in step 12310 could be, for example, a weather alert from NOAA, but could also include other weather alerts from other sources as well. The U-Me system determines which users (if any) have alerts set for the geographic region specified in the alert (step 12320). One of the users that has an alert set for this geographic region is selected (step 12330). When the weather alert satisfies the user's weather alert criteria (step 12340=YES), the user is alerted (step 12350). When the weather alert does not satisfy the user's weather alert criteria (step 12340=NO), the user is not alerted. When there are more users who have alerts set for the geographic region for the weather alert (step 12360=YES), method 12300 loops back to step
12330 and continues until there are no more users who have alerts set for the geographic region for the weather alert (step 12360=NO). At this point, method 12300 is done. The U-Me system provides an additional layer of technology atop the inefficient NOAA weather radio system to allow the user to customize weather alerts to the user's liking. In this manner the user can be awakened for any weather alert he chooses, yet can remain asleep for other weather alerts for which the user does not care to be awakened.
[0435] Various functions relating to home automation are shown in FIGS. 124-130 and discussed in detail below. These functions are preferably performed by the home automation mechanism 186 shown in FIG. 5. FIG. 8 shows that user settings 140 may include home automation settings 850. An example of suitable home automation settings are shown at 12400 in FIG. 124. The example home automation settings shown in FIG. 124 include appliance settings 12410, Heating/Ventilation/ Air Conditioning (HVAC) settings 12420, light settings 12430, security settings 12440, home theater settings 12450, programs 12360, and other home automation settings 12470,
[0436] Examples of suitable appliance settings 12410 are shown in FIG. 125 to include coffee pot settings 12510, refrigerator settings 12520, alarm clock settings 12530, and other appliance settings 12540. Some appliances already have IP addresses, and some people think all devices that plug in will have IP addresses someday. The U-Me system contemplates the future, when a user may want to define settings in the user's U-Me account for any and all of the user's appliances.
[0437] Examples of suitable HVAC settings 12420 are shown in FIG. 126 to include thermostat settings 12610, heater settings 12620, air conditioning settings 12630, fan settings 12640, air cleaner settings 12650, and other HVAC settings 12660. Thermostat settings 12610 may include settings for different thermostats in different zones. Heater settings 12620 may include the heat temperature setting on a thermostat. Air conditioning settings 12630 may include the cool temperature setting on a thermostat. Fan settings 12640 may include turning various fans on or off, or varying the speed of fans, including fans on heaters, air conditioners, ceiling fans, stove exhaust fans, bathroom exhaust fans, etc. For example, fan settings 12640 could specify to run a bathroom exhaust fan when the bathroom light is turned on and to keep the exhaust fan running for ten minutes after the bathroom light is turned off. Air cleaner settings 12650 could include settings to run the air cleaner for a specified continuous period at night, then intermittently during the day according to a defined schedule.
[0438] Examples of suitable light settings 12430 are shown in FIG. 127 to include kitchen light settings 12710, master bedroom light settings 12720, master bathroom light settings 12730, living room light settings
12740, garage light settings 12750, and other light settings 12760. Light settings 12430 can include settings for any light or group of lights. For example, the user may have a motion sensor near the front door of the house to detect when somebody is approaching the front door. Light settings 12430 could include a setting that turns on the porch light when the motion detector detects motion near the front door. Light settings 12430 could include any suitable condition or trigger for turning lights on or off. For example, exterior security floodlights could be illuminated at dark and be kept on until dawn, or could be selectively turned on and off based on one or more motion sensors.
[0439] Examples of suitable security settings 12440 are shown in FIG. 128 to include an arm code 12810, a disarm code 12820, a bypass when arming condition 12830, a lock doors when arming condition 12840, and other security settings 12850.
[0440] Examples of home theater settings 12450 are shown in FIG. 129 to include news settings 12810, sporting events settings 12820, TV series settings 12830, movie settings 12840, and other home theater settings 12850. The home theater settings 12450 allow a user to define a "scene" for various types of viewing experiences. The settings shown in FIG. 129 could each include settings for the home theater audio system, for lights in the TV room and possibly adjacent rooms, for opening and closing drapes or blinds on one or more windows, etc. Thus, when a user wants to settle in to watch a movie, the user can select
"Movie" mode, and the room will then be configured with the appropriate lights, sound settings, window covering positions, etc. for the user to watch a movie.
[0441] Programs 12460 shown in FIG. 124 can include any suitable program or logic to cause different things in the home automation controller to occur based on some specific condition, event, or relationship. An example of a suitable program for a home automation controller is: When arming alarm (ARM), set
Thermostatl to Day High Setting. Thus, when the user presses buttons on the home automation keypad to set the alarm as the user is leaving the house, the program above will cause the settings in Thermostatl to change since nobody is home. Another example of a suitable program is: When GARAGE DOOR opens, turn on Garage Lights. The programs 12460 can include any suitable logic, setting, or combination of settings for home automation, whether currently known or developed in the future.
[0442] FIG. 130 shows a display of sample home automation settings 13000 that could reside in a user's U-Me account. Home automation settings 13000 for the simple example shown in FIG. 130 include appliance settings 12410, HVAC settings 12420, and light settings 12430. The appliance settings 12410 in FIG. 130 include "start the coffee pot brewing at 8:00 AM," "turn on the refrigerator icemaker," and "set the alarm clock for an 8:00 AM alarm." The HVAC settings 12420 in FIG. 130 include cool and heat settings from 8:00 AM to 10:00 PM; cool and heat settings from 10:00 PM to 8:00 AM; the furnace fan set to auto on the thermostats; and the heat/cool mode set to auto on the thermostats. The light settings 12430 in FIG. 130 include turning all lights off at 10:00 PM, turning bedroom lights on at 8:00 AM, and turning all lights on in the event that the security system that is part of the home automation controller sounds a security alarm. [0443] With the user's home automation settings stored in the user's U-Me account, these settings are available for use at a different location. Thus, let's assume the user takes a vacation and rents a condo at the French Riviera. Let's further assume the condo is U-Me certified. When the user arrives at the condo, the user can authenticate to the U-Me system using either the user's smart phone or another user device, or using a device in the condo. Once the U-Me system knows the condo where the user is staying, the U-Me system will have all of the templates that pertain to all of the U-Me certified equipment in that condo. Thus, all of the user's home automation settings (such as those shown in FIG. 130) could be programmed into corresponding equipment at the rental condo. This could even extend to the arm/disarm code for a security system, so the user can use the same codes she uses at home to arm and disarm the security system at the rental condo. This can be done even if the security system at the condo is a totally different brand and type of security system. The conversion mechanism 160 in the U-Me system can convert the user's settings at home to corresponding settings in the rental condo. The U-Me system thus allows "me to be me, anywhere" by making a user's settings in the U-Me account available wherever the user happens to be, on whatever device or equipment the user is using.
[0444] With computers today, users typically install software on their computer system, then enter a license key during installation or activation code after installation that makes the software work. For the ease of discussion herein, the term "license key" means any information the user may enter to gain access to a computer program. Thus, for every program a user installs, there are two critical pieces of information: 1) the program itself; and 2) the license key that allows the user to install the program. Most users today purchase software online, which results in an installation package being downloaded to the user's computer.
The license key is typically sent to the user via e-mail. Thus, when the user clicks on the installation package, the user is then prompted to enter the license key. Once the license key is installed, the software can be installed and run. In other scenarios, the software can be completely installed, but the license key (or activation code) must be entered after installation to enable all of the features of the software.
[0445] Many computer users are not tech-sawy enough to know where an installation package is stored when it is downloaded to the user's computer. And many users do not keep track of their e-mails well enough to assure the license key is saved in case the software ever needs to be reinstalled. The result is that one of the two important pieces needed to install and run a computer program can be easily misplaced or lost. This is prevented by the license management mechanism 188 in the U-Me system shown in FIG. 5. The license management mechanism 188 can perform a method such as method 13100 in FIG. 131. The user downloads software (step 13110). The user enters the license key to install or activate the software (step 13120). Once the user enters the license key, the license management mechanism 188 has both crucial pieces of information that will be needed if the software ever needs to be installed or activated in the future. A license management entry is thus created in the user's U-Me account for the software and the corresponding license key. One suitable example of such an entry is shown at 13200 in FIG. 132, which includes a license key 13210 and the software 13220. Note the software 13220 can include a fully downloaded computer program ready to install, or can include an installation package that includes a smaller piece of code that, in turn, downloads and installs the software from an online provider.
[0446] While the example in FIGS. 131 and 132 relate to software, the license management mechanism 188 can track license information for all licensed content. For example, when music is purchased, the license management mechanism will create an entry that includes the music file as well as license information for the music file. The license management mechanism thus monitors and stores licensed content and corresponding license information so this information can be accessed in the future. This can be useful, for example, when performing an audit to assure the user has licenses for all licensed content, and when transferring licensed content to another user, as discussed in more detail below.
[0447] Referring to FIG. 133, examples of alerts that can be provided by alert mechanism 166 in FIG. 5 include birthdays, anniversaries, periodic reminders, seasonal reminders, weather alerts, medication alerts, and other alerts. Examples of periodic reminders are shown at 13400 in FIG. 134 to include a reminder for a user to take thyroid medication every day at 7:30 AM, a reminder to check oil on all vehicles on the 1st of each month, a reminder to pay the house payment on the 5th of each month, a reminder to check the air filter on the furnace each quarter, a reminder to pay estimated taxes on specified dates, a reminder to file income tax returns annually on April 15th, and other periodic reminders. Periodic reminders can include any reminder for any type of information, event, or task.
[0448] Examples of seasonal reminders are shown at 13500 in FIG. 135 to include a reminder each October 1st to remove hoses from the hose bibs (so a freeze does not cause the hose bibs to burst), a reminder each April 1st to clean out the roof gutters, a reminder each March 15th to take the cover off the
AC compressor, a reminder each October 15 to put the cover on the AC compressor, a reminder on November 15 and January 15 to charge the riding lawnmower battery (so it doesn't run down over the winter), and other seasonal reminders. Note that seasonal reminders can be thought of as a specific subset of periodic reminders. While the U-Me system can provide its own alerts and reminders, the U-Me system can also interact with a user's calendar and tasks to set reminders in those systems as well.
[0449] Another function that can be provided by the U-Me account is the automated destruction of data, content or settings based on defined criteria in a retention/destruction policy by the retention/destruction mechanism 170 shown in FIG. 5. Referring to FIG. 136, in method 13600 a user defines criteria in the user's retention/destruction policy (step 13610). When none of the data or licensed content or settings satisfy criteria for destruction (step 13620=NO), method 13600 loops back and waits until any of the data and/or licensed content and/or settings satisfy the criteria for destruction (step 13620=YES). The U-Me system then destroys the data and/or licensed content and/or settings that satisfy the criteria for destruction specified in the user's Retention/Destruction policy (step 13630).
[0450] A sample retention/destruction policy is shown at 13700 in FIG. 137. This shows to retain tax returns for five years; to destroy specified data upon my death; to destroy a virtual device that is more than two generations back; and other retention/destruction criteria. Note that destroying data in this context means deleting the data from the user's U-Me account. Instead of specifying to retain tax returns for five years, the user could instead specify to destroy tax returns after five years. These may accomplish the same end result in the U-Me system, or may provide completely different results. For example, if the user specifies to retain tax returns for five years, and the user then attempts to delete a tax return in the user's U- Me account is less than five years old, the U-Me system could prevent the deletion and provide a message to the user that deletion of the tax return is not allowed due to the retention criteria the user specified in the Retention/Destruction policy. On the other hand, if the user specifies to delete tax returns after five years, this would not necessarily prevent the user from deleting a tax return that is less than five years old.
However, the U-Me system would be sure to destroy a tax return once it is more than five years old.
[0451] The U-Me system also provides a licensed content transfer mechanism 168 in FIG. 5. A suitable example of the licensed content transfer mechanism 168 is shown as licensed transfer content mechanism 13800 in FIG. 138, which lists possible transfers of licensed content, including gift, sale, inheritance, and other transfers. Because licensed content in the U-Me system is preferably licensed to a user and not to any physical device, the licensed content becomes digital personal property that can be transferred by gift, sale, inheritance, etc. Thus, let's assume a user buys a perpetual license for a song, which is then downloaded to the user's U-Me account. Let's assume the user tires of the song and wants to sell the song to someone else. In method 13900 in FIG. 139, the U-Me system receives a request from Userl to transfer licensed content to User2 (step 13910). The U-Me system then transfers the licensed content from Userl 's U-Me account to User2's U-Me account (step 13920). The U-Me system can include appropriate controls to verify Userl 's license to the licensed content before the transfer, and to transfer the license with the licensed content to
User2. Thus, a user that tires of a song could list the song for sale on eBay, and when the song sells to another U-Me user, the seller could request the transfer of the song to the buyer, which is then carried out by the U-Me system.
[0452] Note the second criteria in the Retention/Destruction policy 13700 shown in FIG. 137 specifies to delete specified data upon my death. The Retention/Destruction mechanism 170 thus allows a user to specify certain information that will be destroyed when the U-Me system receives proof of the user's death. For example, if a user is single, upon the user's death, the user's tax returns are no longer relevant, and can be automatically destroyed. An imaginative reader can imagine many other scenarios where automatic destruction of data upon a user's death would be desirable.
[0453] As described above, the U-Me system supports the concept of digital estate planning. This means for content that is licensed to the user in perpetuity, the user can define transfer-on-death rules. A method 14000 is shown in FIG. 140. The U-Me system receives proof of the death of Userl (step 14010). The U- Me system then reads the transfer-on-death rules the user may have defined (step 14020). The U-Me system then transfers Userl 's licensed content to one or more other U-Me users according to Userl 's transfer-on- death rules (step 14030). In this manner, content for which the user has a perpetual license may be automatically transferred to one or more different users automatically upon the user's death. [0454] Another feature of the U-Me system is the ability to audit the licensed content for a user. FIG. 141 shows a suitable method 14100 for performing an audit. The U-Me system reads the licensed content in the user's U-Me account (step 14110). When the user has a license for all of the user's licensed content (step 14120=YES), method 14100 reports the user has a license for all the user's licensed content (step 14130). When the user does not have a license for all of the user's licensed content (step 14120=NO), the user is prompted to acquire the missing licenses for the licensed content (step 14140). Note the display provided to the user in step 14140 may include controls to delete licensed content for which the user does not have a license, may include controls to acquire a license, and may allow the user to put off acquiring the needed license(s) for some period of time.
[0455] When the user selects to put off acquiring the needed license(s) for licensed content in the user's U-
Me account, the U-Me system can enforce the license audit with a deadline. Referring to FIG. 142, in a method 14200 the U-Me system provides to the user a deadline for acquiring the missing license(s) for licensed content (step 14210). As long as the deadline has not arrived (step 14220=NO), method 14200 waits until the deadline arrives (step 14220=YES). Once the deadline has arrived, any unlicensed content in the user's U-Me account is deleted (step 14230).
[0456] The U-Me system includes a sub-account mechanism 190 that allows a user to setup sub-accounts to the user's U-Me account. FIG. 143 shows a U-Me sub-account mechanism 14300 that is one suitable example of the sub-account mechanism 190 shown in FIG. 5, which includes one or more master accounts, one or more sub-accounts, access controls, and a social media monitor. In one possible scenario, a Mom and Dad are setup as master accounts, and their kids are setup as sub-accounts. The users who have master accounts may define access controls for the sub-accounts, and may further define parameters for the social media monitor. In this manner, parents can control how the kids use their U-Me sub-accounts. This same scenario could be used in a classroom setting, where the teacher has a master account and the students all have sub-accounts. Note the sub-account mechanism can define an account as a sub-account to establish a relationship with the master account without limiting access of the user in the user's account. Thus, if an adult is taking night classes, and the student and the teacher both have U-Me accounts, the sub-account mechanism can be used to establish a relationship between the U-Me users. Thus, the teacher could post homework assignments to her U-Me account, and the homework assignment could then be made available to all sub-accounts.
[0457] Method 14400 in FIG. 144 starts by defining one or more master accounts (step 14410). For each master account, one or more sub-accounts may be defined (step 14420). Access controls may be defined for the sub-accounts (step 14430). In addition, social media activity of the sub-accounts may be reported to the master account(s) (step 14440). The sub-account concept is a powerful tool for creating relationships between U-Me users and for potentially defining how users can access and use the U-Me system.
[0458] The U-Me system also includes a credit card monitoring mechanism 192 shown in FIG. 5. The credit card monitoring mechanism 192 preferably monitors when a user makes a purchase with a credit card online, and creates a log of all websites where the user used each credit card. Referring to FIG. 145, in method 14500 the U-Me system detects when a user enters credit card info on a web page (step 14510). Method 14500 then confirms the user completes the purchase (step 14520). The website for the web page where the user made the purchase is determined (step 14530). When there is an existing entry in the credit card log for this credit card (step 14540=YES), the website is added to the entry (step 14450). When there is no existing entry for this credit card in the credit card log (step 14540=NO), an entry is created in the credit card log for this credit card (step 14560), and the website is added to the entry (step 14550).
[0459] One suitable example of a credit card log is shown at 14600 in FIG. 146 to include entries for each credit card that specify the credit card name, credit card number, and expiration date, with a list of websites where the credit card was used. Log 14600 in FIG. 146 shows entries 14610, 14620 and 14630 for three credit cards, respectively, with each entry including a list of websites where the credit card was used.
[0460] Referring to FIG. 147, method 14700 waits until a credit card in the credit card log is about to expire (step 14710=YES). Note that "about to expire" can be defined in any suitable way, such as one month before expiration. The user is prompted that the credit card is about to expire (step 14720). When the user selects to view the websites where the credit card was used (step 14730=YES), method 14700 displays to the user the websites where the credit card was used (step 14740). Referring to FIG. 148, the user views the display of websites where the credit card was used (step 14810). The user selects one of the websites (step 14820). The user updates the credit card info on the selected website (step 14830). When there are more displayed websites to process (step 14840=YES), method 14800 loops back to step 14820 until there are no more displayed websites to process (step 14840=NO).
[0461] The credit card monitoring mechanism in the U-Me system allows the user to update the credit card information on some or all of the websites where the credit card was used before the credit card expires. If the use of the credit card was for a one-time purchase at a website, the user will probably not want to update the credit card information for that website. But when the use was for a website the user uses often (e.g. , PayPal, Amazon.com), or when the use is for a recurring bill (e.g., electric bill, phone bill), being prompted that a credit card is about to expire can provide important benefits, such as making sure a bill payment is not missed due to an expired credit card.
[0462] The U-Me system also includes a macro/script mechanism 172 shown in FIG. 5. One suitable example of the macro/script mechanism 172 is shown at 14900 in FIG. 149 to include a user interface monitor, a macro record mechanism, a script record mechanism, a macro playback mechanism, a script playback mechanism, and scheduled tasks. Referring to FIG. 150, method 15000 starts the user interface monitor (step 15010). The user interactions are monitored (step 15020). The user interface monitor is stopped (step 15030). A macro or script is then generated that performs the monitored user interactions (step 15040). For example, let's assume a U-Me user receives their bank statements from Wells Fargo via the user's online account with Wells Fargo. The user needs to retrieve the bank statement each month. The user could start the user interface monitor in step 15010, then perform all of the actions to retrieve and store the bank statement to the user's account, which might include the steps of: the user clicks on a bookmark for the Wells Fargo website, which bring up a login page; the user enters the username in the username field; the user enters the password in the password field; the user selects a "login" button; the user selects the account of interest; the user clicks on the link to retrieve this month's bank statement for the account of interest; the bank statement is displayed in a separate window as a .pdf file; the user then saves the .pdf file to the user's
U-Me account. At this point the user could stop the user interface monitor. A macro or script could then be generated that could retrieve each month's bank statement automatically from the Wells Fargo website and store the bank statement in the user's U-Me account. Note the terms "macro" and "script" are used herein as general terms to denote repeating steps that the user interface monitor saw the user perform, and are not limited to any particular program, program type, or technology.
[0463] Once a macro or script has been defined, a task may be scheduled using that macro or script, as shown in method 15100 in FIG. 151. A macro or script is selected (step 15110). One or more times are scheduled to run the macro or script (step 15120). Thus, for the bank statement example above, assuming the bank statement is available by the 3rd of each month, the user could use method 15100 to schedule the "Wells Fargo Bank Statement" macro or script to run on the 4th of each month.
[0464] A method 15200 in FIG. 152 represents a particular example for the case of automatically retrieving a bank statement as discussed above. A script is defined for retrieving an online bank statement (step 15210). The defined script is then run on the 4th of each month, which retrieves the bank statement for that month and stores the bank statement to the user's U-Me account (step 15220).
[0465] Note the macro/script mechanism 172 in the U-Me system can have more capabilities than merely repeating a user's functions. A simple example will illustrate. Let's assume the user retrieves the bank statement for July 2013 on August 10th in defining the script in step 15210 in FIG. 152, and schedules the macro or script to run on the 4th of each month. When the macro/script mechanism 172 goes to the user's online Wells Fargo account to retrieve the August bank statement on September 4th, the macro/script mechanism has to have the intelligence to retrieve the August bank statement, not to simply retrieve again the July bank statement. The script/macro mechanism may thus include intelligence that allows detecting patterns and learning from those patterns. Thus, if the bank statements for 2013 are displayed on the website in a list that includes links called "Statement 01/31/2013", "Statement 02/28/2013"; etc. through "Statement 07/31/2013", the macro/script mechanism can recognize the pattern and know to retrieve the statement corresponding to the month previous to the month when the macro/script is run. In the alternative, the user may provide additional input after the macro/script is recorded to direct how the macro/script performs its functions. The macro/script mechanism allows automating routine tasks so the U-Me system can perform these routine tasks automatically for the user.
[0466] Referring to FIG. 153, a method 15300 shows how the user may access the user's data and/or licensed content and/or settings that are stored in the user's U-Me account. The user authenticates to the U-
Me system (step 15310). The user identifies a location that is U-Me certified (step 15320). The U-Me system reads the location settings and compares the location settings with the user settings (step 15330). When conversion is needed (step 15340=YES), the conversion mechanism in the U-Me system converts the user settings to suitable location settings (step 15350). The conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5. The user settings that correspond to the location are then downloaded to devices at the location (step 15360). When no conversion is needed (step 15340=NO), the user settings in the user's U-Me account can be downloaded to devices at the location (step 15360). Method 15300 could be representative of any suitable location, including a vehicle, a hotel room, a rental condo, etc.
[0467] As discussed in detail above, one of the advantages of the U-Me system is not requiring the user to specify a directory/subdirectory/filename for each file that is saved in the user's U-Me account, which the user would then have to remember to retrieve the file. The data tracker 162 in FIG. 5 tracks data changes and generates indexing information that is stored with a file to help retrieve the file using a search engine. The data search engine 164 in the U-Me system allows formulating very powerful queries using drop-down lists, dates or date ranges, key words, dollar amounts, devices, etc. in a very intuitive, plain-English type of interface. A U-Me user thus does not have to be a database expert who is familiar with Structured Query Language (SQL) in order to use the data search engine 164. Some sample queries that could be submitted via the data search engine 164 are shown at 15400 in FIG. 154 to include: purchases over $100 this year; phones I have owned since 06/12/2012; electronic devices I currently own; total charges on credit card XYZ for the second quarter of this year; and warranties that will expire in the next six months. These examples in FIG. 154 show the data search engine 164 supports complex queries the user may formulate without being an expert at formulating database queries.
[0468] As discussed above, in the most preferred implementation, licensed content is licensed to a person and not to any particular device. In some circumstances there needs to be separation between ownership of the software and who is licensed to use the software. This is especially true in a company or corporate environment, where a company or corporation purchases licensed content for use by its employees.
Referring to FIG. 155, a company purchases licensed content (step 15510). An administrator within the company then specifies a person who is the licensee for the licensed content (step 15520). The licensee is then licensed to use the licensed content (step 15530). What happens if the licensee quits or is fired? This is shown in method 15600 in FIG. 156. The person who is the licensee quits working for the company (step 15610). The administrator within the company deletes the person from the licensing info for the licensed content (step 15620). The person is no longer authorized to use the licensed content (step 15630). These methods in FIGS. 155 and 156 discuss an employee being the licensee. However, the company could be the licensee, with the person authorized to use the software in the licensing info for the licensed content.
Regardless of the labels applied, the concept is the same: the company owns the licensed content, and can authorize a person to use the licensed content, and can also remove authorization for the person to use the licensed content. This is especially handy when a company purchases software with a site license or with a license that covers many users. [0469] The world envisioned by a fully- functioning Universal Me system as described in detail above is much different than the world in which we currently live. Implementing the U-Me system will take many years. One aspect of evolving towards the U-Me system is the need to put existing information that does not exist in an electronic format into the user's U-Me account. A method 15700 shown in FIG. 157 shows how existing physical items can be converted to electronic format and stored in a user's U-Me account. An electronic file for a physical item is created (step 15710). This can be any suitable physical item that is converted to any suitable electronic file format. For example, a hard copy medical record could be scanned to an electronic image file. A DVD of a movie the user owns could be ripped to a .mov file. Music from a CD the user owns could be ripped to a .mp3 file. Hard copy photos could be scanned to an electronic image file. The electronic file is then uploaded to the user's U-Me account with user-specified indexing info (step
15720). The user may the access the electronic file in the user's U-Me account (step 15730).
[0470] Because there will be great demand for services to put physical items into a user's U-Me account, method 15700 could be performed by a third party provide who specializes in migrating physical items to the U-Me system. For example, a user may become a U-Me member, and may use the system initially to store photos the user takes from a smart phone. Once the user becomes convinced of the value in making all of the user's information available in the user's U-Me account, the user could hire a third party U-Me migration specialist to help migrate the user's info to the user's U-Me account. Thus, the user could take a box with photos, movies, CDs, medical records, tax returns, etc. to the U-Me migration specialist, who would have the user specify indexing info for each type of item being stored. The user can decide how much or how little indexing info to provide initially, because the user will always have the option to pull up the information later and add additional indexing info.
[0471 ] An example of a virtual machine is shown at 11600 in FIG. 116. Those skilled in the art of cloud computing will recognize that cloud resources are deployed on virtual machines. Virtual machines can be run on logically-partitioned computer systems. Thus, a single server computer can provide many different virtual machines to different users.
[0472] When a virtual machine runs the U-Me system for a particular user, as shown by way of example in FIG. 116, the virtual machine provides all the needed memory, hard disk space and processor power the user requires. Note, however, that dedicating a virtual machine for a particular user to be always running would be a significant dedication of resources that is not needed if the virtual machine can be created when the user needs to access the U-Me system, and automatically destroyed once the user no longer needs to access the
U-Me system.
[0473] Referring to FIG. 158, a virtual machine image 15800 is shown to include U-Me system info 15810, which includes virtual machine provisioning info 15820, and a U-Me generic user shell 15830. A virtual machine image can be a disk image that, once installed on a physical machine, can be executed. We now assume the virtual machine image 15800 in FIG. 158 is provisioned, or instantiated, on a physical computer system, which results in a running virtual machine 15900 as shown in FIG. 159 that is running in the U-Me system. Note the U-Me system components 15910 are running and a U-Me generic user shell 15920 is running. However, the U-Me generic user shell is not specific to any particular user, because the U-Me user shell only becomes user-specific once a U-Me user descriptor file is written to the U-Me generic user shell 15920. Such an example is shown in FIG. 160, where the U-Me user descriptor file 16020 has been installed in the U-Me generic user shell to create the U-Me user-specific components 16010 for a particular user. This makes the virtual machine 16000 user-specific. The virtual machine 16000 can then provide all the functions the user requires from the U-Me system. Of course, the virtual machine will not necessarily perform all U-Me system functions. In the most preferred implementation, many of the U-Me system functions will be available on other virtual machines that are always running to service requests from many users simultaneously.
[0474] There is a significant advantage in separating the U-Me system components from the information that makes the virtual machine user-specific. The virtual machine 15900 in FIG. 159 is running, and can be easily deployed to serve a user by writing the user's U-Me descriptor file to the U-Me user shell. The time to customize a running virtual machine to a specific user will be a small fraction of the time to provision or instantiate a virtual machine from a virtual machine image. The U-Me system can thus monitor user demand, and in anticipation of a spike in usage, the U-Me system can provision many user-generic virtual machines that are ready and waiting for the U-Me descriptor file to make these virtual machines user- specific. Once the spike in demand hits, the U-Me system can satisfy the demand by configuring each running virtual machines to a particular user. Because the user -generic virtual machines are already up and running, the time to configure a running virtual machine with the user's info will be much less than the time it would take to instantiate a virtual machine from a virtual machine image.
[0475] Method 11300 in FIG. 113 was discussed above. An example is now presented to illustrate specifics of how method 11300 may perform some of its steps. We assume a virtual phone 16100 includes many items copied from a physical phone, such as operating system specifications 16110, operating system settings 16120, apps installed 16130, apps settings 16140, and user settings 16150. Virtual phone 16100 also includes a list of items not copied to the virtual phone 16160. A U-Me app running on the physical phone can preferably detect which items in the physical phone cannot be copied to the virtual phone, and provides those items to the virtual phone so the virtual phone knows what items on a physical phone do not exist in the corresponding virtual phone.
[0476] Referring to FIG. 162, a virtual phone 16200 is shown as an example of a virtual phone that could exist for a user's Samsung Galaxy S3 phone. Virtual phone 16200 includes operating system specs 16210 that specify the phone is running the Android operating system, version 4.1.2. The operating system settings 16220 include all settings for the Android 4.1.2 operating system that can be copied to the virtual phone 16200. The list of apps installed 16230 shows apps labeled A, B, C, D, H, I, J, K, L and M. The settings for installed apps 16240 include all settings for all apps that can be copied to the virtual phone 16200. The user settings 16250 include all of the user settings on the physical phone. The items not copied to virtual phone 16260 lists the operating system 16262, App B 16264, App D 16266, and App J 16268. It is assumed these apps have features that prevent copying all of their information to the virtual phone in the user's U-Me account (i.e., are not U-Me certified). Based on the items that could not be copied to the virtual phone 16260, the user may be prompted with steps to take before the U-Me system can configure the phone. For example, let's assume the user flushes his phone down a toilet, purchases a new phone, and installs the U-
Me app. The user will log in to his U-Me account, then register the new phone as one of the user's devices. The user will select to configure the new device as a clone from the stored virtual device. But there still may be some missing configuration info for the new device (step 11450 in FIG. 13). At this point, the user is prompted for the missing configuration info (step 1 1360). The prompt to the user in step 11460 could be as shown at 16300 in FIG. 163. The instructions to the user on how to setup the new Samsung Galaxy S3 phone include a list of steps the user must perform before the U-Me system can install all of the information from the user's virtual phone. First, the user makes sure Android 4.1.2 is installed. If an earlier release is installed, the user updates the operating system to Android 4.1.2. Once the correct operating system is on the phone, the user installs Apps B, D and J manually. Once the operating system is up to date and apps B, D and J have been installed, the user can select the Continue Setup button 16310, which will cause the U-Me system to copy all of the information in the virtual phone 16200 to the physical phone. While apps B, D and J had to be manually installed, the settings for the apps could still be in the settings for installed apps 16240 in the virtual phone 16200. Installing these settings by the U-Me system after the apps are manually installed by the user can result in the apps being configured exactly as they were on the user's previous phone, even though the apps themselves could not be copied to the virtual phone 16200.
End Spec Martin.1442 - Start Spec Martin.1443
[0477] As discussed above, the widespread acceptance of digital photography has been accompanied by a corresponding widespread problem of most users having thousands of digital photos that are stored using cryptic names in many different directories or folders on their computer systems, making retrieval of photographs difficult. The U-Me system provides an improved way to manage photos, including photos that originated from a digital camera or other digital device, along with hard copy photos that have been digitized for electronic storage. The U-Me system improves over the known art of software that adds metadata to photos by providing a people-centric approach to managing photos, as described in detail below. The methods discussed with respect to FIGS. 164-210 are preferably performed by the photo mechanism 182 shown in FIG. 5.
[0478] Referring to FIG. 164, a method 16400 generates and stores indexing information for a photo. A user defines people and relationships in the U-Me system (step 16410). The U-Me system derives relationships from the user-defined relationships (step 16420). The user may also define one or more locations (step 16430). The U-Me system may also provide defined locations (step 16440). The user may also define one or more events (step 16450). The U-Me system derives events from the user-defined events
(step 16460). The U-Me system then generates indexing info for a photo based on any or all of the user- defined relationships, system-derived relationships, user-defined locations, system-defined locations, user- defined events, and system-derived events (step 16470).
[0479] The U-Me system includes a photo system data entry screen for people, such as screen 16510 shown in FIG. 165 by way of example. The photo system data entry screen 16510, like all of the U-Me system, is person-centric. Thus, when a user decides to have the U-Me system manager the user's photos, the user starts by entering data for a particular person in the photo system data entry screen 16510. Fields in the photo system data entry screen 16510 include Name, Preferred Name, Birth Date and Camera. The user can provide a sample photo of the person's face at 1480 to help train the facial recognition engine in the U- Me photo system. Note the Camera field includes an Add button 16570 that allows the user to enter all cameras the user uses to take digital photos. The data entry screen for people 16510 shown in FIG. 165 includes a People button 16520, a Locations button 16530, an Events button 16540, a Save button 16550, and a Cancel button 16560.
[0480] FIG. 166 shows the data entry screen for people 16510 after a user has entered information into the data entry screen. We assume the user is Jim Jones, and screen 16510 in FIG. 166 shows the pertinent information relating to Jim Jones, including a Preferred Name of Jimmy, a Gender of Male, a Birth Date of
08/03/1957, a Nikon Coolpix S01 Camera, and a photo 16680 showing Jim's face. Once the Save button 16550 is selected, an entry into the user's people database will be created for Jim Jones with the info shown in FIG. 166.
[0481] After selecting the Save button 16550 in FIG. 166, we assume the user selects the People button 16520, which results in a different photo entry screen 16710 in FIG. 167 being displayed to the user. Data entry screen 16710 allows entering relationships for Jim Jones. In the specific example shown in FIG. 167, the user can enter family relationships. Thus, Jim Jones adds all the information relating to members of his family, as shown in screen 16710 in FIG. 168. We assume for this example Jim Jones has a son named Billy Jones with a preferred name of Bubba who is Jim's son by birth, a daughter names Sally Jones with a preferred name of Sally who is Jim's stepdaughter, a wife named Pat who is Jim's current wife, a father named Fred Jones with a preferred name of Dad Jones who is Jim's birth father, and a mother named Nelma Pierce with a preferred name of Mom Jones who is Jim's birth mother. After entering the information shown in screen 16710 in FIG. 168, the user can select the Save button 16550, which results in saving all of the people as people in the user's U-Me database, and which results in saving all the relationships relating to these people. Note that entering information about a spouse can include a type of spouse, such as Current,
Ex and Deceased, as shown in FIG. 169, along with a wedding date. Once a user enters any family members into the data entry screen 16710 shown in FIG. 168, a Display Family Tree button 16810 is displayed that, when selected, will display a family tree of all the relationships for the user. Note that once a person is entered into the user's People database, the user can enter more information for that person by invoking the data entry screens 16510 and 16710. [0482] The initial entry of photo system data for all the people in a user's immediate and extended family may take some time, but once this work is done the U-Me system can use this data in many ways that allow easily storing photos to and easily retrieving photos from the user's U-Me account. In addition, this data relating to people can be shared with others, thus allowing a first user to provide a significant shortcut to a second user who is invited to share the first user's photos as well as people, locations, events, etc.
[0483] The U-Me system is intelligent enough to derive many relationships that are not explicitly specified by a user. For example, FIG. 170 shows user-defined relationships can include son, daughter, father, mother, brother, sister, stepson, stepdaughter, stepfather, stepmother, boss, manager, employee, co-worker, and others. Examples of system-derived relationships in FIG. 170 include grandson, granddaughter, grandpa, grandma, uncle, aunt, nephew, niece, son-in-law, daughter-in-law, mother-in-law, father-in-law, great-grandson, great-granddaughter, great-grandpa, great-grandma, great-uncle, great-aunt, great-nephew, great-niece and others. Note that all of the relationships shown in FIG. 170 are for illustration only, and are not limiting. Other user-defined relationships and system -derived relationships not shown in FIG. 170 are within the scope of the disclosure and claims herein. For example, the system could derive any suitable relationship, such as second cousin twice removed, third-level employee, etc.
[0484] Referring to FIG. 171, as entries are made into the photo system (e.g., as shown in FIGS 166 and 168), method 17100 monitors the photo system data entry (step 17110) and constructs relationships from the photo system data entry (step 17120). People naturally think along the lines of family relationships and other relationships between people. While known software for adding metadata to a photo allows adding name labels such as "Katie" and performing facial recognition, these labels have no meaning within the context of other people in the photos. The U-Me system, in contrast, constructs relationships between people, such as family relationships, that allow storing and retrieving photos much more effectively than in the prior art.
[0485] FIG. 172 shows a display of a family tree that could be displayed when the user clicks on the Display Family Tree button 16810 after saving the information in the data entry screen 16710 as shown in
FIG. 168. Note there is a small "s" next so Sally Jones' name to indicate she is a step-daughter of Jim, not a daughter by birth. The user-defined relationships for Jim Jones specified in FIG. 168 are shown in FIG. 173. Now we assume the user selects Billy Jones in the user's People database, and enters information in the data entry screen 16710 shown in FIG. 167 that indicates Billy married Jenny Black and has a son by birth named Todd Jones. This additional information is dynamically added to the family tree as it is entered. The resulting family tree is shown in FIG. 174.
[0486] FIG. 175 shows the addition of Jenny Black and Todd Jones results in the creation of some system- derived relationships. The U-Me system recognizes the wife of a son is a daughter-in-law, and thus derives from the fact that Jenny Black is listed as Billy Jones' wife that Jenny Black is the daughter-in-law of Jim Jones. In similar fashion, the U-Me system recognizes the son of a son is a grandson, and thus derives from the fact that Todd Jones is listed as Billy Jones' son that Todd Jones is the grandson of Jim Jones. As the family tree is expanded by adding more user-defined relationships, the U-Me system will monitor the additions and dynamically create more system-derived relationships that are derived from the user-defined relationships.
[0487] In addition to defining people and relationships, a user can also define locations. For example, we assume selecting the Locations button 16530 in FIGS. 165- 168 results in the display of a locations data entry page, of which an example 17610 is shown in FIG. 176. The data entry screen for locations 17610 may include a button 17620 to enter a new location or a button 17630 to add a new location using a phone app. We assume the user selects the Enter a New Location button 17620, which could result, for example, in the display of the data entry screen 17710 shown in FIG. 177. In FIG. 177, the location is named Jim & Pat's House, the street address is 21354 Dogwood, the city is Carthage, the state is Missouri (postal abbreviation of MO), and the ZIP code is 64836. When the user enters the location information in the Street, City, State, and ZIP fields, the U-Me system computes GPS coordinates for that location, and stores those GPS coordinates at 17720 relating to the address of the person whose information appears on the screen 17710.
[0488] The U-Me system includes the capability of allowing a user to define any suitable location using the U-Me app on the user's smart phone or other mobile device, such as when the user selects button 17630 in FIGS. 176 or 177. Method 17800 in FIG. 178 shows an example of a method for a user to define a location using the user's smart phone. The user invokes the U-Me app on the user's smart phone, then activates a "location definition" mode on the U-Me app (step 17810). The user then selects the "Begin" button, which causes the current location of the smart phone to be stored as the beginning boundary point (step 17820). The user then travels to the next boundary point (step 17830), and selects "store" on the U-Me app (step 17840) to store the current location of the smart phone as the next boundary point. When the current boundary point is not the beginning point (step 17850=NO), method 17800 loops back to step 17830, and the user continues to enter boundary points until the user is back to the beginning (step 17850=YES). The boundary points are then connected (step 17860), preferably using straight lines in connect -the-dot fashion. A location is then defined from the region enclosed by the boundary points (step 17870). The coordinates for the region are then sent from the U-Me app to the U-Me system, thereby defining a location for the user in the user's U-Me account. In one particular implementation, the coordinates are GPS coordinates, but any suitable location coordinates could be used.
[0489] FIG. 179 illustrates examples of how method 17800 could be used by a user to define locations in the user's U-Me system. We assume for this example that Jim & Pat's house is in a rural area on 40 acres of land that has an irregular shape. We assume Jim uses the U-Me app on his smart phone, walks to a corner of his property shown at points 1,6 in FIG. 179, activates the "location definition" mode on the U-Me app (step 17810), and selects Begin on the U-Me app (step 17820), which stores point 1 as the Begin point. Jim then walks to point 2 (step 17830), the next corner of his property, and selects Store on the U-Me app to store point 2 as the next boundary point (step 17840). Point 2 is not back to the beginning point (step
17850=NO), so Jim walks to point 3 in FIG. 179 (step 17830) and selects Store on the U-Me app to store point 3 as the next boundary point (step 17840). Point 3 is not back to the beginning point (step
17850=NO), so Jim walks to point 4 in FIG. 179 (step 17830) and selects Store on the U-Me app to store point 4 as the next boundary point (step 17840). Point 4 is not back to the beginning point (step
17850=NO), so Jim walks to point 5 in FIG. 179 (step 17830) and selects Store on the U-Me app to store point 5 as the next boundary point (step 17840). Point 5 is not back to the beginning point (step
17850=NO), so Jim walks to point 6 in FIG. 179 (step 17830) and selects Store on the U-Me app to store point 6 as the next boundary point (step 17840). Because point 6 is the same point as point 1, which was the beginning, the U-Me app recognizes the user is back to the beginning point (step 17850=YES). The U-Me app connects the boundary points (step 17860), and defines a location from the connected boundary points (step 17870). The geographical coordinates for this location can then be sent to the user's U-Me account, and the user can then name the location. We assume for this example the location 17920 shown in FIG. 179 that was defined by the user is named "Jim & Pat's Property."
[0490] In FIG. 177, when the address was entered for Jim & Pat's House, the U-Me system computed latitude and longitude coordinates for that location based on a database of addresses with corresponding location coordinates. However, in a rural area, the location coordinates for an address may not correspond very closely to the location of the house. For example, the location coordinates shown in FIG. 177 might correspond to the driveway entrance to Dogwood Road shown at 17930 in FIG. 179. If the house sits back from the road a substantial distance, the location coordinates of the address may not be accurate for the location of the house. Thus, Jim could use the U-Me app to walk the boundary points of his house, shown at points 7, 8, 9, 10, 11, 12 and 13 in FIG. 179. The U-Me app could then connect the boundary points and define a location 17910. This user-defined location 17910 could be substituted for the system-derived location 17720 shown in FIG. 177 to provide a more accurate location of Jim & Pat's house. Note that a photo taken inside of Jim & Pat's house could include indexing information that includes both Jim & Pat's House and Jim & Pat's Property. In the alternative, however, Jim & Pat's Property may be defined specifically to exclude Jim & Pat's house, so a photo taken in Jim & Pat's house will have indexing information generated that indicates the location as Jim & Pat's House, while a photo taken outside the house on the property (for example, of a grandson fishing in a pond) will have indexing information generated that indicates the location as Jim & Pat's Property.
[0491] Many modern cameras and almost all smart phones include location information in the metadata of a digital photo file that specifies the location of where the photo was taken. FIG. 180 shows examples of user-defined locations and system-defined locations. User-defined locations have a specified name and derived geocode information that defines the location. For example, the derived geocode information for Jim & Pat's Property defined by the user at 17920 in FIG. 179 is all geographical coordinates that fall within the defined location 17920. The user-defined locations include Jim & Pat's House, Jim & Pat's Property, Jim's Office, Billy's House, Dad Jones' House, etc. The system -de fined locations can include any location information available from any suitable source, such as online databases or websites, etc. System -defined locations may include, for example, city, county, state, country, city parks, state parks, national parks, tourist attractions, buildings, etc. Thus, when a photo is taken at the Grand Canyon, the U-Me system can detect the location coordinates, check available databases of system-defined locations, detect the location corresponds to the Grand Canyon, and add a location "Grand Canyon" to indexing info for the photo. The same could be done for tourist attractions such as Disney World, and for buildings such as the Empire State
Building. It will be appreciated that a user could define many user -defined locations and the system could define any type and number of system-defined locations. Note that one location can correspond to both a user-defined location and a system-defined location. Thus, if a user owns a cabin in a state park, the user could define the location of the cabin as "Cabin", and photos could then include indexing information that specifies both the state park and "Cabin".
[0492] FIG. 181 shows sample metadata 18110 that may exist in known digital photo files. Note the term "metadata" is used herein to mean data that is not part of the visible image in the digital photo that describes some attribute of the photo. The metadata 18110 in FIG. 181 is shown to include fields for Camera Make, Camera Model, Camera Serial Number, Resolution of the photo, Image Size of the photo, Date/Timestamp, and Geocode Info. The metadata shown in FIG. 181 is shown by way of example. Many other fields of metadata are known in the art, such as the metadata fields defined at the website photometadata.org. The photo metadata disclosed herein expressly extends to any suitable data, whether currently known or developed in the future, that is placed in the digital photo file by the device that took the photo to describe some attribute that relates to the photo.
[0493] When photo metadata includes geocode info as shown in FIG. 181 that defines the geographical location of where the camera was when the photo was taken (as is common in smart phones), method 18200 in FIG. 182 reads this geocode info from the metadata (step 18210). The geocode info can be in any suitable form such as GPS coordinates or other forms of geographical info that specifies location, whether currently known or developed in the future. The geocode info is processed to determine whether the geocode info corresponds to a recognized location (step 18220). If not (step 18220=NO), method 18200 is done. When the geocode info corresponds to a recognized location (step 18220=YES), the location name is added to the indexing info for the photo (step 18230). For example, let's assume Jim Jones takes a photo with his cell phone of his daughter in his house. The geocode info will reflect that the location corresponds to a stored location, namely, Jim & Pat's House. Jim & Pat's House can then be added to the indexing information, which makes retrieval of photos much easier using a photo search engine.
[0494] Referring to FIG. 183, examples of photo indexing info 18310 are shown by way of example to include person info, location info, event info, and other info. Person info can include information relating to a person, including relationship info, both user-defined and system-derived. Location info can include any information relating to a location, including user -defined and system -defined. Event info can include any information relating to a date or date range for the photo, including user -defined events, system-derived events, and system -defined events, as discussed in more detail below. [0495] In one suitable implementation, the photo indexing info is generated using tags in a markup language such as extensible Markup Language (XML). Sample tags for the photo indexing info 18400 shown in FIG. 184 could include tags in three categories, namely: person info, location info, and event info. The sample tags for Person Info shown in FIG. 184 include Person_FuUName, Person_PreferredName, Person_Age and Person_Other. The sample tags for Location Info shown in FIG. 184 include
Location jName, Location_StreetAddress, Location_City, Location_County, Location_State, Location_ZIP, Location_Country, and Location_Other. The sample tags for Event Info shown in FIG. 184 include Event_Name, Event_Date, Event_Date_ ange, Event_BeginDate, Event_EndDate, and EventOther. Note the specific tags shown in FIG. 184 are shown by way of example, and are not limiting.
[0496] Referring to FIG. 185, events may include user-defined events, system -derived events that are derived from user-defined events, or system-defined events that are selected by the user. Examples of user- defined events include birth dates, wedding dates, events date ranges entered by a user, and labels entered by a user that correspond to a date or date range, and others. Examples of system-derived events include Jim's 56th Birthday, Jim & Pat's 30th Anniversary, Jim's Age, and others. Note that a person's age is not an "event" in a generic sense of the word, but the term "event" as used in the disclosure and claims herein includes anything that can be determined based on a date or date range for a photo, including ages of the people in the photo. Examples of events that are system -defined and selected by a user may include fixed- date holidays, variable-date holidays and holiday ranges. Examples of fixed-date holidays in the United States include New Year's Eve, New Year's Day, Valentine's Day, April Fool's Day, Flag Day,
Independence Day, Halloween, Veteran's Day, Christmas Eve, and Christmas Day. Examples of variable- date holidays in the United States include Martin Luther King, Jr. Day, President's Day, Easter, Memorial Day, Labor Day and Thanksgiving. Of course, there are many other fixed-date holidays and variable-date holidays that have not been listed here. Holiday ranges could be defined by the system, and could be selected or modified by the user. For example, an event called "Memorial Day Weekend" could be defined by the system to be the Saturday and Sunday before Memorial Day as well as Memorial Day itself. The user could select this system definition for "Memorial Day Weekend", or could modify the definition. For example, the user could change the definition of "Memorial Day Weekend" to include the Friday before Memorial Day as well. Similar holiday ranges could be defined for Labor Day Weekend, Thanksgiving Weekend and Christmas Holidays. Again, a user can accept a system -defined holiday range or could modify the system-defined holiday range to the user's liking. Thus, the system could define "Christmas Holidays" to include December 20th to January 1st. The user could then modify the system definition of "Christmas Holidays" to include December 15th to January 2nd. Note the system -de fined holidays may include holidays for a number of different countries, nationalities, ethnic groups, etc., which allows a user to select which holidays the user wants to include in indexing information for the user's photos. Thus, a Jewish user could select to include Jewish holidays while excluding Christian holidays. [0497] FIG. 186 shows a method 18600 for generating indexing info for a photo. A photo is selected (step 18610). For the purpose of FIG. 196, we assume a photo is a digital photo file. A unique ID and checksum are generated for the photo (step 18620). The unique ID is a simple numerical designator (like a serial number) that uniquely identifies the photo to the U-Me system. The checksum is computed to facilitate detecting duplicate photos, as discussed in more detail below. Facial and feature recognition are performed on the photo (step 18630). Indexing info is generated for recognized faces and features (step 18640).
Indexing info is also generated for recognized locations (step 18650) based on the location of the photo. In one specific example, the location may be entered by a user. This could be useful, for example, when a user scans a hard-copy photo to generate a digital photo file that does not include location info. The user could then specify a location, which would be a recognized location for the photo. In another specific example, the location is determined from geocode info embedded in the metadata of the digital photo file. Indexing info is also generated for recognized events (step 18660). Recognized events may include anything relating to a date or date range for the photo, including user-defined events, system -derived events (include ages of people in the photo), and system-defined and user-selected events, as described above with reference to FIG. 185. Indexing info for other photo metadata may also be generated (step 18670). The photo is stored (step
18680), and the indexing info for the photo is also stored (step 18690). In one particular implementation, the indexing info is stored separately from the photo. In an alternative implementation, the indexing info is stored as metadata in the digital photo file. The indexing information generated in steps 18640, 18650, 18660 and 18670 may include data that is not in the metadata for the photo, but is generated based on the metadata for the photo in conjunction with information stored in the user's U-Me account. For example, when the U-Me system recognizes a date in the photo metadata that corresponds to Jim & Pat's wedding anniversary, the U-Me system can generate indexing info for the photo that identifies the Event for the photo as Jim & Pat's Wedding Anniversary. Having dates, locations and relationships defined in the user's U-Me account provides a way to add indexing info to a photo that will help to retrieve the photo later using a powerful search engine, discussed in more detail below.
[0498] One suitable implementation for step 18630 in FIG. 186 is shown as method 18700 in FIG. 187. The photo is processed for facial and feature recognition (step 18710). Facial recognition is known in the art, but the processing in step 18710 preferably also includes feature recognition. Feature recognition may recognize any suitable feature or features in the photo that could be found in other photos. Examples of features that could be recognized include a beach, mountains, trees, buildings, a ball, a birthday cake, a swing set, a car, a boat, etc. If there are unrecognized faces or features (step 18720=YES), the user may be prompted to identify the unrecognized faces and/or features (step 18730). Method 18700 is then done.
[0499] By prompting the user for unrecognized faces and features, method 18700 gives the user the chance to build up a library of faces and features that the system will have an easier time recognizing next time around. For example, step 18730 might display the photo with various different faces and regions defined.
The user could select a face and then enter the name for the person, or if the person will appear in many photos, the user could enter some or all of the person's data in a photo system data entry screen, similar to that shown in FIG. 165. The user could also select various regions of the photo to define features that could be recognized in future photos. For example, if a photo shows a couple on a beach with a cruise ship in the background, the user could click on each face to define information corresponding to those two people, and could also click on the sand on the beach and define this feature as "beach", click on the water and define this feature as "water", and click on the cruise ship and define this feature as "boat." Using various heuristics, including artificial intelligence algorithms, these features may be recognized in other photos, which allows adding indexing information that describes those features automatically when the photo is processed, as shown in method 18700 in FIG. 187.
[0500] Referring to FIG. 188, a method 18800 is one specific implementation for step 18640 in FIG. 186.
Indexing info is generated for recognized faces and/or features based on the user -defined relationships (step 18810). Indexing info is also generated for the recognized faces and/or features based on the system -derived relationships (step 18820).
[0501 ] Method 18900 in FIG. 189 is one suitable implementation for step 18650 in FIG. 186 for a digital photo file that does not include any geocode info. A user defines the location for the photo (step 18910).
For example, the user may specify the location using geographical coordinates, or by selecting a name of a geographical location that has already been defined by the user or by the system. This would be the logical approach when a digital photo file has been created from a hard-copy photo, and no geocode info is available for the photo. When the location corresponds to one or more system -defined locations (step 18920=YES), indexing info is generated for the system -defined location(s) (step 18930). When the location corresponds to one or more user-defined locations (step 18940=YES), indexing info is generated for the user-defined location(s) (step 18950). When the location is not a user-defined location (step 18940=NO), the user may opt to add the location to the user -defined locations (step 18960). Note that a single photo can include indexing info that relates to multiple user-defined locations and multiple system-defined locations, all based on the one location where the photo was taken. For example, if we assume region 17910 in FIG. 179 is defined as Jim & Pat's House, and region 17920 is defined as Jim & Pat's Property to include Jim & Pat's House, and assuming the house and property are in Jasper County, Missouri, a photo taken in Jim & Pat's House could include indexing info that specifies the location as Jim & Pat's House, Jim & Pat's Property, Jasper County, Missouri, USA.
[0502] Referring to FIG. 190, a method 19000 is one suitable implementation for step 18650 in FIG. 186 for a digital photo file that includes geocode info, such as a photo taken with a smart phone. The geocode info is read from the photo metadata (step 19010). When the geocode info corresponds to one or more system-defined locations (step 19020=YES), indexing info is generated for the system-defined location (step 19030). When the geocode info corresponds to one or more existing user-defined locations (step
19040=YES), indexing info is generated for the user-defined location (step 19050). When the geocode info does not correspond to any existing user-defined locations (step 19040=NO), and when the user wants to add this location as a new user-defined location (step 19060=YES), the location is added to the user-defined locations (step 19070). Here again, indexing info for a photo can specify many different locations that all apply to the photo, both user-defined and system -defined.
[0503] The generation of location-based indexing info for photos may be done using any suitable heuristic and method. For example, if Jim Jones takes a photo of a grandson at a birthday party in his living room in his house, the U-Me system will recognize the location as Jim & Pat's House, and will store this location as indexing info with the photo. If Jim takes a photo of the grandson fishing at a pond on the property, the U- Me system will recognize the smart phone is not at the house but is on the property, and will recognize the location as "Jim & Pat's Property", and will store this location as indexing info with the photo. In addition, various heuristics could be defined to generate location descriptors. For example, anything within 100 yards of a defined location but not at the defined location could be "near" the defined location. The disclosure and claims herein expressly extend to any suitable location information that could be generated and included as indexing information to describe location of where a photo was taken.
[0504] Method 19100 in FIG. 191 is one suitable implementation for step 18660 in FIG. 186. A date or date range is determined for the photo (step 19110). For the case of a digital photo file from a scanned hard copy photo that does not include a date, the user could specify a date or date range for the photo. Why would the user specify a date range instead of an exact date? One reason is when the user is not sure of the specific date the photo was taken, but can narrow it down to a date range. Another reason is a date range could apply to many photos to make it easier to generate indexing info for those photos. When the digital photo file includes a date that indicates when the photo was taken, determining the date in step 19110 will include reading the date from the metadata in the digital photo file. When the date or date range corresponds to one or more system -defined events (step 19120=YES), indexing info is generated for the corresponding system-defined events (step 19130). When a recognized person is in the photo image, step 19120 will be YES because step 19130 will be performed to compute the age of any and all recognized persons in the photo. Note that age can be computed differently for infants than for older children and adults. When asked how old a baby or a toddler is, the mother will typically reply in months because this is much more informative than telling years. The U-Me system could recognize this, and in addition to generating indexing info that indicates years for a recognized person, when that person is less than say three years old, the indexing info generated in step 19130 could additionally include the age of the recognized person in months. When the date or date range corresponds to one or more user -defined events (step 19140=YES), indexing info is generated for the corresponding user-defined event(s) (step 19150). When the date or date range does not correspond to an existing user-defined event (step 19140=NO) and the user wants to create a new user-defined event (step 19160=YES), the event is added to the user-defined events (step 19170).
[0505] One advantage to the U-Me system being person-centric is camera information can be converted to the corresponding person who took the photo. Referring to FIG. 192, a method 19200 reads camera info from the metadata for a photo (step 19210), looks up the photographer name that corresponds to the camera info (step 19220), and adds the photographer's name to the indexing info (step 19230). In this manner, the metadata in the photo that identifies the camera is used to go a step further to identify the person who uses that camera so the photographer can be specified in the indexing information for the photo. Method 19200 in FIG. 192 is one suitable implementation for step 18670 in FIG. 186.
[0506] Referring to FIG. 19300, method 19300 is a method for storing a photo with corresponding indexing information. The user takes the photo (step 19310). The U-Me software or app sends the photo with metadata (i.e., the digital photo file) to the user's U-Me account (step 19320). The U-Me software or app can send the photo with metadata to the user's U-Me account in any suitable way, including a direct connection from the U-Me software or app to the U-Me system. In the alternative, the U-Me software or app can send one or more e-mails to the user. The U-Me system monitors incoming e-mail, and when a photo is detected, embedded in an e-mail or as an attachment, the U-Me system recognizes the file as a photo. Facial and feature recognition is performed (step 19330). Indexing information is generated for all recognized faces and features (step 19340), for all recognized locations (step 19350), for all recognized events (step 19360), and for any other metadata (step 19370). The digital photo file, including its metadata, is stored with the generated indexing info in the user's photo database (step 19380). When input from the user is needed (step 19382=YES), a flag is set to prompt the user for the needed input (step 19390). Setting a flag lets the user decide when to enter the needed input. Thus, when a user has some spare time, the user may log into the U-Me account and enter all needed input that has accumulated for many photos that have been taken. Method 19300 could be carried out by a user taking a photo with a smart phone that is running the U- Me app, which results in the photo being automatically uploaded, processed, and stored in the user's U-Me account.
[0507] While most young adults and children have taken only digital photographs for their entire lives, older people typically have hundreds or thousands of hard copy photographs. These people need a way to store those photos electronically so they can be easily searched and retrieved as needed. Referring to FIG. 194, method 19400 begins by scanning a hard copy photo (step 19410). Facial and feature recognition is performed (step 19420). A wizard prompts the user to enter indexing information for the photo (step 19430). The photo with its indexing info is then stored in the user's photo database (step 19440). Note the indexing info can be stored in the metadata in the digital photo file, or can be stored separately from the digital photo file.
[0508] Repeating method 19400 for hundreds or thousands of individual photos may be too time- consuming. Instead, the user may process photos in groups. Referring to FIG. 195, method 19500 begins by a user invoking a photo indexing info generator (step 19510). The user can then define indexing info for groups of photos or for individual photos (step 19520).
[0509] A sample digital photo file 19620 is shown in FIG. 196 to include an identifier (ID), Metadata, and the Image. While the indexing information is "metadata" in a general sense, the term "metadata" as used herein relates to data generated by the camera that describes some attribute related to the image, while "indexing info" as used herein relates to data that was not included in the metadata for the image but was generated by the U-Me system to facilitate retrieval of photos using a powerful search engine. The indexing info 19610 can be stored separately from the digital photo file 19620 by simply using the same unique identifier for the photo to correlate the indexing info to the photo. In the alternative, the indexing info can be stored as part of the digital photo file 19610, as shown in FIG. 196.
[0510] An example of a photo indexing info generator screen 19800 is shown in FIG. 198 to include Date fields, a People field, an Event field, a Location field, and a display of thumbnails of photos. The user specifies a date or range of dates in the Date fields. The user specifies one or more people in the People field. The user specifies location in the Location field. An example will illustrate how a user might use the photo indexing info generator in FIG. 198 to generate indexing info for scanned hard copy photos. Let's assume Jim Jones has a stack of 163 photos of all the wedding -related photos of when he married Pat, including some on the morning of their wedding day showing the wedding ceremony, some that were taken later on their wedding day at the reception, and some a week later at a second reception in Pat's hometown. Instead of defining indexing info for each photo, Jim could enter a date range that begins at the wedding day and extends to the date of the second reception, could define an event called "Jim & Pat's Wedding", and could select the 163 thumbnails that correspond to the wedding and reception photos. Once this is done, the user selects the Save button 19810, which results in the photos being saved in Jim's photo database with the appropriate dates and event information as indexing information. Note the People, Event and Location fields can include drop-down lists that list people, events and locations that have been previously defined, along with a selection to define a new event or location. If the user decides to abort entering the indexing info for photos, the user may select the Cancel button 19820.
[0511] A significant advantage of generating indexing info for photos is the ability to search for and retrieve photos using the indexing info. No longer must a user search through hundreds or thousands of thumbnails stored in dozens or hundreds of directories with cryptic names that mean nothing to a person! Instead, the user can use a photo search engine to retrieve photos based on people, their ages, family relationships both entered and derived, location, dates, and events.
[0512] One example of a screen 19900 for a photo search engine is shown in FIG. 199. The example shown in FIG. 199 includes fields for Date(s), Event, Location, People, Relationship, and Photographer. Because of the relationships entered by the user and derived by the U-Me system, searches or queries for photos can now be formulated based on those relationships. Examples of photo queries supported by the photo search engine 19900 in FIG. 199 are shown at 20000 in FIG. 200, and include: photos of grandchildren of Jim Jones between the ages of 2 and 4; photos of the wedding of Sandy Jones; and photos taken at the Lake House in 2010. These simple examples illustrate that adding indexing info that relates to people, locations and events allows for much more powerful querying and retrieving of photos than is known in the art. [0513] The user may want to share photos stored in the user' s U-Me account. This can be done using a photo share engine, a sample display of which is shown at 20100 in FIG. 201. The photo share engine could be provided as a feature of the sharing mechanism 174 shown in FIG. 5, or could be provided by the photo mechanism 182. The user defines criteria for photos to share, then specifies contact information for people with whom the user wants to share the photos. The user can also select whether to share the user's faces, people, locations, events, metadata, and indexing info. The criteria for photos to share can include any suitable criteria, including any suitable criteria that could be entered into the photo search engine for retrieving a photo. The "Share with" field could be a drop-down list with people in the U-Me system, could be a drop-down list of people the user has defined in the user's U-Me account, or could be an e-mail address or other unique identifier for the person. A user could thus enter the e-mail address of a person who is not a
U-Me member, and this could result in the U-Me system sending an e-mail to the person inviting the person to join U-Me to view the photos the user is trying to share with the person. The Representative Photo could designate a photo that includes many family members so the person invited to share the photos can see how the people in the representative photo are identified by the person sharing the photo.
[0514] Referring to FIG. 202, a method 20200 is one example of how a U-Me user could share information with other U-Me users. For method 20200, PI denotes a first U-Me user who wants to share photos and related information with a second U-Me user denoted P2. PI designates P2 to share photos with the "Share All" option (step 20210). Referring to FIG. 201, selecting Yes to the Share All option causes all of the user's photo -related information to be shared with another user, including faces, people, locations, events, metadata, and indexing info. The U-Me system sends an invitation to P2 to share Pi 's photos (step 20220).
If P2 is not yet a U-Me user, P2 will sign up as a U-Me user. P2 logs in to the U-Me system (step 20230). Pi 's defined people are displayed to P2 (step 20240). P2 may select one of Pi 's people (step 20250) and update info for that person (step 20260). In the most preferred implementation, P2 updating the info for that person does not change the info for that person in Pi 's account. Instead, the info for that person in Pi 's account is copied to P2's account so changes made by P2 do not affect the person info in Pi 's account. For example, let's assume Jim Jones invites his son Billy to share some family photos. Let's further assume Billy selects Fred Jones, who has a Preferred Name in Jim's account of Dad Jones. Billy calls Fred Jones Grandpa Jones, not Dad Jones. So Billy could update the Preferred Name for Fred Jones to be Grandpa Jones in step 20260. When there are more of Pi 's people that P2 may want to update (step 20262=YES), method 20200 loops back to step 20250 until there are no more of Pi 's people that P3 wants to update (step
20262=NO). Pi 's representative photo may be displayed to P2 showing the recognized faces (step 20270). P2 may then identify the identifies of the recognized faces in the representative photo to assure they are correct. If the user updated info for PI 's people in step 20260, P2's preferred names for those people can now be displayed for the recognized faces in step 20270. When P2 accepts Pi 's faces (step 20272=YES), P2's people are correlated to Pi 's faces (step 20290). When P2 does not accept Pi 's faces (step 5172=NO, P2 may make any needed corrections or changes to Pi 's faces (step 20280) before correlating P2's people to Pi's faces (step 20290).
[0515] An example is now presented to illustrate the generation of indexing information for a photo by the U-Me system. Referring to FIG. 203, a sample photo 20300 is represented that includes Jim Jones and Todd Jones on Christmas Day 2012. Based on information entered by the user in the data entry screen 16710 in
FIG. 168 and based further on entering additional information in Billy Jones' information that shows his wife is Jenny Black and his son is Todd Jones (as shown in FIG. 174), the indexing information shown in FIG. 204 could represent examples of possible indexing information for the photo in FIG. 203 generated by the photo mechanism in the U-Me system. The first tag at 20410 in FIG. 204 is a photo_id tag that provides a unique identifier that identifies the photo to the U-Me system. The unique identifier in tag 20410 shown in
FIG. 204 is in hexadecimal format. The second tag 20420 is photo_checksum, which provides a checksum for the image that is computed based on the information in the image portion of the photo file. While the photo_checksum value could include information in the metadata and indexing info for a photo, in the most preferred implementation the photo_checksum would include only the image data in the checksum computation so changes to the metadata or the indexing info will not affect the value of the
photo_checksum. Indexing info for Jim Jones is shown in the person tag at 20430. Each person may be given a unique face identifier, such as Person_FaceID shown in FIG. 204, that uniquely identified a face recognized by the facial recognition software in the U-Me system. The indexing info for Jim Jones shown at 20430 include a Person_FaceID of 4296, a Person_FullName of Jim Jones, a Person_PreferredName of Jimmy, a Person_Age of 53, and four Person_Relationship entries that specify one or more relationships in the U-Me system, including a relationship that specifies Jim Jones is the spouse of Pat Jones, is the father of Bill Jones, is the father-in-law of Jenny Jones, and is a grandpa of Todd Jones. Other person relationships could be included for Jim Jones (such as stepfather of Sally Jones), but are omitted from FIG. 204 due to space constraints. The indexing info for a person preferably includes all relationships for that person, both user-defined and system-derived. Indexing info for Todd Jones is shown in the person tag at 20440. Note the age of Todd Jones is shown as 2, and Todd's family relationships are also indicated. As stated above, Todd's age could additionally be shown in a number of months since he is a young child under some specified age, such as three.
[0516] The indexing info at 20450 is due to the facial and feature recognition engine recognizing a Christmas tree in the photo. The indexing info at 20460 includes all location info for the photo. We assume the photo was taken in Jim's house. Thus, the location information includes the name Jim & Pat's House with the address, city, state, zip and country. In an alternative implementation, a separate location tag could be created for each item that specifies a location. Indexing info 20470 and 20480 are also shown, which correspond to two defined events. Indexing info 20470 identifies the event as Christmas, and a date of 2012/12/25, while indexing info 20480 identifies the event as Christmas Holidays with a date range of
2012/12/15 to 2013/01/01. The indexing info shown in FIG. 204 is used by the U-Me system to create indexes that allow easily identifying many different pieces of information that can be used in formulating a sophisticated search. Thus, if Pat Jones does a search for photos of all her grandchildren, this photo 20400 would be returned in the search because of the tag Person_ elationship:Grandson:Pat Jones. If a Jim does a search for all photos taken during the Christmas Holidays for the years 2008-2012, this picture will also be returned in the search because of the tag that defines the event as Christmas Holidays for the specified date range. If Pat does a search for all photos of Todd taken at Jim & Pat's house, this photo would also be returned. If Jim does a search for all photos that include grandchildren when Jim's age is over 50, this photo will also be returned in the search. One skilled in the art will readily recognized that all of the information shown in FIG. 204 can be used in a database search engine to formulate complex and sophisticated queries for a user's photos.
[0517] The term "tag" used with respect to the indexing info is a different kind of "tag" known in the art for tagging photos. This potential confusion is caused by the use of the label "tag" for two different things in the art. A tag in the indexing info described herein could be a markup language tag that identifies some attribute of the photo. Known tags for photos, in contrast, are simple labels. Thus, a user can use Google's Picasa software and service to perform facial recognition on photos and to tag photos with the names of the recognized people. These simple tags are labels that are searchable, but do not contain any relationship information, ages of people in the photos, events, etc. For example, if Jim Jones took the time to use Google's Picasa to tag his photos, he could enter a tag such as Billy on all the photos that include his son Billy. But the tag Billy is a simple text tag, a mere label. While searchable, known photo tags do not provide the flexibility and power of the U-Me system. Known tags are photo -centric, while indexing info generated by the U-Me system is person-centric. With known photo tags, a person cannot do powerful searches looking for grandchildren, looking for people of a particular age, etc. In addition, if Jim were to tag the sample photo 20300 in FIG. 203 with the tag "Christmas 2012", that is a complete label and not subject to parsing. Thus, there is no way to search for all Christmas photos, and have that search return the photo that is tagged with "Christmas 2012." The user would have to specify Christmas 2012 in order for the tag to match the search term. This shows how woefully inadequate known photo tagging is, and how the U-Me photo mechanism provides a significant improvement by generating person-centric indexing info that includes both user-defined information as well as system -derived information that is derived from the user- defined information.
[0518] FIGS. 205 and 206 illustrate how a first user called PI can share photo-related information with another user called P2. Features of Pi 's U-Me account related to the photo mechanism are shown in FIG. 205 to include People Info, Location Info, Event Info, Face Info, and Photo Info, which is a database of Pi 's photos that includes photos and associated indexing info. Note the indexing info could be stored separate from the digital photo file as shown in FIG. 196, or could be stored as part of the digital photo file as shown in FIG. 197. We now assume PI wants to share all of Pi 's photos with P2, as shown in method 20200 in
FIG. 202. Because PI selected "Share All" when sharing Pi 's photos with P2, when P2 creates P2's U-Me account, all of the People Info, Location Info, and Event Info are copied to P2's account, as shown by the dotted lines in FIG. 206. Once the info has been copied, it can be edited by P2 in any suitable way. For example, let's assume Jim Jones enters a preferred name of Cookie for his wife Pat because that is the nickname he always uses to refer to Pat. If Jim shares his photo info with his son Bill, it is likely that Bill will want to change the preferred name of Cookie for Pat to something else, like Mom. In addition, if Jim defined a location for Jim & Pat's House and Jim & Pat's Property as illustrated in FIG. 179, Bill could change the location name to Mom & Dad's House and Mom & Dad's Property. P2 could thus use the copied definitions for People Info, Location Info and Event Info from Pi 's account as a shortcut to defining people, locations and events in P2's account.
[0519] When face information is shared, the names of the recognized faces are copied from Pi 's account to P2's account. However, P2 may want to name the people in the photos differently than PI . As a result, P2's account can have names for the recognized faces that are different than the names PI uses. Thus, the face info in P2's account could include names that point to face IDs in Pi 's account. This provides a significant advantage. Facial recognition software recognizes faces, and the processing time may increase with an increase in the number of recognized faces. P2 gets a shortcut because the U-Me photo mechanism has already done facial recognition on Pi 's photos and has generated indexing info for those faces. By pointing people in P2's account to faces recognized in Pi 's account, the number of faces in the facial recognition database does not double by adding P2. This can also be a significant issue because some facial recognition software is licensed based on the number of templates (faces) in the facial recognition database. By sharing face info between users as shown in FIG. 206, the U-Me system both benefits from earlier work done for one user and also avoids an unneeded increase in the number of faces in the facial recognition database. In addition, when P2 adds more photos of people who have already been recognized, the facial recognition for that person can improve for PI as well. Sharing face info thus provides benefits to both the sharing party and the party to whom the face info has been shared.
[0520] The sharing of photos between PI and P2 is done in a way similar to the sharing of face info. In the most preferred implementation, the photos remain in Pi 's account, and are not copied to P2's account. Instead, P2's account includes photo info that is indexing info for the photos that can be separate from the digital photo files. Thus, Pi 's account has photos Photo 1, . . ., PhotoN and corresponding indexing info for PI, labeled P1_II1, . . ., P1_IIN. When P2's account is created to share PI 's photos, the indexing info can be copies to P2's account, and the indexing info can then be modified as desired by P2. The result is P2 having its own set of indexing info P2_II1, . . ., P2_IIN for the photos stored in Pi 's account. The U-Me system thus allows users to share photos and related information in a very efficient manner while still providing customization for each user, and while providing an incredibly powerful search engine that can retrieve photos based on relationships, locations and events.
[0521] The U-Me photo mechanism may include a way for a person such as P2 in FIG. 206 to decide to share some of Pi 's photos but not all. For example, a woman PI might take seventy photos of Pi 's daughter at her birthday party, and may share all seventy of those photos with her mom P2. Her mom may not want or need to share all seventy of those photos. The U-Me photo mechanism could display thumbnails for Pi 's photos, then allow P2 to select which of those P2 wants to share. This could become an issue regarding how a user is billed for using the U-Me system. For example, let's assume a user pays a monthly subscription for access to the U-Me system, and the price depends on the amount of storage the person uses to store his or her photos. Let's further assume that when person P2 accepts to share Pi 's photos, the size of PI 's photos is allotted against the amount of storage in P2's account. By providing a way to select which photos P2 wants out of all the photos PI is willing to share, P2 maintains control of P2's subscription to a price desired by P2. Thus, P2 could decide to share only a small fraction of the photos offered by PI, thereby giving P2 full access to those photos, while restricting access to the other photos PI offered to share with P2 until P2 accepts to share those other photos as well.
[0522] The sharing of photos and related information as shown above is especially valuable in the context of families. Thus, let's assume Pat Jones decides to become a U-Me subscriber, takes the time to enter all the information for the people in her family, including birth dates, wedding dates, events, locations, etc. and has the U-Me system generate the indexing info for all of her photos. This significant amount of work can be to the benefit of other users who Pat invites to share her photos, such as her husband and her children. While some of the preferred names may change, the vast majority of information entered by Pat will apply to her children and spouse as well. This allows one enthused family member to do the majority of the work in defining people and relationships in the U-Me system, and creates an incredibly powerful shortcut for others that are invited to share that family member's photos and related info.
[0523] Some people have already invested a lot of time and effort to tag their photos with known photo tagging tools or software. The U-Me system can leverage this investment of time, as shown in method 20700 in FIG. 207. Method 20700 assumes the existing tags are stored as part of the photo metadata in a digital photo file. The photo metadata is processed for existing tags (step 20710). A list of existing tags is displayed to the user (step 20720). The user is then allowed to correlate existing tags with defined people, locations and events in the user's U-Me account (step 20730). Indexing info is then generated based on the people, locations and events corresponding to the existing tags (step 20740). We see from method 20700 that a user who has taken the time to perform prior art photo tagging on their digital photo files will have an advantage when importing these photos into their U-Me account, because the tags may be used to correlate photos to people, locations and events defined in the user's U-Me account.
[0524] One problem that most users of digital cameras have is duplicate photos in different locations. For example, a person might take some photos on a blank SD card and then download those photos to a computer. The user may re-insert the card into the camera, and take more photos without deleting the first set of photos. When the user downloads the photos, the user may then create a new directory and download both the first set of photos on the SD card that had been previously downloaded, along with the second set of photos on the SD card that were added later. The result is the first set of photos now exists in two different directories on the user's computer system. With dozens or hundreds of directories, many users have many duplicate photos. The U-Me photo mechanism can detect duplicate photos as shown in method 20800 in FIG. 208. Identifiers for the photos are compared (step 20810). One suitable example for identifiers that could be used to compare photos is a checksum that is computed over all the data in the image portion of a digital photo file. When two checksums match, it is very, very likely the photos are duplicates. Photos that have the same identifiers are marked as possible duplicates (step 20820). A list of possible duplicates is then displayed to the user (step 20830). The user can then identify duplicates from the list and delete any duplicates (step 20840). By detecting and deleting duplicates as shown in method 20800, the U-Me photo mechanism avoids needless storage and processing of duplicate photos.
[0525] Because the U-Me system is people-centric, and uses relationships between people such as family relationships in generating indexing information, the U-Me system is ideal for genealogists to use for photos of their family members. The U-Me photo mechanism thus includes the capability of importing a file that specifies people and relationships (step 20910), such as a .GEDCOM file generated by most popular genealogy software, such as Roots Magic. Photo system data is generated in the user's U-Me account for the people in the imported file (step 20920). System-derived relationships are then derived based on the relationships in the imported file (step 20930). A genealogy file such as a .GEDCOM file can thus provide a significant shortcut for importing data regarding people and family relationships into a user's U-Me account. Note the file need not necessarily be a genealogy file, but could be any type of file that represents people and/or relationships between people, such as an organization chart for a business.
[0526] One problem with prior art photo tags is they are static. Once defined, they don't change. If a user wants to add more tags to a photo, the user must add the tag manually. A significant advantage of the U-Me system is the ability to dynamically update indexing info for a person when information is added or changed. For example, let's assume the photo mechanism recognizes a person's face in 49 photos, but the user has not yet correlated the face to a defined person. The face is represented by a face ID, as shown by way of example in FIG. 204. Once the face is correlated by the user to a defined person, the attributes for that person may by dynamically added to the indexing info. Thus, if the facial recognition software recognized the face of Todd Jones and identified it a FacelD of 5893, but this face had not yet been correlated to the personal info for Todd Jones, this FacelD would be the only information in the person tag 20440 in FIG. 204. Once the user correlates Todd's face to Todd's personal info in the user's U-Me account, all of the pertinent info relating to Todd can be added to the photo dynamically without any input required from the user. Thus, the mere act of identifying the face in photo 20300 to be the face of Todd Jones that is defined in the user's U-Me account, the additional information will be added to the indexing info automatically.
[0527] Referring to FIG. 210, a method 21000 illustrates how additions or changes are automatically propagated to indexing info for a user's photos. An addition or change to a person's people, relationships, locations or events is detected (step 21010). This change is then propagated to the indexing info for all affected photos (step 21020). A simple example will illustrate. Let's assume Bill and Jenny Jones have a daughter named Polly three years after having Todd. Once Polly's information is entered into the U-Me system, the indexing info for each photo that has any person related to Polly will be updated to reflect that person's relationship to Polly. For example, upon saving the personal information for Polly, the tag 20430 in FIG. 204 would be updated to add a field Person_Relationship=Grandpa:Polly Jones and the tag 20440 would be updated to add a field Person_Relationship=Brother:Polly Jones. In another example, let's assume
Jim & Pat occasionally spend Christmas at a cabin in the mountains that they rent. The owners offer to sale the cabin to Jim & Pat, who accept and purchase the cabin. When a location is defined for "Our Cabin", all pictures that include the geographic information for the cabin will be updated to include the location name "Our Cabin." These examples show that data can essentially lie dormant in a photo for a long time. But once information is added or changed in the user's U-Me account, this information can be updated to reflect the added or changed information. Note the addition or change of information could be selectively controlled to apply to some photos and not to others. In the example above, let's assume Jim & Pat define a location called "Rental Cabin" for the cabin during the years they rented the cabin, then define the same location as "Our Cabin" for the years after the purchase. The purchase date could be entered, and all photos prior to the purchase date could have the location name "Rental Cabin" while all photos after the purchase date could have the location name "Our Cabin." Of course, the U-Me system could recognize the physical location is the same, yet maintain different indexing info for different photos based on date.
[0528] The people-centric nature of the U-Me system lends itself to some great features for the photo mechanism. For example, creating a person in a user's U-Me account can result in automatically creating a "container" corresponding to that person in the user's U-Me account. The user can select any container corresponding to any person, which will result in displaying or listing all photos that person is in. Note the containers do not "contain" the photos in a literal sense, but contain pointers to the photos stored in the user's photo database. Note the displayed photos for a person can be organized in any suitable way, such as chronologically, alphabetically according to location, etc.
End Spec Martin.1443 - Start Spec Martin.1444
[0529] Referring to FIG. 211, a method 21100 shows how the user may access the user's data and/or licensed content and/or settings that are stored in the user's U-Me account. The user authenticates to the U- Me system (step 21110). The user identifies a location that is U-Me certified (step 21120). The U-Me system reads the location settings and compares the location settings with the user settings (step 21130). When conversion is needed (step 21140=YES), the conversion mechanism in the U-Me system converts the user settings to suitable location settings (step 21150). The conversion of settings is preferably performed by the conversion mechanism 160 shown in FIG. 5. The user settings that correspond to the location are then downloaded to devices at the location (step 21160). When no conversion is needed (step 21140=NO), the user settings in the user's U-Me account can be downloaded to devices at the location (step 21160). Method 21100 could be representative of any suitable location, including a vehicle, a hotel room, a rental condo, etc. [0530] Referring to FIG. 212, a universal remote control 21210 is shown to include a processor 21220 coupled to a memory 21230, a touch-screen display 21240, an equipment communication interface 21250, an external database communication interface 21260, and a code reader 21281 via a system bus 21280. Batteries 21272 preferably provide power to a power supply 21270, which provides power to the various components in the universal remote control 21210. Of course, a different power source than batteries could be used, such as power from a wall-plug DC adapter. Batteries 21272 are preferred so the universal remote control 21210 can be used without a power cord.
[0531] Memory 21230 preferably contains a display programming mechanism 21232, a dynamic location- based programming mechanism 21234, and an internal database 21238. The display programming mechanism 21232 allows dynamically programming touch-screen display 21240 so the graphical icons
21242 are programmed to correct channel numbers for a specified location. If any of the channels represented by graphical icons 21242 are not available in the specified location, the display programming mechanism 21232 could delete those graphical icons, or could show those graphical icons "grayed out", meaning they still show up but are not selectable by the user.
[0532] The dynamic location-based programming mechanism 21234 functions according to a specified location 21236. The specified location 21236 can be specified in any suitable way. For example, the user could enter a numeric identifier that identifies the specified location. The user could use the code reader 21280 to read a suitable machine -readable code or identifier, such as text, a QR code, a barcode, or any other machine-readable identifier. The specified location 21236 could be determined from a GPS device internal to the universal remote control 21210. The specified location 21236 could be determined by communicating any suitable location-specific information, such as IP address, to an external database such as 21290, which could return a location based on the IP address. These examples are not limiting, and the disclosure and claims herein expressly extend to any suitable way of determining or defining specified location 21236.
[0533] Processor 21220 may be constructed from one or more microprocessors and/or integrated circuits. Processor 21220 executes program instructions stored in memory 21230. Memory 21230 stores programs and data that processor 21220 may access.
[0534] Although universal remote control 21210 is shown to contain only a single processor and a single system bus, those skilled in the art will appreciate that a universal remote control as disclosed and claimed herein may have multiple processors and/or multiple buses. In addition, the interfaces that are used preferably each include separate, fully programmed microprocessors that are used to off-load compute- intensive processing from processor 21220. However, those skilled in the art will appreciate that these functions may be performed using I/O adapters as well.
[0535] Touch-screen display 21240 includes graphical icons 21242 that can be selected by a user touching the icon on the display 21240. Equipment communication interface 21250 is used to transmit commands to equipment or devices. For the specific example shown in FIG. 212, the equipment communication interface
21250 is shown communicating with a television 21252, a digital video recorder (DVR) 21254, a digital video disk (DVD) player 21256, an audio receiver 21258, and external hardware 21259. These devices are shown as examples of equipment that can be controlled by the universal remote control 21210. The equipment communication interface 21250 can send commands in any suitable format or combination of formats. For example, let's assume the television 21252 is Wi-Fi enabled, which means it can be controlled via commands sent via the Wi-Fi network. Let's further assume the DV 21254, DVD 21256 and audio receiver 21258 are not Wi-Fi enabled, but are controlled by infrared signals. In this specific scenario, the equipment communication interface 21250 would include a Wi-Fi interface that communicates via a Wi-Fi network with the TV 21252, and an infrared interface that communicates with the DVR 21254, DVD 21256 and audio receiver 21258. Note the equipment communication interface 21250 could also communicate with external hardware 21259, which then communicates with equipment. Thus, for the specific scenario given above, if the universal remote control 21210 is implemented in a smart phone running an app, and the smart phone does not have an infrared transmitter, the equipment interface 21250 could be a Wi-Fi interface that communicates directly with the TV and with external hardware 21259 that includes an infrared transmitter so the commands sent via the Wi-Fi interface to the external hardware 21259 can be converted to corresponding commands on the infrared transmitter. These examples show the equipment communication interface 21250 can include any suitable interface or combination of interfaces to control any suitable equipment or device.
[0536] The external database communication interface 21260 provides an interface to an external database 21290 that includes location-specific programming parameters 21292. Any suitable interface 21260 can be used for communicating with any suitable type of external database 21290. For example, the external database communication interface 21260 could be a wireless interface that connects via Wi-Fi to a website that provides the external database 21290. In one specific implementation, the location-specific programming parameters 21292 include all information needed to program the universal remote control 21210 for the specified location 21236. In an alternative implementation, the location-specific parameters 21292 may include location-specific information, such as which devices are at the specified location, while the programming codes for the devices are stored in the internal database 21238. The universal remote control 21210 can thus be programmed automatically to control equipment (devices) at the specified location by the universal remote control interacting with the external database 21290 to determine the location- specific programming parameters 21292 that correspond to the specified location. The universal remote control 21210 is thus more "universally remote" than known universal remote controls, because it can be easily and automatically programmed to suit different locations using location-specific programming parameters. The universal remote control 21210 is thus universal across locations, not just universal in being programmable to control a large number of devices, as is known in the art.
[0537] As will be appreciated by one skilled in the art, aspects of the disclosed universal remote control may be embodied as a system, method or computer program product. Accordingly, aspects of the universal remote control may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the universal remote control may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0538] Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable readonly memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD- ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0539] Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
[0540] Computer program code for carrying out operations for aspects of the universal remote control may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
[0541] Aspects of the universal remote control are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0542] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
[0543] The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0544] As shown in FIG. 213, the universal remote control 21210 shown in FIG. 212 can be implemented in a number of different ways. Examples shown in FIG. 213 include a smart phone with an app; a smart phone with external hardware and an app; a tablet computer with an app; a tablet computer with external hardware and an app; and a dedicated universal remote control. Of course, other implementations are also possible within the scope of the disclosure and claims herein.
[0545] Referring to FIG. 214, a method 21400 shows how to setup location-specific programming parameters 21292 in the external database 21290 shown in FIG. 212. A location is specified (step 21410). A TV provider at the specified location is specified (step 21420). A channel map for the specified TV provider is determined (step 21430). Devices at the specified location are specified (step 21440). Devices in step 21440 refers to equipment at the specified location that will be controlled by the universal remote control. Programming codes are then specified for each device at the specified location (step 21450). Note that method 21400 could be repeated for each location specified in the external database 21490. Note also that step 21450 could be optional if the programming codes are included in the internal database 21238 shown in FIG. 212.
[0546] One example of an entry 21500 in the external database 21290 in FIG. 212 is shown in FIG. 215. Note that entry 21500 could be generated using method 21400 in FIG. 214. Entry 21500 includes a location identifier 124987 that uniquely identifies a location. The entry 21500 specifies the TV provider at that location, which is DirecTV for this location. The channel map is the DirecTV channel map, which will correlate channels provided by DirecTV at the specified location with the corresponding channel numbers. The devices in entry 21500 include a DirecTV DVR, a Samsung TV, and a Sony Blu Ray player, with their corresponding code sets. Because the entry 21500 includes all location-specific information, the entry 21500 can be used to program the universal remote control for the specified location. Note that entry 21500 is one suitable implementation for the location-specific programming parameters 21292 shown in FIG. 212.
In one specific implementation, the external database 21290 includes many entries similar to 21500 in FIG. 215 that each specifies programming parameters for a different location. Of course, the code sets shown in FIG. 215 need not be in the entries in the external database, but could optionally be stored in the internal database 21238 shown in FIG. 212.
[0547] Referring to FIG. 216, a method 21600 is preferably performed by the dynamic location -based programming mechanism 21234 shown in FIG. 212. A location is specified (step 21610). The location can be specified in any suitable manner. One suitable manner for specifying a location is for the user to enter a unique identifier for the location. Another suitable manner is for the remote control to use its code reader to read a machine-readable representation of a unique identifier, such as a text identifier, a QR code, a barcode, or any other machine -readable identifier. The external database is then accessed by the universal remote control to determine the TV provider, channel map, devices and corresponding programming codes for the specified location (step 21620). The remote control then programs itself for the specified location using the TV provider, channel map, devices and corresponding programming codes (step 21630). While the most preferred implementation includes all of the information shown in entry 21500 in FIG. 215 for each location, in an alternative implementation the codes sets could be stored in the internal database 21238 of the universal remote control as shown in FIG. 212, while the remainder of the information is stored in entry
21500. In addition, all of the location-specific information could be distributed across multiple databases. Thus, an entry could include the location, TV provider and channel map fields shown in FIG. 215, along with a link to a different entry in a different database that specifies the devices at the location. The disclosure and claims herein expressly extend to storing location-specific programming parameters in any suitable number of locations, and accessing the location-specific programming parameters in any suitable number of locations to program the universal remote control 3910 to a specified location.
End Spec Martin.1444 - Start Spec Martin.1445
[0548] Referring to FIG. 217, a method 21700 is shown for converting user settings from a first vehicle to a second vehicle and for configuring the second vehicle using those settings. The user settings for the first vehicle are stored (step 21710). The user settings for the first vehicle are then converted to corresponding user settings for a second vehicle (step 21720). Note the second vehicle could be the same type as the first vehicle, could be a similar type as the first vehicle, or could be a different type as the first vehicle. The user settings for the second vehicle are then downloaded to the second vehicle (step 21730). The second vehicle is then configured using the downloaded user settings (step 21740). Method 21700 allows a user to rent a rental car of a type the user has never before driven, and have the rental car automatically configured according to the user's settings on a different car.
[0549] As stated above, the "type" of vehicle can vary. A simple example will illustrate. A second vehicle could be considered the "same type" as the first vehicle when the first and second vehicles have the exact same set of settings. This could happen, for example, when the two vehicles are the same vehicle just one model year apart. Note that "same set of settings" means all of the corresponding settings are expressed in the same units or in the same manner. Thus, a setting for horizontal driver seat position measured in distance from the floor is the same setting as a setting in a different vehicle for that also specifies horizontal driver seat position measured in distance from the floor. A setting for horizontal seat position measured in distance from the ceiling is a different setting than a setting for horizontal seat position measured in distance from the floor. Thus, if a first vehicle has settings of the same type as a second vehicle, the settings of the first vehicle can be used directly in the second vehicle.
[0550] A second vehicle could be considered a "similar type" as the first vehicle when the first and second vehicles share most of the same settings, but there are differences as well. This could happen, for example, between different models from the same manufacturer. A second vehicle could be considered a "different type" as the first vehicle when the first and second vehicles do not share most of the same settings. This could happen, for example, between different models from different manufacturers. The disclosure and claims herein expressly extend to any suitable way to define type of a vehicle, and the principles herein apply regardless of whether the settings are being converted between vehicles of the same type, between vehicles of a similar type, or between vehicles of different types.
[0551] Note the process for converting settings from the first vehicle to corresponding settings for the second vehicle varies according to whether the vehicles are of the same type, of a similar type, or of a different type. For vehicles of the same type (e.g., same make and model, different model year), the conversion in step 4220 may produce user settings for the second vehicle that are identical to the user settings for the first vehicle. In this case, conversion means simply using the same settings for the second vehicle as used for the first vehicle. For vehicles of a similar type or of a different type, the conversion in step 4220 may produce user settings for the second vehicle, some of which may be the same as settings for the first vehicle, and some or all of which may be different than the user settings for the first vehicle.
End Spec Martin.1445 - Start Spec Martin.1446
[0552] Referring to FIG. 218, one suitable example of a television receiver 21800 is shown to include one or more processors 21810, a memory 21820, one or more television receivers 21850A, . . ., 21850N that receive input from a TV signal input 21860, which receives a TV signal from a TV signal source, a TV signal output 21870 that is output to a television, a network connection 21880, and an external device interface 21890 that allows transferring user settings to and from an external device 21892. Receivers 21850 could be, for example, tuners that can each receive a single television station. The memory 21820 preferably includes system code 21822, system settings 21824, user settings 21826, a user settings transfer mechanism 21830, and recorded shows 21840 (assuming the television receiver 21800 is a digital video recorder
(DVR)). System code 21822 includes code executed by the one or more processors 21810 to make the television receiver 21800 function. System settings 21824 are settings that are required for the system to function correctly. For example, when the television receiver 21800 is a satellite television receiver, the system settings 21824 could include all system information required for the receiver to connect to one or more satellites. While this information may be entered by a human installer, these are different than "user settings" because the system settings are the settings needed for the television receiver 21800 to function properly. The user settings 21826, in contrast, are settings such as those configured by the end user (e.g. , satellite TV subscriber) that specify preferences the user can set on the receiver 21800. Examples of suitable user television settings are shown in FIG. 32. While the channel numbers for stations 3250 are typically defined by the system and not by the user, they may be included in user television settings 835 so suitable translation of channel numbers can be performed, as described above.
[0553] The user settings transfer mechanism 21830 includes a user settings external write mechanism 21832, a user settings external read mechanism 21834, and a user settings conversion mechanism 21836. The user settings external write mechanism 21832 allows writing the user settings 21826, and possibly some of the system settings 21824, to an external device, such as external device 21892 coupled to external device interface 21890. The user settings written to the external device 21892 are represented in FIG. 218 as user settings 21894. The user settings external read mechanism 21834 allows reading user settings from an external device, such as reading user settings 21894 from external device 21892. The user settings external write mechanism 21832 thus writes user settings 21826 to the user settings 21894 in the external device 21892, while the user settings external read mechanism 21834 reads user settings 21894 from the external device 21892. Note the external device 21892 could be any device capable of storing the user settings
21894. For example, a thumb drive with a uniform serial bus (USB) interface is one suitable example of an external device 21892, which can be used by plugging the thumb drive into a suitable USB port on the television receiver 21800. Such a USB port is one suitable example for the external device interface 21890. Once plugged into the USB port, the television receiver 21800 could read user settings 21894 from the thumb drive or could write user settings 21894 to the thumb drive. Another suitable example of an external device 21892 is a smart phone that could be coupled to the television receiver in any suitable way, including via a Wi-Fi connection to the network connection 21880, via a Bluetooth interface (one specific implementation for external device interface 21890), or via a direct cable -type connection (e.g., USB) to a USB port, which is another suitable implementation of the external device interface 21890. Yet another suitable example of an external device 21892 is the U-Me system that could be accessed via the network connection 21880. The disclosure and claims herein expressly extend to any suitable external device capable of storing user settings, whether currently known or developed in the future, which can communicate with the television receiver 21800 in any suitable way, whether currently known or developed in the future.
[0554] The user settings conversion mechanism 21836 includes logic to convert user settings from one type of television receiver to another. This logic can include direct conversion between device settings, conversion to and from a universal template, and conversion from one device to a universal template followed by conversion from the universal template to the second device, as discussed in detail above with respect to FIGS. 24-30.
[0555] A simple example will illustrate. When the user settings 21894 in the external device 21892 are
DirecTV settings, and the television receiver 21800 is a Dish Network receiver, the DirecTV settings could be read by the user settings external read mechanism 21834, and could then be converted to equivalent Dish Network user settings by the user settings conversion mechanism 21836. The converted settings may then be written to the user settings 21826, thereby programming the Dish Network receiver with settings similar to those used on the DirecTV system, represented by the user settings 21894. Note the user settings conversion mechanism 21836 could include the logic to convert in both directions, both from the television receiver 21800 to one or more different receivers, and from one or more different receivers to the television receiver 21800. For example, if the user has Dish Network but is going to his parent's cabin, which has DirecTV, the user could specify to convert the user settings in the Dish Network receiver to equivalent DirecTV settings, which could then be written to a thumb drive. Because the settings on the thumb drive are then DirecTV settings, the user could the plug the thumb drive into a USB port on the DirecTV receiver at his parents' cabin, and the DirecTV receiver could then program itself from those settings on the thumb drive.
[0556] The ability to store user settings external to the television receiver is a great advantage in many different scenarios. One such scenario is when a user upgrades to a new television receiver. For example, let's assume a Dish Network customer decides to upgrade from his current DVR that can record two channels at a time to a newer DVR that can record four channels at a time. There is currently no known way to transfer settings between the old DVR and the new DVR. The user is stuck with having to manually enter all the user settings, including those shown in FIG. 32, which can take a long time to do. With the ability to store user settings to an external device as disclosed herein, the user could store the user's settings from the old DVR to the external device, and the new DVR could then read those settings. If any conversion is needed between the old settings and the new settings, this can be handled by the user settings conversion mechanism 21836. The result is that most or all of the user's settings are available on the new DVR, thus greatly enhancing the ease and convenience of upgrading to a newer DVR. Another scenario is described above, where a user wants to take his settings with him to program his parents' DVR at their cabin. Another scenario is when a user wants to program a DVR at a hotel or vacation rental. The ability to store user settings external to the television receiver thus provides a very powerful tool that enhances the convenience of the user.
[0557] Referring to FIG. 219, a method 21900 represents one suitable method that could be performed by the television receiver 21800 in FIG. 218. The settings for Device 1 are read (step 21910). These settings are then stored to an external device (step 21920). Method 21900 in FIG. 219 could be performed, by example, by the user settings external write mechanism 21932.
[0558] Referring to FIG. 220, a method 22000 represents one suitable method that could be performed by the television receiver 21800 in FIG. 218. The settings for Device 1 are read from the external device (step 22010). When Device2 is the same type as Devicel (step 22020=YES), Device 2 is programmed with the settings for Devicel (step 22030). When Device2 is not the same type as Devicel (step 22020=NO), the settings for Device2 are determined from the settings for Devicel (step 22040), and Device2 is then programmed with the Device2 settings (step 22050). The settings are read from the external device in step 22010, for example, by the user settings external read mechanism 3434, while determining the settings for Device2 from the settings for Devicel could be performed, for example, by the user settings conversion mechanism 22036. Note the user settings conversion mechanism 21836 could convert the settings in any suitable way, including the ways discussed in detail above with reference to FIGS. 24-30.
[0559] The specific implementation shown in FIG. 218 shows the user settings conversion mechanism 21836 residing in the memory 21820 of a television receiver 21800. Note the function of the user settings conversion mechanism 21836 in FIG. 218 could additionally be performed or could alternatively be performed by the conversion mechanism 160 in the U-Me system 100 shown in FIG. 5. In this scenario, the user settings external write mechanism 21832 would write the user settings 21826 to the U-Me system, which could then convert those settings to corresponding settings for a second device. The settings for the second device could then be read from the U-Me system by a user settings external read mechanism in the second device, which can then be programmed with those settings.
[0560] In some situations, a user's settings might only be needed for a defined period of time, such as a temporary stay in a hotel or rental condo. In such a scenario it would be desirable to be able to clear out the user's settings once the user's stay is over. Method 22100 in FIG. 221 shows one suitable method for doing this. Default settings are defined for a television receiver Devicel (step 22110). Default settings can be any suitable set of user settings, which could include a lack of settings as when Devicel was new, or any set of user settings the hotel manager or condo owner may want to define. The programming of Devicel with external user settings is enabled (step 22120). This could be done, for example, by the hotel manager or condo owner sending a signal to Devicel that allows the user to program Devicel with his or her settings stored on an external device, such as the U-Me system, a thumb drive, a smart phone, etc. The user then programs Devicel with the user's settings read from the external device (step 22130). The user can then use Devicel with the user's settings. As long as the user's stay is not over (step 22140=NO), the user can continue to use Devicel with the user's settings. Once the user's stay is over (step 22140=YES), Devicel is programmed with its default settings (step 22150), thereby removing all of the user's settings that were previously programmed in step 22130. Note that restoring the device to its default settings can include erasing any shows the user recorded. This can be beneficial in assuring Devicel is in a default state for each guest. Thus, if one guest records horror movies on Devicel, once the user checks out, Devicel will be returned to its default state, which will include erasing the horror movies the previous user recorded. This will prevent the next user (such as a family with kids) from seeing the horror movies on Devicel that were recorded by the previous user. Note the determination the user's stay is over in step 22140 can be made in any suitable way. For example, the user could enter into the DV the date the user is checking out, and the DVR could then reset itself on that date at the appropriate checkout time. In the alternative, the hotel manager or condo owner could send a message to the DVR that instructs the DVR to reset itself to the default settings. Another option is for the hotel manager or condo owner to use a privileged mode in the DV to define the time period when the user will be staying. Of course, other options are also possible. The disclosure and claims herein expressly extend to any suitable way to allow a user to program the user's settings to a device, followed by the device being reset to its default settings at some later time.
End Spec Martin.1446 - Start Spec Martin.1447
[0561] FIG. 34 shows a suitable hierarchy of templates related to physical devices. A master template
140A shown in FIG. 24 is one suitable implementation for user settings shown by way of example as 140A in FIGS. 1 and 2. The master template 140A in FIG. 34 could be one suitable implementation for master template 975 shown in FIG. 9. Master template 140A includes settings for multiple physical devices for one user. In one preferred implementation, the master template 140 A is a superset that includes all settings for all of the user's devices, and thus serves as a central repository for storing all of a user's settings. In the most preferred implementation, the master template 140A includes settings from a single user. However, one of ordinary skill in the art will appreciate that the U-Me system could have a repository for settings from many different users, with a master template corresponding to each user.
[0562] As shown in FIG. 34, the master template 140A may include all of the user's settings, which may include, for example, phone settings, tablet settings, laptop settings, desktop settings, TV settings, software settings, vehicle settings, home automation settings, gaming settings, audio settings, and security settings. Because the U-Me system is intended to include any settings a user may have, the U-Me system could include settings for all of a user's devices, even those that are developed in the future. The master template 140 A as shown in FIG. 34 may include any and all settings for a user.
[0563] The U-Me system may optionally include one or more universal templates 152 as discussed above with reference to FIGS. 5 and 9. The universal templates 152 may include, for example, one or more phone templates, one or more tablet templates, one or more laptop templates, one or more desktop templates, one or more TV templates, one or more software templates, one or more vehicle templates, one or more home automation templates, one or more gaming templates, one or more audio templates, and one or more security templates.
[0564] The U-Me system also includes one or more device -specific templates 154 as discussed above with reference to FIGS. 5 and 10-21. The device-specific templates 154 may include, for example, one or more phone templates, one or more tablet templates, one or more laptop templates, one or more desktop templates, one or more TV templates, one or more software templates, one or more vehicle templates, one or more home automation templates, one or more gaming templates, one or more audio templates, and one or more security templates, as shown in detail in FIGS. 10-21.
[0565] Physical devices 150 may include any suitable device the user may use, and which includes one or more user settings a user may set. Physical devices 150 may include, for example, one or more phones, one or more tablet computers, one or more laptop computers, one or more desktop computers, one or more TV devices, one or more vehicles, one or more home automation systems, one or more gaming systems, one or more audio systems, one or more security systems, and any other physical device a user may use. [0566] As indicated by arrows in FIG. 34, settings in physical devices 150 are stored in corresponding device-specific templates 154, which serve as virtual clones of the physical device. The settings in the device-specific templates 154 may be stored in one or more universal templates 152. The settings in the universal templates 152 may be stored in the master template 140A. Note, however, the universal templates 152 are optional. The master template 140A may store settings directly to device -specific templates 154, and the device-specific templates 154 may store settings directly to the master template 140A. The universal templates 152, when used, provide a more organized hierarchy of templates that may help in the process of mapping user settings between templates and between templates and physical devices.
[0567] In one suitable implementation, each template includes mapping information to other templates or to a physical device. Referring to FIG. 35, device-specific templates 154 preferably include settings 3550 that correspond to settings in the physical devices 150, and additionally include inter -template mappings 3560 and device mappings 3570. Inter-template mappings 3560 indicate how the user settings 3550 map to corresponding settings in one or more other templates. Device mappings 3570 indicate how the settings 3550 map to the corresponding physical devices 150. In one specific implementation, device mappings 3570 may not be needed when the settings 3550 correspond exactly to the user settings on a physical device 150.
In an alternative implementation, the device mappings 3570 may indicate how to map the user settings 3550 to the settings on a physical device 150.
[0568] Universal templates 152 may include settings 3530 and inter-template mappings 3540 that indicate how the settings 3530 are mapped to settings in other templates. Master template 140A includes settings 3510 and inter-template mappings 3520 that indicate how the settings 3510 are mapped to settings in other templates. In the specific example shown in FIG. 35, each template includes the mappings that map the user settings. Of course, other implementations are possible within the scope of the disclosure and claims herein. For example, the inter-template mappings 3520, 3540 and 3560, and the device mappings 3570, could be stored separately from the templates. The mappings 3520, 3540, 3560 and 3570 could all be part of the user settings mapping information 326 shown in FIG. 3.
[0569] The hierarchy of templates shown in FIGS. 34 and 35 allow user settings to be stored to templates from physical devices, and to physical devices from templates by the user settings mechanism 324 shown in FIG. 3. Note the user settings mechanism 324 could be implemented within the conversion mechanism 160 shown in FIG. 5. Referring to FIG. 36, a method 3600 begins by receiving a user setting from a physical device (step 3610). This could happen, for example, when a user changes a setting on an existing physical device. The user setting is stored to the device-specific template corresponding to the physical device (step 3620). The mapping information is then used to map the user setting to the master template (step 3630). Note this may include mapping between multiple levels of universal templates to the master template. The user setting is then stored to the master template (step 3640). Method 36 illustrates how all user settings are stored both in a device-specific template as well as in the master template. The master template in the most preferred implementation thus becomes a superset of all user settings for all of a user's devices, and is thus a central repository for all of a user's settings.
[0570] FIG. 37 shows a method 3700 for storing a setting from a master template to a physical device. A user setting is read from the master template (step 3710). The mapping information is then used to map the user setting to a corresponding device -specific template (step 3720). The user setting is then mapped to one or more physical devices (step 3730). The user setting is then stored to the corresponding one or more physical devices (step 3740). Note that one setting in the master template may map to multiple physical devices. Thus, if the user changes the address for a contact, the changed address may be written to the user's phone, laptop computer and desktop computer using method 3700 shown in FIG. 37.
[0571 ] The user may have different settings on multiple devices that may create an incompatibility that needs to be resolved. Referring to FIG. 38, method 3800 begins when an incompatibility in user settings is detected (step 3810). The incompatibility in user settings is displayed to the user (step 3820). The user may then select the preferred setting (step 3830). The preferred setting is then stored in the master template (step 3840). The incompatibility is then logged (step 3850).
[0572] An example of incompatible settings is shown in FIG. 39. We assume the user has two phones, a
Samsung Galaxy S3 and an Apple iPhone 4. The user has a contact John Smith that has a ringtone of High Tide on the Samsung Galaxy S3, while the same contact John Smith on the iPhone 4 has a ringtone of Harp. The device-specific templates 154 shown in FIG. 39 shows these settings in the respective corresponding physical devices. We assume the user is prompted with this incompatibility in step 3820 in FIG. 38, and selects the ringtone High Tide in step 3830 as the desired ringtone for the Universal Template 152. This setting is then stored in the master template in step 3840. While this simple example shows how to resolve an incompatibility, there may be reasons why an incompatibility cannot be resolved by the user. For example, if the Samsung Galaxy S3 phone does not have a ringtone called Harp, and if the iPhone 4 does not have a ringtone called High Tide, there will be no way to resolve the discrepancy between the two. Instead, both ringtones could be stored in the universal template and in the master template. The information logged in step 3850 may indicate to the U-Me system when settings are not compatible, and thus multiple settings for different devices should be stored.
[0573] FIG. 40 illustrates that multiple levels of universal templates may be optionally employed between the device-specific templates and the master template. The settings from the master template 140A are mapped to and from a computer universal template 152A. The settings to and from the computer universal template 152A are mapped to and from a laptop computer universal template 152B and to and from a desktop computer universal template 152C. The settings in the laptop computer universal template 152B are mapped to and from a Dell N5110 laptop computer device-specific template 154A, which sends settings to and receives settings from a Dell N5110 laptop computer, thereby serving as a virtual clone of the Dell N5110 laptop computer. The settings in the desktop computer universal template 152C are mapped to and from a HP Pavillion 500-205t desktop computer device -specific template 154B, which sends settings to and receives settings from a HP Pavillion 500-205t desktop computer, thereby serving as a virtual clone of the HP Pavillion 500-205t desktop computer. Including multiple levels of universal templates allows creating a hierarchy of templates that ease the process of mapping between templates.
End Spec Martin.1447
ADDITIONAL SPECIFICATION SUPPORT
In accordance with the details of the disclosure above, and in reference to the accompanying drawings, it is to be understood that the following word-for-word description of embodiments is hereby provided and supported as follows:
Embodiments from Martin.1442
Embodiment 1. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor;
user data residing in the memory corresponding to a first user;
first user settings corresponding to the first user for a plurality of software applications residing in the memory;
second user settings corresponding to the first user for a plurality of physical devices; and a software mechanism executed by the at least one processor that makes the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
Embodiment 2. The computer system of Embodiment 1, further comprising an authentication mechanism that authenticates the first user on the first device before the software mechanism makes the user data, the first user settings, and the second user settings available to the first user on the first device.
Embodiment 3. The computer system of Embodiment 2, wherein the authentication mechanism uses biometric authentication of the first user.
Embodiment 4. The computer system of Embodiment 3, wherein the biometric authentication comprises scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously- stored reference fingerprint for the first user.
Embodiment 5. The computer system of Embodiment 1, wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos.
Embodiment 6. The computer system of Embodiment 1, further comprising licensed content in the memory that is licensed to the first user, wherein the software mechanism makes the licensed content available to the first user on the first device.
Embodiment 7. The computer system of Embodiment 6, wherein the licensed content comprises music, movies, electronic books, software and games.
Embodiment 8. The computer system of Embodiment 1, wherein the first user settings comprise first user preferences for each of the plurality of software applications. Embodiment 9. The computer system of Embodiment 1, wherein the second user settings comprise second user preferences for each of the plurality of physical devices.
Embodiment 10. The computer system of Embodiment 1, wherein the plurality of physical devices comprises a mobile phone and a computer system.
Embodiment 11. The computer system of Embodiment 1, further comprising a conversion mechanism that converts user settings for a first physical device to user settings for a second physical device.
Embodiment 12. The computer system of Embodiment 11, wherein the first physical device and second physical device are of a similar type.
Embodiment 13. The computer system of Embodiment 11, wherein the first physical device and second physical device are of different types.
Embodiment 14. A computer-implemented method executing on at least one processor comprising the steps of:
storing user data corresponding to a first user;
storing first user settings corresponding to the first user for a plurality of software applications; storing second user settings corresponding to the first user for a plurality of physical devices; and making the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
Embodiment 15. The method of Embodiment 14, further comprising the step of authenticating the first user on the first device before making the user data, the first user settings, and the second user settings available to the first user on the first device.
Embodiment 16. The method of Embodiment 15, wherein the step of authenticating the first user performs biometric authentication of the first user.
Embodiment 17. The method of Embodiment 16, wherein the biometric authentication comprises scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously-stored reference fingerprint for the first user.
Embodiment 18. The method of Embodiment 14, wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos.
Embodiment 19. The method of Embodiment 14, further comprising the step of storing licensed content that is licensed to the first user and making the licensed content available to the first user on the first device. Embodiment 20. The method of Embodiment 19, wherein the licensed content comprises music, movies, electronic books, software and games.
Embodiment 21. The method of Embodiment 14, wherein the first user settings comprise first user preferences for each of the plurality of software applications.
Embodiment 22. The method of Embodiment 14, wherein the second user settings comprise second user preferences for each of the plurality of physical devices. Embodiment 23. The method of Embodiment 14, wherein the plurality of physical devices comprises a mobile phone and a computer system.
Embodiment 24. The method of Embodiment 14, further comprising a conversion mechanism that converts user settings for a first physical device to user settings for a second physical device.
Embodiment 25. The method of Embodiment 24, wherein the first physical device and second physical device are of a similar type.
Embodiment 26. The method of Embodiment 24, wherein the first physical device and second physical device are of different types.
Embodiment 27. A computer-implemented method executing on at least one processor comprising the steps of:
storing user data corresponding to a first user, wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos;
storing licensed content that is licensed to the first user, wherein the licensed content comprises music, movies, electronic books, software and games;
storing first user settings corresponding to the first user for a plurality of software applications, wherein the first user settings comprise first user preferences for each of the plurality of software applications;
storing second user settings corresponding to the first user for a plurality of physical devices, wherein the second user settings comprise second user preferences for each of the plurality of physical devices, wherein the plurality of physical devices comprises a mobile phone and a computer system;
authenticating the first user on a first device used by first the user by scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously-stored reference fingerprint for the first user;
making the user data, the licensed content, the first user settings, and the second user settings available to the first user on a first device used by the first user; and
converting user settings for a first physical device to user settings for a second physical device.
End Embodiments Martin.1442 - Begin Embodiments Martin.1443
Embodiment 1. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor;
information for a plurality of people including at least one user -defined relationship between the plurality of people;
at least one system -derived relationship between the plurality of people that is derived from the at least one user-defined relationship; and
- I l l - a photo mechanism residing in the memory and executed by the at least one processor, the photo mechanism generating indexing information for a digital photo file that includes at least one of the at least one user-defined relationship and the at least one system-derived relationship.
Embodiment 2. The computer system of Embodiment 1, wherein the photo mechanism further generates an event in the indexing information for the digital photo file, where the event comprises at least one of:
at least one user-defined event; and
at least one system -derived event that is derived from the at least one user -defined event.
Embodiment 3. The computer system of Embodiment 2, wherein the at least one user-defined event comprises a birth date of a person and the at least one system-derived event comprises an age of the person computed from a date for the digital photo file and the birth date of the person.
Embodiment 4. The computer system of Embodiment 3, wherein the at least one system-derived event comprises a birthday of the person.
Embodiment 5. The computer system of Embodiment 1, wherein the at least one user-defined event comprises a wedding date of a person and the at least one system-derived event comprises an anniversary of the person computed from a date for the digital photo file and the wedding date of the person.
Embodiment 6. The computer system of Embodiment 1, wherein the photo mechanism further generates a location in the indexing information for the photo, where the location comprises at least one of a user- defined location and a system -defined location.
Embodiment 7. The computer system of Embodiment 1, wherein the photo mechanism includes a facial recognition mechanism that recognizes at least one face in an image in the digital photo file and allows a user to correlate the recognized face to one of the plurality of people.
Embodiment 8. The computer system of Embodiment 1, wherein the indexing information is stored separate from the digital photo file.
Embodiment 9. The computer system of Embodiment 1, wherein the photo mechanism allows a user to add new people, to modify at least one of the plurality of people, and to modify the at least one user-defined relationship, and in response, the photo mechanism updates the indexing information for a plurality of digital photo files to reflect the addition or modification by the user.
Embodiment 10. A computer-implemented method executing on at least one processor comprising:
defining information for a plurality of people including at least one user-defined relationship between the plurality of people;
deriving at least one system-derived relationship between the plurality of people that is derived from the at least one user-defined relationship; and
generating indexing information for a digital photo file that includes at least one of the at least one user-defined relationship and the at least one system -derived relationship.
Embodiment 11. The method of Embodiment 10, further comprising: generating an event in the indexing information for the digital photo file, where the event comprises at least one of:
at least one user-defined event; and
at least one system -derived event that is derived from the at least one user -defined event. Embodiment 12. The method of Embodiment 11, wherein the at least one user-defined event comprises a birth date of a person and the at least one system-derived event comprises an age of the person computed from a date of the digital photo file and the birth date of the person.
Embodiment 13. The method of Embodiment 12, wherein the at least one system-derived event comprises a birthday of the person.
Embodiment 14. The method of Embodiment 10, wherein the at least one user-defined event comprises a wedding date of a person and the at least one system-derived event comprises an anniversary of the person computed from a date for the digital photo file and the wedding date of the person.
Embodiment 15. The method of Embodiment 10, further comprising generating a location in the indexing information for the photo, where the location comprises at least one of a user -defined location and a system- defined location.
Embodiment 16. The method of Embodiment 10, further comprising performing facial recognition that recognizes at least one face in the photo and allows a user to correlate the recognized face to one of the plurality of people.
Embodiment 17. The method of Embodiment 10, further comprising storing the indexing information separate from the digital photo file.
Embodiment 18. The method of Embodiment 10, further comprising detecting when a user adds new people, modifies at least one of the plurality of people, and modifies the at least one user-defined relationship, and in response, updating the indexing information for a plurality of digital photo files to reflect the detected addition or modification by the user.
Embodiment 19. A computer-implemented method executing on at least one processor comprising:
defining user-defined information for a plurality of people including at least one user-defined relationship between the plurality of people and at least one user-defined event for a person that comprises at least one of a birth date and a wedding date;
deriving at least one system-derived relationship between the plurality of people that is derived from the at least one user-defined relationship;
processing a digital photo file that includes a date and geocode information that indicates where the digital photo file was generated;
identifying a person in an image in the digital photo file by performing facial recognition that recognizes a face in the image and allows a user to correlate the recognized face to one of the plurality of people; generating at least one system-derived event for the identified person that is derived from the at least one user-defined event, the at least one system-derived event comprising at least one of:
an age of the person computed from the date in the digital photo file and the birth date of the person;
a birthday of the person computed from the date in the digital photo file and the birth date of the person; and
an anniversary of the person computed from the date of the digital photo file and the wedding date of the person;
generating a location for the photo, where the location comprises at least one of a user-defined location derived from the geocode information in the digital photo file and a system -defined location derived from the geocode information in the digital photo file;
generating indexing information for the digital photo file that includes the one of the plurality of people corresponding to the recognized face, at least one user -de fined relationship between the one of the plurality of people and at least one other person, at least one system -derived relationship between the one of the plurality of people and at least one other person, at least one system -derived event for the one of the plurality of people, and location for the photo;
storing the indexing information for the digital photo file;
detecting when a user adds new people, modifies at least one of the plurality of people, and modifies the at least one user-defined relationship; and
updating the indexing information for a plurality of digital photo files to reflect the detected addition or modification by the user.
Embodiment 20. The method of Embodiment 19, wherein storing the indexing information for the digital photo file comprises storing the indexing information separate from the digital photo file.
End Embodiments Martin.1443 - Begin Embodiments Martin.1444
Embodiment 1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
a display coupled to the at least one processor that displays a plurality of graphical icons that may be selected by a user touching the display;
an equipment communication interface that transmits a plurality of commands to control equipment external to the apparatus;
an external database communication interface that allows retrieving at least one programming parameter from an external database that includes a plurality of entries for a plurality of locations, wherein each of the pluralities of entries includes at least one programming parameter for equipment at each location; and a dynamic location-based programming mechanism residing in the memory and executed by the at least one processor, the dynamic location-based programming mechanism accessing the external database using the external communication interface to retrieve the at least one programming parameter for a specified location and dynamically programming the apparatus using the at least one programming parameter for the specified location to make the apparatus control equipment at the specified location by transmitting a plurality of commands via the equipment communication interface.
Embodiment 2. The apparatus of Embodiment 1, wherein the equipment communication interface comprises a wireless interface and the external communication interface comprises the wireless interface. Embodiment 3. The apparatus of Embodiment 1, wherein the at least one programming parameter for the specified location comprises:
a channel map for a television provider at the specified location that provides television channels available from the television provider along with corresponding channel numbers; and
a specification of the equipment at the specified location.
Embodiment 4. The apparatus of Embodiment 3, wherein dynamically programming the apparatus comprises programming the plurality of graphical icons on the display to correspond to the channel map for the television provider at the specified location and programming the apparatus to control the equipment at the specified location using programming codes for controlling the equipment at the specified location. Embodiment 5. The apparatus of Embodiment 1, wherein a user inputs information to the apparatus that identifies the specified location.
Embodiment 6. The apparatus of Embodiment 1, further comprising an interface for reading a machine- readable code, wherein the specified location is determined by the interface reading a machine -readable code corresponding to the specified location.
Embodiment 7. The apparatus of Embodiment 1, wherein the equipment communication interface transmits the plurality of commands to external hardware that, in turn, transmits a corresponding plurality of codes to the equipment external to the apparatus.
Embodiment 8. The apparatus of Embodiment 1, wherein the apparatus comprises a mobile phone running a corresponding application.
Embodiment 9. The apparatus of Embodiment 1, wherein the apparatus comprises a tablet computer running a corresponding application.
Embodiment 10. The apparatus of Embodiment 1, wherein the apparatus comprises a dedicated universal remote control.
Embodiment 11. A method for programming a remote control comprising:
specifying a location;
accessing an external database that includes a plurality of entries for a plurality of locations including the specified location, wherein each of the pluralities of entries includes at least one programming parameter for equipment at each location; retrieving from the external database the at least one programming parameter for the specified location; and
dynamically programming the remote control using the at least one programming parameter for the specified location to make the remote control able to control the equipment at the specified location by transmitting a plurality of commands via an equipment communication interface to the equipment at the specified location.
Embodiment 12. The method of Embodiment 11, wherein the equipment communication interface comprises a wireless interface.
Embodiment 13. The method of Embodiment 11, wherein the at least one programming parameter for the specified location comprises:
a channel map for a television provider at the specified location that provides television channels available from the television provider along with corresponding channel numbers; and
a specification of the equipment at the specified location.
Embodiment 14. The method of Embodiment 13, wherein dynamically programming the remote control comprises programming a plurality of graphical icons on a display on the remote control to correspond to the channel map for the television provider at the specified location and programming the remote control to control the equipment at the specified location using programming codes for controlling the equipment at the specified location.
Embodiment 15. The method of Embodiment 11, further comprising the step of a user inputting information to the remote control that identifies the specified location.
Embodiment 16. The method of Embodiment 11, further comprising the step of the remote control determining the specified location by reading a machine -readable code corresponding to the specified location.
Embodiment 17. The method of Embodiment 11, wherein the remote control transmits the plurality of commands to external hardware that, in turn, transmits a corresponding plurality of codes to equipment. Embodiment 18. The method of Embodiment 11, wherein the remote control comprises a mobile phone running a corresponding application.
Embodiment 19. The method of Embodiment 11, wherein the remote control comprises a tablet computer running a corresponding application.
Embodiment 20. The method of Embodiment 11, wherein the remote control comprises a dedicated universal remote control.
Embodiment 21. A method for programming a remote control comprising:
(A) configuring a location database by:
specifying a plurality of locations;
for each of the plurality of specified locations, specifying: a television provider at the specified location and a corresponding channel map that provides television channels available from the television provider along with corresponding channel numbers; and
a plurality of devices at the specified location and corresponding programming codes for each of the plurality of devices;
(B) dynamically programming the remote control by:
specifying a location;
accessing the location database;
retrieving from the location database the television provider and corresponding channel map for the television provider corresponding to the specified location;
retrieving from the location database the plurality of devices at the specified location; and dynamically programming the remote control using the channel map for the television provider corresponding to the specified location and using programming codes for each of the plurality of devices.
End Embodiments Martin.1444 - Begin Embodiments Martin.1445
Embodiment 1. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor;
first user settings corresponding to a user for a first vehicle;
a conversion mechanism executed by the at least one processor, the conversion mechanism converting the first user settings for the first vehicle to corresponding second user settings for the user for a second vehicle; and
a software mechanism executed by the at least one processor that downloads the second user settings to the second vehicle.
Embodiment 2. The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise seat position for at least one seat.
Embodiment 3. The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise mirror position for at least one mirror.
Embodiment 4. The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise at least one climate control setting.
Embodiment 5. The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise audio presets.
Embodiment 6. The computer system of Embodiment 1 , the first user settings for the first vehicle and the second user settings for the second vehicle comprise music licensed to the user.
Embodiment 7. The computer system of Embodiment 1, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise video licensed to the user. Embodiment 8. The computer system of Embodiment 1, wherein the first vehicle and the second vehicle are of a similar type.
Embodiment 9. The computer system of Embodiment 1, wherein the first vehicle and the second vehicle are of different types.
Embodiment 10. A computer-implemented method executing on at least one processor comprising:
storing first user settings corresponding to a user for a first vehicle;
converting the first user settings for the first vehicle to corresponding second user settings for the user for a second vehicle; and
downloading the second user settings to the second vehicle.
Embodiment 11. The method of Embodiment 10, further comprising:
configuring the second vehicle using the second user settings.
Embodiment 12. The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise seat position for at least one seat.
Embodiment 13. The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise mirror position for at least one mirror.
Embodiment 14. The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise at least one climate control setting.
Embodiment 15. The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise audio presets.
Embodiment 16. The method of Embodiment 10, the first user settings for the first vehicle and the second user settings for the second vehicle comprise music licensed to the user.
Embodiment 17. The method of Embodiment 10, wherein the first user settings for the first vehicle and the second user settings for the second vehicle comprise video licensed to the user.
Embodiment 18. The method of Embodiment 10, wherein the first vehicle and the second vehicle are of a similar type.
Embodiment 19. The method of Embodiment 10, wherein the first vehicle and the second vehicle are of different types.
Embodiment 20. A computer-implemented method executing on at least one processor comprising:
storing first user settings corresponding to a user for a first vehicle, wherein the first user settings comprise:
seat position for at least one seat;
mirror position for at least one mirror;
at least one climate control setting;
audio presets; and
music licensed to the user; converting the first user settings for the first vehicle to corresponding second user settings for the user for a second vehicle;
downloading the second user settings to the second vehicle; and
configuring the second vehicle using the second user settings.
End Embodiments Martin.1445 - Begin Embodiments Martin.1446
Embodiment 1. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor;
first user settings corresponding to a user for a first device;
a conversion mechanism executed by the at least one processor, the conversion mechanism converting the first user settings for the first device to corresponding second user settings for the user for a second device; and
a software mechanism executed by the at least one processor that downloads the second user settings to the second device.
Embodiment 2. The computer system of Embodiment 1, wherein the first device comprises a first television receiver device and the second device comprises a second television receiver device.
Embodiment 3. The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise shows set to record.
Embodiment 4. The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise blocked channels and parental controls.
Embodiment 5. The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one favorite channels list.
Embodiment 6. The computer system of Embodiment 2, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one password.
Embodiment 7. The computer system of Embodiment 1, wherein the first device and the second device have the same hardware architecture type and have the same system software type.
Embodiment 8. The computer system of Embodiment 1, wherein the first device and the second device have different hardware architecture type and different system software type.
Embodiment 9. A computer-implemented method executing on at least one processor comprising:
storing first user settings corresponding to a user for a first device;
converting the first user settings for the first device to corresponding second user settings for the user for a second device; and downloading the second user settings to the second device.
Embodiment 10. The method of Embodiment 9, further comprising:
configuring the second device using the second user settings.
Embodiment 11. The method of Embodiment 9, wherein the first device comprises a first television receiver device and the second device comprises a second television receiver device.
Embodiment 12. The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise shows set to record.
Embodiment 13. The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise blocked channels and parental controls.
Embodiment 14. The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one favorite channels list.
Embodiment 15. The method of Embodiment 11, wherein the first user settings for the first television receiver device and the second user settings for the second television receiver device comprise at least one password.
Embodiment 16. The method of Embodiment 9, wherein the first device and the second device have the same hardware architecture type and have the same system software type.
Embodiment 17. The method of Embodiment 9, wherein the first device and the second device have different hardware architecture type and different system software type.
Embodiment 18. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor, the memory comprising a first plurality of user television settings defined by a user;
a television signal input for receiving a television signal from a television signal source;
a television signal output for sending a television signal to a television; and
a user settings transfer mechanism residing in the memory and executed by the at least one processor that reads a second plurality of user television settings from an external device coupled to the apparatus and programs at least one of the first plurality of user television settings based on at least one of the second plurality of user television settings.
Embodiment 19. The apparatus of Embodiment 18, wherein the external device comprises a computer system coupled via a network connection to the apparatus.
Embodiment 20. The apparatus of Embodiment 18, wherein the external device comprises a removable memory coupled to the apparatus. Embodiment 21. The apparatus of Embodiment 18, further comprising a conversion mechanism executed by the at least one processor, the conversion mechanism converting the second plurality of user television settings read from the external device and generates therefrom at least one of the first plurality of user television settings.
End Embodiments Martin.1446 - Begin Embodiments Martin.1447
Embodiment 1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
a plurality of device-specific templates for a first user that each includes a plurality of user settings for the first user for a corresponding physical device;
a master template corresponding to the first user that includes the plurality of user settings stored in the plurality of device-specific templates;
mapping information residing in the memory for mapping the user settings for the first user in the plurality of device-specific templates to the master template and for mapping the user settings for the first user in the master template to the plurality of device-specific templates; and
a user settings mechanism residing in the memory and executed by the at least one processor that receives a first user setting from a first physical device used by the first user, stores the first user setting in a first device-specific template corresponding to the first physical device, uses the mapping information to map the first user setting in the first device-specific template to a corresponding setting in the master template, and stores the corresponding setting in the master template.
Embodiment 2. The apparatus of Embodiment 1, wherein the user settings mechanism reads a second user setting for the first user from the master template, uses the mapping information to map the second user setting in the master template to a corresponding setting in a second device-specific template, stores the corresponding setting in the second device-specific template, and stores the corresponding setting to a physical device corresponding to the second device -specific template.
Embodiment 3. The apparatus of Embodiment 1, wherein the master template includes a superset of all user settings for the first user stored in all of the plurality of device -specific templates and wherein the master template is a repository for all of the user settings for the first user.
Embodiment 4. The apparatus of Embodiment 1, wherein the mapping information comprises information for mapping user settings from the plurality of device-specific templates to a plurality of universal templates that correspond to a plurality of device types, and information for mapping user settings from the universal templates to the master template.
Embodiment 5. The apparatus of Embodiment 4, wherein each of the plurality of device types is defined by a combination of hardware architecture and system software. Embodiment 6. The apparatus of Embodiment 5, wherein the combination of hardware architecture and system software for each universal template is different than the combination of hardware architecture and system software for other universal templates.
Embodiment 7. The apparatus of Embodiment 1, wherein the user settings mechanism uses the mapping information to map the corresponding setting in the master template to a second corresponding setting in a second device-specific template corresponding to the first user, stores the second corresponding setting to the second device-specific template, and stores the second corresponding setting to a second physical device corresponding to the second device-specific template.
Embodiment 8. The apparatus of Embodiment 7, wherein the first physical device has a first combination of hardware architecture and system software and the second physical device has a second combination of hardware architecture and system software different than the first combination of hardware architecture and system software.
Embodiment 9. A computer-implemented method executed by a processor for managing user settings, the method comprising:
(A) selecting a physical device;
(B) receiving from the physical device a plurality of user settings for a first user;
(C) storing the plurality of user settings for the first user to a device-specific template
corresponding to the selected physical device;
(D) repeating steps (A), (B) and (C) for a plurality of physical devices to store the plurality of user settings for the first user to a corresponding plurality of device-specific templates;
(E) reading mapping information that maps the plurality of user settings in each of the plurality of device-specific templates to a master template; and
(F) storing the plurality of user settings in the plurality of device-specific templates to the master template.
Embodiment 10. The method of Embodiment 9, further comprising:
reading a second user setting from the master template;
using the mapping information to map the second user setting in the master template to a corresponding setting in a second device-specific template;
storing the corresponding setting in the second device -specific template; and
storing the corresponding setting to a physical device corresponding to the second device-specific template.
Embodiment 11. The method of Embodiment 9, wherein the master template includes a superset of all user settings for the first user stored in all of the plurality of device -specific templates and wherein the master template is a repository for all of the user settings for the first user.
Embodiment 12. The method of Embodiment 9, wherein the mapping information comprises information for mapping user settings from the plurality of device-specific templates to a plurality of universal templates that correspond to a plurality of device types, and information for mapping user settings from the universal templates to the master template.
Embodiment 13. The method of Embodiment 12, wherein each of the plurality of device types is defined by a combination of hardware architecture and system software.
Embodiment 14. The method of Embodiment 13, wherein the combination of hardware architecture and system software for each universal template is different than the combination of hardware architecture and system software for other universal templates.
Embodiment 15. The method of Embodiment 9, further comprising:
using the mapping information to map the corresponding setting in the master template to a second corresponding setting in a second device-specific template corresponding to the first user;
storing the second corresponding setting to the second device-specific template; and
storing the second corresponding setting to a second physical device corresponding to the second device-specific template.
Embodiment 16. The method of Embodiment 15, wherein the first physical device has a first combination of hardware architecture and system software and the second physical device has a second combination of hardware architecture and system software different than the first combination of hardware architecture and system software.
Embodiment 17. A computer-implemented method executed by a processor for managing user settings, the method comprising:
(A) selecting a physical device;
(B) receiving from the physical device a plurality of user settings for a first user;
(C) storing the plurality of user settings for the first user to a device -specific template
corresponding to the selected physical device;
(D) repeating steps (A), (B) and (C) for a plurality of physical devices to store the plurality of user settings for the first user to a corresponding plurality of device -specific templates;
(E) reading mapping information that maps the plurality of user settings in each of the plurality of device-specific templates to a master template, wherein the mapping information comprises information for mapping user settings from the plurality of device-specific templates to a plurality of universal templates that correspond to a plurality of device types, and information for mapping user settings from the universal templates to the master template, wherein the master template includes a superset of all user settings for the first user stored in all of the plurality of device -specific templates and wherein the master template is a repository for all of the user settings for the first user, wherein each of the plurality of device types is defined by a combination of hardware architecture and system software, and wherein the combination of hardware architecture and system software for each universal template is different than the combination of hardware architecture and system software for other universal templates; (F) using the mapping information to store the plurality of user settings in the plurality of device - specific templates to the master template;
(G) the first user storing a second user setting for the first user to the master template;
(H) using the mapping information to map the second user setting in the master template to a corresponding setting in a second device-specific template;
(I) using the mapping information to store the corresponding setting in the second device-specific template; and
(J) storing the corresponding setting to a physical device corresponding to the second device - specific template.
Embodiment 18. The method of Embodiment 17, wherein the first physical device has a first combination of hardware architecture and system software and the second physical device has a second combination of hardware architecture and system software different than the first combination of hardware architecture and system software.
End Embodiments Martin.1447 - New Embodiments based on eReceipt Disclosure in Martin.1442
Embodiment 1. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor; and
an electronic receipt (eReceipt) mechanism residing in the memory and executed by the at least one processor, the eReceipt mechanism sending an eReceipt to a user for a purchase of a product by the user from a seller, the eReceipt including a selectable warranty link that directs the user to a first web site for warranty service when the user selects the warranty link on a date less than a specified date and directs the user to a second web site different from the first web site for warranty service when the user selects the warranty link on a date greater than the specified date.
Embodiment 2. The computer system of Embodiment 1, wherein the eRecipt mechanism sends the eReceipt in an e-mail to the user.
Embodiment 3. The computer system of Embodiment 2, wherein the eRecipt mechanism sends the eReceipt in an attachment to the e-mail to the user.
Embodiment 4. The computer system of Embodiment 1, wherein the first web site allows the user to make a warranty claim to the seller.
Embodiment 5. The computer system of Embodiment 1, wherein the second web site allows the user to make a warranty claim to a manufacturer of the product.
Embodiment 6. The computer system of Embodiment 1, wherein the eReceipt includes a date of warranty expiration.
Embodiment 7. The computer system of Embodiment 6, wherein the eReceipt mechanism monitors the date of warranty expiration, and at a predetermined time prior to the date of warranty expiration, the eReceipt mechanism notifies the user of the date of warranty expiration.
Embodiment 8. The computer system of Embodiment 7, wherein the e eceipt mechanism further notifies the user that an extended warranty is available.
Embodiment 9. The computer system of Embodiment 1, wherein the first website validates the purchase of the product by the user using the eReceipt when the user makes a warranty claim.
Embodiment 10. The computer system of Embodiment 1, wherein the second website validates the purchase of the product by the user using the eReceipt when the user makes a warranty claim.
Embodiment 11 A computer system comprising:
at least one processor;
a memory coupled to the at least one processor; and
an electronic receipt (eReceipt) mechanism residing in the memory and executed by the at least one processor, the eReceipt mechanism receiving an eReceipt sent to a user for a purchase of a product by the user from a seller, the eReceipt including a selectable warranty link that directs the user to a first web site for warranty service when the user selects the warranty link on a date less than a specified date and directs the user to a second web site different from the first web site for warranty service when the user selects the warranty link on a date greater than the specified date.
Embodiment 12. The computer system of Embodiment 11, wherein the eReceipt mechanism monitors e- mails to the user for eReceipts.
Embodiment 13. The computer system of Embodiment 11, wherein the eReceipt mechanism detects the warranty link in the eReceipt, and in response, sets the warranty link to direct the user to the first web site when the user selects the warranty link.
Embodiment 14. The computer system of Embodiment 11, wherein the eReceipt mechanism determines when the specified date arrives, and in response, sets the warranty link to direct the user to the second web site when the user selects the warranty link.
Embodiment 15. The computer system of Embodiment 11, wherein the eReceipt mechanism comprises a search engine that allows the user to search a database of eReceipts received by the user.
Embodiment 16. A computer-implemented method executing on at least one processor comprising:
determining a warranty policy for a product purchased by a user;
formatting an electronic receipt (eReceipt) according to the warranty policy and according to a template that defines content for the eReceipt, wherein the eReceipt includes a selectable warranty link that directs the user to a first web site for warranty service when the user selects the warranty link on a date less than a specified date and directs the user to a second web site different from the first web site for warranty service when the user selects the warranty link on a date greater than the specified date; and
sending the eReceipt to the user.
Embodiment 17. The method of Embodiment 16, further comprising:
receiving the eReceipt by the user; detecting the selectable warranty link in the eReceipt;
in response to detecting the selectable warranty link in the eReceipt, setting the warranty link to direct the user to the first web site regardless of date when the user selects the warranty link;
determining the specified date from the eReceipt; and
when the specified date arrives, setting the warranty link to direct the user to the second web site regardless of date when the user selects the warranty link.
Embodiment 18. The method of Embodiment 16, further comprising sending the eReceipt in an e-mail to the user.
Embodiment 19. The method of Embodiment 18, wherein the eReceipt is an attachment to the e-mail sent to the user.
Embodiment 20. The method of Embodiment 16, further comprising making a warranty claim to the seller on the first web site.
Embodiment 21. The method of Embodiment 16, further comprising making a warranty claim to a manufacturer of the product on the second web site.
Embodiment 22. The method of Embodiment 16, wherein the eReceipt includes a date of warranty expiration.
Embodiment 23. The method of Embodiment 22, further comprising:
monitoring the date of warranty expiration; and
at a predetermined time prior to the date of warranty expiration, notifying the user of the date of warranty expiration.
Embodiment 24. The method of Embodiment 23, further comprising notifying the user that an extended warranty is available.
Embodiment 25. The method of Embodiment 16, further comprising the first website validating the purchase of the product by the user using the eReceipt when the user makes a warranty claim.
Embodiment 26. The method of Embodiment 16, further comprising the second website validating the purchase of the product by the user using the eReceipt when the user makes a warranty claim.
Embodiment 27. A networked computer system comprising:
a first computer system comprising:
at least one processor;
a memory coupled to the at least one processor; and
a first electronic receipt (eReceipt) mechanism residing in the memory and executed by the at least one processor, the first eReceipt mechanism sending an eReceipt to a user for a purchase of a product by the user from a seller by sending an e-mail to the user that includes the eReciept, the eReceipt including a selectable warranty link that directs the user to a first web site for warranty service when the user selects the warranty link on a date less than a specified date to allow the user to make a warranty claim to the seller, and directs the user to a second web site different from the first web site for warranty service when the user selects the warranty link on a date greater than the specified date to allow the user to make a warranty claim to a manufacturer of the product, wherein the e eceipt includes a date of warranty expiration;
a second computers system coupled via network connection to the first computer system, the second computer system comprising:
at least one processor;
a memory coupled to the at least one processor; and
a second electronic receipt (eReceipt) mechanism residing in the memory and executed by the at least one processor that monitors e-mails to the user for eReceipts, the eReceipt mechanism receiving from the first eReceipt mechanism an eReceipt sent in an e-mail to the user for the purchase of the product, wherein the second eReceipt mechanism monitors the date of warranty expiration, and at a predetermined time prior to the date of warranty expiration, the second eReceipt mechanism notifies the user of the date of warranty expiration and notifies the user when an extended warranty is available, wherein the second eReceipt mechanism comprises a search engine that allows the user to search a database of eReceipts received by the second eReceipt mechanism.
New Embodiments based on Medical Info Disclosure in Martin.1442
Embodiment 1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
semi-private medical information for a user residing in the memory;
private medical information for the user residing in the memory; and
a medical information mechanism residing in the memory and executed by the at least one processor, the medical information mechanism detecting when a current location of the apparatus is a medical facility, and in response, displaying the semi-private medical information for the user while not displaying the private medical information for the user.
Embodiment 2. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
semi-private medical information for a user residing in the memory;
private medical information for the user residing in the memory;
biometric identification information for the user residing in the memory; and
a medical information mechanism residing in the memory and executed by the at least one processor, the medical information mechanism receiving biometric identification information, and when the biometric identification information matches the stored biometric identification for the user, displaying the semi-private medical information for the user while not displaying the private medical information for the user.
Embodiment 3. The apparatus of Embodiment 2, wherein the medical information mechanism does not display the semi-private medical information and does not display the private medical information when the received biometric identification information does not match the stored biometric identification information for the user.
Embodiment 4. A method for accessing medical information for a user comprising:
storing semi-private medical information for the user;
storing private medical information for the user;
detecting when the user is in a medical facility; and
in response to detecting when the user is in the medical facility, displaying the semi -private medical information for the user while not displaying the private medical information for the user.
Embodiment 5. A method for accessing medical information for a user comprising:
storing semi-private medical information for the user on a portable device carried by the user; storing private medical information for the user on the portable device;
the portable device detecting when the portable device is in a medical facility; and
in response to detecting when the portable device is in the medical facility, the portable device displays the semi-private medical information for the user while not displaying the private medical information for the user.
Embodiment 6. A method for accessing medical information for a user comprising:
storing semi-private medical information for the user;
storing private medical information for the user;
storing biometric identification information for the user;
receiving biometric identification information; and
when the biometric information matches the stored biometric identification for the user, displaying the semi-private medical information for the user while not displaying the private medical information for the user.
Embodiment 7. The method of Embodiment 6, wherein when the received biometric identification information does not match the stored biometric identification information for the user, not displaying the semi-private medical information and not displaying the private medical information.
New Embodiments for Martin.1443
Embodiment 1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
information for a plurality of date -based events comprising:
at least one user-defined event; and at least one system-derived event that is derived from the at least one user -defined event; and
a photo mechanism residing in the memory and executed by the at least one processor, the photo mechanism generating indexing information for a digital photo file based on a date in the digital photo file, the indexing information including at least one of the at least one user-defined event and the at least one system-derived event based on the date in the digital photo file.
Embodiment 2. The apparatus of Embodiment 1, wherein the at least one user-defined event comprises a birth date of a person and the at least one system-derived event comprises an age of the person computed from the date in the digital photo file and the birth date of the person.
Embodiment 3. The apparatus of Embodiment 2, wherein the at least one system -derived event comprises a birthday of the person.
Embodiment 4. The apparatus of Embodiment 1, wherein the at least one user-defined event comprises a wedding date of a person and the at least one system-derived event comprises an anniversary of the person computed from the date in the digital photo file and the wedding date of the person.
Embodiment 5. The apparatus of Embodiment 1, wherein the indexing information is stored separate from the digital photo file.
Embodiment 6. The apparatus of Embodiment 1, wherein the photo mechanism allows a user to add a new user-defined event or modify at least one user-defined event, and in response, updates the indexing information for a plurality of digital photo files to reflect the detected addition or modification by the user. Embodiment 7. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
information for a plurality of locations comprising:
at least one user-defined location; and
at least one system -defined location; and
a photo mechanism residing in the memory and executed by the at least one processor, the photo mechanism generating indexing information for a digital photo file based on location data in the digital photo file, the indexing information including at least one of the at least one user-defined location and the at least one system-defined location that corresponds to the location data in the digital photo file.
Embodiment 8. The apparatus of Embodiment 7, wherein the indexing information is stored separate from the digital photo file.
Embodiment 9. A computer-implemented method executing on at least one processor comprising:
defining information for a plurality of date -based events including at least one user-defined event; deriving at least one system-derived event from the at least one user-defined event; and generating indexing information for a digital photo file based on a date in the digital photo file, the indexing information including at least one of the at least one user-defined event and the at least one system- derived event based on the date in the digital photo file.
Embodiment 10. The method of Embodiment 9, wherein the at least one user-defined event comprises a birth date of a person and the at least one system-derived event comprises an age of the person computed from the date in the digital photo file and the birth date of the person.
Embodiment 11. The method of Embodiment 10, wherein the at least one system-derived event comprises a birthday of the person.
Embodiment 12. The method of Embodiment 9, wherein the at least one user-defined event comprises a wedding date of a person and the at least one system-derived event comprises an anniversary of the person computed from the date in the digital photo file and the wedding date of the person.
Embodiment 13. The method of Embodiment 9, wherein the indexing information is stored separate from the digital photo file.
Embodiment 14. The method of Embodiment 9, wherein the photo mechanism allows a user to add a new user-defined event or modify at least one user-defined event, and in response, updates the indexing information for a plurality of digital photo files to reflect the detected addition or modification by the user. Embodiment 15. A computer-implemented method executing on at least one processor comprising:
defining information for a plurality of locations including at least one user-defined location and at least one system-defined location; and
generating indexing information for a digital photo file based on location data in the digital photo file, the indexing information including at least one of the at least one user-defined location and the at least one system -defined location that corresponds to the location data in the digital photo file.
Embodiment 16. The method of Embodiment 15, wherein the indexing information is stored separate from the digital photo file.
Embodiment 17. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
people information residing in the memory for a plurality of people including at least one user- defined relationship between the plurality of people and at least one system-derived relationship between the plurality of people that is derived from the at least one user-defined relationship;
event information residing in the memory for a plurality of date -based events comprising at least one user-defined event and at least one system-derived event that is derived from the at least one user- defined event;
location information residing in the memory for a plurality of locations comprising at least one user-defined location and at least one system -de fined location;
a plurality of digital photo files residing in the memory, the plurality of digital photo files including a plurality of existing tags defined by a user that each comprise a text label; and
a photo mechanism residing in the memory and executed by the at least one processor, the photo mechanism displaying a list of the plurality of existing tags to a user, receiving input from the user to correlate at least some of the plurality of existing tags with corresponding people information, corresponding event information, and corresponding location information, and generating indexing information for the plurality of digital photo files based on the corresponding people information, corresponding event information, and corresponding location information.
Embodiment 18. The apparatus of Embodiment 17, wherein the indexing information is stored separate from the digital photo file.
Embodiment 19. A computer-implemented method executing on at least one processor comprising:
providing people information for a plurality of people including at least one user-defined relationship between the plurality of people and at least one system -derived relationship between the plurality of people that is derived from the at least one user-defined relationship;
providing event information for a plurality of date-based events comprising at least one user- defined event and at least one system -derived event that is derived from the at least one user-defined event; providing location information for a plurality of locations comprising at least one user-defined location and at least one system-defined location;
providing a plurality of digital photo files that include a plurality of existing tags defined by a user that each comprise a text label;
processing the plurality of photo files to display a list of the plurality of existing tags to a user; receiving input from the user to correlate at least some of the plurality of existing tags with corresponding people information from the people information, with corresponding event information from the event information, and with corresponding location information from the location information; and generating indexing information for the plurality of digital photo files based on the corresponding people information, corresponding event information, and corresponding location information.
Embodiment 20. The method of Embodiment 19, wherein the indexing information is stored separate from the digital photo file.
Embodiment 21. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
people information residing in the memory for a plurality of people including at least one user- defined relationship between the plurality of people and at least one system -derived relationship between the plurality of people that is derived from the at least one user-defined relationship;
event information residing in the memory for a plurality of date -based events comprising at least one user-defined event and at least one system-derived event that is derived from the at least one user- defined event;
location information residing in the memory for a plurality of locations comprising at least one user-defined location and at least one system -de fined location; and
a photo mechanism residing in the memory and executed by the at least one processor, the photo mechanism processing a digital photo file that includes a date and geocode information that indicates where the digital photo file was generated, identifying a person in an image in the digital photo file by performing facial recognition that recognizes a face in the image and allows a user to correlate the recognized face to one of the plurality of people, generating at least one system -derived event for the identified person that is derived from the at least one user -defined event, the at least one system-derived event comprising at least one of:
an age of the person computed from the date in the digital photo file and the birth date of the person;
a birthday of the person computed from the date in the digital photo file and the birth date of the person; and
an anniversary of the person computed from the date of the digital photo file and the wedding date of the person;
the photo mechanism generating a location for the photo, where the location comprises at least one of a first user-defined location in the location information derived from the geocode information in the digital photo file and a first system -de fined location derived in the location information from the geocode information in the digital photo file;
the photo mechanism generating indexing information for the digital photo file that includes the one of the plurality of people corresponding to the recognized face, at least one user -defined relationship between the one of the plurality of people and at least one other person, at least one system -derived relationship between the one of the plurality of people and at least one other person, at least one system- derived event for the one of the plurality of people, and location for the photo;
the photo mechanism storing the indexing information for the digital photo file;
the photo mechanism detecting when a user adds new people to the people information, modifies at least one of the plurality of people in the people information, modifies the at least one user-defined relationship in the people information, adds a new date -based event in the event information, modifies the at least one user-defined event in the event information, adds a new location to the location information, and modifies the at least one user-defined location in the location information; and
the photo mechanism updating the indexing information for a plurality of digital photo files to reflect the detected addition or modification by the user.
Embodiment 22. The apparatus of Embodiment 21, wherein the indexing information is stored separate from the digital photo file. Embodiment 23. A networked computer system comprising:
a computer system comprising:
a first processor;
a memory coupled to the first processor;
people information residing in the memory for a plurality of people including at least one user-defined relationship between the plurality of people and at least one system -derived relationship between the plurality of people that is derived from the at least one user -defined relationship;
event information residing in the memory for a plurality of date -based events comprising at least one user-defined event and at least one system -derived event that is derived from the at least one user-defined event;
location information residing in the memory for a plurality of locations comprising at least one user-defined location and at least one system-defined location; and
a first photo mechanism residing in the memory and executed by the first processor, the photo mechanism receiving a digital photo file from a device used by a user to take a picture; the device used by the user coupled via network connection to the computer system, the device comprising:
a second processor;
a second memory coupled to the at second processor;
a digital camera that allows the user to take the picture, wherein the digital camera generates the digital photo file for the picture that includes a date and geocode information that indicates where the digital photo file was generated;
a second photo mechanism residing in the memory and executed by the second processor, the second photo mechanism receiving the digital photo file from the digital camera and sending the digital photo file to the first photo mechanism in the computer system; in response to the first photo mechanism receiving the digital photo file from the device, the first photo mechanism processes the digital photo file, identifies a person in an image in the digital photo file by performing facial recognition that recognizes a face in the image, and allows the user to correlate the recognized face to one of the plurality of people in the people information;
generating indexing information for the digital photo file based on the recognized face in the image determining from the geocode information and the location information a recognized location for the digital photo file;
adding to the indexing information for the digital photo file the recognized location;
determining from the date in the digital photo file and the event information the at least one recognized system-derived event comprising at least one of: an age of the person computed from the date in the digital photo file and the birth date of the person;
a birthday of the person computed from the date in the digital photo file and the birth date of the person; and
an anniversary of the person computed from the date of the digital photo file and the wedding date of the person; and
adding to the indexing information for the digital photo file the recognized system-derived event. Embodiment 24. The networked computer system of Embodiment 23, wherein the indexing information is stored separate from the digital photo file.
Embodiment 25. A method for processing digital photo files comprising:
a user taking a picture using a device that includes a digital camera that generates a digital photo file of the picture;
the device sending the digital photo file to a cloud-based computer system that includes:
people information for a plurality of people including at least one user -de fined relationship between the plurality of people and at least one system-derived relationship between the plurality of people that is derived from the at least one user-defined relationship;
event information for a plurality of date -based events comprising at least one user-defined event and at least one system-derived event that is derived from the at least one user-defined event; and
location information for a plurality of locations comprising at least one user-defined location and at least one system-defined location;
in response to the cloud-based computer system receiving the digital photo file from the device, the cloud-based computer system performs the steps of:
identifying a person in an image in the digital photo file by performing facial recognition that recognizes a face in the image, and allows the user to correlate the recognized face to one of the plurality of people in the people information;
generating indexing information for the digital photo file based on the recognized face in the image;
determining from the geocode information and the location information a recognized location for the digital photo file;
adding to the indexing information for the digital photo file the recognized location; determining from the date in the digital photo file and the event information the at least one recognized system -derived event comprising at least one of:
an age of the person computed from the date in the digital photo file and the birth date of the person; a birthday of the person computed from the date in the digital photo file and the birth date of the person; and
an anniversary of the person computed from the date of the digital photo file and the wedding date of the person; and
adding to the indexing information for the digital photo file the recognized system -derived event. Embodiment 26. The method of Embodiment 25, wherein the indexing information is stored separate from the digital photo file.
New Embodiments for Martin.1444
Embodiment 1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
a display coupled to the at least one processor that displays a plurality of graphical icons that may be selected by a user touching the display;
an equipment communication interface that transmits a plurality of commands to control equipment external to the apparatus;
a database residing in the memory that includes a plurality of entries for a plurality of locations, wherein each of the pluralities of entries includes at least one programming parameter for equipment at each location; and
a dynamic location-based programming mechanism residing in the memory and executed by the at least one processor, the dynamic location-based programming mechanism accessing the database to retrieve the at least one programming parameter for a specified location and dynamically programming the apparatus using the at least one programming parameter for the specified location to make the apparatus control equipment at the specified location by transmitting a plurality of commands via the equipment communication interface.
Embodiment 2. The apparatus of Embodiment 1, wherein the equipment communication interface comprises a wireless interface.
Embodiment 3. The apparatus of Embodiment 1, wherein the at least one programming parameter for the specified location comprises:
a channel map for a television provider at the specified location that provides television channels available from the television provider along with corresponding channel numbers; and
a specification of the equipment at the specified location.
Embodiment 4. The apparatus of Embodiment 3, wherein dynamically programming the apparatus comprises programming the plurality of graphical icons on the display to correspond to the channel map for the television provider at the specified location and programming the apparatus to control the equipment at the specified location using programming codes for controlling the equipment at the specified location. Embodiment 5. The apparatus of Embodiment 1, wherein a user inputs information to the apparatus that identifies the specified location.
Embodiment 6. The apparatus of Embodiment 1, further comprising an interface for reading a machine- readable code, wherein the specified location is determined by the interface reading a machine -readable code corresponding to the specified location.
Embodiment 7. The apparatus of Embodiment 1, wherein the equipment communication interface transmits the plurality of commands to external hardware that, in turn, transmits a corresponding plurality of codes to the equipment external to the apparatus.
Embodiment 8. The apparatus of Embodiment 1, wherein the apparatus comprises a mobile phone running a corresponding application.
Embodiment 9. The apparatus of Embodiment 1, wherein the apparatus comprises a tablet computer running a corresponding application.
Embodiment 10. The apparatus of Embodiment 1, wherein the apparatus comprises a dedicated universal remote control.
Embodiment 11. A method for programming a remote control comprising:
specifying a location;
accessing a database that includes a plurality of entries for a plurality of locations including the specified location, wherein each of the pluralities of entries includes at least one programming parameter for equipment at each location;
retrieving from the database the at least one programming parameter for the specified location; and dynamically programming the remote control using the at least one programming parameter for the specified location to make the remote control able to control the equipment at the specified location by transmitting a plurality of commands via an equipment communication interface to the equipment at the specified location.
Embodiment 12. The method of Embodiment 11, wherein the equipment communication interface comprises a wireless interface.
Embodiment 13. The method of Embodiment 11, wherein the at least one programming parameter for the specified location comprises:
a channel map for a television provider at the specified location that provides television channels available from the television provider along with corresponding channel numbers; and
a specification of the equipment at the specified location.
Embodiment 14. The method of Embodiment 13, wherein dynamically programming the remote control comprises programming a plurality of graphical icons on a display on the remote control to correspond to the channel map for the television provider at the specified location and programming the remote control to control the equipment at the specified location using programming codes for controlling the equipment at the specified location. Embodiment 15. The method of Embodiment 11, further comprising a user inputting information to the remote control that identifies the specified location.
Embodiment 16. The method of Embodiment 11, further comprising the remote control determining the specified location by reading a machine -readable code corresponding to the specified location.
Embodiment 17. The method of Embodiment 11, wherein the remote control transmits the plurality of commands to external hardware that, in turn, transmits a corresponding plurality of codes to equipment. Embodiment 18. The method of Embodiment 11, wherein the remote control comprises a mobile phone running a corresponding application.
Embodiment 19. The method of Embodiment 11, wherein the remote control comprises a tablet computer running a corresponding application.
Embodiment 20. The method of Embodiment 11, wherein the remote control comprises a dedicated universal remote control.
Embodiment 21. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
a display coupled to the at least one processor that displays a plurality of graphical icons that may be selected by a user touching the display;
an equipment communication interface that transmits a plurality of commands to control equipment external to the apparatus; and
a dynamic location-based programming mechanism residing in the memory and executed by the at least one processor, the dynamic location-based programming mechanism specifying a location, accessing a database to determine a television provider for the specified location, to determine a channel map for the specified television provider, to determine equipment at the specified location, and to determine programming codes for the equipment at the specified location, and dynamically programming the apparatus to control the equipment at the specified location using the channel map for the specified television provider at the specified location and the programming codes for the equipment at the specified location.
Embodiment 22. The apparatus of Embodiment 21, wherein dynamically programming the apparatus comprises programming the plurality of graphical icons on the display to correspond to the channel map for the television provider at the specified location.
Embodiment 23. The apparatus of Embodiment 22, wherein a user inputs information to the apparatus that identifies the specified location.
Embodiment 24. The apparatus of Embodiment 21, further comprising an interface for reading a machine- readable code, wherein the specified location is determined by the interface reading a machine -readable code corresponding to the specified location.
Embodiment 25. The apparatus of Embodiment 21, wherein the apparatus comprises a mobile phone running a corresponding application. Embodiment 26. The apparatus of Embodiment 21, wherein the apparatus comprises a tablet computer running a corresponding application.
Embodiment 27. The apparatus of Embodiment 21, wherein the apparatus comprises a dedicated universal remote control.
Embodiment 28. A method for programming a remote control comprising:
specifying a location;
accessing a database to determine a television provider at the specified location;
accessing the database to determine a channel map for the specified television provider at the specified location;
accessing the database to determine equipment at the specified location;
accessing the database to determine programming codes for the equipment at the specified location; and
dynamically programming the remote control to control the equipment at the specified location using the channel map for the specified television provider at the specified location and the programming codes for the equipment at the specified location.
Embodiment 29. The method of Embodiment 28, wherein dynamically programming the apparatus comprises programming a plurality of graphical icons on a display of the remote control to correspond to the channel map for the television provider at the specified location.
Embodiment 30. The method of Embodiment 29, further comprising a user inputting information to the remote control that identifies the specified location.
Embodiment 31. The method of Embodiment 28, wherein the remote control includes an interface for reading a machine -readable code, and further comprising determining the specified location by the interface reading a machine -readable code corresponding to the specified location.
Embodiment 32. The method of Embodiment 28, wherein the remote control comprises a mobile phone running a corresponding application.
Embodiment 33. The method of Embodiment 28, wherein the remote control comprises a tablet computer running a corresponding application.
Embodiment 34. The method of Embodiment 28, wherein the remote control comprises a dedicated universal remote control.
New Embodiments for Martin.1445
Embodiment 1. A vehicle comprising:
at least one processor;
a memory coupled to the at least one processor;
a plurality of user settings corresponding to a user residing in the memory, wherein the plurality of user settings comprises at least one of:
seat position for at least one seat in the vehicle; mirror position for at least one mirror on the vehicle;
at least one climate control setting for the vehicle;
audio presets for the vehicle; and
music licensed to the user;
an interface for the vehicle to communicate with a system external to the vehicle; and
a mechanism for sending the plurality of user settings corresponding to the user to the system external to the vehicle.
Embodiment 2. The vehicle of Embodiment 1, wherein the system external to the vehicle comprises a cloud-based computer system.
Embodiment 3. The vehicle of Embodiment 1, wherein the system external to the vehicle comprises a handheld device.
Embodiment 4. A method for storing user settings in a vehicle to a cloud-based computer system
comprising:
a user defining a plurality of user settings for a vehicle, wherein the plurality of first user settings comprises at least one of:
seat position for at least one seat in the vehicle;
mirror position for at least one mirror on the vehicle;
at least one climate control setting for the vehicle;
audio presets for the vehicle; and
music licensed to the user;
communicating with a system external to the vehicle; and
sending the plurality of user settings corresponding to the user to the system external to the vehicle. Embodiment 5. The method of Embodiment 4, wherein the system external to the vehicle comprises a cloud-based computer system.
Embodiment 6. The method of Embodiment 4, wherein the system external to the vehicle comprises a handheld device.
Embodiment 7. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor;
first user settings corresponding to a user for a first vehicle from a first manufacturer residing in the memory, wherein the first user settings comprises at least one of:
seat position for at least one seat in the vehicle;
mirror position for at least one mirror on the vehicle;
at least one climate control setting for the vehicle;
audio presets for the vehicle; and
music licensed to the user; a conversion mechanism executed by the at least one processor, the conversion mechanism converting the first user settings for the first vehicle to corresponding second user settings for the user for a second vehicle from a second manufacturer different than the first manufacturer; and
a software mechanism executed by the at least one processor that downloads the second user settings to the second vehicle.
Embodiment 8. A computer-implemented method executing on at least one processor comprising:
storing first user settings corresponding to a user for a first vehicle from a first manufacturer, wherein the plurality of user settings comprises at least one of:
seat position for at least one seat in the vehicle;
mirror position for at least one mirror on the vehicle;
at least one climate control setting for the vehicle;
audio presets for the vehicle; and
music licensed to the user;
converting the first user settings for the first vehicle to corresponding second user settings for the user for a second vehicle from a second manufacturer different than the first manufacturer; and
downloading the second user settings to the second vehicle.
New Embodiments for Martin.1446
Embodiment 1. A method for programming a television receiver apparatus comprising:
storing in the television receiver apparatus a first plurality of user television settings corresponding to a user;
reading by the television receiver apparatus a second plurality of user television settings from an external device coupled to the television receiver apparatus; and
programming at least one of the first plurality of user television settings based on at least one of the second plurality of user television settings.
Embodiment 2. The method of Embodiment 1, wherein the external device comprises a computer system coupled via a network connection to the television receiver apparatus.
Embodiment 3. The method of Embodiment 1, wherein the external device comprises a removable memory coupled to the television receiver apparatus.
Embodiment 4. The method of Embodiment 1, further comprising converting the second plurality of user television settings read from the external device and generating therefrom at least one of the first plurality of user television settings.
Embodiment 5. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor; a first plurality of user television settings corresponding to a user for a first television receiver device, the first television receiver device comprising a first tuner that receives a first television signal input from a first television signal source, and a first television signal output that is output to a first television external to the first television receiver device;
a conversion mechanism executed by the at least one processor, the conversion mechanism converting the first plurality of user settings for the first television receiver device to a corresponding second plurality of user settings for the user for a second television receiver device, the second television receiver device comprising a second tuner that receives a second television signal input from a second television signal source, and a second television signal output that is output to a second television external to the second television receiver device; and
a software mechanism executed by the at least one processor that downloads the second plurality of user settings to the second television receiver device.
Embodiment 6. The apparatus of Embodiment 5, wherein the first plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 7. The apparatus of Embodiment 5, wherein the second plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 8. The apparatus of Embodiment 5, wherein the first television receiver device and the second television receiver device have the same hardware architecture type and have the same system software type. Embodiment 9. The apparatus of Embodiment 5, wherein the first television receiver device and the second television receiver device have different hardware architecture type and different system software type. Embodiment 10. A method for transferring settings from a first television receiver device to a second television receiver device comprising:
storing a first plurality of user television settings corresponding to a user for the first television receiver device, the first television receiver device comprising a first tuner that receives a first television signal input from a first television signal source, and a first television signal output that is output to a first television external to the first television receiver device; converting the first plurality of user settings for the first television receiver device to a corresponding second plurality of user settings for the user for the second television receiver device, the second television receiver device comprising a second tuner that receives a second television signal input from a second television signal source, and a second television signal output that is output to a second television external to the second television receiver device; and
downloading the second plurality of user settings to the second television receiver device.
Embodiment 11. The method of Embodiment 10, wherein the first plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 12. The method of Embodiment 10, wherein the second plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 13. The method of Embodiment 10, wherein the first television receiver device and the second television receiver device have the same hardware architecture type and have the same system software type. Embodiment 14. The method of Embodiment 10, wherein the first television receiver device and the second television receiver device have different hardware architecture type and different system software type. Embodiment 15. A television receiver apparatus comprising:
at least one processor;
a memory coupled to the at least one processor, the memory comprising a first plurality of user television settings;
a tuner for receiving a television signal input from a television signal source;
a television signal output for sending a television signal to a television external to the television receiver apparatus;
a user settings transfer mechanism residing in the memory and executed by the at least one processor that reads a second plurality of user television settings from an external device coupled to the television receiver apparatus, determining from the second plurality of user television settings are from a device of the same type as the television receiver apparatus, and when the second plurality of user television settings are from a device of the same type as the television receiver apparatus, the user settings transfer mechanism programs at least one of the first plurality of user television settings based on at least one of the second plurality of user television settings, and when the second plurality of user television settings are from a device of a different type than the television receiver apparatus, the user settings transfer mechanism converts the second user settings to corresponding first user settings for the television receiver apparatus and programs at least one of the first plurality of user television settings in the first television receiver device using the converted second user settings.
Embodiment 16. The television receiver apparatus of Embodiment 15, wherein the first plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 17. The television receiver apparatus of Embodiment 15, wherein the second plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 18. The television receiver apparatus of Embodiment 15, wherein the external device comprises a computer system coupled via a network connection to the television receiver apparatus.
Embodiment 19. The television receiver apparatus of Embodiment 15, wherein the external device comprises a removable memory coupled to the television receiver apparatus.
Embodiment 20. A method for transferring settings from a second television receiver device to a first television receiver device comprising:
storing a first plurality of user television settings corresponding to a user for the first television receiver device, the first television receiver device comprising a first tuner that receives a first television signal input from a first television signal source, and a first television signal output that is output to a first television external to the first television receiver device;
reading a second plurality of user television settings from an external device coupled to the first television receiver device;
determining from the second plurality of user television settings are from a device of the same type as the first television receiver device;
when the second plurality of user television settings are from a device of the same type as the first television receiver device, programming at least one of the first plurality of user television settings based on at least one of the second plurality of user television settings; and
when the second plurality of user television settings are from a device of a different type than the first television receiver device, converting the second user settings to corresponding first user settings for the first television receiver device and programming at least one of the first plurality of user television settings in the first television receiver device using the converted second user settings.
Embodiment 21. The method of Embodiment 20, wherein the first plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 22. The method of Embodiment 20, wherein the second plurality of user settings comprises at least one of:
at least one favorite channel list;
at least one show set to record;
at least one blocked channel;
at least one parental control; and
at least one password.
Embodiment 23. The method of Embodiment 20, wherein the external device comprises a computer system coupled via a network connection to the television receiver apparatus.
Embodiment 24. The method of Embodiment 20, wherein the external device comprises a removable memory coupled to the television receiver apparatus.
New Embodiments for Martin.1447
Embodiment 1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor;
a plurality of device-specific templates for a first user that each includes a plurality of user settings for the first user for a corresponding physical device;
a master template corresponding to the first user that includes the plurality of user settings stored in the plurality of device-specific templates;
mapping information residing in the memory for mapping the user settings for the first user in the plurality of device-specific templates to the master template and for mapping the user settings for the first user in the master template to the plurality of device-specific templates; and
a user settings mechanism residing in the memory and executed by the at least one processor that reads a first user setting for the first user from the master template, uses the mapping information to map the first user setting in the master template to a corresponding setting in a first device-specific template, stores the corresponding setting in the first device-specific template, and stores the corresponding setting to a first physical device corresponding to the first device-specific template.
Embodiment 2. The apparatus of Embodiment 1, wherein the user settings mechanism receives a second user setting from a second physical device used by the first user, stores the second user setting in a second device-specific template corresponding to the second physical device, uses the mapping information to map the first user setting in the second device-specific template to a corresponding setting in the master template, and stores the corresponding setting in the master template.
Embodiment 3. The apparatus of Embodiment 1, wherein the master template includes a superset of all user settings for the first user stored in all of the plurality of device -specific templates and wherein the master template is a repository for all of the user settings for the first user.
Embodiment 4. The apparatus of Embodiment 1, wherein the mapping information comprises information for mapping user settings from the plurality of device-specific templates to a plurality of universal templates that correspond to a plurality of device types, and information for mapping user settings from the universal templates to the master template.
Embodiment 5. The apparatus of Embodiment 4, wherein each of the plurality of device types is defined by a combination of hardware architecture and system software.
Embodiment 6. The apparatus of Embodiment 5, wherein the combination of hardware architecture and system software for each universal template is different than the combination of hardware architecture and system software for other universal templates.
Embodiment 7. The apparatus of Embodiment 2, wherein the user settings mechanism uses the mapping information to map the corresponding setting in the master template to a third corresponding setting in a third device-specific template corresponding to the first user, stores the third corresponding setting to the third device-specific template, and stores the third corresponding setting to a third physical device corresponding to the third device -specific template.
Embodiment 8. The apparatus of Embodiment 7, wherein the first physical device has a first combination of hardware architecture and system software and the second physical device has a second combination of hardware architecture and system software different than the first combination of hardware architecture and system software.
Embodiment 9. A computer-implemented method executed by a processor for managing user settings, the method comprising:
reading a first user setting from a master template;
reading mapping information that maps the plurality of user settings in the master template to each of a plurality of device -specific templates;
using the mapping information to map the first user setting in the master template to a corresponding setting in a device -specific template; storing the corresponding setting in the second device-specific template; and storing the corresponding setting to a physical device corresponding to the device -specific template. Embodiment 10. The method of Embodiment 9, further comprising:
(A) selecting a physical device;
(B) receiving from the physical device a plurality of user settings for the first user;
(C) storing the plurality of user settings for the first user to a device -specific template
corresponding to the selected physical device;
(D) repeating steps (A), (B) and (C) for a plurality of physical devices to store the plurality of user settings for the first user to a corresponding plurality of device -specific templates;
(E) reading mapping information that maps the plurality of user settings in each of the plurality of device-specific templates to the master template; and
(F) storing the plurality of user settings in the plurality of device-specific templates to the master template.
Embodiment 11. The method of Embodiment 9, wherein the master template includes a superset of all user settings for the first user stored in all of the plurality of device -specific templates and wherein the master template is a repository for all of the user settings for the first user.
Embodiment 12. The method of Embodiment 9, wherein the mapping information comprises information for mapping user settings from the plurality of device-specific templates to a plurality of universal templates that correspond to a plurality of device types, and information for mapping user settings from the universal templates to the master template.
Embodiment 13. The method of Embodiment 12, wherein each of the plurality of device types is defined by a combination of hardware architecture and system software.
Embodiment 14. The method of Embodiment 13, wherein the combination of hardware architecture and system software for each universal template is different than the combination of hardware architecture and system software for other universal templates.
Embodiment 15. The method of Embodiment 10, further comprising:
using the mapping information to map the corresponding setting in the master template to a third corresponding setting in a third device-specific template corresponding to the first user;
storing the third corresponding setting to the third device-specific template; and
storing the third corresponding setting to a third physical device corresponding to the third device - specific template.
Embodiment 16. The method of Embodiment 15, wherein the first physical device has a first combination of hardware architecture and system software and the second physical device has a second combination of hardware architecture and system software different than the first combination of hardware architecture and system software. [0574] While the implementation envisioned herein has the majority of storage for the user's information in the cloud, this cloud-based implementation may not be the ultimate end game. As technology advances, a handheld device like a smart phone could eventually have the capability of storing all of a user's information and could have sufficient computing capacity to perform all needed processing. Thus, the apparatus 300 shown in FIG. 3 could be representative of a handheld device, with the Universal Me system 100 running on the handheld device. In this scenario, when a user wants to program a television receiver or other device with the user's settings, the Universal Me app running on the user's smart phone could perform any needed conversion between different device types, and could download the settings to device via the smart phone's local interface (e.g., Bluetooth). Similarly, when a user rents a rental car, the user could scan a machine - readable code or enter a code into the user's smart phone to identify the rental car. The Universal Me app running on the smart phone will determine the type of car corresponding to the code, and when the Universal Me app does not have settings for this type of car, the Universal Me app will convert the user's settings to corresponding settings for this rental car, and will download the settings to the car via the smart phone's local interface (e.g., Bluetooth). As shown by these simple examples, when the storage and computing capacity of smart phones becomes sufficiently large, most of the processing in the Universal Me system could be performed on the smart phone. The cloud would continue to be used for backup and for virtual devices, but much of the user's information could be stored on the user's smart phone, which could then be used to perform the many functions discussed herein without accessing the user's U-Me account in the cloud.
[0575] One of the long-term goals of the U-Me system is to create a place for all of a user's data, licensed content and settings. This includes settings for devices we have not yet dreamed of. The philosophy is very simple. If the user is going to spend time configuring any device to the user's liking by specifying settings for the device, let's store those settings in the user's U-Me account so those settings can be used to reconfigure an identical or similar device, or so those settings can be used to generate suitable settings for a different kind of device. Thus, if a person spends hours configuring a computerized sewing machine to perform certain functions by specifying many different settings for the sewing machine, let's store those settings to the user's U-Me account. This philosophy can extend to any and all devices known today and developed in the future.
[0576] The specification herein uses different terms for phones, including cell phones, smart phones, and just "phones." These are all examples of different mobile phones. The disclosure and claims herein expressly extend to any and all mobile phones, whether currently known or developed in the future.
[0577] The specification herein discusses different types of computing devices, including smart phones, tablets, laptop computers, and desktop computers. The term "computer system" or "computer" as used herein can extend to any or all of these devices, as well as other devices, whether currently known or developed in the future. In one specific context, a computer system is a laptop or desktop computer system, which is a different type than a phone or a tablet. [0578] The disclosure herein uses some shortened terms for the sake of simplicity. For example, the word "information" is shortened in many instances to "info", the word "photograph" is shortened in many instances to "photo", the word "specifications" is shortened in some instances to "specs", and the word "medications" is shortened in some instances to "meds." Other shortened or colloquial terms may appear in the specification and drawings, which will be understood by those of ordinary skill in the art.
[0579] Many trademarks and service marks have been referenced in this patent application. Applicant has filed US federal service mark applications for "Universal Me" and for "U-Me". All other trademarks and service marks herein are the property of their respective owners, and applicant claims no rights in these other marks.
[0580] One skilled in the art will appreciate that many variations are possible within the scope of the claims. Thus, while the disclosure is particularly shown and described above, it will be understood by those skilled in the art that these and other changes in form and details may be made therein without departing from the spirit and scope of the claims. For example, while the specific examples in the figures and discussed above relate to various specific devices for the purpose of illustration, the principles herein apply equally to any type of device that has user settings, whether currently known or developed in the future.

Claims

Claims and Abstract from Martin.1442 CLAIMS
1. A computer system comprising:
at least one processor;
a memory coupled to the at least one processor;
user data residing in the memory corresponding to a first user;
first user settings corresponding to the first user for a plurality of software applications residing in the memory;
second user settings corresponding to the first user for a plurality of physical devices; and a software mechanism executed by the at least one processor that makes the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
2. The computer system of claim 1 further comprising an authentication mechanism that authenticates the first user on the first device before the software mechanism makes the user data, the first user settings, and the second user settings available to the first user on the first device.
3. The computer system of claim 2 wherein the authentication mechanism uses biometric authentication of the first user.
4. The computer system of claim 3 wherein the biometric authentication comprises scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously- stored reference fingerprint for the first user.
5. The computer system of claim 1 wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos.
6. The computer system of claim 1 further comprising licensed content in the memory that is licensed to the first user, wherein the software mechanism makes the licensed content available to the first user on the first device.
7. The computer system of claim 6 wherein the licensed content comprises music, movies, electronic books, software and games.
8. The computer system of claim 1 wherein the first user settings comprise first user preferences for each of the plurality of software applications.
9. The computer system of claim 1 wherein the second user settings comprise second user preferences for each of the plurality of physical devices.
10. The computer system of claim 1 wherein the plurality of physical devices comprises a mobile phone and a computer system.
11. The computer system of claim 1 further comprising a conversion mechanism that converts user settings for a first physical device to user settings for a second physical device.
12. The computer system of claim 11 wherein the first physical device and second physical device are of a similar type.
13. The computer system of claim 11 wherein the first physical device and second physical device are of different types.
14. A computer-implemented method executing on at least one processor comprising the steps of:
storing user data corresponding to a first user;
storing first user settings corresponding to the first user for a plurality of software applications; storing second user settings corresponding to the first user for a plurality of physical devices; and making the user data, the first user settings, and the second user settings available to the first user on a first device used by the first user.
15. The method of claim 14 further comprising the step of authenticating the first user on the first device before making the user data, the first user settings, and the second user settings available to the first user on the first device.
16. The method of claim 15 wherein the step of authenticating the first user performs biometric authentication of the first user.
17. The method of claim 16 wherein the biometric authentication comprises scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously-stored reference fingerprint for the first user.
18. The method of claim 14 wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos.
19. The method of claim 14 further comprising the step of storing licensed content that is licensed to the first user and making the licensed content available to the first user on the first device.
20. The method of claim 19 wherein the licensed content comprises music, movies, electronic books, software and games.
21. The method of claim 14 wherein the first user settings comprise first user preferences for each of the plurality of software applications.
22. The method of claim 14 wherein the second user settings comprise second user preferences for each of the plurality of physical devices.
23. The method of claim 14 wherein the plurality of physical devices comprises a mobile phone and a computer system.
24. The method of claim 14 further comprising a conversion mechanism that converts user settings for a first physical device to user settings for a second physical device.
25. The method of claim 24 wherein the first physical device and second physical device are of a similar type.
26. The method of claim 24 wherein the first physical device and second physical device are of different types.
27. A computer-implemented method executing on at least one processor comprising the steps of:
storing user data corresponding to a first user, wherein the user data corresponding to the first user comprises files, contacts, e-mail, calendar, tasks, reminders and digital photos;
storing licensed content that is licensed to the first user, wherein the licensed content comprises music, movies, electronic books, software and games;
storing first user settings corresponding to the first user for a plurality of software applications, wherein the first user settings comprise first user preferences for each of the plurality of software applications;
storing second user settings corresponding to the first user for a plurality of physical devices, wherein the second user settings comprise second user preferences for each of the plurality of physical devices, wherein the plurality of physical devices comprises a mobile phone and a computer system;
authenticating the first user on a first device used by first the user by scanning a fingerprint of the first user and comparing the scanned fingerprint against a previously-stored reference fingerprint for the first user;
making the user data, the licensed content, the first user settings, and the second user settings available to the first user on a first device used by the first user; and
converting user settings for a first physical device to user settings for a second physical device.
PCT/US2014/053596 2013-08-30 2014-08-29 Making a user's data, settings, and licensed content available in the cloud Ceased WO2015031861A1 (en)

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US14/016,038 US9118670B2 (en) 2013-08-30 2013-08-30 Making a user's data, settings, and licensed content available in the cloud
US14/016,038 2013-08-30
US14/044,843 2013-10-02
US14/044,843 US20150066941A1 (en) 2013-08-30 2013-10-02 Photo cataloging, storage and retrieval using relationships between people
US14/132,104 2013-12-18
US14/132,104 US9456164B2 (en) 2013-08-30 2013-12-18 Universal remote control that is automatically programmable according to location
US14/153,630 US20150066246A1 (en) 2013-08-30 2014-01-13 Making a user's information available in a vehicle
US14/153,630 2014-01-13
US14/207,490 US20150067099A1 (en) 2013-08-30 2014-03-12 Transferring user settings from one device to another
US14/207,490 2014-03-12
US14/462,523 US20150066853A1 (en) 2013-08-30 2014-08-18 Templates and mappings for user settings
US14/462,523 2014-08-18

Publications (1)

Publication Number Publication Date
WO2015031861A1 true WO2015031861A1 (en) 2015-03-05

Family

ID=52587400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/053596 Ceased WO2015031861A1 (en) 2013-08-30 2014-08-29 Making a user's data, settings, and licensed content available in the cloud

Country Status (1)

Country Link
WO (1) WO2015031861A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10372462B2 (en) 2015-11-24 2019-08-06 Nokia Technologies Oy Method and apparatus for device setup
CN111222101A (en) * 2018-11-27 2020-06-02 北京数安鑫云信息技术有限公司 Method and device for preventing illegal copying of software, and method and device for collecting software behavior information
US20230022816A1 (en) * 2021-07-23 2023-01-26 Fresenius Medical Care Holdings Inc. New language transfer

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060230124A1 (en) * 2000-06-22 2006-10-12 Microsoft Corporation Distributed computing services platform
US20130169546A1 (en) * 2011-12-29 2013-07-04 United Video Properties, Inc. Systems and methods for transferring settings across devices based on user gestures
US20130174237A1 (en) * 2011-12-29 2013-07-04 Ebay Inc. System and method for transferring states between electronic devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060230124A1 (en) * 2000-06-22 2006-10-12 Microsoft Corporation Distributed computing services platform
US20130169546A1 (en) * 2011-12-29 2013-07-04 United Video Properties, Inc. Systems and methods for transferring settings across devices based on user gestures
US20130174237A1 (en) * 2011-12-29 2013-07-04 Ebay Inc. System and method for transferring states between electronic devices

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10372462B2 (en) 2015-11-24 2019-08-06 Nokia Technologies Oy Method and apparatus for device setup
CN111222101A (en) * 2018-11-27 2020-06-02 北京数安鑫云信息技术有限公司 Method and device for preventing illegal copying of software, and method and device for collecting software behavior information
US20230022816A1 (en) * 2021-07-23 2023-01-26 Fresenius Medical Care Holdings Inc. New language transfer
US12159130B2 (en) * 2021-07-23 2024-12-03 Fresenius Medical Care Deutschland Gmbh Acquiring user-specific customization data for a medical device

Similar Documents

Publication Publication Date Title
US9781204B2 (en) Making a user&#39;s data, settings, and licensed content available in the cloud
US20150066941A1 (en) Photo cataloging, storage and retrieval using relationships between people
US11463576B1 (en) Screen interface for a mobile device apparatus
US11431834B1 (en) Screen interface for a mobile device apparatus
US9497173B2 (en) System for the unified organization, secure storage and secure retrieval of digital and paper documents
US20150066246A1 (en) Making a user&#39;s information available in a vehicle
US20130166325A1 (en) Apparatuses, systems and methods for insurance quoting
EP2617190B1 (en) Content capture device and methods for automatically tagging content
US8655881B2 (en) Method and apparatus for automatically tagging content
WO2020106882A9 (en) Digital asset management
US20170308616A1 (en) Digital asset management for enterprises
US20120072463A1 (en) Method and apparatus for managing content tagging and tagged content
US20150067099A1 (en) Transferring user settings from one device to another
US20120067954A1 (en) Sensors, scanners, and methods for automatically tagging content
US20180260582A1 (en) Systems and methods for secure user profiles
US20150081777A1 (en) Dynamic content aggregation
KR20200018792A (en) Universal data scaffold based data management platform
CN115877723A (en) Home appliance control method, related device and system
US20180276611A1 (en) Secure Network Based Order Confirmation, Transportation, and Delivery Processes Utilizing Logistics Automation
US20250061233A1 (en) Systems and methods for tokenization of personally identifiable information (pii) and personal health information (phi)
US9456164B2 (en) Universal remote control that is automatically programmable according to location
WO2015031861A1 (en) Making a user&#39;s data, settings, and licensed content available in the cloud
US20150066853A1 (en) Templates and mappings for user settings
US20200143445A1 (en) Methods and apparatus for merchandise generation including an image
US20120096369A1 (en) Automatically displaying photos uploaded remotely to a digital picture frame

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14839570

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14839570

Country of ref document: EP

Kind code of ref document: A1