WO2019216671A1 - User and/or application profiles - Google Patents
User and/or application profiles Download PDFInfo
- Publication number
- WO2019216671A1 WO2019216671A1 PCT/KR2019/005574 KR2019005574W WO2019216671A1 WO 2019216671 A1 WO2019216671 A1 WO 2019216671A1 KR 2019005574 W KR2019005574 W KR 2019005574W WO 2019216671 A1 WO2019216671 A1 WO 2019216671A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- application
- data
- usage
- applications
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
- G06F21/6254—Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/316—User authentication by observing the pattern of computer usage, e.g. typical user behaviour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0382—Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present specification relates to user and/or application profiles and relates, for example, to the use of usage signatures for generating user and/or application profiles.
- Application profiles can be generated based on user experiences of using such applications.
- this specification describes a method comprising: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game or an application); converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
- each usage signature is anonymized (in order, for example, to provide for user privacy).
- Each application profile may be based on usage signatures of a plurality of users of the respective application (e.g. an average, or some other combination, of the relevant usage signatures).
- usage signatures of a plurality of users of the respective application e.g. an average, or some other combination, of the relevant usage signatures.
- one method may characterise an application by how multiple users interact with an application.
- Each user profile may be based on usage signatures of a single user for a plurality of applications (e.g. an average, or some other combination, of the relevant usage signatures).
- usage signatures of a single user for a plurality of applications e.g. an average, or some other combination, of the relevant usage signatures.
- one method may characterise a user by how they interact with multiple applications.
- the user inputs can take many forms.
- the user inputs may include one or more of: swipes of one or more of the input devices; touches of one or more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons (or other input devices) pressed; pressure applied to buttons (or other input devices); joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data.
- one or more of the user inputs may be labelled with external information.
- Each user input may be converted into a usage signature using sparse coding and/or principle component analysis.
- each user input may be converted into a usage signature using a function that generates a unique fingerprint (e.g. without storing the user input data).
- the usage signature may be generated by any suitable one-way process (that is unable to recover the relevant user input). The use of such a one-way process may have security or privacy advantages.
- One or more of the user profile(s) may be used to identify a likely change in a user of a user device. In one embodiment, access to some or all functions of a user device is blocked on identifying the likely change in the user of a user device.
- One or more of the user profile(s) may be used to provide user performance feedback over time.
- One embodiment may include identifying changes in user profile data indicative of a potential health problem.
- the potential health problem(s) may include one or more short term health problems (such as tiredness).
- the potential health problem(s) may include one or more long term health problems (which may, for example, be indicative of a more serious health concern).
- One or more of the user profile(s) may be used to predict an emotional status of a user.
- the emotional status may be predicted based on one or more usage signatures and/or a change in one or more usage signatures.
- external information such as biometric data
- a user may be collected, for example for use in emotional status prediction.
- One or more of the user profile(s) may be used for user behaviour prediction and/or user profile prediction.
- One or more of said application profiles may be used to generate user feedback for the respective application.
- user feedback e.g. anonymised user feedback
- Similar applications may be identified on the basis of applications having similar application profiles.
- similar users may be identified on the basis of users having a similar usage signature.
- similar user groups may be identified on the basis of users having similar usage signatures for a given application.
- Applications may be suggested to a user based on an identified, determined or predicted emotional status of the user. Such emotions may be identified, determined or predicted based on one or more usage signatures. Biometric data for a user may be used during the identification, determination or prediction of emotions.
- this specification describes an apparatus configured to perform any method as described with reference to the first aspect.
- this specification describes computer-readable instructions which, when executed by computing apparatus, cause the computing apparatus to perform any method as described with reference to the first aspect.
- this specification describes an apparatus comprising: an input module for receiving a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game or an application); a converter for converting each of a plurality of user inputs into a usage signature (which may, for example, be a vector); and an aggregator for aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
- each usage signature is anonymized (in order, for example, to provide for user privacy).
- the aggregator may be configured to generate each application profile based on usage signatures of a plurality of users of the respective application.
- the aggregator may generate the application profile(s) from an average of the relevant usage signatures (although other methods, such as the use of alternating least squares, are possible).
- the aggregator may be configured to generate each user profile based on usage signatures of a single user for a plurality of applications.
- the aggregator may generate the user profile(s) from an average of the relevant usage signatures (although other methods, such as the use of alternating least squares, are possible).
- the user inputs may include one or more of: swipes of one or more of the input devices; touches of one or more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons (or other input devices) pressed; pressure applied to buttons (or other input devices); joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data.
- one or more of the user inputs may be labelled with external information.
- the converter may comprise a sparse coding module and/or principle component analysis module.
- the converter may comprise a module for converting each user input into a usage signature using a function that generates a unique fingerprint (e.g. without storing the user input data).
- the usage signature may be generated by any suitable one-way process (that is unable to recover the relevant user input).
- the apparatus of the fourth aspect may further comprise an output module.
- the output module may be configured to identify one or more of: a likely change in a user of a user device; user performance feedback over time; changes in user profile data indicative of a potential health problem; an emotional status of a user; user feedback for the respective application; similar applications, on the basis of applications having similar application profiles; similar users, on the basis of users having a similar usage signature; similar user groups, on the basis of users having similar usage signatures for a given application; user behaviour prediction; user profile prediction; and applications based on an identified, determined or predicted emotional status of a user.
- this specification describes a computer-readable medium having computer-readable code stored thereon, the computer readable code, when executed by at least one processor, causes performance of: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (wherein each usage signature may be anonymized); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
- one or more user devices such as one or more mobile communication devices and/or one or more games controllers
- each user device being used by a user for accessing an application (such as a computer game)
- an application such as a computer game
- usage signature wherein each usage signature may be anonymized
- aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles
- this specification describes an apparatus comprising: means for obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); means for converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature may be anonymized); and means for aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
- a usage signature which may, for example, be a vector
- each usage signature may be anonymized
- this specification describes a non-transitory computer readable medium comprising program instructions stored therefore for performing at least the following: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature may be anonymized); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
- a usage signature which may, for example, be a vector
- each usage signature may be anonymized
- this specification describes an apparatus comprising: at least one processor; and at least one memory including computer program code which, when executed by the at least one processor, causes the apparatus to: obtain a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); convert each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature may be anonymized); and aggregate some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
- a usage signature which may, for example, be a vector
- each usage signature may be anonymized
- FIG. 1 is a block diagram of an example system
- FIG. 2 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 3 is a block diagram of a portion of a system in accordance with an example embodiment
- FIG. 4 shows an example user device that may be used in an example embodiment
- FIG. 5 shows an example data signal provided for use with an example embodiment
- FIG. 6 shows an example data signal provided for use with an example embodiment
- FIG. 7 is a block diagram of a system in accordance with an example embodiment
- FIG. 8 is a block diagram of a system in accordance with an example embodiment
- FIG. 9 is a block diagram of a system in accordance with an example embodiment.
- FIG. 10 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 11 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 12 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 13 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 14 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 15 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 16 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 17 is a flow chart showing an algorithm in accordance with an example embodiment
- FIG. 18 is a flow chart showing an algorithm in accordance with an example embodiment.
- FIG. 19 is a block diagram of a system in accordance with an example embodiment.
- FIG. 1 is a block diagram of an example system, indicated generally by the reference numeral 1.
- the system 1 includes a processor 10.
- the processor has an input receiving user information and provides an output.
- the system 1 may be used to recommend applications to a user based on information about the user that is input to the processor 10.
- the user information may include details such as: the age of the user, the number of applications that the user has installed, user demographics, and details of the user device.
- the processor may suggest applications that might be of interest to the user.
- the system 1 is not able to make use of information regarding how the user interacts with one or more existing applications in the generation of the output (e.g. how the relevant applications are actually used).
- FIG. 2 is a flow chart showing an algorithm, indicated generally by the reference numeral 20, in accordance with an example embodiment.
- FIG. 3 is a block diagram of a portion of a system, indicated generally by the reference numeral 30, showing examples of user input data that might be collected in the step 22.
- the system 30 includes a collect user inputs module 32 configured to receive inputs including one or more of the following: swipes made on a user interface, pressure applied to a user interface (and/or to a button), degree of pressure applied (to the user interface and/or to the button), user reaction time, user playing style, user hand speed, user hand position, information relating to touches on the user interface, joystick positions, gyroscopic data and accelerometer data.
- the inputs shown in FIG. 3 are provided by way of example only. In any instance of the step 22 of the algorithm 20, some of those inputs may be omitted and/or other inputs may be provided.
- the inputs considered in the step 22 of the algorithm 20 may include inputs that are not related to the direct user interaction with a user device.
- biometric data such as heart rate data
- biometric data may be provided as an input. Such biometric data may be used, for example, in determining an emotion of the user (e.g. tired, excited, stimulated, bored etc.).
- Other external user input data sources could also be provided, as shown in FIG. 3.
- at least some user input data may be labelled with external information or data (such as biometric data or user emotion data).
- usage signatures are generated based on the user information collected in the step 22.
- Such usage signatures may, for example, be an interaction profile indicating how a particular user interacts with a particular application.
- the usage signatures may, for example, be modified from the user data such that the data is anonymized.
- one or more profiles are generated based on the usage signatures.
- the generated user profile(s) may include a user profile (relevant to a particular user) and/or an application profile (relating to a particular application).
- the profiles may be generated by aggregating data from multiple usage signatures.
- a user profile may include user information that is not related to the user's interaction with one or more applications (such as biometric data, or some other data indicative of user emotion).
- FIG. 4 shows an example user device 40 that may be used in example embodiments.
- the user device 40 may be used as a user input device (for example for providing input to an application, such as a computer game).
- the user device 40 may be configured to provide at least some of the data provided to the collect user input module 32 described above.
- the user device 40 may, for example, be a mobile phone or similar device, and/or a games controller. Alternative implementations for the user device 40 are possible.
- FIG. 4 A number of potential interactions between a user and the user device are shown in FIG. 4. These include a swipe left command 42a, a swipe right command 42b, a user device tilting 44 and user device shaking 46.
- FIG. 5 shows example data signals, indicated generally by the reference numeral 50, that might be obtained from the user device 40.
- the data signals 50 show a swipe signature plotted against time.
- the swipe signature includes left facing arrows indicative of a swipe left command 42a and right facing arrows indicative of a swipe right command 42b.
- the position of the arrows on the x-axis indicates the time (and duration) of the swipe.
- the position of the arrows on the y-axis can be used to indicate the vertical position on the user device screen of the swipe command.
- FIG. 6 shows an example data signal, indicated generally by the reference numeral 60, that might be obtained from the user device 40.
- the data signal 60 shows a tilt signature plotted over time.
- the tilt signature is derived from the user device tilting 44 discussed above.
- the plot 60 plots the degree of tilt (on the y-axis) against time.
- the data signals 50 and 60 are two examples of data signal formats and are provided by way of example only. Many other data signals formats are possible and may be used, as will be readily apparent to those skilled in the art.
- FIG. 7 is a block diagram of a system, indicated generally by the reference numeral 70, in accordance with an example embodiment.
- the system 70 includes a user input module 72, a data compression module 74, a database 76 and a processor 78.
- the user inputs module 72 may take the form of one or more collect user input modules 32 described above. The user input module 72 may therefore be configured to implement that step 22 of the algorithm 20 described above. The user inputs module 72 may include one or more user device 40.
- the data compression module 74 may be configured to anonymize the user inputs obtained from the user inputs module 72.
- the data compression module 74 may therefore be configured to implement the step 24 of the algorithm 20 described above.
- the data compression module 74 may be configured to compress the user input data using sparse coding or some other form of principal component analysis (PCA) to generate a usage signature vector.
- PCA is a technique that can be used to reduce the dimensionality of data, for example by applying statistical procedures to the underlying data.
- the compression carried out by the data compression module 74 may be used to reduce the quantity of data and to increase privacy by preventing the original data from being reconstructed from the compressed data.
- Alternative approaches to compress data might be a machine learning neural network encoder or manually created statistics.
- the data generated by the data compression module 74 may, for example, be referred to as a fingerprint or a signature.
- the fingerprint or signature may be provided as a vector.
- the fingerprint or signature may include other user information (such as general user information, such as age, user demographics etc., and/or measured user information, such as heart rate and other biometric data).
- the fingerprint(s) or signature(s) may be generated by any suitable one-way process (that is unable to recover the user data).
- Such techniques include machine learning techniques, such as auto-encoders, data statistics such as mean/variance/probability distribution or manually designed methods.
- the data compression step 74 may be configured to compress the data such that the information content is available, but the original data (such as the time, duration and position of swipe data in FIG. 5 and the shape of each tilt signature in FIG. 6) cannot be reconstructed from the compressed data.
- the swipe signature may be compressed so that the time and distance between swipes can be recovered, but the location of each swipe of a user interface screen cannot be reconstructed.
- the compressed data may be stored at the database 76.
- the database 76 may be located, for example, at a server and may be configured to store data relating to many users, obtained from many user inputs. By storing usage signatures generated by one or more instances of the data compression module 74, the database 76 does not need to store information that can be used to reconstruct the original data provided by the user inputs 72. This may be advantageous for privacy or security reasons.
- the processor 78 has access to the database 76 and therefore has access to the compressed data.
- the processor 78 may be configured to extract information from the stored data, as discussed further below.
- the processor 78 may be configured to, for example, extract or generate one or more user profiles and/or one of more application profiles from the data stored in the database 76, thereby implementing the step 26 of the algorithm 20 described above.
- the processor 78 may be configured to aggregate information from multiple data sources in order to generate the user and/or application profiles. For example, as discussed below, data from a single user across multiple applications may be aggregated to generate a user profile and data from multiple users for a single application may be aggregated to generate an application profile. The aggregation may involve a simple average, but other methods, such as the use of alternating least squares, are possible.
- the system 70 described above assumes that the aggregation of data to generate user and application profiles is carried out by the processor 78. This is not essential. For example, data could be aggregated within the database 76.
- FIG. 8 is a block diagram of a system, indicated generally by the reference numeral 80, in accordance with an example embodiment.
- the system 80 comprises a first user module 82, a second user module 83 and a third user module 84.
- Each of the user modules 82 to 84 may comprise a user input device and a data compression module (such as the user inputs module 72 and data compression module 74 described above).
- the system 80 further comprises a database 86 and a user profile generation module 87.
- the database 86 may be the database 76 of the system 70 described above.
- the user profile generation module 87 may be an example of the processor 78 described above.
- the first user module 82 is used by a first user to access a first application
- the second user module 83 is used by the first user to access a second application
- the third user module 84 is used by the first user to access a third application.
- the first, second and third modules 82 to 84 could be the same user module used (e.g. at different times) to access different applications.
- the database 86 may therefore be configured to store data (which may be compressed and/or anonymized) relating to how the first user interacts with multiple applications.
- the user profile generation module 87 may therefore be configured to use the data stored in the database 86 to create a user profile for the first user including information across multiple applications.
- FIG. 9 is a block diagram of a system, indicated generally by the reference numeral 90, in accordance with an example embodiment.
- the system 90 comprises a first user module 92, a second user module 93 and a third user module 94.
- Each of the user modules 92 to 94 may comprise a user input device and a data compression module (such as the user inputs module 72 and data compression module 74 described above).
- the system 90 further comprises a database 96 and an application profile generation module 97.
- the database 96 may be the database 76 of the system 70 described above.
- the application profile generation module 97 may be an example of the processor 78 described above.
- the first user module 92 is used by a first user to access a first application
- the second user module 93 is used by a second user to access the first application
- the third user module 94 is used by a third user to access the first application.
- the first, second and third modules 92 to 94 could be the same module used (at different times) by different users to access the first application.
- the database 96 may therefore store data (which may be compressed and/or anonymized) relating to how different users interact with the first application.
- the application profile generation module 97 may therefore use the data stored in the database 96 to create an application profile for the first application including information across multiple users.
- the system 90 may additionally comprise a fourth user module 98 and a fifth user module 99.
- Each of the user modules 98 and 99 may comprise a user input device and a data compression module (such as the user inputs module 72 and data compression module 74 described above).
- the fourth user module 98 may be used by the first user to access a second application and the fifth user module 99 may be used by the first user to access a third application.
- the database 96 may therefore store data relating to how the first user interacts with multiple applications.
- the user profile generation module 97 may also be able to create a user profile for the first user including information across multiple applications.
- the system 80 can be used to generate a user profile and the system 90 can be used to generate an application profile (and optionally a user profile).
- Such user profiles and applications profiles are examples of the profile(s) that may be generated in the step 26 of the algorithm 20 discussed above.
- FIG. 10 is a flow chart showing an algorithm, indicated generally by the reference numeral 100, in accordance with an example embodiment.
- the algorithm 100 starts at step 102, where user input data is collected, as described above.
- a user profile may be generated based on a user that is currently using a particular application or device.
- the user data e.g. user profile
- the user profile of the normal user of the application or device is compared with the user profile of the normal user of the application or device.
- a difference between the normal user and the current user may be indicative of an unauthorised user. If no difference is detected, the algorithm terminates at step 108. If a difference is detected, the algorithm moves to step 106 where access to the device or application may be restricted or prohibited, before the algorithm terminates at step 108.
- the step 106 may allow access to a device, but may prevent certain function(s), such as payment functions. This would enable, for example, a child to use a mobile phone owned by a parent to access games, but the detection of a different user (based on a generated user profile based on game play style) may be used to prevent the child from authorising payments from the mobile phone.
- the user identification process could be used as an alternative to other device locking methods (e.g. as an alternative to providing a password).
- FIG. 11 is a flow chart showing an algorithm, indicated generally by the reference numeral 110, in accordance with an example embodiment.
- the algorithm 110 starts at step 112, where user input data is collected, as described above.
- user performance indication is generated.
- the user performance indication may be generated from the user data stored within a database (e.g. database 76, 86 or 96).
- the user performance indication could be based on reaction time and/or user accuracy, which indications may be derivable from the stored user data.
- the step could be based on data for a single application (such as a game), but could also be provided for a single user across multiple applications. User performance indication for multiple users (of a single application or across multiple applications) could also be provided.
- the step 114 could additionally output information regarding how the user's performance has changed over time and could compare the user's performance with other users, based on the user profile data stored in the database.
- the algorithm 110 terminates at step 116.
- the algorithm 110 can be adapted to provide different outputs (in addition to, or instead of, user performance indication).
- the step 114 could provide a prediction of an emotional status of a user (for example whether the user is one or more of: content, excited, calm, tired and bored).
- Other uses of the principles of the algorithm 110 will be apparent to those skilled in the art.
- FIG. 12 is a flow chart showing an algorithm, indicated generally by the reference numeral 120, in accordance with an example embodiment.
- the algorithm 120 starts at step 122, where user input data is collected, as described above.
- the user profile data is interrogated to determine whether the data indicates any potential short term health issues. If so, the algorithm 120 moves to step 126; if not, the algorithm terminates at step 128.
- Short-term health issues may be identified in step 124 by identifying a change in user performance, as indicated by the user profile data (e.g. comparing current performance with historical performance).
- the step 124 could, for example, be used to identify fatigue and may, for example, indicate that the user should take a break.
- an alert is raised to the user at step 126 (e.g. recommending or requiring a break) and the algorithm 120 then terminates at step 128.
- FIG. 13 is a flow chart showing an algorithm, indicated generally by the reference numeral 130, in accordance with an example embodiment.
- the algorithm 130 starts at step 132, where user input data is collected, as described above.
- the user profile data is interrogated to determine whether the data indicates any potential long term health issues. If so, the algorithm 130 moves to step 136; if not, the algorithm terminates at step 138.
- Long-term health issues may be identified in step 134 by identifying a change in user performance, as indicated by the user profile data (e.g. comparing current performance with historical performance).
- the step 134 could, for example, be used to identify changes in levels of attention or reaction time.
- an alert is raised to the user at step 136 and the algorithm 120 then terminates at step 138.
- FIG. 14 is a flow chart showing an algorithm, indicated generally by the reference numeral 140, in accordance with an example embodiment.
- the algorithm 140 starts at step 142, where user input data is collected, as described above.
- the user data collected in step 142 may be based on many users of a particular application (such as a game).
- user feedback for example in the form of an application profile
- the user feedback could be provided to the developer of the application.
- the algorithm 140 then terminates at step 146.
- the user feedback provided in step 144 may enable a game or application developer to obtain information regarding how users are interacting with their game or application. Such data can be provided anonymously and can provide data regarding real users, rather than a test community.
- FIG. 15 is a flow chart showing an algorithm, indicated generally by the reference numeral 150, in accordance with an example embodiment.
- the algorithm 150 starts at step 152, where user input data is collected, as described above.
- the user data collected in step 152 may be based on many users of a particular application (such as a game).
- similar applications may be identified. Applications may be deemed to be similar in the event that the application profiles generated for the application share predefined metrics. The identification of similar application may be of interest, for example, to a user who enjoys a particular style of game and wishes to identify games with similar attributes.
- the algorithm 150 then terminates at step 156.
- the algorithm 150 may include identifying similar users on the basis of users having similar user fingerprint and/or identifying similar user groups on the basis of users having similar user fingerprint for a given application.
- FIG. 16 is a flow chart showing an algorithm, indicated generally by the reference numeral 160, in accordance with an example embodiment.
- the algorithm 160 starts at step 162, where user input data is collected, as described above.
- the user data collected in step 162 may be based on many users of a particular application (such as a game) and/or users of a plurality of applications.
- step 164 clustering by similarity is carried out.
- the step 164 may, for example, cluster users into groups of users having similar user characteristics (as extracted from the user inputs collected in step 162).
- the step 164 may cluster applications into groups of applications having similar user characteristics (as extracted from the user inputs collected in step 162).
- the cluster information collected in step 164 is used.
- the cluster information could be used for one or more of: targeted advertising, improved recommendations and tailoring mobile experiences to specific user groups.
- targeted advertising e.g., targeted advertising
- improved recommendations e.g., tailored recommendations
- tailoring mobile experiences e.g., personalized advertising, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, personalized recommendations, etc.
- generalised statistics from user or application groups can be obtained. Insights obtained from such statistics may be usable to further improve user experiences.
- Experimental results, novel insights and correlations of performance, preference and/or engagement are all possible applications of the step 166. Other uses will be apparent to the skilled person.
- the algorithm 160 then terminates at step 168.
- FIG. 17 is a flow chart showing an algorithm, indicated generally by the reference numeral 170, in accordance with an example embodiment.
- the algorithm 170 starts at step 172, where user input data is collected, as described above.
- the user data collected in step 172 may be based on many users of a particular application (such as a game) and/or users of a plurality of applications.
- step 174 one or more predictions are made based on the user input(s) collected in step 172.
- the algorithm 170 then terminates at step 176.
- the step 174 may be used to predict or infer data. For example, a prediction of movies that a particular user might like might be inferred on the basis of knowledge of the preferences of near neighbours to the user (with the near neighbours being identified, for example, by identifying other users with similar user characteristics).
- the algorithms 160 and 170 may be combined.
- the step 174 may make use of the clustering techniques of step 164 in the identification of near neighbours for providing predictions in the step 174.
- FIG. 18 is a flow chart showing an algorithm, indicated generally by the reference numeral 180, in accordance with an example embodiment.
- the algorithm 180 starts at step 182, where user input data is collected, as described above.
- the user data collected in step 182 may be based on usage signatures of a single user for one or more applications.
- step 184 one or more applications are suggested based on a determined emotion of the user (based on the user input(s) collected in step 182).
- the algorithm 180 then terminates at step 186.
- the step 184 may make use of the way in which a user is interacting with a game to identify, determine or predict an emotional status of the user (e.g. excited, bored, short-tempered etc).
- the step 184 may make use of biometric data (e.g. heart rate data) in the identification/determination/prediction of emotional status.
- biometric data e.g. heart rate data
- games or other applications may be suggested accordingly.
- Such game may, for example, be suggested in order to reduce an emotional response (e.g. calming games if a user is over-excited) or to make use of a detected emotion.
- FIG. 19 is a schematic diagram of a system, indicated generally by the reference numeral 200, that may be used in example implementations of the principles described herein.
- the system 200 comprises a processor 202, a memory 204 (for example, including RAM and/or ROM), input means 206 and output means 208.
- the processor 202 may be in communication with each of the other components in the system 200 in order to control operation thereof.
- the processor 202 may take any suitable form, such as a microcontroller, plural microcontrollers, a processor, or plural processors.
- the memory 204 may include a non-volatile memory, a hard disk drive (HDD) or a solid state drive (SSD) and may, for example, store an operating system and/or one or more software applications.
- the operating system may contain code which, when executed by the processor, implements aspects of the algorithms described herein.
- the input means 206 and the output means 208 may take many different forms and may be provided, for example, to allow a user (such as an application or games developer) to interact with the system 200.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Social Psychology (AREA)
- Bioethics (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An apparatus and method is described comprising: obtaining a plurality of user inputs from one or more user devices, each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (wherein each usage signature may be anonymized); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
Description
The present specification relates to user and/or application profiles and relates, for example, to the use of usage signatures for generating user and/or application profiles.
Application profiles can be generated based on user experiences of using such applications.
Such data tends to be unreliable since only a small number of users are typically willing to provide such data. There is a need for improved methods for generating application profiles and user profiles.
In a first aspect, this specification describes a method comprising: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game or an application); converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server). In some embodiments, each usage signature is anonymized (in order, for example, to provide for user privacy).
Each application profile may be based on usage signatures of a plurality of users of the respective application (e.g. an average, or some other combination, of the relevant usage signatures). Thus, one method may characterise an application by how multiple users interact with an application.
Each user profile may be based on usage signatures of a single user for a plurality of applications (e.g. an average, or some other combination, of the relevant usage signatures). Thus, one method may characterise a user by how they interact with multiple applications.
The user inputs can take many forms. By way of example, the user inputs may include one or more of: swipes of one or more of the input devices; touches of one or more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons (or other input devices) pressed; pressure applied to buttons (or other input devices); joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data. For example, one or more of the user inputs may be labelled with external information.
Each user input may be converted into a usage signature using sparse coding and/or principle component analysis. Alternatively, or in addition, each user input may be converted into a usage signature using a function that generates a unique fingerprint (e.g. without storing the user input data). Thus, in some embodiments, the usage signature may be generated by any suitable one-way process (that is unable to recover the relevant user input). The use of such a one-way process may have security or privacy advantages.
One or more of the user profile(s) may be used to identify a likely change in a user of a user device. In one embodiment, access to some or all functions of a user device is blocked on identifying the likely change in the user of a user device.
One or more of the user profile(s) may be used to provide user performance feedback over time.
One embodiment may include identifying changes in user profile data indicative of a potential health problem. The potential health problem(s) may include one or more short term health problems (such as tiredness). Alternatively, or in addition, the potential health problem(s) may include one or more long term health problems (which may, for example, be indicative of a more serious health concern).
One or more of the user profile(s) may be used to predict an emotional status of a user. The emotional status may be predicted based on one or more usage signatures and/or a change in one or more usage signatures. Alternatively, or in addition, external information (such as biometric data) regarding a user may be collected, for example for use in emotional status prediction.
One or more of the user profile(s) may be used for user behaviour prediction and/or user profile prediction.
One or more of said application profiles may be used to generate user feedback for the respective application. In some embodiments, user feedback (e.g. anonymised user feedback) may be provided, for example, to a game developer and/or an application developer.
Similar applications may be identified on the basis of applications having similar application profiles. Alternatively, or in addition, similar users may be identified on the basis of users having a similar usage signature. Alternatively, or in addition, similar user groups may be identified on the basis of users having similar usage signatures for a given application.
Applications may be suggested to a user based on an identified, determined or predicted emotional status of the user. Such emotions may be identified, determined or predicted based on one or more usage signatures. Biometric data for a user may be used during the identification, determination or prediction of emotions.
In a second aspect, this specification describes an apparatus configured to perform any method as described with reference to the first aspect.
In a third aspect, this specification describes computer-readable instructions which, when executed by computing apparatus, cause the computing apparatus to perform any method as described with reference to the first aspect.
In a fourth aspect, this specification describes an apparatus comprising: an input module for receiving a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game or an application); a converter for converting each of a plurality of user inputs into a usage signature (which may, for example, be a vector); and an aggregator for aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server). In some embodiments, each usage signature is anonymized (in order, for example, to provide for user privacy).
The aggregator may be configured to generate each application profile based on usage signatures of a plurality of users of the respective application. The aggregator may generate the application profile(s) from an average of the relevant usage signatures (although other methods, such as the use of alternating least squares, are possible).
The aggregator may be configured to generate each user profile based on usage signatures of a single user for a plurality of applications. The aggregator may generate the user profile(s) from an average of the relevant usage signatures (although other methods, such as the use of alternating least squares, are possible).
The user inputs may include one or more of: swipes of one or more of the input devices; touches of one or more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons (or other input devices) pressed; pressure applied to buttons (or other input devices); joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data. For example, one or more of the user inputs may be labelled with external information.
The converter may comprise a sparse coding module and/or principle component analysis module. The converter may comprise a module for converting each user input into a usage signature using a function that generates a unique fingerprint (e.g. without storing the user input data). Thus, in some embodiments, the usage signature may be generated by any suitable one-way process (that is unable to recover the relevant user input).
The apparatus of the fourth aspect may further comprise an output module. The output module may be configured to identify one or more of: a likely change in a user of a user device; user performance feedback over time; changes in user profile data indicative of a potential health problem; an emotional status of a user; user feedback for the respective application; similar applications, on the basis of applications having similar application profiles; similar users, on the basis of users having a similar usage signature; similar user groups, on the basis of users having similar usage signatures for a given application; user behaviour prediction; user profile prediction; and applications based on an identified, determined or predicted emotional status of a user.
In a fifth aspect, this specification describes a computer-readable medium having computer-readable code stored thereon, the computer readable code, when executed by at least one processor, causes performance of: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (wherein each usage signature may be anonymized); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
In a sixth aspect, this specification describes an apparatus comprising: means for obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); means for converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature may be anonymized); and means for aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
In a seventh aspect, this specification describes a non-transitory computer readable medium comprising program instructions stored therefore for performing at least the following: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature may be anonymized); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
In an eighth aspect, this specification describes an apparatus comprising: at least one processor; and at least one memory including computer program code which, when executed by the at least one processor, causes the apparatus to: obtain a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); convert each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature may be anonymized); and aggregate some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
Example embodiments will now be described, by way of non-limiting examples, with reference to the following schematic drawings, in which:
FIG. 1 is a block diagram of an example system;
FIG. 2 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 3 is a block diagram of a portion of a system in accordance with an example embodiment;
FIG. 4 shows an example user device that may be used in an example embodiment;
FIG. 5 shows an example data signal provided for use with an example embodiment;
FIG. 6 shows an example data signal provided for use with an example embodiment;
FIG. 7 is a block diagram of a system in accordance with an example embodiment;
FIG. 8 is a block diagram of a system in accordance with an example embodiment;
FIG. 9 is a block diagram of a system in accordance with an example embodiment;
FIG. 10 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 11 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 12 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 13 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 14 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 15 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 16 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 17 is a flow chart showing an algorithm in accordance with an example embodiment;
FIG. 18 is a flow chart showing an algorithm in accordance with an example embodiment; and
FIG. 19 is a block diagram of a system in accordance with an example embodiment.
FIG. 1 is a block diagram of an example system, indicated generally by the reference numeral 1. The system 1 includes a processor 10. The processor has an input receiving user information and provides an output.
By way of example, the system 1 may be used to recommend applications to a user based on information about the user that is input to the processor 10. The user information may include details such as: the age of the user, the number of applications that the user has installed, user demographics, and details of the user device. Based on the user information provided, the processor may suggest applications that might be of interest to the user. The system 1 is not able to make use of information regarding how the user interacts with one or more existing applications in the generation of the output (e.g. how the relevant applications are actually used).
FIG. 2 is a flow chart showing an algorithm, indicated generally by the reference numeral 20, in accordance with an example embodiment.
The algorithm 20 starts at step 22, where user input from one or more users is collected. The user inputs may take many forms. FIG. 3 is a block diagram of a portion of a system, indicated generally by the reference numeral 30, showing examples of user input data that might be collected in the step 22. The system 30 includes a collect user inputs module 32 configured to receive inputs including one or more of the following: swipes made on a user interface, pressure applied to a user interface (and/or to a button), degree of pressure applied (to the user interface and/or to the button), user reaction time, user playing style, user hand speed, user hand position, information relating to touches on the user interface, joystick positions, gyroscopic data and accelerometer data.
The inputs shown in FIG. 3 are provided by way of example only. In any instance of the step 22 of the algorithm 20, some of those inputs may be omitted and/or other inputs may be provided. For example, the inputs considered in the step 22 of the algorithm 20 may include inputs that are not related to the direct user interaction with a user device. For example, biometric data (such as heart rate data) for a user may be provided as an input. Such biometric data may be used, for example, in determining an emotion of the user (e.g. tired, excited, stimulated, bored etc.). Other external user input data sources (not related to user inputs or biometric data) could also be provided, as shown in FIG. 3. For example, at least some user input data may be labelled with external information or data (such as biometric data or user emotion data).
At step 24 of the algorithm 20, usage signatures are generated based on the user information collected in the step 22. Such usage signatures may, for example, be an interaction profile indicating how a particular user interacts with a particular application. As discussed further below, the usage signatures may, for example, be modified from the user data such that the data is anonymized.
At step 26, one or more profiles are generated based on the usage signatures. As discussed further below, the generated user profile(s) may include a user profile (relevant to a particular user) and/or an application profile (relating to a particular application). The profiles may be generated by aggregating data from multiple usage signatures. A user profile may include user information that is not related to the user's interaction with one or more applications (such as biometric data, or some other data indicative of user emotion).
FIG. 4 shows an example user device 40 that may be used in example embodiments. The user device 40 may be used as a user input device (for example for providing input to an application, such as a computer game). The user device 40 may be configured to provide at least some of the data provided to the collect user input module 32 described above. The user device 40 may, for example, be a mobile phone or similar device, and/or a games controller. Alternative implementations for the user device 40 are possible.
A number of potential interactions between a user and the user device are shown in FIG. 4. These include a swipe left command 42a, a swipe right command 42b, a user device tilting 44 and user device shaking 46.
FIG. 5 shows example data signals, indicated generally by the reference numeral 50, that might be obtained from the user device 40. The data signals 50 show a swipe signature plotted against time. The swipe signature includes left facing arrows indicative of a swipe left command 42a and right facing arrows indicative of a swipe right command 42b. The position of the arrows on the x-axis indicates the time (and duration) of the swipe. The position of the arrows on the y-axis can be used to indicate the vertical position on the user device screen of the swipe command.
FIG. 6 shows an example data signal, indicated generally by the reference numeral 60, that might be obtained from the user device 40. The data signal 60 shows a tilt signature plotted over time. The tilt signature is derived from the user device tilting 44 discussed above. The plot 60 plots the degree of tilt (on the y-axis) against time.
The data signals 50 and 60 are two examples of data signal formats and are provided by way of example only. Many other data signals formats are possible and may be used, as will be readily apparent to those skilled in the art.
FIG. 7 is a block diagram of a system, indicated generally by the reference numeral 70, in accordance with an example embodiment. The system 70 includes a user input module 72, a data compression module 74, a database 76 and a processor 78.
The user inputs module 72 may take the form of one or more collect user input modules 32 described above. The user input module 72 may therefore be configured to implement that step 22 of the algorithm 20 described above. The user inputs module 72 may include one or more user device 40.
The data compression module 74 may be configured to anonymize the user inputs obtained from the user inputs module 72. The data compression module 74 may therefore be configured to implement the step 24 of the algorithm 20 described above. The data compression module 74 may be configured to compress the user input data using sparse coding or some other form of principal component analysis (PCA) to generate a usage signature vector. PCA is a technique that can be used to reduce the dimensionality of data, for example by applying statistical procedures to the underlying data. The compression carried out by the data compression module 74 may be used to reduce the quantity of data and to increase privacy by preventing the original data from being reconstructed from the compressed data. Alternative approaches to compress data might be a machine learning neural network encoder or manually created statistics. The data generated by the data compression module 74 may, for example, be referred to as a fingerprint or a signature. The fingerprint or signature may be provided as a vector. The fingerprint or signature may include other user information (such as general user information, such as age, user demographics etc., and/or measured user information, such as heart rate and other biometric data). The fingerprint(s) or signature(s) may be generated by any suitable one-way process (that is unable to recover the user data). Such techniques include machine learning techniques, such as auto-encoders, data statistics such as mean/variance/probability distribution or manually designed methods.
Consider, for example, the data shown in FIGS. 5 and 6. The data compression step 74 may be configured to compress the data such that the information content is available, but the original data (such as the time, duration and position of swipe data in FIG. 5 and the shape of each tilt signature in FIG. 6) cannot be reconstructed from the compressed data. For example, the swipe signature may be compressed so that the time and distance between swipes can be recovered, but the location of each swipe of a user interface screen cannot be reconstructed.
The compressed data may be stored at the database 76. The database 76 may be located, for example, at a server and may be configured to store data relating to many users, obtained from many user inputs. By storing usage signatures generated by one or more instances of the data compression module 74, the database 76 does not need to store information that can be used to reconstruct the original data provided by the user inputs 72. This may be advantageous for privacy or security reasons.
The processor 78 has access to the database 76 and therefore has access to the compressed data. The processor 78 may be configured to extract information from the stored data, as discussed further below. The processor 78 may be configured to, for example, extract or generate one or more user profiles and/or one of more application profiles from the data stored in the database 76, thereby implementing the step 26 of the algorithm 20 described above.
The processor 78 may be configured to aggregate information from multiple data sources in order to generate the user and/or application profiles. For example, as discussed below, data from a single user across multiple applications may be aggregated to generate a user profile and data from multiple users for a single application may be aggregated to generate an application profile. The aggregation may involve a simple average, but other methods, such as the use of alternating least squares, are possible.
The system 70 described above assumes that the aggregation of data to generate user and application profiles is carried out by the processor 78. This is not essential. For example, data could be aggregated within the database 76.
FIG. 8 is a block diagram of a system, indicated generally by the reference numeral 80, in accordance with an example embodiment.
The system 80 comprises a first user module 82, a second user module 83 and a third user module 84. Each of the user modules 82 to 84 may comprise a user input device and a data compression module (such as the user inputs module 72 and data compression module 74 described above). The system 80 further comprises a database 86 and a user profile generation module 87. The database 86 may be the database 76 of the system 70 described above. The user profile generation module 87 may be an example of the processor 78 described above.
As shown in FIG. 8, the first user module 82 is used by a first user to access a first application, the second user module 83 is used by the first user to access a second application and the third user module 84 is used by the first user to access a third application. Of course, the first, second and third modules 82 to 84 could be the same user module used (e.g. at different times) to access different applications.
The database 86 may therefore be configured to store data (which may be compressed and/or anonymized) relating to how the first user interacts with multiple applications. The user profile generation module 87 may therefore be configured to use the data stored in the database 86 to create a user profile for the first user including information across multiple applications.
FIG. 9 is a block diagram of a system, indicated generally by the reference numeral 90, in accordance with an example embodiment. The system 90 comprises a first user module 92, a second user module 93 and a third user module 94. Each of the user modules 92 to 94 may comprise a user input device and a data compression module (such as the user inputs module 72 and data compression module 74 described above). The system 90 further comprises a database 96 and an application profile generation module 97. The database 96 may be the database 76 of the system 70 described above. The application profile generation module 97 may be an example of the processor 78 described above.
As shown in FIG. 9, the first user module 92 is used by a first user to access a first application, the second user module 93 is used by a second user to access the first application and the third user module 94 is used by a third user to access the first application. Of course, the first, second and third modules 92 to 94 could be the same module used (at different times) by different users to access the first application.
The database 96 may therefore store data (which may be compressed and/or anonymized) relating to how different users interact with the first application. The application profile generation module 97 may therefore use the data stored in the database 96 to create an application profile for the first application including information across multiple users.
The system 90 may additionally comprise a fourth user module 98 and a fifth user module 99. Each of the user modules 98 and 99 may comprise a user input device and a data compression module (such as the user inputs module 72 and data compression module 74 described above). As shown in FIG. 9, the fourth user module 98 may be used by the first user to access a second application and the fifth user module 99 may be used by the first user to access a third application. The database 96 may therefore store data relating to how the first user interacts with multiple applications. Thus, the user profile generation module 97 may also be able to create a user profile for the first user including information across multiple applications.
As discussed above, the system 80 can be used to generate a user profile and the system 90 can be used to generate an application profile (and optionally a user profile). Such user profiles and applications profiles are examples of the profile(s) that may be generated in the step 26 of the algorithm 20 discussed above.
User and application profiles generated in accordance with the principles described above have a wide variety of potential uses. A number of potential uses are described below with reference to FIGS. 10 to 18. It should be understood that these are some of many example uses of such user and application profiles that will be apparent to those skilled in the art.
FIG. 10 is a flow chart showing an algorithm, indicated generally by the reference numeral 100, in accordance with an example embodiment. The algorithm 100 starts at step 102, where user input data is collected, as described above. A user profile may be generated based on a user that is currently using a particular application or device.
At step 104, the user data (e.g. user profile) of a person currently using the particular application or device is compared with the user profile of the normal user of the application or device. A difference between the normal user and the current user may be indicative of an unauthorised user. If no difference is detected, the algorithm terminates at step 108. If a difference is detected, the algorithm moves to step 106 where access to the device or application may be restricted or prohibited, before the algorithm terminates at step 108.
By way of example, the step 106 may allow access to a device, but may prevent certain function(s), such as payment functions. This would enable, for example, a child to use a mobile phone owned by a parent to access games, but the detection of a different user (based on a generated user profile based on game play style) may be used to prevent the child from authorising payments from the mobile phone.
There are further potential uses for the algorithm 100. For example, the user identification process could be used as an alternative to other device locking methods (e.g. as an alternative to providing a password).
FIG. 11 is a flow chart showing an algorithm, indicated generally by the reference numeral 110, in accordance with an example embodiment. The algorithm 110 starts at step 112, where user input data is collected, as described above.
At step 114, user performance indication is generated. The user performance indication may be generated from the user data stored within a database ( e.g. database 76, 86 or 96). By way of example, the user performance indication could be based on reaction time and/or user accuracy, which indications may be derivable from the stored user data. The step could be based on data for a single application (such as a game), but could also be provided for a single user across multiple applications. User performance indication for multiple users (of a single application or across multiple applications) could also be provided.
The step 114 could additionally output information regarding how the user's performance has changed over time and could compare the user's performance with other users, based on the user profile data stored in the database.
With the user performance indication indicated to the user, the algorithm 110 terminates at step 116.
The algorithm 110 can be adapted to provide different outputs (in addition to, or instead of, user performance indication). For example, the step 114 could provide a prediction of an emotional status of a user (for example whether the user is one or more of: content, excited, calm, tired and bored). Other uses of the principles of the algorithm 110 will be apparent to those skilled in the art.
FIG. 12 is a flow chart showing an algorithm, indicated generally by the reference numeral 120, in accordance with an example embodiment. The algorithm 120 starts at step 122, where user input data is collected, as described above.
At step 124, the user profile data is interrogated to determine whether the data indicates any potential short term health issues. If so, the algorithm 120 moves to step 126; if not, the algorithm terminates at step 128.
Short-term health issues may be identified in step 124 by identifying a change in user performance, as indicated by the user profile data (e.g. comparing current performance with historical performance). The step 124 could, for example, be used to identify fatigue and may, for example, indicate that the user should take a break. In the event that a short-term health issue is identified, an alert is raised to the user at step 126 (e.g. recommending or requiring a break) and the algorithm 120 then terminates at step 128.
FIG. 13 is a flow chart showing an algorithm, indicated generally by the reference numeral 130, in accordance with an example embodiment. The algorithm 130 starts at step 132, where user input data is collected, as described above.
At step 134, the user profile data is interrogated to determine whether the data indicates any potential long term health issues. If so, the algorithm 130 moves to step 136; if not, the algorithm terminates at step 138.
Long-term health issues may be identified in step 134 by identifying a change in user performance, as indicated by the user profile data (e.g. comparing current performance with historical performance). The step 134 could, for example, be used to identify changes in levels of attention or reaction time. In the event that a long-term health issue is identified, an alert is raised to the user at step 136 and the algorithm 120 then terminates at step 138.
FIG. 14 is a flow chart showing an algorithm, indicated generally by the reference numeral 140, in accordance with an example embodiment. The algorithm 140 starts at step 142, where user input data is collected, as described above. The user data collected in step 142 may be based on many users of a particular application (such as a game).
At step 144, user feedback, for example in the form of an application profile, is provided. For example, the user feedback could be provided to the developer of the application. The algorithm 140 then terminates at step 146. The user feedback provided in step 144 may enable a game or application developer to obtain information regarding how users are interacting with their game or application. Such data can be provided anonymously and can provide data regarding real users, rather than a test community.
FIG. 15 is a flow chart showing an algorithm, indicated generally by the reference numeral 150, in accordance with an example embodiment. The algorithm 150 starts at step 152, where user input data is collected, as described above. The user data collected in step 152 may be based on many users of a particular application (such as a game).
At step 154, similar applications may be identified. Applications may be deemed to be similar in the event that the application profiles generated for the application share predefined metrics. The identification of similar application may be of interest, for example, to a user who enjoys a particular style of game and wishes to identify games with similar attributes. The algorithm 150 then terminates at step 156.
In addition to, or instead of, the step 154, the algorithm 150 may include identifying similar users on the basis of users having similar user fingerprint and/or identifying similar user groups on the basis of users having similar user fingerprint for a given application.
FIG. 16 is a flow chart showing an algorithm, indicated generally by the reference numeral 160, in accordance with an example embodiment. The algorithm 160 starts at step 162, where user input data is collected, as described above. The user data collected in step 162 may be based on many users of a particular application (such as a game) and/or users of a plurality of applications.
At step 164, clustering by similarity is carried out. The step 164 may, for example, cluster users into groups of users having similar user characteristics (as extracted from the user inputs collected in step 162). Alternatively, or in addition, the step 164 may cluster applications into groups of applications having similar user characteristics (as extracted from the user inputs collected in step 162).
At step 166, the cluster information collected in step 164 is used. For example, the cluster information could be used for one or more of: targeted advertising, improved recommendations and tailoring mobile experiences to specific user groups. Once grouped, generalised statistics from user or application groups can be obtained. Insights obtained from such statistics may be usable to further improve user experiences. Experimental results, novel insights and correlations of performance, preference and/or engagement are all possible applications of the step 166. Other uses will be apparent to the skilled person.
The algorithm 160 then terminates at step 168.
FIG. 17 is a flow chart showing an algorithm, indicated generally by the reference numeral 170, in accordance with an example embodiment. The algorithm 170 starts at step 172, where user input data is collected, as described above. The user data collected in step 172 may be based on many users of a particular application (such as a game) and/or users of a plurality of applications.
At step 174, one or more predictions are made based on the user input(s) collected in step 172. The algorithm 170 then terminates at step 176.
For example, the step 174 may be used to predict or infer data. For example, a prediction of movies that a particular user might like might be inferred on the basis of knowledge of the preferences of near neighbours to the user (with the near neighbours being identified, for example, by identifying other users with similar user characteristics).
In some embodiments, the algorithms 160 and 170 may be combined. For example, the step 174 may make use of the clustering techniques of step 164 in the identification of near neighbours for providing predictions in the step 174.
FIG. 18 is a flow chart showing an algorithm, indicated generally by the reference numeral 180, in accordance with an example embodiment. The algorithm 180 starts at step 182, where user input data is collected, as described above. The user data collected in step 182 may be based on usage signatures of a single user for one or more applications.
At step 184, one or more applications are suggested based on a determined emotion of the user (based on the user input(s) collected in step 182). The algorithm 180 then terminates at step 186.
For example, the step 184 may make use of the way in which a user is interacting with a game to identify, determine or predict an emotional status of the user (e.g. excited, bored, short-tempered etc). In addition, the step 184 may make use of biometric data (e.g. heart rate data) in the identification/determination/prediction of emotional status. Once an emotion has been identified, determined or predicted, games (or other applications) may be suggested accordingly. Such game may, for example, be suggested in order to reduce an emotional response (e.g. calming games if a user is over-excited) or to make use of a detected emotion.
If desired, different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions may be optional or may be combined. For example, any combination of the algorithms described above with reference to FIGS. 10 to 18 may be provided within a single implementation. For example, the detection of potential short-term and long-term health issues as described above with reference to the algorithms 120 and 130 could readily be combined. Moreover, those skilled in the art will be aware of variants to such algorithms.
For completeness, FIG. 19 is a schematic diagram of a system, indicated generally by the reference numeral 200, that may be used in example implementations of the principles described herein. The system 200 comprises a processor 202, a memory 204 (for example, including RAM and/or ROM), input means 206 and output means 208. The processor 202 may be in communication with each of the other components in the system 200 in order to control operation thereof. The processor 202 may take any suitable form, such as a microcontroller, plural microcontrollers, a processor, or plural processors.
The memory 204 may include a non-volatile memory, a hard disk drive (HDD) or a solid state drive (SSD) and may, for example, store an operating system and/or one or more software applications. The operating system may contain code which, when executed by the processor, implements aspects of the algorithms described herein.
The input means 206 and the output means 208 may take many different forms and may be provided, for example, to allow a user (such as an application or games developer) to interact with the system 200.
It will be appreciated that the above described example embodiments are purely illustrative and not limiting on the scope of the invention. Other variants and modifications will be apparent to persons skilled in the art upon reading the present specification.
Moreover, the disclosure of the present application should be understood to include any novel features or any novel combination of features either explicitly or implicitly disclosed herein or any generalization thereof and during the prosecution of the present application or of any application derived therefrom, new claims may be formulated to cover any such features and/or combination of such features.
Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.
It is also noted herein that while the above describes various examples, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims.
Claims (15)
- A method for controlling an apparatus, comprising:obtaining, by the apparatus, a plurality of user inputs;converting, by the apparatus, the plurality of user inputs into a plurality of usage signatures respectively, wherein the plurality of usage signatures include an interaction profile indicating how a specified user interacts with a specified application; andgenerate, by the apparatus, one or more application profiles and/or one or more user profiles based on the plurality of usage signatures.
- The method of claim 1, wherein each of the one or more application profiles is based on the plurality of usage signatures of a plurality of users of a respective application among a plurality of applications.
- The method of claim 1, wherein each of the one or more user profiles is based on usage signatures of a single user for a plurality of applications.
- The method of claim 1, wherein each of the plurality of user inputs is converted into the plurality of usage signatures using a function that generates a unique fingerprint without storing data of the plurality of user inputs.
- The method of claim 1, further comprising using at least one of the one or more user profiles to identify a likely change in a user of a user device.
- The method of claim 5, further comprising blocking access to one or more functions of the user device on identifying the likely change in the user of the user device.
- An apparatus comprising:an input module configured to receive a plurality of user inputs;a converter configured to convert the plurality of user inputs into a plurality of usage signatures respectively, wherein the plurality of usage signatures include an interaction profile indicating how a specified user interacts with a specified application; anda processor configured to generate one or more application profiles and/or one or more user profiles based on the plurality of usage signatures.
- The apparatus of claim 7, wherein each of the plurality of usage signatures is anonymized.
- The apparatus of claim 7, wherein the processor is configured to generate each of the one or more application profiles based on the plurality of usage signatures of a plurality of users of a respective application among a plurality of applications.
- The apparatus of claim 9, wherein the processor is configured to generate the one or more application profiles based on an average of relevant usage signatures.
- The apparatus of claim 7, wherein the processor is configured to generate each of the plurality user profile based on usage signatures of a single user for a plurality of applications.
- The apparatus of claim 11, wherein the processor is configured to generate the one or more user profiles based on an average of the relevant usage signatures.
- The apparatus of claim 7, wherein the plurality of user inputs include at least one of: swipes of one or more user devices, touches of the one or more user devices, pressure applied to the one or more user devices, pressure size applied to the one or more user devices, buttons pressed, pressure applied to buttons, joystick positions, gyroscope data, accelerometer data, user reaction time, user hand position, user hand speed, user playing style, biometric data or other external data.
- The apparatus of claim 7, wherein the converter comprises at least one of a sparse coding module, a principle component analysis module, or a module using a function for generating a unique fingerprint without storing data of the plurality of user inputs.
- The apparatus of claim 7, further comprising an output module;wherein the output module is configured to identify at least one of:a likely change in a user of a user device;user performance feedback over time;changes in user profile data indicative of a potential health problem;an emotional status of a user;user feedback for a respective application among a plurality of applications;similar applications, on the basis of applications having similar application profiles;similar users, on the basis of users having a similar usage signature;similar user groups, on the basis of users having similar usage signatures for a given application;user behaviour prediction;user profile prediction; orapplications based on the emotional status of the user.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1807658.8A GB2575236A (en) | 2018-05-11 | 2018-05-11 | User and/or application profiles |
| GB1807658.8 | 2018-05-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019216671A1 true WO2019216671A1 (en) | 2019-11-14 |
Family
ID=62623247
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/005574 Ceased WO2019216671A1 (en) | 2018-05-11 | 2019-05-09 | User and/or application profiles |
Country Status (2)
| Country | Link |
|---|---|
| GB (1) | GB2575236A (en) |
| WO (1) | WO2019216671A1 (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12125054B2 (en) | 2018-09-25 | 2024-10-22 | Valideck International Corporation | System, devices, and methods for acquiring and verifying online information |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070214144A1 (en) * | 2001-04-26 | 2007-09-13 | Lawson Robert J | System and method for managing user profiles |
| US20120239752A1 (en) * | 2009-12-02 | 2012-09-20 | Vinod Kumar Gopinath | Management of user profiles in a cloud based managed utility computing environment |
| US20120323694A1 (en) * | 2011-06-15 | 2012-12-20 | Blue Kai, Inc. | Non-invasive sampling and fingerprinting of online users and their behavior |
| US20150088955A1 (en) * | 2013-09-20 | 2015-03-26 | Nuance Communications, Inc. | Mobile application daily user engagement scores and user profiles |
| US20170086053A1 (en) * | 2012-01-27 | 2017-03-23 | Microsoft Technology Licensing, Llc | Data usage profiles for users and applications |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9696336B2 (en) * | 2011-11-30 | 2017-07-04 | The Nielsen Company (Us), Llc | Multiple meter detection and processing using motion data |
| US10289819B2 (en) * | 2015-08-12 | 2019-05-14 | Kryptowire LLC | Active authentication of users |
-
2018
- 2018-05-11 GB GB1807658.8A patent/GB2575236A/en not_active Withdrawn
-
2019
- 2019-05-09 WO PCT/KR2019/005574 patent/WO2019216671A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070214144A1 (en) * | 2001-04-26 | 2007-09-13 | Lawson Robert J | System and method for managing user profiles |
| US20120239752A1 (en) * | 2009-12-02 | 2012-09-20 | Vinod Kumar Gopinath | Management of user profiles in a cloud based managed utility computing environment |
| US20120323694A1 (en) * | 2011-06-15 | 2012-12-20 | Blue Kai, Inc. | Non-invasive sampling and fingerprinting of online users and their behavior |
| US20170086053A1 (en) * | 2012-01-27 | 2017-03-23 | Microsoft Technology Licensing, Llc | Data usage profiles for users and applications |
| US20150088955A1 (en) * | 2013-09-20 | 2015-03-26 | Nuance Communications, Inc. | Mobile application daily user engagement scores and user profiles |
Also Published As
| Publication number | Publication date |
|---|---|
| GB2575236A (en) | 2020-01-08 |
| GB201807658D0 (en) | 2018-06-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Draffin et al. | Keysens: Passive user authentication through micro-behavior modeling of soft keyboard interaction | |
| Das et al. | Personalized privacy assistants for the internet of things: Providing users with notice and choice | |
| CN112861139B (en) | Device security based on screen analysis | |
| Li et al. | Unobservable re-authentication for smartphones. | |
| Das Swain et al. | Semantic gap in predicting mental wellbeing through passive sensing | |
| WO2018155920A1 (en) | Method and apparatus for authenticating users in internet of things environment | |
| CN109478218A (en) | For the device and method for executing session of classifying | |
| US20230388111A1 (en) | Apparatus and methods for secure distributed communications and data access | |
| US11113371B2 (en) | Continuous authentication based on motion input data | |
| Doryab et al. | Extraction of behavioral features from smartphone and wearable data | |
| KR100590177B1 (en) | The terminal, the system of managing log data and the method which manages log data | |
| KR102243890B1 (en) | Method and apparatus for managing visitor of hospital | |
| Delgado Rodriguez et al. | Do you need to touch? Exploring correlations between personal attributes and preferences for tangible privacy mechanisms | |
| Keshavarz et al. | The automatic detection of sensitive data in smart homes | |
| WO2019216671A1 (en) | User and/or application profiles | |
| Basu et al. | COPPTCHA: COPPA tracking by checking hardware-level activity | |
| Tang et al. | Niffler: A Context‐Aware and User‐Independent Side‐Channel Attack System for Password Inference | |
| WO2025226091A1 (en) | Customer identification method and device for identifying same person in store | |
| CN115220597A (en) | Data acquisition method, device, terminal, server and readable storage medium | |
| Rocha et al. | A2BeST: An adaptive authentication service based on mobile user's behavior and spatio-temporal context | |
| KR102480453B1 (en) | Personal information sharing device through personal information collection entity | |
| US12425193B2 (en) | Resource access control | |
| JP6053646B2 (en) | Monitoring device, information processing system, monitoring method, and program | |
| WO2019054598A1 (en) | Eye tracking method and user terminal for performing same | |
| CN114090392A (en) | Page browsing time duration statistical method and device, electronic equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19799796 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19799796 Country of ref document: EP Kind code of ref document: A1 |