[go: up one dir, main page]

WO2015167470A1 - Measuring user action performance by classifying user actions - Google Patents

Measuring user action performance by classifying user actions Download PDF

Info

Publication number
WO2015167470A1
WO2015167470A1 PCT/US2014/035915 US2014035915W WO2015167470A1 WO 2015167470 A1 WO2015167470 A1 WO 2015167470A1 US 2014035915 W US2014035915 W US 2014035915W WO 2015167470 A1 WO2015167470 A1 WO 2015167470A1
Authority
WO
WIPO (PCT)
Prior art keywords
user action
user
data
control type
classification
Prior art date
Application number
PCT/US2014/035915
Other languages
French (fr)
Inventor
Haim SHUVALI
Amichai Nitsan
Dana GILBOA
Yirat HENDLER
Guy Offer
Original Assignee
Hewlett-Packard Development Company, Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, Lp filed Critical Hewlett-Packard Development Company, Lp
Priority to PCT/US2014/035915 priority Critical patent/WO2015167470A1/en
Publication of WO2015167470A1 publication Critical patent/WO2015167470A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0633Workflow analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • a typical computing device may allow a user to interact with a software application through a user interface displayed on a display device. For instance, when a user performs a user action, sjch as clicking a button displayed on a user interface of the computing device, the application may cause the processor of the computing device to perform a number of operations that ultimately result in an output that is visible to the user, such as an update to the user interface.
  • FIG 1 is a block diagram of an example system for measuring user action performance consistent with disclosed implementations
  • FIG 2 is a block diagram of an example user action performance measurement device consistent with disclosed implementations
  • FIG 3 is a block diagram of an example user action performance measurement device consistent with disclosed implementations
  • FIG. 4 is a flow chart of an example process for measuring user action performance consistent with disclosed implementations
  • FIG. 5 is a flow chart of an example process for capturing user action data consistent with disclosed implementations
  • FIG. 6 is a flow chart of an example process for classifying user actions consistent with disclosed implementations
  • FIG. 7 is a flow chart of an example process for analyzing user actions based on user action classifications consistent with disclosed implementations; and [0010] FIG 8 is an example of a user interface for displaying user action-related statistics consistent with disclosed implementations
  • the average response time may be skewed by very few long network response times, and the comparison of each user acton to a single threshold may not accurately reflect user frustration when an action a user expects to take a short amount of time is taking longer than the user expects, but less time than the single static threshold Moreover, by measuring the network response time instead of the time it takes for the user to obtain the result he or she expects (e.g., a complete display update) little insight is provided into the user's satisfaction Accordingly, an accurate measurement of the user's experience with the application should be based on user expectations.
  • Examples disclosed herein provide user action performance measurement based on user expectation To this erd.
  • example implementations disclosed herein may measure user action performance by distinguishing between actions that a user expects to take a short amount of time and those that a user expects to take a long amount of time.
  • user actions may be automatically classified by analyzing a control type (e.g., a type of object with which a user can interact, such as a hyperlink, a checkbox, a text entry box. a button, etc ) associated with a user action and the sequence of the user action within the user session.
  • a control type e.g., a type of object with which a user can interact, such as a hyperlink, a checkbox, a text entry box. a button, etc
  • some implementations may capture data associated with the user actions, such as a control type associated with a first user action and a control type associated with a second user acton that precedes the first user action. Some implementations may also classify the frst user action based on the first control type and the second control type, and analyze the captured data based on the classification Moreover, some implementations may determine the perceived duration of a user action rather than the network response time.
  • FIG. 1 is a block diagram of an example system 100 for measuring user acton performance consistent with disclosed implementations.
  • System 100 may be implemented in a number of different configurations without departing from the scope of the disclosed examples
  • system 100 may include a user action performance measurement device 1 10.
  • client device 120 an application content provider device 130, a database 140. and a network 150 for connecting user action performance measurement device 110 with client device 120, application content provider device 130, and/or database 140.
  • User action performance measurement device 110 may be a computing system that performs various functions consistent with disclosed examples, such as classifying user actions based on the c ntrol type associated with the user action and the control type associated with a preceding user action.
  • measurement device 110 may be a desktop computer, a laptop computer, a tablet computing device, a mobile phone, a server, anc/or any other type of computing device
  • measurement device 110 may process information received from client device 120, application content provider device 130, and/or database 140.
  • measurement device 110 may capture data associated with a first user action, classify the first user action based on the control type of the first user action and a control type of a second user action that precedes the first user action, and/or perform data analysis on the captured data Examples of measurement device 110 and certain functions that may be performed by device 110 are described in greater detail below with respect to. for example, FIGs. 2-8.
  • Client device 120 may be a computing system operated by a user.
  • client device 120 may be a dasktop computer, a laptop computer, a tablet computing device, a mobile phone, a server, and/or any other type of computing device.
  • client device 120 may be a computing device to perform operations consistent with certain disclosed implementations.
  • client device 120 may be adapted to detect a user action and to update a user interface in response to the user action.
  • Client device 120 may include a processor to execute instructions stored in a machine-readable storage medium In the example shown in FIG 1.
  • client device 120 may include a processor 122. a machine-readable storage medium 124. a display device 126. and an interface 128.
  • Processor 122 of client device 120 may be at least one processing unit (CPU), miaoprocessor, and/or another hardware device to execute instructions to perform operations.
  • processor 122 may fetch, decode, and execute instructions stored in machine-readable storage medium 124 (such as application instructions 127) to display a user interface, to detect a user action, to update the user interface in response to the user action, and/or to collect and/or transmit data associated with tie user action.
  • Machine-readable storage medium 124 may be any electronic, magnetic, optical, or other non -transitory storage device that stores instructions execute by processor 122
  • Display device 126 may be any type of display device that presents information, such as a user interface, to a user operating client device 120.
  • Interface device 128 may be any combination of hardware and/or programming that facilitates the exchange of data between the internal components of client device 120 and external components, such as user action performance measurement device 1 10.
  • interface device 128 may include a network interface device that allows client device 120 to receive and send data to and from application content provider device 130 via network 150.
  • Application content provider device 130 may be a computing system operated by the content provider of ar application.
  • content provider device 130 may be a desktop computer, a laptop computer, a tablet computing device, a mobile phone, a server, anc/or any other type of computing device
  • content provider device 130 may be a computing device capable of receiving a request from client device 120 to update a user interface based on a user action
  • content provider device 130 may transmit user interface update instructions based on the user action to client device 120
  • the user interface update instructions may be used by client device 120 to provide a user interface response to the user via a display, such as display device 126.
  • client device 120 may store the user interface update instuctions in machine-readable storage medium 124, and/or processor 122 of client device 120 may execute the user interface update instructions to update the user interface with new text, graphics, and/or a combination of text and graphics.
  • application content provider device 130 (and/or another component, such as measurement device 110. client device 120. etc.) may transmit information related to when the user interface has been fully updated such that all of the text and/or graphics have fully loaded. This information may be stored in a storage device, such as a machine-readable storage medium and/or a database, such as database VO.
  • Database 140 may be any type of storage system configuration that facilitates the storage of data.
  • database 140 may facilitate the locating, accessing, and retrieving of data (eg., SaaS. SQL. Access, etc. databases).
  • Database 140 can be populated by a number of methods.
  • measurement device 110 may populate database 140 with database entries generated by measurement device 110. and store the database entries in database 140.
  • measurement device 110 may populate database 140 by receiving a set of database entries from another component, a wireless network operator, or a user of client device 120 and/or application content provider device 130, and storing the database entries in database 140.
  • the database entries can contain a plurality of fields, which ma ⁇ include information related to user actions, such as, for example, the date/time of the user action, the name of the user action, the position of the user action within the sequence of user actions, other user actions within the sequence, the user, the client device type, the device operating system type, the amount of time between the user action and a user interface response to the user action, the control type associated with the user action, the classification of the user action, and/or the classification of the user interface control
  • database 140 is a single component external to components 110. 120. and 130.
  • datatase 140 may comprise separate databases and/or may be part of devices 110, 120. 130, and/or another device.
  • database 140 may be managed by components of devices 110, 120. and/or 130 that are capable of accessing, creating, controlling and/or otherwise managing data remotely through network 150.
  • Network 150 may be any tyoe of network that facilitates communication between remote components, such as measurement device 110 and client device 120.
  • network 150 may be a local area network (LAN), a wide area network (WAN), a virtual private netwoik, a dedicated intranet, the Internet, and/or a wireless network.
  • FIG. 1 shows one measurement devce 1 10, client device 120, content provider device 130. database 140. and network 150.
  • system 100 may include any number of components 110. 120, 130. 140. and 150. as well as other components not depicted in FIG 1
  • System 100 may also omit any of components 110. 120. 130. 140. and 150
  • measurement device 110 and content provider device 130 may be directly connected instead of being connected via network 1 0.
  • FIG. 2 is a block diagram of an example user action performance measurement device 210 consistent vith disclosed implementations.
  • user action performance measurement device 210 may correspond to user action performance measurement device 1 10 of FIG. 1
  • Measurement device 210 may be implemented in various ways.
  • measurement device 210 may be a special purpose computer, a server, a mainframe computer, a computing device executing instructions that receive and process information and provide responses, a mobile phone, and/or any other type of computing device.
  • measurement device 210 may include a processor 220. an interface 230. and a machine-readable storage medium 240.
  • Processor 220 may be at least one processing unit (CPU), microprocessor, and/or another hardware device to execute instructions to perform operations
  • processor 220 may fetch, decode, and execute user action performance measurement instructions 250 (e g , instructions 252, 254, and/or 256) stored in machine-readable storage medium 240 to perform operations related to disclosed examples.
  • Interface device 230 may be any device that facilitates the transfer of information between device 210 and eternal components, such as client device 120.
  • interface device 230 may include a network interface device that allows device 210 to receive and send data to and from network 150.
  • interface device 230 may retrieve and pOcess data related to user actions from client device 120 via network 150.
  • Machine-readable storage medium 240 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions.
  • machine-readable storage medium 240 may be, for example, Random Access Memory (RAM), Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.
  • RAM Random Access Memory
  • EEPROM Electrically-Erasable Programmable Read-Only Memory
  • machine- readable storage medium 240 may be a non-transitory computer-readable storage medium, where the term "non-transitory" does not encompass transitory propagating signals
  • Machine-readable storage medium 240 may be encoded with instructions that, when executed by processor 220. perform operations consistent with disclosed implementations.
  • machine-readable storage medium 240 may include instructions that perform operations that may classify user actions based on the control type of the user interface control associated with the user action and the context or flow in which the user interface control is being used. In the example shown in FIG. 2.
  • machine-readable storage medium 240 may include data capture instructions 252, user action classification instructions 254, and data analysis instructions 256.
  • Data capture instructions 252 may function to capture data related to a user action. For example, when data capture instructions 252 are executed by processor 220, data capture instructions 252 may cause processor 220 of measurement device 210, processor 122 of client device 120, and/or another processor to capture data related to a user action, classify the user action as either a first classification or a second classification based on the data, and/or determine the amount of time between the user action and the corresponding user interface response This analysis is described in further detail below with respect to. for example.
  • FIGs. 4 and 5 User action classification instructions 254 may function to classify user actions. For example, when user action classification instructions 254 are executed by processor 220, user action classification instructions 254 may cause the processor 220 of measurement device 210.
  • User action classification instructions 254 may also cause a processor to store the classification of the user action and/or the user interface control in machine-readable storage medium 240 and/or in another storage device, such ss database 140. This analysis is described in further detail below with respect to. for example, FIGs. 4 and 6.
  • Data analysis instructions 258 may function to analyze data related to the user action based on the classification of the user action. For example, when data analysis instructions 256 are executed by a processor, such as processor 220 of measurement device 210, data analysis instructions 256 may cause processor 220 of measurement device 210, processor 122 of client device 120. and/or another processor to determine statistics for user actions of the same classification. As another example, data analysis instructions 256 may cause processor 220 and/or another processor to determine whether the amount of time that elapses between a user action and a user interface response to the user action exceeds a threshold, where the threshold is based on the classification of the user action. This analysis is described in further detail below with respect to. for example. FIGs. 4. 7. and 8.
  • FIG. 3 is a block diagram of an example user action performance measurement device 310 consistent vith disclosed implementations.
  • user action performance measurement device 310 may correspond to user action performance measurement device 110 of FIG. 1.
  • Device 310 may be implemented in various ways.
  • device 310 like device 210. may be a special purpose computer, a server, a mainframe computer, a computing device executing instructions that receive and process information and provide responses. and/or any other type of computing device.
  • device 310 may include an interface device 330, a data capture engine 340, a user action classification engine 350, and a data analysis engine 360.
  • Interface device 330 may be any device that facilitates the transfer of information between user action performance measurement device 310 and external components, such as client device 120.
  • interface device 330 may include a network interface device that allows user action performance measurement device 310 to receive and send data to and from network 150.
  • interface device 330 may retrieve and process data related to user actions from client device 120 via network 150
  • Engines 340. 350. and 360 may be electronic circuitry for implementing functionality consistent with disclosed examples.
  • engines 330, 340. and 350 may represent combinations of hardware devices and programming to implement the functionality consistent with disclosed implementations
  • the functionality of engines 340, 350. and 360 may correspond to operations performed by user action performance measurement device 210 of FIG. 2. such as operations performed when user action performance measurement instructions 250 are executed by processor 220 (described above with respect to FIG. 2).
  • data capture engine 330 may represent a combination of hardware and programming that performs operations similar to those performed when processor 220 executes data capture instructions 252.
  • user action classification engine 340 may represent a combination of hardware and programming that performs operations similar to those performed when processor 220 executes user action classification instructions 254.
  • data analysis engine 360 may represent a combination of hardware and programming that performs operations similar to those performed when processor 220 executes data analysis instructions 256.
  • FIG. 4 is a flow chart of an example process 400 for measunng user action performance consistent with disclosed implementations
  • execution of process 400 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100.
  • other suitable systems and devices for execution of at least one step of process 400 may be used.
  • processes descnbed below as being performed by measurement device 110 may be performed by client device 120.
  • user action performance measurement device 210 may be performed by user action performance measurement device 310. and/or any other suitable device.
  • processes described below as being performed by client device 120 may be performed by measurement devices 1 3, 210, 310. and/or any other suitable device.
  • Process 400 may be implemented in the form of executable instructions stored on a machine-readable storage medium and/or in the form of electronic circuitry.
  • Process 400 may start (step S405) wtien a user action has been performed on a user interface.
  • client device 120 may output a user interface for an applicaton on an available display, such as display device 126
  • the user interface may include content, such as text, graphics, or a combination of text and graphics, which represents information and/or actions that are available to a user
  • the user interface may include hyperlinks, radio buttons, checkboxes, text entry boxes, buttons, and/or other types of controls that a user may interact with Users may interact with the interface by inputting a user action related to the control to client device 20
  • a user may execute a mouse click, move a mouse, execute a touch gesture on a touch-enabled display, execute a voice command, or provide another type input.
  • process 400 may include capturing user action data (step S420)
  • system 100 may receive the user action input and/or determine a control type associated with the user action.
  • System 100 may also determine the amount of time that elapses between the user action and a user interface response to the user action.
  • system 100 may determine the amount of time between the user action and a complete user interface response such that the user interface is fully updated (e.g.. all graphics and text are loaded).
  • System 100 may also collect and/or store data related to the user action. Examples of the steps that ma/ be involved with capturing user action data are discussed in greater detail below with respect to, for example.
  • FIG. 5 Examples of the steps that ma/ be involved with capturing user action data are discussed in greater detail below with respect to, for example.
  • Process 400 may also include classifying the user action ⁇ step S430) based on user action data.
  • system 100 may classify data based on the control type associated with the user action and the context or flow in which the control is being used, in some examples, system 100 may determine a control type associated with a user action (e g , a first user action) and a control type of a preceding user action (e.g., a second user action) and compare those control types with known control types to determine a classification. For example, system 100 may compare the control types to sets of long action control types and sets of short action control types stored in a storage devce, such as database 140, to determine a match.
  • a storage devce such as database 140
  • System 100 may then classify ihe user action (and/or the control associated with the user action) as one of a first josification (e.g.. a long action) or a second classification (e.g., a short action) based on the match. Examples of the steps that may be involved with classifying user actions are discussed in greater detail below with respect to, for example. FIG. 6.
  • Process 400 may also include analyzing user action data based on the classification (step S440). For example, measurement device 110 may calculate statistics for user actions assigned to the first classification, and/or statistics for user actions assigned to the second classif cation. As another example, measurement device 110 may determine whether the amount of time between the user action and the user interface response to the user action exceeds a threshold, where the threshold is based on the classification of the user action. If so, measurement device 1 10 may generate an alert indicating t at the application is experiencing perceived user action durations that exceed the threshold, and may calculate statistics for user actions of the same classification that exceed the threshold Examples of the steps involved with analyzing user action data based on the classification are discussed in greater detail below with respect to.
  • step S420 After the user action data is captured (step S420). the user action is classified based on the user action data (step S430) and/or the user action data is analyzed based on the classification (step S440), process 400 may end (step S455)
  • FIG. 5 is a flow chart of an e ⁇ ample process 500 for capturing user action data consistent with disclosed implementations.
  • execution of process 500 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100, other suitable systems and devices for execution of at least one step of process 500 may be used.
  • processes described below as being performed by measurement device 1 10 may be performed by client device 120.
  • user action performance measurement device 310, and/or any other suitable cevice may be performed by measurement devices 1 10. 210. 310. and/or any other suitable device.
  • Process 500 may be implemented in the form of executable instructions stored on a machine-readable storage medium and/or in the form of electronic circuitry In certain aspects, process 500 may related to the processes associated with step S420 of FIG. 4.
  • Process 500 may start (step S505) after a user action has been performed on a user interface.
  • system 100 may receive a user action input (step S510), such as an input to client device 120.
  • Example inputs may include, but are not limited to executing a mouse click, moving a mouse, executing a touch gesture on a touch-enabled display, e ecuting a voice command, entry of numenc, alphanumeric, and/or other character stiings, and/or any other type of input.
  • the user action may correspond to a request from the user to provide additional content and/or otherwise receive a user interface response reflecting the content requested.
  • Receipt of the user action input may cause client 120 to run several tasks or processes, some of which may function to update a user interface displayed on display device 126 Tasks and/or processes that update the user interface may be considered to be user interface tasks [0041)
  • process 500 may continue by marking the time at which the user action was performed (step S520).
  • measurement device 110 may include instructions which, when executed by a processor, generate a time stamp which contains the start time of the user action The time stamp may be stored in a storage device such, as for example, a machine- readable medium and/or database 140.
  • measurement device 1 10 may start a timer at the time the user action was performed, and the timer may continue to run until a specified action occurs, such as when the user interface tasks are complete (e.g.. when the user interface is fully updated)
  • Process 500 may continue by transmitting a request to update the user interface based on the user action (step S530).
  • client device 120 may transmit a request to content provider device 130 to provide content requested by the user
  • the recuest may include information associated with the user action, such as the time and date of the user action, the type of client device 120, the operating system of client device 120. the display type of display device 128. the user name, an identifier that indicates the content requested by the user based on the user action, information associated with previous user actions (e.g., characters entered by a user in a text box), and any other suitable information that content provider device 130 may use to provide content to client device 20.
  • Application content provider device 130 may receive the request and may provide a response including instructiors to client 120 via network 150 to update the user interface based on the request.
  • Client 120 may receive the network response (step S540) and may execute instructions, such as instructions received from content provider device 130, to provide a user interface response (step S550)
  • client 120 may update the user interface to reflect the user action input. This may include, for example, marking or unmarting a checkbox, activating a button, updating the position of a slider on a slider bar. returning search results, navigating to a different part of the user interface, pressnting a drop down menu, providing new text and/or graphics, or otherwise changing another type of interface element.
  • Client 120 may output the updated user interface on a display device, such as display 128.
  • process 500 may continue by marking the time at which the user interlace response was complete (step S560). For example, similar to step S520.
  • measurement device 110 may include instructions which, when executed by a processor, generate a time stamp which contains the time the user interface update was complete
  • client 120 may run several tasks or process, some of which may update the user interface.
  • a complete user interface update may occur when the last task that updates trie user interface is complete
  • the time stamp may include the time that the last task that updates the user interface is complete.
  • the time stamp may be stored in a storage device such, as for example, a machine-readable Ttedium and/or database 140.
  • system 100 may end the timer started in step S520 once the last task that updates the user interface is complete
  • Process 500 may also determine the amount of time that elapsed between the user action and the user interface response (step S570) For example, in some implementations device 110 may calcuate the elapsed time by subtracting the time stamp generated in step S560 (e.g., the time at which the user interface was updated) from the time stamp generated in step S520 (e.g.. the time at which the user action was performed). As another example, device 110 may determine the ending time of the timer started in step S520 and stopped in step S5G0. As another example, any other method of measumg the time elapsed between the user action and when the corresponding user interface response to the user action is displayed may be used.
  • the time stamp generated in step S560 e.g., the time at which the user interface was updated
  • step S520 e.g. the time at which the user action was performed
  • device 110 may determine the ending time of the timer started in step S520 and stopped in step S5G0.
  • Process 500 may also determine a control type associated with the user action (step S580)
  • device 1 10 may use a number of methods for determining a control type
  • device 110 may determine a control type by text associated with the control type, object mapping, and/or any other method for determining a control type.
  • Process 500 may also store data associated with the user action in a storage device (step S590).
  • device 110 may store data in a machine- readable medium, in database 140. and/or in any other suitable type of storage device.
  • the stored data may include: information relating to the user action (e.g.. the date/time of the user action, the position of the user action within the sequence of user actions in a user session, the client device type, the device operating system type, the control type associated with the user action, etc.): information relating to transmitting the request to update the user interface (e.g..).
  • process 500 may end (step S595).
  • FIG. 6 is a flow chart of an example process 600 for classifying user actions consistent with disclosed implementations.
  • process 600 may classify user actions based on the control type associated with the user action and the sequence of the user action within the user session
  • execution of process 600 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100.
  • other suitable systems and devices for execution of at least one step of process 600 may be used.
  • processes described below as being performed by measurement device 110 may be performed by client device 120.
  • user action performance measurement device 210. user action performance measurement device 310. and/or any other suitable device.
  • processes described below as being performed by client device 120 may be performed by measurement devices 1 D, 210, 310. and/or any other suitable device.
  • Process 600 may be implemented in the form of executable instructions stored on a machine-readable storage medium and/or in the form of electronic circuitry In certain aspects, process 600 may relate to the processes associated with step S430 of FIG 4.
  • Process 600 may start (step S605) after a user action has been performed on a user interface. After the user action (e.g.. a first user action) has been performed, process 600 may include cetermining a control type of the user action (step S610). In some implementations, device 110 may access data related to the user action to determine the control type associated with the user action. For example, device 110 may retrieve, for s particular user action, the stored user action data (including the control type associated with the user action) stored in a storage device, such as a machine-readable storage medium, database 140, and/or any other suitable storage device. The stcred user action data may correspond to the data stored in step S590 of FIG 5.
  • Process 600 may also include determining a control type associated with a preceding user action (step S620).
  • measurement device 0 may access data stored in a machine-readable storage medium, in database 140. and/or in another storage device to identify preceding user actions (e.g.. a second user action) within the user session.
  • measurement device 110 may perform a query based on the position of the particular user action within the sequence (e.g . first, second, third, fourth, etc.) to identify user actions that have earlier positions within the sequence. Measurement device 110 may also determine which of those user actions occurred immediately before the particular user action.
  • measurement device 10 may perform a query to identify user actions within the session that were performed before the date and time of the particular user action.
  • Measurement device 110 may then compare the dates and times of the identified user actions with the date and time of the particular user action to determine which of the identified usei actions occurred at a time closest to the time of the particular user action. Once the user action or actions that precede the particular user action are identified, measurement device 1 10 may access data stored in a storage device, such as a machine-readable storage medium, database 140, and/or another storage device to identify the control type associated with the preceding user actions.
  • a storage device such as a machine-readable storage medium, database 140, and/or another storage device to identify the control type associated with the preceding user actions.
  • J Process 600 may also includedEt determining whether the control type of the user action and the control type of a preceding user action match a first set of control types (step S630).
  • system 100 may store sets of long action control types in a storage device, such as da abase 140.
  • the sets of long action control types may define control types that, based on their respective positions within a sequence of user actions, may be associated with user actions that a user may perceive to take a long amount of time
  • the sets of long action control types may include control types associated with user actions that occur after a user has entered or viewed some type of custom input.
  • the sets of long action control types may include an initial long control type and a subsequent long control type.
  • the initial long control type may be a type of control associated with a user action that occurs immediately before the subsequent long control type.
  • sets of long action control types may include: (1 ) a checkbox as the initial long control type and a refresh as the subsequent long control type: (2) a switch as the initial long control type and a refresh as the subsequent long control type: (3) an open menu option as the initial long control type and a refresh as the subsequent long control type: (4) a radio button as the initial kmc control type and a refresh as the subsequent long control type: (5) a text entry as the initial long control type and a button as the subsequent long control type (e.g.. search button, login button, and the like): and (6) any other suitable initial long control type.
  • each of the sets of long action control types include two control types, any number of control types (e.g.. one control type, three control types, ten control types, etc.) may be part of a set. and each set need not have the same number of control types.
  • Measurement device 110 may compare the user action control type and the preceding user action control type to the sets of long action control types to determine a match. For example, measurement device 110 may determine whether the control type of the user action and the control type of the preceding user action match the subsequent long control type and the initial long control type, respectively if the control type of the user action matches the subsequent long control type and the control type of the preceding user action matches the initial long control type in a set, (step S630; yes) the user action and/or the control associated with the user action may be classified as a first classrication (step S6 0) (e.g., a long action and/or a long action control, respectively). Measurement device 110 may provide data regarding the classification to a storage device, such as database 140, for storage and/or to another device for processing [step S680).
  • a storage device such as database 140
  • process 600 may also determine whether the control type of the user action and the control type of the preceding user action match a second set of control types (step S650).
  • measurement devce 1 10 may store sets of short action control types in a storage device, such as database 140.
  • the sets of short action control types may define control types that, based on their respective positions within a sequence of user actions, may be associated with user actions that a user may perceive to take a short amount of time.
  • the sets of short action control types may include control types assooated with user actions that relate to system prepared user interface options.
  • the sets of short action control types may include an initial short control type and a subsequent short control type.
  • the initial short control type may be a type of control associated with a user action that occurs immediately before the subsequent short control type
  • Examples of sets of short action control types may include, but are not limited to. any combination of a (1 ) refresh, checkbox, switch, radio button, open menu option, list option, and scroll view (without refresh) as the initial short control type and a (2) text box. checkbox, switch, radio button, open menu option, list option, and scroll view (without refresh) as the subsequent short control type.
  • any number of control types may be part of a set. and each set need not have the same number of control types
  • Measurement device 1 10 may compare the user action control type and the preceding user action control type to the control types in the sets of short action control types to determine a match. For example, measurement device 1 10 may determine whether the control type of the user action and the control type of the preceding user action match the subsequent short control type and the initial short control type, respectively. If the control type of the user action matches the subsequent short control type and the control type of the preceding user action matches the initial short control type, (step S650, yes) the user action and/or the control associated with the user action may be classified as a second classification (step S660) (e.g., a short action and/or an short action control, respectively). Measurement Device 1 10 may provide data regarding the classification to a storage device, such as database 140 for storage and/or to another device for processing (step S680).
  • a storage device such as database 140 for storage and/or to another device for processing (step S680).
  • step S650 the user action and/or the control associated with the user action may be classified as a third classification (step S670) (e.g.. an unknown type of user action and/or action control, respectively).
  • step S670 e.g. an unknown type of user action and/or action control, respectively.
  • Device 1 10 may then provide data regarding the classification to a storage device, such as database 140 for storage and/or to another device for processing (step S680).
  • Measurement device 1 10 may process the data related to user actions and/or user action controls of the thiid classification in a number of ways. For example, measurement device 1 10 may automatically change the classification of the user action to another classification, such as the first classification (e.g., reclassify the user action as a long action) or the second classification (e.g.. reclassify the user action as a short action). Measurement device 110 may also request additional information regarding the user action from the user.
  • the first classification e.g., reclassify the user action as a long action
  • the second classification e.g.. reclassify the user action as a short action
  • Measurement device 110 may also request additional information regarding the user action from the user.
  • measurement device 1 10 may generate and/or displa/ a data collection screen to collect additional data related to the user action from the user
  • the data collection screen may include a set of data request areas indicating the data requested, such as whether the user perceived the user interface response to take a long amount of time or a short amount of time.
  • the data collection screen may include text entry boxes or radio buttons by which the user can submit his or her perceived user action duration to system 100.
  • Measurement device 1 10 may process the perceived user action duration input by the user and modify the classification of the user action based on the input (e.g.. change the classification of the user action from the third classification to the first classification or the second classification).
  • Measurement device 110 may then provide data regarding the new classification to a storage device, such as database 140 for storage and/or to another device for processing. After the data is stored, process 600 may end (step S695).
  • FIG. 7 is a flow chart of an example process 700 for analyzing user actions based on user action classifications consistent with disclosed implementations.
  • execution of process 500 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100.
  • other suitable systems and devices for execution of at least one step of process 500 may be used.
  • processes described below as being performed by measurement device 110 may be performed by client device 120.
  • user action performance measurement device 310 and/or any other suitable device.
  • processes described below as being performed by client device 120 may be performed by measurement devices 110, 210. 310. and/or any other suitable device.
  • Process 700 may be implemented in the form of executable instructions stored on a machine-readable storage medium andxw in the form of electronic circuitry.
  • process 600 may relate to the processes associated with step S440 of FIG. 4 [0058]
  • Process 700 may start (step 3705) after a user action has been classified. After the user action has been class fie d, process 700 may include determining whether the amount of time between the user action and the user interface response exceeds a classification-based threshod value. For example, measurement device 1 10 may access data related to the user action to determine the classification and the amount of time between the user action and the user interface response.
  • device 1 10 may retneve, for a particular user action, the stored user action data (including the classificatior and the amount of time between the user action and the user interface response) stored in a storage device, such as a machine-readable storage medium, database 140, and/or any other suitable storage device.
  • the stored user action data may correspond to the data stored in step S590 of FIG 5 and/or step S680 of FIG. 6.
  • Process 700 may also set classification-based threshold values (step S710).
  • Measurement device 110 may set a first threshold value that defines an acceptable duration for usei interface responses of the first classification (e.g., long actions), and a second threshold value that defines an acceptable duration for user interface responses of the second classification (e.g., short actions).
  • the first threshold value and the second threshold value may be set based on the stored user action data
  • measurement device 110 may use the stored user action data to set the first threshold value to be an average of all user interface response durations of the first classification.
  • device 110 may use the stored user action data to set the second threshold value to be an average of all user interface responses duratiors of the second classification
  • the first threshold value and the second threshold value may be predetermined numbers.
  • measurement device 1 10 may set the first threshold value (e.g., the threshold valua associated with long actions) to be between 2-8 seconds and may set the second threshold value (e.g., the threshold value associated with short actions) to be 150ms.
  • Measurement device 110 may then provide data regarding the first threshold and/or the second threshold to a storage device, such as a machine-readable medium, database 140, and/or another device for additional processing.
  • Process 700 may also determine whether the amount of time exceeds a classification-based threshold (step S720). For example, if the user action has been classified as the first classification, measurement device 110 may compare the amount of time between the user action and the user interface response to the first threshold. Similarly, if the user action has been classified as the second classification, measurement device 110 may compare the amount of time between the user action and the user interface response to the second threshold.
  • a classification-based threshold For example, if the user action has been classified as the first classification, measurement device 110 may compare the amount of time between the user action and the user interface response to the first threshold. Similarly, if the user action has been classified as the second classification, measurement device 110 may compare the amount of time between the user action and the user interface response to the second threshold.
  • process 700 may continue by calculating statistics for user actions of the same classification (step S740) (discussed in more detail below) and store the statistics and/or other data (e.g., data indicating that the user action exceeded the classification-based threshold) in a stcrage device (step S750) (also discussed in more detail below)
  • process 700 may continue by generating an alert indicating that the amount of time exceeds the threshold (step S730).
  • measurement device 1 10 may generate and provide an alert to a device, such as application content provider device 130. client device 120. and/or another suitable type of device.
  • measurement device 1 10 may generate an alert to include user action data obtained, for example, from machine-readable medium 124 of client 120. database 140, and/or another component.
  • measurement device 1 10 may generate the alert such that it may include information about user actions that exceed the classification-based threshold (e.g., aggregate information about the number of user actions that exceeded the classification-based threshold, information about particular user actions, etc.).
  • Process 700 may also include calculating statistics for user actions of the same classification (step S740).
  • the statistics may relate to user interface performance based on user expectations.
  • measurement device 1 10 may calculate statistics for a single user action, for multiple user actions, for single users, for multiple users, for ⁇ single device type, for multiple device types, for a single control type, for multiple c ntral types, for a particular control and/or for multiple controls.
  • measurement device 110 may access data relating to the user actions (e.g., data stored in step S590 of FIG 5 and/or step S680 of FIG 6) stored in a storage device, such as database 140. to identify information for use in various statistical calculations.
  • device 1 10 may use the data to calculate an average jser interface response duration for all user actions assigned to the first classification an average user interface response duration for all user actions assigned to the second classification
  • measurement device 110 may calculate the number of user actions assigned to the first classification that exceeded the first threshold and the number of user actions assigned to the second classification hat exceeded the second threshold.
  • measurement device 1 0 may calculate, for a particular user action, the number of users that experienced a user interface response duration that exceeded the classification-based threshold.
  • Measurement device 110 may provide data regarding the calculated statistics to a storage device, such as a machine- readable medium and/or database 140, and/or to another device for additional processing (step S750) After the data is stored, process 700 may end (step S765)
  • FIG. 8 is an example of a user interface for displaying user action-related statistics consistent with disclosed implementations
  • user interface CUT) performance display 80C may be generated by device 110 using data obtained from, for example, a machine-readable medium, database 140. and/or another component
  • device 1 10 may obtain the results of the calculations performed by measurement device 1 10 and described above with respect to step S740 of FIG. 7.
  • Ul performance display 800 may display results of the calculations performed in step S740 as text, graphics, or a combination of text and graphics in a way that aids the user in determining user action performance
  • Ul performance display 800 may include an area 820 for displaying statistics related to overall user action performance, and an area 840 for displaying statistics related to user actions that exceed the classification-based threshold.
  • area 820 may display the total number of actions, the percentage of actions assigned to the first classification (e.g., long actions) that exceeded the first threshold, the average response time of actions assigned to the first classification that exceeded the first threshold, the percentage of actions assigned to the second classification that exceeded the second threshold, and/or the average response time of actons assigned to the second classification that exceeded the second threshold.
  • area 840 may display information about the particular user actions that exceeded the classification - based threshold For example, area 840 may display the control type associated with the user action 842. the name of the user action 844. the number of users who performed the particular user action and had a response time that exceeded the classification-based threshold 846, and the average response time for the particular user action 848
  • Measurement device 1 10 may limit the information displayed on Ul performance display 800 and/or may display user-action related data according to certain selection criteria. For example, a user may instruct measurement device 1 0 to limit the data displayed to statistics related to the first classification or to statistics related to the second classification. As another example, a user may instruct measurement device 110 to limit the dcta displayed to data related to particular type of device, a particular operating systen, and/or a particular application. measurement device 1 10 may display the Ul performance display 800 on a display device, such as a display device associated with content provider device 130.
  • the disclosed examples may include systems, devices, computer-readable storage media, and methods for measuring user action performance. For purposes of explanation, certain examples are described with reference to the components illustrated in FIGs. 1-8. The functionally of the illustrated components may overlap, however, and may be present in a fewer or greater number of elements and components Further, all or part of the functionality of illustrated elements may coexist or be distributed among several gsographica!ly dispersed locations. Moreover, the disclosed examples may be implemented in various environments and are not limited to the illustrated examples
  • measurement device 110 of system 100 may simply receive data captured from client device 120. and/or client device 120 may classify the user action.
  • implementations consistent with the disclosed examples need not perform the sequence of operations in any particular order, including those in in FIGs. 4-7.
  • the present disclosure merely sets forth possible examples of implementations, and many variations and modifications may be made to the described examples All such modifications and variations are intended to be included within the scope of this disclosure and protected by the following claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Educational Administration (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Example implementations relate to measuring user action performance. In some examples, data associated with a first user action and a first user interface response to the first user action is obtained. The data may include a first control type associated with the first user action. The first user action may be classified based on the first control type and a second control type associated with a second user action, the second user action preceding the first user action. The data may also be analyzed based on the classification.

Description

MEASURING USER ACTION PERFORMANCE BY CLASSIFYING USER
ACTIONS
BACKGROUND
[0001] A typical computing device may allow a user to interact with a software application through a user interface displayed on a display device. For instance, when a user performs a user action, sjch as clicking a button displayed on a user interface of the computing device, the application may cause the processor of the computing device to perform a number of operations that ultimately result in an output that is visible to the user, such as an update to the user interface.
BRIEF DESCRIPIION OF THE DRAWINGS
[0002] The following detailed description references the drawings, wherein
[0003] FIG 1 is a block diagram of an example system for measuring user action performance consistent with disclosed implementations;
[0004] FIG 2 is a block diagram of an example user action performance measurement device consistent with disclosed implementations;
[0005] FIG 3 is a block diagram of an example user action performance measurement device consistent with disclosed implementations;
[0006] FIG. 4 is a flow chart of an example process for measuring user action performance consistent with disclosed implementations;
[0007] FIG. 5 is a flow chart of an example process for capturing user action data consistent with disclosed implementations;
[0008] FIG. 6 is a flow chart of an example process for classifying user actions consistent with disclosed implementations;
[0009] FIG. 7 is a flow chart of an example process for analyzing user actions based on user action classifications consistent with disclosed implementations; and [0010] FIG 8 is an example of a user interface for displaying user action-related statistics consistent with disclosed implementations
DETAILED DESCRIPTION
[0011] The following detailed descrption refers to the accompanying drawings Wherever possible, the same reference numbers are used in the drawings and the following descnption to refer to the same or similar parts. While several examples are described in this document, modifications, adaptations, and other implementations are possible. Accordingly, the following detailed description does not limit the disclosed examples Instead, the proper scope of the disclosed examples may be defined by the appended claims
[0012] As detailed above, when a user performs a user action, a number of operations are performed that ultimate result in output that is visible to the user, such as an update to the user interface. While some of these updates may seem almost instantaneous to the user, other updates may be perceived as taking a longer amount of time For example, a user may expect that an action such as selecting an item via a checkbox should result in a near-instantaneous user interface response. Similarly, a user may expect that an acton such as signing in by clicking a button may take longer to complete since the application is validating the user's login and password. Thus, even before completing the action, users may expect different user actions to have different durations.
[0013] Traditional methods of measuring user satisfaction with the performance of the application do not distinguish between actions that a user expects to take a long amount of time and actions that a user expects to take a short amount of time. Instead, each user action is treated in the same manner and in a way that does not fully take into account the perspective o the user interacting with the application. For example, the calculation of traditional metrics may involve determining the average network response time for all of the user actions, or comparing the network response times of the user actions to a single static threshold However, these approaches may result in metrics that do not accurately reflect user satisfaction. For example, the average response time may be skewed by very few long network response times, and the comparison of each user acton to a single threshold may not accurately reflect user frustration when an action a user expects to take a short amount of time is taking longer than the user expects, but less time than the single static threshold Moreover, by measuring the network response time instead of the time it takes for the user to obtain the result he or she expects (e.g., a complete display update) little insight is provided into the user's satisfaction Accordingly, an accurate measurement of the user's experience with the application should be based on user expectations.
[0014] Examples disclosed herein provide user action performance measurement based on user expectation To this erd. example implementations disclosed herein may measure user action performance by distinguishing between actions that a user expects to take a short amount of time and those that a user expects to take a long amount of time. For example, in some implementations user actions may be automatically classified by analyzing a control type (e.g., a type of object with which a user can interact, such as a hyperlink, a checkbox, a text entry box. a button, etc ) associated with a user action and the sequence of the user action within the user session. Additionally, some implementations may capture data associated with the user actions, such as a control type associated with a first user action and a control type associated with a second user acton that precedes the first user action Some implementations may also classify the frst user action based on the first control type and the second control type, and analyze the captured data based on the classification Moreover, some implementations may determine the perceived duration of a user action rather than the network response time.
[0015] Referring now to the drawings. FIG. 1 is a block diagram of an example system 100 for measuring user acton performance consistent with disclosed implementations. System 100 may be implemented in a number of different configurations without departing from the scope of the disclosed examples In the example shown in FIG 1. system 100 may include a user action performance measurement device 1 10. a client device 120. an application content provider device 130, a database 140. and a network 150 for connecting user action performance measurement device 110 with client device 120, application content provider device 130, and/or database 140.
[0016] User action performance measurement device 110 may be a computing system that performs various functions consistent with disclosed examples, such as classifying user actions based on the c ntrol type associated with the user action and the control type associated with a preceding user action. For example, measurement device 110 may be a desktop computer, a laptop computer, a tablet computing device, a mobile phone, a server, anc/or any other type of computing device In some examples, measurement device 110 may process information received from client device 120, application content provider device 130, and/or database 140. For example, measurement device 110 may capture data associated with a first user action, classify the first user action based on the control type of the first user action and a control type of a second user action that precedes the first user action, and/or perform data analysis on the captured data Examples of measurement device 110 and certain functions that may be performed by device 110 are described in greater detail below with respect to. for example, FIGs. 2-8.
[0017) Client device 120 may be a computing system operated by a user. For example, client device 120 may be a dasktop computer, a laptop computer, a tablet computing device, a mobile phone, a server, and/or any other type of computing device. In some examples, client device 120 may be a computing device to perform operations consistent with certain disclosed implementations. For example, client device 120 may be adapted to detect a user action and to update a user interface in response to the user action.
[0018] Client device 120 may include a processor to execute instructions stored in a machine-readable storage medium In the example shown in FIG 1. client device 120 may include a processor 122. a machine-readable storage medium 124. a display device 126. and an interface 128. Processor 122 of client device 120 may be at least one processing unit (CPU), miaoprocessor, and/or another hardware device to execute instructions to perform operations. For example, processor 122 may fetch, decode, and execute instructions stored in machine-readable storage medium 124 (such as application instructions 127) to display a user interface, to detect a user action, to update the user interface in response to the user action, and/or to collect and/or transmit data associated with tie user action. Machine-readable storage medium 124 may be any electronic, magnetic, optical, or other non -transitory storage device that stores instructions execute by processor 122 Display device 126 may be any type of display device that presents information, such as a user interface, to a user operating client device 120. Interface device 128 may be any combination of hardware and/or programming that facilitates the exchange of data between the internal components of client device 120 and external components, such as user action performance measurement device 1 10. In some examples, interface device 128 may include a network interface device that allows client device 120 to receive and send data to and from application content provider device 130 via network 150.
[0019] Application content provider device 130 may be a computing system operated by the content provider of ar application. For example, content provider device 130 may be a desktop computer, a laptop computer, a tablet computing device, a mobile phone, a server, anc/or any other type of computing device In some examples, content provider device 130 may be a computing device capable of receiving a request from client device 120 to update a user interface based on a user action Additionally, content provider device 130 may transmit user interface update instructions based on the user action to client device 120 The user interface update instructions may be used by client device 120 to provide a user interface response to the user via a display, such as display device 126. For example, client device 120 may store the user interface update instuctions in machine-readable storage medium 124, and/or processor 122 of client device 120 may execute the user interface update instructions to update the user interface with new text, graphics, and/or a combination of text and graphics. In some implementations, application content provider device 130 (and/or another component, such as measurement device 110. client device 120. etc.) may transmit information related to when the user interface has been fully updated such that all of the text and/or graphics have fully loaded. This information may be stored in a storage device, such as a machine-readable storage medium and/or a database, such as database VO.
[0020] Database 140 may be any type of storage system configuration that facilitates the storage of data. For example, database 140 may facilitate the locating, accessing, and retrieving of data (eg., SaaS. SQL. Access, etc. databases). Database 140 can be populated by a number of methods. For example, measurement device 110 may populate database 140 with database entries generated by measurement device 110. and store the database entries in database 140. As another example, measurement device 110 may populate database 140 by receiving a set of database entries from another component, a wireless network operator, or a user of client device 120 and/or application content provider device 130, and storing the database entries in database 140. The database entries can contain a plurality of fields, which ma} include information related to user actions, such as, for example, the date/time of the user action, the name of the user action, the position of the user action within the sequence of user actions, other user actions within the sequence, the user, the client device type, the device operating system type, the amount of time between the user action and a user interface response to the user action, the control type associated with the user action, the classification of the user action, and/or the classification of the user interface control While in the example shown in FIG. 1 database 140 is a single component external to components 110. 120. and 130. datatase 140 may comprise separate databases and/or may be part of devices 110, 120. 130, and/or another device. In some implementations, database 140 may be managed by components of devices 110, 120. and/or 130 that are capable of accessing, creating, controlling and/or otherwise managing data remotely through network 150.
(0021) Network 150 may be any tyoe of network that facilitates communication between remote components, such as measurement device 110 and client device 120. For example, network 150 may be a local area network (LAN), a wide area network (WAN), a virtual private netwoik, a dedicated intranet, the Internet, and/or a wireless network.
[0022] The arrangement illustrated m FIG. 1 is simply an example, and system 100 may be implemented in a number of different configurations. For example, while FIG. 1. shows one measurement devce 1 10, client device 120, content provider device 130. database 140. and network 150. system 100 may include any number of components 110. 120, 130. 140. and 150. as well as other components not depicted in FIG 1 System 100 may also omit any of components 110. 120. 130. 140. and 150 For example, measurement device 110 and content provider device 130 may be directly connected instead of being connected via network 1 0.
[0023] FIG. 2 is a block diagram of an example user action performance measurement device 210 consistent vith disclosed implementations. In certain aspects, user action performance measurement device 210 may correspond to user action performance measurement device 1 10 of FIG. 1 Measurement device 210 may be implemented in various ways. For example, measurement device 210 may be a special purpose computer, a server, a mainframe computer, a computing device executing instructions that receive and process information and provide responses, a mobile phone, and/or any other type of computing device. In the example shown in FIG 2. measurement device 210 may include a processor 220. an interface 230. and a machine-readable storage medium 240.
[0024] Processor 220 may be at least one processing unit (CPU), microprocessor, and/or another hardware device to execute instructions to perform operations For example, processor 220 may fetch, decode, and execute user action performance measurement instructions 250 (e g , instructions 252, 254, and/or 256) stored in machine-readable storage medium 240 to perform operations related to disclosed examples.
[0025] Interface device 230 may be any device that facilitates the transfer of information between device 210 and eternal components, such as client device 120. In some examples, interface device 230 may include a network interface device that allows device 210 to receive and send data to and from network 150. For example, interface device 230 may retrieve and pOcess data related to user actions from client device 120 via network 150.
[0026] Machine-readable storage medium 240 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions. Thus, machine-readable storage medium 240 may be, for example, Random Access Memory (RAM), Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like. In some implementations, machine- readable storage medium 240 may be a non-transitory computer-readable storage medium, where the term "non-transitory" does not encompass transitory propagating signals Machine-readable storage medium 240 may be encoded with instructions that, when executed by processor 220. perform operations consistent with disclosed implementations. For example, machine-readable storage medium 240 may include instructions that perform operations that may classify user actions based on the control type of the user interface control associated with the user action and the context or flow in which the user interface control is being used. In the example shown in FIG. 2. machine-readable storage medium 240 may include data capture instructions 252, user action classification instructions 254, and data analysis instructions 256.
[0027] Data capture instructions 252 may function to capture data related to a user action. For example, when data capture instructions 252 are executed by processor 220, data capture instructions 252 may cause processor 220 of measurement device 210, processor 122 of client device 120, and/or another processor to capture data related to a user action, classify the user action as either a first classification or a second classification based on the data, and/or determine the amount of time between the user action and the corresponding user interface response This analysis is described in further detail below with respect to. for example. FIGs. 4 and 5 [0028] User action classification instructions 254 may function to classify user actions. For example, when user action classification instructions 254 are executed by processor 220, user action classification instructions 254 may cause the processor 220 of measurement device 210. the processor 122 of client device 120. and/or another processor to classify a user action and/or a user interface control associated with the user action based on a control type associated with the user action and a control type associated with a preceding user action. User action classification instructions 254 may also cause a processor to store the classification of the user action and/or the user interface control in machine-readable storage medium 240 and/or in another storage device, such ss database 140. This analysis is described in further detail below with respect to. for example, FIGs. 4 and 6.
[0029] Data analysis instructions 258 may function to analyze data related to the user action based on the classification of the user action. For example, when data analysis instructions 256 are executed by a processor, such as processor 220 of measurement device 210, data analysis instructions 256 may cause processor 220 of measurement device 210, processor 122 of client device 120. and/or another processor to determine statistics for user actions of the same classification. As another example, data analysis instructions 256 may cause processor 220 and/or another processor to determine whether the amount of time that elapses between a user action and a user interface response to the user action exceeds a threshold, where the threshold is based on the classification of the user action. This analysis is described in further detail below with respect to. for example. FIGs. 4. 7. and 8.
[0030] FIG. 3 is a block diagram of an example user action performance measurement device 310 consistent vith disclosed implementations. In certain aspects, user action performance measurement device 310 may correspond to user action performance measurement device 110 of FIG. 1. Device 310 may be implemented in various ways. For example, device 310, like device 210. may be a special purpose computer, a server, a mainframe computer, a computing device executing instructions that receive and process information and provide responses. and/or any other type of computing device. In the example shown in FIG. 3. device 310 may include an interface device 330, a data capture engine 340, a user action classification engine 350, and a data analysis engine 360.
[0031] Interface device 330 may be any device that facilitates the transfer of information between user action performance measurement device 310 and external components, such as client device 120. In some examples, interface device 330 may include a network interface device that allows user action performance measurement device 310 to receive and send data to and from network 150. For example, interface device 330 may retrieve and process data related to user actions from client device 120 via network 150
[0032] Engines 340. 350. and 360 may be electronic circuitry for implementing functionality consistent with disclosed examples. For example, engines 330, 340. and 350 may represent combinations of hardware devices and programming to implement the functionality consistent with disclosed implementations In some examples, the functionality of engines 340, 350. and 360 may correspond to operations performed by user action performance measurement device 210 of FIG. 2. such as operations performed when user action performance measurement instructions 250 are executed by processor 220 (described above with respect to FIG. 2). In FIG. 3. data capture engine 330 may represent a combination of hardware and programming that performs operations similar to those performed when processor 220 executes data capture instructions 252. Similarly, user action classification engine 340 may represent a combination of hardware and programming that performs operations similar to those performed when processor 220 executes user action classification instructions 254. and data analysis engine 360 may represent a combination of hardware and programming that performs operations similar to those performed when processor 220 executes data analysis instructions 256.
[0033] FIG. 4 is a flow chart of an example process 400 for measunng user action performance consistent with disclosed implementations Although execution of process 400 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100. other suitable systems and devices for execution of at least one step of process 400 may be used. For example, processes descnbed below as being performed by measurement device 110 may be performed by client device 120. user action performance measurement device 210. user action performance measurement device 310. and/or any other suitable device. Similarly, processes described below as being performed by client device 120 may be performed by measurement devices 1 3, 210, 310. and/or any other suitable device. Process 400 may be implemented in the form of executable instructions stored on a machine-readable storage medium and/or in the form of electronic circuitry.
[0034] Process 400 may start (step S405) wtien a user action has been performed on a user interface. For example, to begin a user session, client device 120 may output a user interface for an applicaton on an available display, such as display device 126 The user interface may include content, such as text, graphics, or a combination of text and graphics, which represents information and/or actions that are available to a user For example, the user interface may include hyperlinks, radio buttons, checkboxes, text entry boxes, buttons, and/or other types of controls that a user may interact with Users may interact with the interface by inputting a user action related to the control to client device 20 For example, a user may execute a mouse click, move a mouse, execute a touch gesture on a touch-enabled display, execute a voice command, or provide another type input.
[0035J Once the user action has seen performed, process 400 may include capturing user action data (step S420) For example, system 100 may receive the user action input and/or determine a control type associated with the user action. System 100 may also determine the amount of time that elapses between the user action and a user interface response to the user action. For example, system 100 may determine the amount of time between the user action and a complete user interface response such that the user interface is fully updated (e.g.. all graphics and text are loaded). System 100 may also collect and/or store data related to the user action. Examples of the steps that ma/ be involved with capturing user action data are discussed in greater detail below with respect to, for example. FIG. 5
[0036] Process 400 may also include classifying the user action {step S430) based on user action data. For example, system 100 may classify data based on the control type associated with the user action and the context or flow in which the control is being used, in some examples, system 100 may determine a control type associated with a user action (e g , a first user action) and a control type of a preceding user action (e.g., a second user action) and compare those control types with known control types to determine a classification. For example, system 100 may compare the control types to sets of long action control types and sets of short action control types stored in a storage devce, such as database 140, to determine a match. System 100 may then classify ihe user action (and/or the control associated with the user action) as one of a first dassification (e.g.. a long action) or a second classification (e.g., a short action) based on the match. Examples of the steps that may be involved with classifying user actions are discussed in greater detail below with respect to, for example. FIG. 6.
[0037] Process 400 may also include analyzing user action data based on the classification (step S440). For example, measurement device 110 may calculate statistics for user actions assigned to the first classification, and/or statistics for user actions assigned to the second classif cation. As another example, measurement device 110 may determine whether the amount of time between the user action and the user interface response to the user action exceeds a threshold, where the threshold is based on the classification of the user action. If so, measurement device 1 10 may generate an alert indicating t at the application is experiencing perceived user action durations that exceed the threshold, and may calculate statistics for user actions of the same classification that exceed the threshold Examples of the steps involved with analyzing user action data based on the classification are discussed in greater detail below with respect to. for example, FIGs 7 and 8 [0038] After the user action data is captured (step S420). the user action is classified based on the user action data (step S430) and/or the user action data is analyzed based on the classification (step S440), process 400 may end (step S455)
[0039] FIG. 5 is a flow chart of an e<ample process 500 for capturing user action data consistent with disclosed implementations. Although execution of process 500 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100, other suitable systems and devices for execution of at least one step of process 500 may be used. For example, processes described below as being performed by measurement device 1 10 may be performed by client device 120. user action performance measurement device 210. user action performance measurement device 310, and/or any other suitable cevice. Similarly, processes described below as being performed by client device 120 may be performed by measurement devices 1 10. 210. 310. and/or any other suitable device. Process 500 may be implemented in the form of executable instructions stored on a machine-readable storage medium and/or in the form of electronic circuitry In certain aspects, process 500 may related to the processes associated with step S420 of FIG. 4.
[0040] Process 500 may start (step S505) after a user action has been performed on a user interface. As discussed above, system 100 may receive a user action input (step S510), such as an input to client device 120. Example inputs may include, but are not limited to executing a mouse click, moving a mouse, executing a touch gesture on a touch-enabled display, e ecuting a voice command, entry of numenc, alphanumeric, and/or other character stiings, and/or any other type of input. The user action may correspond to a request from the user to provide additional content and/or otherwise receive a user interface response reflecting the content requested. Receipt of the user action input may cause client 120 to run several tasks or processes, some of which may function to update a user interface displayed on display device 126 Tasks and/or processes that update the user interface may be considered to be user interface tasks [0041) Upon performance of the user action, process 500 may continue by marking the time at which the user action was performed (step S520). For example, measurement device 110 may include instructions which, when executed by a processor, generate a time stamp which contains the start time of the user action The time stamp may be stored in a storage device such, as for example, a machine- readable medium and/or database 140. As another example, measurement device 1 10 may start a timer at the time the user action was performed, and the timer may continue to run until a specified action occurs, such as when the user interface tasks are complete (e.g.. when the user interface is fully updated)
[0042] Process 500 may continue by transmitting a request to update the user interface based on the user action (step S530). For example, in some instances, client device 120 may transmit a request to content provider device 130 to provide content requested by the user The recuest may include information associated with the user action, such as the time and date of the user action, the type of client device 120, the operating system of client device 120. the display type of display device 128. the user name, an identifier that indicates the content requested by the user based on the user action, information associated with previous user actions (e.g., characters entered by a user in a text box), and any other suitable information that content provider device 130 may use to provide content to client device 20.
[0043] Application content provider device 130 may receive the request and may provide a response including instructiors to client 120 via network 150 to update the user interface based on the request. Client 120 may receive the network response (step S540) and may execute instructions, such as instructions received from content provider device 130, to provide a user interface response (step S550) For example, client 120 may update the user interface to reflect the user action input. This may include, for example, marking or unmarting a checkbox, activating a button, updating the position of a slider on a slider bar. returning search results, navigating to a different part of the user interface, pressnting a drop down menu, providing new text and/or graphics, or otherwise changing another type of interface element. Client 120 may output the updated user interface on a display device, such as display 128.
[0044] Once the user interface has oeen updated, process 500 may continue by marking the time at which the user interlace response was complete (step S560). For example, similar to step S520. measurement device 110 may include instructions which, when executed by a processor, generate a time stamp which contains the time the user interface update was complete As discussed above, in response to the user action, client 120 may run several tasks or process, some of which may update the user interface. In some implementations, a complete user interface update may occur when the last task that updates trie user interface is complete Thus, in some examples, the time stamp may include the time that the last task that updates the user interface is complete. The time stamp may be stored in a storage device such, as for example, a machine-readable Ttedium and/or database 140. As another example, system 100 may end the timer started in step S520 once the last task that updates the user interface is complete
[0045] Process 500 may also determine the amount of time that elapsed between the user action and the user interface response (step S570) For example, in some implementations device 110 may calcuate the elapsed time by subtracting the time stamp generated in step S560 (e.g., the time at which the user interface was updated) from the time stamp generated in step S520 (e.g.. the time at which the user action was performed). As another example, device 110 may determine the ending time of the timer started in step S520 and stopped in step S5G0. As another example, any other method of measumg the time elapsed between the user action and when the corresponding user interface response to the user action is displayed may be used.
[0046] Process 500 may also determine a control type associated with the user action (step S580) For example, device 1 10 may use a number of methods for determining a control type In some implementations, device 110 may determine a control type by text associated with the control type, object mapping, and/or any other method for determining a control type.
[0047] Process 500 may also store data associated with the user action in a storage device (step S590). For example, device 110 may store data in a machine- readable medium, in database 140. and/or in any other suitable type of storage device. The stored data may include: information relating to the user action (e.g.. the date/time of the user action, the position of the user action within the sequence of user actions in a user session, the client device type, the device operating system type, the control type associated with the user action, etc.): information relating to transmitting the request to update the user interface (e.g.. the date/time of the transmission, information indicating the type of content requested, information regarding the transmission device, etc .1 information relating to the completion of the last user interface task (e.g.. the time the user interface was updated, the amount of time that elapsed between the user acton and the corresponding update to the user interface, etc.): and any other data related to the user action and user interface response. After the data is stored, process 500 may end (step S595).
[0048] FIG. 6 is a flow chart of an example process 600 for classifying user actions consistent with disclosed implementations. For example, process 600 may classify user actions based on the control type associated with the user action and the sequence of the user action within the user session Although execution of process 600 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100. other suitable systems and devices for execution of at least one step of process 600 may be used. For example, processes described below as being performed by measurement device 110 may be performed by client device 120. user action performance measurement device 210. user action performance measurement device 310. and/or any other suitable device. Similarly, processes described below as being performed by client device 120 may be performed by measurement devices 1 D, 210, 310. and/or any other suitable device. Process 600 may be implemented in the form of executable instructions stored on a machine-readable storage medium and/or in the form of electronic circuitry In certain aspects, process 600 may relate to the processes associated with step S430 of FIG 4.
[0049] Process 600 may start (step S605) after a user action has been performed on a user interface. After the user action (e.g.. a first user action) has been performed, process 600 may include cetermining a control type of the user action (step S610). In some implementations, device 110 may access data related to the user action to determine the control type associated with the user action. For example, device 110 may retrieve, for s particular user action, the stored user action data (including the control type associated with the user action) stored in a storage device, such as a machine-readable storage medium, database 140, and/or any other suitable storage device. The stcred user action data may correspond to the data stored in step S590 of FIG 5.
[0050] Process 600 may also include determining a control type associated with a preceding user action (step S620). In some implementations, measurement device 0 may access data stored in a machine-readable storage medium, in database 140. and/or in another storage device to identify preceding user actions (e.g.. a second user action) within the user session. For example, measurement device 110 may perform a query based on the position of the particular user action within the sequence (e.g . first, second, third, fourth, etc.) to identify user actions that have earlier positions within the sequence. Measurement device 110 may also determine which of those user actions occurred immediately before the particular user action. As another example, measurement device 10 may perform a query to identify user actions within the session that were performed before the date and time of the particular user action. Measurement device 110 may then compare the dates and times of the identified user actions with the date and time of the particular user action to determine which of the identified usei actions occurred at a time closest to the time of the particular user action Once the user action or actions that precede the particular user action are identified, measurement device 1 10 may access data stored in a storage device, such as a machine-readable storage medium, database 140, and/or another storage device to identify the control type associated with the preceding user actions.
[0051 J Process 600 may also includEt determining whether the control type of the user action and the control type of a preceding user action match a first set of control types (step S630). For example, system 100 may store sets of long action control types in a storage device, such as da abase 140. The sets of long action control types may define control types that, based on their respective positions within a sequence of user actions, may be associated with user actions that a user may perceive to take a long amount of time For example, the sets of long action control types may include control types associated with user actions that occur after a user has entered or viewed some type of custom input. The sets of long action control types may include an initial long control type and a subsequent long control type. The initial long control type may be a type of control associated with a user action that occurs immediately before the subsequent long control type. Examples of sets of long action control types may include: (1 ) a checkbox as the initial long control type and a refresh as the subsequent long control type: (2) a switch as the initial long control type and a refresh as the subsequent long control type: (3) an open menu option as the initial long control type and a refresh as the subsequent long control type: (4) a radio button as the initial kmc control type and a refresh as the subsequent long control type: (5) a text entry as the initial long control type and a button as the subsequent long control type (e.g.. search button, login button, and the like): and (6) any other suitable initial long control type. While in the example described above each of the sets of long action control types include two control types, any number of control types (e.g.. one control type, three control types, ten control types, etc.) may be part of a set. and each set need not have the same number of control types.
[0052] Measurement device 110 may compare the user action control type and the preceding user action control type to the sets of long action control types to determine a match. For example, measurement device 110 may determine whether the control type of the user action and the control type of the preceding user action match the subsequent long control type and the initial long control type, respectively if the control type of the user action matches the subsequent long control type and the control type of the preceding user action matches the initial long control type in a set, (step S630; yes) the user action and/or the control associated with the user action may be classified as a first classrication (step S6 0) (e.g., a long action and/or a long action control, respectively). Measurement device 110 may provide data regarding the classification to a storage device, such as database 140, for storage and/or to another device for processing [step S680).
[0053] If the control type of the user action does not match the subsequent long control type and the control type of th* preceding user action does not match the initial long control type, (step S630; no), process 600 may also determine whether the control type of the user action and the control type of the preceding user action match a second set of control types (step S650). For example, like with the sets of long action control types, measurement devce 1 10 may store sets of short action control types in a storage device, such as database 140. The sets of short action control types may define control types that, based on their respective positions within a sequence of user actions, may be associated with user actions that a user may perceive to take a short amount of time. For example, the sets of short action control types may include control types assooated with user actions that relate to system prepared user interface options. The sets of short action control types may include an initial short control type and a subsequent short control type. The initial short control type may be a type of control associated with a user action that occurs immediately before the subsequent short control type Examples of sets of short action control types may include, but are not limited to. any combination of a (1 ) refresh, checkbox, switch, radio button, open menu option, list option, and scroll view (without refresh) as the initial short control type and a (2) text box. checkbox, switch, radio button, open menu option, list option, and scroll view (without refresh) as the subsequent short control type. While in the example described above the sets of short action control types two control types, any number of control types (e.g.. one control type, three control types, ten control types, etc.) may be part of a set. and each set need not have the same number of control types
[0054] Measurement device 1 10 may compare the user action control type and the preceding user action control type to the control types in the sets of short action control types to determine a match. For example, measurement device 1 10 may determine whether the control type of the user action and the control type of the preceding user action match the subsequent short control type and the initial short control type, respectively. If the control type of the user action matches the subsequent short control type and the control type of the preceding user action matches the initial short control type, (step S650, yes) the user action and/or the control associated with the user action may be classified as a second classification (step S660) (e.g., a short action and/or an short action control, respectively). Measurement Device 1 10 may provide data regarding the classification to a storage device, such as database 140 for storage and/or to another device for processing (step S680).
[0055] If the control type of the user action does not match the subsequent short control type and the control type of the preceding user action does not match the initial short control type of a short action control set. (step S650; no) the user action and/or the control associated with the user action may be classified as a third classification (step S670) (e.g.. an unknown type of user action and/or action control, respectively). Device 1 10 may then provide data regarding the classification to a storage device, such as database 140 for storage and/or to another device for processing (step S680).
[0056] Measurement device 1 10 may process the data related to user actions and/or user action controls of the thiid classification in a number of ways. For example, measurement device 1 10 may automatically change the classification of the user action to another classification, such as the first classification (e.g., reclassify the user action as a long action) or the second classification (e.g.. reclassify the user action as a short action). Measurement device 110 may also request additional information regarding the user action from the user. For example, measurement device 1 10 may generate and/or displa/ a data collection screen to collect additional data related to the user action from the user The data collection screen may include a set of data request areas indicating the data requested, such as whether the user perceived the user interface response to take a long amount of time or a short amount of time. For example, the data collection screen may include text entry boxes or radio buttons by which the user can submit his or her perceived user action duration to system 100. Measurement device 1 10 may process the perceived user action duration input by the user and modify the classification of the user action based on the input (e.g.. change the classification of the user action from the third classification to the first classification or the second classification). Measurement device 110 may then provide data regarding the new classification to a storage device, such as database 140 for storage and/or to another device for processing. After the data is stored, process 600 may end (step S695).
[0057] FIG. 7 is a flow chart of an example process 700 for analyzing user actions based on user action classifications consistent with disclosed implementations. Although execution of process 500 is described below with reference to system 100 of FIG. 1 and/or specific components of system 100. other suitable systems and devices for execution of at least one step of process 500 may be used. For example, processes described below as being performed by measurement device 110 may be performed by client device 120. user action performance measurement device 210. user action performance measurement device 310, and/or any other suitable device. Similarly, processes described below as being performed by client device 120 may be performed by measurement devices 110, 210. 310. and/or any other suitable device. Process 700 may be implemented in the form of executable instructions stored on a machine-readable storage medium andxw in the form of electronic circuitry. In certain aspects, process 600 may relate to the processes associated with step S440 of FIG. 4 [0058] Process 700 may start (step 3705) after a user action has been classified. After the user action has been class fie d, process 700 may include determining whether the amount of time between the user action and the user interface response exceeds a classification-based threshod value. For example, measurement device 1 10 may access data related to the user action to determine the classification and the amount of time between the user action and the user interface response. In some implementations, device 1 10 may retneve, for a particular user action, the stored user action data (including the classificatior and the amount of time between the user action and the user interface response) stored in a storage device, such as a machine-readable storage medium, database 140, and/or any other suitable storage device. The stored user action data may correspond to the data stored in step S590 of FIG 5 and/or step S680 of FIG. 6.
[0059] Process 700 may also set classification-based threshold values (step S710). For example. Measurement device 110 may set a first threshold value that defines an acceptable duration for usei interface responses of the first classification (e.g., long actions), and a second threshold value that defines an acceptable duration for user interface responses of the second classification (e.g., short actions). In some implementations, the first threshold value and the second threshold value may be set based on the stored user action data For example, measurement device 110 may use the stored user action data to set the first threshold value to be an average of all user interface response durations of the first classification. Similarly, device 110 may use the stored user action data to set the second threshold value to be an average of all user interface responses duratiors of the second classification In some implementations, the first threshold value and the second threshold value may be predetermined numbers. For example, measurement device 1 10 may set the first threshold value (e.g., the threshold valua associated with long actions) to be between 2-8 seconds and may set the second threshold value (e.g., the threshold value associated with short actions) to be 150ms. Measurement device 110 may then provide data regarding the first threshold and/or the second threshold to a storage device, such as a machine-readable medium, database 140, and/or another device for additional processing.
[0060] Process 700 may also determine whether the amount of time exceeds a classification-based threshold (step S720). For example, if the user action has been classified as the first classification, measurement device 110 may compare the amount of time between the user action and the user interface response to the first threshold. Similarly, if the user action has been classified as the second classification, measurement device 110 may compare the amount of time between the user action and the user interface response to the second threshold. If the amount of time does not exceed the ciassification-based threshold (step S720; no), process 700 may continue by calculating statistics for user actions of the same classification (step S740) (discussed in more detail below) and store the statistics and/or other data (e.g., data indicating that the user action exceeded the classification-based threshold) in a stcrage device (step S750) (also discussed in more detail below)
[0061] If the amount of time exceeds the classification-based threshold (step S720: yes), process 700 may continue by generating an alert indicating that the amount of time exceeds the threshold (step S730). For example, measurement device 1 10 may generate and provide an alert to a device, such as application content provider device 130. client device 120. and/or another suitable type of device. In certain aspects, measurement device 1 10 may generate an alert to include user action data obtained, for example, from machine-readable medium 124 of client 120. database 140, and/or another component. In some implementations, measurement device 1 10 may generate the alert such that it may include information about user actions that exceed the classification-based threshold (e.g., aggregate information about the number of user actions that exceeded the classification-based threshold, information about particular user actions, etc.).
[0062] Process 700 may also include calculating statistics for user actions of the same classification (step S740). In some implementations, the statistics may relate to user interface performance based on user expectations. For example, measurement device 1 10 may calculate statistics for a single user action, for multiple user actions, for single users, for multiple users, for ε single device type, for multiple device types, for a single control type, for multiple c ntral types, for a particular control and/or for multiple controls. In some implementations, measurement device 110 may access data relating to the user actions (e.g., data stored in step S590 of FIG 5 and/or step S680 of FIG 6) stored in a storage device, such as database 140. to identify information for use in various statistical calculations. For example, device 1 10 may use the data to calculate an average jser interface response duration for all user actions assigned to the first classification an average user interface response duration for all user actions assigned to the second classification As another example, measurement device 110 may calculate the number of user actions assigned to the first classification that exceeded the first threshold and the number of user actions assigned to the second classification hat exceeded the second threshold. As yet another example, measurement device 1 0 may calculate, for a particular user action, the number of users that experienced a user interface response duration that exceeded the classification-based threshold. Measurement device 110 may provide data regarding the calculated statistics to a storage device, such as a machine- readable medium and/or database 140, and/or to another device for additional processing (step S750) After the data is stored, process 700 may end (step S765)
[0063] FIG. 8 is an example of a user interface for displaying user action-related statistics consistent with disclosed implementations In some implementations, user interface CUT) performance display 80C may be generated by device 110 using data obtained from, for example, a machine-readable medium, database 140. and/or another component For example, device 1 10 may obtain the results of the calculations performed by measurement device 1 10 and described above with respect to step S740 of FIG. 7. As shown in FIG. 8, Ul performance display 800 may display results of the calculations performed in step S740 as text, graphics, or a combination of text and graphics in a way that aids the user in determining user action performance For example, as shown in FIG. 8, Ul performance display 800 may include an area 820 for displaying statistics related to overall user action performance, and an area 840 for displaying statistics related to user actions that exceed the classification-based threshold.
[0064] In the example shown in FIG 8, area 820 may display the total number of actions, the percentage of actions assigned to the first classification (e.g., long actions) that exceeded the first threshold, the average response time of actions assigned to the first classification that exceeded the first threshold, the percentage of actions assigned to the second classification that exceeded the second threshold, and/or the average response time of actons assigned to the second classification that exceeded the second threshold. In tre example shown in FIG. 8, area 840 may display information about the particular user actions that exceeded the classification - based threshold For example, area 840 may display the control type associated with the user action 842. the name of the user action 844. the number of users who performed the particular user action and had a response time that exceeded the classification-based threshold 846, and the average response time for the particular user action 848
[0065] Measurement device 1 10 may limit the information displayed on Ul performance display 800 and/or may display user-action related data according to certain selection criteria. For example, a user may instruct measurement device 1 0 to limit the data displayed to statistics related to the first classification or to statistics related to the second classification. As another example, a user may instruct measurement device 110 to limit the dcta displayed to data related to particular type of device, a particular operating systen, and/or a particular application In some implementations, measurement device 1 10 may display the Ul performance display 800 on a display device, such as a display device associated with content provider device 130.
[0066] The disclosed examples may include systems, devices, computer-readable storage media, and methods for measuring user action performance. For purposes of explanation, certain examples are described with reference to the components illustrated in FIGs. 1-8. The functionally of the illustrated components may overlap, however, and may be present in a fewer or greater number of elements and components Further, all or part of the functionality of illustrated elements may coexist or be distributed among several gsographica!ly dispersed locations. Moreover, the disclosed examples may be implemented in various environments and are not limited to the illustrated examples
[0067] Further, the sequence of operations described in connection with FIGs. 1 -8 are examples and are not intended to be limiting Additional or fewer operations or combinations of operations may be used or may vary without departing from the scope of the disclosed examples. For example, measurement device 110 of system 100 may simply receive data captured from client device 120. and/or client device 120 may classify the user action. Furthermore, implementations consistent with the disclosed examples need not perform the sequence of operations in any particular order, including those in in FIGs. 4-7. Thus, the present disclosure merely sets forth possible examples of implementations, and many variations and modifications may be made to the described examples All such modifications and variations are intended to be included within the scope of this disclosure and protected by the following claims.

Claims

CLAIMS We claim:
1. A system for measuring user action performance comprising:
a memory storing instructions; and
a processor to execute the instructions to perform operations for measuring user action performance, the operations including: capturing data associated with a plurality of user actions and user interface responses to the plurality of user actions, the data including:
a first control type associated with a first user action; and
a second control type associated with a second user action, the second user action preceding the first user action;
classifying the first user action based on the first control type and the second control type; and
analyzing the data based on the classification.
2 The system of claim 1 , the operations comprising:
calculating an amount of :ime that elapses between the first user action and a first user interface response to the first user action.
3. The system of claim 2, wherein anayzing the data based on the classification comprises:
determining whether the amount of time exceeds a threshold, the threshold being based on the classification of the first user action; and
generating an alert if the amount of time exceeds the threshold
4. The system of claim 1 , wherein classifying the first user action based on the first control type and the second control type comprises:
assigning the first user action to a first classification if the first control type and the second control type match a first set of control types; and
assigning the first user action to a second classification if the first control type and tha second control type match a second set of control types
5. The system of claim 4. wherein classifying the first user action comprises:
assigning the first user action to a third classification if the first control type and the second control type do not match the first set of control types or the second set of control types, and
generating a data collection screen to collect additional data related to the user action, the additional data including whether the user perceived the user action to take a long amount of time or a short amount of lime.
6. The system of claim 1. the operations including:
classifying a first control based on the first control type and the
second control typa, the first user action being performed the first control.
7. The system of claim 1 , wherein anayzing the data based on the classification comprises:
calculating first statistics based on a first portion of the data, the first portion of the data including data relating to user actions of the first classification;
calculating second statist cs based on a second portion of the data, the second portion of the data including data relating to user actions of the seccnd classification: and
transmitting the first statistics and the second statistics to a
computing device.
8. A non-transitory computer-readable storage medium including instructions
which, when executed by a processor of a device for measuring user action performance, cause the processor to:
obtain data associated win a first user action and a first user
interface response to the first user action, the data including a first control type associated with the first user action; classify the first user action based on the First control type and a
second control typs associated with a second user action, the second user action preceding the first user action, and analyze the data based on the classification
9. The non-transitory computer-readable storage medium of claim 8, wherein:
the data includes an amount of time that elapses between the first user action and a f rst user interface response to the first user action; and
the first user interface response is a complete response to the first user action such that a user interface is fully updated.
10. The non-transitory computer-readable storage medium of claim 8, wherein classifying the first user action comprises:
assigning the first user action to a first classification if the first control type and the second control type match a first set of control types; and
assigning the first user action to a second classification if the first control type and the second control type match a second set of control types.
1 1. The non -transitory computer-readable storage medium of claim 8. the data including a time of completion ol a last user interface task.
12. The non -transitory computer-readable storage medium of claim 8. the
instructions causing the processor to: calculate first statistics bssed on a first portion of the data, the first portion of the data including data relating to user actions assigned to the first classification;
calculate second statistics based on a second portion of the data, the second portion of tie data including data relating to user actions assigned to the second classification; and
display the first statistics and the second statistics on a display
device.
13. A computer-implemented method lor measuring user action performance
comprising:
accessing, via a processor, data associated with a plurality of user actions and user interface responses to the plurality of user actions, the data being accessed from a client device through network interface circuitry;
determining, via the processor, an amount of time that elapses
between the first user action and a corresponding full update to a user interface, the determination being based on the data, determining, via the processor, a control type associated with each user action based on the data;
classifying, via the processor, a first user action of the plurality of user actions based on tie control type associated with the first user action, the control type of a second user action that precedes the first user action, and the amount of time; and
analyzing, via the processor, the data based on the classification
14. The method of claim 13, comprising:
determining whether the amount of time exceeds a threshold, the threshold being based on the classification of the first user action; and
generating an alert if the amount of time exceeds the threshold.
15. The method of claim 14, wherein the alert includes aggregate information about a number of user actions ihat exceeded the threshold and information about the first user action
PCT/US2014/035915 2014-04-29 2014-04-29 Measuring user action performance by classifying user actions WO2015167470A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2014/035915 WO2015167470A1 (en) 2014-04-29 2014-04-29 Measuring user action performance by classifying user actions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2014/035915 WO2015167470A1 (en) 2014-04-29 2014-04-29 Measuring user action performance by classifying user actions

Publications (1)

Publication Number Publication Date
WO2015167470A1 true WO2015167470A1 (en) 2015-11-05

Family

ID=54359026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/035915 WO2015167470A1 (en) 2014-04-29 2014-04-29 Measuring user action performance by classifying user actions

Country Status (1)

Country Link
WO (1) WO2015167470A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10346285B2 (en) 2017-06-09 2019-07-09 Microsoft Technology Licensing, Llc Instrumentation of user actions in software applications

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110161331A1 (en) * 2006-03-29 2011-06-30 Christina Yip Chung Incremental Update Of Long-Term And Short-Term User Profile Scores In A Behavioral Targeting System
US20120166636A1 (en) * 2009-07-24 2012-06-28 Queen Mary And Westfiled College University Of London Method of monitoring the performance of a software application
US8214805B2 (en) * 2006-12-21 2012-07-03 International Business Machines Corporation Method and system for graphical user interface testing
US20120198476A1 (en) * 2011-01-31 2012-08-02 Dmitry Markuza Evaluating performance of an application using event-driven transactions
US20120284735A1 (en) * 2011-05-06 2012-11-08 Microsoft Corporation Interaction-Based Interface to a Logical Client

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110161331A1 (en) * 2006-03-29 2011-06-30 Christina Yip Chung Incremental Update Of Long-Term And Short-Term User Profile Scores In A Behavioral Targeting System
US8214805B2 (en) * 2006-12-21 2012-07-03 International Business Machines Corporation Method and system for graphical user interface testing
US20120166636A1 (en) * 2009-07-24 2012-06-28 Queen Mary And Westfiled College University Of London Method of monitoring the performance of a software application
US20120198476A1 (en) * 2011-01-31 2012-08-02 Dmitry Markuza Evaluating performance of an application using event-driven transactions
US20120284735A1 (en) * 2011-05-06 2012-11-08 Microsoft Corporation Interaction-Based Interface to a Logical Client

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10346285B2 (en) 2017-06-09 2019-07-09 Microsoft Technology Licensing, Llc Instrumentation of user actions in software applications

Similar Documents

Publication Publication Date Title
US11687631B2 (en) Method for generating a human likeness score
US11210160B1 (en) Computer information technology alert remediation selection based on alert similarity
US9430626B1 (en) User authentication via known text input cadence
US10095753B2 (en) Aggregation and generation of confidential data insights with confidence values
US20180089442A1 (en) External dataset-based outlier detection for confidential data in a computer system
US10068088B2 (en) Method, computer program and system that uses behavioral biometric algorithms
CN104508602B (en) Method, storage medium and system especially related to touch gesture offset
TWI584145B (en) Biometrics data recognition apparatus, system, method and computer readable medium
US11163783B2 (en) Auto-selection of hierarchically-related near-term forecasting models
US12001984B2 (en) Enhanced user selection for communication workflows using machine-learning techniques
US20210374778A1 (en) User experience management system
US11740916B1 (en) System and method for providing a customized graphical user interface based on user inputs
US11381528B2 (en) Information management apparatus and information management method
KR20190135691A (en) Method for customer relationship management of hospital using medical emr data and management server implementing the same
US10255457B2 (en) Outlier detection based on distribution fitness
CN114093472A (en) Triage information display method and client for Internet medical treatment
US20160070902A1 (en) Smart captchas
US20180089443A1 (en) Internal dataset-based outlier detection for confidential data in a computer system
US10867249B1 (en) Method for deriving variable importance on case level for predictive modeling techniques
US11507992B1 (en) Systems and methods for displaying filters and intercepts leveraging a predictive analytics architecture
JP6100832B2 (en) Method and system for providing recommended search terms based on messenger dialogue content, and recording medium
KR102040858B1 (en) System and Method for Interaction Analysis of Virtual Space
JP5827447B1 (en) Information processing apparatus, information processing method, and program
US11475221B2 (en) Techniques for selecting content to include in user communications
WO2015167470A1 (en) Measuring user action performance by classifying user actions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14890974

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14890974

Country of ref document: EP

Kind code of ref document: A1