[go: up one dir, main page]

CA3072385A1 - Systems and methods for assessing skill and trait levels - Google Patents

Systems and methods for assessing skill and trait levels Download PDF

Info

Publication number
CA3072385A1
CA3072385A1 CA3072385A CA3072385A CA3072385A1 CA 3072385 A1 CA3072385 A1 CA 3072385A1 CA 3072385 A CA3072385 A CA 3072385A CA 3072385 A CA3072385 A CA 3072385A CA 3072385 A1 CA3072385 A1 CA 3072385A1
Authority
CA
Canada
Prior art keywords
user
skill
server
level
trait
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA3072385A
Other languages
French (fr)
Inventor
James Francis Knupfer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Misellf Inc
Original Assignee
Misellf Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/274,980 external-priority patent/US20200258045A1/en
Application filed by Misellf Inc filed Critical Misellf Inc
Publication of CA3072385A1 publication Critical patent/CA3072385A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Disclosed herein are embodiments of systems, methods, and products comprises a server for assessing user skill and trait levels. The server receives user information and generates a user profile based on the received user information. The server determines a skill to be assessed and the initial level of the skill based on the user profile. The server displays a challenge corresponding to the skill to be assessed and the initial skill level in a graphical user interface of a client computing device. As the user is answering the questions included in the challenge, the server receives the response inputs and continuously monitors and tracks the interaction events. The server also executes an artificial intelligence model to determine the user's skill level and trait level based on the response inputs and interaction events. The server updates the user skill and trait levels in the user profile with the new assessment results.

Description

PATENT
SYSTEMS AND METHODS FOR ASSESSING SKILL AND TRAIT LEVELS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001]
This application claims priority from U.S. Patent Application Serial No.
16/274,980, filed February 13, 2019.
TECHNICAL FIELD
[0002]
This application relates generally to methods and systems for assessing skill and trait levels.
BACKGROUND
[0003] In recruitment processes, an employer may post positions externally and/or internally to attract suitable candidates and review previously received applications.
Alternatively, an employer may engage with one or more recruitment agencies.
However, these conventional methods and solutions have created several shortcomings and a new set of challenges. The process of finding right candidates to fill job vacancies can be cumbersome, resource-intensive, slow, and expensive. This is especially true in fields where labor shortages have made finding suitably skilled candidates more difficult.
[0004]
Some conventional methods for recruitment may be inefficient and time-consuming. For example, in the conventional methods, the employer may review and filter the applications by manually reviewing the application materials (e.g., resumes).
The review of applications may take considerable time, especially when the number of applications is large.
While automating this process using keyword filters can help, the process is still inefficient.
[0005]
Conventional methods using automated filtering may produce inaccurate results.
For example, in the conventional methods, the employer may review and filter the applications based on information provided by the applicants. Applicants, however, may intentionally or unintentionally mischaracterize their skill levels and history. Skill levels may be overstated or even understated or omitted. Further, the applications may be outdated in that they do not accurately characterize the current skill levels of the applicants. Still further, an applicant's skill levels and actual skills may not match their current or prior positions, making it difficult to judge PATENT
their competency. Even if references are consulted regarding the skill levels of the applicants, subjective bias can color the information received from the references. Even where an internal candidate is applying, the assessment of the candidate's skill levels by his/her manager may be inaccurate. As a result, the review and filtering process in conventional methods may provide inadequate information regarding the candidates' qualification.
[0006]
Some conventional methods require additional expense to pursue unqualified applicants. Once the applications are filtered, applicants may be interviewed and subsequently tested. This testing provides a snapshot of the applicants' skill levels, but is expensive and time-consuming to employers. For example, an employer may need to pay for the hotels and flight tickets for bringing the applicants to onsite interviews. In cases where the employer engages with recruitment agencies, the employer may need to pay the recruitment agencies fees in the range from 15% to 25% of the worker's annual salary.
[0007]
The conventional methods may also provide insufficient information on behavior traits of applicants. The assessment of behavioral traits of applicants is, in many cases, challenging. Different positions can have different requirements, not just technically, but also personality-wise. When the employer is looking for a worker to integrate into a larger team, evidence of teamwork can often be more desirable. In scenarios where the employer is looking for a worker for a small team or who will work alone, teamwork may be less important, and independence can be more desirable. The assessment of these behavioral traits of applicants can be difficult, as little information is available to the employer. While internal and external references can provide some insight into how the applicant works, this information is subjective.
Further, the employer may only have a short period of time to spend with an applicant before having to make a decision as to whether the applicant possesses appropriate behavioral traits for the position.
[0008] It can be difficult to objectively locate the best candidates for a position.
With the trend of globalization and working remotely, many markets are moving from roles based to skills based, thereby reducing the effectiveness of traditional recruitment processes. Further, due to the rapidly changing demands in certain markets, such as technology, media, and communications, there can be significant lost revenue due to skills shortages and/or the delay in filling a position.

PATENT
SUMMARY
[0009]
For the aforementioned reasons, what is needed is a system and method to assess the user skill and trait levels more accurately and objectively. Embodiments disclosed herein address the above challenges by providing a system and method for presenting challenges to test user skills, continuously monitoring the user interactions with the challenges, and assessing the user skill and trait levels based on user interactions. Embodiments described herein also provide a system and method for iteratively updating the user's skill and trait levels by executing an artificial intelligence model based on the user's interactions with new challenges. Whereas many conventional systems are limited to the submitted application documents, the embodiments herein recite methods for obtaining additional information and generating new information for consideration.
[0010]
Specifically, an analytic server may generate a prompt displaying a challenge comprising one or more questions for the user to respond. The analytic server may continuously monitor user interaction events as the user interacts with the prompt. The analytic server may determine a user's skill level and trait level by executing an artificial intelligence model based on the monitored interaction events. As the user's skill level and trait level update, the analytic server may provide new challenges with new questions corresponding to the user's updated skill level and trait level. The analytic server may iteratively monitor the user interaction events as the user responds to the new challenges and update the user's skill level and trait level based on the new interaction events. The analytic server may repeat performing this process to provide a more accurate and objective assessment of the user's skill and trait levels and generate a more comprehensive user profile regarding different user skills and traits over time.
[0011]
Because the embodiments described herein provide a more accurate and objective assessment of the users' skill level and trait level, the embodiments described may enable employers to search candidates based on a criteria comprising the required skill sets and the required skill level and trait level. As a result, the embodiments described herein may make the recruitment process more efficiently and less expensive.
[0012] In one embodiment, a method comprises generating, by a server, a prompt displaying one or more questions corresponding to a selected skill and an initial skill level of the selected skill, the prompt comprising a string for each question and one or more input elements PATENT
configured to receive response inputs to the question; continuously monitoring, by the server, interaction events corresponding to user interactions with the prompt, wherein the interaction events comprise the response inputs entered into the one or more input elements, a time period in which a user enters the response inputs, changes of the response inputs, interactions with other electronic applications executing on the user computing device, and a timestamp of each user interaction with the prompt; executing, by the server, an artificial intelligence model to determine an adjustment from the initial skill level to a second skill level for the selected skill and a trait level based on the monitored interaction events, the artificial intelligence model being trained based on historical users and their respective skill levels, trait levels, and interaction events; updating, by the server, a user profile with the second skill level and the trait level;
continuously monitoring, by the server, the user profile; when the server identifies a modification to the user profile, generating, by the server, a second prompt to display a second set of questions associated with the modification to the user profile; continuously monitoring, by the server, a second set of interaction events corresponding to user interactions with the second prompt;
executing, by the server, the artificial intelligence model to determine an adjustment from the second skill level to a new skill level and a new trait level based on the second set of interaction events; and updating, by the server, the user profile with the new skill level and the new trait level.
100131 In another embodiment, a system comprises a user computing device, a server in communication with the user computing device and configured to: generate a prompt on the user computing device displaying one or more questions corresponding to a selected skill and an initial skill level of the selected skill, the prompt comprising a string for each question and one or more input elements configured to receive response inputs to the question;
continuously monitor interaction events corresponding to user interactions with the prompt, wherein the interaction events comprise the response inputs entered into the one or more input elements, a time period in which a user enters the response inputs, changes of the response inputs, interactions with other electronic applications executing on the user computing device, and a timestamp of each user interaction with the prompt; execute an artificial intelligence model to determine an adjustment from the initial skill level to a second skill level for the selected skill and a trait level based on the monitored interaction events, the artificial intelligence model being trained based on historical users and their respective skill levels, trait levels, and interaction events; update a user PATENT
profile with the second skill level and the trait level; continuously monitor the user profile; when the server identifies a modification to the user profile, generate a second prompt to display a second set of questions associated with the modification to the user profile;
continuously monitor a second set of interaction events corresponding to user interactions with the second prompt;
execute the artificial intelligence model to determine an adjustment from the second skill level to a new skill level and a new trait level based on the second set of interaction events; and update the user profile with the new skill level and the new trait level.
[0014] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the disclosed embodiment and subject matter as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015]
The present disclosure can be better understood by referring to the following figures. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the disclosure. In the figures, reference numerals designate corresponding parts throughout the different views.
[0016]
FIG. 1 illustrates a computer system for assessing skill and trait levels, according to an embodiment.
[0017]
FIG. 2 illustrates a schematic diagram showing various physical and logical components of an analytic server in the computer system, according to an embodiment.
[0018]
FIG. 3 illustrates a schematic diagram showing main software components of the analytic server and a client computing device, according to an embodiment.
[0019]
FIG. 4 illustrates various data structures maintained by the computer system, according to an embodiment.
[0020]
FIG. 5 illustrate a flowchart depicting operational steps for using the skill and trait level assessment service by a user, according to an embodiment.
[0021]
FIG. 6 illustrates a graphical user interface for user profile setup, according to an embodiment.

PATENT
[0022]
FIG. 7 illustrates a flowchart depicting operational steps for the analytic server to assess skill and trait levels based on a user's response input, according to an embodiment.
[0023]
FIG. 8 illustrates a graphical user interface for presenting a challenge with simple multiple choice response options, according to an embodiment.
[0024]
FIG. 9 illustrates a graphical user interface for presenting a challenge with complex multiple choice response options, according to an embodiment.
[0025]
FIG. 10 illustrates a graphical user interface displaying content of a response option, according to an embodiment.
[0026]
FIG. 11 illustrates a graphical user interface upon selection of one of the multiple choice response options, according to an embodiment.
[0027]
FIG. 12 illustrates a flowchart depicting operational steps for updating user skill and trait levels, according to an embodiment.
[0028]
FIGS. 13A-13B illustrate a graphical user interface showing a user skill summary, according to an embodiment.
[0029]
FIG. 14 illustrates a graphical user interface of a dashboard for an employer, according to an embodiment.
[0030]
FIGS. 15A-15B illustrate a graphical user interface for an employer to specify search criteria, according to an embodiment.
[0031]
FIG. 16 illustrates a graphical user interface displaying a list of candidates matching specified search criteria, according to an embodiment.
[0032]
FIGS. 17A-17B illustrates a graphical user interface for summarizing a selected candidate's skill and trait levels, according to an embodiment.
[0033]
FIG. 18 illustrates a graphical user interface showing a region summary for users, according to an embodiment.
[0034]
FIG. 19 illustrates a flowchart depicting operational steps for assessing skill and trait levels, according to an embodiment.

PATENT
[0035]
FIG. 20 illustrates a graphical user interface for summarizing challenge results, according to an embodiment.
[0036]
FIG. 21 illustrates a graphical user interface displaying skill and trait level assessment results for a user, according to an embodiment.
DETAILED DESCRIPTION
[0037]
Reference will now be made to the illustrative embodiments illustrated in the drawings, and specific language will be used here to describe the same. It will nevertheless be understood that no limitation of the scope of the claims or this disclosure is thereby intended.
Alterations and further modifications of the inventive features illustrated herein, and additional applications of the principles of the subject matter illustrated herein, which would occur to one ordinarily skilled in the relevant art and having possession of this disclosure, are to be considered within the scope of the subject matter disclosed herein. The present disclosure is here described in detail with reference to embodiments illustrated in the drawings, which form a part here. Other embodiments may be used and/or other changes may be made without departing from the spirit or scope of the present disclosure. The illustrative embodiments described in the detailed description are not meant to be limiting of the subject matter presented here.
[0038]
Embodiments disclosed herein provide a system and method for assessing skill and trait levels. Specifically, an analytic server may receive user information and generate a user profile based on the received user information. The analytic server may determine a skill to be assessed for the user and the initial level of the skill based on the user profile. Alternatively, the analytic server may receive a selection of the skill to be assessed from the user. The analytic server may display a challenge corresponding to the skill to be assessed and the initial skill level in a graphical user interface of a client computing device. The challenge may comprise one or more questions for the user to answer. As the user is answering the questions included in the challenge, the analytic server may receive the response inputs and continuously monitor and track the interaction events corresponding to the user interactions with the graphical user interface. The analytic server may score the response inputs for the challenge. The analytic server may execute an artificial intelligence model to determine the user's skill level based on the user's score and the user interaction events. The analytic server may also execute the artificial intelligence model to determine the user's trait level, such as the user's personal nature PATENT
and characteristics based on the interaction events corresponding to the user behavior. The analytic server may update the user skill and trait levels in the user profile with the new assessment results. The analytic server may continuously monitor the user's interaction events and update the user's skill and trait levels to generate a more accurate and object assessment of the user.
[0039]
FIG. 1 illustrates components of a system 100 for assessing skill and trait levels, according to an embodiment. The system 100 may comprise an analytic server 110, a database 120, and a set of client computing devices 130 that are connected with each other via hardware and software components of one or more networks 140. Examples of the network 140 include, but are not limited to, Local Area Network (LAN), Wireless Local Area Network (WLAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), and the Internet.
The communication over the network 140 may be performed in accordance with various communication protocols, such as Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), and IEEE communication protocols.
[0040]
The analytic server 110 may be any computing device comprising a processor and other computing hardware and software components. The analytic server 110 may be logically and physically organized within the same or different devices or structures, and may be distributed across any number of physical structures and locations (e.g., cabinets, rooms, buildings, cities).
[0041]
The analytic server 110 may be a computing device comprising a processing unit.
The processing unit may include a processor with computer-readable medium, such as a random access memory coupled to the processor. The analytic server 110 may be running algorithms or computer executable program instructions, which may be executed by a single processor or multiple processors in a distributed configuration. The analytic server 110 may be configured to interact with one or more software modules of a same or a different type operating within the system 100.
[0042]
Non-limiting examples of the processor may include a microprocessor, an application specific integrated circuit, and a field programmable object array, among others.
Non-limiting examples of the analytic server 110 may include a server computer, a workstation computer, a tablet device, and a mobile device (e.g., smartphone). Some embodiments may PATENT
include multiple computing devices functioning as the analytic server 110.
Some other embodiments may include a single computing device capable of performing the various tasks described herein.
[0043]
The set of client computing devices 130 may be any computing device allowing a user to interact with the analytic server 110. The client computing devices 130 may be any computing device comprising a processor and non-transitory machine-readable storage medium.
The examples of the computing device may include, but are not limited to, a desktop computer, a laptop, a personal digital assistant (PDA), a smartphone, a tablet computer, and the like. The client computing devices 130 may comprise any number of input and output devices supporting various types of data, such as text, image, audio, video, and the like.
[0044]
The analytic server 110 may receive user information via a mobile application (e.g., client application) installed in the client computing device 130 and/or a web application (e.g., comprising a hyperlink of a website). The analytic server 110 may display a graphical user interface (GUI) on the client computing device 130 that allows the user to input user information, such as the user's basic information including name, age, gender, location (e.g., home address), zip code, email address, and the like. The user information may also include the user's professional background and experience, such as the user's education history, work projects, previous work experiences, skill sets, and the like.
[0045] In some embodiments, the analytic server 110 may collect user information from external sources (not shown). The analytic server 110 may web crawl various websites (e.g., social networks) and collect the user's relevant data from the various websites. In some other embodiments, the analytic server 110 may receive one or more documents containing the user information from the client computing device 130. For example, the analytic server 110 may render a GUI on the client computing device 130 that allows the user to upload a resume.
[0046]
The analytic server 110 may generate a user profile in the database 120 based on the received user information. The user profile may comprise the user's attributes including the user's basic information, professional background and experience, career goals, and any other data describing the user. The analytic server 110 may determine a skill to be assessed for the user based on the user's career goal, the user's background, the job market, and any other relevant data. Furthermore, the analytic server 110 may determine the initial level of the user's skill based PATENT
on the user's background and experience and other user attributes in the user profile. In some embodiments, the analytic server may let the user select the skill to be assessed. The analytic server 110 may display a list of skills for the user to select via the mobile application or web application on the client computing device 130. The analytic server 110 may display the list of skill options in an interactive component (e.g., a menu, a button, a link, or an icon) for the user to select.
[0047]
The analytic server 110 may display a challenge corresponding to the skill to be assessed and the initial skill level on the client computing device 130. Each challenge may include one or more questions for the user to answer. The analytic server 110 may determine the challenge based on the skill and the skill level, and display the one or more questions included in the challenge in a GUI on the client computing device 130. The user operating the client computing device 130 may be able to answer the one or more questions by interacting with the GUI. The GUI may include options for the user to receive hints, ask for assistance, chat with other people, and the like.
[0048] As the user is answering the questions by interacting with the GUI via the mobile or web application in the client computing device 130, the analytic server 110 may receive the user's response input to each of the questions. Furthermore, the analytic server 110 may monitor and track other interaction events corresponding to the user's interaction with the GUI. The interaction events may correspond to the user behaviors. The analytic server 110 may use such data to predict and explore the user's trait, such as the user's personal nature and characteristic, and any other user attributes.

The analytic server 110 may determine a score for the user's response inputs to the challenge. The analytic server 110 may compare the user's response inputs to the standard answers in the scoring process. The score may indicate the user's skill level in the assessed skill.
[0050] In addition, the analytic server 110 may execute an artificial intelligence (Al) model to determine the user's skill level (also referred to as skill value) and trait level (also referred to as trait value) based on the response inputs and interaction events. As the user chooses to respond to another challenge, the analytic server 110 may continuously monitor the user's interaction events, determine and update the user's skill level and trait level. By iteratively updating the user's skill level and trait level based on new challenges and user behaviors. the PATENT
analytic server 110 may generate a more accurate assessment of the user. The analytic server 110 may store the updated skill level and trait level in the user profile in a database 120. The analytic server 110 may also generate a GUI on the client computing device 130 to display the user's assessment results of the new skill level and trait level.
[0051]
The database 120 may be any non-transitory machine-readable media configured to store system data including the user data and the challenge data for skill and trait assessment.
For example, the user data in database 120 may comprise the user's basic information including name, age, gender, location (e.g., address), zip code, email address, and the like. The user data may also include the user's professional background and experience, such as the user's education history, work projects, previous work experiences, skill sets, and the like.
The user data may also include the user's preferences, user interaction events, user skill levels in different skills and trait level, and any other information about the user. The challenge data for skill and trait assessment in the database 120 may comprise the challenges associated/tagged with the different skills and the levels of the different skills, scoring parameters including the standard responses for the questions in the challenges and the method for determining a score, hints associated with the challenges, and any other information for scoring and assessment.
[0052]
The database 120 may be part of the analytic server 110. The database 120 may be a separate component in communication with the analytic server 110. The database 120 may have a logical construct of data files, which may be stored in non-transitory machine-readable storage media, such as a hard disk or memory, controlled by software modules of a database program (e.g., SQL), and a database management system that executes the code modules (e.g., SQL scripts) for various data queries and management functions.
[0053]
FIG. 2 illustrates a schematic diagram 200 showing various physical and logical components of the analytic server in the computer system, according to an embodiment. The analytic server has a number of physical and logical components, including a central processing unit ("CPU") 202, random access memory ("RAM") 204, an input/output ("I/O") interface 206, a network interface 208, non-volatile storage 210, and a local bus 212 enabling CPU 202 to communicate with the other components. The RAM 204 provides relatively responsive volatile storage to the CPU 202. The I/O interface 206 allows for input to be received from one or more devices, such as a keyboard, a mouse, etc., and outputs information to output devices, such as a PATENT
display and/or speakers. The network interface 208 permits communication with other computing devices, such as the client computing devices, over computer networks such as Internet. The non-volatile storage 210 stores the operating system and programs, including computer-executable instructions for implementing the skill and trait level assessment system.
System data 212 in the database is stored in the non-volatile storage 210.
During operation of computer system, the operating system, the programs and the data may be retrieved from non-volatile storage 210 and placed in RAM 204 to facilitate execution.
[0054]
FIG. 3 illustrates a schematic diagram 300 showing main software components of the analytic server and a client computing device, according to an embodiment.
The server application 304 executing on the analytic server 312 may perform the skill and trait level assessment. The server application 304 may access the system data 308 and provide access to the skill and trait assessment level services as a web service and/or via one or more other APIs. The server application 304 may use an artificial intelligence model 306 to process data received from the client computing device 310.
[0055] A
user operating the client computing device 310 may access the skill and trait level assessment service via a client application 302. The client application 302 may be a web application or a set of web pages presented in a web browser tab, or another app that can communicate with the server application 304. The user may log into the client application 302 by providing credentials via the client application 302, and then interact with the client application 302 to set up a user profile. The user may select skills from a list maintained by the analytic server 312 for which the user wishes to be assessed, and then respond to challenges selected for presentation by the analytic server 312. The client application 302 may have different types. For example, the client application 302 be a web browser or a purpose-built app.
[0056]
The server application 304 executing on the analytic server 312 may assess the user's skill and trait level based on the user's response input and interaction with the client application 302. The response input is the answer, selection, solution, etc.
inputted by a user in response to a challenge. Examples of response input in this implementation include yes or no answers, selection of one (or more, where appropriate) choices, expected output based on a challenge's input, and program code to solve a problem. When the skills are spoken languages, the challenges can be words and/or phrases to translate or respond to, and the response input can PATENT
be the translated response words and/or phrases. Further, the response data can be orally provided to assess pronunciation.
[0057]
FIG. 4 illustrates various data structures maintained by the computer system, according to an embodiment. The system data in the database may include a set of skills 402, a set of traits 404, skill assessment data 406 used to assess the level of users in at least some of the skills, and user data 416. Each skill 402 may have an associated skill identifier. Similarly, each trait 404 may have an associated trait identifier.
[0058]
The skill assessment data 406 may include a large set of challenges 408, each of which is tagged with one or more skills 402 to identify associations with certain skills 402.
Challenges 408 can be tagged with multiple skills 402. For example, one challenge can test both a user's HTML (Hypertext Markup Language) and CSS (Cascading Style Sheets) skills. Each challenge 408 represents a set of skill-testing questions or problems to be responded to by users, as well as one or more satisfactory responses or outcomes. The challenges 408 can include finite solutions, such as yes or no responses or multiple choice answers, or can alternatively accept a variety of solutions that satisfactorily address the challenge 408. Each challenge 408 has a unique challenge ID.
[0059]
Each challenge 408 has a level 410 for each skill 402 with which the challenge 408 is tagged. The level 410 for each skill 402 with which the challenge 408 is tagged is based on the difficulty of the challenge 408 with respect to the particular skill 402. For example, when a challenge 408 is tagged with two skills 402, the challenge 408 may require greater knowledge of a first skill and lesser knowledge of a second skill, and thus the level for each skill 402 with which the challenge is tagged may vary. The level 410 for a skill 402 with which a challenge 408 is tagged can be discrete or continuous.
[0060]
Further, one or more hints 412 can be each provided for each challenge 408.
The hints 412 may be simple text messages, images, audio prompts, etc., or, alternatively, the hints 412 may be interactive.
[0061]
Further, each challenge 408 can have response scoring parameters 414 that determine how the response data is scored. In some circumstances, it may be appropriate to either provide full score or no score, such as where the challenge expects a yes or no response.
In other circumstances, such as where the response data is mostly correct, but has an error, or
13 PATENT
where the solution provided is less optimal, a partial score may be provided for partially correct.
For example, a challenge 408 can ask a user to provide HTML code for generating a particularly formatted web page element. The challenge 408 can be tagged with two skills 402, such as HTML and CSS. One solution may perform the requisite function, but may be less optimal than another known approach using CSS. As a result, the response scoring 414 can indicate that the less optimal solution can have a lower score for both HTML and CSS skills 402 for the challenge 408, and that the more optimal solution can have a higher score for both HTML
and CSS skills 402 for the challenge 408.
[0062]
The user data 416 includes a set of user profiles 418. Each user profile 418 includes basic user data 420, such as the user's name, email address, and location. By default, the location can be determined from the IP address of the client computing device, but can be overwritten by the user. In addition, preferences 422 are stored for each user profile 418. These preferences can include who the user wishes to expose his/her data to, when to be notified, etc.
[0063]
The user data 416 also includes a user interaction event log 424 of some or all of the user's interactions with the client application. The interaction events include, but are not limited to, the challenges 408 (via the challenge IDs) that a user has been presented with, the response input provided, and all of the interactions for how and when the user interacted with the user interface of the client application. This includes the keystrokes a user makes in providing a response, including deletions, touch interactions with the user interface, user interactions with chat functionality of the client application, either to ask for assistance or to provide assistance to others, the requesting of a hint, the losing and gaining of focus of the client application (such as via switching to another application, browser tab, etc.). These entries in the user interaction event log 424 are time-stamped.
[0064]
User skill and trait vectors 426 register the user's assessed proficiency levels in the skills 402 and determined levels in the traits 404. As used herein, when "skill level" refers to a user's proficiency level in a skill 402, "skill level" and "proficiency level" may be used interchangeably. These user skill and trait vectors 426 are determined from the user interaction events that are registered in the user interaction event log 424.
[0065]
Additionally, the user skill and trait vectors 426 include a response vector that acts as a counter for the number of challenges tagged with each skill that the user has answered.
14 PATENT
Thus, if a user has provided response input to 103 challenges tagged with the JavaScript skill, 42 challenges tagged with the web security skill, and 26 challenges tagged with both the JavaScript and web security skills, the JavaScript and web security skills elements in the response vector will be 129 and 68 respectively. The analytic server may use the response vector to determine a confidence level for the user's skill levels in each skill.
[0066]
Still further, the user skill and trait vectors 426 include weight vectors that represent the amount of user interaction event data that each user skill vector is generated from.
The user skill and trait vectors 426 are keyed to the skill identifiers of the skills 402 and the trait identifiers of the traits 404.
[0067] A
machine learning technique 428 is used by the artificial intelligence model to draw conclusions from the user interaction events. The machine learning technique 428 is generated using training data. Using the machine learning technique 428, the artificial intelligence model processes the user interaction events in order to assess a user's proficiency levels in skills and trait levels based on the user interaction events; that is, to generate the user skill and trait vectors 426.
[0068]
While, in the illustrated data model, only the top-most challenge 408 is shown being tagged with associated skills 402, and having hints 412 and response scoring parameters 414, and while only the top-most user profile 418 is shown having basic user data 420, preferences 422, an associated user interaction event log 424, and user skill and trait vectors 426, it should be understood that the top-most challenge 408 and user profile 418 are representative of all challenges and user profiles respectively.
[0069]
The user can use either the mobile application (e.g., client application) installed in the client computing device and/or a web application (e.g., comprising a hyperlink of a website) served by the analytic server to create a user profile 418. The user profile 418 includes basic user data 420 such as the user's name, email address, and location.
[0070]
FIG. 5 illustrate a flowchart 500 depicting operational steps for using the skill and trait level assessment service by a user, according to an embodiment. Other embodiments may comprise additional or alternative steps, or may omit some steps altogether.

PATENT
[0071] At step 510, the user launches the client application. Once the user has registered with the analytic server, the user can then launch the client application on the client computing device that connects to the analytic server. The client application can be, for example, a web page/application retrieved from the analytic server, a custom application, and can even in some cases be a telephony application whereby the user interacts with the analytic server via voice. It will be understood that, if the user has previously set up a user profile, the user may not need to separately launch the client application at this step.
[0072] At step 520, once the client application is executing on the client computing device, the user can select one of a list of skills offered by the analytic server to be assessed. The user can select to be assessed in a skill previously selected by the user, or can select to be assessed in a new skill.
[0073] At step 530, upon receiving a selected skill for assessment, the analytic server begins to select and present challenges to the user via the client application. The analytic server may access the user skill and trait vectors to retrieve all of the user's skill levels for different skills. If the user's proficiency level for the selected has not previously been assessed, the user proficiency level for that skill is set to an initial level.
[0074]
The analytic server may select a challenge based on the user's selected skill and the level for the selected skill. In addition, when a challenge is tagged with other skills, the selection of the challenge may also consider the user's skill levels in the other skills and the level of the challenge in relation to the other tagged skills. In this embodiment, the analytic server selects a range for each skill based on the user skill levels for each skill.
For example, assuming a user has a particular skill level for a skill, the range may be from 80% to 120% of the user's skill level for the skill. The analytic server then locates challenges with levels that fall in this range for the user. Challenges that have been presented to the user within a set time period, such as the last six months, are then eliminated. The analytic server then selects one of the challenges satisfying these criteria.
[0075] At step 540, the analytic server may present the challenge to the user via the client application. The analytic server may transmit the challenge data via any known approach, such as AJAX calls, etc. In another example, the analytic server may transmit the challenge data as part of a web page. Upon receipt of the challenge data, the client application may present the PATENT
challenge to the user. The challenge data may include the hints, if any, for the challenge. The client application may present the challenge on a display of the client computing device.
Alternatively, the client application may present the challenge via audio speakers of or attached to the client computing device.
[0076] At step 550, the analytic server determines if the user wishes to assess skill level in another skill. After the user completes the challenge for the selected skill, the user may indicate that he/she wants to further assess his/her skill level in another skill by selecting another skill from the list of skills. If the user wishes to assess another skill, the process may go to step 530, where the user selects a new skill to be assessed. If the user does not wish to assess another skill, the process goes to step 560.
[0077] At step 560, the use closes the client application, after which the user may elect to recommence assessment of skill levels in the skills by relaunching the client application at step 520.
[0078]
When the client application is active, the client application may register the user interaction events and transmit the interaction events to the analytic server for processing. The analytic server may create the user interaction event log based on such interaction events. As discussed above, these user interaction events may include the keystrokes a user makes in providing a response including deletions, touch interactions with the user interface, user interactions with chat functionality of the client application, either to ask for assistance or to provide assistance to others, the requesting of a hint, the losing and gaining of focus of the client application (such as via switching to another application, browser tab, etc.).
The entries in the user interaction event log are time-stamped.
[0079]
FIG. 6 illustrates a graphical user interface 600 for user profile setup, according to an embodiment. The graphical user interface 600 presented on the client computing device via the client application may enable a user to enter basic information, such as a usemame, a password, surname and given name, a location, etc. This information is stored in the basic user data in the database.
[0001]
FIG. 7 illustrates a flowchart 700 depicting operational steps for the analytic server to assess skill and trait levels based on a user's response input, according to an PATENT
embodiment. Other embodiments may comprise additional or alternative steps, or may omit some steps altogether.
[0080] At step 710, the analytic server may retrieve the user skill level. The analytic server may access the user skill and trait vector to retrieve all of the user's skill levels for different skills. If the user's proficiency level for the selected has not previously been assessed, the user proficiency level for the selected level is set to an initial level.
[0081] At step 720, the analytic server may select a challenge based on the user skill level for the selected skill. The analytic server may select a challenge based on the user's selected skill and the level for the selected skill. In addition, when a challenge is tagged with other skills, the selection of the challenge may also consider the user's skill levels in the other skills and the level of the challenge in relation to the other tagged skills.
[0082] At step 730, the analytic server may present the challenge to the user via the client application. The analytic server may transmit the challenge data via any known approach, such as AJAX calls, etc. In another example, the analytic server may transmit the challenge data as part of a web page. Upon receipt of the challenge data, the client application may present the challenge to the user. The challenge data may include the hints, if any, for the challenge. The client application may present the challenge on a display of the client computing device.
Alternatively, the client application may present the challenge via audio speakers of or attached to the client computing device.
[0083] At step 740, the analytic server may receive response input from the user.
Response input is the data received from the user to specifically respond to the challenge presented. The client application provides the user a response interface for inputting response input to the challenge. Typically, the response interface includes a text field in which text can be entered and a "submit" button to transmit the entered response input to the analytic server.
Alternatively, the response interface can include a set of buttons or controls enabling the user to select "true" or "false", "yes" or "no", one or more of a set of choices, etc.
In this case, the response input may be deemed to be received upon selection of one of the choices, or can be registered upon activating a "submit" button or other control. In other implementations, the response interface can include an audio capture device, such as a microphone of the client computing device, enabling the user to provide an oral response. Buttons or other controls can PATENT
optionally be used to control audio recording, or algorithms can be used to detect when the user is providing an oral response and when the user has completed his/her response.
[0084] In some embodiments, the analytic server may monitor the user's interactions with the response interface. For example, as the user is entering a response and making edits, the analytic server may register the edits as interaction events for the purpose of adjusting the scoring of the response input.
[0085]
Upon receipt of the response input for the challenge, typically after the user selects to submit the entered input, the client application transmits the response input to the analytic server. In turn, the analytic server registers the response input in the user interaction event log.
[0086] At step 750, the analytic server may score the response input upon receipt of the response input. The analytic server may use the response scoring parameters to determine the score for the response input. The score may be a value in the range from zero to one. In some embodiments, the response input is limited to a binary set of options, such as "yes" or "no", or "true" or "false", or selection of one or more choices. Scoring such response input is straightforward and definite because the response is either right or wrong. In some other embodiments, the process of evaluating the response input can be more complicated. For example, the challenge can ask a user to create a short program to perform a function. In this case, the evaluation of the response input can involve (a) syntax validation, (b) execution, (c) comparison of the output to expected results, and (d) determination of the efficiency of the program. For other skills, the method of scoring the response input can vary.
For example, where the skill is a spoken language, the response input can be scored on its grammatical correctness, intonation, etc. Further, when a challenge is tagged with more than one skill, the response input is scored for each tagged skill.
[0087] At step 760, the analytic server may update the skill level based on the response input score. Upon receiving and scoring the revised response input, the analytic server may update the skill level(s) corresponding to the challenge. In order to determine how the user skill levels are to be updated, the analytic server determines an expected response input score based on the user's current skill levels in the user skill and trait vectors and the level for each tagged skill for the challenge. A user with a skill level that is higher than the level of the selected PATENT
challenge for a particular skill can be expected to do better than a user with a user skill level that is lower than the level of the selected challenge for the particular skill. An example of a function to determine the response input score is:
60% + ( user skill level - level of the challenge) / 50, where the user skill level and the level of the challenges are determined on a scale from 0 to 100.
[0088]
The response input score is then compared to the expected response input score to determine how the user skill level for the particular skill is to be adjusted.
In the illustrated embodiment, the user skill level is adjusted as follows:
skill level = skill level + (response input score - expected response input score) x 10 [0089]
The analytic server registers the updated skill levels in the user skill and trait vectors. In addition, the analytic server updates the response vector in the user skill and trait vectors for the skills tagged for the challenge associated for which the response was received.
Further, the analytic server registers the challenge and the received response input in the user interaction event log.
[00901 In some embodiments, the skill and trait level assessment system may enable the user to revise his/her response. For example, after the user has inputted a number of lines of programming code, the user may select to review and revise his/her response input such as, for example, after consulting with others or reviewing a resource. The client application may present an interactive component (e.g., a button) to enable the user to remain on the same challenge and revise his/her response input.
[0091] At step 770, the analytic server may determine if the user selects to revise their response input. If the user requests to revise the response input, the process goes to step 730. In other embodiments, the analytic server may update the user's skill level for the revised response input. The analytic server may register the user's additional interaction, such as the user's desire to improve the skill. If the user does not request to revise the response input, the process goes to step 780.
[0092] At step 780, the analytic server determines if the user wishes to respond to another challenge. If the user indicates that he/she is interested in another challenge, the process goes to step 720 where the analytic server may select another challenge for the user and present PATENT
the new challenge to the user via the client application. The challenge selected is based on the updated user skill level. Thus, if a user continues to respond well or poorly to the challenge, the challenge presented to the user will increase or decrease in difficulty. If, instead, the user indicates that he/she does not wish to attempt another challenge, the assessment of the user's skill level in the selected skill ends. In this manner, the user can decide how little or how much time he/she wishes to spend having his/her skill level assessed.
[0093]
FIG. 8 illustrates a graphical user interface 800 for presenting a challenge with simple multiple choice response options, according to an embodiment. The GUI
800 may include a prompt 802 explaining the challenge, as well as a set of response options 804. In this case, only a single response option may be selected by a user as response input, but for other challenges, more than one response option can be selected. A response submission button 806 enables a user to submit the response input. Additionally, a pause button 808 enables pausing of the challenge.
A chat window, hints, etc. can be accessed via a menu button 810.
[0094]
FIG. 9 illustrates a graphical user interface 900 for presenting a challenge with complex multiple choice response options, according to an embodiment. In some scenarios, such as where the screen size of the client application is limited in this illustrated example, it can be beneficial to selectively expose information-rich response options in this example. The GUI 900 includes a prompt 902 explaining the challenge, as well as a set of drop-down response options 904. Each drop-down response option 904 has a disclosure control 906. When the user interacts.
with the disclosure control 906, the analytic server may display the GUI in FIG. 10 to show the content of the drop-down response option 904.
[0095]
FIG. 10 illustrates a graphical user interface 1000 displaying content of a response option, according to an embodiment. Upon activating the disclosure control 1002 of a response option, the content 1004 of the response option is exposed. In this example, the content 1004 of the response option includes a set of programming code, it will be understood that the contents 1004 of the response options can be any combination of text, images, audio signals, visual effects, etc.
[0002]
FIG. 11 illustrates a graphical user interface 1100 upon selection of one of the multiple choice response options, according to an embodiment. Upon receiving the selection of PATENT
one of the response options, the analytic server may display a response submission button 1102 on the GUI 1100 to enable the user to submit the response input.
[0003]
FIG. 12 illustrates a flowchart 1200 depicting operational steps for updating user skill and trait levels, according to an embodiment. Other embodiments may comprise additional or alternative steps, or may omit some steps altogether.
[0096] At step 1210, the analytic server may receive the user interaction events. The analytic server may pass the interaction events as inputs to the artificial intelligence model.
[0097] At step 1220, the analytic server may retrieve the current user skill and trait vectors for the user and use the vectors as input of the artificial intelligence model.
[0098] At step 1230, the analytic server may update the user skill and trait vectors.
The analytic server may execute the artificial intelligence model to determine the user's new skill and trait levels and further update the skill and trait vectors with the new skill and trait levels. The analytic server may utilize machine learning techniques in the artificial intelligence model to determine the trait levels based on the behavior traits of the user interaction events.
[0099]
For example, if a user requests assistance from other users via a chat function of the client application, the artificial intelligence model can update the trait level for teamwork for the user. If the user accesses resources available through the client application after being presented with a challenge but before providing response input, the artificial intelligence model can update the skill level of the user for the skills tagged for the challenge, as well as a trait level for resourcefulness. If the client application has lost focus, such as by a user's switching to another browser tab or to another application, the artificial intelligence model may update the trait level and/or the skill level to reflect a possible access to external resources by the user.
These updated skill and trait levels are written to the user skill and trait vectors.
[00100]
The client application is continuously transmitting user interaction events to the analytic server for processing. The skill and trait level assessment service provided by the analytic server may enable users to build an assessment of their skill and trait levels that may be exposed to potential employers.
[00101]
FIGS. 13A-13B illustrate a graphical user interface 1300 showing a user skill summary, according to an embodiment. The GUI 1300 shows the user's assessed skill levels, as PATENT
well as a progress graph to give an indication of the user's progress. In addition, the skill summary GUI 1300 presents a confidence level 1302 generated for the user's skill levels by the analytic server. The confidence level is at least partially determined using the response vector in the user skill and trait vectors. As the user responds to more challenges, the analytic server becomes more confident in the skill levels that it has calculated for the user.
1001021 In the current implementation, a unitary confidence level is determined for a user based on the total number of responses to challenges and the user interaction events. If, for example, the user interaction events in the user interaction events log suggest that a user takes a relatively long time to provide response, or switches to other applications or browser tabs during the process of responding to a challenge, the analytic server may provide a lower confidence level in the user skill levels determined from the response input. If the user provides response input more quickly or without switching applications/browser tabs, the analytic server may have a higher confidence level for the assessed user skill level.

The analytic server may draw various other inferences from the user interaction events that cause the confidence level to be adjusted. In order for the unitary confidence level to be applied by the analytic server to a particular skill for a user, the user must have answered a minimum threshold number of challenges tagged with that particular skill. In other embodiments, a separate confidence level may be determined for each skill at least partially based on the user interaction events registered during the response of a challenge tagged with that skill.

Users can select the types of positions that they are interested in to be notified of.
The types of positions can be specified by various criteria, including geographic location, size of the employer, the compensation level, particular companies, etc. Employers interested in hiring workers can post positions, specifying what skill proficiency and trait levels are desired for one or more positions. When positions matching a user's criteria are submitted via the position notifications, the user is notified. The user may then elect to apply or not apply for the posted position. If the user elects to apply for a position, his/her skill proficiency and trait levels can be exposed to the employer.

Unlike the conventional hiring process, the skill proficiency and trait levels assessed for a user can be deemed a more credible assessment of the skills of the user than a resume or even the testimony of a personal reference. Users need not expose or even provide PATENT
their gender, age, or employment history. Not only can the user's current proficiency levels in various skills exposed to the employer, but also various aspects of their progression stored in the user interaction event log can be exposed. Based on such information, the employer is able to determine if the users has made rapid progress in their proficiency level, is readily distracted, etc.
Thus, the systems and methods described herein provide a more holistic skill-based and not role-based perspective of the users.
[00106]
Further, as much subjectivity is removed, the cost of assessing if a user possesses the appropriate skills is significantly reduced for the employer. Because of the objectivity of the skills proficiency level assessment provided by the systems and methods described herein, the user proficiency level assessments can be used for other purposes, such as being provided as evidence to immigration officials that the user possesses needed skills. The systems and methods described herein can also aid users in identifying areas for improvement.
Employers can also use the skill and trait level assessment system in order to locate candidates for work.
[00107]
FIG. 14 illustrates a graphical user interface 1400 of a dashboard for an employer, according to an embodiment. The employer dashboard GUI 1400 may indicate the number of connection requests made for users, the number of users that have accepted such requests, and the number of candidates that have declined. The employer dashboard GUI 1400 may also indicate the number of user profiles that the employer has chosen to follow, the number of new profiles that match search criteria that the employer has specified, and the number of saved candidate searches. The employer dashboard GUI 1400 also enables an employer to specify criteria for a new candidate search via a new search button 1402.
[00108]
FIGS. 15A-15B illustrate a graphical user interface 1500 for an employer to specify search criteria, according to an embodiment. When an employer activates a new search for candidates, the analytic server may display the GUI 1500 to enable the employer to specify the desired skills and skill levels, how closely candidates have to match the specified criteria to be returned in the results, the confidence level desired for the users' skill levels, the location of the role, the type of role, remuneration details, and governmental work eligibility requirements.
[00109]
FIG. 16 illustrates a graphical user interface 1600 displaying a list of candidates matching specified search criteria, according to an embodiment. The candidate search result GUI
1600 may include a list of candidates, the percent match for each candidate to the desired skill PATENT
levels, and the confidence level for the user's skill levels. If a user has not answered the minimum threshold number of challenges for a requisite skill specified in the search criteria, the user may be shown as having an indeterminate confidence level as insufficient data has been collected and thus may appear lower in the search results. This can be conveyed to the searcher via the search results interface. In other embodiments, the confidence level for the user may be reduced to reflect the at least partially insufficient data.
[00110]
The search results do not include personally-identifying information, enabling the candidates to maintain anonymity. The employer can elect to view more details for a candidate via a corresponding candidate view button 1602, and can select to follow the progress of the candidate via a follow button 1604.
[00111]
FIGS. 17A-17B illustrates a graphical user interface 1700 for summarizing a selected candidate's skill and trait levels, according to an embodiment. The analytic server may present the candidate summary GUI 1700 to the employer when the employer selects to view a candidate from the list of candidates matching the employer's search criteria.
The candidate summary GUI 1700 shows how the candidate ranks relative to other candidates, the candidate's location, whether the candidate is willing to relocate, the candidate's skill levels, the amount of time the candidate spent answering challenges, a graph of technical skill progression, and a = behavioral overview graph illustrating some key traits of the candidate.
A connect button 1702 enables the employer to send a request to the candidate to open a discussion.
When the candidate receives the request, the candidate can decide whether to expose more information, including contact details, name, etc., or reject the request to maintain his/her anonymity.
[00112]
FIG. 18 illustrates a graphical user interface 1800 showing a region summary for users, according to an embodiment. The GUI 1800 shows a region summary for JavaScript developers that can be viewed by an employer. The region summary GUI 1800 graphically presents an overview of the current and projected JavaScript developer markets by region.
[00113]
FIG. 19 illustrates a flowchart depicting operational steps for assessing skill and trait levels, according to an embodiment. Other embodiments may comprise additional or alternative steps, or may omit some steps altogether.
[00114]
At step 1902, the analytic server may receive user information and generate a user profile based on the received user information. The analytic server may display a GUI

PATENT
comprising a set of input fields configured to receive user inputs on user information. The GUI
may allow the user to input user information, such as the user's basic information including name, age, gender, address, zip code, email address, and the like. The user information may also include the user's professional background and experience, such as the user's education history, work projects, previous work experiences, skill sets, and the like. The analytic server may receive the user information via the mobile application (e.g., client application) installed in the client computing device and/or a web application (e.g., comprising a hyperlink of a website).
[00115]
The analytic server may generate a user profile for each user based on the user information received from the client computing device. The user profile may comprise the user's attributes including the user's basic information, professional background and experience, career goals, and any other data describing the user. The analytic server may create a user account including a username and a password for the user. The analytic server may also store the user's profile into a database.
[00116]
The analytic server may receive the user information from the client computing device based on the user's input. Alternatively, the analytic server may retrieve the user information by web crawling social networks from external data sources. The analytic server may access external data sources to collect the user data by using web crawling or other data mining algorithms. For example, the analytic server may visit various websites (e.g., social networks) by going through a list of Uniform Resource Locator (URL) web addresses and collect the user's relevant data from the various websites. In some other embodiments, the analytic server may request the user to upload one or more documents, parse the documents to extract the user information, and generate the user profile based on the extracted user information. For example, the analytic server may request the user to upload a resume. The analytic server may receive the document (e.g., resume) uploaded by the client computing device and generate the user profile based on the user information extracted from document (e.g., resume).
[00117] At step 1904, the analytic server may determine a skill to be assessed for the user and an initial skill level for the skill. The analytic server may display a GUI comprising an interactive component with a list of skills configured to receive a selection of a skill from the list of skills. The analytic server may display the list of skills for the user to select via the mobile application or web application. The list of skills may be skill options offered by the analytic PATENT
server for assessment. The analytic server may display the list of skill options in an interactive component (e.g., a menu, a button, a link, or an icon) for the user to select.
[00118] In some embodiments, the list of skills may be based on the user profile. For example, if the user is a software engineer, the list of skills may include a plurality of programming languages. In some other embodiments, the list of skills may be based on the user's request. The user may be able to enter the skill sets the user is interested in. For example, even though a user is an accountant, the user may want to make transition and request to be assessed for programming languages in computer science. Upon the user interacting with the interactive component, the analytic server may receive the user's selected skill for assessment.
[00119] In some embodiments, the analytic server may receive a selection of the skill to be assessed from a list of skills displayed on a GUI. In some other embodiments, instead of letting the user to determine the skill to be assessed, the analytic server may determine the required skill to be assessed for the user. For example, the analytic server may determine the required skill for a user based on the user's background and career goals in the user profile and job market data. If a user is looking for a certain kind of job, the analytic server may determine the required skill based on the job requirement. The analytic server may collect historical data for other users, historical data for job market, and historical data for accepted candidates for different jobs. The analytic server may train an Al model based on the historical data. The analytic server may execute the Al model to determine the required skills for a user based on the user's background and career goals in the user profile and the job market data. For example, if a user wants to become a software engineer, the analytic server may determine the programming languages to be assessed for the user based on the job requirements for such positions.
Furthermore, the analytic server may determine the initial level of the user's skill based on the user's background and experience and other user attributes in the user profile. For example, a new graduate majored in computer science may have a lower level in the programming languages than a software engineer with two years of experience.
[00120] At step 1906, the analytic server may display a challenge corresponding to the skill to be assessed and the initial skill level. Upon receiving a selection of the skill for assessment, the analytic server may determine a challenge corresponding to the skill and the initial skill level. Each challenge may include one or more questions for the user to answer. Each PATENT
challenge may be tagged with one or more skills and the skill level for each of the one or more skills. For example, one challenge may test both HTML and CSS skills. The level of each skill may be based on the difficulty of the challenge. For example, the challenge may require greater knowledge of HTML skill and less knowledge of CSS skill. As a result, the challenge may have the tagged level for HTML skill higher than the level for CSS skill.
[00121]
The analytic server may display the one or more questions included in the challenge in a GUI. For example, the analytic server may display a prompt comprising a string for each question and one or more input elements configured to receive response inputs to the question. The user may be able to answer the one or more questions by interacting with the prompt in GUI. The prompt may provide the questions, multi choice answers, and other functions in the GUI. Furthermore, the GUI may include options for the user to receive hints, ask for assistance, chat with other people, and the like.
[00122] At step 1908, the analytic server may receive response inputs and continuously monitor and track interaction events as the user answers the questions included in the challenge.
As the user is answering the questions by interacting with the prompt via the mobile application or web application in the client computing device, the analytic server may receive the user's response inputs to each of the questions. Furthermore, the analytic server may monitor and track other interaction events corresponding to the user's interaction with the prompt. For example, the analytic server may track all the interaction for how and when the user answers the question by interacting with the prompt in the GUI of the client application.
[00123]
The interaction events may include response inputs entered into the one or more input elements of the prompt (e.g., the answers to the one or more questions included in the challenge), a time period in which the user enters the response inputs (e.g., the time spent on answering each question after the question is presented). The interaction events may further comprise interactions with other electronic applications executing on the client computing device, including whether the user requests a hint, the user's interaction with chat functions, the user's messaging events, whether the user asks for assistance or provide assistance to others, the losing and gaining of focus of the client application. The interaction events may also comprise changes of the response inputs, whether the user select the right answer at the first try, whether PATENT
the user changes and deletes the answers for multiple times to reach a final decision, and the like.
The interaction events may also comprise a timestamp of each user interaction with the prompt.
[00124]
The analytic server may continuously monitor and track the user's interaction events over time. The interaction events may correspond to the user behaviors.
The analytic server may use such data to predict and explore the user's trait, such as the user's personal nature and characteristic, and any other user attributes.
[00125] At step 1910, the analytic server may execute an artificial intelligence model to determine an adjustment from the initial skill level to a second skill level for the assessed skill and a trait level based on the response inputs and interaction events. The analytic server may determine a score for the user's response inputs to the challenge. The analytic server may compare the user's response inputs to the standard answers in the scoring process. The score may indicate the user's skill level in the assessed skill.
[00126] In addition, the analytic server may train an Al model based on historical users and their respective skill levels, trait levels and interaction events. The user behaviors corresponding to the interaction events may include meaningful information for evaluating the user's skill levels and trait levels. For example, a user spending a long time to figure out the right answer to a question may be less familiar with the involved knowledge. In another example, a user frequently providing assistance to other people may be cooperative. The AI model may explore the correlation between the user's interaction events (e.g., user behaviors) and the user's skill and trait levels based on the historical data.
[00127]
The analytic server may train the AI model using predictive modeling or machine learning techniques, including but are not limited to, neural networks (NNs), support vector machine (SVMs). decision trees, k-nearest neighbor, linear and logistic regression, clustering, association rules, and scorecards.
[00128]
The analytic server may execute the AI model based on the monitored interaction events (e.g., user behaviors) to determine the user's second skill level and trait level. By exploring the correlation between the user's interaction events (e.g., user behaviors) and the user's skill and trait levels, the analytic server may determine the user's second skill level based on the score of the challenge and the user's interaction events. The analytic server may also PATENT
determine the user's trait level, such as the user's personal nature and characteristics based on the interaction events corresponding to the user behaviors.
[00129] At step 1912, the analytic server may update the user profile with the newly assessed skill level (e.g., second skill level) and trait level. For example, the analytic server may set the initial skill level as the second skill level and update the user profile with the newly derived skill and trait levels based on the user's interactions with the challenge.
[00130]
The analytic server may display another challenge by proceeding to step 1906 and repeat the process. As the user responds to a new challenge, the analytic server may continuously monitor the user's interaction events, iteratively determine an adjustment from the second skill level to a new skill level and a new trait level, and iteratively update the user profile with the user's new skill level and new trait level derived based on the user's interaction events with the new challenge. By iteratively updating the user's skill level and trait level based on new challenges and user behaviors, the analytic server may generate a more accurate and objective assessment of the user and generate a more comprehensive user profile regarding different user skills and traits over time.
[00131]
For example, as the user's skill level and trait level update, the analytic server may provide new challenges with new questions corresponding to the user's updated skill level and trait level. The analytic server may iteratively monitor the user interaction events as the user responds to the new challenges and update the user's skill level and trait level based on the new interaction events. The analytic server may repeat performing this process to provide a more accurate and objective assessment of the user's skill and trait levels. In some embodiments, the user may select to respond to new challenges in different skill sets with different skill level. The analytic server may display new challenges based on the user's selection.
[00132] In some embodiments, the analytic server may continuously monitor the user profile by parsing the user profile periodically and identify modifications to the user profile. The modifications to the user profile may include any new data received from user inputs and/or extracted from external data sources. For example, the user may add some new skills to the user profile or user resume. The user may add new work experience to the user profile. The user may receive new endorsement for certain skills. When the analytic server identifies a modification to the user profile, the analytic server may update the skill level and the trait level associated with PATENT
the modification. Specifically, the analytic server may generate a new prompt to display a new set of questions associated with the modification to the user profile. For example, if the user adds Python as a new skill. The analytic server may generate a new prompt to display a new set of questions related with Python. The analytic server may continuously monitor a new set of interaction events corresponding to the user interactions with the new prompt.
The analytic server may execute the artificial intelligence model to determine a new skill level and a new trait level based on the new set of interaction events. The analytic server may update the user profile with the new skill level and the new trait level.
[00133]
The analytic server may store the updated skill level and trait level in the user profile in the database. The analytic server may generate a GUI to display the user's new skill level and trait level. The analytic server may also use the updated skill level and trait level for job recommendation and matching. For example, the analytic server may receive a request from an employer electronic device operated by an employer looking for candidates.
The request may comprise criteria for the required skill sets and the required skill levels and trait levels. The analytic server may search the user profile of each user and identify the users matching the criteria. The analytic server may display a list of users comprising the users matching the criteria on the employer electronic device. The analytic server may also display the detailed information regarding each user in the GUI displayed on the employer electronic device.
For example, the GUI may include the user identifier, the skill level for each required skill, the confidence level for the skill level, and the like. In some embodiments, the analytic server may determine a matching score based on the user profile and the criteria. The analytic server may display the list of matching users based on the ranking of the matching scores.
[00134]
The analytic server may also determine a confidence level (also referred to as confidence value) at least partially based on the received user responses and the monitored interaction events (e.g., user behavior). The confidence level may correspond to the extent of confidence of the analytic server with regard to the assessed skill level. The confidence value may be the AUC (area under curve) value of a ROC (receiver operating characteristic) curve. As the user responds to more challenges, the analytic server may become more confident in the skill level assessed for the user. The analytic server may determine the confidence value for the skill level based on a number of responses to the challenge provided by the user.
The number of PATENT
responses satisfies a minimum threshold. The user may need to respond a minimum number (e.g., a threshold) of questions for a certain skill assessment to receive a confidence level for that skill level. In some embodiments, the analytic server may determine the confidence level based on the total number of responses to the challenges provided by the user. In some other embodiments, the analytic server may determine the confidence level based on the user behaviors during the user interactions with the GUI. If a user takes a relatively long time to provide response or switches answers among multiple choices, or switches to browser tabs or other applications during the process of responding to a challenge, the analytic server may determine the user's skill level with a lower confidence level.
[00135]
FIG. 20 illustrates a graphical user interface 2000 for summarizing challenge results, according to an embodiment. The challenge results shown in the GUI
2000 may include the skills tested in the challenge 2002, the challenge duration 2004, the percentile ranking 2006 and the accuracy 2008. The GUI 2000 may also include a button 2010 for new challenge that enables the user to start another challenge.
[00136]
FIG. 21 illustrates a graphical user interface 2100 displaying skill and trait level assessment results for a user, according to an embodiment. The GUI 2100 may include the user identifier 2102, the summary of the user's completed challenges 2104, such as the number of challenges the user has completed, the accuracy and ranking of the user. The GUI 2100 may also include the skill levels of different skills 2106 based on the assessment results. The GUI 2100 may further include a graphical indicator 2108 that illustrates the user's progress over time.
[00137]
The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the steps in the foregoing embodiments may be performed in any order. Words such as "then," "next," etc. are not intended to limit the order of the steps;
these words are simply used to guide the reader through the description of the methods.
Although process flow diagrams may describe the operations as a sequential process, many of the operations may be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process may correspond to a method, a function, a procedure, a subroutine, a PATENT
subprogram, etc. When a process corresponds to a function, its termination may correspond to a return of the function to the calling function or the main function.
[00138]
The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed here may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
[00139]
Embodiments implemented in computer software may be implemented in software, firmware, middleware, microcode, hardware description languages, or any combination thereof. A code segment or machine-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
[00140]
The actual software code or specialized control hardware used to implement these systems and methods is not limiting of the invention. Thus, the operation and behavior of the systems and methods were described without reference to the specific software code being understood that software and control hardware can be designed to implement the systems and methods based on the description here.
[00141]
When implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable or processor-readable storage medium. The steps of a method or algorithm disclosed here may be embodied in a processor-executable software module, which may reside on a computer-readable or processor-readable PATENT
storage medium. A non-transitory computer-readable or processor-readable media includes both computer storage media and tangible storage media that facilitate transfer of a computer program from one place to another. A non-transitory processor-readable storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, such non-transitory processor-readable media may comprise RAM, ROM, EEPROM, CD-ROM
or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible storage medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer or processor. Disk and disc, as used here, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable medium and/or computer-readable medium, which may be incorporated into a computer program product.
[00142]
When implemented in hardware, the functionality may be implemented within circuitry of a wireless signal processing circuit that may be suitable for use in a wireless receiver or mobile device. Such a wireless signal processing circuit may include circuits for accomplishing the signal measuring and calculating steps described in the various embodiments.
[00143]
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A
general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A
processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with PATENT
a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
[00144]
Any reference to claim elements in the singular, for example, using the articles "a," "an" or "the," is not to be construed as limiting the element to the singular.
[00145]
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.

Claims (20)

What is claimed is:
1. A method comprising:
generating, by a server, a prompt displaying one or more questions corresponding to a selected skill and an initial skill level of the selected skill, the prompt comprising a string for each question and one or more input elements configured to receive response inputs to the question;
continuously monitoring, by the server, interaction events corresponding to user interactions with the prompt, wherein the interaction events comprise the response inputs entered into the one or more input elements, a time period in which a user enters the response inputs, changes of the response inputs, interactions with other electronic applications executing on the user computing device, and a timestamp of each user interaction with the prompt;
executing, by the server, an artificial intelligence model to determine an adjustment from the initial skill level to a second skill level for the selected skill and a trait level based on the monitored interaction events, the artificial intelligence model being trained based on historical users and their respective skill levels, trait levels, and interaction events;
updating, by the server, a user profile with the second skill level and the trait level;
continuously monitoring, by the server, the user profile;
when the server identifies a modification to the user profile, generating, by the server, a second prompt to display a second set of questions associated with the modification to the user profile;
continuously monitoring, by the server, a second set of interaction events corresponding to user interactions with the second prompt;
executing, by the server, the artificial intelligence model to determine an adjustment from the second skill level to a new skill level and a new trait level based on the second set of interaction events; and updating, by the server, the user profile with the new skill level and the new trait level.
2. The method of claim 1, further comprising:
displaying, by the server on a user computing device, a graphical user interface comprising an interactive component with a list of skills; and receiving, by the server, the selected skill from the list of skills.
3. The method of claim 1, further comprising:
receiving, by the server, a document uploaded from a user computing device;
and determining, by the server, the initial skill level of the selected skill based on user information extracted from the document by parsing the document.
4. The method of claim 1, further comprising:
retrieving, by the server, user information by web crawling social networks from external data sources; and determining, by the server, the initial skill level of the selected skill based on the user information retrieved from the external data sources.
5. The method of claim 1, further comprising:
determining, by the server, a confidence value for the second skill level based on the interaction events.
6. The method of claim 1, further comprising:
determining, by the server, a confidence value for the second skill level based on a number of responses to the one or more questions, the number of responses satisfying a minimum threshold.
7. The method of claim 1, further comprising:
receiving, by the server, a request from an employer electronic device, the request comprising a criteria for a set of required skills and required skill levels and trait levels;
searching, by the server, the user profile of each user to identify users matching the criteria; and displaying, by the server, a list of users comprising the users matching the criteria on the employer electronic device.
8. The method of claim 1, wherein the interaction events comprises messaging events.
9. The method of claim 1, wherein the interaction events comprises request for assistance.
10. The method of claim 1, wherein the one or more questions are tagged with one or more skills and the skill level for each of the one or more skills.
11. A system comprising:
a user computing device, a server in communication with the user computing device and configured to:
generate a prompt on the user computing device displaying one or more questions corresponding to a selected skill and an initial skill level of the selected skill, the prompt comprising a string for each question and one or more input elements configured to receive response inputs to the question;
continuously monitor interaction events corresponding to user interactions with the prompt, wherein the interaction events comprise the response inputs entered into the one or more input elements, a time period in which a user enters the response inputs, changes of the response inputs, interactions with other electronic applications executing on the user computing device, and a timestamp of each user interaction with the prompt;
execute an artificial intelligence model to determine an adjustment from the initial skill level to a second skill level for the selected skill and a trait level based on the monitored interaction events, the artificial intelligence model being trained based on historical users and their respective skill levels, trait levels, and interaction events;
update a user profile with the second skill level and the trait level;
continuously monitor the user profile;
when the server identifies a modification to the user profile, generate a second prompt to display a second set of questions associated with the modification to the user profile;
continuously monitor a second set of interaction events corresponding to user interactions with the second prompt;

execute the artificial intelligence model to determine an adjustment from the second skill level to a new skill level and a new trait level based on the second set of interaction events; and update the user profile with the new skill level and the new trait level.
12. The system of claim 11, wherein the server is further configured to:
display, on the user computing device, a graphical user interface comprising an interactive component with a list of skills; and receive the selected skill from the list of skills.
13. The system of claim 11, wherein the server is further configured to:
receive a document uploaded from the user computing device; and determine the initial skill level of the selected skill based on user information extracted from the document by parsing the document.
14. The system of claim 11, wherein the server is further configured to:
retrieve user information by web crawling social networks from external data sources;
and determine the initial skill level of the selected skill based on the user information retrieved from the external data sources.
15. The system of claim 11, wherein the server is further configured to:
determine a confidence value for the second skill level based on the interaction events.
16. The system of claim 11, wherein the server is further configured to:
determine a confidence value for the second skill level based on a number of responses to the one or more questions, the number of responses satisfying a minimum threshold.
17. The system of claim 11, wherein the server is further configured to:
receive a request from an employer electronic device, the request comprising a criteria for a set of required skills and required skill levels and trait levels;
search the user profile of each user to identify users matching the criteria;
and display a list of users comprising the users matching the criteria on the employer electronic device.
18. The system of claim 11, wherein the interaction events comprises messaging events.
19. The system of claim 11, wherein the interaction events comprises request for assistance.
20. The system of claim 11, wherein the one or more questions are tagged with one or more skills and the skill level for each of the one or more skills.
CA3072385A 2019-02-13 2020-02-13 Systems and methods for assessing skill and trait levels Abandoned CA3072385A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US16/274,980 US20200258045A1 (en) 2019-02-13 2019-02-13 System and method for assessing skill and trait levels
US16/274,980 2019-02-13
US201916597619A 2019-10-09 2019-10-09
US16/597,619 2019-10-09

Publications (1)

Publication Number Publication Date
CA3072385A1 true CA3072385A1 (en) 2020-08-13

Family

ID=72039906

Family Applications (1)

Application Number Title Priority Date Filing Date
CA3072385A Abandoned CA3072385A1 (en) 2019-02-13 2020-02-13 Systems and methods for assessing skill and trait levels

Country Status (1)

Country Link
CA (1) CA3072385A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990702A (en) * 2021-03-12 2021-06-18 深圳工盟科技有限公司 Construction team matching method, device and equipment based on construction task and storage medium
US20230281564A1 (en) * 2022-03-03 2023-09-07 Hireteammate, Inc. System and method for managing data in a platform

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990702A (en) * 2021-03-12 2021-06-18 深圳工盟科技有限公司 Construction team matching method, device and equipment based on construction task and storage medium
US20230281564A1 (en) * 2022-03-03 2023-09-07 Hireteammate, Inc. System and method for managing data in a platform

Similar Documents

Publication Publication Date Title
US20200258045A1 (en) System and method for assessing skill and trait levels
US9262746B2 (en) Prescription of electronic resources based on observational assessments
US9575616B2 (en) Educator effectiveness
US8930398B1 (en) System and method for improving a resume according to a job description
US10223442B2 (en) Prioritizing survey text responses
US20110306028A1 (en) Educational decision support system and associated methods
US11586656B2 (en) Opportunity network system for providing career insights by determining potential next positions and a degree of match to a potential next position
US20170344927A1 (en) Skill proficiency system
US10438500B2 (en) Job profile integration into talent management systems
Barba et al. Web analytics reveal user behavior: TTU Libraries’ experience with Google Analytics
US20190189020A1 (en) Arrangements for delivery of a tailored educational experience
US11797938B2 (en) Prediction of psychometric attributes relevant for job positions
US20160026347A1 (en) Method, system and device for aggregating data to provide a display in a user interface
Buzick et al. Personalizing large‐scale assessment in practice
US20220198951A1 (en) Performance analytics engine for group responses
US12518237B2 (en) Whole self portfolio
US11868374B2 (en) User degree matching algorithm
CA3072385A1 (en) Systems and methods for assessing skill and trait levels
McAvinue et al. Comparative evaluation of Large Language Models using key metrics and emerging tools
CA3214128A1 (en) Model-based candidate screening and evaluation tools
Gummer et al. Learning effects in coders and their implications for managing content analyses
US11120404B2 (en) Method and system for dynamic data collection while optimize a smart device
US20200020242A1 (en) Student Assessment and Reporting
Capps et al. Example‐based reasoning and fact‐weighting guidance in accounting standards
US20200401279A1 (en) User interface for providing machine-learned reviewer recommendations

Legal Events

Date Code Title Description
FZDE Discontinued

Effective date: 20220803

FZDE Discontinued

Effective date: 20220803