[go: up one dir, main page]

WO2014205421A1 - Automated detection of insider threats - Google Patents

Automated detection of insider threats Download PDF

Info

Publication number
WO2014205421A1
WO2014205421A1 PCT/US2014/043528 US2014043528W WO2014205421A1 WO 2014205421 A1 WO2014205421 A1 WO 2014205421A1 US 2014043528 W US2014043528 W US 2014043528W WO 2014205421 A1 WO2014205421 A1 WO 2014205421A1
Authority
WO
WIPO (PCT)
Prior art keywords
usage
subject
analysis
behavioral biometric
anomalous behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2014/043528
Other languages
French (fr)
Inventor
Joseph S. VALACICH
Jeffrey L. JENKINS
John Howie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Arizona
Original Assignee
University of Arizona
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Arizona filed Critical University of Arizona
Publication of WO2014205421A1 publication Critical patent/WO2014205421A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action

Definitions

  • the present invention relates to a system and a method of detecting or determining anomalous behavior in a subject by using the subject's behavioral biometric and the subject's behavioral system usage activity pattern
  • Insider threats - a trusted adversary who operates within an organization's boundaries - are a significant danger to both private and public sectors, and are often cited as the greatest threat to an organization.
  • Insider threats include disgruntled employees or ex- employees, contractors, business partners, or auditors.
  • the damage caused by an insider threat can take many forms, including workplace violence; the introduction of malware into corporate networks; the theft of information, corporate secrets or money; the corruption or deletion of data; and so on.
  • the identification of insider threats is an extremely difficult, expensive, error prone, and time-consuming task. This identification process is heightened in very large organizations. For instance, identifying a small number of potential insider threats within an organization with thousands of employees is a literal "needle in the haystack" problem. According to a recent survey, it takes companies on the average 416 days to contain an insider attack.
  • the system and a method is an automated detection of insider threats called AuDIT (Automated Detection of Insider Threats).
  • AuDIT combines behavioral biometric data (e.g., mouse and keyboard activities) and the subject's behavioral system (or system resource) usage activity pattern (e.g., what programs the person uses, and when) to identify anomalous behavior (e.g., a potential insider threat).
  • AuDIT can aid in preventing and/or identifying an insider threats challenge.
  • the invention includes behavioral biometric data (i.e., input device usage characteristics, e.g., mouse and keyboard activities) and behavioral system usage activity patterns to identify anomalous behavior (i.e., a potential insider threat).
  • behavioral biometric data i.e., input device usage characteristics, e.g., mouse and keyboard activities
  • behavioral system usage activity patterns to identify anomalous behavior (i.e., a potential insider threat).
  • the terms "behavioral system usage activity” and “behavioral system resource usage activity” are used interchangeably herein and refers to the subject's access and/or utilization patterns of various system resources including, but not limited to software modules usage patterns (e.g., software programs a person normally use on a computer, the programs installed on a computer a person does not use, etc.), network usage patterns (e.g., where a person usually visits on a network, the resources a person usually uses on a network, etc.), hardware resource usage (e.g., the devices a person usually uses) as well as any
  • One particular aspect of the invention provides a system and a method for detecting anomalous behavior by a subject.
  • the system of the invention comprises a behavioral biometric-based analysis system and a system usage analysis system that is coupled to the behavioral biometric-based analysis system.
  • the behavioral biometric-based analysis system monitors how the user or the subject uses input device(s).
  • the behavioral biometric-based analysis system comprises:
  • a data interception unit configured to intercept input from a subject, wherein the data interception unit is configured to passively collect an input device usage characteristic
  • a behavior comparison unit operatively coupled to said behavior analysis unit (i.e., behavioral system usage analysis unit).
  • the behavioral biometric-based analysis system dynamically monitors and passively collects behavioral biometric information of the subject, and translates said behavioral biometric information into representative data, stores and compares results, and outputs a behavioral biometric-based analysis result associated with anomalous behavior by the subject.
  • system resource usage analysis system that is operatively coupled to said behavioral biometric-based analysis system, said system resource usage analysis system comprising:
  • the system resource usage analysis system dynamically monitors and passively collects system usage characteristic of the subject, and translates said system resource usage characteristic into representative data, stores and compares results, and outputs a system usage result associated with anomalous behavior by the subject.
  • the behavioral biometric-based analysis system is configured to transmit the behavioral biometric-based analysis result to said system resource usage analysis system when the behavioral biometric-based analysis result indicates anomalous behavior by the subject.
  • said system resource usage analysis system is configured to determine anomalous behavior by the subject only upon receiving the behavioral biometric-based analysis result from said behavioral biometric-based analysis system indicating anomalous behavior by the subject.
  • said input device comprises a mouse, a touch pad, a touch screen, a stylus, a microphone, a track ball, a pointer device, a remote camera, a joy stick, or a combination thereof.
  • said data interception unit is configured to passively collect usage characteristics comprising how and the way (i.e., method) the subject uses a plurality of input devices.
  • said data interception unit is configured to passively collect usage characteristics comprising subject's method of selecting a given function.
  • said data interception unit is further configured to collect system usage information.
  • the system usage information comprises a current process running on the system, a start time of a given application, a run time of a given application, an end time of a given application, or a combination thereof.
  • said system is suitably configured for real-time monitoring.
  • Another aspect of the invention provides a method for determining anomalous behavior by a subject. Such a method typically comprises:
  • the step of determining anomalous behavior based on the system resource usage is conducted only when said step (a) indicates anomalous behavior by the subject.
  • said step (b) comprises analyzing network usage patterns, software usage activities, hardware resource usage, or a combination thereof.
  • Figure 1 is a schematic illustration of one particular embodiment of self- adaptive automatic insider threat detection framework that includes continuous monitoring, analyzing, and policy response framework.
  • Figure 2 is a screenshot of an example question in negative-emotion condition.
  • Figure 3 is a self-assessment Manikin (SAM) Scales.
  • the present inventors' research on deception has established that humans have uncontrolled physiological changes (e.g., a rush of adrenalin) that can be detected as observable behavioral changes when committing actions known to be immoral, criminal, or unethical.
  • physiological changes e.g., a rush of adrenalin
  • we have detected such behavioral changes with a variety of sensor technologies and approaches including vocalic, linguistic, kinesics, mouse movement, keyboard typing patterns, and so on.
  • One aspect of the system of the invention is based on the discovery that heightened emotion and stress, significantly deviating from established norms for a particular person, can be detected through subtle behavioral changes captured by anomalous input device (e.g., keyboard and mouse) usage patterns when using a particular technology resource (e.g., network usage, software usage, hardware resource) in illegitimate ways.
  • anomalous input device e.g., keyboard and mouse
  • technology resource e.g., network usage, software usage, hardware resource
  • the behavioral biometric- based analysis system is designed to develop baselines for each user by longitudinally collecting and analyzing input device usage characteristics (e.g., keystroke and mouse movement patterns) for specific critical technology resources (e.g., network resources, software resources, hardware resources) over an initial training period.
  • input device usage characteristics e.g., keystroke and mouse movement patterns
  • critical technology resources e.g., network resources, software resources, hardware resources
  • baselines is updated and refined as a person's behavior and usage evolves due to natural changes in job roles, projects, work environment, deployment of new critical resources, and so on.
  • the baseline i.e., user's own reference input device usage characteristics
  • the system of the invention detects when anomalous patterns emerge.
  • the system of the invention provides an innovative paradigm for detecting anomalous behaviors that could be indications of insider threats (e.g., heightened emotional state due to copying and pasting activity within a document that should not be shared or modified).
  • all information associated with human and system activities are contextually and continuously monitored by the system of the invention, triggering a realtime, policy-driven alert when anomalies diagnostic of insider threat occur (e.g., notifying management, restricting system access, etc.).
  • the term "contextually monitored” refers to analyzing the input device usage characteristics of a user for a given application or task.
  • input device usage characteristics refers to how and the way (or method of) a particular application or program or task is used by the user.
  • “how" a user uses an input device can be objectively measured and includes, without limitation, the duration of keyboard press, pressure applied to a keyboard when typing, keyboard stroke speed, the dwell time (the length of time a key is held down) and transition time (the time to move from one key to another), rollover time (the time that overlaps between two key presses) for keyboard actions, etc.
  • the term "way" of using an input device refers to how a particular function is selected, e.g., whether a keyboard or a mouse is used for a particular task; whether the user uses a numeric keypad or the number keys across the top of the keyboard, etc.
  • a list of exemplary behavior biometrics that can be measured and used to detect anomalous behavior is provided in Table 1.
  • Table 1 Examples of behavior biometrics that can be used to detect an insider threat
  • Idle Time if there is a change in time greater than 200 ms but no movement, this is counted as idle time
  • Hover Region The amount of time a person overs over a region
  • Previous insider threat systems have focused on either detecting anomalies in system resource utilization (e.g., operating system usage, memory, I/O calls, file access monitoring, etc.) or employing lengthy human behavioral analysis techniques (e.g., the polygraph examination) that are often invasive, expensive, and time consuming.
  • the system of the invention integrates the human behavioral monitoring via non-invasive input device (e.g., mouse and keyboard) usage patterns and the subject's behavioral system usage activity patterns for detecting anomalies that could be triggered by insider threats.
  • the system of the invention scales easily and can be deployed throughout an organization with any number of employees.
  • the system of the invention utilizes information from behavioral based input device usage characteristics and behavioral system usage activity patterns to achieve unprecedented behavioral analysis capabilities that leads to high detection rates of insider threats with very low false alarms.
  • System (i.e., system resource) usage characteristics or baselines for each individual are developed for specific critical resources (including network resources, software resources, and hardware resources).
  • the baseline is developed for each individual by analyzing software module usage patterns (e.g., software programs a person normally use on a computer, the programs installed on a computer a person does not use, etc.), network usage patterns (e.g., where a person usually visits on a network, the resources a person usually uses on a network, etc.), hardware resource usage (e.g., the devices a person usually uses), and so on.
  • software module usage patterns e.g., software programs a person normally use on a computer, the programs installed on a computer a person does not use, etc.
  • network usage patterns e.g., where a person usually visits on a network, the resources a person usually uses on a network, etc.
  • hardware resource usage e.g., the devices a person usually uses
  • baselines for each individual. For example, one can have baselines depending on the time of day, e.g., morning, afternoon and/or evening baselines, and/or one can have baselines depending on the day of the week, e.g., Monday, Tuesday, Wednesday, Thursday, Friday, and/or weekend(s) of each week, or simply weekday baselines and weekend baselines, etc.
  • the system of the invention is capable of real-time monitoring of individuals' use of computing resources using a client-server model.
  • the system continuously monitors input device (e.g., mouse and keyboard) usage activity, paired with technical system usage information (e.g., the application that the individual is using at the time and the way in which the system is being used).
  • input device e.g., mouse and keyboard
  • technical system usage information e.g., the application that the individual is using at the time and the way in which the system is being used.
  • Monitoring data is sampled and sent to the system server for analysis either in realtime as the samples are taken or in a batch mode that would allow the system to continue to function even when the user is not connected to the corporate network.
  • an alert is routed to a policy-driven management dashboard.
  • the appropriate actions according to the organization policies can be taken promptly; for example, a follow up online survey can be invoked to get further clarification from the user about the anomalous behavior; an alert is sent to the appropriate unit for deeper inspection and analysis; and so on.
  • System of the invention for monitoring and analyzing has minimal impact on the system performance, and thus does not interfere with daily operations, and is designed to be easily deployed using group policy management.
  • the system of the invention also provides a customizable and secure management dashboard, which can be configured to display and highlight anomalous behavioral indicators for review. Because the system of the invention provides continuous real-time anomaly detection, it utilizes event policies to respond to various anomalous events (e.g., notification of specific individuals, access restrictions, etc.). Using this dashboard, designated individuals can have the ability to review the indicators of users with anomalous behaviors, drilling down to specific activities and even screenshots if desired. In some embodiments, to better understand the potential severity and ramification of an event, system of the invention provides a variety of visualization and analysis tools. For instance, the dashboard provides a suspect's social network by analyzing email and communication logs, potentially identifying abnormal relationships that span organizational units.
  • This data fusion within the system of the invention identifies not only suspicious individuals, but can assist in revealing networks of conspirators. For example, if multiple individuals are exhibiting abnormal behavior indicative of insider threat, and share the same supervisor, this could suggest that the supervisor should also be investigated as a possible threat. Or, as another example, if an individual is discovered as a possible insider threat, an analysis of whom else might be an accomplice can be performed by examining communication logs.
  • the system of the invention reduces the likelihood of Type 1 errors (false positive detection of an individual as an insider threat) through analysis of organizational information available in directories (e.g., Active Directory and OpenLDAP) and HR systems (e.g. PeopleSoft) to identify manager-employee hierarchy, shared and individual calendars, cost center information, and task lists to identify causes of stress arising from project deadlines, tense meetings where others are showing stress, and so on.
  • directories e.g., Active Directory and OpenLDAP
  • HR systems e.g. PeopleSoft
  • the analysis system of the invention also supports deep analysis of usage patterns of various system resources using existing system log analysis tools. By identifying potential insider threats, the system facilitates targeted analysis and mining of system logs generated by operating systems and various system applications (e.g., proxy servers, database logs, Intranet server logs, etc.).
  • the analysis system of the invention also can be configured to interface with leading Security Incident and Event Management (SIEM) systems, to tune them to monitor system usage by individuals flagged as potential insider threats.
  • SIEM Security Incident and Event Management
  • a uDIT Framework Some embodiment of the invention is shown in Figure 1.
  • the system continuously captures behavioral biometric data (input device usage data) and behavioral system usage activity data.
  • the system uses this signature to compare to future behavioral biometric data and system usage activity data to identify anomalies (e.g., an emotional reaction while using the resource in an abnormal when).
  • anomalies e.g., an emotional reaction while using the resource in an abnormal when.
  • a policy-driven response is generated.
  • the baseline of the user is continually updated. In some instances, as more baselines are gathered, the latest baseline data are often weighted more than the previous baselines. In this manner, baselines of each user is compared to more recent activities. Accordingly, the system and method of the invention includes evolving baselines for users.
  • the anomaly detection system of the invention can be applied to detect and analyze anomalous behavior.
  • the detection system is applied to analyze and detect anomalous behavior in human-computer interaction when accessing critical organizational resources (network resources, software resources, and hardware resources).
  • This detection approach is based on supervised learning and cumulative anomaly techniques.
  • a system training period can be used to develop baseline behavioral and usage pattern for critical resources for each individual within an organization.
  • the detection system of the invention considers the patterns of behavior and use over a period of time, for a broad range of resources, instead of considering only one isolated behavior independent of the particular system being used. Research indicates that usage patterns may differ across different applications— thus, in some embodiments independent baselines are established for each individual for each critical resource to accurately identify anomalies.
  • the detection system of the invention is the first system perform continuous, real-time monitoring and anomaly analysis of human behavioral sensing via input device (e.g., mouse and keyboard) usage patterns and behavioral system usage activity patternsto identify a possible insider threat. It has many advantages over alternative methods, as those methods are often reactive, time consuming, and manually intensive (research has shown that it takes an average 416 days to identify and contain an insider attacks), invasive, require employee cooperation, and may be easily fooled by a cunning individual.
  • input device e.g., mouse and keyboard
  • the detection system of the invention overcomes these obstacles in at least the following ways: (i) A data capture process runs continuously in the background of the computer system during normal workday activities, while analysis takes place on a separate and secure remote system; (ii) Behavioral sensing data (i.e., keyboard and mouse usage) is gathered in a continuous and unobtrusive manner with no adverse effect to the user; (iii) Users need not be aware of the data collection that is taking place; (iv) If desired, raw keystroke data could be collected in such a way that the content of the message is unknown until suspicion is raised (e.g., by using an encryption key unique to the system administrator), at which point messages may be reconstructed.
  • the system's behavioral analysis approach is language agnostic (i.e., the detection methodology will work with English, Spanish, Arabic, etc.) because it relies on longitudinal usage patterns rather than message content;
  • the system is not easily fooled, as heightened emotions that would trigger anomalous event typically manifests itself as subtle differences in typing or mouse movement behavior that occurs between 20 and 100 milliseconds.
  • the focus of the detection system of the invention is primarily to confirm the individuals that are operating within a given acceptable range of behavior and system usage. Regardless, however, it provides a powerful tool for
  • Study 1 induced negative emotion (i.e., negative emotional valence) through validated manipulation, and explored its influence on mouse cursor speed and distance in a highly controlled experiment.
  • Study 2 extended the results of Study 1 to focus more broadly on emotional valence (positive, neutral and negative) and arousal (low and high).
  • Study 3 extended these results to predict frustration— a common negative emotion in online interactions— in a realistic online e-commerce scenario.
  • Study 1 was a laboratory experiment with a previously validated negative emotion manipulation. The experiment had a single factor with two conditions: a) a negative-emotion (i.e., negative emotional valence) condition and b) a baseline condition. After the emotion manipulation, participants immediately completed another task that was identical for both conditions while mouse cursor movement data were collected. Data were analyzed to determine whether participants in the negative-emotion group displayed slower cursor speed and greater cursor distance than participants in the baseline group on an otherwise identical task.
  • a negative-emotion i.e., negative emotional valence
  • Procedure and Manipulation After the participants consented to participate in the study, each participant was randomly assigned to the negative-emotion or the baseline (i.e., non-negative-valence) condition. After completing the condition, all participants completed an identical follow-up task, during which mouse cursor movements were measured.
  • Negative-Emotion Condition In the negative-emotion condition emotional valence was manipulated through an intelligence test designed to be unfair that was previously developed and validated. The instructions of the intelligence test explained that the test was timed, and the score would be computed based on how many questions the participants answered correctly within the allotted time. While the participants read the instructions, the clock began to count in the upper right hand corner of the screen. Before loading the first question, three messages were shown for eight seconds each explaining that the question was being loaded. During this time, the timer incremented. The messages were, in order, "Still loading the first question... Please be patient”; “Loading the first question... Please wait”; and "Question loaded. Processing first question."
  • the question was shown (e.g., Figure 2), giving participants 15 seconds to answer the question before the page automatically advanced.
  • the question was a difficult, requiring longer than 15 seconds to answer.
  • the cycle repeated; the three 'loading' messages were shown again for eight seconds each, while the timer advanced.
  • the participants were told that their time had expired. They were also given the feedback that they had only been given two out of three questions because they had taken too long to answer. Participants were then told that because of their slow response time and incorrect answers, their score indicated a lower intelligence level than that of most people who had taken the test.
  • Baseline Condition non-negative- emotion condition: As in the negative emotion condition above, the instructions explained to participants in the baseline condition that they would take an intelligence test. The delivery of the questions in this condition was similar to the negative-emotion condition, except that the negative-emotion-inducing mechanisms was not implemented (i.e., the test was not timed; the three questions could easily be answered; and, at the conclusion, the system congratulated participants for answering the questions correctly). Thus, negative emotion would likely not be induced in the task.
  • MTurk Amazon.com's Mechanical Turk
  • Table 3 shows the summary statistics of the two conditions for cursor distance and speed.
  • Study 2 The purpose of Study 2 was to extend the findings of Study 1 to a broader set of emotions. As previously discussed, emotions can be generally categorized on a scale from high to low arousal and positive to negative valence. These dimensions allow for classification of most emotions. In this experiment, both valence (positive vs. neutral, vs. negative valence) and arousal (high vs. low arousal) were manipulated in a 2 x 3 factorial design. See Table 4. Doing so allowed determination as to whether emotional valence and arousal interact to influence mouse cursor movements, and how the influence of negatively valenced emotions on mouse cursor movements compare to the effects of neutral and positively valenced emotions.
  • IAPS International Affect Picture System
  • IAPS is a collection of pictures that is widely used in experiments to elicit a range of emotions. Consistent with the present inventors' theoretical development, the emotions induced by the images were shown to trigger physiological responses related to the behavioral inhibition system, even to the extent of influencing users' mouse cursor movements. Each picture in the library has a rating of pleasure (valence) and arousal invoked during viewing.
  • Procedures After the participants had consented to participate in this study, they were presented with the following instructions: "Thank you for participating in this study. After you click the next button, you will see a picture for 10 seconds. After ten seconds, the page will automatically advance and ask you how you feel. After answering the questions, you will be directed to a computer store website. Your task is to navigate the website and pretend to purchase the following product: J. Crew Abingdon Laptop Bag for a 17 inch laptop. Please write down these details so you remember what product to find. After clicking on purchase, you will be guided to a survey.
  • the website was designed such that there was only one obvious link on each page that would lead the user a closer to goal attainments (finding the correct laptop bag).
  • the home webpage users had to click on the "shop laptop bags” link (the only link on the page relevant to finding laptop bags) in the left sidebar to advance to the second webpage.
  • the second webpage users had to click on the "J. Crew Abingdon Laptop Bag” that accompanied the picture shown earlier to get to the next webpage.
  • users had to select their laptop's "screen size” and push "submit” to arrive at the last webpage.
  • users could review the product and click "purchase.” After clicking on purchase, the task was complete and the website led participants to a post survey.
  • Condition 5 was significantly different from Condition 1 (p ⁇ 0.05), Condition 2 (p ⁇ 0.05), Condition 3 (p ⁇ 0.05), and Condition 4 (p ⁇ 0.05).
  • Condition 6 was significantly different from Condition 1 (p ⁇ 0.01), Condition 2 (p ⁇ 0.01), Condition 3 (p ⁇ 0.01), and Condition 4 (p ⁇ 0.01).
  • Study 3 had two main purposes. First, Study 3 manipulated negative emotion immediately before Pages 2, 3, and 4 instead of before Page 1 , and used the same website as Study 2. Doing so helped explore whether the significant results (Page 1) and nonsignificant results (Pages 2-4) in Study 2 were due to the design of the webpages or the timing / longevity of the negative emotion manipulations (as previously suggested). Second, Study 3 explored whether the tracking and analysis of mouse cursor movements can be used to infer negative emotion. In a realistic e-commerce scenario, negative emotion was manipulated in a 1 x 2 factorial experiment design, and then predicted which condition users received, based on mouse cursor speed and distance. The prediction accuracy rate is calculated and reported herein.
  • Study 3 replicated the website and general task of Study 2. Similar to the previous study, participants were asked to navigate a website to purchase a product. Upon agreeing, participants were directed to a webpage that presented them with the following instructions: "After clicking next, you will be directed to a computer store website. Your task is to navigate the website and pretend to purchase the following product: J. Crew Abingdon Laptop Bag for a 17 inch laptop. Please write down these details so you remember what product to find. After clicking on purchase, you will be guided to a survey. "
  • Download delay has been shown to induce frustration and negative valence, for example, previous studies have found that small download delays can have profound impacts on users' intentions and attitudes— decreases in performance and behavioral intentions begin to flatten when the delays extend to 4 seconds, and attitudes flatten when the delays extend to 8 seconds or longer.
  • the screen was dimmed, all links were disabled, and three successive messages were shown. First, a message saying "please wait while the next page loads” was displayed for eight seconds. Afterwards, a second message saying "Page still loading. Please be patient.” was shown for another eight seconds. Finally, a message was shown saying "Error loading page. Please try again.” Others have reported that such error messages have already been shown to increase frustration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a system and a method of detecting or determining anomalous behavior in a subject using the subject's behavioral biometric and the subject's behavioral system usage activity pattern.

Description

AUTOMATED DETECTION OF INSIDER THREATS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the priority benefit of U.S. Provisional Application
No. 61/838,149, filed June 21, 2013, which is incorporated herein by reference in its entirety.
FIELD OF THE INVENTION
[0002] The present invention relates to a system and a method of detecting or determining anomalous behavior in a subject by using the subject's behavioral biometric and the subject's behavioral system usage activity pattern
BACKGROUND OF THE INVENTION
[0003] Insider threats - a trusted adversary who operates within an organization's boundaries - are a significant danger to both private and public sectors, and are often cited as the greatest threat to an organization. Insider threats include disgruntled employees or ex- employees, contractors, business partners, or auditors. The damage caused by an insider threat can take many forms, including workplace violence; the introduction of malware into corporate networks; the theft of information, corporate secrets or money; the corruption or deletion of data; and so on. The identification of insider threats, however, is an extremely difficult, expensive, error prone, and time-consuming task. This identification process is heightened in very large organizations. For instance, identifying a small number of potential insider threats within an organization with thousands of employees is a literal "needle in the haystack" problem. According to a recent survey, it takes companies on the average 416 days to contain an insider attack.
[0004] Conventional insider threat systems focus on either detecting anomalies in system resource utilization (e.g., operating system usage, memory, I/O calls, file access monitoring, etc.) or employing lengthy human behavioral analysis techniques (e.g., the polygraph examination) that are often invasive, expensive, and time consuming
[0005] Therefore, there is a continuing need for a system and/or a method for noninvasive ly and automatically detecting insider threats.
SUMMARY OF THE INVENTION
[0006] Some aspects of the invention provide a system and a method for detecting insider threats. In one particular aspect of the invention, the system and a method is an automated detection of insider threats called AuDIT (Automated Detection of Insider Threats). AuDIT combines behavioral biometric data (e.g., mouse and keyboard activities) and the subject's behavioral system (or system resource) usage activity pattern (e.g., what programs the person uses, and when) to identify anomalous behavior (e.g., a potential insider threat). By detecting anomalous behavior of a computer user, AuDIT can aid in preventing and/or identifying an insider threats challenge. The invention includes behavioral biometric data (i.e., input device usage characteristics, e.g., mouse and keyboard activities) and behavioral system usage activity patterns to identify anomalous behavior (i.e., a potential insider threat). The terms "behavioral system usage activity" and "behavioral system resource usage activity" are used interchangeably herein and refers to the subject's access and/or utilization patterns of various system resources including, but not limited to software modules usage patterns (e.g., software programs a person normally use on a computer, the programs installed on a computer a person does not use, etc.), network usage patterns (e.g., where a person usually visits on a network, the resources a person usually uses on a network, etc.), hardware resource usage (e.g., the devices a person usually uses) as well as any other system resources or units accessed or used by the subject.
[0007] One particular aspect of the invention provides a system and a method for detecting anomalous behavior by a subject. The system of the invention comprises a behavioral biometric-based analysis system and a system usage analysis system that is coupled to the behavioral biometric-based analysis system. Generally, the behavioral biometric-based analysis system monitors how the user or the subject uses input device(s). Some aspects of determining how to obtain behavioral biometric of a user is disclosed in commonly assigned U.S. Provisional Patent Application No. 61/837,153, filed June 19, 2013, as well as commonly assigned PCT Patent Apphcation No. PCT/US 14/43057, filed June 18, 2014, and U.S. Patent No. 8,230,232, issued to Ahmed et al, which are incorporated herein by reference in their entirety.
[0008] In some embodiments, the behavioral biometric-based analysis system comprises:
(i) a data interception unit configured to intercept input from a subject, wherein the data interception unit is configured to passively collect an input device usage characteristic;
(ii) a behavior analysis unit operatively coupled to said data interception unit to receive the passively collected input device usage characteristic; and
(iii) a behavior comparison unit operatively coupled to said behavior analysis unit (i.e., behavioral system usage analysis unit). The behavioral biometric-based analysis system dynamically monitors and passively collects behavioral biometric information of the subject, and translates said behavioral biometric information into representative data, stores and compares results, and outputs a behavioral biometric-based analysis result associated with anomalous behavior by the subject.
[0009] Yet in other embodiments, the system resource usage analysis system that is operatively coupled to said behavioral biometric-based analysis system, said system resource usage analysis system comprising:
(i) a system usage monitoring system that is configured to passively collect system usage characteristic of the subject;
(ii) a system usage analysis unit operatively coupled to said system usage
monitoring system to receive the passively collected system usage
characteristic; and
(iii) a system usage comparison unit operatively coupled to said system usage analysis unit.
The system resource usage analysis system dynamically monitors and passively collects system usage characteristic of the subject, and translates said system resource usage characteristic into representative data, stores and compares results, and outputs a system usage result associated with anomalous behavior by the subject.
[0010] Still in other embodiments, the behavioral biometric-based analysis system is configured to transmit the behavioral biometric-based analysis result to said system resource usage analysis system when the behavioral biometric-based analysis result indicates anomalous behavior by the subject. In some instances, said system resource usage analysis system is configured to determine anomalous behavior by the subject only upon receiving the behavioral biometric-based analysis result from said behavioral biometric-based analysis system indicating anomalous behavior by the subject.
[0011] Yet in other embodiments, said input device comprises a mouse, a touch pad, a touch screen, a stylus, a microphone, a track ball, a pointer device, a remote camera, a joy stick, or a combination thereof. In some instances, said data interception unit is configured to passively collect usage characteristics comprising how and the way (i.e., method) the subject uses a plurality of input devices.
[0012] In some embodiments, said data interception unit is configured to passively collect usage characteristics comprising subject's method of selecting a given function. [0013] In other embodiments, said data interception unit is further configured to collect system usage information. In some instances, the system usage information comprises a current process running on the system, a start time of a given application, a run time of a given application, an end time of a given application, or a combination thereof.
[0014] Typically, said system is suitably configured for real-time monitoring.
[0015] Another aspect of the invention provides a method for determining anomalous behavior by a subject. Such a method typically comprises:
(a) determining whether the subject is exhibiting anomalous behavior based on a behavioral biometric-based analysis; and
(b) determining whether the subject is exhibiting anomalous behavior based on the system resource usage,
wherein when the subject exhibits anomalous behavior as determined by both the behavioral biometric-based analysis and the system resource usage, it is an indication that the subject is behaving anomalously.
[0016] In some embodiments, the step of determining anomalous behavior based on the system resource usage is conducted only when said step (a) indicates anomalous behavior by the subject.
[0017] Still in other embodiments, said step (b) comprises analyzing network usage patterns, software usage activities, hardware resource usage, or a combination thereof.
BRIEF DESCRIPTION OF THE INVENTION
[0018] Figure 1 is a schematic illustration of one particular embodiment of self- adaptive automatic insider threat detection framework that includes continuous monitoring, analyzing, and policy response framework.
[0019] Figure 2 is a screenshot of an example question in negative-emotion condition.
[0020] Figure 3 is a self-assessment Manikin (SAM) Scales.
DETAILED DESCRIPTION OF THE INVENTION
[0021] The present inventors' research on deception has established that humans have uncontrolled physiological changes (e.g., a rush of adrenalin) that can be detected as observable behavioral changes when committing actions known to be immoral, criminal, or unethical. In our prior research, we have detected such behavioral changes with a variety of sensor technologies and approaches including vocalic, linguistic, kinesics, mouse movement, keyboard typing patterns, and so on. One of the possible catalysts for creating an
uncontrolled physiological response in a person is online activity known to be illegitimate (e.g., accessing critical resources for criminal intent). [0022] One aspect of the system of the invention is based on the discovery that heightened emotion and stress, significantly deviating from established norms for a particular person, can be detected through subtle behavioral changes captured by anomalous input device (e.g., keyboard and mouse) usage patterns when using a particular technology resource (e.g., network usage, software usage, hardware resource) in illegitimate ways.
Additionally, significant deviations from established baselines (how a person uses the input device, such as mouse or keyboard, with critical resources) indicate that something has significantly changed about an individual. Such deviations could be a response to a variety of factors that ultimately are benign to the organization, e.g., a death in the family, relationship problems, etc. However, others could be precursors to insider threat concerns, e.g., financial crisis, or reflect an actual insider threat event. As such, when such deviations occur, it is important for the organization to not only identify a potential problem or event but to also proactively target investigations when anomalous events occur.
[0023] In some embodiments, to identify anomalous events, the behavioral biometric- based analysis system is designed to develop baselines for each user by longitudinally collecting and analyzing input device usage characteristics (e.g., keystroke and mouse movement patterns) for specific critical technology resources (e.g., network resources, software resources, hardware resources) over an initial training period. Over time, baselines is updated and refined as a person's behavior and usage evolves due to natural changes in job roles, projects, work environment, deployment of new critical resources, and so on. Thus, the baseline (i.e., user's own reference input device usage characteristics) is dynamically updated unless the change in the input device usage characteristics is significantly different. Once baselines have been developed for each user across various system resources the system of the invention detects when anomalous patterns emerge. Fusing both behavioral human- computer monitoring through input device usage characteristics along with system usage data, the system of the invention provides an innovative paradigm for detecting anomalous behaviors that could be indications of insider threats (e.g., heightened emotional state due to copying and pasting activity within a document that should not be shared or modified). In some embodiments, all information associated with human and system activities are contextually and continuously monitored by the system of the invention, triggering a realtime, policy-driven alert when anomalies diagnostic of insider threat occur (e.g., notifying management, restricting system access, etc.). The term "contextually monitored" refers to analyzing the input device usage characteristics of a user for a given application or task. The term "input device usage characteristics" refers to how and the way (or method of) a particular application or program or task is used by the user. In general, "how" a user uses an input device can be objectively measured and includes, without limitation, the duration of keyboard press, pressure applied to a keyboard when typing, keyboard stroke speed, the dwell time (the length of time a key is held down) and transition time (the time to move from one key to another), rollover time (the time that overlaps between two key presses) for keyboard actions, etc. The term "way" of using an input device refers to how a particular function is selected, e.g., whether a keyboard or a mouse is used for a particular task; whether the user uses a numeric keypad or the number keys across the top of the keyboard, etc. A list of exemplary behavior biometrics that can be measured and used to detect anomalous behavior is provided in Table 1.
Table 1 : Examples of behavior biometrics that can be used to detect an insider threat
Figure imgf000008_0001
Additional AUC The AUC minimum the minimum AUC
Overall Distance The total distance traveled by the mouse trajectory
Additional Distance The distance a users' mouse cursor traveled on the screen minus the distance that it would have required to traveling along the idealized response trajectory (i.e. straight lines between users' mouse clicks),
Distance Buckets Distance traveled for each 75 ms
X Flips The number of reversals on the x axis
Y Flips The number of reversals on the y axis
Maximum Deviation The largest perpendicular deviation between the actual trajectory and its idealized response trajectory (i.e., straight lines between users' mouse clicks),
Speed Buckets Average speed for each 75 ms
Overall Speed Average overall speed
Idle Time if there is a change in time greater than 200 ms but no movement, this is counted as idle time
Idle Time on Same If there is a change in time but not a change in location, this mean Location an event other than movement triggered a recording (e.g., such as leaving the page, and other things). The time in this event is summed.
Idle Time On 100 If there is a change in distance greater than 100 between two Distance points, this may indicate that someone left the screen and came back in another area
Total Time Total response time
Click Mean Speed The mean speed of users click
Click Median Speed The median speed of users click
Click Mean Latency The mean time between when a user clicks down and releases the click
Click Median Latency The median time between when a user clicks down and releases the click
Answer Changes The number of times an answer was selected; if over 1, the person changed answers
Hover Changes The number of times an answer was hovered; if over 1, the person hovered over answers they didn't chose
Hover Region The amount of time a person overs over a region
Return Sum The number of times a person returns to a region after leaving it
Dwell The measurement of how long a key is held down
Transition Time between key presses
Rollover The time between when one key is released and the subsequent key is pushed
[0024] Previous insider threat systems have focused on either detecting anomalies in system resource utilization (e.g., operating system usage, memory, I/O calls, file access monitoring, etc.) or employing lengthy human behavioral analysis techniques (e.g., the polygraph examination) that are often invasive, expensive, and time consuming. The system of the invention integrates the human behavioral monitoring via non-invasive input device (e.g., mouse and keyboard) usage patterns and the subject's behavioral system usage activity patterns for detecting anomalies that could be triggered by insider threats. The system of the invention scales easily and can be deployed throughout an organization with any number of employees.
[0025] Online Human Behavioral Sensor Monitoring: Cognitive psychology research has demonstrated that deceptive behavior and malicious intent, both hallmarks of insider threats, result in increased mental workload and heightened emotion. These increases in mental workload and emotion are manifested in uncontrolled physiological changes that can be detected through subtle changes to input device usage characteristics (e.g., keystroke and mouse usage behavior). When developing behavioral baselines (i.e., reference usage characteristics) in order to detect anomalous events, input device (e.g., mouse and keyboard) usage data is collected over time so that a comprehensive behavioral pattern or usage characteristics is developed for various critical online resources in order to reduce false alarms in the anomaly detection process. By comparing anomalous behavior to robust baselines or dynamic usage characteristics based on the context and application a person is using, the system of the invention is able to identify instances in which an individual may be acting with malicious intent.
[0026] Online System Resources Usage Monitoring: the system of the invention utilizes information from behavioral based input device usage characteristics and behavioral system usage activity patterns to achieve unprecedented behavioral analysis capabilities that leads to high detection rates of insider threats with very low false alarms. System (i.e., system resource) usage characteristics or baselines for each individual are developed for specific critical resources (including network resources, software resources, and hardware resources). The baseline is developed for each individual by analyzing software module usage patterns (e.g., software programs a person normally use on a computer, the programs installed on a computer a person does not use, etc.), network usage patterns (e.g., where a person usually visits on a network, the resources a person usually uses on a network, etc.), hardware resource usage (e.g., the devices a person usually uses), and so on. Once baselines are established, the system of the invention integrates the information collected by both monitoring routines and correlates them in order to identify anomalous behavior. For example, a possible threat can be identified when the system of the invention detects heightened behavioral indicators along with dramatic changes in system usage patterns for a critical resource. Because users can and often do have a different behavioral biometrics and/or different behavioral system resource usage depending on the time of day and/or the day of week, it should be appreciated that there can be multiple baselines for each individual. For example, one can have baselines depending on the time of day, e.g., morning, afternoon and/or evening baselines, and/or one can have baselines depending on the day of the week, e.g., Monday, Tuesday, Wednesday, Thursday, Friday, and/or weekend(s) of each week, or simply weekday baselines and weekend baselines, etc.
[0027] System Implementation and Management Dashboard: The system of the invention is capable of real-time monitoring of individuals' use of computing resources using a client-server model. On the client side, the system continuously monitors input device (e.g., mouse and keyboard) usage activity, paired with technical system usage information (e.g., the application that the individual is using at the time and the way in which the system is being used). Monitoring data is sampled and sent to the system server for analysis either in realtime as the samples are taken or in a batch mode that would allow the system to continue to function even when the user is not connected to the corporate network. When the results of the integrated analysis show that an individual is behaving outside of their pre-established baselines (e.g., experiencing a heightened emotional response while using the system (i.e., system resource) in an abnormal way) an alert is routed to a policy-driven management dashboard. According to the risk and impact analysis of the detected anomalous behavior, the appropriate actions according to the organization policies can be taken promptly; for example, a follow up online survey can be invoked to get further clarification from the user about the anomalous behavior; an alert is sent to the appropriate unit for deeper inspection and analysis; and so on. System of the invention for monitoring and analyzing has minimal impact on the system performance, and thus does not interfere with daily operations, and is designed to be easily deployed using group policy management.
[0028] The system of the invention also provides a customizable and secure management dashboard, which can be configured to display and highlight anomalous behavioral indicators for review. Because the system of the invention provides continuous real-time anomaly detection, it utilizes event policies to respond to various anomalous events (e.g., notification of specific individuals, access restrictions, etc.). Using this dashboard, designated individuals can have the ability to review the indicators of users with anomalous behaviors, drilling down to specific activities and even screenshots if desired. In some embodiments, to better understand the potential severity and ramification of an event, system of the invention provides a variety of visualization and analysis tools. For instance, the dashboard provides a suspect's social network by analyzing email and communication logs, potentially identifying abnormal relationships that span organizational units. This data fusion within the system of the invention identifies not only suspicious individuals, but can assist in revealing networks of conspirators. For example, if multiple individuals are exhibiting abnormal behavior indicative of insider threat, and share the same supervisor, this could suggest that the supervisor should also be investigated as a possible threat. Or, as another example, if an individual is discovered as a possible insider threat, an analysis of whom else might be an accomplice can be performed by examining communication logs.
[0029] The system of the invention reduces the likelihood of Type 1 errors (false positive detection of an individual as an insider threat) through analysis of organizational information available in directories (e.g., Active Directory and OpenLDAP) and HR systems (e.g. PeopleSoft) to identify manager-employee hierarchy, shared and individual calendars, cost center information, and task lists to identify causes of stress arising from project deadlines, tense meetings where others are showing stress, and so on.
[0030] In some embodiments, the analysis system of the invention also supports deep analysis of usage patterns of various system resources using existing system log analysis tools. By identifying potential insider threats, the system facilitates targeted analysis and mining of system logs generated by operating systems and various system applications (e.g., proxy servers, database logs, Intranet server logs, etc.). The analysis system of the invention also can be configured to interface with leading Security Incident and Event Management (SIEM) systems, to tune them to monitor system usage by individuals flagged as potential insider threats.
[0031] A uDIT Framework: Some embodiment of the invention is shown in Figure 1.
Once the system is deployed in an organization, it continuously captures behavioral biometric data (input device usage data) and behavioral system usage activity data. The system combines these streams of data to create a behavioral biometric signature for each resource monitored (which may include software, hardware, and network resources). The system uses this signature to compare to future behavioral biometric data and system usage activity data to identify anomalies (e.g., an emotional reaction while using the resource in an abnormal when). When an anomaly is detected, a policy-driven response is generated. It should be appreciated that as more behavioral biometric signatures and system usage activity data are gathered for an individual, the more robust/reliable baseline can be established. Thus, in some embodiments, the baseline of the user is continually updated. In some instances, as more baselines are gathered, the latest baseline data are often weighted more than the previous baselines. In this manner, baselines of each user is compared to more recent activities. Accordingly, the system and method of the invention includes evolving baselines for users.
[0032] The anomaly detection system of the invention can be applied to detect and analyze anomalous behavior. The detection system is applied to analyze and detect anomalous behavior in human-computer interaction when accessing critical organizational resources (network resources, software resources, and hardware resources). This detection approach is based on supervised learning and cumulative anomaly techniques. A system training period can be used to develop baseline behavioral and usage pattern for critical resources for each individual within an organization. In some embodiments, the detection system of the invention considers the patterns of behavior and use over a period of time, for a broad range of resources, instead of considering only one isolated behavior independent of the particular system being used. Research indicates that usage patterns may differ across different applications— thus, in some embodiments independent baselines are established for each individual for each critical resource to accurately identify anomalies.
[0033] The detection system of the invention is the first system perform continuous, real-time monitoring and anomaly analysis of human behavioral sensing via input device (e.g., mouse and keyboard) usage patterns and behavioral system usage activity patternsto identify a possible insider threat. It has many advantages over alternative methods, as those methods are often reactive, time consuming, and manually intensive (research has shown that it takes an average 416 days to identify and contain an insider attacks), invasive, require employee cooperation, and may be easily fooled by a cunning individual. The detection system of the invention overcomes these obstacles in at least the following ways: (i) A data capture process runs continuously in the background of the computer system during normal workday activities, while analysis takes place on a separate and secure remote system; (ii) Behavioral sensing data (i.e., keyboard and mouse usage) is gathered in a continuous and unobtrusive manner with no adverse effect to the user; (iii) Users need not be aware of the data collection that is taking place; (iv) If desired, raw keystroke data could be collected in such a way that the content of the message is unknown until suspicion is raised (e.g., by using an encryption key unique to the system administrator), at which point messages may be reconstructed. Not only does this protect the privacy of end users, but it also enhances security of sensitive information, as the content of the message does not have to be broadcast across the network; (v) Unlike systems that rely on linguistic features, the system's behavioral analysis approach is language agnostic (i.e., the detection methodology will work with English, Spanish, Arabic, etc.) because it relies on longitudinal usage patterns rather than message content; (vi) The system is not easily fooled, as heightened emotions that would trigger anomalous event typically manifests itself as subtle differences in typing or mouse movement behavior that occurs between 20 and 100 milliseconds. Attempts to modify one's keystroke or mouse use would be flagged as abnormal, thus identifying individuals attempting to fool the system; (vii) Organizational structure, communication patterns, meeting schedules, and resource utilization can be analyzed without interruption to daily activities; (viii) As opposed to comprehensively examining server logs, which can be very time consuming and result in information overload, our system helps to focus deep exploration of logs based on the identification of specific individuals and events.
[0034] In some embodiments, the focus of the detection system of the invention is primarily to confirm the individuals that are operating within a given acceptable range of behavior and system usage. Regardless, however, it provides a powerful tool for
management to proactively identify and investigate those individuals who are displaying abnormal behavior.
[0035] Additional objects, advantages, and novel features of this invention will become apparent to those skilled in the art upon examination of the following examples thereof, which are not intended to be limiting. In the Examples, procedures that are constructively reduced to practice are described in the present tense, and procedures that have been carried out in the laboratory are set forth in the past tense.
EXAMPLES
[0036] The below examples are experiments that were conducted to detect heightened emotional responses through the speed and distance of users mouse movements. Similar studies explore how other features (e.g., mousing precision, area under the curve, typing dynamics, etc.) predict emotional responses.
[0037] In particular, described below are three studies that detect negative emotions through mouse cursor speed and distance. Study 1 induced negative emotion (i.e., negative emotional valence) through validated manipulation, and explored its influence on mouse cursor speed and distance in a highly controlled experiment. Study 2 extended the results of Study 1 to focus more broadly on emotional valence (positive, neutral and negative) and arousal (low and high). Study 3 extended these results to predict frustration— a common negative emotion in online interactions— in a realistic online e-commerce scenario.
[0038] Study 1 : Study 1 was a laboratory experiment with a previously validated negative emotion manipulation. The experiment had a single factor with two conditions: a) a negative-emotion (i.e., negative emotional valence) condition and b) a baseline condition. After the emotion manipulation, participants immediately completed another task that was identical for both conditions while mouse cursor movement data were collected. Data were analyzed to determine whether participants in the negative-emotion group displayed slower cursor speed and greater cursor distance than participants in the baseline group on an otherwise identical task.
[0039] Procedure and Manipulation: After the participants consented to participate in the study, each participant was randomly assigned to the negative-emotion or the baseline (i.e., non-negative-valence) condition. After completing the condition, all participants completed an identical follow-up task, during which mouse cursor movements were measured.
[0040] Negative-Emotion Condition: In the negative-emotion condition emotional valence was manipulated through an intelligence test designed to be unfair that was previously developed and validated. The instructions of the intelligence test explained that the test was timed, and the score would be computed based on how many questions the participants answered correctly within the allotted time. While the participants read the instructions, the clock began to count in the upper right hand corner of the screen. Before loading the first question, three messages were shown for eight seconds each explaining that the question was being loaded. During this time, the timer incremented. The messages were, in order, "Still loading the first question... Please be patient"; "Loading the first question... Please wait"; and "Question loaded. Processing first question..."
[0041] Finally, the question was shown (e.g., Figure 2), giving participants 15 seconds to answer the question before the page automatically advanced. To induce negative emotional valence, the question was a difficult, requiring longer than 15 seconds to answer. After the page automatically advanced, the cycle repeated; the three 'loading' messages were shown again for eight seconds each, while the timer advanced. This was followed by the second difficult question that should also take longer than 15 seconds to complete. After the second question automatically advanced after 15 seconds, the participants were told that their time had expired. They were also given the feedback that they had only been given two out of three questions because they had taken too long to answer. Participants were then told that because of their slow response time and incorrect answers, their score indicated a lower intelligence level than that of most people who had taken the test. As their score was outside of their control (resulting from the system being too slow and not giving them enough time to answer), studies have shown that participants will likely have a negative emotional response. All participants were debriefed after the experiment, informing them that the test did not actually measure intelligence, but that it was a task designed to induce negative valence to see if emotions influenced mouse cursor movements.
[0042] Baseline Condition (non-negative- emotion condition): As in the negative emotion condition above, the instructions explained to participants in the baseline condition that they would take an intelligence test. The delivery of the questions in this condition was similar to the negative-emotion condition, except that the negative-emotion-inducing mechanisms was not implemented (i.e., the test was not timed; the three questions could easily be answered; and, at the conclusion, the system congratulated participants for answering the questions correctly). Thus, negative emotion would likely not be induced in the task.
[0043] Follow-up Task: Following the manipulation, all participants engaged in an identical follow-up task, during which mouse cursor movements were recorded and analyzed. This allowed comparison of mouse movement differences induced by the negative-emotion condition on an otherwise identical task. The task required participants to drag six 4-digit numbers from a box on the left side of the screen to a box on the right side of the screen, and to arrange them in ascending order.
[0044] Participants: Participants were recruited for the experiment from
Amazon.com's Mechanical Turk (MTurk). Using MTurk to recruit participants has been deemed appropriate for random sample populations. Social scientists are increasingly using MTurk, as the diversity of the participant pool is larger than that of typical undergraduate college samples, and the data are as reliable as are those collected using other methods, if not more so. Further, some studies have found that the behavior of MTurk respondents closely resembled that of participants in traditional laboratory experiments.
[0045] All participants were required to have an Amazon Masters certification
(awarded to people who have demonstrated accuracy and quality across a wide variety of tasks), and paid US$0.40 for a 4-minute task (equaling a US$6 hourly wage). Previous studies have shown that data quality is not affected by compensation rate on MTurk. To verify that responses were not automated, a control question in the post-survey was implemented. Given that this study focused on mouse cursor movements, the type of device being used to complete the task was also captured, and data points from mobile devices were removed from analysis. This resulted in a final sample size of 65 participants. Table 2 shows the sample sizes and demographics of each condition.
Table 2. Participant Demographics of Study 1 :
Negative Valence Neutral
Figure imgf000017_0001
[0046] Measures: Using a publicly available JavaScript library (JQuery), the webpage that contained the follow-up task captured the mouse cursor's x/y position and timestamp at a millisecond precision rate while the participants completed the follow-up task. The data were sent to a server via an AJAX call, and the server calculated the cursor distance for each participant by summing the distances between each recorded x/y position. Specifically, it calculated the distance (in pixels) using the Euclidean distance between two x/y positions <¾
Figure imgf000017_0002
and , leading to a total task distance of
D = ^ ' "_^ d(ai, ai+1) between the recorded points ai, a2, ..., a„. Cursor speed was calculated by the server as a function of cursor distance D and movement time t during the task, measured in pixels per millisecond: v = D 11 .
[0047] As a manipulation check in a post survey, emotions were measured using a non-verbal (pictorial) emotion scale (Self- Assessment Manikin, SAM). Because participants participated from a variety of countries, a pictorial scale was chosen to minimize potential cultural/linguistic effects/biases. Two of the constructs assessed by the SAM scale included valence and arousal. The scale used 9-point scales, accompanied by graphic depictions of the measured dimensions (see Error! Reference source not found, for an example). The SAM scale has been used extensively by various researchers, especially in situations where it is cumbersome to use verbal scales consisting of a large number of items or when dealing with non-native English speaking participant populations.
[0048] Results: The results of the manipulation check showed that the participants in the negative-emotion condition experienced significantly more negative emotional valence than did those in the baseline task ( (l,64) =83.936, /? < 0.001), indicating that the conditions successfully manipulated emotional valence. No significant difference was observed in arousal between the conditions ( (l,64) = 2.325, /? > 0.05). A one-way analysis of variance (ANOVA) was conducted to test the relationships between negative emotion and mouse cursor movements, and found that participants in the negative-valence condition had significantly lower mouse cursor speed ( (l,64) = 5.093, /? < 0.05, = 0.091) and greater cursor distance ( (l,64) = 7.696, /? < 0.01, η2 = 0.109) than did participants in the low- frustration task. Table 3 shows the summary statistics of the two conditions for cursor distance and speed.
Figure imgf000018_0001
[0049] Study 2: The purpose of Study 2 was to extend the findings of Study 1 to a broader set of emotions. As previously discussed, emotions can be generally categorized on a scale from high to low arousal and positive to negative valence. These dimensions allow for classification of most emotions. In this experiment, both valence (positive vs. neutral, vs. negative valence) and arousal (high vs. low arousal) were manipulated in a 2 x 3 factorial design. See Table 4. Doing so allowed determination as to whether emotional valence and arousal interact to influence mouse cursor movements, and how the influence of negatively valenced emotions on mouse cursor movements compare to the effects of neutral and positively valenced emotions.
[0050] Manipulations: Valence and arousal were manipulated using images from the
International Affect Picture System (IAPS). IAPS is a collection of pictures that is widely used in experiments to elicit a range of emotions. Consistent with the present inventors' theoretical development, the emotions induced by the images were shown to trigger physiological responses related to the behavioral inhibition system, even to the extent of influencing users' mouse cursor movements. Each picture in the library has a rating of pleasure (valence) and arousal invoked during viewing.
[0051] After emotional valence and arousal were manipulated, all participants then interacted with the same website while mouse cursor movements were analyzed to explore how the evoked emotions influenced mouse cursor movements. To manipulate valence and arousal, six images were selected, with one image representing each condition. The images were selected to maximize consistency among the different groups. Striving for consistent condition groups eliminated some extreme emotions. For example, highly negatively- valenced images are almost always associated with very high valenced reactions; whereas achieving this level of arousal is not possible with a neutral-valenced image. Thus, these highly negatively -valenced images were excluded from the study because it was not possible to find a neutral-valenced image with a comparable level of arousal. On a scale from 1 to 9, all high-arousal images had an emotional arousal score between 6.16 and 6.22; all low- arousal images had an emotional arousal score between 3.60 and 3.67. The positively- valenced images had emotional valence scores between 8.08 and 8.20; the negatively - valenced images had emotional valence scores between 2.82 and 2.91; and neutrally- valenced images had emotional valence scores that were approximately around the neutral point (neutral equals a score of 5), one slightly above (5.14), and one slightly below (4.63). The images and associated valence scores are summarized in Table 4.
Figure imgf000019_0001
[0052] Procedures: After the participants had consented to participate in this study, they were presented with the following instructions: "Thank you for participating in this study. After you click the next button, you will see a picture for 10 seconds. After ten seconds, the page will automatically advance and ask you how you feel. After answering the questions, you will be directed to a computer store website. Your task is to navigate the website and pretend to purchase the following product: J. Crew Abingdon Laptop Bag for a 17 inch laptop. Please write down these details so you remember what product to find. After clicking on purchase, you will be guided to a survey.
[0053] Once participants clicked 'next' on the page, they were randomly shown one of the IAPS images for 10 seconds. After 10 seconds, the page automatically advanced and requested participants to report how they currently felt by responding to the SAM scale for valence and arousal as a manipulation check (to verify the images elicited the desired emotional response). After answering the manipulation check questions, all participants (regardless of conditions) interacted with the same website containing four different pages. The website was developed for this experiment (to ensure no one had previous experience with the website) and mimicked an online computer store. A professionally made computer- store template was used for the website. To complete the task, users had to navigate the website containing four pages to find the product. The website was designed such that there was only one obvious link on each page that would lead the user a closer to goal attainments (finding the correct laptop bag). On the first webpage (the home webpage), users had to click on the "shop laptop bags" link (the only link on the page relevant to finding laptop bags) in the left sidebar to advance to the second webpage. On the second webpage, users had to click on the "J. Crew Abingdon Laptop Bag" that accompanied the picture shown earlier to get to the next webpage. On the third webpage, users had to select their laptop's "screen size" and push "submit" to arrive at the last webpage. On the fourth webpage, users could review the product and click "purchase." After clicking on purchase, the task was complete and the website led participants to a post survey.
[0054] Participants: Two-hundred-twenty-four people from Amazon's MTurk participated in the experiment (see Table 5 for a condition demographic breakdown). All participants were required to have an Amazon Masters certification, use a computer mouse to take the experiment, and were paid US$0.75 for a 5 -minute task (equaling a US$9 hourly wage).
Table 5. Partici ant Demo ra hics of Stud 2
Figure imgf000020_0001
[0055] Measures: While the participants browsed the website to purchase the bag, the website recorded and analyzed mouse cursor movements using the same JQuery script used in the previous experiment. Once the mouse cursor movements were captured, they were sent via an AJAX call to a web service that calculated the speed and distance. After calculating the speed and distance, the web service recorded the two statistics in a database and returned the statistics to the webpage before the next webpage was served. This entire analysis process took less than a second in every case.
[0056] As previously discussed, immediately after seeing the image, participants also answered manipulation check questions using the SAM Scales to verify that the images successfully induced emotional valence and arousal as desired. After the experiment, a post- survey was administered that again measured users' emotional valence and arousal using the same measures. The purpose of asking the manipulation checks a second time was to see if the induced emotional states persisted throughout the task, as emotions are often very short in duration. Furthermore, demographic information were collected in the post-survey.
[0057] Results: Prior to analyzing mouse cursor movements, manipulation checks were performed to ensure that the IAPS images induced the desired emotions; specifically, an ANOVA was performed to compare the levels of valence and arousal induced by the six images immediately after seeing the image. The ANOVA indicated a significant difference in valence ( (5,218) = 63.526, /? < 0.001) and arousal ( (5,218) = 13.005, /? < 0.001) among the images.
[0058] A Bonferroni post-hoc comparison was then conducted to explore how the individual images differed in terms of arousal and valence. The results suggest that the two negatively- valenced images (Condition 5, 6), as expected, were not statistically different from each other on emotional valence (p > 0.05), but induced significantly more negative valence than the four other images (p < 0.001 for each comparison). The two neutrally - valenced images (Condition 3, 4), as expected, were not statistically different from each other on emotional valence (p >0.05), but induced significantly more positive valence than did the two negatively-valenced images (p <0.001 for each comparison), and significantly more negative valence than did the two positively- valenced images (p <0.001 for each
comparison). Finally, the two positively-valenced images (Condition 1 and 2), as expected, were not statistically different from each other on emotional valence (p >0.05), but induced significantly more positive valence than did the four other images (p <0.001 for all comparisons).
[0059] Regarding arousal, the high-arousal images (Condition 1 , 3, 5), as expected, were not statistically different from each other on emotional arousal (p >0.05 for each comparison), but did induce significantly higher arousal than did the low-arousal images (p <0.001 for each comparison). Similarly, no differences were observed in emotional arousal among the low-arousal images, as expected, suggesting that they induced statistically the same degree of arousal. The means, standard deviations and comparison differences are shown in Table 6. In summary, the manipulations of valence (positive, neutral and negative) and arousal (high and low) all appear to be successful.
Table 6. Manipulation Checks Means and Standard Deviations for Valence and Arousal
Figure imgf000022_0001
[0060] Next, whether mouse cursor distance and speed were significantly different among the condition groups for each page of the website (Pages 1 - 4) were explored using an analysis of variance and Bonferroni post-hoc comparisons.
[0061] Analysis of Web Page 1. Distance: On the first page of the website, an
ANOVA indicated that there was a difference between condition groups for mouse cursor distance ( (5,218) = 5.725, /? <0.001). A Bonferroni post-hoc comparison indicated that the two negative emotionally-valenced image conditions (Condition 5 and 6) resulted in significantly greater distance than did the other condition groups. Condition 5 was significantly different from Condition 1 (p <0.05), Condition 2 (p <0.05), Condition 3 (p <0.05), and Condition 4 (p <0.05). Condition 6 was significantly different from Condition 1 (p <0.01), Condition 2 (p <0.01), Condition 3 (p <0.01), and Condition 4 (p <0.01). Although Conditions 5 and 6 significantly differed in arousal (see the manipulation check section), this did not lead to a difference in mouse cursor distance between the two groups (p >0.05). When running an -test with only the two negatively- valenced conditions together (Conditions 5 and 6), a non-significant /?-value was observed for distance ( (l,70) = .227, /? >0.05), suggesting that the two non-negatively- valenced conditions could be grouped together when analyzing distance. Likewise, an -test with only the four non-negatively- valenced conditions resulted in an non-significant p-value ( (3,148) =0.022, /? >0.05), suggesting that they could be grouped together for analyzing distance. We then conducted a t-test comparing the means of the two reduced groups (Condition 5 and 6 together against Conditions 1-4). First, as the sample sizes of the two tests differed, we conducted a Levene's test to test for equality of variances. The variance among the two sample populations was different (F = 22.820, p <0.001), thus the assumption of equal variance for the normal t-test was violated. To compensate for unequal variance, we conducted a t-test for equality of means in SPSS that assumed unequal variance. The results suggested that the negatively-valenced conditions induced greater distance than did the non-negatively-valenced conditions (t(92.387) = 4.681, p <0.001).
[0062] Speed: An ANOVA indicated that a difference existed between condition groups for mouse cursor speed on the first page ( (5,218) = 6.983, /? <0.001). A Bonferroni post-hoc comparison indicated that the two negative emotionally- valenced image conditions (Conditions 5 and 6) resulted in significantly slower speed than did the other condition groups. Condition 5 had significantly slower speed than did Condition 1 (p <0.01), Condition 2 (p <0.01), Condition 3 (p <0.01), and Condition 4 (p <0.001). Condition 6 had significantly slower speed than did Condition 1 (p <0.01), Condition 2 (p <0.05), Condition 3 (p <0.05) and Condition 4 (p <0.001). Although Conditions 5 and 6 significantly differed in emotional arousal, this did not lead to a difference in mouse cursor speed between the two groups (p >0.05). Again, when running an -test with only the two negatively-valenced conditions together (Condition 5 and 6), a non- significant p-value was observed for speed ( (l,70) = 1.496, p >0.05), suggesting that the two non-negatively-valenced conditions could be grouped together for analyzing speed. Likewise, an -test with only the four non-negatively- valence condition resulted in an non-significant p-value ( (3,148) = 0.619, /? >0.05), suggesting that they could be grouped together for analyzing speed. A t-test was then conducted comparing the means of the two reduced groups (Conditions 5 and 6 together against Conditions 1 - 4). First, as the sample sizes of the two tests differed, a Levene's test was conducted to test for equality of variances. The variance between the two sample populations was different (F = 9.419, p <0.01), thus the assumption of equal variance for the normal t-test was violated. To compensate for unequal variance, a t-test was conducted for equality of means in SPSS that assumed unequal variance. The results suggested that the negatively-valenced conditions induced greater distance than did the non-negatively-valenced conditions (t(208.570) = 7.587, p <0.001).
[0063] Analysis of Web Pages 2 - 4: ANOVAs were conducted to explore whether distance and speed were different on the subsequent pages: Pages 2, 3 and 4 of the website. On Page 2, the test indicated that the condition groups did not significantly differ in terms of mouse cursor distance ( (5,218) = 0.280, /? >0.05) or speed ( (5,218) = 1.525, /? >0.05). Likewise, on Page 3, an ANOVA showed that the condition groups did not significantly differ in terms of mouse cursor distance ( (5,218) = 1.535, /? >0.05) or speed (F(5,218) = 1.575, p >0.05). Finally, on Page 4, an ANOVA again indicated that the condition groups did not significantly differ in terms of mouse cursor distance ( (5,218) = 1.432, /? >0.05) or speed (F(5,218) = 0.657, /? >0.05). To further explore these non-significant relationships, an analysis was conducted on the level of emotional valence and arousal reported in the post- survey (i.e., after users interacted with the website). As previously mentioned, in a post- survey, participants again reported their emotional valence and arousal. The analysis of these items suggested that although the images induced valence and arousal at the beginning of the experiment (see the manipulation check analysis), this emotional state did not persist until the end of the experiment for both valence ( (5,218) = 1.432, /? >0.05) and arousal ( (5,218) = .464, /? >0.05).
[0064] The results thus suggest that although negative valence was exhibited for the first manipulation check and likely the first page (Page 1), the effects of the condition eventually deteriorated (i.e., participants' emotional valence shifted toward neutral), perhaps to the degree that the conditions did not influence behavior on Pages 2 - 4, and ultimately no differences in valence and arousal were observed in the post survey. This is consistent with past report that suggests emotions are often intense but short lived events that may only last seconds or, in some cases, minutes. The next study explores this in greater detail.
[0065] Study 3: Study 3 had two main purposes. First, Study 3 manipulated negative emotion immediately before Pages 2, 3, and 4 instead of before Page 1 , and used the same website as Study 2. Doing so helped explore whether the significant results (Page 1) and nonsignificant results (Pages 2-4) in Study 2 were due to the design of the webpages or the timing / longevity of the negative emotion manipulations (as previously suggested). Second, Study 3 explored whether the tracking and analysis of mouse cursor movements can be used to infer negative emotion. In a realistic e-commerce scenario, negative emotion was manipulated in a 1 x 2 factorial experiment design, and then predicted which condition users received, based on mouse cursor speed and distance. The prediction accuracy rate is calculated and reported herein.
[0066] Experimental Task and Manipulations: Study 3 replicated the website and general task of Study 2. Similar to the previous study, participants were asked to navigate a website to purchase a product. Upon agreeing, participants were directed to a webpage that presented them with the following instructions: "After clicking next, you will be directed to a computer store website. Your task is to navigate the website and pretend to purchase the following product: J. Crew Abingdon Laptop Bag for a 17 inch laptop. Please write down these details so you remember what product to find. After clicking on purchase, you will be guided to a survey. "
[0067] After clicking 'next,' all participant were led to the same webpage as Study
2— Page 1. Participants then found and clicked on the link titled "Shop Laptop Cases" (the only link on the page relevant to finding laptop bags). Having all participants click on the same link ensured they had the same ending point on the page, regardless of condition. After clicking on the link, mouse cursor movements were no longer analyzed until the user reached Page 2 (at which point the analysis continued). For the baseline condition, the page automatically advanced to Page 2; for the negative-emotion condition, download delay (i.e., webpage loading speed) was manipulated to induce frustration (a common negatively - valenced emotion experienced when interacting with websites). Download delay has been shown to induce frustration and negative valence, for example, previous studies have found that small download delays can have profound impacts on users' intentions and attitudes— decreases in performance and behavioral intentions begin to flatten when the delays extend to 4 seconds, and attitudes flatten when the delays extend to 8 seconds or longer. In the negative-emotion condition, the screen was dimmed, all links were disabled, and three successive messages were shown. First, a message saying "please wait while the next page loads" was displayed for eight seconds. Afterwards, a second message saying "Page still loading. Please be patient." was shown for another eight seconds. Finally, a message was shown saying "Error loading page. Please try again." Others have reported that such error messages have already been shown to increase frustration. Afterwards, users re-clicked on the link, which acted as an anchor so that participants (regardless of the condition) would have approximately the same starting position on the next page. This time, however, the page immediately advanced to Page 2, at which point the system began again to analyze mouse cursor movements. Hence, regardless of condition, the scope of the mouse cursor movement analysis had the same ending and beginning points for every participant, regardless of condition (and excluded the frustration manipulation that was unique to the negative-valence condition group). This process repeated itself between the second and third webpages and the third and fourth webpages for the negative-emotion condition group.
[0068] Participants: One-hundred-twenty-six students participated in the experiment from a management school at a large University in the United States (see Table 7). As compensation, students were given 0.25% extra credit applied to a participating management course of their choice. Students represent an age group and demographic that commonly uses the internet; 97% of student-aged people (18-29) use the internet in the United States, and 97% of people with a college degree use the internet, which is significantly more than most other age groups and educational levels. Therefore, the student population represents an important internet demographic. Furthermore, students have been argued to be an appropriate population to establish the relationships among constructs. As one purpose of this study is to better understand the relationship between negative emotion and mouse movements in a more realistic task, we deem students were deemed to be an appropriate sample.
Figure imgf000026_0001
[0069] Measures: The website collected and analyzed mouse cursor movements using the same process as described in Study 2. After clicking on the "purchase" link, participants were led to an online survey. The participants were asked several questions as manipulation checks to ensure frustration was manipulated. First, participants responded to the 9-point Self-Assessment Manikin (SAM) scale to indicate their emotional valence and arousal. Next, using a 7-point Likert scale, participants responded to three questions created by the research team regarding the level of frustration they experienced while interacting with the website: Fl) I felt frustrated while interacting with the website; F2) interacting with the website was frustrating; and F3) the website made me frustrated.
[0070] Results: First, manipulation checks were performed comparing the means for valence, arousal and frustration. Participants in the negative-emotion condition experienced significantly lower valence (i(124) = 10.845, p <0.001), significantly higher arousal (t(124) = 5.440, p <0.05) (frustration is characterized by both negative valence and high arousal), and significantly higher frustration on the frustration manipulation check items (Fl : i( 124) = 13.847, /? <0.001; F2: t(124) = 15.462, /? <0.001; F3: t(124) = 14.630, p <0.001). Thus, the manipulations appeared to be successful.
[0071] An ANOVA was then conducted to test the relationships between negative emotion and mouse cursor movement. First, mouse cursor movements were compared on each of the four pages separately. Because no manipulation was made prior to the first page in the website, the mousing behavior should hypothetically be the same between condition groups on this page. However, because frustration was manipulated before users interacted with pages 2, 3, and 4, the mousing behavior should be different on these pages. The results are shown in Table 8. As expected, since Page 1 was not preceded by a manipulation, both mouse cursor speed and distance were not significantly different between the two condition groups. However, as Pages 2, 3, and 4 were preceded by frustration-inducing manipulations, participants in the negative-emotion condition group exhibited greater distance and slower speed than did participants in the non- frustration group.
Table 8. Ex erimental Results
Figure imgf000027_0001
[0072] In addition, when averaging the overall speed and distance on Pages 2, 3, and
4 (the webpages that were preceded by a frustration manipulation) for each participant, the average speed was significantly slower for participants in the negative-emotion condition group than for participants in the baseline group (E(l l l 24) = 32.455, /? <0.001 , η2 = .262). Participants in the negative-emotion condition group also exhibited greater average distance than did participants in the baseline group (E(l ll24) = 31.437, /? <0.001 , η2 = .202). Table 9 shows the summary statistics of the two conditions for average cursor distance and speed. Talbe 9. Descri tive Statistics for Stud 3
Figure imgf000028_0001
[0073] Prediction: Based on these results, the present inventors explored if one can predict whether participants were in the negative-emotion group or the baseline group, based on average mouse cursor speed and distance on pages 2 - 4 (the pages that were preceded by the frustration condition). To create a prediction model, a simple logistic regression model was specified using Weka Data Mining Software. The simple logistic regression model is shown below:
Frustration = 0.4 + distance 0.00008 + speed (-7.84) The model was validated through 10-fold cross validation and achieved an overall accuracy rate of 81.746%. The detailed accuracy rates are shown in Table 10. Thus, cursor speed and distance proved to be predictors of the induced frustration in this experiment at an accuracy rate over 80%>.
Table 10. Prediction Results
Figure imgf000028_0002
[0074] The foregoing discussion of the invention has been presented for purposes of illustration and description. The foregoing is not intended to limit the invention to the form or forms disclosed herein. Although the description of the invention has included description of one or more embodiments and certain variations and modifications, other variations and modifications are within the scope of the invention, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative embodiments to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims

What is Claimed is:
1. A system for detecting anomalous behavior by a subject, said system comprising:
(a) a behavioral biometric-based analysis system comprising:
(i) a data interception unit configured to intercept input from a subject, wherein the data interception unit is configured to passively collect an input device usage characteristic;
(ii) a behavior analysis unit operatively coupled to said data interception unit to receive the passively collected input device usage characteristic; and
(iii) a behavior comparison unit operatively coupled to said behavior
analysis unit,
wherein said behavioral biometric-based analysis system dynamically monitors and passively collects behavioral biometric information of the subject, and translates said behavioral biometric information into representative data, stores and compares results, and outputs a behavioral biometric-based analysis result associated with anomalous behavior by the subject; and
(b) a system usage analysis system that is operatively connected to said behavioral biometric-based analysis system, said system usage analysis system comprising:
(i) a system usage monitoring system that is configured to passively
collect system usage characteristic of the subject;
(ii) a system usage analysis unit operatively connected to said system
usage monitoring system to receive the passively collected system usage characteristic; and
(iii) a system usage comparison unit operatively connected to said system usage analysis unit,
wherein said system usage analysis system dynamically monitors and passively collects system usage characteristic of the subject, and translates said system usage characteristic into representative data, stores and compares results, and outputs a system usage result associated with anomalous behavior by the subject.
2. The system of Claim 1, wherein said behavioral biometric-based analysis system is configured to transmit the behavioral biometric-based analysis result to said system usage analysis system when the behavioral biometric-based analysis result indicates anomalous behavior by the subject.
3. The system of Claim 2, wherein said system usage analysis system is configured to determine anomalous behavior by the subject only upon receiving the behavioral biometric-based analysis result from said behavioral biometric-based analysis system indicating anomalous behavior by the subject.
4. The system of Claim 1, wherein said input device comprises a mouse, a touch pad, a touch screen, a stylus, keyboard, a microphone, a track ball, a pointer device, a remote camera, a joy stick, or a combination thereof.
5. The system of Claim 4, wherein said data interception unit is configured to passively collect usage characteristics comprising how and the way the subject uses an input device.
6. The system of Claim 1, wherein said data interception unit is configured to passively collect usage characteristics comprising subject's method of selecting a given function.
7. The system of Claim 1, wherein said data interception unit is further configured to collect system usage information, and wherein said system usage information comprises a current process running on the system, a start time of a given application, a run time of a given application, an end time of a given application, or a combination thereof.
8. The system of Claim 1, wherein said system is configured for real-time monitoring.
9. A method for determining anomalous behavior by a subject, said method comprising:
(a) determining whether the subject is exhibiting anomalous behavior based on a behavioral biometric-based analysis; and
(b) determining whether the subject is exhibiting anomalous behavior based on the system resource usage,
wherein when the subject exhibits anomalous behavior as determined by both the behavioral biometric-based analysis and the system resource usage, it is an indication that the subject is behaving anomalously.
10. The method of Claim 9, wherein said step of determining anomalous behavior based on the system resource usage is conducted only when said step (a) indicates anomalous behavior by the subject.
11. The method of Claim 9, wherein said step (b) comprises analyzing network resource usage pattern, software usage pattern, hardware resource pattern, or a combination thereof.
PCT/US2014/043528 2013-06-21 2014-06-20 Automated detection of insider threats Ceased WO2014205421A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361838149P 2013-06-21 2013-06-21
US61/838,149 2013-06-21

Publications (1)

Publication Number Publication Date
WO2014205421A1 true WO2014205421A1 (en) 2014-12-24

Family

ID=52105380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/043528 Ceased WO2014205421A1 (en) 2013-06-21 2014-06-20 Automated detection of insider threats

Country Status (1)

Country Link
WO (1) WO2014205421A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107196942A (en) * 2017-05-24 2017-09-22 山东省计算中心(国家超级计算济南中心) A kind of inside threat detection method based on user language feature
NO20170249A1 (en) * 2017-02-20 2018-08-21 Jazz Networks Ltd Secure access by behavior recognition
US11082454B1 (en) * 2019-05-10 2021-08-03 Bank Of America Corporation Dynamically filtering and analyzing internal communications in an enterprise computing environment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040225627A1 (en) * 1999-10-25 2004-11-11 Visa International Service Association, A Delaware Corporation Synthesis of anomalous data to create artificial feature sets and use of same in computer network intrusion detection systems
US20050108562A1 (en) * 2003-06-18 2005-05-19 Khazan Roger I. Technique for detecting executable malicious code using a combination of static and dynamic analyses
US20070300301A1 (en) * 2004-11-26 2007-12-27 Gianluca Cangini Instrusion Detection Method and System, Related Network and Computer Program Product Therefor
US7519860B2 (en) * 2000-09-11 2009-04-14 Nokia Corporation System, device and method for automatic anomaly detection
US7860970B2 (en) * 2003-10-15 2010-12-28 International Business Machines Corporation Secure initialization of intrusion detection system
WO2011094484A1 (en) * 2010-01-28 2011-08-04 Drexel University Detection, diagnosis, and mitigation of software faults
US8135657B2 (en) * 2000-09-25 2012-03-13 Crossbeam Systems, Inc. Systems and methods for processing data flows
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
US20130031598A1 (en) * 2010-11-18 2013-01-31 The Boeing Company Contextual-Based Virtual Data Boundaries

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040225627A1 (en) * 1999-10-25 2004-11-11 Visa International Service Association, A Delaware Corporation Synthesis of anomalous data to create artificial feature sets and use of same in computer network intrusion detection systems
US7519860B2 (en) * 2000-09-11 2009-04-14 Nokia Corporation System, device and method for automatic anomaly detection
US8135657B2 (en) * 2000-09-25 2012-03-13 Crossbeam Systems, Inc. Systems and methods for processing data flows
US20050108562A1 (en) * 2003-06-18 2005-05-19 Khazan Roger I. Technique for detecting executable malicious code using a combination of static and dynamic analyses
US7860970B2 (en) * 2003-10-15 2010-12-28 International Business Machines Corporation Secure initialization of intrusion detection system
US20070300301A1 (en) * 2004-11-26 2007-12-27 Gianluca Cangini Instrusion Detection Method and System, Related Network and Computer Program Product Therefor
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
WO2011094484A1 (en) * 2010-01-28 2011-08-04 Drexel University Detection, diagnosis, and mitigation of software faults
US20130031598A1 (en) * 2010-11-18 2013-01-31 The Boeing Company Contextual-Based Virtual Data Boundaries

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NO20170249A1 (en) * 2017-02-20 2018-08-21 Jazz Networks Ltd Secure access by behavior recognition
CN107196942A (en) * 2017-05-24 2017-09-22 山东省计算中心(国家超级计算济南中心) A kind of inside threat detection method based on user language feature
CN107196942B (en) * 2017-05-24 2020-05-15 山东省计算中心(国家超级计算济南中心) An Insider Threat Detection Method Based on User Language Features
US11082454B1 (en) * 2019-05-10 2021-08-03 Bank Of America Corporation Dynamically filtering and analyzing internal communications in an enterprise computing environment

Similar Documents

Publication Publication Date Title
Haag et al. Protection motivation theory in information systems security research: A review of the past and a road map for the future
Zhuo et al. SoK: Human-centered phishing susceptibility
US20200163605A1 (en) Automated detection method for insider threat
Nasirpouri Shadbad et al. Technostress and its influence on employee information security policy compliance
Beautement et al. Productive security: A scalable methodology for analysing employee security behaviours
US20050183143A1 (en) Methods and systems for monitoring user, application or device activity
Wright et al. Phishing susceptibility in context: A multilevel information processing perspective on deception detection
Jenkins et al. Sleight of hand: Identifying concealed information by monitoring mouse-cursor movements
Burda et al. Cognition in social engineering empirical research: a systematic literature review
Miyamoto et al. EyeBit: eye-tracking approach for enforcing phishing prevention habits
Schoenherr et al. The cybersecurity (CSEC) questionnaire: Individual differences in unintentional insider threat behaviours
Morgan et al. A new hope: Human-centric cybersecurity research embedded within organizations
Riemenschneider et al. The influence of organizational values on employee attitude and information security behavior: the mediating role of psychological capital
Mamonov et al. The impact of exposure to news about electronic government surveillance on concerns about government intrusion, privacy self-efficacy, and privacy protective behavior
Williams et al. The role of conscientiousness and cue utilisation in the detection of phishing emails in controlled and naturalistic settings
Gledson et al. Combining mouse and keyboard events with higher level desktop actions to detect mild cognitive impairment
WO2014205421A1 (en) Automated detection of insider threats
Bruggen Studying the impact of security awareness efforts on user behavior
Reeves et al. Sleeping with the enemy: does depletion cause fatigue with cybersecurity?
Nyre-Yu et al. Considerations for Deploying xAI Tools in the Wild: Lessons Learned from xAI Deployment in a Cybersecurity Operations Setting.
Russo et al. Developers task satisfaction and performance during the covid-19 pandemic
Jones et al. The effects of persuasion principles on perceived honesty during shoulder surfing attacks
Franz et al. Who bites the hook? Investigating employees' susceptibility to phishing: A randomized field experiment
Chen et al. Sleepless after sabotage: the hidden costs of coping with customer incivility
Huang et al. Beyond Technicalities: Assessing Cyber Risk by Incorporating Human Factors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14813910

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14813910

Country of ref document: EP

Kind code of ref document: A1