US20090249477A1 - Method and system for determining whether a computer user is human - Google Patents
Method and system for determining whether a computer user is human Download PDFInfo
- Publication number
- US20090249477A1 US20090249477A1 US12/058,420 US5842008A US2009249477A1 US 20090249477 A1 US20090249477 A1 US 20090249477A1 US 5842008 A US5842008 A US 5842008A US 2009249477 A1 US2009249477 A1 US 2009249477A1
- Authority
- US
- United States
- Prior art keywords
- user
- challenge
- information
- online service
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/083—Network architectures or network communication protocols for network security for authentication of entities using passwords
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/107—Computer-aided management of electronic mailing [e-mailing]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/21—Monitoring or handling of messages
- H04L51/212—Monitoring or handling of messages using filtering or selective blocking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2133—Verifying human interaction, e.g., Captcha
Definitions
- the present invention relates to computer systems that allow users to create accounts. Specifically, the present invention relates to a method and system for determining whether a user setting up an account is a computer or human.
- Email websites and message boards offer an easy and cost effective way for individuals to communicate with one another. In many cases, these services are provided at no cost to the user. The user merely has to generate an account by providing information, such as a username, password and perhaps some personal information.
- a spam message typically includes unsolicited offers to sell some product or service. These messages can tend to clutter the inbox of an email account and lead to aggravation on the part of the owner of the email account.
- One way to minimize the aggravation caused by these messages may be to identify the spammer that sends a spam message and block any new spam messages from the spammer via a junk mail filter.
- many spammers have taken advantage of the easy account generation described above and developed automated systems for generating numerous email addresses. In many instances, simply changing the email address by one character may be sufficient to circumvent the junk mail filters described above.
- CAPTCHA Computers and Humans Apart
- the CAPTCHA may consist of a user challenge or image of several characters presented in a distorted fashion. The user may then be asked to solve the challenge or transcribe the text in the image.
- the CAPTCHA may be easily readable by a human, but not by a computer.
- CAPTCHA is a trademark of Carnegie Mellon University.
- CAPTCHAs may be vulnerable to relay attacks that use humans to solve the user challenge presented in the CAPTCHA.
- the CAPTCHA may be forwarded to a sweatshop of human operators who may be capable of solving the CAPTCHA.
- the CAPTCHA may be solved by posting the CAPTCHA on a website offering free services and asking users to solve the user challenge presented.
- the CAPTCHA may be utilized on a website offering pornography. Human users attempting to gain access to the website may be asked to solve the user challenge.
- the answer may be utilized by an automated system attempting to generate, for example, an email account on an email server.
- some email systems may restrain the number of mail messages that can be sent by a user until the user becomes trusted. Once trusted, however, the restraints may be removed. To overcome these safeguards, some automated systems may behave as a normal user. For example, the automated system may only send a small number of emails to a limited number of email addresses at any given time. However, once the automated system becomes trusted and the restraints have been removed, these automated systems may attempt to send millions of spam messages.
- the method may include collecting information about the online service user, generating a question based on the personal information, communicating the question to the online service user in the form of a CAPTCHA, and receiving a response to the question presented in the CAPTCHA, wherein a correct response is interpreted to mean that the online service user is human.
- the CAPTCHA may present the question in a distorted fashion so as to make it difficult for automated systems to read the question presented.
- the question may be based on personal information received during a registration process so as to make it impossible for another human unrelated to the online service user to know the correct answer.
- the method and system may also include measuring the response time in answering the question.
- FIG. 1 is a diagram depicting a computer user communicating with a web server via an internet connection in the present invention
- FIG. 2 a is a web page for entering registration information in connection with a first embodiment of the invention
- FIG. 2 b is a first web page for logging into a user account in connection with the first embodiment of the invention
- FIG. 2 c is a second web page for logging into a user account in connection with the first embodiment of the invention
- FIG. 3 a is a first web page for logging into a user account in connection with a second embodiment of the invention
- FIG. 3 b is a second web page for logging into a user account in connection with the second embodiment of the invention.
- FIG. 4 is a flow diagram for verifying that a user is human in a first embodiment of the invention
- FIG. 5 a is an exemplary text question distorted utilizing a first distortion method that may be utilized in connection with the present invention
- FIG. 5 b is an exemplary text distorted utilizing a second distortion method that may be utilized in connection with the present invention
- FIG. 5 c is an exemplary text distorted utilizing a third distortion method that may be utilized in connection with the present invention.
- FIG. 6 illustrates a general computer system, which may represent any of the computing devices referenced herein.
- FIG. 1 shows a data communication system 150 .
- a computer user communicates with an email server via the internet connection.
- the system 150 includes a user 120 , a user terminal 100 , an email server 105 , a registration database 110 , and registration data 115 .
- the email server 105 may be utilized to communicate web pages to the computer user 120 via the user terminal 100 that may enable generating a user account for the user 120 on the email server 105 , logging into the email server, and creating and reading email messages.
- the email server 105 may be implemented using any conventional computer or other data processing device.
- the email server 105 may further be implemented using a specialized data processing device which has been particularly adapted to performing the functions of an email server. These functions include communicating with users operating user terminals such as the user terminal 100 , communicating with other networked equipment to transmit and receive email information including email messages and control information, storing and retrieving email messages. Such messages, and such email information may include data defining text, images, video, audio or other information.
- the email server 105 may include a hardware device, a software application or combinations of the two.
- the email server 105 may also include timer software and circuitry that may enable determining response times of users.
- the registration database 110 may be utilized to store registration data 115 provided by the user 120 .
- the registration data 115 may include information, such as the user's 120 username, password, and address.
- the registration data 115 may also include personal information about the user 120 , such as a favorite color or favorite pet.
- the registration database 110 may store information about a plurality of registered users. For example, usernames and passwords for a plurality of users may be stored in the registration database 110 .
- personal information about the users may be stored in the registration database 110 .
- the personal information may include information such as a favorite color or favorite animal.
- the registration database 110 may reside in any type of memory.
- the memory may be a solid state memory or a magnetically based memory such as a hard drive.
- the user terminal 100 may be implemented using any conventional computer or other data processing device.
- the user terminal 100 may further be implemented using a specialized data processing device which has been particularly adapted to performing the functions of a user terminal. These functions include communicating with servers, such as the email server 105 or web servers, communicating with other networked equipment to transmit and receive email information including email messages and control information, and storing and retrieving email messages. Such messages, and such email information may include data defining text, images, video, audio or other information.
- the user terminal 100 may include a hardware device, a software application or combinations of the two.
- the user 120 may be required to generate a user account on the email server 105 .
- the user 120 may navigate to a website operating on an email server 105 offering free email services.
- the website may have a widget for generating new user accounts. Clicking the widget may cause the email server 105 to communicate to the user 120 a registration web page, such as the registration web page 200 shown in FIG. 2 a .
- This web page may enable the user 120 to specify, for example, basic information 205 , such as a name, address, city and state, username, password, and personal information 210 , such as a favorite color or favorite pet to be associated with the user account. This information may then be stored in the registration database 110 .
- the user 120 may then navigate to a first logon screen 215 as shown in FIG. 2 b , where the user 120 may be prompted to enter a username and password 220 .
- the user 120 may then be presented with a second logon screen 225 , as shown in FIG. 2 c .
- the second logon screen 225 may include a user challenge, such as a question to be solved.
- the question may ask the user 120 a question that only the user 120 would know. In this regard, the question may be based on the personal information 210 provided by the user 120 via the registration web page 200 .
- the question may be presented in the form of a CAPTCHA 225 .
- the question presented in the CAPTCHA may be visually distorted in such a way as to make it difficult or even impossible for an automated system to interpret.
- the user 120 may then be asked to solve the user challenge.
- the user 120 may then be allowed to access other web sites provided on the email server 105 , such as those associated with reading and writing email messages.
- the user 120 may have to provide the answer within a predetermined time. This may further help determine whether the user 120 is human because a human may be able to answer a question posed more quickly than a computer.
- Prompting the user 120 to solve the question and distorting the question may enable determining whether the user 120 is human rather than an automated system.
- the CAPTCHA may not be vulnerable to the relay attacks described above because other humans may not know the answers to the questions presented in the CAPTCHA, the reason for this being that the humans attempting to solve the CAPTCHA will likely not know what personal information utilized to generate the CAPTCHA. For example, although a human at a relay site might be able to read a question, such as “what is your favorite color?”, the same human likely would not know what color was specified as the answer to this question and thus would probably provide an incorrect answer to the question posed in the CAPTCHA. This combination may prevent automated systems and automated systems in combination with human help from generating the email addresses necessary for proliferating spam and other junk mail.
- the user 120 may not be required to register before using the system.
- the user 120 may be presented with a first logon screen 300 , as shown in FIG. 3 a .
- the first logon screen 300 may prompt the user 120 to enter basic information 305 , such as a username and password, and also personal information 310 , such as a favorite color or favorite pet.
- the user 120 may then be presented with a second logon screen 315 , as shown in FIG. 3 b .
- the second logon screen 315 may prompt the user 120 to answer a question presented in a CAPTCHA 320 .
- the question posed in the CAPTCHA 320 may be based on the personal information specified during in the first logon screen 300 .
- protection against automated systems may be enhanced by asking several questions related to several pieces of personal information that may have been provided by the user 120 .
- personal information may be specified via drop down lists.
- a drop down list may be utilized to specify a favorite color and limit the number of responses.
- Images may be utilized as well. For example, images of various animals may be presented to the user 120 to enable the user 120 to specify a favorite animal.
- the challenge presented to the user 120 may be based on information collected about the activities of the user 120 .
- the user challenge may be a question, such as “which of the following user ids have you sent/received an email to/from in the last five days?” Then a list of user id choices may be presented to the user 120 where one of the user ids corresponds to a recipient/sender of an email that the user 120 recently sent/received.
- Another example may be a question that asks the user 120 about recent web pages the user 120 may have visited. For example, the user 120 may be asked a question about an article that may have been on one of the web pages viewed.
- these sorts of questions may be presented to the user 120 when the email server 105 suspects the user 120 of being an automated system, such as when the user 120 takes too long to answer a CAPTCHA question about personal information or when the user 120 answers too many CAPTCHA questions incorrectly. This may be done so as to not bother an ordinary user who is not suspected of taking part in a spamming operation. These and other methods may further protect against automated systems.
- servers utilized for online banking may generate a CAPTCHA as described above and may communicate the CAPTCHA to a web browser operating on a personal computer. This may enable the banking server to verify the identity of the user of the web browser and may also enable verifying that a human is operating the web browser.
- FIG. 4 is a flow diagram for providing a logon screen for verifying that a user is a human in a first embodiment of the invention.
- logon information may be received.
- the user 120 may provide a username and password via a webpage such as the logon webpage 215 shown in FIG. 2 b .
- personal information stored in a database may be selected so as to create a user challenge based on the information.
- the email server 105 may, for example, randomly select personal information related to the user 120 from the registration database 110 described above.
- a text formatted question based on randomly selected personal information stored in the database may be created. An example of such a text question may be “what is your favorite color?” or “what is your favorite animal?”
- the text formatted question may be converted into an image, such as the text image 500 shown in FIG. 5 a .
- the text image 500 may be distorted to make it difficult or impossible for an automated system to convert the text image 500 back into a text format.
- the text image 500 may be warped so as to prevent optical character recognition (OCR) software programs from deciphering the text.
- OCR optical character recognition
- the text may be distorted as shown in the images 505 and 510 in FIG. 5 b and FIG. 5 c . In these images 505 and 510 , the characters in the text message contact one another. That is, there is no space between them. This may make it difficult to distinguish the individual characters, which may be one the steps required by many OCR programs.
- the image may be communicated to a user.
- the image may be presented to the user 120 via the user terminal 100 during the logon process.
- a timer may be started as well. The timer may be utilized to measure elapsed time between communicating the image to the user and receiving a response.
- a response to the question presented in the image may be provided by the user. For example, in response to the question “what is your favorite animal?” the user 120 may specify “Parrot.”
- the response to the data associated with the text question may be compared to the previously generated question.
- the mail server 105 described above may verify that the response to the question presented corresponds to the registration data 115 stored in the registration database 110 .
- the timer started above may be stopped so as to measure the elapsed time between communicating the message at block 420 and receiving the response at block 425 .
- a computer such as the email server 105 may determine whether the amount of time that may have elapsed between communicating the image to the user at block 420 and receiving the correct response from the user at block 425 is less than a threshold amount of time. For example, in the present embodiment, the email server 105 may allow for a turn around time of 30 seconds. If the elapsed time is less than the threshold then at block 450 the user may be successfully logged into the system. If the elapsed time is greater that the threshold then the user may be required to re-enter the logon information at block 400 . Alternatively, the user may be barred from logging back for a pre-determined amount of time, such as 1 hour. Yet another alternative may be to lock the user out indefinitely until the user contacts service personal associated with the web services he may be trying to gain access to.
- the computer may check the number of failed attempts at answering the user challenge. If the number of attempts is below a threshold, then the process may go back to block 405 where a different text formatted question may be generated. If the number of failed attempts exceeds the threshold, then the user may be required to re-enter the logon information at block 400 . Alternatively, the user may be barred from logging back for a pre-determined amount of time, such as 1 hour. Yet another alternative may be to lock the user out indefinitely until the user contacts service personal associated with the web services he may be trying to gain access to.
- FIG. 6 illustrates a general computer system, which may represent an email server 105 , user terminal 100 , or any of the other computing devices referenced herein.
- the computer system 600 may include a set of instructions 645 that may be executed to cause the computer system 600 to perform any one or more of the methods or computer based functions disclosed herein.
- the computer system 600 may operate as a standalone device or may be connected, e.g., using a network, to other computer systems or peripheral devices.
- the computer system may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
- the computer system 600 may also be implemented as or incorporated into various devices, such as a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile device, a palmtop computer, a laptop computer, a desktop computer, a communications device, a wireless telephone, a land-line telephone, a control system, a camera, a scanner, a facsimile machine, a printer, a pager, a personal trusted device, a web appliance, a network router, switch or bridge, or any other machine capable of executing a set of instructions 645 (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- STB set-top box
- PDA personal digital assistant
- the computer system 600 may be implemented using electronic devices that provide voice, video or data communication. Further, while a single computer system 600 may be illustrated, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.
- the computer system 600 may include a processor 605 , such as, a central processing unit (CPU), a graphics processing unit (GPU), or both.
- the processor 605 may be a component in a variety of systems.
- the processor 605 may be part of a standard personal computer or a workstation.
- the processor 605 may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analyzing and processing data.
- the processor 605 may implement a software program, such as code generated manually (i.e., programmed).
- the computer system 600 may include a memory 610 that can communicate via a bus 620 .
- the registration database 110 may be stored in the memory.
- the memory 610 may be a main memory, a static memory, or a dynamic memory.
- the memory 610 may include, but may not be limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like.
- the memory 610 may include a cache or random access memory for the processor 605 .
- the memory 610 may be separate from the processor 605 , such as a cache memory of a processor, the system memory, or other memory.
- the memory 610 may be an external storage device or database for storing data. Examples may include a hard drive, compact disc (“CD”), digital video disc (“DVD”), memory card, memory stick, floppy disc, universal serial bus (“USB”) memory device, or any other device operative to store data.
- the memory 610 may be operable to store instructions 645 executable by the processor 605 .
- the functions, acts or tasks illustrated in the figures or described herein may be performed by the programmed processor 605 executing the instructions 645 stored in the memory 610 .
- processing strategies may include multiprocessing, multitasking, parallel processing and the like.
- the computer system 600 may further include a display 630 , such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube (CRT), a projector, a printer or other now known or later developed display device for outputting determined information.
- the display 630 may act as an interface for the user to see the functioning of the processor 605 , or specifically as an interface with the software stored in the memory 610 or in the drive unit 615 .
- the display 630 may be utilized to display, for example, whether a business organization is a candidate for transformation.
- the display 630 may also be utilized to display a transformation plan.
- the various reports and surveys described above may be presented on the display 630 .
- the computer system 600 may include an input device 630 configured to allow a user to interact with any of the components of system 600 .
- the input device 625 may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with the system 600 .
- the computer system 600 may also include a disk or optical drive unit 615 .
- the disk drive unit 615 may include a computer-readable medium 640 in which one or more sets of instructions 645 , e.g. software, can be embedded. Further, the instructions 645 may perform one or more of the methods or logic as described herein.
- the instructions 645 may reside completely, or at least partially, within the memory 610 and/or within the processor 605 during execution by the computer system 600 .
- the memory 610 and the processor 605 also may include computer-readable media as discussed above.
- the present disclosure contemplates a computer-readable medium 640 that includes instructions 645 or receives and executes instructions 645 responsive to a propagated signal; so that a device connected to a network 650 may communicate voice, video, audio, images or any other data over the network 650 .
- the instructions 645 may be implemented with hardware, software and/or firmware, or any combination thereof. Further, the instructions 645 may be transmitted or received over the network 650 via a communication interface 635 .
- the communication interface 635 may be a part of the processor 605 or may be a separate component.
- the communication interface 635 may be created in software or may be a physical connection in hardware.
- the communication interface 635 may be configured to connect with a network 650 , external media, the display 630 , or any other components in system 600 , or combinations thereof.
- the connection with the network 650 may be a physical connection, such as a wired Ethernet connection or may be established wirelessly as discussed below.
- the additional connections with other components of the system 600 may be physical connections or may be established wirelessly.
- the network 650 may include wired networks, wireless networks, or combinations thereof. Information related to business organizations may be provided via the network 650 .
- the wireless network may be a cellular telephone network, an 802.11, 802.16, 802.20, or WiMax network.
- the network 650 may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to TCP/IP based networking protocols.
- the computer-readable medium 640 may be a single medium, or the computer-readable medium 640 may be a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions.
- the term “computer-readable medium” may also include any medium that may be capable of storing, encoding or carrying a set of instructions for execution by a processor or that may cause a computer system to perform any one or more of the methods or operations disclosed herein.
- the computer-readable medium 640 may include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories.
- the computer-readable medium 640 also may be a random access memory or other volatile re-writable memory.
- the computer-readable medium 640 may include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium.
- a digital file attachment to an e-mail or other self-contained information archive or set of archives may be considered a distribution medium that may be a tangible storage medium. Accordingly, the disclosure may be considered to include any one or more of a computer-readable medium or a distribution medium and other equivalents and successor media, in which data or instructions may be stored.
- dedicated hardware implementations such as application specific integrated circuits, programmable logic arrays and other hardware devices, may be constructed to implement one or more of the methods described herein.
- Applications that may include the apparatus and systems of various embodiments may broadly include a variety of electronic and computer systems.
- One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that may be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system may encompass software, firmware, and hardware implementations.
- the method and system may be realized in hardware, software, or a combination of hardware and software.
- the method and system may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
- a typical combination of hardware and software may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- the method and system may also be embedded in a computer program product, which included all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
- Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- the embodiments disclosed herein provide an improved approach for verifying that a user is human rather than a computer. Rather than simply relying on prior CAPTCHA methods, which may be circumvented via relay attacks, this approach creates a CAPTCHA question based on randomly selected personal information only known to the user. The addition of personal information to the CAPTCHA renders the CAPTCHA less susceptible to circumvention because, while the humans that take part in the relay attack may be able to read the question, they may not know the answer.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Economics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- 1. Field of Invention
- The present invention relates to computer systems that allow users to create accounts. Specifically, the present invention relates to a method and system for determining whether a user setting up an account is a computer or human.
- 2. Background Information
- The growth of the internet has fueled a boom in web based applications. For example, commonly available applications include search engines, mapping tools, email websites, and message boards. Email websites and message boards offer an easy and cost effective way for individuals to communicate with one another. In many cases, these services are provided at no cost to the user. The user merely has to generate an account by providing information, such as a username, password and perhaps some personal information.
- But along with the benefits of the enhanced communication has come the aggravation of junk mail or spam messages. A spam message typically includes unsolicited offers to sell some product or service. These messages can tend to clutter the inbox of an email account and lead to aggravation on the part of the owner of the email account. One way to minimize the aggravation caused by these messages may be to identify the spammer that sends a spam message and block any new spam messages from the spammer via a junk mail filter. However, many spammers have taken advantage of the easy account generation described above and developed automated systems for generating numerous email addresses. In many instances, simply changing the email address by one character may be sufficient to circumvent the junk mail filters described above.
- One method utilized to prevent the abuse described above is to present a CAPTCHA (“Completely Automated Public Turing test to tell Computers and Humans Apart”) to the user attempting to create an account. The CAPTCHA may consist of a user challenge or image of several characters presented in a distorted fashion. The user may then be asked to solve the challenge or transcribe the text in the image. The CAPTCHA may be easily readable by a human, but not by a computer. CAPTCHA is a trademark of Carnegie Mellon University.
- However, CAPTCHAs may be vulnerable to relay attacks that use humans to solve the user challenge presented in the CAPTCHA. In some cases, the CAPTCHA may be forwarded to a sweatshop of human operators who may be capable of solving the CAPTCHA. In other instances, the CAPTCHA may be solved by posting the CAPTCHA on a website offering free services and asking users to solve the user challenge presented. For example, the CAPTCHA may be utilized on a website offering pornography. Human users attempting to gain access to the website may be asked to solve the user challenge. Once solved, the answer may be utilized by an automated system attempting to generate, for example, an email account on an email server.
- In an effort to limit the amount of spam an automated system may generate, some email systems may restrain the number of mail messages that can be sent by a user until the user becomes trusted. Once trusted, however, the restraints may be removed. To overcome these safeguards, some automated systems may behave as a normal user. For example, the automated system may only send a small number of emails to a limited number of email addresses at any given time. However, once the automated system becomes trusted and the restraints have been removed, these automated systems may attempt to send millions of spam messages.
- To address the problems outlined above, a method and system for determining whether an online service user is human is provided. In one implementation, the method may include collecting information about the online service user, generating a question based on the personal information, communicating the question to the online service user in the form of a CAPTCHA, and receiving a response to the question presented in the CAPTCHA, wherein a correct response is interpreted to mean that the online service user is human. The CAPTCHA may present the question in a distorted fashion so as to make it difficult for automated systems to read the question presented. The question may be based on personal information received during a registration process so as to make it impossible for another human unrelated to the online service user to know the correct answer. The method and system may also include measuring the response time in answering the question.
-
FIG. 1 is a diagram depicting a computer user communicating with a web server via an internet connection in the present invention; -
FIG. 2 a is a web page for entering registration information in connection with a first embodiment of the invention; -
FIG. 2 b is a first web page for logging into a user account in connection with the first embodiment of the invention; -
FIG. 2 c is a second web page for logging into a user account in connection with the first embodiment of the invention; -
FIG. 3 a is a first web page for logging into a user account in connection with a second embodiment of the invention; -
FIG. 3 b is a second web page for logging into a user account in connection with the second embodiment of the invention; -
FIG. 4 is a flow diagram for verifying that a user is human in a first embodiment of the invention; -
FIG. 5 a is an exemplary text question distorted utilizing a first distortion method that may be utilized in connection with the present invention; -
FIG. 5 b is an exemplary text distorted utilizing a second distortion method that may be utilized in connection with the present invention; -
FIG. 5 c is an exemplary text distorted utilizing a third distortion method that may be utilized in connection with the present invention; and -
FIG. 6 illustrates a general computer system, which may represent any of the computing devices referenced herein. -
FIG. 1 shows adata communication system 150. In thedata communication system 150, a computer user communicates with an email server via the internet connection. Referring toFIG. 1 , thesystem 150 includes auser 120, auser terminal 100, anemail server 105, aregistration database 110, andregistration data 115. - The
email server 105 may be utilized to communicate web pages to thecomputer user 120 via theuser terminal 100 that may enable generating a user account for theuser 120 on theemail server 105, logging into the email server, and creating and reading email messages. Theemail server 105 may be implemented using any conventional computer or other data processing device. Theemail server 105 may further be implemented using a specialized data processing device which has been particularly adapted to performing the functions of an email server. These functions include communicating with users operating user terminals such as theuser terminal 100, communicating with other networked equipment to transmit and receive email information including email messages and control information, storing and retrieving email messages. Such messages, and such email information may include data defining text, images, video, audio or other information. Theemail server 105 may include a hardware device, a software application or combinations of the two. Theemail server 105 may also include timer software and circuitry that may enable determining response times of users. - The
registration database 110 may be utilized to storeregistration data 115 provided by theuser 120. Theregistration data 115 may include information, such as the user's 120 username, password, and address. Theregistration data 115 may also include personal information about theuser 120, such as a favorite color or favorite pet. Theregistration database 110 may store information about a plurality of registered users. For example, usernames and passwords for a plurality of users may be stored in theregistration database 110. In addition, personal information about the users may be stored in theregistration database 110. The personal information may include information such as a favorite color or favorite animal. Theregistration database 110 may reside in any type of memory. For example, the memory may be a solid state memory or a magnetically based memory such as a hard drive. - The
user terminal 100 may be implemented using any conventional computer or other data processing device. Theuser terminal 100 may further be implemented using a specialized data processing device which has been particularly adapted to performing the functions of a user terminal. These functions include communicating with servers, such as theemail server 105 or web servers, communicating with other networked equipment to transmit and receive email information including email messages and control information, and storing and retrieving email messages. Such messages, and such email information may include data defining text, images, video, audio or other information. Theuser terminal 100 may include a hardware device, a software application or combinations of the two. - In operation, before being allowed to read and write email messages, the
user 120 may be required to generate a user account on theemail server 105. For example, theuser 120 may navigate to a website operating on anemail server 105 offering free email services. The website may have a widget for generating new user accounts. Clicking the widget may cause theemail server 105 to communicate to the user 120 a registration web page, such as theregistration web page 200 shown inFIG. 2 a. This web page may enable theuser 120 to specify, for example,basic information 205, such as a name, address, city and state, username, password, andpersonal information 210, such as a favorite color or favorite pet to be associated with the user account. This information may then be stored in theregistration database 110. After registering, theuser 120 may then navigate to afirst logon screen 215 as shown inFIG. 2 b, where theuser 120 may be prompted to enter a username andpassword 220. - After this, the
user 120 may then be presented with asecond logon screen 225, as shown inFIG. 2 c. Thesecond logon screen 225 may include a user challenge, such as a question to be solved. The question may ask the user 120 a question that only theuser 120 would know. In this regard, the question may be based on thepersonal information 210 provided by theuser 120 via theregistration web page 200. The question may be presented in the form of aCAPTCHA 225. The question presented in the CAPTCHA may be visually distorted in such a way as to make it difficult or even impossible for an automated system to interpret. Theuser 120 may then be asked to solve the user challenge. Upon providing the correct solution to the question, theuser 120 may then be allowed to access other web sites provided on theemail server 105, such as those associated with reading and writing email messages. In addition, theuser 120 may have to provide the answer within a predetermined time. This may further help determine whether theuser 120 is human because a human may be able to answer a question posed more quickly than a computer. - Prompting the
user 120 to solve the question and distorting the question may enable determining whether theuser 120 is human rather than an automated system. In addition, as the question is based on personal information, the CAPTCHA may not be vulnerable to the relay attacks described above because other humans may not know the answers to the questions presented in the CAPTCHA, the reason for this being that the humans attempting to solve the CAPTCHA will likely not know what personal information utilized to generate the CAPTCHA. For example, although a human at a relay site might be able to read a question, such as “what is your favorite color?”, the same human likely would not know what color was specified as the answer to this question and thus would probably provide an incorrect answer to the question posed in the CAPTCHA. This combination may prevent automated systems and automated systems in combination with human help from generating the email addresses necessary for proliferating spam and other junk mail. - In an alternative embodiment, the
user 120 may not be required to register before using the system. In this case, theuser 120 may be presented with afirst logon screen 300, as shown inFIG. 3 a. Thefirst logon screen 300 may prompt theuser 120 to enterbasic information 305, such as a username and password, and alsopersonal information 310, such as a favorite color or favorite pet. After specifying this information, theuser 120 may then be presented with asecond logon screen 315, as shown inFIG. 3 b. Thesecond logon screen 315 may prompt theuser 120 to answer a question presented in aCAPTCHA 320. The question posed in theCAPTCHA 320 may be based on the personal information specified during in thefirst logon screen 300. - Other embodiments are contemplated as well. For example, protection against automated systems may be enhanced by asking several questions related to several pieces of personal information that may have been provided by the
user 120. In addition, personal information may be specified via drop down lists. For example, a drop down list may be utilized to specify a favorite color and limit the number of responses. Images may be utilized as well. For example, images of various animals may be presented to theuser 120 to enable theuser 120 to specify a favorite animal. - In yet other embodiments, the challenge presented to the
user 120 may be based on information collected about the activities of theuser 120. For example, the user challenge may be a question, such as “which of the following user ids have you sent/received an email to/from in the last five days?” Then a list of user id choices may be presented to theuser 120 where one of the user ids corresponds to a recipient/sender of an email that theuser 120 recently sent/received. Another example may be a question that asks theuser 120 about recent web pages theuser 120 may have visited. For example, theuser 120 may be asked a question about an article that may have been on one of the web pages viewed. - In an effort to improve user experience, these sorts of questions may be presented to the
user 120 when theemail server 105 suspects theuser 120 of being an automated system, such as when theuser 120 takes too long to answer a CAPTCHA question about personal information or when theuser 120 answers too many CAPTCHA questions incorrectly. This may be done so as to not bother an ordinary user who is not suspected of taking part in a spamming operation. These and other methods may further protect against automated systems. - It is to be understood that the advantages described above are not limited to email systems. For example, the system may be adapted to operate with other systems in which a secure communication channel is desired. For example, servers utilized for online banking may generate a CAPTCHA as described above and may communicate the CAPTCHA to a web browser operating on a personal computer. This may enable the banking server to verify the identity of the user of the web browser and may also enable verifying that a human is operating the web browser.
-
FIG. 4 is a flow diagram for providing a logon screen for verifying that a user is a human in a first embodiment of the invention. Atblock 400, logon information may be received. For example, theuser 120 may provide a username and password via a webpage such as thelogon webpage 215 shown inFIG. 2 b. Atblock 405, personal information stored in a database may be selected so as to create a user challenge based on the information. For example, theemail server 105 may, for example, randomly select personal information related to theuser 120 from theregistration database 110 described above. Atblock 410, a text formatted question based on randomly selected personal information stored in the database may be created. An example of such a text question may be “what is your favorite color?” or “what is your favorite animal?” - At
block 415, the text formatted question may be converted into an image, such as thetext image 500 shown inFIG. 5 a. As shownFIG. 5 a, thetext image 500 may be distorted to make it difficult or impossible for an automated system to convert thetext image 500 back into a text format. For example, thetext image 500 may be warped so as to prevent optical character recognition (OCR) software programs from deciphering the text. There may be numerous other methods to distort the text as well. For example, the text may be distorted as shown in the 505 and 510 inimages FIG. 5 b andFIG. 5 c. In these 505 and 510, the characters in the text message contact one another. That is, there is no space between them. This may make it difficult to distinguish the individual characters, which may be one the steps required by many OCR programs.images - Referring back to
FIG. 4 , atblock 420, the image may be communicated to a user. For example, the image may be presented to theuser 120 via theuser terminal 100 during the logon process. A timer may be started as well. The timer may be utilized to measure elapsed time between communicating the image to the user and receiving a response. Atblock 425, a response to the question presented in the image may be provided by the user. For example, in response to the question “what is your favorite animal?” theuser 120 may specify “Parrot.” Atblock 430, the response to the data associated with the text question may be compared to the previously generated question. For example, themail server 105 described above may verify that the response to the question presented corresponds to theregistration data 115 stored in theregistration database 110. The timer started above may be stopped so as to measure the elapsed time between communicating the message atblock 420 and receiving the response atblock 425. - At
block 435, if the response matches the registration data then atblock 440 then, a computer, such as theemail server 105 may determine whether the amount of time that may have elapsed between communicating the image to the user atblock 420 and receiving the correct response from the user atblock 425 is less than a threshold amount of time. For example, in the present embodiment, theemail server 105 may allow for a turn around time of 30 seconds. If the elapsed time is less than the threshold then atblock 450 the user may be successfully logged into the system. If the elapsed time is greater that the threshold then the user may be required to re-enter the logon information atblock 400. Alternatively, the user may be barred from logging back for a pre-determined amount of time, such as 1 hour. Yet another alternative may be to lock the user out indefinitely until the user contacts service personal associated with the web services he may be trying to gain access to. - Referring back to block 435, if the response is incorrect, then at
block 445 the computer may check the number of failed attempts at answering the user challenge. If the number of attempts is below a threshold, then the process may go back to block 405 where a different text formatted question may be generated. If the number of failed attempts exceeds the threshold, then the user may be required to re-enter the logon information atblock 400. Alternatively, the user may be barred from logging back for a pre-determined amount of time, such as 1 hour. Yet another alternative may be to lock the user out indefinitely until the user contacts service personal associated with the web services he may be trying to gain access to. -
FIG. 6 illustrates a general computer system, which may represent anemail server 105,user terminal 100, or any of the other computing devices referenced herein. Thecomputer system 600 may include a set ofinstructions 645 that may be executed to cause thecomputer system 600 to perform any one or more of the methods or computer based functions disclosed herein. Thecomputer system 600 may operate as a standalone device or may be connected, e.g., using a network, to other computer systems or peripheral devices. - In a networked deployment, the computer system may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The
computer system 600 may also be implemented as or incorporated into various devices, such as a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile device, a palmtop computer, a laptop computer, a desktop computer, a communications device, a wireless telephone, a land-line telephone, a control system, a camera, a scanner, a facsimile machine, a printer, a pager, a personal trusted device, a web appliance, a network router, switch or bridge, or any other machine capable of executing a set of instructions 645 (sequential or otherwise) that specify actions to be taken by that machine. In one embodiment, thecomputer system 600 may be implemented using electronic devices that provide voice, video or data communication. Further, while asingle computer system 600 may be illustrated, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions. - As illustrated in
FIG. 6 , thecomputer system 600 may include aprocessor 605, such as, a central processing unit (CPU), a graphics processing unit (GPU), or both. Theprocessor 605 may be a component in a variety of systems. For example, theprocessor 605 may be part of a standard personal computer or a workstation. Theprocessor 605 may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analyzing and processing data. Theprocessor 605 may implement a software program, such as code generated manually (i.e., programmed). - The
computer system 600 may include amemory 610 that can communicate via abus 620. For example, theregistration database 110 may be stored in the memory. Thememory 610 may be a main memory, a static memory, or a dynamic memory. Thememory 610 may include, but may not be limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. In one case, thememory 610 may include a cache or random access memory for theprocessor 605. Alternatively or in addition, thememory 610 may be separate from theprocessor 605, such as a cache memory of a processor, the system memory, or other memory. Thememory 610 may be an external storage device or database for storing data. Examples may include a hard drive, compact disc (“CD”), digital video disc (“DVD”), memory card, memory stick, floppy disc, universal serial bus (“USB”) memory device, or any other device operative to store data. Thememory 610 may be operable to storeinstructions 645 executable by theprocessor 605. The functions, acts or tasks illustrated in the figures or described herein may be performed by the programmedprocessor 605 executing theinstructions 645 stored in thememory 610. The functions, acts or tasks may be independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firm-ware, micro-code and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing and the like. - The
computer system 600 may further include adisplay 630, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube (CRT), a projector, a printer or other now known or later developed display device for outputting determined information. Thedisplay 630 may act as an interface for the user to see the functioning of theprocessor 605, or specifically as an interface with the software stored in thememory 610 or in thedrive unit 615. In this regard, thedisplay 630 may be utilized to display, for example, whether a business organization is a candidate for transformation. Thedisplay 630 may also be utilized to display a transformation plan. In addition, the various reports and surveys described above may be presented on thedisplay 630. - Additionally, the
computer system 600 may include aninput device 630 configured to allow a user to interact with any of the components ofsystem 600. Theinput device 625 may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with thesystem 600. - The
computer system 600 may also include a disk oroptical drive unit 615. Thedisk drive unit 615 may include a computer-readable medium 640 in which one or more sets ofinstructions 645, e.g. software, can be embedded. Further, theinstructions 645 may perform one or more of the methods or logic as described herein. Theinstructions 645 may reside completely, or at least partially, within thememory 610 and/or within theprocessor 605 during execution by thecomputer system 600. Thememory 610 and theprocessor 605 also may include computer-readable media as discussed above. - The present disclosure contemplates a computer-
readable medium 640 that includesinstructions 645 or receives and executesinstructions 645 responsive to a propagated signal; so that a device connected to anetwork 650 may communicate voice, video, audio, images or any other data over thenetwork 650. Theinstructions 645 may be implemented with hardware, software and/or firmware, or any combination thereof. Further, theinstructions 645 may be transmitted or received over thenetwork 650 via acommunication interface 635. Thecommunication interface 635 may be a part of theprocessor 605 or may be a separate component. Thecommunication interface 635 may be created in software or may be a physical connection in hardware. Thecommunication interface 635 may be configured to connect with anetwork 650, external media, thedisplay 630, or any other components insystem 600, or combinations thereof. The connection with thenetwork 650 may be a physical connection, such as a wired Ethernet connection or may be established wirelessly as discussed below. Likewise, the additional connections with other components of thesystem 600 may be physical connections or may be established wirelessly. - The
network 650 may include wired networks, wireless networks, or combinations thereof. Information related to business organizations may be provided via thenetwork 650. The wireless network may be a cellular telephone network, an 802.11, 802.16, 802.20, or WiMax network. Further, thenetwork 650 may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to TCP/IP based networking protocols. - The computer-
readable medium 640 may be a single medium, or the computer-readable medium 640 may be a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term “computer-readable medium” may also include any medium that may be capable of storing, encoding or carrying a set of instructions for execution by a processor or that may cause a computer system to perform any one or more of the methods or operations disclosed herein. - The computer-
readable medium 640 may include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories. The computer-readable medium 640 also may be a random access memory or other volatile re-writable memory. Additionally, the computer-readable medium 640 may include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium. A digital file attachment to an e-mail or other self-contained information archive or set of archives may be considered a distribution medium that may be a tangible storage medium. Accordingly, the disclosure may be considered to include any one or more of a computer-readable medium or a distribution medium and other equivalents and successor media, in which data or instructions may be stored. - Alternatively or in addition, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, may be constructed to implement one or more of the methods described herein. Applications that may include the apparatus and systems of various embodiments may broadly include a variety of electronic and computer systems. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that may be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system may encompass software, firmware, and hardware implementations.
- Accordingly, the method and system may be realized in hardware, software, or a combination of hardware and software. The method and system may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- The method and system may also be embedded in a computer program product, which included all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- While the method and system has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings without departing from its scope. Therefore, it is intended that the present method and system not be limited to the particular embodiment disclosed, but that the method and system include all embodiments falling within the scope of the appended claims.
- From the foregoing, it may be seen that the embodiments disclosed herein provide an improved approach for verifying that a user is human rather than a computer. Rather than simply relying on prior CAPTCHA methods, which may be circumvented via relay attacks, this approach creates a CAPTCHA question based on randomly selected personal information only known to the user. The addition of personal information to the CAPTCHA renders the CAPTCHA less susceptible to circumvention because, while the humans that take part in the relay attack may be able to read the question, they may not know the answer.
Claims (23)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/058,420 US20090249477A1 (en) | 2008-03-28 | 2008-03-28 | Method and system for determining whether a computer user is human |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/058,420 US20090249477A1 (en) | 2008-03-28 | 2008-03-28 | Method and system for determining whether a computer user is human |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20090249477A1 true US20090249477A1 (en) | 2009-10-01 |
Family
ID=41119212
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/058,420 Abandoned US20090249477A1 (en) | 2008-03-28 | 2008-03-28 | Method and system for determining whether a computer user is human |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US20090249477A1 (en) |
Cited By (40)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070011066A1 (en) * | 2005-07-08 | 2007-01-11 | Microsoft Corporation | Secure online transactions using a trusted digital identity |
| US20070143624A1 (en) * | 2005-12-15 | 2007-06-21 | Microsoft Corporation | Client-side captcha ceremony for user verification |
| US20100031330A1 (en) * | 2007-01-23 | 2010-02-04 | Carnegie Mellon University | Methods and apparatuses for controlling access to computer systems and for annotating media files |
| US20100122340A1 (en) * | 2008-11-13 | 2010-05-13 | Palo Alto Research Center Incorporated | Enterprise password reset |
| US20100162357A1 (en) * | 2008-12-19 | 2010-06-24 | Microsoft Corporation | Image-based human interactive proofs |
| US20100229223A1 (en) * | 2009-03-06 | 2010-09-09 | Facebook, Inc. | Using social information for authenticating a user session |
| JP2012003467A (en) * | 2010-06-16 | 2012-01-05 | Ricoh Co Ltd | Authentication device, authentication system, and authentication method |
| US8196198B1 (en) | 2008-12-29 | 2012-06-05 | Google Inc. | Access using images |
| CN102542137A (en) * | 2010-12-21 | 2012-07-04 | F2威尔股份有限公司 | Method and system for processing data based on full-automatic human and computer distinguishing test data |
| US20120210393A1 (en) * | 2010-08-31 | 2012-08-16 | Rakuten, Inc. | Response determination apparatus, response determination method, response determination program, recording medium, and response determination system |
| US8392986B1 (en) * | 2009-06-17 | 2013-03-05 | Google Inc. | Evaluating text-based access strings |
| US20130218566A1 (en) * | 2012-02-17 | 2013-08-22 | Microsoft Corporation | Audio human interactive proof based on text-to-speech and semantics |
| US8542251B1 (en) | 2008-10-20 | 2013-09-24 | Google Inc. | Access using image-based manipulation |
| US20130276125A1 (en) * | 2008-04-01 | 2013-10-17 | Leap Marketing Technologies Inc. | Systems and methods for assessing security risk |
| US8621396B1 (en) | 2008-10-20 | 2013-12-31 | Google Inc. | Access using image-based manipulation |
| US20140059663A1 (en) * | 2011-08-05 | 2014-02-27 | EngageClick, Inc. | System and method for creating and implementing scalable and effective multi-media objects with human interaction proof (hip) capabilities |
| WO2014044507A1 (en) * | 2012-09-20 | 2014-03-27 | Endress+Hauser Flowtec Ag | Method for the secure operation of a field device |
| US8693807B1 (en) | 2008-10-20 | 2014-04-08 | Google Inc. | Systems and methods for providing image feedback |
| US8745698B1 (en) * | 2009-06-09 | 2014-06-03 | Bank Of America Corporation | Dynamic authentication engine |
| US8856954B1 (en) * | 2010-12-29 | 2014-10-07 | Emc Corporation | Authenticating using organization based information |
| WO2015102510A1 (en) * | 2013-12-30 | 2015-07-09 | Limited Liability Company Mail.Ru | Systems and methods for determining whether user is human |
| US20150271166A1 (en) * | 2011-03-24 | 2015-09-24 | AYaH, LLC | Method for generating a human likeness score |
| US9378354B2 (en) | 2008-04-01 | 2016-06-28 | Nudata Security Inc. | Systems and methods for assessing security risk |
| WO2017040570A1 (en) * | 2015-09-01 | 2017-03-09 | Alibaba Group Holding Limited | System and method for authentication |
| US9648034B2 (en) | 2015-09-05 | 2017-05-09 | Nudata Security Inc. | Systems and methods for detecting and scoring anomalies |
| US20170154173A1 (en) * | 2015-11-27 | 2017-06-01 | Chao-Hung Wang | Array password authentication system and method thereof |
| US9723005B1 (en) * | 2014-09-29 | 2017-08-01 | Amazon Technologies, Inc. | Turing test via reaction to test modifications |
| US9767263B1 (en) | 2014-09-29 | 2017-09-19 | Amazon Technologies, Inc. | Turing test via failure |
| US9860247B2 (en) * | 2013-01-04 | 2018-01-02 | Gary Stephen Shuster | CAPTCHA systems and methods |
| US9985943B1 (en) | 2013-12-18 | 2018-05-29 | Amazon Technologies, Inc. | Automated agent detection using multiple factors |
| US9990487B1 (en) | 2017-05-05 | 2018-06-05 | Mastercard Technologies Canada ULC | Systems and methods for distinguishing among human users and software robots |
| US10007776B1 (en) | 2017-05-05 | 2018-06-26 | Mastercard Technologies Canada ULC | Systems and methods for distinguishing among human users and software robots |
| US10127373B1 (en) | 2017-05-05 | 2018-11-13 | Mastercard Technologies Canada ULC | Systems and methods for distinguishing among human users and software robots |
| US10438225B1 (en) | 2013-12-18 | 2019-10-08 | Amazon Technologies, Inc. | Game-based automated agent detection |
| US10511611B2 (en) * | 2016-09-11 | 2019-12-17 | Cisco Technology, Inc. | Conditional content access |
| US10558789B2 (en) | 2011-08-05 | 2020-02-11 | [24]7.ai, Inc. | Creating and implementing scalable and effective multimedia objects with human interaction proof (HIP) capabilities, with challenges comprising different levels of difficulty based on the degree on suspiciousness |
| US10778694B2 (en) | 2018-08-29 | 2020-09-15 | International Business Machines Corporation | Location detection based presentation |
| US20200396277A1 (en) * | 2014-06-24 | 2020-12-17 | Alibaba Group Holding Limited | Method and system for securely identifying users |
| US11693943B2 (en) | 2018-07-06 | 2023-07-04 | International Business Machines Corporation | Authenticating a user via a customized image-based challenge |
| US11775853B2 (en) | 2007-11-19 | 2023-10-03 | Nobots Llc | Systems, methods and apparatus for evaluating status of computing device user |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040199597A1 (en) * | 2003-04-04 | 2004-10-07 | Yahoo! Inc. | Method and system for image verification to prevent messaging abuse |
| US20080189768A1 (en) * | 2007-02-02 | 2008-08-07 | Ezra Callahan | System and method for determining a trust level in a social network environment |
| US20090235327A1 (en) * | 2008-03-11 | 2009-09-17 | Palo Alto Research Center Incorporated | Selectable captchas |
| US7624277B1 (en) * | 2003-02-25 | 2009-11-24 | Microsoft Corporation | Content alteration for prevention of unauthorized scripts |
| US8019127B2 (en) * | 2006-09-13 | 2011-09-13 | George Mason Intellectual Properties, Inc. | Image based turing test |
| US8036902B1 (en) * | 2006-06-21 | 2011-10-11 | Tellme Networks, Inc. | Audio human verification |
| US8056129B2 (en) * | 2007-04-19 | 2011-11-08 | International Business Machines Corporation | Validating active computer terminal sessions |
| US8073912B2 (en) * | 2007-07-13 | 2011-12-06 | Michael Gregor Kaplan | Sender authentication for difficult to classify email |
-
2008
- 2008-03-28 US US12/058,420 patent/US20090249477A1/en not_active Abandoned
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7624277B1 (en) * | 2003-02-25 | 2009-11-24 | Microsoft Corporation | Content alteration for prevention of unauthorized scripts |
| US20040199597A1 (en) * | 2003-04-04 | 2004-10-07 | Yahoo! Inc. | Method and system for image verification to prevent messaging abuse |
| US8036902B1 (en) * | 2006-06-21 | 2011-10-11 | Tellme Networks, Inc. | Audio human verification |
| US8019127B2 (en) * | 2006-09-13 | 2011-09-13 | George Mason Intellectual Properties, Inc. | Image based turing test |
| US20080189768A1 (en) * | 2007-02-02 | 2008-08-07 | Ezra Callahan | System and method for determining a trust level in a social network environment |
| US8056129B2 (en) * | 2007-04-19 | 2011-11-08 | International Business Machines Corporation | Validating active computer terminal sessions |
| US8073912B2 (en) * | 2007-07-13 | 2011-12-06 | Michael Gregor Kaplan | Sender authentication for difficult to classify email |
| US20090235327A1 (en) * | 2008-03-11 | 2009-09-17 | Palo Alto Research Center Incorporated | Selectable captchas |
Cited By (82)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070011066A1 (en) * | 2005-07-08 | 2007-01-11 | Microsoft Corporation | Secure online transactions using a trusted digital identity |
| US9213992B2 (en) | 2005-07-08 | 2015-12-15 | Microsoft Technology Licensing, Llc | Secure online transactions using a trusted digital identity |
| US20070143624A1 (en) * | 2005-12-15 | 2007-06-21 | Microsoft Corporation | Client-side captcha ceremony for user verification |
| US8782425B2 (en) | 2005-12-15 | 2014-07-15 | Microsoft Corporation | Client-side CAPTCHA ceremony for user verification |
| US8145914B2 (en) * | 2005-12-15 | 2012-03-27 | Microsoft Corporation | Client-side CAPTCHA ceremony for user verification |
| US20100031330A1 (en) * | 2007-01-23 | 2010-02-04 | Carnegie Mellon University | Methods and apparatuses for controlling access to computer systems and for annotating media files |
| US9600648B2 (en) | 2007-01-23 | 2017-03-21 | Carnegie Mellon University | Methods and apparatuses for controlling access to computer systems and for annotating media files |
| US8555353B2 (en) | 2007-01-23 | 2013-10-08 | Carnegie Mellon University | Methods and apparatuses for controlling access to computer systems and for annotating media files |
| US11836647B2 (en) | 2007-11-19 | 2023-12-05 | Nobots Llc | Systems, methods and apparatus for evaluating status of computing device user |
| US11810014B2 (en) | 2007-11-19 | 2023-11-07 | Nobots Llc | Systems, methods and apparatus for evaluating status of computing device user |
| US11775853B2 (en) | 2007-11-19 | 2023-10-03 | Nobots Llc | Systems, methods and apparatus for evaluating status of computing device user |
| US11036847B2 (en) | 2008-04-01 | 2021-06-15 | Mastercard Technologies Canada ULC | Systems and methods for assessing security risk |
| US9946864B2 (en) | 2008-04-01 | 2018-04-17 | Nudata Security Inc. | Systems and methods for implementing and tracking identification tests |
| US10997284B2 (en) | 2008-04-01 | 2021-05-04 | Mastercard Technologies Canada ULC | Systems and methods for assessing security risk |
| US10839065B2 (en) | 2008-04-01 | 2020-11-17 | Mastercard Technologies Canada ULC | Systems and methods for assessing security risk |
| US9378354B2 (en) | 2008-04-01 | 2016-06-28 | Nudata Security Inc. | Systems and methods for assessing security risk |
| US9842204B2 (en) * | 2008-04-01 | 2017-12-12 | Nudata Security Inc. | Systems and methods for assessing security risk |
| US20130276125A1 (en) * | 2008-04-01 | 2013-10-17 | Leap Marketing Technologies Inc. | Systems and methods for assessing security risk |
| US9633190B2 (en) | 2008-04-01 | 2017-04-25 | Nudata Security Inc. | Systems and methods for assessing security risk |
| US8693807B1 (en) | 2008-10-20 | 2014-04-08 | Google Inc. | Systems and methods for providing image feedback |
| US8621396B1 (en) | 2008-10-20 | 2013-12-31 | Google Inc. | Access using image-based manipulation |
| US8542251B1 (en) | 2008-10-20 | 2013-09-24 | Google Inc. | Access using image-based manipulation |
| US20100122340A1 (en) * | 2008-11-13 | 2010-05-13 | Palo Alto Research Center Incorporated | Enterprise password reset |
| US8881266B2 (en) * | 2008-11-13 | 2014-11-04 | Palo Alto Research Center Incorporated | Enterprise password reset |
| US20100162357A1 (en) * | 2008-12-19 | 2010-06-24 | Microsoft Corporation | Image-based human interactive proofs |
| US8196198B1 (en) | 2008-12-29 | 2012-06-05 | Google Inc. | Access using images |
| US8332937B1 (en) | 2008-12-29 | 2012-12-11 | Google Inc. | Access using images |
| US20100229223A1 (en) * | 2009-03-06 | 2010-09-09 | Facebook, Inc. | Using social information for authenticating a user session |
| US8910251B2 (en) * | 2009-03-06 | 2014-12-09 | Facebook, Inc. | Using social information for authenticating a user session |
| US8745698B1 (en) * | 2009-06-09 | 2014-06-03 | Bank Of America Corporation | Dynamic authentication engine |
| US8392986B1 (en) * | 2009-06-17 | 2013-03-05 | Google Inc. | Evaluating text-based access strings |
| JP2012003467A (en) * | 2010-06-16 | 2012-01-05 | Ricoh Co Ltd | Authentication device, authentication system, and authentication method |
| CN102687160A (en) * | 2010-08-31 | 2012-09-19 | 乐天株式会社 | Response determining device,response determining method,response determining program,recording medium and response determining system |
| CN102687160B (en) * | 2010-08-31 | 2015-12-16 | 乐天株式会社 | Response decision maker, response decision method and response decision-making system |
| US8863233B2 (en) * | 2010-08-31 | 2014-10-14 | Rakuten, Inc. | Response determination apparatus, response determination method, response determination program, recording medium, and response determination system |
| EP2472428A4 (en) * | 2010-08-31 | 2017-11-22 | Rakuten, Inc. | Response determining device, response determining method, response determining program, recording medium and response determining system |
| US20120210393A1 (en) * | 2010-08-31 | 2012-08-16 | Rakuten, Inc. | Response determination apparatus, response determination method, response determination program, recording medium, and response determination system |
| KR101385352B1 (en) | 2010-08-31 | 2014-04-14 | 라쿠텐 인코포레이티드 | Response determining device, response determining method, recording medium and response determining system |
| CN102542137A (en) * | 2010-12-21 | 2012-07-04 | F2威尔股份有限公司 | Method and system for processing data based on full-automatic human and computer distinguishing test data |
| US8856954B1 (en) * | 2010-12-29 | 2014-10-07 | Emc Corporation | Authenticating using organization based information |
| US11687631B2 (en) | 2011-03-24 | 2023-06-27 | Imperva, Inc. | Method for generating a human likeness score |
| US10068075B2 (en) * | 2011-03-24 | 2018-09-04 | Distil Networks, Inc. | Method for generating a human likeness score |
| US11423130B2 (en) | 2011-03-24 | 2022-08-23 | Imperva, Inc. | Method for generating a human likeness score |
| US20150271166A1 (en) * | 2011-03-24 | 2015-09-24 | AYaH, LLC | Method for generating a human likeness score |
| US10558789B2 (en) | 2011-08-05 | 2020-02-11 | [24]7.ai, Inc. | Creating and implementing scalable and effective multimedia objects with human interaction proof (HIP) capabilities, with challenges comprising different levels of difficulty based on the degree on suspiciousness |
| US20140059663A1 (en) * | 2011-08-05 | 2014-02-27 | EngageClick, Inc. | System and method for creating and implementing scalable and effective multi-media objects with human interaction proof (hip) capabilities |
| US9621528B2 (en) * | 2011-08-05 | 2017-04-11 | 24/7 Customer, Inc. | Creating and implementing scalable and effective multimedia objects with human interaction proof (HIP) capabilities, with challenges comprising secret question and answer created by user, and advertisement corresponding to the secret question |
| US20130218566A1 (en) * | 2012-02-17 | 2013-08-22 | Microsoft Corporation | Audio human interactive proof based on text-to-speech and semantics |
| US10319363B2 (en) * | 2012-02-17 | 2019-06-11 | Microsoft Technology Licensing, Llc | Audio human interactive proof based on text-to-speech and semantics |
| WO2014044507A1 (en) * | 2012-09-20 | 2014-03-27 | Endress+Hauser Flowtec Ag | Method for the secure operation of a field device |
| US9860247B2 (en) * | 2013-01-04 | 2018-01-02 | Gary Stephen Shuster | CAPTCHA systems and methods |
| US10298569B2 (en) * | 2013-01-04 | 2019-05-21 | Gary Stephen Shuster | CAPTCHA systems and methods |
| US10438225B1 (en) | 2013-12-18 | 2019-10-08 | Amazon Technologies, Inc. | Game-based automated agent detection |
| US9985943B1 (en) | 2013-12-18 | 2018-05-29 | Amazon Technologies, Inc. | Automated agent detection using multiple factors |
| WO2015102510A1 (en) * | 2013-12-30 | 2015-07-09 | Limited Liability Company Mail.Ru | Systems and methods for determining whether user is human |
| US20200396277A1 (en) * | 2014-06-24 | 2020-12-17 | Alibaba Group Holding Limited | Method and system for securely identifying users |
| US11677811B2 (en) * | 2014-06-24 | 2023-06-13 | Advanced New Technologies Co., Ltd. | Method and system for securely identifying users |
| US10262121B2 (en) | 2014-09-29 | 2019-04-16 | Amazon Technologies, Inc. | Turing test via failure |
| US9723005B1 (en) * | 2014-09-29 | 2017-08-01 | Amazon Technologies, Inc. | Turing test via reaction to test modifications |
| US9767263B1 (en) | 2014-09-29 | 2017-09-19 | Amazon Technologies, Inc. | Turing test via failure |
| US10333939B2 (en) | 2015-09-01 | 2019-06-25 | Alibaba Group Holding Limited | System and method for authentication |
| WO2017040570A1 (en) * | 2015-09-01 | 2017-03-09 | Alibaba Group Holding Limited | System and method for authentication |
| US9979747B2 (en) | 2015-09-05 | 2018-05-22 | Mastercard Technologies Canada ULC | Systems and methods for detecting and preventing spoofing |
| US9749358B2 (en) | 2015-09-05 | 2017-08-29 | Nudata Security Inc. | Systems and methods for matching and scoring sameness |
| US10212180B2 (en) | 2015-09-05 | 2019-02-19 | Mastercard Technologies Canada ULC | Systems and methods for detecting and preventing spoofing |
| US9648034B2 (en) | 2015-09-05 | 2017-05-09 | Nudata Security Inc. | Systems and methods for detecting and scoring anomalies |
| US9813446B2 (en) | 2015-09-05 | 2017-11-07 | Nudata Security Inc. | Systems and methods for matching and scoring sameness |
| US10749884B2 (en) | 2015-09-05 | 2020-08-18 | Mastercard Technologies Canada ULC | Systems and methods for detecting and preventing spoofing |
| US9680868B2 (en) | 2015-09-05 | 2017-06-13 | Nudata Security Inc. | Systems and methods for matching and scoring sameness |
| US10805328B2 (en) | 2015-09-05 | 2020-10-13 | Mastercard Technologies Canada ULC | Systems and methods for detecting and scoring anomalies |
| US9800601B2 (en) | 2015-09-05 | 2017-10-24 | Nudata Security Inc. | Systems and methods for detecting and scoring anomalies |
| US10129279B2 (en) | 2015-09-05 | 2018-11-13 | Mastercard Technologies Canada ULC | Systems and methods for detecting and preventing spoofing |
| US10965695B2 (en) | 2015-09-05 | 2021-03-30 | Mastercard Technologies Canada ULC | Systems and methods for matching and scoring sameness |
| US9749357B2 (en) | 2015-09-05 | 2017-08-29 | Nudata Security Inc. | Systems and methods for matching and scoring sameness |
| US9749356B2 (en) | 2015-09-05 | 2017-08-29 | Nudata Security Inc. | Systems and methods for detecting and scoring anomalies |
| US20170154173A1 (en) * | 2015-11-27 | 2017-06-01 | Chao-Hung Wang | Array password authentication system and method thereof |
| US10511611B2 (en) * | 2016-09-11 | 2019-12-17 | Cisco Technology, Inc. | Conditional content access |
| US9990487B1 (en) | 2017-05-05 | 2018-06-05 | Mastercard Technologies Canada ULC | Systems and methods for distinguishing among human users and software robots |
| US10007776B1 (en) | 2017-05-05 | 2018-06-26 | Mastercard Technologies Canada ULC | Systems and methods for distinguishing among human users and software robots |
| US10127373B1 (en) | 2017-05-05 | 2018-11-13 | Mastercard Technologies Canada ULC | Systems and methods for distinguishing among human users and software robots |
| US11693943B2 (en) | 2018-07-06 | 2023-07-04 | International Business Machines Corporation | Authenticating a user via a customized image-based challenge |
| US10778694B2 (en) | 2018-08-29 | 2020-09-15 | International Business Machines Corporation | Location detection based presentation |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20090249477A1 (en) | Method and system for determining whether a computer user is human | |
| US20240396858A1 (en) | Determining Authenticity of Reported User Action in Cybersecurity Risk Assessment | |
| CN112567710B (en) | Systems and methods for polluting phishing campaign responses | |
| Alabdan | Phishing attacks survey: Types, vectors, and technical approaches | |
| US10182031B2 (en) | Automated message security scanner detection system | |
| US10027701B1 (en) | Method and system for reducing reporting of non-malicious electronic messages in a cybersecurity system | |
| US9774626B1 (en) | Method and system for assessing and classifying reported potentially malicious messages in a cybersecurity system | |
| Jansson et al. | Phishing for phishing awareness | |
| US9912687B1 (en) | Advanced processing of electronic messages with attachments in a cybersecurity system | |
| US9264418B1 (en) | Client-side spam detection and prevention | |
| US20200382496A9 (en) | Domain-based Isolated Mailboxes | |
| US8510557B2 (en) | Secure message and file delivery | |
| WO2022071961A1 (en) | Automated collection of branded training data for security awareness training | |
| US9276923B1 (en) | Generating authentication challenges based on preferences of a user's contacts | |
| US9037864B1 (en) | Generating authentication challenges based on social network activity information | |
| US20120317217A1 (en) | Methods and systems for managing virtual identities | |
| Farooqi et al. | Canarytrap: Detecting data misuse by third-party apps on online social networks | |
| Hu et al. | Revisiting email spoofing attacks | |
| US20220197997A1 (en) | Systems and Methods for Attacks, Countermeasures, Archiving, Data Leak Prevention, and Other Novel Services for Active Messages | |
| US8738764B1 (en) | Methods and systems for controlling communications | |
| Klint | Cybersecurity in home-office environments: An examination of security best practices post Covid | |
| Maceiras et al. | Know their customers: An empirical study of online account enumeration attacks | |
| Oxley | A best practices guide for mitigating risk in the use of social media | |
| Lain et al. | {URL} Inspection Tasks: Helping Users Detect Phishing Links in Emails | |
| US20250252164A1 (en) | Multidimensional local large language model user authentication |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: YAHOO| INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PUNERA, KUNAL;REEL/FRAME:022572/0240 Effective date: 20080327 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: YAHOO HOLDINGS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO| INC.;REEL/FRAME:042963/0211 Effective date: 20170613 |
|
| AS | Assignment |
Owner name: OATH INC., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO HOLDINGS, INC.;REEL/FRAME:045240/0310 Effective date: 20171231 |