[go: up one dir, main page]

Yung et al., 2015 - Google Patents

Methods to test visual attention online

Yung et al., 2015

View HTML
Document ID
2494197700403825745
Author
Yung A
Cardoso-Leite P
Dale G
Bavelier D
Green C
Publication year
Publication venue
Journal of visualized experiments: JoVE

External Links

Snippet

Online data collection methods have particular appeal to behavioral scientists because they offer the promise of much larger and much more representative data samples than can typically be collected on college campuses. However, before such methods can be widely …
Continue reading at pmc.ncbi.nlm.nih.gov (HTML) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for a specific business sector, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • G06F19/30Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
    • G06F19/34Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for a specific business sector, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Health care, e.g. hospitals; Social work
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes

Similar Documents

Publication Publication Date Title
Yung et al. Methods to test visual attention online
Backx et al. Comparing web-based and lab-based cognitive assessment using the Cambridge Neuropsychological Test Automated Battery: A within-subjects counterbalanced study
Tsay et al. Interactions between sensory prediction error and task error during implicit motor learning
Bolden et al. How young children view mathematical representations: a study using eye-tracking technology
Colbert et al. Teaching metacognitive skills: Helping your physician trainees in the quest to ‘know what they don't know’
Barrett et al. Comparing virtual reality, desktop-based 3D, and 2D versions of a category learning experiment
US20130226845A1 (en) Instruction System with Eyetracking-Based Adaptive Scaffolding
Fontaine et al. Effectiveness of adaptive e-learning environments on knowledge, competence, and behavior in health professionals and students: protocol for a systematic review and meta-analysis
US11263914B2 (en) Multi-level executive functioning tasks
Niehorster et al. Accuracy and tuning of flow parsing for visual perception of object motion during self-motion
Intarasirisawat et al. Exploring the touch and motion features in game-based cognitive assessments
Ellison et al. Determining eye–hand coordination using the sport vision trainer: An evaluation of test–retest reliability
Jost et al. Manual training of mental rotation performance: Visual representation of rotating figures is the main driver for improvements
Mei et al. Usability issues with 3D user interfaces for adolescents with high functioning autism
Castro-Alonso et al. VAR: A battery of computer-based instruments to measure visuospatial processing
Mia et al. A scoping review on mobile health technology for assessment and intervention of upper limb motor function in children with motor impairments
Haesner et al. An eye movement analysis of web usability: Differences between older adults with and without mild cognitive impairment
Makai-Bölöni et al. Touchscreen-based finger tapping: Repeatability and configuration effects on tapping performance
Zhang et al. Towards a computer-assisted comprehensive evaluation of visual motor integration for children with autism spectrum disorder: a pilot study
Corbett et al. Statistical extraction affects visually guided action
Daza et al. A multimodal dataset for understanding the impact of mobile phones on remote online virtual education
US8042946B1 (en) Contrast sensitivity test
Topalli et al. Eye-hand coordination patterns of intermediate and novice surgeons in a simulation-based endoscopic surgery training environment
Wilms et al. Indirect versus direct feedback in computer-based Prism Adaptation Therapy
Tian et al. The least increasing aversion (lia) protocol: Illustration on identifying individual susceptibility to cybersickness triggers