[go: up one dir, main page]

CN118021308B - Human factors intelligent cockpit driver experience evaluation system and method - Google Patents

Human factors intelligent cockpit driver experience evaluation system and method Download PDF

Info

Publication number
CN118021308B
CN118021308B CN202311869332.9A CN202311869332A CN118021308B CN 118021308 B CN118021308 B CN 118021308B CN 202311869332 A CN202311869332 A CN 202311869332A CN 118021308 B CN118021308 B CN 118021308B
Authority
CN
China
Prior art keywords
evaluation
task
data
driver
management
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311869332.9A
Other languages
Chinese (zh)
Other versions
CN118021308A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kingfar International Inc
Original Assignee
Kingfar International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kingfar International Inc filed Critical Kingfar International Inc
Priority to CN202311869332.9A priority Critical patent/CN118021308B/en
Publication of CN118021308A publication Critical patent/CN118021308A/en
Priority to US18/948,357 priority patent/US20250208000A1/en
Priority to EP24220178.8A priority patent/EP4575799A1/en
Application granted granted Critical
Publication of CN118021308B publication Critical patent/CN118021308B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • A61B5/024Measuring pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • A61B5/024Measuring pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Measuring devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/33Heart-related electrical modalities, e.g. electrocardiography [ECG] specially adapted for cooperation with other devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4803Speech analysis specially adapted for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6815Ear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6825Hand
    • A61B5/6826Finger
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • A61B5/7445Display arrangements, e.g. multiple display units
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • Psychiatry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Child & Adolescent Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Dentistry (AREA)
  • Signal Processing (AREA)
  • Pulmonology (AREA)
  • Artificial Intelligence (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Otolaryngology (AREA)
  • Geometry (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application relates to a human factor intelligent cabin driving human body inspection and evaluation system and method, wherein the system comprises at least one single vehicle inspection and evaluation system and an inspection and evaluation management system, wherein the system is used for collecting sensing data generated by at least one inspection and evaluation driver based on corresponding inspection and evaluation contents and identifying the driver state of the at least one inspection and evaluation driver according to the sensing data, and the inspection and evaluation management system is used for generating the inspection and evaluation contents corresponding to the at least one inspection and evaluation driver and receiving the sensing data and the driver state uploaded by the at least one single vehicle inspection and evaluation system to obtain the human factor intelligent cabin inspection and evaluation data. According to the embodiment of the application, the generated evaluation content can be utilized to carry out experience test on the driver, and the experience evaluation data of the artificial intelligent cabin is obtained according to the sensing data and the state feedback of the driver, so that a comprehensive and accurate evaluation system standard is provided for the human inspection evaluation of the driver, the actual evaluation requirement of the artificial intelligent cabin is met, and the working efficiency and accuracy of the evaluation are effectively improved.

Description

Human factor intelligent cockpit driving human body inspection and evaluation system and method
Technical Field
The application relates to the technical field of intelligent cabins, in particular to a human-caused intelligent cabin driving human body inspection and evaluation system and method.
Background
With the rapid development of the intelligent level of the automobile, the requirements of people on the comfort of the automobile are higher and higher, particularly, the experience of the driver of the intelligent cabin of the automobile is more and more important, and the experience test evaluation of the driver of the intelligent cabin of the automobile is an important way for measuring the effect of the human body inspection of the driver.
In the related art, an experience test scene can be selected with the purpose of traveling demand, data in the scene is analyzed and processed to obtain an experience test result of personnel, and the cockpit is evaluated to obtain an experience test evaluation result of the cockpit.
However, when the cabin of the vehicle performs the driving human body test, the driving human body test objective evaluation basis is lacking, the actual evaluation requirement of the intelligent cabin cannot be met, the experience of the driver in the evaluation process is affected, the working efficiency and the accuracy of the evaluation are poor, and improvement is needed.
Disclosure of Invention
The application provides a human-caused intelligent cabin driving human body test evaluation system and method, which are used for solving the problems that when a vehicle cabin performs driving human body test, the driving human body test objective evaluation basis is lacked, the actual evaluation requirement of the intelligent cabin cannot be met, the experience of a driver in the evaluation process is influenced, the work efficiency and the accuracy of the evaluation are poor and the like.
The embodiment of the first aspect of the application provides a human body testing and evaluating system for human-factor intelligent cabin driving, which comprises at least one bicycle testing and evaluating system and an evaluation management system, wherein the bicycle testing and evaluating system is used for collecting sensing data generated by at least one testing and evaluating driver based on corresponding testing and evaluating content and identifying the driver state of the at least one testing and evaluating driver according to the sensing data, and the evaluation management system is used for generating the testing and evaluating content corresponding to the at least one testing and evaluating driver and transmitting the testing and evaluating content to the at least one bicycle testing and evaluating system and receiving the sensing data and the driver state uploaded by the at least one bicycle testing and evaluating system so as to obtain experience testing and evaluating data of the human-factor intelligent cabin according to the sensing data and the driver state, wherein the experience testing and evaluating data are used for evaluating interactivity between the human-factor intelligent cabin and the driver.
According to the technical means, the embodiment of the application can carry out experience test on the driver by using the generated evaluation content, and obtains the experience evaluation data of the human intelligent cabin according to the sensing data and the state feedback of the driver, thereby providing comprehensive and accurate evaluation system specification for the human experience evaluation of the driver, meeting the actual evaluation requirement of the human intelligent cabin and effectively improving the working efficiency and accuracy of the evaluation.
Illustratively, in one embodiment of the application, each of the at least one bicycle evaluation systems includes an acquisition component for acquiring at least one of eye movement data, physiological data, behavior data, and voice data of the at least one evaluated driver as the sensing data, and an evaluation component for evaluating the fatigue level and/or emotional state of the at least one evaluated driver as the driver state based on the sensing data.
According to the technical means, the fatigue level of the driver can be estimated by using the sensing data, or the emotion state of the driver can be estimated by using the sensing data, and the fatigue level and emotion state of the driver can be estimated by using the sensing data, so that different states of the driver in the driving process can be analyzed in a targeted manner, a comprehensive and accurate assessment system standard is provided for the human body inspection and evaluation of the driver, and the actual requirement of the human-caused intelligent cabin is met.
Illustratively, in one embodiment of the application, the acquisition component is further configured to time synchronize a plurality of eye movement data, physiological data, behavioral data, and voice data of the at least one assessment driver.
According to the technical means, the embodiment of the application can also time synchronize a plurality of items in the eye movement data, the physiological data, the behavior data and the voice data of the driver, so that the driving behavior and the driving state of the driver are monitored and evaluated in real time, and the working efficiency and the accuracy of the evaluation are effectively improved.
The evaluation management system comprises a system management module, a resource library module, an evaluation design module, a data analysis module and an index system and weight management module, wherein the system management module is used for executing at least one of a test project management task, a vehicle model management task, a tested management task and a main test management task of the evaluation system, the resource library module is used for executing a resource library management task of the evaluation system, the evaluation design module is used for calling the resource library module to generate an evaluation flow and corresponding evaluation contents based on a time axis, the data analysis module is used for executing a data analysis task of the evaluation system according to the sensing data and the driver state, and the index system and weight management module is used for constructing an index system of the evaluation system, obtaining corresponding weights of the sensing data and the driver state based on the index system and executing the tasks by using the weights.
According to the technical means, the embodiment of the application can execute test project management tasks, vehicle type management tasks, tested management tasks, main test management tasks and the like, thereby realizing the integrated management of intelligent cabin real vehicle test projects, improving the efficiency of management test flow, executing tasks by using weights, screening indexes to be used and distributing the weights, and being more reliable.
Illustratively, in one embodiment of the application, the test project management task comprises at least one of a project creation task, a personnel allocation task, a vehicle type association task and a project progress demonstration task, the vehicle type management task comprises at least one of a vehicle type information creation task and a management task, the tested management task comprises at least one of an addition tested task, a tested demographic information addition task, a demographic information custom addition task, a tested historical record task, a tested import and export task and a demographic information statistics task, and the main test management task comprises at least one of an addition main test task and a main test historical record task.
According to the technical means, the embodiment of the application can create project tasks, personnel allocation tasks, vehicle type association tasks and project progress display tasks, thereby realizing integrated management of intelligent cabin real vehicle test projects based on complete evaluation project workflow, maintaining comprehensive information of testees so as to ensure effective participation and management, completely recording and managing the information of each main tester through main test tasks, providing comprehensive and accurate evaluation system specifications for driving human body test evaluation, and effectively improving the working efficiency and accuracy of evaluation.
Illustratively, in one embodiment of the present application, the resource library management task includes one or more of a use case management subtask, a questionnaire list management subtask, a behavior experiment management subtask, a trip management subtask, a speech assessment management subtask, and an assessment tool management subtask, a state assessment algorithm model management subtask in a subjective assessment task.
According to the technical means, the comprehensive evaluation can be ensured according to the use case management subtask, the questionnaire list management subtask, the behavior experiment management subtask, the journey management subtask, the voice evaluation management subtask, the evaluation tool management subtask in the objective evaluation task, the state evaluation algorithm model management subtask and the like in the subjective evaluation task, and the actual evaluation requirement of the human-caused intelligent cabin is met.
Illustratively, in one embodiment of the application, the data analysis tasks include at least one of a video behavior data analysis task, an eye movement data analysis task, a physiological data analysis task, a voice data analysis task, a visual reporting task.
According to the technical means, the embodiment of the application can improve the data acquisition rate through the video behavior data analysis task, the eye movement data analysis task, the physiological data analysis task, the voice data analysis task, the visual reporting task and the like, ensure the reliability of analysis and finally effectively improve the working efficiency and accuracy of evaluation.
The embodiment of the second aspect of the application provides a human body testing and evaluating method for driving of a human-factor intelligent cabin, which comprises the following steps of collecting sensing data generated by at least one tested driver based on corresponding testing and evaluating content, identifying the driver state of the at least one tested driver according to the sensing data, and obtaining experience testing and evaluating data of the human-factor intelligent cabin according to the sensing data and the driver state, wherein the experience testing and evaluating data are used for evaluating the interactivity between the human-factor intelligent cabin and the driver.
According to the technical means, the embodiment of the application can carry out experience test on the driver by using the generated evaluation content, and obtains the experience evaluation data of the human intelligent cabin according to the sensing data and the state feedback of the driver, thereby providing comprehensive and accurate evaluation system specification for the human experience evaluation of the driver, meeting the actual evaluation requirement of the human intelligent cabin and effectively improving the working efficiency and accuracy of the evaluation.
Illustratively, in one embodiment of the application, the collecting the sensing data generated by the at least one evaluating driver based on the corresponding evaluating content comprises collecting at least one of eye movement data, physiological data, behavior data and voice data of the at least one evaluating driver as the sensing data, and evaluating the fatigue level and/or emotion state of the at least one evaluating driver as the driver state according to the sensing data.
According to the technical means, the fatigue level of the driver can be estimated by using the sensing data, or the emotion state of the driver can be estimated by using the sensing data, and the fatigue level and emotion state of the driver can be estimated by using the sensing data, so that different states of the driver in the driving process can be analyzed in a targeted manner, a comprehensive and accurate assessment system standard is provided for the human body inspection and evaluation of the driver, and the actual requirement of the human-caused intelligent cabin is met.
Illustratively, in one embodiment of the present application, the collecting at least one of eye movement data, physiological data, behavior data, and voice data of the at least one assessment driver as the sensing data includes time synchronizing a plurality of the at least one eye movement data, physiological data, behavior data, and voice data of the at least one assessment driver.
According to the technical means, the embodiment of the application can also time synchronize a plurality of items in the eye movement data, the physiological data, the behavior data and the voice data of the driver, so that the driving behavior and the driving state of the driver are monitored and evaluated in real time, and the working efficiency and the accuracy of the evaluation are effectively improved.
The method for obtaining the experience evaluation data of the human factor intelligent cabin according to the sensing data and the driver state comprises the steps of executing at least one of a test project management task, a vehicle model management task, a tested management task and a main test management task, executing a resource library management task, generating an evaluation flow and corresponding evaluation content based on a time axis, executing a data analysis task of the evaluation system according to the sensing data and the driver state, and obtaining corresponding weights of the sensing data and the driver state based on the index system to perform task execution by using the weights.
According to the technical means, the embodiment of the application can execute test project management tasks, vehicle type management tasks, tested management tasks, main test management tasks and the like, thereby realizing the integrated management of intelligent cabin real vehicle test projects, improving the efficiency of management test flow, executing tasks by using weights, screening indexes to be used and distributing the weights, and being more reliable.
Illustratively, in one embodiment of the application, the test project management task comprises at least one of a project creation task, a personnel allocation task, a vehicle type association task and a project progress demonstration task, the vehicle type management task comprises at least one of a vehicle type information creation task and a management task, the tested management task comprises at least one of an addition tested task, a tested demographic information addition task, a demographic information custom addition task, a tested historical record task, a tested import and export task and a demographic information statistics task, and the main test management task comprises at least one of an addition main test task and a main test historical record task.
According to the technical means, the embodiment of the application can create project tasks, personnel allocation tasks, vehicle type association tasks and project progress display tasks, thereby realizing integrated management of intelligent cabin real vehicle test projects based on complete evaluation project workflow, maintaining comprehensive information of testees so as to ensure effective participation and management, completely recording and managing the information of each main tester through main test tasks, providing comprehensive and accurate evaluation system specifications for driving human body test evaluation, and effectively improving the working efficiency and accuracy of evaluation.
Illustratively, in one embodiment of the present application, the resource library management task includes one or more of a use case management subtask, a questionnaire list management subtask, a behavior experiment management subtask, a trip management subtask, a speech assessment management subtask, and an assessment tool management subtask, a state assessment algorithm model management subtask in a subjective assessment task.
According to the technical means, the comprehensive evaluation can be ensured according to the use case management subtask, the questionnaire list management subtask, the behavior experiment management subtask, the journey management subtask, the voice evaluation management subtask, the evaluation tool management subtask in the objective evaluation task, the state evaluation algorithm model management subtask and the like in the subjective evaluation task, and the actual evaluation requirement of the human-caused intelligent cabin is met.
Illustratively, in one embodiment of the application, the data analysis tasks include at least one of a video behavior data analysis task, an eye movement data analysis task, a physiological data analysis task, a voice data analysis task, a visual reporting task.
According to the technical means, the embodiment of the application can improve the data acquisition rate through the video behavior data analysis task, the eye movement data analysis task, the physiological data analysis task, the voice data analysis task, the visual reporting task and the like, ensure the reliability of analysis and finally effectively improve the working efficiency and accuracy of evaluation.
An embodiment of a third aspect of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the program to implement the human body test and assessment method for human body due to intelligent cockpit driving in the above embodiment.
A fourth aspect of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the human-derived intelligent cockpit driving-human-body-verification method as above.
According to the embodiment of the application, the generated evaluation content can be utilized to carry out experience test on the driver, and the experience evaluation data of the artificial intelligent cabin is obtained according to the sensing data and the state feedback of the driver, so that a comprehensive and accurate evaluation system standard is provided for the human inspection evaluation of the driver, the actual evaluation requirement of the artificial intelligent cabin is met, and the working efficiency and accuracy of the evaluation are effectively improved. Therefore, the problems that when the cabin of the vehicle performs driving human body test, the driving human body test objective evaluation basis is lacked, the actual evaluation requirement of the intelligent cabin cannot be met, the experience of a driver in the evaluation process is affected, the work efficiency and the accuracy of the evaluation are poor and the like are solved.
Additional aspects and advantages of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
Fig. 1 is a schematic structural diagram of a human-factor intelligent cockpit driving human body test and evaluation system according to an embodiment of the present application;
FIG. 2 is a schematic illustration of event and segment markers for different types of data for a human-based intelligent cockpit driving human verification system according to one embodiment of the present application;
FIG. 3 is a schematic diagram of clock synchronization of a human-caused intelligent cockpit driving human verification system according to one embodiment of the present application;
FIG. 4 is a schematic diagram of a wearable eye tracker of a human-caused intelligent cockpit driving human verification system according to one embodiment of the present application;
FIG. 5 is a schematic diagram of physiological sensors of a human-derived intelligent cockpit driving human verification system according to one embodiment of the present application;
FIG. 6 is a schematic diagram of a comprehensive video analysis of a human-caused intelligent cockpit driving human verification system according to one embodiment of the present application;
FIG. 7 is a schematic diagram of an eye movement data analysis of a human visual cabin driving human visual inspection system according to an embodiment of the present application;
FIG. 8 is a schematic diagram of physiological data analysis of a human-derived intelligent cockpit driving human verification system according to one embodiment of the present application;
FIG. 9 is a schematic diagram of a questionnaire report of a human visual cockpit driving human verification system according to one embodiment of the present application;
FIG. 10 is a schematic diagram of a behavioral test report of a human-caused intelligent cockpit driving human test and assessment system according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a human-caused intelligent cockpit driving human body verification system according to an embodiment of the present application;
fig. 12 is a flowchart of a human factor intelligent cabin driving human body test and evaluation method according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present application and should not be construed as limiting the application.
The following describes a human-caused intelligent cockpit driving human body test and evaluation system and method according to the embodiment of the application with reference to the accompanying drawings. Aiming at the problems that the actual evaluation requirement of the intelligent cabin cannot be met, the experience of a driver in the evaluation process is influenced, and the evaluation work efficiency and accuracy are poor when the driver human body test is carried out on the cabin of the vehicle in the background art, the application provides a human-caused intelligent cabin driving human body test and evaluation system, wherein the generated evaluation content can be utilized to carry out experience test on the driver, and the experience and evaluation data of the human-caused intelligent cabin is obtained according to the sensing data of the driver and the state feedback of the driver, so that comprehensive and accurate evaluation system specifications are provided for the driver human body test and evaluation, the actual evaluation requirement of the human-caused intelligent cabin is met, and the evaluation work efficiency and accuracy are effectively improved. Therefore, the problems that when the cabin of the vehicle performs driving human body test, the driving human body test objective evaluation basis is lacked, the actual evaluation requirement of the intelligent cabin cannot be met, the experience of a driver in the evaluation process is affected, the work efficiency and the accuracy of the evaluation are poor and the like are solved.
Specifically, fig. 1 is a schematic structural diagram of a human-factor intelligent cockpit driving human body test and evaluation system provided by an embodiment of the present application.
As shown in fig. 1, the human-caused intelligent cockpit driving human-body test and evaluation system 10 includes:
Specifically, the at least one bicycle evaluation system 100 is configured to collect sensing data generated by at least one evaluation driver based on the corresponding evaluation content, and identify a driver state of the at least one evaluation driver according to the sensing data.
It may be understood that, in the embodiment of the present application, at least one bicycle evaluation system 100 may be a sub-server, efficient data communication and task management may be implemented between a main server and a field sub-server, and the field evaluation terminal may perform professional and systematic evaluation operations according to the received evaluation trip requirement, and after the evaluation work is finished, data is automatically or manually uploaded to the main server.
In the actual implementation process, the embodiment of the application can collect the sensing data generated by at least one testing driver based on the corresponding testing content through at least one bicycle testing system 100, and identify the driver state of at least one testing driver according to the sensing data, for example, the embodiment of the application can collect the blink frequency, pupil state and the like of the driver, and identify the driver state as the driving concentration state when the blink frequency is 11 times per minute and the pupil size is kept relatively stable, and for example, the embodiment of the application can identify the driver state as the driving fatigue state when the blink frequency is 6 times per minute and the pupil is contracted.
According to the embodiment of the application, the generated evaluation content can be utilized to carry out experience test on the driver, so that the driver state of the driver is identified, further, support is provided for subsequent human body evaluation of driving, evaluation is carried out in a targeted manner, the actual evaluation requirement of the human intelligent cabin is met, and the intelligent reliability of the evaluation system is ensured.
Illustratively, in one embodiment of the present application, each of the at least one bicycle evaluation systems 100 includes an acquisition component 101 for acquiring at least one of eye movement data, physiological data, behavioral data, and voice data of at least one evaluated driver as sensing data, and an evaluation component 102 for evaluating at least one evaluated driver's fatigue and/or emotional state as a driver state based on the sensing data.
It can be understood that when the embodiment of the application is accessed to eye movement data, physiological data, video behavior data and voice data in the evaluation execution process, real-time synchronous acquisition can be performed, and the synchronous time interval can be less than or equal to 10ms, so that the consistency of time points of multi-mode data is ensured, and events and fragment marks of different types of data can be performed, as shown in fig. 1, by adopting an NTP network time protocol, the clock of a computer can be synchronized to UTC (Universal Time Coordinated, universal time), and the clock synchronization is realized by using a time stamp and a calibration algorithm, so that the clocks of a plurality of devices or systems are ensured to be kept consistent, as shown in fig. 2.
In the actual execution process, the embodiment of the application can collect at least one eye movement data of the evaluation driver through the collection component 101, for example, the eye movement data of the single vehicle in the evaluation execution process is collected in real time through the wearable eye movement instrument, including but not limited to collecting data indexes such as pupils, gazing, blinking and the like, so as to be convenient for further analyzing the eye movement state and the cognitive processing capability of the driver in the driving process, as shown in fig. 3.
The embodiment of the application can collect at least one physiological data of the driver through the collection component 101, for example, physiological indexes of the driver are collected in real time through a physiological sensor, including but not limited to, the collection of physiological data such as electrocardio, pulse, skin electricity, myoelectricity and respiration, and the like, so that the psychological state change of the driver in the driving process can be further statistically analyzed, and the like, wherein the matched hardware equipment is shown in fig. 4.
According to the embodiment of the application, the behavior data of at least one evaluation driver can be acquired through the acquisition component 101, for example, the behavior data in the single vehicle evaluation execution process can be acquired in real time through the behavior observation camera, the operation behaviors are recorded in a video form, and the classification behaviors are customized and comprehensively counted based on the characteristics and rules. Table 1 is a behavioral data acquisition companion hardware device table, wherein, as shown in table 1:
TABLE 1
Sequence number Hardware name
1 First person viewing angle recorder
2 Touch control pen
3 Hand-held terminal
4 Behavior recording camera
5 High-speed camera
According to the embodiment of the application, the voice data of at least one evaluation driver can be acquired through the acquisition component 101, for example, voice interaction data carried by a real vehicle is acquired in real time through voice equipment, and voice data in the whole testing process are acquired, wherein the voice data content comprises but is not limited to voice instructions, voice moods, dialogue content and the like, the acquired voice data can be subjected to post-marking classification, and voice interaction events and background noise are extracted, so that the follow-up further analysis is facilitated. Table 2 is a table of hardware devices for voice data collection, wherein the table 2 shows:
TABLE 2
The embodiment of the application can evaluate the fatigue level of the driver by using the sensing data or the emotion state of the driver by using the sensing data, and can evaluate the fatigue level and emotion state of the driver by using the sensing data, thereby purposefully analyzing different states of the driver in the driving process, providing a comprehensive and accurate evaluation system standard for the test and evaluation of the driver, and meeting the actual requirements of the cabin.
Illustratively, in one embodiment of the present application, the acquisition component 101 is further configured to time synchronize a plurality of at least one of eye movement data, physiological data, behavioral data, and speech data of the assessment driver.
It may be understood that the eye movement data in the embodiment of the present application includes, but is not limited to, pupil, gaze, blink, etc., the physiological data includes, but is not limited to, electrocardiograph, pulse, dermatome, myoelectricity, respiration, etc., the behavioral data includes, but is not limited to, behavioral data collected by a camera, and the voice data includes, but is not limited to, voice command, voice emotion, dialogue content, etc.
In the actual execution process, the embodiment of the application can perform time synchronization on at least one of eye movement data, physiological data, behavior data and voice data of the evaluation driver through the acquisition component 101, for example, the embodiment of the application can perform time synchronization on the eye movement data when the driver blinks, the physiological data when the driver breathes and the voice data when the voice command is issued, thereby ensuring the consistency of the time points of the multi-mode data.
The embodiment of the application can time synchronize a plurality of items in the eye movement data, the physiological data, the behavior data and the voice data of the driver, thereby monitoring and evaluating the driving behavior and the driving state of the driver in real time and effectively improving the working efficiency and the accuracy of the evaluation.
The evaluation management system 200 is configured to generate an evaluation content corresponding to at least one evaluation driver, send the evaluation content to the at least one bicycle evaluation system 100, and receive the sensing data and the driver status uploaded by the at least one bicycle evaluation system 100, so as to obtain experience evaluation data of the human factor intelligent cabin according to the sensing data and the driver status, where the experience evaluation data is used for evaluating interactivity between the human factor intelligent cabin and the driver.
It can be understood that the application can perform human factor intelligence cockpit driving human body test and assessment, wherein, the human factor intelligence is based on the human factor engineering technology which is the technology of combining artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) and applying the psychological and physiological principles of human to the engineering and design of products, processes and systems, in other words, the technology of designing and improving a human-machine-environment system according to the characteristics of human, and the technology is combined with the artificial intelligence technology, so that a machine can interact with human more intelligently. The driver state in the embodiment of the application comprises but not limited to states such as a driving fatigue state, a driving concentration state, an emotion active state, an emotion neutral state and an emotion passive state, experience evaluation data of the human-caused intelligent cabin comprises but not limited to data such as satisfaction, neutral and dissatisfaction, and sensing data can be acquired by but not limited to a sensor.
For example, the embodiment of the application can generate at least one evaluation content for evaluating the driving satisfaction degree of the driver, and send the evaluation content about the driving satisfaction degree to at least one bicycle evaluation system 100, so as to be convenient for meeting the actual evaluation requirement of the human-factor intelligent cabin; for another example, the embodiment of the application can generate at least one evaluation content for evaluating the driving satisfaction degree of the driver, and send the evaluation content about the driving satisfaction degree to at least one bicycle evaluation system 100, acquire the sensing data of the confusion of the driver and the passive emotion state of the driver through the camera, and acquire the experience evaluation data of the artificial intelligent cabin according to the sensing data and the passive emotion state, thereby acquiring the experience evaluation data of the artificial intelligent cabin as unsatisfactory, wherein the experience evaluation data is used for evaluating the interactivity between the artificial intelligent cabin and the driver.
According to the embodiment of the application, the experience evaluation data of the human factor intelligent cabin can be obtained according to the sensing data and the state feedback of the driver, so that a comprehensive and accurate evaluation system standard is provided for the human factor intelligent cabin, the actual evaluation requirement of the human factor intelligent cabin is met, and the working efficiency and accuracy of the evaluation are effectively improved.
Illustratively, in one embodiment of the present application, the assessment management system 200 includes a system management module 201 configured to perform at least one of a test project management task, a vehicle model management task, a tested management task, and a main test management task of the assessment system 100, a resource library module 202 configured to perform a resource library management task of the assessment system, an assessment design module 203 configured to invoke the resource library module 202 to generate an assessment flow and corresponding assessment content based on a time axis, a data analysis module 204 configured to perform a data analysis task of the assessment system 100 according to the sensing data and the driver status, and an index system and weight management module 205 configured to construct an index system of the assessment system, and obtain corresponding weights of the sensing data and the driver status based on the index system, so as to perform task execution by using the weights.
In the actual execution process, the embodiment of the application can realize the integrated management of human-caused intelligent cabin real-vehicle test projects by executing the test project management tasks, the vehicle model management tasks, the tested management tasks, the main test management tasks and the like of the test system 100 through the system management module 201 in the test management system 200, can execute the resource library management tasks of the test system through the resource library module 202 to ensure the comprehensiveness of management, can call the resource library module 202 through the test design module 203, adds test task content based on a test flow designed by a time axis, sets a test method, can call subjective test resources and objective test resources, issues the test tasks, can call use cases, questionnaire scales, behavior experiments and trips in the resource library as test task content, supports issuing the test tasks to client terminals and carries out tests. The case management subtask in the subjective evaluation task can define and edit the evaluation cases of the intelligent cabin system, and can efficiently manage the test flow through visual display, execution history and flexible parameter configuration, and the method can comprise the following steps:
(1) A test case library subtask supporting the addition, editing and deletion of test cases;
(2) And the high-level screening subtask can screen the test cases according to the set rules.
Further, the embodiment of the application can execute the data analysis task of the assessment system 100 according to the sensing data and the driver state based on the data analysis module 204, is more reliable, can construct an index system of the assessment system through the index system and the weight management module 205, and obtains the corresponding weight of the sensing data and the driver state based on the index system so as to execute the task by using the weight, thereby screening the index to be used and distributing the weight.
In the index system and weight management module 205, the construction of the index system is based on the constituent elements of the man-machine-ring, the index system for the experience of the driver can be divided into safe, efficient and pleasant, and corresponds to the interaction process and the constituent elements of the two man-machine-ring systems in the driver respectively, and the platform supports the screening and the weight distribution of the indexes to be used according to the index system. Table 3 is an index system table, wherein, as shown in table 3:
TABLE 3 Table 3
The embodiment of the application can execute test project management tasks, vehicle type management tasks, tested management tasks, main test management tasks and the like, thereby realizing the integrated management of intelligent cabin real vehicle test projects, improving the efficiency of management test flow, executing tasks by using weights, screening indexes to be used and distributing the weights, and being more reliable.
Illustratively, in one embodiment of the present application, the test project management task includes at least one of a create project task, a person assignment task, a vehicle type association task, and a project progress presentation task, the vehicle type management task includes at least one of a create task and a management task of vehicle type information, the tested management task includes at least one of an add-on-test task, a test demographic information add-on task, a demographic information custom add task, a test history task, a test import and export task, and a demographic information statistics task, and the main test management task includes at least one of an add-on-main test task and a main test history task.
The test item management tasks in embodiments of the present application may include, but are not limited to:
(1) The project task is established, namely supporting a self-defined newly-established local evaluation project, editing a project name and a project storage path, supporting a one-key opening of the local storage project, and checking a history project record;
(2) A personnel allocation task, namely selecting a main test personnel and a tested personnel of an evaluation item, and calling personnel information added by a test management module and a main test management module;
(3) The vehicle type association task is to select the vehicle type of the evaluation vehicle, call the vehicle information added by the vehicle type management module and then check and compare the evaluation result based on the vehicle type;
(4) Project progress showing task, namely decomposing the project into key milestones in a time axis form, and showing the progress of target completion by utilizing visualization tools.
In the embodiment of the application, the vehicle type management task can be used for creating and managing the vehicle type information, namely adding a test vehicle and filling the vehicle type information, such as filling information of a manufacturer, a vehicle type number, VIN (Vehicle identification number, vehicle identification code), a vehicle number, a production date, a software version and the like.
The measured management tasks in the embodiment of the application can include but are not limited to:
(1) The task to be tested is added, namely, the manual addition of the tested is supported, and the tested name and demographic information are filled in;
(2) The tested demographic information adding tasks comprise gender, age, height, weight, identity card number, contact phone and email;
(3) The demographic information self-defining adding task is that different types of information such as adding options, texts, numbers and the like such as driving age, expression capability, main driving car and the like are supported, and sensitive information can be encrypted or hidden;
(4) Recording participation history of a tested person, wherein the participation history comprises items participating in evaluation and evaluated vehicle types;
(5) The tested import and export tasks support batch import and export of tested information, so that cross-project testing is convenient to perform;
(6) Demographic information statistics task supporting filtering and counting tested demographic information according to options, text content and digital range.
The primary management tasks in the embodiments of the present application may include, but are not limited to:
(1) A main test task, namely supporting adding a main test and filling in the name or other information of the main test;
(2) And a main test history record task for recording and displaying the evaluation items participated by the related main test personnel.
The embodiment of the application can create project tasks, personnel allocation tasks, vehicle type related tasks, project progress display tasks and the like, so that the integrated management of intelligent cabin real vehicle test projects can be realized based on complete evaluation project workflow, the information of each main tester can be completely recorded and managed through the tested management task adding tested tasks, tested demographic information adding tasks, demographic information custom adding tasks, tested historical record tasks, tested import and export tasks, demographic information statistics tasks and the like, so that the comprehensive information of the tested person is maintained, the effective participation and management is ensured, and the information of each main tester can be completely recorded and managed through the main test management task adding main test tasks, main test historical record tasks and the like, and the comprehensive understanding of the system to the main tester is ensured.
Illustratively, in one embodiment of the present application, the resource library management tasks include one or more of a use case management subtask, a questionnaire list management subtask, a behavioral experiment management subtask, a trip management subtask, a speech assessment management subtask, and an assessment tool management subtask, a state assessment algorithm model management subtask in a subjective assessment task.
In the embodiment of the application, the questionnaire scale management subtask can manage and maintain subjective questionnaires and scales, ensure that a system can comprehensively and accurately collect data such as driver experience, and comprises the following steps:
(1) And a meter library subtask, namely a system embeds a driver experience related meter, wherein each meter is provided with a detail page introduction and comprises theoretical background, topic examples, scoring rules and the like. Table 4 is a table, wherein, as shown in table 4:
TABLE 4 Table 4
Sequence number Name of the name
1 System availability scale SUS
2 Driver interface satisfaction questionnaire QUIS
3 Validity satisfaction and ease of USE questionnaire USE
4 Overall assessment availability questionnaire PSSUQ
5 Liket five-level satisfaction questionnaire scale
6 Ten-level satisfaction and acceptance scale SAE
7 Post-scene questionnaire ASQ
8 Mission load index NASA-TLX
(2) The custom scale subtask supports the modification of the questions and scores of the scales in the scale library and the creation of a new scale through a questionnaire scale template or a questionnaire-answer scale template;
(3) And the searching and screening subtask supports searching and screening according to the fields of the table name, the type and the like.
In an embodiment of the present application, the behavior experiment management subtask may manage and maintain behavior experiments available for driving subtasks to evaluate cognitive and behavioral responses under specific conditions to be tested, and may include, but is not limited to:
(1) Behavior experiment library subtasks, namely, a behavior experiment suitable for driving subtasks is built in the system, such as GO-NOGO tasks (a double-task paradigm for researching reaction stopping capability), odd-ball tasks (a classical experiment paradigm of ERP (Event-related Potentials, event-related potential)), and the like;
(2) The user-defined experiment subtask supports modification of experiment parameters in a behavior experiment library and creation of a new experiment through a reaction time-paradigm template or an experiment paradigm template;
(3) Search and screening subtasks, supporting search and screening according to fields such as experiment names, types and the like.
In embodiments of the present application, the trip management subtasks may define and create different driver trips to simulate the complete operational flow of the driver in the intelligent cockpit system, which may include, but are not limited to:
(1) The system sets the driver journey in a plurality of different scenes, the number of which is more than 10;
(2) A custom journey subtask, which supports modification of the journey in the journey library and supports calling of the contents of the use case library and the list library to newly construct the journey;
(3) Search and filter subtasks, support searching and filtering according to the name, type, etc. fields of journey.
In the embodiment of the present application, the voice evaluation management subtask may manage the corpus and voice used for performing the voice interaction evaluation when the real vehicle is mounted, and may include, but not limited to:
(1) Corpus subtasks, namely, arranging at least 2000 corpora covering each functional module of the car machine voice interaction in the system;
(2) A voice library subtask, namely a system is internally provided with at least 20000 voice instructions;
(3) A voice interaction evaluation subtask, which is to support the call of the corpus and the content of the voice library;
(4) A searching subtask, which is used for supporting searching of the content in the corpus and the voice library;
(5) And the import subtask supports the batch import of voice instructions in the voice library.
In the embodiment of the application, the evaluation tool management subtask in the objective evaluation task can objectively and quantitatively evaluate the operation behavior, physiological change, eye movement state, voice interaction and the like of a driver in the process of completing the interactive evaluation task through different types of measurement tools so as to manage and count various data indexes, and can be used for controlling and counting various data indexes, including but not limited to:
(1) And the eye movement assessment tool is used for providing abundant eye movement data such as pupil diameter, blink frequency, fixation time and the like and evaluating the design and usability of the intelligent cockpit driver interface. By measuring the eye movement state of the driver, the attention distribution, browsing mode, interaction path and the like of the driver are known. In the intelligent cabin man-machine interaction process, the reaction and performance behavior of a driver in interaction with the cabin can be known through eye movement data, and improvement of man-machine interaction interface design is facilitated;
(2) Providing data of physiological characterization changes such as HRV (HEART RATE variability) heart rate, EDA (Electronic design automation electronic design automation) skin electricity, RESP (Respiration ), EMG (electromyography) myoelectricity and the like, and evaluating the emotional state, psychological pressure, fatigue degree, anxiety level and the like of a driver in the real vehicle evaluation process, and knowing the influence of an intelligent cabin system on the psychological state of the driver;
(3) And the behavior evaluation tool is used for providing operation performance data such as the occurrence times of behaviors, the interaction time, the accuracy and the like and evaluating the operation performance in the interaction process such as finger touch control and the like. If a driver performs touch operation on the intelligent cabin center control screen interactive interface, measuring and recording the total movement distance of an operation fingertip, the direction change frequency of the operation fingertip, the operation time, the number of operation steps and the like;
(4) The voice evaluation tool is used for providing objective data of voice interaction and language expression such as voice speed, tone characteristics, noise level, recognition accuracy and the like, and is used for evaluating voice recognition and voice synthesis performance in the intelligent cabin, measuring perception and emotion adjustment effects of the cabin system on voice emotion of a driver and evaluating accuracy and interactive experience quality of the intelligent cabin system on voice instruction recognition.
In the embodiment of the present application, the state evaluation algorithm model management subtask in the objective evaluation task may support evaluation of the psychological state of the subject in the evaluation process according to the multimodal physiological data collected in real time by the subject and the algorithm model deployed by the system, and may include, but is not limited to:
(1) The load state identification model can predict at least two classification states of a tested high load level/low load level in real time, including but not limited to fusion calculation of a data source HRV heart rate variability, SKT skin temperature, EDA skin electricity and the like, belongs to a classical state identification model in the industry, and verifies the model by a scale result, a real state classification and the like, and the model precision is more than 70%;
(2) The fatigue state identification model can predict at least two classification states of tested fatigue/non-fatigue in real time, including but not limited to fusion calculation of data source HRV heart rate variability, SKT skin temperature, EDA skin electricity and the like, belongs to a classical state identification model, and verifies model effects by scale results or real state classification and the like, and the model precision is more than 70%;
(3) The emotion recognition model can predict at least three classification states of tested positive/negative/neutral in real time, including but not limited to fusion calculation of data source HRV heart rate variability, SKT skin temperature, EDA skin electricity and the like, belongs to a classical state recognition model in industry, and verifies the model by using a scale result or a real state classification result and the like, wherein the model precision is more than 70%.
The embodiment of the application can ensure the comprehensiveness of the evaluation according to the use case management subtask, the questionnaire list management subtask, the behavior experiment management subtask, the journey management subtask, the voice evaluation management subtask, the evaluation tool management subtask in the objective evaluation task, the state evaluation algorithm model management subtask and the like in the subjective evaluation task, and meets the actual evaluation requirement of the human-caused intelligent cabin.
Illustratively, in one embodiment of the application, the data analysis tasks include at least one of a video behavior data analysis task, an eye movement data analysis task, a physiological data analysis task, a voice data analysis task, a visual reporting task.
It may be understood that, the video behavior data analysis task in the embodiment of the present application may support processing and analyzing the interactive operation behavior data recorded by the video in the evaluation process, the eye movement data analysis task may support batch processing and visual analysis of the eye movement data in the evaluation process, the physiological data analysis task may support batch processing and visual analysis of the physiological data in the evaluation process, the voice data analysis task may support batch processing and visual analysis of the voice interactive data in the evaluation process, the visual reporting task may support automatic generation or custom programming of the evaluation result report in a text and graphic manner, and provide a visual interactive editing function to adjust the report content and support derivation of the result report.
In the actual implementation process, the video behavior data analysis task in the embodiment of the present application may include, but is not limited to:
(1) The video synchronous playback comprises the steps of supporting synchronous playback of two paths of videos and multiple paths of videos, and enabling a user to switch video channels in a self-defined way;
(2) The system can automatically acquire the starting and ending time points of the test task and the current time positioned by the time axis, and automatically calculate the duration of the task;
(3) And counting operation behaviors across screens, namely recording operation data of a driver on the screens, carrying out custom classification based on characteristics and rules of the operation behaviors, including long-press, short-press, sliding, dragging and other operation categories, counting the times of each operation behavior, and intuitively presenting operation action density in a thermodynamic diagram, a swarm diagram, a track diagram and other visual modes. Further exploring and analyzing differences between different devices or driver populations;
(4) And recording the feedback opinion and analysis judgment record, namely recording the tested operation feedback opinion in a questionnaire or oral recourse mode, correlating with operation data, and carrying out comprehensive video analysis by combining the video analysis judgment result of the main test, wherein the comprehensive video analysis is shown in figure 5.
The eye movement data analysis tasks in the embodiment of the application can comprise:
(1) The eye movement data processing comprises extracting the fixation point, blink and eye jump state based on the I-VT algorithm, supporting the user-defined setting of processing parameters including noise reduction level, data interpolation, fixation point time difference, fixation point visual angle difference and other references;
(2) Visual eye movement data Chart, including original data, fixation point X, Y coordinates of processing data, angular velocity and left and right eye pupil data;
(3) Eye movement Gaze Mapping supports local size Mapping auto-overlay and manual coding functions. Providing ASSISTED MAPPING a visual result presentation of the confidence interval, and automatically processing the expenditure multitasking data;
(4) The visual analysis of eye movement, namely, providing a visual result presentation and derivation such as a heat point diagram, a gaze track playback video and the like;
(5) The AOI interest area analysis supports the self-definition of sequence setting based on single AOI or AOI Group, and supports the statistical analysis of eye movement data in the created Snapshot AOI and Dynamic AOI;
(6) TOI event sequence analysis, namely, customizing data such as Metrics, interval set and offset of a visual result, and guiding data such as sequence length, TTF, access times, behavior count, TOI duty ratio and the like based on the event by using an excel format file;
(7) The eye movement data is derived, namely all raw data, processed data and analysis data are derived in Excel, CSV and TSV, XLSX, PLOF formats, including gyroscopes, acceleration sensors, magnetometers, pupil data, fixation (Fixation) based on interest areas, access (Visit) and eye jump (Saccade) statistical indexes are supported, and the method is shown in figure 6. Providing a visual result presentation and export such as a heat point diagram, a gaze track playback video and the like, supporting creation and editing of Snapshot AOI and Dynamic AOI, supporting use of AOI Tag grouping to obtain statistical indexes based on the AOI or AOI group, supporting local ASSISTED MAPPING automatic superposition and manual Coding functions, providing ASSISTED MAPPING visual result presentation of confidence intervals, expenditure multitask data automatic processing, providing TOI event sequence analysis functions, enabling customization of time Interval set and offset of visual results, enabling export of data such as sequence length based on events, TTF, access times, behavior count, TOI duty ratio and the like and export of the data in an excel format file, supporting gyroscope, acceleration sensor, magnetometer and pupil data export, presetting two visual data screening standards of the attribute and Fixation, supporting creation of a self-defined data filtering standard including noise reduction level, data interpolation, gaze point time difference, gaze point visual angle difference and the like, supporting I-VT data filter, supporting visual angular speed, supporting visual display (374), and supporting visual interest (Visit) and viewing area (4954) exporting based on the statistical index (494) of the original data (354).
The physiological data analysis tasks in embodiments of the present application may include, but are not limited to:
(1) The physiological data processing comprises the steps of supporting batch processing of signals such as HRV signals, EDA signals, RESP signals, EMG signals and the like, carrying out the batch processing function of multiple tested data by customizing processing parameters in advance or using default parameters of a system, and obtaining the truest and reliable physiological data through parameters such as filtering and the like;
(2) HRV heart rate variability data analysis, namely supporting IBI point detection, HRV time domain analysis, frequency domain analysis, difference scatter diagram analysis and the like, and quantitatively evaluating the activities of sympathetic nerves and parasympathetic nerves;
(3) EDA skin electricity data analysis, namely supporting automatic identification of SCR skin conductance reaction, ER event correlation analysis, time domain analysis, ER overall statistics and visual chart display;
(4) RESP breath data analysis, namely supporting breath signal time domain analysis and frequency domain analysis;
(5) EMG myoelectricity data analysis, namely supporting signal normalization, EMG processing data analysis, rectification data analysis, normalization data analysis, frequency domain analysis and dynamic periodic force identification;
(6) Visualization Chart & data export support exporting raw data, processed data, analyzed data, overall data reporting, downsampled data, relative time data, absolute time data, etc. in the. Excel and csv formats, as shown in FIG. 7.
The voice data analysis tasks in the embodiments of the present application may include, but are not limited to:
(1) Voice data processing, namely supporting cleaning and voice characteristic extraction of voice data by adopting a filter;
(2) Visual analysis of voice data, namely extracting audio information in video, visually displaying an audio track, and rapidly judging whether voice activity exists in a certain time period from vision;
(3) The speech segment sequence analysis, namely marking the sequence of the speech paragraphs and the corpus through speech activity detection, and carrying out the speech group segment analysis based on the sequence;
(4) Analyzing, counting and evaluating voice interaction data in the execution process, including voice interaction accuracy, response time, voice interaction times and the like;
(5) Voice data export, supporting export of voice raw data, processing data, analyzing data, overall data reporting.
Visual reporting tasks in embodiments of the present application may include:
(1) Questionnaire scale report, which can provide different dimension scores of the scale and specific options to be tested, as shown in fig. 8;
(2) Behavioral experiment reports the overall outcome statistics of the experiment and the detailed outcome for each trial run can be provided as shown in fig. 9;
(3) The physiological data report can provide comprehensive statistical results and visual charts of various physiological indexes based on events or fragments and different analysis methods;
(4) An eye movement data report, which can provide comprehensive statistical results and visual charts of various eye movement indexes based on different records, fragment types and different analysis methods;
(5) And (3) reporting the behavior data, namely providing comprehensive statistical results and a visual chart of each behavior index based on different records, fragment types, behavior types and different analysis methods.
The embodiment of the application can improve the data acquisition rate through a video behavior data analysis task, an eye movement data analysis task, a physiological data analysis task, a voice data analysis task, a visual reporting task and the like, ensure the reliability of analysis and finally effectively improve the working efficiency and accuracy of evaluation.
Specifically, with reference to fig. 11, the working principle of the human body inspection and evaluation system for human body due to intelligent cabin driving in the embodiment of the application can be described in detail in a specific embodiment.
As shown in fig. 11, the embodiment of the present application includes at least one bicycle evaluation system 100 and an evaluation management system 20.
The at least one bicycle evaluation system 100 comprises an acquisition component 101, an eye movement data acquisition unit 1011, a physiological data acquisition unit 1012, a behavior data acquisition unit 1013 and a voice data acquisition unit 1014, an evaluation component 102, a fatigue level evaluation unit 1021 and an emotion state evaluation unit 1022.
The evaluation management system 200 includes a system management module 201, a project management unit 2011, a vehicle model management unit 2012, a subject management unit 2013, and a main management unit 2014, a resource library module 202, a subjective evaluation unit 2021, a questionnaire investigation unit 2022, a behavior test unit 2023, a use case management unit 2024, a trip unit 2025, an objective evaluation unit 2026, an eye movement evaluation unit 2027, a physiological evaluation unit 2028, a behavior evaluation unit 2029, a voice evaluation unit 2030, an intelligent evaluation unit 2031, an evaluation design module 203, a time axis establishment unit 2031, a resource library call unit 2032, a subject and objective evaluation unit 2033, a test content distribution unit 2034, a data analysis module 204, a video behavior data analysis unit 2041, an eye movement data analysis unit 2042, a physiological data analysis unit 2043, a voice data analysis unit 2044, and a visual report derivation unit 2045, and an index system and weight management module 205.
According to the human-caused intelligent cabin driving human body test and evaluation system provided by the embodiment of the application, the generated test and evaluation content can be utilized to carry out experience test on a driver, and the experience test and evaluation data of the human-caused intelligent cabin is obtained according to the sensing data of the driver and the state feedback of the driver, so that a comprehensive and accurate test and evaluation system standard is provided for the test and evaluation of the driving human body, the actual test and evaluation requirement of the human-caused intelligent cabin is met, and the work efficiency and accuracy of test and evaluation are effectively improved. Therefore, the problems that when the cabin of the vehicle performs the driving human body test, the driving human body test objective evaluation basis is lacked, the actual evaluation requirement of the cabin cannot be met, the experience of the driver in the evaluation process is affected, and the evaluation work efficiency and accuracy are poor are solved.
Secondly, a human factor intelligent cabin driving human body test and evaluation method according to the embodiment of the application is described with reference to the attached drawings.
Fig. 12 is a flowchart of a human body test and evaluation method for driving a human body due to an intelligent cabin according to an embodiment of the present application.
As shown in fig. 12, the human body test and evaluation method for the human body due to the intelligent cockpit driving comprises the following steps:
In step S1201, sensing data generated by at least one evaluation driver based on the corresponding evaluation content is collected, and the driver state of the at least one evaluation driver is identified from the sensing data.
In step S1202, experience evaluation data of the artificial intelligence cabin is obtained according to the sensing data and the driver state, wherein the experience evaluation data is used for evaluating the interactivity between the artificial intelligence cabin and the driver.
Illustratively, in one embodiment of the present application, collecting sensory data generated by at least one assessment driver based on corresponding assessment content includes collecting at least one of eye movement data, physiological data, behavioral data, and voice data of the at least one assessment driver as sensory data, and evaluating fatigue level and/or emotional state of the at least one assessment driver as driver state based on the sensory data.
Illustratively, in one embodiment of the present application, collecting at least one of eye movement data, physiological data, behavioral data, and voice data of the at least one assessment driver as the sensing data includes time synchronizing a plurality of the at least one assessment driver's eye movement data, physiological data, behavioral data, and voice data.
The method comprises the steps of executing at least one of a test project management task, a vehicle model management task, a tested management task and a main test management task, executing a resource library management task, generating an evaluation flow and corresponding evaluation content based on a time axis, executing a data analysis task of an evaluation system according to the sensing data and the driver state, and obtaining corresponding weights of the sensing data and the driver state based on an index system to execute tasks by using the weights.
Illustratively, in one embodiment of the present application, the test project management task includes at least one of a create project task, a person assignment task, a vehicle type association task, and a project progress presentation task, the vehicle type management task includes at least one of a create task and a management task of vehicle type information, the tested management task includes at least one of an add-on-test task, a test demographic information add-on task, a demographic information custom add task, a test history task, a test import and export task, and a demographic information statistics task, and the main test management task includes at least one of an add-on-main test task and a main test history task.
Illustratively, in one embodiment of the present application, the resource library management tasks include one or more of a use case management subtask, a questionnaire list management subtask, a behavioral experiment management subtask, a trip management subtask, a speech assessment management subtask, and an assessment tool management subtask, a state assessment algorithm model management subtask in a subjective assessment task.
Illustratively, in one embodiment of the application, the data analysis tasks include at least one of a video behavior data analysis task, an eye movement data analysis task, a physiological data analysis task, a voice data analysis task, a visual reporting task.
It should be noted that the explanation of the embodiment of the human-factor intelligent cabin driving human body test and evaluation system is also applicable to the human-factor intelligent cabin driving human body test and evaluation method of the embodiment, and is not repeated here.
According to the human factor intelligent cabin driving human body test and evaluation method provided by the embodiment of the application, the generated test and evaluation content can be utilized to carry out experience test on a driver, and the experience test and evaluation data of the human factor intelligent cabin is obtained according to the sensing data and the driver state feedback of the driver, so that a comprehensive and accurate test and evaluation system standard is provided for the human factor intelligent cabin driving human body test and evaluation, the actual test and evaluation requirement of the human factor intelligent cabin is met, and the work efficiency and accuracy of test and evaluation are effectively improved. Therefore, the problems that when the cabin of the vehicle performs the driving human body test, the driving human body test objective evaluation basis is lacked, the actual evaluation requirement of the cabin cannot be met, the experience of the driver in the evaluation process is affected, and the evaluation work efficiency and accuracy are poor are solved.
Fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device may include:
memory 1301, processor 1302, and computer programs stored on memory 1301 and executable on processor 1302.
The processor 1302 executes the program to implement the human-factor intelligent cockpit driving human-body test and assessment method provided in the above embodiment.
Further, the electronic device further includes:
a communication interface 1303 for communication between the memory 1301 and the processor 1302.
Memory 1301 is used to store a computer program that can run on processor 1302.
Memory 1301 may comprise high-speed RAM memory or may also comprise non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 1301, the processor 1302, and the communication interface 1303 are implemented independently, the communication interface 1303, the memory 1301, and the processor 1302 may be connected to each other through a bus and perform communication with each other. The bus may be an industry standard architecture (Industry Standard Architecture, abbreviated ISA) bus, an external device interconnect (PERIPHERAL COMPONENT INTERCONNECT, abbreviated PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, abbreviated EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 13, but not only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 1301, the processor 1302 and the communication interface 1303 are integrated on a chip, the memory 1301, the processor 1302 and the communication interface 1303 may complete communication with each other through internal interfaces.
The processor 1302 may be a central processing unit (Central Processing Unit, abbreviated as CPU), or an Application SPECIFIC INTEGRATED Circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the application.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, realizes the human-factor intelligent cockpit driving human body verification and evaluation method.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order from that shown or discussed, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include an electrical connection (an electronic device) having one or more wires, a portable computer diskette (a magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware as in another embodiment, may be implemented using any one or more combinations of discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like, as is known in the art.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (7)

1. Human factor intelligent cabin driving human body test and evaluation system, which is characterized by comprising:
The system comprises at least one bicycle evaluation system, a control system and a control system, wherein the at least one bicycle evaluation system is used for acquiring sensing data generated by at least one evaluation driver based on corresponding evaluation content and identifying the driver state of the at least one evaluation driver according to the sensing data;
wherein each of the at least one bicycle evaluation systems includes:
The acquisition component is used for acquiring physiological data, behavior data and voice data of the at least one evaluation driver as the sensing data and performing time synchronization on a plurality of items of the physiological data, the behavior data and the voice data of the at least one evaluation driver;
an evaluation component for evaluating the fatigue level and/or emotional state of the at least one assessment driver as the driver state based on the sensing data;
An assessment management system comprising:
the system management module is used for executing at least one of a test project management task, a vehicle type management task, a tested management task and a main test management task of the evaluation system;
the resource library module is used for executing a resource library management task of the evaluation system;
the evaluation design module is used for calling the resource library module and generating an evaluation flow and corresponding evaluation contents based on a time axis;
the data analysis module is used for executing the data analysis task of the evaluation system according to the sensing data and the driver state;
The index system and weight management module is used for constructing an index system of the evaluation system, obtaining corresponding weights of the sensing data and the driver state based on the index system, and executing tasks by using the weights;
the evaluation management system is used for generating evaluation content corresponding to at least one evaluation driver, transmitting the evaluation content to the at least one bicycle evaluation system, and receiving the sensing data and the driver state uploaded by the at least one bicycle evaluation system, so as to obtain experience evaluation data of the human factor intelligent cabin according to the sensing data and the driver state, wherein the experience evaluation data are used for evaluating the interactivity between the human factor intelligent cabin and the driver.
2. The human-caused intelligent cockpit driving human body verification system according to claim 1, wherein,
The test project management task comprises at least one of a project creation task, a personnel allocation task, a vehicle type association task and a project progress display task;
The vehicle type management task comprises at least one of a creation task and a management task of vehicle type information;
The tested management task comprises at least one of a tested task, a tested demographic information adding task, a demographic information custom adding task, a tested history record task, a tested import and export task and a demographic information statistics task;
The main test management task includes at least one of an add main test task and a main test history task.
3. The human-based intelligent cockpit driving human verification and assessment system according to claim 1, wherein the resource library management task comprises one or more of a use case management subtask, a questionnaire list management subtask, a behavior experiment management subtask, a trip management subtask, a voice assessment management subtask, and an assessment tool management subtask, a state assessment algorithm model management subtask in a subjective assessment task.
4. The human factor intelligent cockpit driving human verification system of claim 1 wherein the data analysis tasks include at least one of a video behavior data analysis task, an eye movement data analysis task, a physiological data analysis task, a voice data analysis task, a visual reporting task.
5. A human factor intelligent cockpit driving human body test and evaluation method, wherein the human factor intelligent cockpit driving human body test and evaluation system as claimed in any one of claims 1-4 is used for realizing the human factor intelligent cockpit driving human body test and evaluation method, and is characterized by comprising the following steps:
Acquiring sensing data generated by at least one evaluating driver based on the corresponding evaluating content and driver state of at least one evaluating driver identified based on the sensing data, and
And obtaining experience evaluation data of the human factor intelligent cabin according to the sensing data and the driver state, wherein the experience evaluation data are used for evaluating the interactivity between the human factor intelligent cabin and the driver.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the human-based intelligent cockpit driving human verification method of claim 5.
7. A computer-readable storage medium having stored thereon a computer program, characterized in that the program is executed by a processor for implementing the human-factor intelligent cockpit driving-human-test-assessment method according to claim 5.
CN202311869332.9A 2023-12-22 2023-12-29 Human factors intelligent cockpit driver experience evaluation system and method Active CN118021308B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202311869332.9A CN118021308B (en) 2023-12-29 2023-12-29 Human factors intelligent cockpit driver experience evaluation system and method
US18/948,357 US20250208000A1 (en) 2023-12-22 2024-11-14 Method for evaluating human-machine interaction of vehicle, system, edge computing device, and medium
EP24220178.8A EP4575799A1 (en) 2023-12-22 2024-12-16 Method for evaluating human-machine interaction of vehicle, system, edge computing device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311869332.9A CN118021308B (en) 2023-12-29 2023-12-29 Human factors intelligent cockpit driver experience evaluation system and method

Publications (2)

Publication Number Publication Date
CN118021308A CN118021308A (en) 2024-05-14
CN118021308B true CN118021308B (en) 2025-02-11

Family

ID=90997558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311869332.9A Active CN118021308B (en) 2023-12-22 2023-12-29 Human factors intelligent cockpit driver experience evaluation system and method

Country Status (1)

Country Link
CN (1) CN118021308B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119246087A (en) * 2024-09-23 2025-01-03 奇瑞汽车股份有限公司 Vehicle experience evaluation method, device, electronic device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089279A (en) * 2022-12-30 2023-05-09 北京津发科技股份有限公司 Intelligent cabin HMI evaluation method, system and storage medium
CN117075576A (en) * 2023-08-17 2023-11-17 重庆长安汽车股份有限公司 Vehicle intelligent cabin testing method, cloud server, vehicle-mounted terminal and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9650051B2 (en) * 2013-12-22 2017-05-16 Lytx, Inc. Autonomous driving comparison and evaluation
CN112525551B (en) * 2020-12-10 2023-08-29 北京百度网讯科技有限公司 Drive test method, device, equipment and storage medium for automatic driving vehicle
CN114298469A (en) * 2021-11-24 2022-04-08 重庆大学 User experience test and evaluation method of automotive intelligent cockpit
CN116594858B (en) * 2022-12-30 2024-08-27 北京津发科技股份有限公司 Intelligent cabin man-machine interaction evaluation method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089279A (en) * 2022-12-30 2023-05-09 北京津发科技股份有限公司 Intelligent cabin HMI evaluation method, system and storage medium
CN117075576A (en) * 2023-08-17 2023-11-17 重庆长安汽车股份有限公司 Vehicle intelligent cabin testing method, cloud server, vehicle-mounted terminal and storage medium

Also Published As

Publication number Publication date
CN118021308A (en) 2024-05-14

Similar Documents

Publication Publication Date Title
Chen et al. Assessing task mental workload in construction projects: A novel electroencephalography approach
Kocielnik et al. Smart technologies for long-term stress monitoring at work
Lemon et al. A narrative review of methods for applying user experience in the design and assessment of mental health smartphone interventions
JP2009530071A (en) Visual attention and emotional reaction detection display system
Charland et al. Assessing the multiple dimensions of engagement to characterize learning: A neurophysiological perspective
Kovesdi et al. Application of eye tracking for measurement and evaluation in human factors studies in control room modernization
Qin et al. An EEG-based mental workload evaluation for AR head-mounted display use in construction assembly tasks
WO2019086856A1 (en) Systems and methods for combining and analysing human states
CN118021308B (en) Human factors intelligent cockpit driver experience evaluation system and method
Bauer et al. Application of wearable technology for the acquisition of learning motivation in an adaptive e-learning platform
Bitkina et al. User stress in artificial intelligence: modeling in case of system failure
CN116089279A (en) Intelligent cabin HMI evaluation method, system and storage medium
Aoyama Lawrence et al. Being in-sync: A multimodal framework on the emotional and cognitive synchronization of collaborative learners
Suryani et al. Role, methodology, and measurement of cognitive load in computer science and information systems research
Liapis et al. Don’t leave me alone: Retrospective think aloud supported by real-time monitoring of participant’s physiology
Majumder et al. A smart cyber-human system to support mental well-being through social engagement
Rodger Reinforcing inspiration for technology acceptance: Improving memory and software training results through neuro-physiological performance
Jusufi et al. Visualization of spiral drawing data of patients with Parkinson's disease
US20250208000A1 (en) Method for evaluating human-machine interaction of vehicle, system, edge computing device, and medium
Zheng et al. Evaluation of mental load using EEG and eye movement characteristics
Ramsay et al. Equinox: exploring naturalistic distortions of time perception
Feng et al. Cognitive friction measurement: interaction assessment of interface information in complex information systems
CN104992080A (en) Stimulus information compiling method of potential value test
Mogire et al. Probing for psycho-physiological correlates of cognitive interaction with cybersecurity events
Jeong Suggestion of methods for understanding user’s emotional changes while using a product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant