Disclosure of Invention
The main objective of the present invention is to solve the problem that the electronic device in the prior art cannot be operated accurately by the eye tracking technology.
To solve the above problem, the present invention provides a user interface synchronization system for pairing an electronic device with a wearable device, the user interface synchronization system comprising:
the image output module is coupled with the electronic device to access the image data of the electronic device and transmit the image data of the electronic device to the wearable device through a wireless network;
the mapping module is coupled with the wearable device and displays the image data on an output unit of the wearable device for a user to watch and operate;
the eye movement instruction analysis module is coupled with the wearable device and used for analyzing the eye movement instruction acquired by the wearable device; and
and the instruction conversion module is coupled with the electronic device or the wearable device and converts the eye action instruction when receiving the eye action instruction so as to output the eye action instruction as an action instruction which can be executed by the electronic device.
Furthermore, the mapping module establishes a user interface window for displaying the image data on an output unit of the wearable device, and the user interface window is enlarged or reduced in equal proportion according to the length and the width of a display screen of the electronic device.
Furthermore, the eye movement instruction analysis module analyzes the gaze direction of the user according to the acquired eye image, and forms a cursor capable of moving according to the gaze direction on the user interface window.
Further, when the gazing direction of the user stays on the user interface window, the eye movement instruction analysis module records the coordinate position of the gazing direction on the display screen of the electronic device, and when the gazing direction stays on the same graphical interface approximately and exceeds a set threshold time, a triggering instruction is transmitted to the instruction conversion module through a wireless network to start one or more programs corresponding to the graphical interface.
Further, when the eye movement instruction analysis module detects that the watching direction of the user is fast moving from left to right, the eye movement instruction analysis module transmits the eye movement instruction of turning pages on the left side to the instruction conversion module so as to enable the electronic device to execute the movement instruction of turning pages from left to right, and when the eye movement instruction analysis module detects that the watching direction of the user is fast moving from right to left, the eye movement instruction analysis module transmits the eye movement instruction of turning pages on the right side to the instruction conversion module so as to enable the electronic device to execute the movement instruction of turning pages from right to left.
Furthermore, the eye movement instruction analysis module sets a reference coordinate when detecting the trigger action of the user, continuously detects the gaze direction of the user, records the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinate, and transmits an eye movement instruction corresponding to the eye movement direction and the movement distance to the instruction conversion module when the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, so that the electronic device executes a scrolling action instruction corresponding to the eye movement direction.
Further, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
Another objective of the present invention is to provide an electronic device with a controlled end, which includes a display screen, a wireless transmission unit, and a processor connected to the display screen and the wireless transmission unit. The processor has a graphics processing unit for transmitting image data to the display screen to provide a user interface for user operation. The processor comprises an arithmetic unit used for mounting and executing the following programs:
and the graphic output module is used for accessing the image data of the electronic device and transmitting the image data displayed by the electronic device to the wearable device through a wireless network so as to be output by the wearable device for visual operation of a user.
The instruction conversion module receives the eye action instruction provided by the wearable device through a wireless network and outputs the eye action instruction as an action instruction which can be executed by the electronic device.
Further, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
Another objective of the present invention is to provide a wearable device with a main control end, which includes an output unit, a wireless transmission unit, a camera unit, and a processor connected to the output unit, the wireless transmission unit, and the camera unit. The camera unit is used for shooting and obtaining the eye images of the user. The processor has a graphic processing unit for transmitting image data to the output unit for user interface operation. The processor comprises an arithmetic unit used for mounting and executing the following programs:
the mapping module obtains image data of the electronic device through a wireless network and transmits the image data to the graphic processing unit of the wearable device to be displayed on the output unit for a user to watch and operate.
The eye movement instruction analysis module acquires the eye image shot by the camera unit, acquires an eye movement instruction from the eye image, and transmits the eye movement instruction to the electronic device through a wireless network so as to start one or more programs of the electronic device.
Furthermore, the mapping module establishes a user interface window for displaying the image data on an output unit of the wearable device, and the user interface window is enlarged or reduced in equal proportion according to the length and the width of a display screen of the electronic device.
Furthermore, the eye movement instruction analysis module analyzes the gaze direction of the user according to the acquired eye image, and forms a cursor on the user interface window, wherein the cursor can move according to the gaze direction.
Further, when the gazing direction of the user stays on the user interface window, the eye movement instruction analysis module records the coordinate position of the gazing direction on the display screen of the electronic device, and when the condition that the gazing direction stays on the same graphical interface approximately and exceeds a set threshold value is detected, a triggering instruction is transmitted to the electronic device through a wireless network to start one or more programs corresponding to the graphical interface.
Further, the eye movement instruction analysis module transmits a left-side page-turning eye movement instruction to the electronic device when detecting that the gaze direction of the user is fast moving from left to right, so that the electronic device executes a left-side to right-side page-turning program, and transmits a right-side page-turning eye movement instruction to the electronic device when detecting that the gaze direction of the user is fast moving from right to left, so that the electronic device executes a right-side to left-side page-turning program.
Furthermore, the eye movement instruction analysis module sets a reference coordinate when detecting the trigger action of the user, continuously detects the gaze direction of the user, records an X-axis movement distance and a Y-axis movement distance of the gaze direction relative to the reference coordinate, and transmits an eye movement instruction corresponding to the eye movement direction and the movement distance to the electronic device when the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, so that the electronic device executes a scrolling program corresponding to the eye movement direction.
Further, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
Another objective of the present invention is to provide an interface synchronization method for a wearable device and an electronic device, comprising: accessing the image data of the electronic device and transmitting the image data of the electronic device to the wearable device through a wireless network; displaying the image data on an output unit of the wearable device through the wearable device so as to allow a user to watch the image data; analyzing the eye movement instruction acquired by the wearable device and transmitting the eye movement instruction to the electronic device through a wireless network; and converting the eye action command when receiving the eye action command so as to output the eye action command as an action command which can be executed by the electronic device.
Further, after receiving the image data, the wearable device enlarges or reduces the user interface window for displaying the image data according to the length and the width of the display screen of the electronic device in an equal proportion.
Furthermore, the wearable device analyzes the gazing direction of the user according to the acquired eye image, and forms a cursor capable of moving according to the gazing direction on the user interface window.
Further, when the gazing direction of the user stays on the user interface window, the coordinate position on the display screen of the electronic device corresponding to the gazing direction is recorded, and when the gazing direction stays on the same graphical interface approximately and exceeds a set threshold time, a triggering instruction is transmitted to the electronic device through a wireless network to start one or more programs corresponding to the graphical interface.
Further, when it is detected that the gaze direction of the user is moving rapidly from left to right, an eye action instruction for turning pages on the left side is transmitted to the electronic device, so that the electronic device executes an action instruction for turning pages from left to right, and when it is detected that the gaze direction of the user is moving rapidly from right to left, an eye action instruction for turning pages on the right side is transmitted to the electronic device, so that the electronic device executes an action instruction for turning pages from right to left.
Further, a reference coordinate is set when the trigger action of the user is detected, the gazing direction of the user is continuously detected, the X-axis movement distance and the Y-axis movement distance of the gazing direction relative to the reference coordinate are recorded, and when the X-axis movement distance or the Y-axis movement distance is larger than a threshold value, an eye action command corresponding to the eye movement direction and the movement distance is transmitted to the electronic device, so that the electronic device executes a scrolling action command corresponding to the eye movement direction.
It is still another object of the present invention to provide a computer readable recording medium, which records a program, and when the program is loaded and executed by an electronic device and a wearable device, the method can be performed.
It is still another object of the present invention to provide a computer program product, which when loaded into an electronic device and a wearable device, performs the method as described above.
Therefore, the present invention has the following excellent effects compared to the aforementioned conventional techniques:
1. the user interface synchronization system can transmit the image data of the electronic device to the output unit of the wearable device, so that the electronic device can be operated by tracking the eye movement of the user.
2. The front lens and the eyes of the user can be kept at a fixed distance, and the eye movement of the user can be detected easily.
Detailed Description
The detailed description and technical contents of the present invention will now be described with reference to the drawings. In the drawings, for convenience of explanation, the scale is not necessarily drawn to scale, but may be exaggerated, and the drawings and the scale are not intended to limit the scope of the present invention.
The present invention is a user interface synchronization system 100 for transmitting a picture of an electronic device 10 to a wearable device 20, and tracking an eye movement of a user through the wearable device 20 to operate the electronic device 10.
The electronic device 10 at least includes a display screen 11, a Processing Unit 12 (CPU), and a Graphics Processing Unit 13 (GPU) capable of inputting and outputting images. Specifically, the electronic device 10 may be, for example, a cellular phone, a smart phone, a tablet computer, a handheld mobile communication device, a Personal Digital Assistant (PDA), or a portable electronic device such as the like, and the electronic device 10 may also be an electronic device having a display and control interface, such as a calculator, a desktop computer, a notebook computer, or a vehicle-mounted computer.
The wearable device 20 is particularly a wearable device worn on the head of a user, and can provide a user interface operable by the user through the output unit 21, capture the eyes of the user through the camera unit 25 to obtain the eye images of the user, and operate the user interface through the gazing direction of the user. The wearable device 20 includes at least an output Unit 21 for providing an image to the eyes of the user, an imaging Unit 25 for imaging the eyes of the user to obtain an image of the eyes of the user, a processing Unit 22 (CPU), and a Graphics processing Unit 23(GPU) for inputting and outputting an image. Specifically, the wearable device 20 may be smart glasses, eye tracker, augmented reality device, virtual reality device, or the like.
FIG. 1 is a block diagram of a user interface synchronization system according to the present invention, as shown in the figure:
the hardware architecture of the electronic device 10 and the wearable device 20 will be described below, and after the description of the hardware architecture, the following description will further describe a part of the software architecture.
An electronic device:
as mentioned above, the electronic device 10 serves as a controlled end to transmit the image data to the wearable device 20, and operate the electronic device 10 through the wireless network by the eye movement tracking function of the wearable device 20. The electronic device 10 includes a display screen 11, a Processing Unit 12(Processing Unit), a Graphics Processing Unit 13 (GPU) capable of inputting and outputting images, a storage Unit 14, and a wireless transmission Unit 16.
The
processing unit 12 and the
gpu 13 may together form a processor CU1, so that the
processing unit 12 and the
gpu 13 can be integrated on a single chip, thereby reducing the volume required by the components. For example, the processor CU1 may be developed by ARM Holdings, ltd
The present invention is not limited to a series of processors, and a Loongson (Loongson) processor developed by Institute of Computing Technology (ICT) of the Chinese academy of sciences.
In another preferred embodiment, the processing unit 12 and the GPU 13 can be configured as processors respectively, and respectively process logical operations and image processing, and jointly or cooperatively process a portion of the program instructions.
In another preferred embodiment, the processing unit 12 and the storage unit 14 together form a processor, and the processing unit 12 can load a program pre-stored in the storage unit 14 and execute a corresponding algorithm.
In this implementation, processing unit 12 and GPU 13 together form processor CU1, which is coupled to processor CU1 and to storage unit 14. The Processor CU1 may be a Central Processing Unit (CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or the like, or any combination thereof.
The display screen 11 is used for displaying graphic data, such as a user operation interface, a graphical interface, or a multimedia image, and the image or the operation interface is displayed on the display screen 11 for the user to read. The display screen 11 may be an Active Matrix Organic Light Emitting Device (AMOLED) display, a Thin Film Transistor (TFT) display, or other display devices, which is not limited in the present invention. The display screen 11 is driven by the control circuit, and drives the light emitting units (pixel elements) of the panel at the corresponding coordinates by inputting corresponding signals to the data line driving circuit and the scan line driving circuit. After the display screen 11 accesses the data in the storage unit 14 through the graphic processing unit 13, the corresponding multimedia data is distributed on the display screen 11 for the user to visually observe.
The wtru 16 may perform data transmission via a wireless network. Specifically, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP). In another preferred embodiment, the wireless transmission unit 16 can be paired with the wearable device 20 by Radio Frequency Identification (RFID) technology, so as to perform wireless data transmission with the wearable device 20 at medium and short distances.
Wearable device:
the wearable device 20 can be used as a main control terminal for receiving the image data of the electronic device 10 and outputting the image data to the eyes of the user for the user to watch. As described above, the wearable device 20 includes an output Unit 21, a Processing Unit 22 (CPU), a Graphics Processing Unit 23(GPU) that can input and output images, an imaging Unit 25, and a wireless transmission Unit 26.
The processing unit 22 is substantially the same as the processing unit 12 of the electronic device 10, and a description of the processing unit 22 is omitted here. As with the processing unit 12 of the electronic device 10, in a preferred embodiment, the processing unit 22 and the GPU 23 together form a processor CU2, such that the processing unit 22 and the GPU 23 can be integrated on a single chip. In another preferred embodiment, the processing unit 22 and the graphics processing unit 23 may be configured as processors respectively, and respectively process logical operations and image processing, and jointly or cooperatively process a part of the program instructions. In another preferred embodiment, the processing unit 22 and the storage unit 24 together form a processor, and the processing unit 22 can load a program pre-stored in the storage unit 24 and execute a corresponding algorithm.
In this implementation, processing unit 22 and graphics processing unit 23 together form processor CU2, and processor CU2 is coupled to storage unit 24. The Processing Unit 22 may be a Central Processing Unit (CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or other similar devices or combinations thereof.
The output unit 21 is used for displaying the graphic data and sending the graphic data to the eyes of the user for the user to visually operate. The output unit 21 may be a display screen, such as an Active Matrix Organic Light Emitting Device (AMOLED) display, a Thin Film Transistor (TFT) display, or other such display devices. In another preferred embodiment, the output unit 21 may be a retina display, and the image is projected directly on the retina for the user to view through a Retinal image projection (RID) technology. The retinal display can directly form an image on the retina by the reflection of the glass, so that the image light beam is fused with the real scene seen by naked eyes. The output unit 21 can issue the corresponding multimedia data to the eyes of the user for the user to visually observe after accessing the data in the storage unit 24 through a graphics processing unit 23 (GPU).
The image pickup unit 25 is a camera mounted with a Charge Coupled Device (CCD) or a Complementary Metal-Oxide Semiconductor (CMOS), and is not limited in the present invention. The camera unit 25 is used for capturing an eye image of the user, and the obtained eye image is transmitted to the eye movement command analysis module 28 for further analysis, so as to track the eye movement of the user and convert the eye movement into a corresponding eye movement command.
The wtru 26 may perform data transmission via a wireless network. Specifically, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
The hardware architecture of the electronic device 10 and the wearable device 20 has been described in detail, and the architecture of the user interface synchronization system according to the present invention is further described in detail according to the hardware architecture:
referring to fig. 2, a schematic diagram of a user interface synchronization system according to the present invention is shown, as follows:
the present invention can pair the electronic device 10 with the wearable device 20. After the pairing is successful, the electronic device 10 can transmit the image data on the display screen 11 to the wearable device 20 through the wireless network, so that the user can read the display screen 11 through the wearable device 20. The wearable device 20 can capture an image of the user's eyes through the camera unit 25, and wirelessly operate the electronic device 10 through eye movements, thereby starting one or more programs of the electronic device 10.
Referring to fig. 1, the ui synchronization system 100 includes a graphic output module 17, a command conversion module 18, a mapping module 27 and an eye movement command analysis module 28 coupled to the electronic device 10.
In the present embodiment, the graphic output module 17 and the command conversion module 18 are pre-stored in the storage unit 14 of the electronic device 10, so that the processing unit 12 of the electronic device 10 executes the algorithm after mounting the programs of the graphic output module 17 and the command conversion module 18. The mapping module 27 and the eye movement instruction analysis module 28 are pre-stored in the storage unit 24 of the wearable device 20, so that the processing unit 22 of the wearable device 20 executes the algorithm after the programs of the mapping module 27 and the eye movement instruction analysis module 28 are mounted. In another preferred embodiment, the command conversion module 18 can also be loaded on the wearable device 20, and the wearable device 20 converts the eye movement command into a movement command for the electronic device 10 to execute through the command conversion module 18, and then transmits the movement command to the electronic device 10 through the wireless network to start one or more programs. In the above steps, the wearable device 20 encrypts the motion command, and the electronic device 10 only needs to decrypt and execute the motion command, which can be understood as another similar implementation of the present invention.
The graphic output module 17 is coupled to the electronic device 10 to access the image data of the electronic device 10 and transmit the image data of the electronic device 10 to the wearable device 20 via a wireless network. The graphic output module 17 is used for accessing image data (user interface) provided in the graphic processing unit 13, and the acquired image data is synchronized with the image displayed on the display screen 11, or the display screen 11 is forcibly closed according to preset or user setting, so that the graphic output module 17 directly transmits the image data to the wearable device 20 through a wireless network, thereby reducing the extra burden of the graphic output module 17 and reducing the power consumption of the electronic device.
The mapping module 27 is paired with the graphic output module 17 through the wireless transmission unit 26, and the mapping module 27 is used for displaying image data on the output unit 21 of the wearable device 20 for the user to watch. Referring to fig. 3, the mapping module 27 can establish a user interface window W for displaying the image data, and proportionally enlarge or reduce the user interface window W according to the length and width of the display screen 11 of the electronic device 10, so that the user can operate the user interface window W within a better visual perception range. The mapping module 27 displays a movable cursor W1 on the user interface window W, the cursor W1 moves along the gazing direction of the user, and the gazing direction is calculated by the eye movement command analysis module 28.
In a preferred embodiment, the graphic output module 17 may compress a series of image data by Streaming media, transmit the data in segments through a wireless network, and decompress the stream packets by the mapping module 27 coupled to the wearable device 20, so as to display the image data on the output unit 21 of the wearable device 20 in real time.
After the image data is displayed on the user interface window W provided by the output unit 21, the user can operate the cursor W1 on the user interface window W through eye movement, or generate corresponding eye movement commands through continuous eye movement to operate the user interface window W. After analyzing the eye movement of the user, the eye movement instruction analyzing module 28 converts the eye movement into an eye movement instruction, and transmits the eye movement instruction to the instruction converting module 18 coupled to the electronic device 10 through the wireless network for the instruction converting module 18 to analyze, further find the eye movement instruction corresponding to the movement instruction of the electronic device 10, and start one or more programs corresponding to the movement instruction through the processing unit 12. In a preferred embodiment, the command conversion module 18 includes a lookup table, and the command conversion module 18 can convert the eye movement command into a movement command corresponding to the electronic device 10 through the lookup table. Therefore, when the wearable device 20 is successfully mated with the electronic device 10, the user can operate the user interface window W through the wearable device 20, at this time, the camera unit 25 will continuously capture the eye image of the user, and the user can operate the electronic device 10 through the eye movement through the wearable device 20 by the captured eye movement.
Specifically, the eye movement command analysis module 28 includes two functions, one of which is to analyze the corresponding eye movement through the eye image of the user and determine the gazing direction of the user through the eye movement; the second is to determine the eye movement command the user wants to input according to the user's gaze direction.
As a technique for obtaining the eye movement (gaze direction) by the eye Image, a Purkinje Image tracking method (DPI), an infrared Image System method (IRVS), and an infrared eye movement method (IROG) may be used, but the present invention is not limited thereto. With the above method, a plurality of sample points (as shown in fig. 4) of the eye image mapped on the output unit 21 can be acquired through a plurality of eye images of the user, and the acquired sample points are used for further analyzing the eye movement command that the user wants to input.
Referring to fig. 4, in order to disclose a series of sample points when the user operates right scrolling, in default of the system, the middle position is set as a starting point and an end point of an eye trajectory instruction, in the illustration, the starting position is shown as ○, the end position is shown as x, and the detected coordinate sequences are arranged in a dot-like manner in sequence, the eye movement instruction analysis module 28 transmits an eye movement instruction of page turning on the left side to the instruction conversion module 18 of the electronic device 10 when detecting that the gaze direction of the user moves rapidly from left (i.e., middle position) to right (right side position), at this time, the instruction conversion module 18 calls a corresponding movement instruction through a lookup table when receiving the eye movement instruction of page turning on the left side, so that the processing unit 12 executes an event of page turning from left side to right side.
In the right-side page turning part, please refer to fig. 5, in order to disclose a series of sample points when the user operates the right-side page turning, in default of the system, in the diagram, the starting position is shown as ○, the ending position is shown as gamma, and the detected coordinate sequences are arranged in a dot shape in sequence, when the eye movement instruction analysis module 28 detects that the gazing direction of the user rapidly moves from right (i.e. the middle position) to left (i.e. the left position), the eye movement instruction of the right-side page turning is transmitted to the instruction conversion module 18 of the electronic device 10, at this time, the instruction conversion module 18 finds the corresponding movement instruction through the lookup table when receiving the right-side page turning, so that the processing unit 12 executes the event of turning the right-side page to the left.
When the page-turning command is input, the electronic device 10 enters a short non-response period (e.g., one second) to avoid that the user turns the page by the eye movement and then is determined as the page-turning command when the eye returns to the middle position.
In another preferred embodiment, the user can input an action command for scrolling pages to the electronic device 10 through the eye action command, so that the pages on the user interface W are scrolled in the direction of the user's gaze. Specifically, the eye movement command analysis module 28 detects an eye movement of the user, sets a reference point when detecting a trigger movement of the user (the trigger movement may be blinking, closing eyes, drawing a circle, or other predefined eye movements), and continuously records an X-axis movement distance and a Y-axis movement distance of the gazing direction relative to the reference coordinate. When the X-axis movement distance or the Y-axis movement distance is greater than the threshold value, the eye movement command analysis module 28 transmits an eye movement command including a movement direction (i.e., a positive value or a negative value of the X-axis or the Y-axis) and a movement distance corresponding to the eye movement direction to the command conversion module 18, and the command conversion module 18 transmits the command to the processing unit 12, so that the processing unit 12 executes a scrolling movement command corresponding to the eye movement direction. The obtained eye movement direction is used as a reference value for judging the scroll direction, and the obtained movement distance is used as a reference value for judging the scroll speed.
Referring to fig. 6, a user may manipulate a cursor W1 on a user interface window W by looking at the user interface window W. The eye movement instruction analysis module 28 analyzes the gazing direction of the user according to the acquired eye images, and forms a cursor W1 movable according to the gazing direction on the user interface window W. When the gazing direction of the user stays on the user interface window W, the eye movement command analysis module 28 records the coordinate position on the display screen 11 of the electronic device 10 corresponding to the gazing direction, and when the gazing direction stays on the same graphical interface approximately for more than a set threshold time, transmits a trigger command to the command conversion module 18 via the wireless network, and the command conversion module 18 converts the trigger command into a touch-start action command to start a software program or command corresponding to the graphical interface on the coordinate position.
As shown in fig. 6, in a preferred embodiment, the mapping module 27, upon receiving the image data, defines and targets a range of the corresponding graphical interface on the output unit 21 (the defined range is obtained by the gpu 13 of the electronic device 10), and when the gazing direction of the user stays within the range defined by one of the graphical interfaces, the cursor W1 is converted into a timer W2 and marks the time required for starting the graphical interface. As shown, the timer W2 includes a percentile and a timing bar, when the gazing direction of the user stays on the graphical interface, the cursor W1 is converted into the timer W2, the percentile and the timing bar display the remaining time, when the percentile reaches 100%, the timing bar also displays a full frame, and the eye movement command analysis module 28 transmits a trigger command to the command conversion module 18 via the wireless network to start the software program or command corresponding to the graphical interface.
Referring to fig. 7, it is shown that in another preferred embodiment, the user interface window W may be provided with a plurality of operation areas, and a user may watch the graphical interface corresponding to each operation area to generate an eye movement command corresponding to the operation area. For example, when the user stops the cursor W1 (gazing direction) at the arrow W3 in the right direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of the right page turning to the instruction conversion module 18 of the electronic device 10; when the user holds the cursor W1 (gazing direction) on the arrow W4 in the left direction, the eye movement command analysis module 28 will transmit the eye movement command of the right page turn to the command conversion module 18 of the electronic device 10; when the user stops the cursor W1 (gazing direction) at the arrow W5 in the upper direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of the upper page turning to the instruction conversion module 18 of the electronic device 10; when the user stops the cursor W1 (gazing direction) at the arrow W6 in the lower direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of the lower page turning to the instruction conversion module 18 of the electronic device 10; after receiving the eye movement instruction, the instruction conversion module 18 may find a corresponding movement instruction through a lookup table or an object-oriented manner, and transmit the movement instruction to the processing unit 12 to start a corresponding one or more programs.
The following is a detailed description of the user interface synchronization method of the present invention with reference to the drawings, and fig. 8 is a schematic flow chart of the user interface synchronization method of the present invention, as shown in the drawings:
the user interface synchronization method of the present invention is applied to the electronic device 10 and the wearable device 20, and can transfer the screen on the electronic device 10 to the wearable device 20 to wirelessly operate the user interface of the electronic device 10 through the eye control function of the wearable device 20. The specific flow related to interface synchronization is as follows:
initially, the electronic device 10 and the wearable device 20 are paired, and the pairing may be performed through encryption, key establishment, or other mutual authentication methods, so as to establish a connection between the electronic device 10 and the wearable device 20. (step S201)
When the pairing is completed, the image data of the electronic device 10 (for example, the image on the display screen 11) is accessed, and the image data of the electronic device 10 is transmitted to the wearable device 20 via the wireless network. (step S202)
After receiving the image data, the wearable device 20 displays the image data on the output unit 21 of the wearable device 20 for the user to watch. The wearable device 20 enlarges or reduces the length and width of the display screen 11 of the electronic device 10 in an equal proportion to create a user interface window W for displaying the image data. (step S203)
The wearable device 20 analyzes the gazing direction of the user according to the acquired eye image, and forms a cursor W1 movable according to the gazing direction on the user interface window W. (step S204)
The eye movement command acquired by the wearable device 20 is analyzed and transmitted to the electronic device 10 via a wireless network. When receiving the eye movement command, the electronic device 10 converts the eye movement command to output the eye movement command as a movement command that can be executed by the electronic device 10. (step S205)
Three different embodiments are described below for the step S205, and it is understood that the following three embodiments can be performed simultaneously in the step S205:
in the first embodiment, the contents of fig. 9 are referred to below. First, a range corresponding to each graphical interface on the display screen array is obtained, where the range can be obtained from the graphics processing unit 13 of the electronic device 10, and the range of each graphical interface includes one or more programs corresponding to the graphical interface. (step S2051A) when the gazing direction of the user stays on the user interface window, the coordinate position on the display screen 11 of the electronic device 10 corresponding to the gazing direction is recorded to confirm the gazing direction of the user (step S2052A). When the gazing direction of the user stays on one of the graphical interfaces, a timer is started to record the staying time of the gazing direction of the user, and whether the gazing direction exceeds a set threshold time, for example, 1-2 seconds is determined (step S2053A). (step S2054A). On the other hand, if the gaze direction is out of the range of the graphical interface, the process returns to step S2052A, and continues to detect the gaze direction of the user.
In the second embodiment, the contents of fig. 10 are referred to below. First, a decision-making process is started, in which the gaze direction of the user is continuously detected (step S2051B), and when it is detected that the gaze direction of the user moves from left to right rapidly, an eye movement command for turning a page on the left side is transmitted to the electronic device 10, so that the electronic device 10 executes a movement command for turning a page from left to right (step S2052B). When detecting that the gazing direction of the user is moving from right to left, the eye action instruction of turning the page on the right side is transmitted to the electronic device 10, so that the electronic device 10 executes the action instruction of turning the page from right to left. (step S2053B)
In the third embodiment, the contents of fig. 11 are referred to below. First, the wearable device 20 detects whether the user acts as a trigger (step S2051C). When the trigger action of the user is detected, a reference coordinate is set, the gazing direction of the user is continuously detected, and the X-axis movement distance and the Y-axis movement distance of the gazing direction relative to the reference coordinate are recorded (step S2052C). Then, it is determined whether the X-axis movement distance or the Y-axis movement distance is greater than a threshold (step S2053C), if so, the process proceeds to the next step, otherwise, the process returns to step S2052C. When the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, the eye movement direction is determined (step S2054C). The eye movement command including the eye movement direction and the eye movement distance is transmitted to the electronic device 10, so that the electronic device 10 executes the scrolling movement command corresponding to the eye movement direction (step S2055C). In the above steps, if the distance in the gazing direction of the user returns to the state smaller than the threshold, the process returns to the process before step 2051C, and it is continuously detected whether the user generates a trigger action.
The method steps described in the present invention can also be implemented as a computer-readable recording medium, which is stored in a computer-readable recording medium such as an optical disc, a hard disk, a semiconductor memory device, etc., and is loaded on an electronic device through the computer-readable recording medium for accessing and using the electronic device or the electronic apparatus.
The method steps described in the present invention can also be implemented as a computer program product, which is stored in a hard disk of a network server, a memory device, such as app store, google play, windows, or other similar application online publishing platforms, and can be implemented by uploading the computer program product to the server for a user to pay for downloading.
In summary, the user interface synchronization system of the present invention can transmit the image data of the electronic device to the output unit of the wearable device, so as to operate the electronic device by tracking the eye movement of the user. The front lens and the eyes of the user can be kept at a fixed distance, and the eye movement of the user can be detected easily.
Although the present invention has been described in detail, it should be understood that the foregoing is only illustrative of the preferred embodiments of the present invention, and that various changes and modifications can be made herein without departing from the spirit and scope of the invention.