[go: up one dir, main page]

CN106201284B - User interface synchronization system and method - Google Patents

User interface synchronization system and method Download PDF

Info

Publication number
CN106201284B
CN106201284B CN201510340993.1A CN201510340993A CN106201284B CN 106201284 B CN106201284 B CN 106201284B CN 201510340993 A CN201510340993 A CN 201510340993A CN 106201284 B CN106201284 B CN 106201284B
Authority
CN
China
Prior art keywords
user
electronic device
eye movement
eye
command
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510340993.1A
Other languages
Chinese (zh)
Other versions
CN106201284A (en
Inventor
邹嘉骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Utechzone Co Ltd
Original Assignee
Utechzone Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Utechzone Co Ltd filed Critical Utechzone Co Ltd
Publication of CN106201284A publication Critical patent/CN106201284A/en
Application granted granted Critical
Publication of CN106201284B publication Critical patent/CN106201284B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种用户界面同步系统,包括耦接于电子装置的图形输出模块、指令转换模块,以及耦接于穿戴式装置的映像模块、眼动指令分析模块。该图形输出模块存取该电子装置的影像数据,并将该影像数据经由无线网络传送至该穿戴式装置。该映像模块通过该穿戴式装置将影像数据显示在输出单元上,以供用户注视操作。该眼动指令分析模块分析所获取到的眼部动作指令,并将该眼部动作指令经由无线网络传送至该电子装置。该指令转换模块在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行的动作指令。

Figure 201510340993

A user interface synchronization system includes a graphics output module and a command conversion module coupled to an electronic device, as well as an imaging module and an eye movement command analysis module coupled to a wearable device. The graphics output module accesses image data of the electronic device and transmits the image data to the wearable device via a wireless network. The imaging module displays the image data on the output unit through the wearable device for the user to watch and operate. The eye movement command analysis module analyzes the acquired eye movement command and transmits the eye movement command to the electronic device via the wireless network. The instruction conversion module converts the eye movement instruction when receiving the eye movement instruction to output the eye movement instruction into an action instruction that can be executed by the electronic device.

Figure 201510340993

Description

User interface synchronization system and method
Technical Field
The present invention relates to a user interface synchronization system and a method thereof, and more particularly, to a user interface synchronization system and a method thereof for use between an electronic device and a wearable device.
Background
Eye tracking (eye tracking) is a program for tracking eye movement by analyzing an eye image of a user or a movement pattern of an eyeball with respect to a head. The eye tracker is a device capable of tracking and measuring eyeball position and eyeball motion information, and is widely applied to research of visual systems, psychology and cognitive linguistics. Currently, there are a plurality of methods for eye movement tracking, and the more common eye movement tracking technologies include a real-time-real-Image (DPI) method, an infrared Image System (IRVS) method, an infrared eye movement diagram (IROG) method, and the like, and the gaze direction of a user can be determined by acquiring an eye Image of the user.
Recently, some brands of mobile terminals have introduced a technology of integrating an eye tracking technology into a mobile terminal, and by tracking a gazing direction of a user, the mobile terminal may generate a corresponding operation instruction to issue a corresponding instruction to the mobile terminal. Such techniques can facilitate a user to operate the display screen, and when one of the user's hands cannot be left to operate the mobile terminal, the user can still input a corresponding instruction by tracking the eyes of the user. Another scenario may be applied to playing a movie, where it is determined whether a user gazes at a display screen according to a direction that the user gazes, and when it is determined that the user does not gaze at the display screen, the stop function is temporarily activated, so as to ensure that the user does not miss any highlight in the movie.
However, according to the prior art, the eyes of the user are photographed through the front lens of the mobile terminal, and the obtained image is easily affected by environmental factors (such as high light and low light environment), and is easily difficult to track the eye movement. When the user wants to perform accurate eye movement tracking (such as gaze direction detection), the distance between the front lens and the display screen of the mobile terminal and the eyes of the user is not fixed, so that the gaze direction of the user cannot be obtained in a triangulation manner, and the user can only be briefly judged to be gazed leftwards or rightwards according to the image of the user.
Disclosure of Invention
The main objective of the present invention is to solve the problem that the electronic device in the prior art cannot be operated accurately by the eye tracking technology.
To solve the above problem, the present invention provides a user interface synchronization system for pairing an electronic device with a wearable device, the user interface synchronization system comprising:
the image output module is coupled with the electronic device to access the image data of the electronic device and transmit the image data of the electronic device to the wearable device through a wireless network;
the mapping module is coupled with the wearable device and displays the image data on an output unit of the wearable device for a user to watch and operate;
the eye movement instruction analysis module is coupled with the wearable device and used for analyzing the eye movement instruction acquired by the wearable device; and
and the instruction conversion module is coupled with the electronic device or the wearable device and converts the eye action instruction when receiving the eye action instruction so as to output the eye action instruction as an action instruction which can be executed by the electronic device.
Furthermore, the mapping module establishes a user interface window for displaying the image data on an output unit of the wearable device, and the user interface window is enlarged or reduced in equal proportion according to the length and the width of a display screen of the electronic device.
Furthermore, the eye movement instruction analysis module analyzes the gaze direction of the user according to the acquired eye image, and forms a cursor capable of moving according to the gaze direction on the user interface window.
Further, when the gazing direction of the user stays on the user interface window, the eye movement instruction analysis module records the coordinate position of the gazing direction on the display screen of the electronic device, and when the gazing direction stays on the same graphical interface approximately and exceeds a set threshold time, a triggering instruction is transmitted to the instruction conversion module through a wireless network to start one or more programs corresponding to the graphical interface.
Further, when the eye movement instruction analysis module detects that the watching direction of the user is fast moving from left to right, the eye movement instruction analysis module transmits the eye movement instruction of turning pages on the left side to the instruction conversion module so as to enable the electronic device to execute the movement instruction of turning pages from left to right, and when the eye movement instruction analysis module detects that the watching direction of the user is fast moving from right to left, the eye movement instruction analysis module transmits the eye movement instruction of turning pages on the right side to the instruction conversion module so as to enable the electronic device to execute the movement instruction of turning pages from right to left.
Furthermore, the eye movement instruction analysis module sets a reference coordinate when detecting the trigger action of the user, continuously detects the gaze direction of the user, records the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinate, and transmits an eye movement instruction corresponding to the eye movement direction and the movement distance to the instruction conversion module when the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, so that the electronic device executes a scrolling action instruction corresponding to the eye movement direction.
Further, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
Another objective of the present invention is to provide an electronic device with a controlled end, which includes a display screen, a wireless transmission unit, and a processor connected to the display screen and the wireless transmission unit. The processor has a graphics processing unit for transmitting image data to the display screen to provide a user interface for user operation. The processor comprises an arithmetic unit used for mounting and executing the following programs:
and the graphic output module is used for accessing the image data of the electronic device and transmitting the image data displayed by the electronic device to the wearable device through a wireless network so as to be output by the wearable device for visual operation of a user.
The instruction conversion module receives the eye action instruction provided by the wearable device through a wireless network and outputs the eye action instruction as an action instruction which can be executed by the electronic device.
Further, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
Another objective of the present invention is to provide a wearable device with a main control end, which includes an output unit, a wireless transmission unit, a camera unit, and a processor connected to the output unit, the wireless transmission unit, and the camera unit. The camera unit is used for shooting and obtaining the eye images of the user. The processor has a graphic processing unit for transmitting image data to the output unit for user interface operation. The processor comprises an arithmetic unit used for mounting and executing the following programs:
the mapping module obtains image data of the electronic device through a wireless network and transmits the image data to the graphic processing unit of the wearable device to be displayed on the output unit for a user to watch and operate.
The eye movement instruction analysis module acquires the eye image shot by the camera unit, acquires an eye movement instruction from the eye image, and transmits the eye movement instruction to the electronic device through a wireless network so as to start one or more programs of the electronic device.
Furthermore, the mapping module establishes a user interface window for displaying the image data on an output unit of the wearable device, and the user interface window is enlarged or reduced in equal proportion according to the length and the width of a display screen of the electronic device.
Furthermore, the eye movement instruction analysis module analyzes the gaze direction of the user according to the acquired eye image, and forms a cursor on the user interface window, wherein the cursor can move according to the gaze direction.
Further, when the gazing direction of the user stays on the user interface window, the eye movement instruction analysis module records the coordinate position of the gazing direction on the display screen of the electronic device, and when the condition that the gazing direction stays on the same graphical interface approximately and exceeds a set threshold value is detected, a triggering instruction is transmitted to the electronic device through a wireless network to start one or more programs corresponding to the graphical interface.
Further, the eye movement instruction analysis module transmits a left-side page-turning eye movement instruction to the electronic device when detecting that the gaze direction of the user is fast moving from left to right, so that the electronic device executes a left-side to right-side page-turning program, and transmits a right-side page-turning eye movement instruction to the electronic device when detecting that the gaze direction of the user is fast moving from right to left, so that the electronic device executes a right-side to left-side page-turning program.
Furthermore, the eye movement instruction analysis module sets a reference coordinate when detecting the trigger action of the user, continuously detects the gaze direction of the user, records an X-axis movement distance and a Y-axis movement distance of the gaze direction relative to the reference coordinate, and transmits an eye movement instruction corresponding to the eye movement direction and the movement distance to the electronic device when the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, so that the electronic device executes a scrolling program corresponding to the eye movement direction.
Further, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
Another objective of the present invention is to provide an interface synchronization method for a wearable device and an electronic device, comprising: accessing the image data of the electronic device and transmitting the image data of the electronic device to the wearable device through a wireless network; displaying the image data on an output unit of the wearable device through the wearable device so as to allow a user to watch the image data; analyzing the eye movement instruction acquired by the wearable device and transmitting the eye movement instruction to the electronic device through a wireless network; and converting the eye action command when receiving the eye action command so as to output the eye action command as an action command which can be executed by the electronic device.
Further, after receiving the image data, the wearable device enlarges or reduces the user interface window for displaying the image data according to the length and the width of the display screen of the electronic device in an equal proportion.
Furthermore, the wearable device analyzes the gazing direction of the user according to the acquired eye image, and forms a cursor capable of moving according to the gazing direction on the user interface window.
Further, when the gazing direction of the user stays on the user interface window, the coordinate position on the display screen of the electronic device corresponding to the gazing direction is recorded, and when the gazing direction stays on the same graphical interface approximately and exceeds a set threshold time, a triggering instruction is transmitted to the electronic device through a wireless network to start one or more programs corresponding to the graphical interface.
Further, when it is detected that the gaze direction of the user is moving rapidly from left to right, an eye action instruction for turning pages on the left side is transmitted to the electronic device, so that the electronic device executes an action instruction for turning pages from left to right, and when it is detected that the gaze direction of the user is moving rapidly from right to left, an eye action instruction for turning pages on the right side is transmitted to the electronic device, so that the electronic device executes an action instruction for turning pages from right to left.
Further, a reference coordinate is set when the trigger action of the user is detected, the gazing direction of the user is continuously detected, the X-axis movement distance and the Y-axis movement distance of the gazing direction relative to the reference coordinate are recorded, and when the X-axis movement distance or the Y-axis movement distance is larger than a threshold value, an eye action command corresponding to the eye movement direction and the movement distance is transmitted to the electronic device, so that the electronic device executes a scrolling action command corresponding to the eye movement direction.
It is still another object of the present invention to provide a computer readable recording medium, which records a program, and when the program is loaded and executed by an electronic device and a wearable device, the method can be performed.
It is still another object of the present invention to provide a computer program product, which when loaded into an electronic device and a wearable device, performs the method as described above.
Therefore, the present invention has the following excellent effects compared to the aforementioned conventional techniques:
1. the user interface synchronization system can transmit the image data of the electronic device to the output unit of the wearable device, so that the electronic device can be operated by tracking the eye movement of the user.
2. The front lens and the eyes of the user can be kept at a fixed distance, and the eye movement of the user can be detected easily.
Drawings
FIG. 1: a block diagram of a user interface synchronization system of the present invention is shown.
FIG. 2: a schematic diagram illustrating the use of the user interface synchronization system of the present invention is shown.
FIG. 3: schematic diagram (one) representing a user interface window.
FIG. 4: and (b) a track schematic diagram (I) generated by the eye action on the user interface window is shown.
FIG. 5: and (II) showing a track generated by the eye action on the user interface window.
FIG. 6: diagram (ii) representing a user interface window.
FIG. 7: a schematic diagram illustrating another user interface window.
FIG. 8; a flow diagram (one) of the user interface synchronization method of the present invention is shown.
FIG. 9: and (b) a flow diagram illustrating a user interface synchronization method of the present invention.
FIG. 10: and (c) a flow diagram illustrating a user interface synchronization method of the present invention.
FIG. 11: and (d) a flow diagram illustrating a user interface synchronization method of the present invention.
Description of the symbols
100 user interface synchronization system
10 electronic device
11 display screen
12 processing unit
13 graphic processing unit
14 storage unit
16 radio transmission unit
17 graphic output module
18 instruction conversion module
CU1 processor
20 wearable device
21 output unit
22 processing unit
23 graphic processing unit
24 storage unit
25 image pickup unit
26 Wireless transmission unit
27 image module
28 eye movement instruction analysis module
CU2 processor
W user interface window
W1 vernier
W2 timer
W3, W4, W5, W6 arrows (up, down, left, right)
Steps S201 to S205
Steps S2051A-S2054A
Steps S2051B-S2053B
Steps S2051C-S2055C
Detailed Description
The detailed description and technical contents of the present invention will now be described with reference to the drawings. In the drawings, for convenience of explanation, the scale is not necessarily drawn to scale, but may be exaggerated, and the drawings and the scale are not intended to limit the scope of the present invention.
The present invention is a user interface synchronization system 100 for transmitting a picture of an electronic device 10 to a wearable device 20, and tracking an eye movement of a user through the wearable device 20 to operate the electronic device 10.
The electronic device 10 at least includes a display screen 11, a Processing Unit 12 (CPU), and a Graphics Processing Unit 13 (GPU) capable of inputting and outputting images. Specifically, the electronic device 10 may be, for example, a cellular phone, a smart phone, a tablet computer, a handheld mobile communication device, a Personal Digital Assistant (PDA), or a portable electronic device such as the like, and the electronic device 10 may also be an electronic device having a display and control interface, such as a calculator, a desktop computer, a notebook computer, or a vehicle-mounted computer.
The wearable device 20 is particularly a wearable device worn on the head of a user, and can provide a user interface operable by the user through the output unit 21, capture the eyes of the user through the camera unit 25 to obtain the eye images of the user, and operate the user interface through the gazing direction of the user. The wearable device 20 includes at least an output Unit 21 for providing an image to the eyes of the user, an imaging Unit 25 for imaging the eyes of the user to obtain an image of the eyes of the user, a processing Unit 22 (CPU), and a Graphics processing Unit 23(GPU) for inputting and outputting an image. Specifically, the wearable device 20 may be smart glasses, eye tracker, augmented reality device, virtual reality device, or the like.
FIG. 1 is a block diagram of a user interface synchronization system according to the present invention, as shown in the figure:
the hardware architecture of the electronic device 10 and the wearable device 20 will be described below, and after the description of the hardware architecture, the following description will further describe a part of the software architecture.
An electronic device:
as mentioned above, the electronic device 10 serves as a controlled end to transmit the image data to the wearable device 20, and operate the electronic device 10 through the wireless network by the eye movement tracking function of the wearable device 20. The electronic device 10 includes a display screen 11, a Processing Unit 12(Processing Unit), a Graphics Processing Unit 13 (GPU) capable of inputting and outputting images, a storage Unit 14, and a wireless transmission Unit 16.
The processing unit 12 and the gpu 13 may together form a processor CU1, so that the processing unit 12 and the gpu 13 can be integrated on a single chip, thereby reducing the volume required by the components. For example, the processor CU1 may be developed by ARM Holdings, ltd
Figure BDA0000741499300000091
The present invention is not limited to a series of processors, and a Loongson (Loongson) processor developed by Institute of Computing Technology (ICT) of the Chinese academy of sciences.
In another preferred embodiment, the processing unit 12 and the GPU 13 can be configured as processors respectively, and respectively process logical operations and image processing, and jointly or cooperatively process a portion of the program instructions.
In another preferred embodiment, the processing unit 12 and the storage unit 14 together form a processor, and the processing unit 12 can load a program pre-stored in the storage unit 14 and execute a corresponding algorithm.
In this implementation, processing unit 12 and GPU 13 together form processor CU1, which is coupled to processor CU1 and to storage unit 14. The Processor CU1 may be a Central Processing Unit (CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or the like, or any combination thereof.
The display screen 11 is used for displaying graphic data, such as a user operation interface, a graphical interface, or a multimedia image, and the image or the operation interface is displayed on the display screen 11 for the user to read. The display screen 11 may be an Active Matrix Organic Light Emitting Device (AMOLED) display, a Thin Film Transistor (TFT) display, or other display devices, which is not limited in the present invention. The display screen 11 is driven by the control circuit, and drives the light emitting units (pixel elements) of the panel at the corresponding coordinates by inputting corresponding signals to the data line driving circuit and the scan line driving circuit. After the display screen 11 accesses the data in the storage unit 14 through the graphic processing unit 13, the corresponding multimedia data is distributed on the display screen 11 for the user to visually observe.
The wtru 16 may perform data transmission via a wireless network. Specifically, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP). In another preferred embodiment, the wireless transmission unit 16 can be paired with the wearable device 20 by Radio Frequency Identification (RFID) technology, so as to perform wireless data transmission with the wearable device 20 at medium and short distances.
Wearable device:
the wearable device 20 can be used as a main control terminal for receiving the image data of the electronic device 10 and outputting the image data to the eyes of the user for the user to watch. As described above, the wearable device 20 includes an output Unit 21, a Processing Unit 22 (CPU), a Graphics Processing Unit 23(GPU) that can input and output images, an imaging Unit 25, and a wireless transmission Unit 26.
The processing unit 22 is substantially the same as the processing unit 12 of the electronic device 10, and a description of the processing unit 22 is omitted here. As with the processing unit 12 of the electronic device 10, in a preferred embodiment, the processing unit 22 and the GPU 23 together form a processor CU2, such that the processing unit 22 and the GPU 23 can be integrated on a single chip. In another preferred embodiment, the processing unit 22 and the graphics processing unit 23 may be configured as processors respectively, and respectively process logical operations and image processing, and jointly or cooperatively process a part of the program instructions. In another preferred embodiment, the processing unit 22 and the storage unit 24 together form a processor, and the processing unit 22 can load a program pre-stored in the storage unit 24 and execute a corresponding algorithm.
In this implementation, processing unit 22 and graphics processing unit 23 together form processor CU2, and processor CU2 is coupled to storage unit 24. The Processing Unit 22 may be a Central Processing Unit (CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or other similar devices or combinations thereof.
The output unit 21 is used for displaying the graphic data and sending the graphic data to the eyes of the user for the user to visually operate. The output unit 21 may be a display screen, such as an Active Matrix Organic Light Emitting Device (AMOLED) display, a Thin Film Transistor (TFT) display, or other such display devices. In another preferred embodiment, the output unit 21 may be a retina display, and the image is projected directly on the retina for the user to view through a Retinal image projection (RID) technology. The retinal display can directly form an image on the retina by the reflection of the glass, so that the image light beam is fused with the real scene seen by naked eyes. The output unit 21 can issue the corresponding multimedia data to the eyes of the user for the user to visually observe after accessing the data in the storage unit 24 through a graphics processing unit 23 (GPU).
The image pickup unit 25 is a camera mounted with a Charge Coupled Device (CCD) or a Complementary Metal-Oxide Semiconductor (CMOS), and is not limited in the present invention. The camera unit 25 is used for capturing an eye image of the user, and the obtained eye image is transmitted to the eye movement command analysis module 28 for further analysis, so as to track the eye movement of the user and convert the eye movement into a corresponding eye movement command.
The wtru 26 may perform data transmission via a wireless network. Specifically, the wireless network employs a wireless fidelity Direct (WiFi Direct) protocol, Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).
The hardware architecture of the electronic device 10 and the wearable device 20 has been described in detail, and the architecture of the user interface synchronization system according to the present invention is further described in detail according to the hardware architecture:
referring to fig. 2, a schematic diagram of a user interface synchronization system according to the present invention is shown, as follows:
the present invention can pair the electronic device 10 with the wearable device 20. After the pairing is successful, the electronic device 10 can transmit the image data on the display screen 11 to the wearable device 20 through the wireless network, so that the user can read the display screen 11 through the wearable device 20. The wearable device 20 can capture an image of the user's eyes through the camera unit 25, and wirelessly operate the electronic device 10 through eye movements, thereby starting one or more programs of the electronic device 10.
Referring to fig. 1, the ui synchronization system 100 includes a graphic output module 17, a command conversion module 18, a mapping module 27 and an eye movement command analysis module 28 coupled to the electronic device 10.
In the present embodiment, the graphic output module 17 and the command conversion module 18 are pre-stored in the storage unit 14 of the electronic device 10, so that the processing unit 12 of the electronic device 10 executes the algorithm after mounting the programs of the graphic output module 17 and the command conversion module 18. The mapping module 27 and the eye movement instruction analysis module 28 are pre-stored in the storage unit 24 of the wearable device 20, so that the processing unit 22 of the wearable device 20 executes the algorithm after the programs of the mapping module 27 and the eye movement instruction analysis module 28 are mounted. In another preferred embodiment, the command conversion module 18 can also be loaded on the wearable device 20, and the wearable device 20 converts the eye movement command into a movement command for the electronic device 10 to execute through the command conversion module 18, and then transmits the movement command to the electronic device 10 through the wireless network to start one or more programs. In the above steps, the wearable device 20 encrypts the motion command, and the electronic device 10 only needs to decrypt and execute the motion command, which can be understood as another similar implementation of the present invention.
The graphic output module 17 is coupled to the electronic device 10 to access the image data of the electronic device 10 and transmit the image data of the electronic device 10 to the wearable device 20 via a wireless network. The graphic output module 17 is used for accessing image data (user interface) provided in the graphic processing unit 13, and the acquired image data is synchronized with the image displayed on the display screen 11, or the display screen 11 is forcibly closed according to preset or user setting, so that the graphic output module 17 directly transmits the image data to the wearable device 20 through a wireless network, thereby reducing the extra burden of the graphic output module 17 and reducing the power consumption of the electronic device.
The mapping module 27 is paired with the graphic output module 17 through the wireless transmission unit 26, and the mapping module 27 is used for displaying image data on the output unit 21 of the wearable device 20 for the user to watch. Referring to fig. 3, the mapping module 27 can establish a user interface window W for displaying the image data, and proportionally enlarge or reduce the user interface window W according to the length and width of the display screen 11 of the electronic device 10, so that the user can operate the user interface window W within a better visual perception range. The mapping module 27 displays a movable cursor W1 on the user interface window W, the cursor W1 moves along the gazing direction of the user, and the gazing direction is calculated by the eye movement command analysis module 28.
In a preferred embodiment, the graphic output module 17 may compress a series of image data by Streaming media, transmit the data in segments through a wireless network, and decompress the stream packets by the mapping module 27 coupled to the wearable device 20, so as to display the image data on the output unit 21 of the wearable device 20 in real time.
After the image data is displayed on the user interface window W provided by the output unit 21, the user can operate the cursor W1 on the user interface window W through eye movement, or generate corresponding eye movement commands through continuous eye movement to operate the user interface window W. After analyzing the eye movement of the user, the eye movement instruction analyzing module 28 converts the eye movement into an eye movement instruction, and transmits the eye movement instruction to the instruction converting module 18 coupled to the electronic device 10 through the wireless network for the instruction converting module 18 to analyze, further find the eye movement instruction corresponding to the movement instruction of the electronic device 10, and start one or more programs corresponding to the movement instruction through the processing unit 12. In a preferred embodiment, the command conversion module 18 includes a lookup table, and the command conversion module 18 can convert the eye movement command into a movement command corresponding to the electronic device 10 through the lookup table. Therefore, when the wearable device 20 is successfully mated with the electronic device 10, the user can operate the user interface window W through the wearable device 20, at this time, the camera unit 25 will continuously capture the eye image of the user, and the user can operate the electronic device 10 through the eye movement through the wearable device 20 by the captured eye movement.
Specifically, the eye movement command analysis module 28 includes two functions, one of which is to analyze the corresponding eye movement through the eye image of the user and determine the gazing direction of the user through the eye movement; the second is to determine the eye movement command the user wants to input according to the user's gaze direction.
As a technique for obtaining the eye movement (gaze direction) by the eye Image, a Purkinje Image tracking method (DPI), an infrared Image System method (IRVS), and an infrared eye movement method (IROG) may be used, but the present invention is not limited thereto. With the above method, a plurality of sample points (as shown in fig. 4) of the eye image mapped on the output unit 21 can be acquired through a plurality of eye images of the user, and the acquired sample points are used for further analyzing the eye movement command that the user wants to input.
Referring to fig. 4, in order to disclose a series of sample points when the user operates right scrolling, in default of the system, the middle position is set as a starting point and an end point of an eye trajectory instruction, in the illustration, the starting position is shown as ○, the end position is shown as x, and the detected coordinate sequences are arranged in a dot-like manner in sequence, the eye movement instruction analysis module 28 transmits an eye movement instruction of page turning on the left side to the instruction conversion module 18 of the electronic device 10 when detecting that the gaze direction of the user moves rapidly from left (i.e., middle position) to right (right side position), at this time, the instruction conversion module 18 calls a corresponding movement instruction through a lookup table when receiving the eye movement instruction of page turning on the left side, so that the processing unit 12 executes an event of page turning from left side to right side.
In the right-side page turning part, please refer to fig. 5, in order to disclose a series of sample points when the user operates the right-side page turning, in default of the system, in the diagram, the starting position is shown as ○, the ending position is shown as gamma, and the detected coordinate sequences are arranged in a dot shape in sequence, when the eye movement instruction analysis module 28 detects that the gazing direction of the user rapidly moves from right (i.e. the middle position) to left (i.e. the left position), the eye movement instruction of the right-side page turning is transmitted to the instruction conversion module 18 of the electronic device 10, at this time, the instruction conversion module 18 finds the corresponding movement instruction through the lookup table when receiving the right-side page turning, so that the processing unit 12 executes the event of turning the right-side page to the left.
When the page-turning command is input, the electronic device 10 enters a short non-response period (e.g., one second) to avoid that the user turns the page by the eye movement and then is determined as the page-turning command when the eye returns to the middle position.
In another preferred embodiment, the user can input an action command for scrolling pages to the electronic device 10 through the eye action command, so that the pages on the user interface W are scrolled in the direction of the user's gaze. Specifically, the eye movement command analysis module 28 detects an eye movement of the user, sets a reference point when detecting a trigger movement of the user (the trigger movement may be blinking, closing eyes, drawing a circle, or other predefined eye movements), and continuously records an X-axis movement distance and a Y-axis movement distance of the gazing direction relative to the reference coordinate. When the X-axis movement distance or the Y-axis movement distance is greater than the threshold value, the eye movement command analysis module 28 transmits an eye movement command including a movement direction (i.e., a positive value or a negative value of the X-axis or the Y-axis) and a movement distance corresponding to the eye movement direction to the command conversion module 18, and the command conversion module 18 transmits the command to the processing unit 12, so that the processing unit 12 executes a scrolling movement command corresponding to the eye movement direction. The obtained eye movement direction is used as a reference value for judging the scroll direction, and the obtained movement distance is used as a reference value for judging the scroll speed.
Referring to fig. 6, a user may manipulate a cursor W1 on a user interface window W by looking at the user interface window W. The eye movement instruction analysis module 28 analyzes the gazing direction of the user according to the acquired eye images, and forms a cursor W1 movable according to the gazing direction on the user interface window W. When the gazing direction of the user stays on the user interface window W, the eye movement command analysis module 28 records the coordinate position on the display screen 11 of the electronic device 10 corresponding to the gazing direction, and when the gazing direction stays on the same graphical interface approximately for more than a set threshold time, transmits a trigger command to the command conversion module 18 via the wireless network, and the command conversion module 18 converts the trigger command into a touch-start action command to start a software program or command corresponding to the graphical interface on the coordinate position.
As shown in fig. 6, in a preferred embodiment, the mapping module 27, upon receiving the image data, defines and targets a range of the corresponding graphical interface on the output unit 21 (the defined range is obtained by the gpu 13 of the electronic device 10), and when the gazing direction of the user stays within the range defined by one of the graphical interfaces, the cursor W1 is converted into a timer W2 and marks the time required for starting the graphical interface. As shown, the timer W2 includes a percentile and a timing bar, when the gazing direction of the user stays on the graphical interface, the cursor W1 is converted into the timer W2, the percentile and the timing bar display the remaining time, when the percentile reaches 100%, the timing bar also displays a full frame, and the eye movement command analysis module 28 transmits a trigger command to the command conversion module 18 via the wireless network to start the software program or command corresponding to the graphical interface.
Referring to fig. 7, it is shown that in another preferred embodiment, the user interface window W may be provided with a plurality of operation areas, and a user may watch the graphical interface corresponding to each operation area to generate an eye movement command corresponding to the operation area. For example, when the user stops the cursor W1 (gazing direction) at the arrow W3 in the right direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of the right page turning to the instruction conversion module 18 of the electronic device 10; when the user holds the cursor W1 (gazing direction) on the arrow W4 in the left direction, the eye movement command analysis module 28 will transmit the eye movement command of the right page turn to the command conversion module 18 of the electronic device 10; when the user stops the cursor W1 (gazing direction) at the arrow W5 in the upper direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of the upper page turning to the instruction conversion module 18 of the electronic device 10; when the user stops the cursor W1 (gazing direction) at the arrow W6 in the lower direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of the lower page turning to the instruction conversion module 18 of the electronic device 10; after receiving the eye movement instruction, the instruction conversion module 18 may find a corresponding movement instruction through a lookup table or an object-oriented manner, and transmit the movement instruction to the processing unit 12 to start a corresponding one or more programs.
The following is a detailed description of the user interface synchronization method of the present invention with reference to the drawings, and fig. 8 is a schematic flow chart of the user interface synchronization method of the present invention, as shown in the drawings:
the user interface synchronization method of the present invention is applied to the electronic device 10 and the wearable device 20, and can transfer the screen on the electronic device 10 to the wearable device 20 to wirelessly operate the user interface of the electronic device 10 through the eye control function of the wearable device 20. The specific flow related to interface synchronization is as follows:
initially, the electronic device 10 and the wearable device 20 are paired, and the pairing may be performed through encryption, key establishment, or other mutual authentication methods, so as to establish a connection between the electronic device 10 and the wearable device 20. (step S201)
When the pairing is completed, the image data of the electronic device 10 (for example, the image on the display screen 11) is accessed, and the image data of the electronic device 10 is transmitted to the wearable device 20 via the wireless network. (step S202)
After receiving the image data, the wearable device 20 displays the image data on the output unit 21 of the wearable device 20 for the user to watch. The wearable device 20 enlarges or reduces the length and width of the display screen 11 of the electronic device 10 in an equal proportion to create a user interface window W for displaying the image data. (step S203)
The wearable device 20 analyzes the gazing direction of the user according to the acquired eye image, and forms a cursor W1 movable according to the gazing direction on the user interface window W. (step S204)
The eye movement command acquired by the wearable device 20 is analyzed and transmitted to the electronic device 10 via a wireless network. When receiving the eye movement command, the electronic device 10 converts the eye movement command to output the eye movement command as a movement command that can be executed by the electronic device 10. (step S205)
Three different embodiments are described below for the step S205, and it is understood that the following three embodiments can be performed simultaneously in the step S205:
in the first embodiment, the contents of fig. 9 are referred to below. First, a range corresponding to each graphical interface on the display screen array is obtained, where the range can be obtained from the graphics processing unit 13 of the electronic device 10, and the range of each graphical interface includes one or more programs corresponding to the graphical interface. (step S2051A) when the gazing direction of the user stays on the user interface window, the coordinate position on the display screen 11 of the electronic device 10 corresponding to the gazing direction is recorded to confirm the gazing direction of the user (step S2052A). When the gazing direction of the user stays on one of the graphical interfaces, a timer is started to record the staying time of the gazing direction of the user, and whether the gazing direction exceeds a set threshold time, for example, 1-2 seconds is determined (step S2053A). (step S2054A). On the other hand, if the gaze direction is out of the range of the graphical interface, the process returns to step S2052A, and continues to detect the gaze direction of the user.
In the second embodiment, the contents of fig. 10 are referred to below. First, a decision-making process is started, in which the gaze direction of the user is continuously detected (step S2051B), and when it is detected that the gaze direction of the user moves from left to right rapidly, an eye movement command for turning a page on the left side is transmitted to the electronic device 10, so that the electronic device 10 executes a movement command for turning a page from left to right (step S2052B). When detecting that the gazing direction of the user is moving from right to left, the eye action instruction of turning the page on the right side is transmitted to the electronic device 10, so that the electronic device 10 executes the action instruction of turning the page from right to left. (step S2053B)
In the third embodiment, the contents of fig. 11 are referred to below. First, the wearable device 20 detects whether the user acts as a trigger (step S2051C). When the trigger action of the user is detected, a reference coordinate is set, the gazing direction of the user is continuously detected, and the X-axis movement distance and the Y-axis movement distance of the gazing direction relative to the reference coordinate are recorded (step S2052C). Then, it is determined whether the X-axis movement distance or the Y-axis movement distance is greater than a threshold (step S2053C), if so, the process proceeds to the next step, otherwise, the process returns to step S2052C. When the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, the eye movement direction is determined (step S2054C). The eye movement command including the eye movement direction and the eye movement distance is transmitted to the electronic device 10, so that the electronic device 10 executes the scrolling movement command corresponding to the eye movement direction (step S2055C). In the above steps, if the distance in the gazing direction of the user returns to the state smaller than the threshold, the process returns to the process before step 2051C, and it is continuously detected whether the user generates a trigger action.
The method steps described in the present invention can also be implemented as a computer-readable recording medium, which is stored in a computer-readable recording medium such as an optical disc, a hard disk, a semiconductor memory device, etc., and is loaded on an electronic device through the computer-readable recording medium for accessing and using the electronic device or the electronic apparatus.
The method steps described in the present invention can also be implemented as a computer program product, which is stored in a hard disk of a network server, a memory device, such as app store, google play, windows, or other similar application online publishing platforms, and can be implemented by uploading the computer program product to the server for a user to pay for downloading.
In summary, the user interface synchronization system of the present invention can transmit the image data of the electronic device to the output unit of the wearable device, so as to operate the electronic device by tracking the eye movement of the user. The front lens and the eyes of the user can be kept at a fixed distance, and the eye movement of the user can be detected easily.
Although the present invention has been described in detail, it should be understood that the foregoing is only illustrative of the preferred embodiments of the present invention, and that various changes and modifications can be made herein without departing from the spirit and scope of the invention.

Claims (18)

1.一种用户界面同步系统,用于将电子装置与穿戴式装置进行配对,其特征在于,所述用户界面同步系统包括:1. A user interface synchronization system for pairing an electronic device with a wearable device, wherein the user interface synchronization system comprises: 图形输出模块,耦接于该电子装置,以存取该电子装置的影像数据,并将该电子装置的影像数据经由无线网络传送至该穿戴式装置;a graphics output module, coupled to the electronic device, to access image data of the electronic device, and to transmit the image data of the electronic device to the wearable device via a wireless network; 映像模块,耦接于该穿戴式装置,将该影像数据显示于该穿戴式装置的输出单元上,以供用户注视操作;an image module, coupled to the wearable device, and displaying the image data on the output unit of the wearable device for the user to watch and operate; 眼动指令分析模块,耦接于该穿戴式装置,分析该穿戴式装置所获取到的眼部动作指令,在检测到该用户的触发动作时设定一基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离;以及An eye movement command analysis module, coupled to the wearable device, analyzes the eye movement command obtained by the wearable device, sets a reference coordinate when detecting the triggering action of the user, and continuously detects the gaze direction of the user , and record the X-axis movement distance and Y-axis movement distance of the gaze direction relative to the reference coordinate; and 指令转换模块,耦接于该电子装置或该穿戴式装置,当该X轴移动距离或该Y轴移动距离大于阈值时,该眼动指令分析模块传递对应于该用户的眼部移动方向及移动距离的眼部动作指令至所述指令转换模块,在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行对应于该眼部移动方向卷动的动作指令。An instruction conversion module, coupled to the electronic device or the wearable device, when the X-axis movement distance or the Y-axis movement distance is greater than a threshold, the eye movement command analysis module transmits the eye movement direction and movement corresponding to the user The eye movement command of the distance is sent to the command conversion module, and the eye movement command is converted when the eye movement command is received, so as to output the eye movement command for the electronic device to execute the corresponding eye movement command. The action command of scrolling in the moving direction of the part. 2.如权利要求1所述的用户界面同步系统,其特征在于,所述映像模块在该穿戴式装置的输出单元上建立显示该影像数据的用户界面窗口,并依据该电子装置的显示屏幕的长宽按照等比例放大或缩小调整该用户界面窗口。2 . The user interface synchronization system of claim 1 , wherein the image module creates a user interface window for displaying the image data on the output unit of the wearable device, and according to the display screen of the electronic device, the user interface window is created. 3 . The user interface window is adjusted to be enlarged or reduced in proportion to the length and width. 3.如权利要求2所述的用户界面同步系统,其特征在于,所述眼动指令分析模块是依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。3. The user interface synchronization system as claimed in claim 2, wherein the eye movement instruction analysis module analyzes the gaze direction of the user according to the obtained eye image, and forms on the user interface window A cursor that can move according to this gaze direction. 4.如权利要求3所述的用户界面同步系统,其特征在于,当该用户的注视方向停留在该用户界面窗口上时,所述眼动指令分析模块是记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并当该注视方向停留在同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至所述指令转换模块以启动该图形化界面所对应的一个或多个程序。4. The user interface synchronization system according to claim 3, wherein when the user's gaze direction stays on the user interface window, the eye movement instruction analysis module is to record the gaze direction corresponding to the electronic The coordinate position on the display screen of the device, and when the gaze direction stays on the same graphical interface for longer than the set threshold time, the trigger command is transmitted to the command conversion module via the wireless network to activate the graphical interface. corresponding one or more programs. 5.如权利要求3所述的用户界面同步系统,其特征在于,所述眼动指令分析模块在检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的该眼部动作指令至该指令转换模块,以使该电子装置执行由左侧向右侧翻页的动作指令,所述眼动指令分析模块在检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的该眼部动作指令至所述指令转换模块,以使所述电子装置执行由右侧向左侧翻页的动作指令。5. The user interface synchronization system according to claim 3, wherein when the eye movement instruction analysis module detects that the user's gaze direction moves rapidly from left to right, it transmits the eye for turning pages on the left. The action command is sent to the command conversion module, so that the electronic device executes the action command of turning pages from left to right, and the eye movement command analysis module detects that the user's gaze direction is moving rapidly from right to left , and transmit the eye action command for page turning on the right to the command conversion module, so that the electronic device executes the action command for turning pages from the right to the left. 6.如权利要求1所述的用户界面同步系统,其特征在于,所述无线网络采用无线保真度直连协议、蓝芽无线传输、或虚拟无线接入点。6 . The user interface synchronization system according to claim 1 , wherein the wireless network adopts a wireless fidelity direct connection protocol, a Bluetooth wireless transmission, or a virtual wireless access point. 7 . 7.一种主控端穿戴式装置,其特征在于,包括:输出单元、无线传输单元、摄像单元、以及连接于所述输出单元、所述无线传输单元及所述摄像单元的处理器,所述摄像单元是用于拍摄并取得用户的眼部影像,所述处理器包括:7. A wearable device for a main control terminal, comprising: an output unit, a wireless transmission unit, a camera unit, and a processor connected to the output unit, the wireless transmission unit and the camera unit, wherein the The camera unit is used to shoot and obtain the user's eye image, and the processor includes: 图形处理单元,用于将影像数据传送至该输出单元上以提供用户操作的用户界面;a graphics processing unit for transmitting image data to the output unit to provide a user interface for user operation; 运算单元,用于挂载并执行以下的程序:The arithmetic unit is used to mount and execute the following programs: 映像模块,是通过无线网络取得电子装置的影像数据,并将该影像数据传送至该穿戴式装置的该图形处理单元以显示在该输出单元上,供用户注视操作;an image module, which obtains image data of the electronic device through a wireless network, and transmits the image data to the graphics processing unit of the wearable device to be displayed on the output unit for the user to watch and operate; 眼动指令分析模块,取得由该摄像单元所拍摄取得的该眼部影像,并由该眼部影像获取眼部动作指令,在检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,通过无线网络传递对应于该用户的眼部移动方向及移动距离的眼部动作指令传送至该电子装置,以启动该电子装置执行对应于该眼部移动方向卷动的程序。The eye movement instruction analysis module obtains the eye image captured by the camera unit, and obtains the eye movement instruction from the eye image, sets the reference coordinates when detecting the triggering action of the user, and continuously detects the user and record the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinate. When the X-axis movement distance or the Y-axis movement distance is greater than the threshold, the user's eye movement corresponding to the user is transmitted through the wireless network. The eye movement command of the moving direction and the moving distance is sent to the electronic device, so as to start the electronic device to execute the scrolling program corresponding to the moving direction of the eye. 8.如权利要求7所述的穿戴式装置,其特征在于,所述映像模块在所述穿戴式装置的该输出单元上建立显示该影像数据的用户界面窗口,并依据该电子装置的显示屏幕的长宽按照等比例放大或缩小调整该用户界面窗口。8 . The wearable device of claim 7 , wherein the image module creates a user interface window for displaying the image data on the output unit of the wearable device, and according to the display screen of the electronic device. 9 . The length and width of the user interface window are enlarged or reduced in equal proportions. 9.如权利要求8所述的穿戴式装置,其特征在于,所述眼动指令分析模块系依据所获取到的该眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。9 . The wearable device according to claim 8 , wherein the eye movement command analysis module analyzes the gaze direction of the user according to the obtained eye image, and forms an image on the user interface window. 10 . A cursor that can move according to this gaze direction. 10.如权利要求9所述的穿戴式装置,其特征在于,当该用户的注视方向停留在该用户界面窗口上时,所述眼动指令分析模块将记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并当检测到该注视方向停留在同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该电子装置以启动该图形化界面所对应的一个或多个程序。10. The wearable device according to claim 9, wherein when the user's gaze direction stays on the user interface window, the eye movement command analysis module will record the gaze direction corresponding to the electronic device The coordinate position on the display screen, and when it is detected that the gaze direction stays on the same graphical interface for longer than the set threshold time, a trigger command is sent to the electronic device via the wireless network to activate the corresponding graphical interface. one or more programs. 11.如权利要求9所述的穿戴式装置,其特征在于,所述眼动指令分析模块检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由左侧向右侧翻页的程序,所述眼动指令分析模块检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由右侧向左侧翻页的程序。11 . The wearable device according to claim 9 , wherein when the eye movement command analysis module detects that the user's gaze direction is moving rapidly from left to right, it transmits an eye movement command for turning pages on the left. 12 . to the electronic device, so that the electronic device executes the program of turning pages from left to right, the eye movement instruction analysis module detects that the user's gaze direction is moving rapidly from right to left, and transmits the right page turning The eye movement command is sent to the electronic device, so that the electronic device executes the procedure of turning pages from right to left. 12.如权利要求7所述的穿戴式装置,其特征在于,所述无线网络采用无线保真度直连协议、蓝芽无线传输、或虚拟无线接入点。12 . The wearable device of claim 7 , wherein the wireless network adopts Wireless Fidelity Direct Protocol, Bluetooth wireless transmission, or virtual wireless access point. 13 . 13.一种穿戴式装置及电子装置的界面同步方法,其特征在于,包括:13. An interface synchronization method for a wearable device and an electronic device, comprising: 存取该电子装置的影像数据,并将该电子装置的影像数据经由无线网络传送至该穿戴式装置;accessing the image data of the electronic device, and transmitting the image data of the electronic device to the wearable device via the wireless network; 通过该穿戴式装置将该影像数据显示在该穿戴式装置的输出单元上,以供用户注视操作;Display the image data on the output unit of the wearable device through the wearable device for the user to watch and operate; 分析该穿戴式装置所获取到的眼部动作指令,当检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令经由无线网络传送至该电子装置;在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行对应于该眼部移动方向卷动的动作指令。Analyze the eye movement instructions obtained by the wearable device, set the reference coordinates when detecting the trigger action of the user, continuously detect the gaze direction of the user, and record the X-axis movement distance of the gaze direction relative to the reference coordinates and the movement distance of the Y axis, when the movement distance of the X axis or the movement distance of the Y axis is greater than the threshold, the eye movement command corresponding to the movement direction and movement distance of the eye is transmitted to the electronic device via the wireless network; When the eye movement command is converted, the eye movement command is converted to output the eye movement command as an action command for the electronic device to perform scrolling corresponding to the eye movement direction. 14.如权利要求13所述的界面同步方法,其特征在于,所述穿戴式装置在接收到该影像数据后,依据所述电子装置的显示屏幕的长宽按照等比例放大或缩小建立显示该影像数据的用户界面窗口。14 . The interface synchronization method of claim 13 , wherein after receiving the image data, the wearable device enlarges or reduces the length and width of the display screen of the electronic device in an equal proportion to establish and display the image data. 15 . User interface window for image data. 15.如权利要求14所述的界面同步方法,其特征在于,该穿戴式装置是依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。15 . The interface synchronization method of claim 14 , wherein the wearable device analyzes the gaze direction of the user according to the acquired eye image, and forms a gaze direction based on the gaze on the user interface window. 16 . The cursor moves in the direction. 16.如权利要求15所述的界面同步方法,其特征在于,该用户的注视方向停留在该用户界面窗口上时,记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并在该注视方向停留在同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该电子装置以启动该图形化界面所对应的一个或多个程序。16. The interface synchronization method of claim 15, wherein when the user's gaze direction stays on the user interface window, the coordinate position on the display screen of the electronic device corresponding to the gaze direction is recorded, and When the gaze direction stays on the same graphical interface for more than a set threshold time, a trigger command is transmitted to the electronic device via the wireless network to activate one or more programs corresponding to the graphical interface. 17.如权利要求15所述的界面同步方法,其特征在于,当检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由左侧向右侧翻页的动作指令,当检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由右侧向左侧翻页的动作指令。17 . The interface synchronization method of claim 15 , wherein when it is detected that the user's gaze direction moves rapidly from left to right, an eye movement instruction for turning pages on the left is transmitted to the electronic device, 17 . Make the electronic device execute the action command of turning pages from left to right, when detecting that the user's gaze direction is moving rapidly from right to left, transmit the eye action command of turning pages on the right to the electronic device, so as to Make the electronic device execute the action instruction of turning pages from right to left. 18.一种计算机可读取记录媒体,其特征在于,在所述可读取记录媒体上记录一程序,当电子装置及穿戴式装置加载该程序并执行后,可完成如权利要求13至17中任一项所述的方法。18. A computer-readable recording medium, wherein a program is recorded on the readable recording medium, and after the electronic device and the wearable device load the program and execute it, the program as claimed in claims 13 to 17 can be completed. The method of any of the above.
CN201510340993.1A 2015-04-29 2015-06-18 User interface synchronization system and method Active CN106201284B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW104113704A TWI571768B (en) 2015-04-29 2015-04-29 User interface synchronization system, device, method, computer readable recording medium, and computer program product
TW104113704 2015-04-29

Publications (2)

Publication Number Publication Date
CN106201284A CN106201284A (en) 2016-12-07
CN106201284B true CN106201284B (en) 2020-03-24

Family

ID=57453126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510340993.1A Active CN106201284B (en) 2015-04-29 2015-06-18 User interface synchronization system and method

Country Status (2)

Country Link
CN (1) CN106201284B (en)
TW (1) TWI571768B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180150204A1 (en) * 2016-11-30 2018-05-31 Google Inc. Switching of active objects in an augmented and/or virtual reality environment
US20190107885A1 (en) * 2016-12-09 2019-04-11 Shenzhen Royole Technologies Co. Ltd. Head-mounted display and method and system for adjusting user interface thereof
CN106774893B (en) * 2016-12-15 2019-10-18 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
US10511842B2 (en) 2017-10-06 2019-12-17 Qualcomm Incorporated System and method for foveated compression of image frames in a system on a chip
CN112560572B (en) * 2020-10-24 2024-11-12 北京博睿维讯科技有限公司 A camera and large screen interactive processing method, device and system
CN119621846A (en) * 2024-11-21 2025-03-14 数字广东网络建设有限公司 Data synchronization method, device, electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101258436A (en) * 2005-09-08 2008-09-03 瑞士电信流动电话公司 Communication devices, systems and methods
CN101272727A (en) * 2005-09-27 2008-09-24 潘尼公司 A device for controlling an external unit
US7762665B2 (en) * 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
CN103472915A (en) * 2013-08-30 2013-12-25 深圳Tcl新技术有限公司 Reading control method and reading control device on basis of pupil tracking and display equipment
TWM472854U (en) * 2013-11-27 2014-02-21 Chipsip Technology Co Ltd Wearable display
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye tracking method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
JP5539945B2 (en) * 2011-11-01 2014-07-02 株式会社コナミデジタルエンタテインメント GAME DEVICE AND PROGRAM

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7762665B2 (en) * 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
CN101258436A (en) * 2005-09-08 2008-09-03 瑞士电信流动电话公司 Communication devices, systems and methods
CN101272727A (en) * 2005-09-27 2008-09-24 潘尼公司 A device for controlling an external unit
CN103472915A (en) * 2013-08-30 2013-12-25 深圳Tcl新技术有限公司 Reading control method and reading control device on basis of pupil tracking and display equipment
TWM472854U (en) * 2013-11-27 2014-02-21 Chipsip Technology Co Ltd Wearable display
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye tracking method and device

Also Published As

Publication number Publication date
TWI571768B (en) 2017-02-21
CN106201284A (en) 2016-12-07
TW201638723A (en) 2016-11-01

Similar Documents

Publication Publication Date Title
US11625841B2 (en) Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium
US10832448B2 (en) Display control device, display control method, and program
US11262835B2 (en) Human-body-gesture-based region and volume selection for HMD
CN106201284B (en) User interface synchronization system and method
WO2020216054A1 (en) Sight line tracking model training method, and sight line tracking method and device
CN106716302B (en) Method, apparatus, and computer-readable medium for displaying image
EP3035656B1 (en) Method and apparatus for controlling an electronic device
US20190080188A1 (en) Facial recognition method and related product
KR20210069491A (en) Electronic apparatus and Method for controlling the display apparatus thereof
CN110196640A (en) A kind of method of controlling operation thereof and terminal
CN104331168A (en) Display adjusting method and electronic equipment
CN111382624A (en) Action recognition method, device, equipment and readable storage medium
JP6504058B2 (en) INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM
US20240184499A1 (en) Information display apparatus and method
CN107589841A (en) Wear the operating method of display device, wear display device and system
CN115985209A (en) Wearable display device, control method, control device, electronic device and storage medium
US10992926B2 (en) Head mounted display system capable of displaying a virtual scene and a real scene in a picture-in-picture mode, related method and related non-transitory computer readable storage medium
EP3112991A1 (en) Method and apparatus for context based application grouping in virtual reality
US11600241B2 (en) Display control device, imaging device, display control method, and display control program
CN204347750U (en) head-mounted display apparatus
CN117130472B (en) Virtual space operation guide display method, mobile device and system
CN117130471B (en) Human-computer interaction method, electronic equipment and system
US12432443B2 (en) System and method for detecting a user intent to start a video recording
US20210350766A1 (en) Display control device, imaging device, display control method, and display control program
US20240073514A1 (en) System and method for detecting a user intent to start a video recording

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant