[go: up one dir, main page]

CN106201284A - user interface synchronization system and method - Google Patents

user interface synchronization system and method Download PDF

Info

Publication number
CN106201284A
CN106201284A CN201510340993.1A CN201510340993A CN106201284A CN 106201284 A CN106201284 A CN 106201284A CN 201510340993 A CN201510340993 A CN 201510340993A CN 106201284 A CN106201284 A CN 106201284A
Authority
CN
China
Prior art keywords
electronic device
user
eye movement
wearable device
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510340993.1A
Other languages
Chinese (zh)
Other versions
CN106201284B (en
Inventor
邹嘉骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Utechzone Co Ltd
Original Assignee
Utechzone Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Utechzone Co Ltd filed Critical Utechzone Co Ltd
Publication of CN106201284A publication Critical patent/CN106201284A/en
Application granted granted Critical
Publication of CN106201284B publication Critical patent/CN106201284B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种用户界面同步系统,包括耦接于电子装置的图形输出模块、指令转换模块,以及耦接于穿戴式装置的映像模块、眼动指令分析模块。该图形输出模块存取该电子装置的影像数据,并将该影像数据经由无线网络传送至该穿戴式装置。该映像模块通过该穿戴式装置将影像数据显示在输出单元上,以供用户注视操作。该眼动指令分析模块分析所获取到的眼部动作指令,并将该眼部动作指令经由无线网络传送至该电子装置。该指令转换模块在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行的动作指令。

A user interface synchronization system includes a graphics output module and a command conversion module coupled to an electronic device, as well as an imaging module and an eye movement command analysis module coupled to a wearable device. The graphics output module accesses image data of the electronic device and transmits the image data to the wearable device via a wireless network. The imaging module displays the image data on the output unit through the wearable device for the user to watch and operate. The eye movement command analysis module analyzes the acquired eye movement command and transmits the eye movement command to the electronic device via the wireless network. The instruction conversion module converts the eye movement instruction when receiving the eye movement instruction to output the eye movement instruction into an action instruction that can be executed by the electronic device.

Description

用户界面同步系统、方法User interface synchronization system and method

技术领域technical field

本发明涉及一种用户界面同步系统及其方法,尤指一种用于电子装置与穿戴式装置间的用户界面同步系统及其方法。The invention relates to a user interface synchronization system and method thereof, in particular to a user interface synchronization system and method used between an electronic device and a wearable device.

背景技术Background technique

眼动追踪(eye tracking),是指通过分析用户的眼部影像或者眼球相对头部的运动模式,来实现对眼球运动的追踪程序。眼动仪是一种能够跟踪测量眼球位置及眼球运动信息的设备,在视觉系统、心理学、认知语言学的研究中有广泛的应用。目前眼动追踪有多种方法,较常见的眼动追踪技术包括Purkinje影像追踪法(Dual-Purkinje-Image,DPI)、红外线影像系统法(Infra-RedVideo System,IRVS)、红外线眼动图法(Infra-Red Oculography,IROG)等,可通过获取用户的眼部影像藉以判断用户的注视方向。Eye tracking refers to the tracking of eye movement by analyzing the user's eye image or the movement pattern of the eye relative to the head. Eye tracker is a device that can track and measure eyeball position and eye movement information. It is widely used in the research of visual system, psychology, and cognitive linguistics. At present, there are many methods of eye movement tracking. The more common eye movement tracking techniques include Purkinje image tracking method (Dual-Purkinje-Image, DPI), infrared image system method (Infra-Red Video System, IRVS), infrared eye movement mapping method ( Infra-Red Oculography, IROG), etc., can determine the user's gaze direction by obtaining the user's eye image.

近期来,部分品牌的移动终端推出将眼动追踪的技术整合至移动终端的技术,通过追踪用户的注视方向,移动终端可产生对应的操作指令,以对该移动终端下达对应的指令。这类的技术可便于用户操作显示屏幕,当用户的其中一手无法空出来操作移动终端时,通过追踪用户的眼球,仍可输入对应的指令。另一种情境可应用于影片播放,通过用户注视的方向,判断用户是否注视显示屏幕,在判断用户并未注视显示屏幕时,暂时启动停播功能,确保用户不会错过影片中的任何精彩片段。Recently, some brands of mobile terminals have introduced technology that integrates eye-tracking technology into the mobile terminal. By tracking the user's gaze direction, the mobile terminal can generate corresponding operation instructions to issue corresponding instructions to the mobile terminal. This type of technology can facilitate the user to operate the display screen. When one of the user's hands cannot be free to operate the mobile terminal, the user can still input corresponding instructions by tracking the user's eyeballs. Another scenario can be applied to video playback. According to the direction of the user's gaze, it is judged whether the user is looking at the display screen. When it is judged that the user is not looking at the display screen, the pause function is temporarily activated to ensure that the user will not miss any highlights in the movie. .

然而,根据目前现有的技术,经由移动终端的前镜头拍摄用户眼部,所取得的影像容易受到环境因素的影响(例如高光、低光环境),在追踪眼部动作时容易遇到困难。欲进行精确的眼部动作追踪时(如注视方向检测),由于移动终端的前镜头、显示屏幕与用户眼部的距离不固定,无法通过三角测距的方式取得用户的注视方向,仅能就用户的影像简略地判断用户向左或是向右注视。However, according to the current existing technology, the user's eyes are photographed through the front lens of the mobile terminal, and the obtained images are easily affected by environmental factors (such as high-light and low-light environments), and it is easy to encounter difficulties in tracking eye movements. When it is desired to perform accurate eye movement tracking (such as gaze direction detection), since the distance between the front lens of the mobile terminal, the display screen and the user's eyes is not fixed, it is impossible to obtain the user's gaze direction through triangular distance measurement. The user's image briefly determines whether the user is looking left or right.

发明内容Contents of the invention

本发明的主要目的,在于解决现有技术中电子装置无法精确通过进行眼动追踪技术操作的问题。The main purpose of the present invention is to solve the problem that the electronic devices in the prior art cannot accurately operate through the eye-tracking technology.

为解决上述问题,本发明提供一种用户界面同步系统,用于将电子装置与穿戴式装置进行配对,所述用户界面同步系统包括:In order to solve the above problems, the present invention provides a user interface synchronization system for pairing an electronic device with a wearable device. The user interface synchronization system includes:

图形输出模块,耦接于该电子装置,以存取该电子装置的影像数据,并将该电子装置的影像数据经由无线网络传送至该穿戴式装置;a graphic output module, coupled to the electronic device, to access the image data of the electronic device, and transmit the image data of the electronic device to the wearable device via the wireless network;

映像模块,耦接于该穿戴式装置,将该影像数据显示于该穿戴式装置的输出单元上,以供用户注视操作;An image module, coupled to the wearable device, displays the image data on the output unit of the wearable device for the user to watch and operate;

眼动指令分析模块,耦接于该穿戴式装置,分析该穿戴式装置所获取到的眼部动作指令;以及An eye movement instruction analysis module, coupled to the wearable device, analyzes the eye movement instructions acquired by the wearable device; and

指令转换模块,耦接于该电子装置或该穿戴式装置,在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行的动作指令。An instruction conversion module, coupled to the electronic device or the wearable device, converts the eye movement instruction when receiving the eye movement instruction, so as to output the eye movement instruction as an executable instruction for the electronic device action command.

进一步地,该映像模块在该穿戴式装置的输出单元上建立显示该影像数据的用户界面窗口,并依据电子装置的显示屏幕的长宽按照等比例放大或缩小调整该用户界面窗口。Further, the imaging module establishes a user interface window displaying the image data on the output unit of the wearable device, and enlarges or reduces the user interface window in proportion to the length and width of the display screen of the electronic device.

进一步地,该眼动指令分析模块系依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。Further, the eye movement command analysis module analyzes the gaze direction of the user according to the acquired eye image, and forms a cursor on the user interface window that can move according to the gaze direction.

进一步地,当该用户的注视方向停留在该用户界面窗口上时,该眼动指令分析模块则记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并当该注视方向大致停留于同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该指令转换模块以启动该图形化界面所对应的一或多个程序。Further, when the gaze direction of the user stays on the user interface window, the eye movement instruction analysis module records the coordinate position on the display screen of the electronic device corresponding to the gaze direction, and when the gaze direction stays approximately When the set threshold time is exceeded on the same graphical interface, a trigger command is sent to the instruction conversion module via the wireless network to start one or more programs corresponding to the graphical interface.

进一步地,该眼动指令分析模块在检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的该眼部动作指令至该指令转换模块,以使该电子装置执行由左侧向右侧翻页的动作指令,该眼动指令分析模块于检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的该眼部动作指令至该指令转换模块,以使该电子装置执行由右侧向左侧翻页的动作指令。Further, when the eye movement instruction analysis module detects that the user's gaze direction is moving rapidly from left to right, it transmits the eye movement instruction of turning the page on the left to the instruction conversion module, so that the electronic device executes the The action instruction of turning the page from the left to the right, the eye movement instruction analysis module transmits the eye movement instruction of turning the page on the right to the instruction conversion module when detecting that the user's gaze direction is moving rapidly from right to left , so that the electronic device executes an action instruction of turning pages from right to left.

进一步地,该眼动指令分析模块在检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令至该指令转换模块,以使该电子装置执行对应于该眼部移动方向卷动的动作指令。Further, the eye movement instruction analysis module sets the reference coordinates when detecting the trigger action of the user, continuously detects the gaze direction of the user, and records the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinates , when the X-axis moving distance or the Y-axis moving distance is greater than a threshold value, transmitting an eye movement instruction corresponding to the eye moving direction and moving distance to the instruction conversion module, so that the electronic device executes the corresponding eye movement instruction. An action command to scroll in the moving direction.

进一步地,该无线网络采用无线保真度直连(WiFi Direct)协议、蓝芽无线传输(Bluetooth)、或虚拟无线AP(Wi-Fi soft AP)。Further, the wireless network adopts a wireless fidelity direct (WiFi Direct) protocol, a Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).

本发明的另一目的,在于提供一种受控端电子装置,包括有显示屏幕,无线传输单元,以及连接该显示屏幕及该无线传输单元的处理器。该处理器系具有图形处理单元,用于将影像数据传送至该显示屏幕上以提供供用户操作的用户界面。该处理器包括有运算单元用于挂载并执行以下的程序:Another object of the present invention is to provide a controlled electronic device, which includes a display screen, a wireless transmission unit, and a processor connected to the display screen and the wireless transmission unit. The processor has a graphics processing unit for transmitting image data to the display screen to provide a user interface for user operation. The processor includes an arithmetic unit for loading and executing the following programs:

图形输出模块,用于存取该电子装置的影像数据,并将该电子装置所显示的影像数据经由无线网络传送至穿戴式装置,来供该穿戴式装置输出以供用户目视操作。The graphic output module is used for accessing the image data of the electronic device, and transmitting the image data displayed by the electronic device to the wearable device through the wireless network, so that the wearable device can output for visual operation by the user.

指令转换模块,经由无线网络接收该穿戴式装置所提供的眼部动作指令,将该眼部动作指令输出为可供该电子装置执行的动作指令。The instruction conversion module receives the eye movement instruction provided by the wearable device via the wireless network, and outputs the eye movement instruction as an action instruction that can be executed by the electronic device.

进一步地,该无线网络采用无线保真度直连(WiFi Direct)协议、蓝芽无线传输(Bluetooth)、或虚拟无线AP(Wi-Fi soft AP)。Further, the wireless network adopts a wireless fidelity direct (WiFi Direct) protocol, a Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).

本发明的另一目的,在于提供一种主控端穿戴式装置,包括有输出单元,无线传输单元,摄像单元,以及连接于该输出单元、该无线传输单元及该摄像单元的处理器。该摄像单元系用于拍摄并取得用户的眼部影像。该处理器系具有图形处理单元,用于将影像数据传送至该输出单元上以供用户操作的用户界面。该处理器包括有运算单元用于挂载并执行以下的程序:Another object of the present invention is to provide a wearable device at the main control end, which includes an output unit, a wireless transmission unit, a camera unit, and a processor connected to the output unit, the wireless transmission unit, and the camera unit. The camera unit is used for shooting and obtaining the user's eye image. The processor is provided with a graphics processing unit for transmitting image data to the output unit for a user interface operated by a user. The processor includes an arithmetic unit for loading and executing the following programs:

映像模块,系通过无线网络取得电子装置的影像数据,并将该影像数据传送至该穿戴式装置的该图形处理单元以显示在该输出单元上,供用户注视操作。The imaging module obtains image data of the electronic device through the wireless network, and transmits the image data to the graphic processing unit of the wearable device to be displayed on the output unit for the user to watch and operate.

眼动指令分析模块,取得经由该摄像单元所拍摄取得的眼部影像,并由该眼部影像获取眼部动作指令,通过无线网络将该眼部动作指令传送至该电子装置,以启动该电子装置的一或多个程序。The eye movement command analysis module obtains the eye image captured by the camera unit, obtains the eye movement command from the eye image, and transmits the eye movement command to the electronic device through the wireless network to activate the electronic device. One or more programs for a device.

进一步地,该映像模块在该穿戴式装置的输出单元上建立显示该影像数据的用户界面窗口,并依据该电子装置的显示屏幕的长宽按照等比例放大或缩小调整该用户界面窗口。Further, the imaging module establishes a user interface window displaying the image data on the output unit of the wearable device, and enlarges or reduces the user interface window in proportion to the length and width of the display screen of the electronic device.

进一步地,该眼动指令分析模块系依据所获取到的该眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。Further, the eye movement instruction analysis module analyzes the user's gaze direction according to the acquired eye image, and forms a cursor on the user interface window that can move according to the gaze direction.

进一步地,当该用户的注视方向停留于该用户界面窗口上时,该眼动指令分析模块系纪录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并在检测到该注视方向大致停留于同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该电子装置以启动该图形化界面所对应的一或多个程序。Further, when the gaze direction of the user stays on the user interface window, the eye movement command analysis module records the coordinate position corresponding to the gaze direction on the display screen of the electronic device, and detects the gaze direction When generally staying on the same graphical interface for more than the set threshold time, a trigger command is sent to the electronic device via the wireless network to start one or more programs corresponding to the graphical interface.

进一步地,该眼动指令分析模块在检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由左侧向右侧翻页的程序,该眼动指令分析模块在检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由右侧向左侧翻页的程序。Further, when the eye movement instruction analysis module detects that the user's gaze direction is moving rapidly from left to right, it transmits the eye movement instruction of turning the page from the left to the electronic device, so that the electronic device performs For the program of turning the page to the right, the eye movement instruction analysis module transmits the eye movement instruction of turning the page on the right to the electronic device when it detects that the user's gaze direction is moving rapidly from right to left, so that the electronic device The device executes the program of turning pages from right to left.

进一步地,该眼动指令分析模块系在检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令至该电子装置,以使该电子装置执行对应于该眼部移动方向卷动的程序。Further, the eye movement command analysis module sets the reference coordinates when detecting the trigger action of the user, continuously detects the gaze direction of the user, and records the X-axis movement distance and the Y-axis movement of the gaze direction relative to the reference coordinates distance, when the X-axis moving distance or the Y-axis moving distance is greater than the threshold value, the eye movement command corresponding to the moving direction and moving distance of the eye is transmitted to the electronic device, so that the electronic device executes the action corresponding to the eye A program that scrolls in the mobile direction.

进一步地,该无线网络采用无线保真度直连(WiFi Direct)协议、蓝芽无线传输(Bluetooth)、或虚拟无线AP(Wi-Fi soft AP)。Further, the wireless network adopts a wireless fidelity direct (WiFi Direct) protocol, a Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).

本发明的另一目的,在于提供一种穿戴式装置及电子装置的界面同步方法,包括:存取该电子装置的影像数据,并将该电子装置的影像数据经由无线网络传送至该穿戴式装置;通过该穿戴式装置将影像数据显示在该穿戴式装置的输出单元上,以供用户注视操作;分析该穿戴式装置所获取到的眼部动作指令,并将该眼部动作指令经由无线网络传送至该电子装置;以及在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行的动作指令。Another object of the present invention is to provide an interface synchronization method for a wearable device and an electronic device, including: accessing the image data of the electronic device, and transmitting the image data of the electronic device to the wearable device via a wireless network ;display the image data on the output unit of the wearable device through the wearable device, so that the user can watch and operate; analyze the eye movement instructions acquired by the wearable device, and send the eye movement instructions through the wireless network transmitting to the electronic device; and converting the eye movement command when receiving the eye movement command, so as to output the eye movement command as an action command executable by the electronic device.

进一步地,该穿戴式装置在接收到该影像数据后,依据该电子装置的显示屏幕的长宽按照等比例放大或缩小建立显示该影像数据的用户界面窗口。Further, after the wearable device receives the image data, it enlarges or shrinks proportionally according to the length and width of the display screen of the electronic device to create a user interface window for displaying the image data.

进一步地,该穿戴式装置系依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。Further, the wearable device analyzes the user's gaze direction according to the acquired eye image, and forms a cursor on the user interface window that can move according to the gaze direction.

进一步地,该用户的注视方向停留在该用户界面窗口上时,则记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并在该注视方向大致停留于同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该电子装置以启动该图形化界面所对应的一或多个程序。Further, when the user's gaze direction stays on the user interface window, the coordinate position on the display screen of the electronic device corresponding to the gaze direction is recorded, and the gaze direction roughly stays on the same graphical interface for more than When the threshold time is set, a trigger command is sent to the electronic device via the wireless network to start one or more programs corresponding to the graphical interface.

进一步地,在检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由左侧向右侧翻页的动作指令,在检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由右侧向左侧翻页的动作指令。Further, when it is detected that the gaze direction of the user is moving rapidly from left to right, the eye movement command of turning the page on the left is transmitted to the electronic device, so that the electronic device performs the turning of the page from left to right. Action command, when detecting that the user’s gaze direction is moving rapidly from right to left, transmit the eye movement command of turning the page on the right to the electronic device, so that the electronic device performs the action of turning the page from the right to the left action command.

进一步地,在检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令至该电子装置,以使该电子装置执行对应于该眼部移动方向卷动的动作指令。Further, set the reference coordinates when the trigger action of the user is detected, continuously detect the gaze direction of the user, and record the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinates, when the X-axis movement When the distance or the Y-axis movement distance is greater than the threshold, the eye movement command corresponding to the eye movement direction and movement distance is transmitted to the electronic device, so that the electronic device executes the scrolling action command corresponding to the eye movement direction .

本发明还有一目的,在于提供一种计算机可读取纪录媒体,其上记录一程序,当电子装置及穿戴式装置加载该程序并执行后,系可完成如上所述的方法。Another object of the present invention is to provide a computer-readable recording medium, on which a program is recorded. After the program is loaded and executed in electronic devices and wearable devices, the above method can be completed.

本发明还有一目的,在于提供一种计算机程序产品,当该计算机程序产品被加载电子装置及穿戴式装置中执行时,可完成如上所述的方法。Another object of the present invention is to provide a computer program product, which can implement the above-mentioned method when the computer program product is loaded into an electronic device and a wearable device for execution.

因此,本发明相较于前述习知技术具有以下的优异效果:Therefore, compared with the aforementioned prior art, the present invention has the following excellent effects:

1.本发明的用户界面同步系统可将电子装置的影像数据传送至穿戴式装置的输出单元上,以使通过追踪用户的眼部动作操作电子装置。1. The user interface synchronization system of the present invention can transmit the image data of the electronic device to the output unit of the wearable device, so as to operate the electronic device by tracking the user's eye movements.

2.本发明的前镜头与用户眼部间可维持于固定的间距,较容易检测用户的眼部动作。2. The distance between the front lens and the user's eyes can be maintained at a fixed distance, which makes it easier to detect the user's eye movements.

附图说明Description of drawings

图1:表示本发明用户界面同步系统的方块示意图。FIG. 1: A schematic block diagram showing the user interface synchronization system of the present invention.

图2:表示本发明用户界面同步系统的使用状态示意图。Fig. 2: A schematic view showing the usage status of the user interface synchronization system of the present invention.

图3:表示用户界面窗口的示意图(一)。Figure 3: Schematic representing the user interface window (a).

图4:表示眼部动作于用户界面窗口上所产生的轨迹示意图(一)。Figure 4: Schematic diagram showing the trajectories generated by eye movements on the user interface window (1).

图5:表示眼部动作于用户界面窗口上所产生的轨迹示意图(二)。Figure 5: Schematic diagram showing the trajectory generated by eye movements on the user interface window (2).

图6:表示用户界面窗口的示意图(二)。Figure 6: Schematic representation of the user interface window (2).

图7:表示另一种用户界面窗口的示意图。Figure 7: Schematic representation of another user interface window.

图8;表示本发明用户界面同步方法的流程示意图(一)。FIG. 8 : shows a schematic flow chart (1) of the user interface synchronization method of the present invention.

图9:表示本发明用户界面同步方法的流程示意图(二)。FIG. 9 : Schematic flow diagram (2) showing the user interface synchronization method of the present invention.

图10:表示本发明用户界面同步方法的流程示意图(三)。FIG. 10 : Schematic flow chart (3) showing the user interface synchronization method of the present invention.

图11:表示本发明用户界面同步方法的流程示意图(四)。FIG. 11 : Schematic flow diagram (4) showing the user interface synchronization method of the present invention.

符号说明Symbol Description

100 用户界面同步系统100 User Interface Synchronization System

10 电子装置10 electronics

11 显示屏幕11 display screen

12 处理单元12 processing units

13 图形处理单元13 graphics processing unit

14 储存单元14 storage units

16 无线传输单元16 wireless transmission unit

17 图形输出模块17 Graphic output module

18 指令转换模块18 instruction conversion module

CU1 处理器CU1 processor

20 穿戴式装置20 wearable devices

21 输出单元21 output unit

22 处理单元22 processing units

23 图形处理单元23 graphics processing unit

24 储存单元24 storage units

25 摄像单元25 camera unit

26 无线传输单元26 wireless transmission unit

27 映像模块27 image module

28 眼动指令分析模块28 Eye Movement Command Analysis Module

CU2 处理器CU2 processor

W 用户界面窗口W User interface window

W1 游标W1 cursor

W2 定时器W2 timer

W3、W4、W5、W6 箭头(上、下、左、右)W3, W4, W5, W6 Arrows (up, down, left, right)

步骤S201~S205Steps S201-S205

步骤S2051A~S2054ASteps S2051A~S2054A

步骤S2051B~S2053BSteps S2051B-S2053B

步骤S2051C~S2055CSteps S2051C-S2055C

具体实施方式detailed description

有关本发明的详细说明及技术内容,现就配合图示说明如下。再者,本发明中的图示,为说明方便,其比例未必按实际比例绘制,而有夸大的情况,该等图示及其比例非用于限制本发明的范围。Relevant detailed description and technical contents of the present invention are as follows with regard to coordinating illustrations now. Furthermore, for the convenience of illustration, the proportions of the diagrams in the present invention are not necessarily drawn according to the actual scale, but may be exaggerated. These diagrams and their proportions are not intended to limit the scope of the present invention.

本发明为一种用户界面同步系统100,用于将电子装置10的画面传送至穿戴式装置20上,通过穿戴式装置20追踪用户的眼部动作以操作该电子装置10。The present invention is a user interface synchronization system 100 , which is used to transmit the image of the electronic device 10 to the wearable device 20 , and the wearable device 20 tracks the user's eye movements to operate the electronic device 10 .

所述的电子装置10系至少包括显示屏幕11、处理单元12(Central Processing Unit,CPU)、以及可输入输出影像的图形处理单元13(Graphics Processing Unit,GPU)。具体而言,该电子装置10可为(例如)蜂巢式电话、智能电话、平板计算机、手持式行动通讯装置、个人数字助理(Personal DigitalAssistant,PDA)或类此的可携式电子装置,此外,该电子装置10亦可为计算器、桌面计算机、笔记本电脑、车载计算机等具有显示及控制界面的电子装置。The electronic device 10 includes at least a display screen 11 , a processing unit 12 (Central Processing Unit, CPU), and a graphics processing unit 13 (Graphics Processing Unit, GPU) capable of inputting and outputting images. Specifically, the electronic device 10 can be, for example, a cellular phone, a smart phone, a tablet computer, a handheld mobile communication device, a personal digital assistant (Personal Digital Assistant, PDA) or similar portable electronic devices. In addition, The electronic device 10 can also be an electronic device with a display and control interface, such as a calculator, a desktop computer, a notebook computer, or a vehicle-mounted computer.

所述的穿戴式装置20特指一种穿戴于用户头部的穿戴型装置,可通过输出单元21提供用户可操作的用户界面,通过摄像单元25拍摄用户的眼部获取用户的眼部影像,通过用户的注视方向操作上述的用户界面。所述的穿戴式装置20至少包括有提供影像至用户眼部的输出单元21、拍摄用户眼部以取得用户眼部影像的摄像单元25、处理单元22(Central Processing Unit,CPU),以及可输入输出影像的图形处理单元23(GraphicsProcessing Unit,GPU)。具体而言,该穿戴式装置20可为智能型眼镜、眼迹追踪仪、扩增实境装置、虚拟现实装置、或类此的智能型穿戴装置。The wearable device 20 specifically refers to a wearable device worn on the user's head, which can provide a user-operable user interface through the output unit 21, and capture the user's eye image through the camera unit 25, The user interface described above is operated by the gaze direction of the user. The wearable device 20 includes at least an output unit 21 that provides images to the user's eyes, a camera unit 25 that takes pictures of the user's eyes to obtain images of the user's eyes, a processing unit 22 (Central Processing Unit, CPU), and can input A graphics processing unit 23 (Graphics Processing Unit, GPU) for outputting images. Specifically, the wearable device 20 can be smart glasses, an eye tracker, an augmented reality device, a virtual reality device, or similar smart wearable devices.

如图1所示,为本发明用户界面同步系统的方块示意图,如图所示:As shown in Figure 1, it is a schematic block diagram of the user interface synchronization system of the present invention, as shown in the figure:

以下为分别针对电子装置10及穿戴式装置20的硬件架构进行说明,在硬件架构说明完后,后方会针对软件架构的部分进行更进一步的说明。The following describes the hardware architectures of the electronic device 10 and the wearable device 20 respectively. After the hardware architectures are explained, the software architectures will be further described later.

电子装置:Electronics:

如前所述,所述的电子装置10作为受控端,以将影像数据传送至该穿戴式装置20,并通过该穿戴式装置20的眼部动作追踪功能通过无线网络操作该电子装置10。该电子装置10系包括有显示屏幕11、处理单元12(Processing Unit)、可输入输出影像的图形处理单元13(Graphics Processing Unit,GPU)、储存单元14、及无线传输单元16。As mentioned above, the electronic device 10 is used as a controlled terminal to transmit image data to the wearable device 20 , and operate the electronic device 10 through the wireless network through the eye movement tracking function of the wearable device 20 . The electronic device 10 includes a display screen 11 , a processing unit 12 (Processing Unit), a Graphics Processing Unit 13 (Graphics Processing Unit, GPU) capable of inputting and outputting images, a storage unit 14 , and a wireless transmission unit 16 .

所述的处理单元12可与图形处理单元13共同构成处理器CU1,使该处理单元12与图形处理单元13可以集成于单一芯片上,因此减少组件所需占去的体积。举例而言,所述的处理器CU1可以为例如ARM Holdings,Ltd.开发的系列处理器、以及中国科学院的计算技术研究所(ICT)开发的龙芯(Loongson)处理器等,于本发明中不予以限制。The processing unit 12 and the graphics processing unit 13 can jointly constitute the processor CU1, so that the processing unit 12 and the graphics processing unit 13 can be integrated on a single chip, thereby reducing the required volume of components. For example, the processor CU1 may be developed by, for example, ARM Holdings, Ltd. series of processors, and Loongson processors developed by the Institute of Computing Technology (ICT) of the Chinese Academy of Sciences, etc., are not limited in the present invention.

在另一较佳实施例中,所述的处理单元12、及图形处理单元13可个别构成处理器,分别处理逻辑运算、及图像处理等工作,并共同或协同处理部分程序指令。In another preferred embodiment, the processing unit 12 and the graphics processing unit 13 can individually constitute processors, which respectively process logic operations and image processing, and jointly or cooperatively process some program instructions.

在另一较佳实施例中,所述的处理单元12可与储存单元14共同构成处理器,该处理单元12可加载该储存单元14所预存的程序,并执行对应的算法。In another preferred embodiment, the processing unit 12 and the storage unit 14 together constitute a processor, and the processing unit 12 can load a program pre-stored in the storage unit 14 and execute a corresponding algorithm.

在本实施中,该处理单元12系与图形处理单元13共同构成处理器CU1,该处理器CU1并耦接于该储存单元14。该处理器CU1可为中央处理器(Central Processing Unit,CPU),或是其他可程序化并具有一般用途或特殊用途的微处理器(Microprocessor)、数字信号处理器(Digital SignalProcessor,DSP)、可程序化控制器、特殊应用集成电路(Application Specific Integrated Circuits,ASIC)、可程序化逻辑设备(Programmable Logic Device,PLD)或其他类似装置或这些装置的组合。In this implementation, the processing unit 12 and the graphics processing unit 13 jointly constitute a processor CU1 , and the processor CU1 is coupled to the storage unit 14 . The processor CU1 can be a central processing unit (Central Processing Unit, CPU), or other programmable microprocessor (Microprocessor), digital signal processor (Digital Signal Processor, DSP) with general purpose or special purpose, Programmable controller, application specific integrated circuit (Application Specific Integrated Circuits, ASIC), programmable logic device (Programmable Logic Device, PLD) or other similar devices or a combination of these devices.

所述的显示屏幕11系用于显示图形数据,例如用户操作界面、图形化界面、或多媒体影像等,通过将影像或操作界面显示在该显示屏幕11上以供用户读取。所述的显示屏幕11可为主动式数组有机发光装置(AMOLED)显示器、薄膜晶体管(Thin FilmTransistor,TFT)显示器、或其他类此的显示设备,于本发明中不予以限制。所述的显示屏幕11通过控制电路驱动,通过输入对应的讯号至数据线驱动电路及扫描线驱动电路,以驱动面板在对应坐标上的发光单元(像素元)。所述的显示屏幕11通过图形处理单元13在存取储存单元14内的数据后,将对应的多媒体数据发布在显示屏幕11上,以供用户目视。The display screen 11 is used to display graphic data, such as user operation interface, graphical interface, or multimedia images, etc., by displaying the image or operation interface on the display screen 11 for users to read. The display screen 11 can be an active matrix organic light emitting device (AMOLED) display, a thin film transistor (Thin Film Transistor, TFT) display, or other similar display devices, which are not limited in the present invention. The display screen 11 is driven by the control circuit, by inputting corresponding signals to the data line driving circuit and the scanning line driving circuit, to drive the light-emitting units (pixel elements) of the panel on the corresponding coordinates. The display screen 11 publishes the corresponding multimedia data on the display screen 11 after the graphics processing unit 13 accesses the data in the storage unit 14 for visual viewing by the user.

所述的无线传输单元16系可通过无线网络进行数据传输。具体而言,该无线网络采用无线保真度直连(WiFi Direct)协议、蓝芽无线传输(Bluetooth)、或虚拟无线AP(Wi-Fi soft AP)。在另一较佳实施例中,所述的无线传输单元16可通过无线射频辨识(Radio Frequency Identification,RFID)技术与穿戴式装置20进行配对,藉以与该穿戴式装置20进行中、短距离的无线数据传输。The wireless transmission unit 16 is capable of data transmission through a wireless network. Specifically, the wireless network adopts a wireless fidelity direct (WiFi Direct) protocol, a Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP). In another preferred embodiment, the wireless transmission unit 16 can be paired with the wearable device 20 through radio frequency identification (Radio Frequency Identification, RFID) technology, so as to carry out medium and short distance communication with the wearable device 20 Wireless data transmission.

穿戴式装置:Wearable device:

该穿戴式装置20系可作为主控端,通过接收该电子装置10的影像数据,并将该影像数据输出至用户眼部,以供用户注视操作。如前所述,该穿戴式装置20系包括有输出单元21、处理单元22(Central Processing Unit,CPU)、可输入输出影像的图形处理单元23(Graphics Processing Unit,GPU)、摄像单元25、以及无线传输单元26。The wearable device 20 can be used as a master control terminal, by receiving the image data of the electronic device 10 and outputting the image data to the eyes of the user for the user to watch and operate. As mentioned above, the wearable device 20 includes an output unit 21, a processing unit 22 (Central Processing Unit, CPU), a graphics processing unit 23 (Graphics Processing Unit, GPU) capable of inputting and outputting images, a camera unit 25, and Wireless transmission unit 26.

所述的处理单元22与电子装置10的处理单元12大致相同,在此,针对处理单元22的说明即不再予以赘述。同电子装置10的处理单元12,在一较佳实施例中,所述的处理单元22可与图形处理单元23共同构成处理器CU2,使该处理单元22与图形处理单元23可以集成在单一芯片上。在另一较佳实施例中,所述的处理单元22、及图形处理单元23可个别构成处理器,分别处理逻辑运算、及图像处理等工作,并共同或协同处理部分程序指令。在另一较佳实施例中,所述的处理单元22可与储存单元24共同构成处理器,该处理单元22可加载该储存单元24所预存的程序,并执行对应的算法。The processing unit 22 is substantially the same as the processing unit 12 of the electronic device 10 , and the description of the processing unit 22 will not be repeated here. Same as the processing unit 12 of the electronic device 10, in a preferred embodiment, the processing unit 22 and the graphics processing unit 23 together constitute the processor CU2, so that the processing unit 22 and the graphics processing unit 23 can be integrated in a single chip superior. In another preferred embodiment, the processing unit 22 and the graphics processing unit 23 can individually constitute processors, which respectively process logic operations and image processing, and jointly or cooperatively process some program instructions. In another preferred embodiment, the processing unit 22 and the storage unit 24 together constitute a processor, and the processing unit 22 can load the program pre-stored in the storage unit 24 and execute the corresponding algorithm.

在本实施中,该处理单元22系与图形处理单元23共同构成处理器CU2,该处理器CU2系耦接于该储存单元24。该处理单元22可为中央处理器(Central Processing Unit,CPU),或是其他可程序化并具有一般用途或特殊用途的微处理器(Microprocessor)、数字信号处理器(Digital SignalProcessor,DSP)、可程序化控制器、特殊应用集成电路(Application Specific Integrated Circuits,ASIC)、可程序化逻辑设备(Programmable Logic Device,PLD)或其他类似装置或这些装置的组合。In this implementation, the processing unit 22 and the graphics processing unit 23 jointly constitute a processor CU2 , and the processor CU2 is coupled to the storage unit 24 . The processing unit 22 can be a central processing unit (Central Processing Unit, CPU), or other programmable microprocessors (Microprocessor), digital signal processors (Digital Signal Processor, DSP) with general purpose or special purpose, and can Programmable controller, application specific integrated circuit (Application Specific Integrated Circuits, ASIC), programmable logic device (Programmable Logic Device, PLD) or other similar devices or a combination of these devices.

所述的输出单元21系用于显示图形数据,并将图形数据送至用户的眼部以供用户目视操作。该输出单元21可为显示屏幕,例如主动式数组有机发光装置(AMOLED)显示器、薄膜晶体管(thin film transistor,TFT)显示器、或其他类此的显示设备。在另一较佳实施例中,所述的输出单元21可为视网膜显示器,通过视网膜影像投影(Retinal Imaging Display,RID)技术,将画面直接投射在视网膜上供用户观看。视网膜显示器通过玻璃的反射,光束可直接在视网膜上成像,让影像光束与肉眼所看到的实景融合。所述的输出单元21系可通过图形处理单元23(GPU)在存取储存单元24内的数据后,将对应的多媒体数据发布于用户的眼部,以供用户目视。The output unit 21 is used for displaying graphic data, and sending the graphic data to the user's eyes for visual operation by the user. The output unit 21 can be a display screen, such as an active matrix organic light emitting device (AMOLED) display, a thin film transistor (thin film transistor, TFT) display, or other similar display devices. In another preferred embodiment, the output unit 21 may be a retinal display, which directly projects images on the retina for the user to watch through retinal image projection (Retinal Imaging Display, RID) technology. Through the reflection of the glass on the retinal display, the light beam can be directly imaged on the retina, so that the image beam can be integrated with the real scene seen by the naked eye. The output unit 21 can issue the corresponding multimedia data to the eyes of the user after accessing the data in the storage unit 24 through the graphics processing unit 23 (GPU) for the user to view.

所述的摄像单元25采用搭载有感光耦合组件(ChargeCoupled Device,CCD)或互补性氧化金属半导体(ComplementaryMetal-Oxide Semiconductor,CMOS)的摄像机,在本发明中不予以限制。该摄像单元25系用于拍摄用户的眼部影像,所取得的眼部影像将传送至眼动指令分析模块28进行进一步的分析,以追踪用户的眼部动作并将该眼部动作转换为对应的眼部动作指令。The camera unit 25 is a camera equipped with a Charge Coupled Device (CCD) or a Complementary Metal-Oxide Semiconductor (CMOS), which is not limited in the present invention. The camera unit 25 is used to take images of the user's eyes, and the obtained eye images will be sent to the eye movement command analysis module 28 for further analysis, so as to track the user's eye movements and convert the eye movements into corresponding eye movement commands.

所述的无线传输单元26系可通过无线网络进行数据传输。具体而言,该无线网络采用无线保真度直连(WiFi Direct)协议、蓝芽无线传输(Bluetooth)、或虚拟无线AP(Wi-Fi soft AP)。The wireless transmission unit 26 is capable of data transmission through a wireless network. Specifically, the wireless network adopts a wireless fidelity direct (WiFi Direct) protocol, a Bluetooth wireless transmission (Bluetooth), or a virtual wireless AP (Wi-Fi soft AP).

以上已针对电子装置10及穿戴式装置20的硬件架构进行详细的说明,在此,进一步依据上述的硬件架构,对应本发明用户界面同步系统的架构进行详细的说明:The hardware architectures of the electronic device 10 and the wearable device 20 have been described above in detail. Here, based on the above hardware architectures, the architecture of the user interface synchronization system of the present invention is further described in detail:

结合图2,为本发明用户界面同步系统的使用状态示意图,如图所示:In conjunction with Fig. 2, it is a schematic diagram of the use state of the user interface synchronization system of the present invention, as shown in the figure:

本发明可将该电子装置10与穿戴式装置20进行配对。在配对成功后,所述的电子装置10可通过无线网络将显示屏幕11上的影像数据传送至该穿戴式装置20,以便用户通过该穿戴式装置20读取该显示屏幕11。所述的穿戴式装置20系可通过摄像单元25拍摄用户的眼部影像,以通过眼部动作无线操作该电子装置10,藉以启动该电子装置10的一或多个程序。The present invention can pair the electronic device 10 with the wearable device 20 . After the pairing is successful, the electronic device 10 can transmit the image data on the display screen 11 to the wearable device 20 through the wireless network, so that the user can read the display screen 11 through the wearable device 20 . The wearable device 20 can capture the user's eye image through the camera unit 25 , so as to wirelessly operate the electronic device 10 through eye movements, so as to activate one or more programs of the electronic device 10 .

参照图1,所述的用户界面同步系统100系包括耦接于该电子装置10的图形输出模块17、指令转换模块18,以及耦接于该穿戴式装置20的映像模块27、眼动指令分析模块28。Referring to FIG. 1 , the user interface synchronization system 100 includes a graphics output module 17 coupled to the electronic device 10, an instruction conversion module 18, an image module 27 coupled to the wearable device 20, and an eye movement instruction analysis module. Module 28.

在本实施例中,所述的图形输出模块17、以及指令转换模块18系预存于该电子装置10的储存单元14内,以使该电子装置10的处理单元12于挂载该图形输出模块17及指令转换模块18的程序后执行其算法。所述的映像模块27、以及眼动指令分析模块28系预存于该穿戴式装置20的储存单元24内,以使该穿戴式装置20的处理单元22于挂载该映像模块27、以及眼动指令分析模块28的程序后执行其算法。在另一较佳实施例中,所述的指令转换模块18亦可加载于该穿戴式装置20,该穿戴式装置20通过该指令转换模块18将该眼部动作指令转换为可供电子装置10执行的动作指令后,再经由无线网络将该动作指令传送至该电子装置10以启动一或多个程序。于上述步骤中,所述的穿戴式装置20系对该动作指令进行加密,该电子装置10仅需将该动作指令解密后并执行即可,所述的方式可理解为本发明的另一类似实施。In this embodiment, the graphics output module 17 and the command conversion module 18 are pre-stored in the storage unit 14 of the electronic device 10, so that the processing unit 12 of the electronic device 10 is mounted on the graphics output module 17 And the program of the instruction conversion module 18 executes its algorithm. The image module 27 and the eye movement instruction analysis module 28 are pre-stored in the storage unit 24 of the wearable device 20, so that the processing unit 22 of the wearable device 20 can mount the image module 27 and the eye movement instruction analysis module 28. The program of the instruction analysis module 28 then executes its algorithm. In another preferred embodiment, the command conversion module 18 can also be loaded on the wearable device 20, and the wearable device 20 converts the eye movement commands into the electronic device 10 through the command conversion module 18. After the action command is executed, the action command is sent to the electronic device 10 via the wireless network to start one or more programs. In the above steps, the wearable device 20 encrypts the action command, and the electronic device 10 only needs to decrypt the action command and execute it. The above method can be understood as another similar method of the present invention. implement.

所述的图形输出模块17耦接于该电子装置10,以存取该电子装置10的影像数据,并将该电子装置10的影像数据经由无线网络传送至该穿戴式装置20。该图形输出模块17系用于存取该图形处理单元13内所提供的影像数据(用户界面),所取得的影像数据系与该显示屏幕11上所显示的影像同步,或者是依据预设定或用户设定,强制关闭显示屏幕11,使该图形输出模块17直接将影像数据通过无线网络传送至该穿戴式装置20,以减少该图形输出模块17额外的负担及减少电子装置的耗电。The graphic output module 17 is coupled to the electronic device 10 to access the image data of the electronic device 10 and transmit the image data of the electronic device 10 to the wearable device 20 via the wireless network. The graphics output module 17 is used to access the image data (user interface) provided in the graphics processing unit 13, and the obtained image data is synchronized with the image displayed on the display screen 11, or according to preset Or the user sets to forcibly turn off the display screen 11, so that the graphic output module 17 directly transmits the image data to the wearable device 20 through the wireless network, so as to reduce the extra burden of the graphic output module 17 and reduce the power consumption of the electronic device.

所述的映像模块27系经由该无线传输单元26与该图形输出模块17形成配对,该映像模块27系用于将影像数据显示于该穿戴式装置20的输出单元21上,以供用户注视操作。如图3,该映像模块27可建立显示该影像数据的用户界面窗口W,并依据电子装置10的显示屏幕11的长宽依等比例放大或缩小调整该用户界面窗口W,以使用户在较佳视觉感受范围内操作该用户界面窗口W。所述的映像模块27系于该用户界面窗口W上显示一可移动的光标W1,该光标W1系追随用户的注视方向移动,所述的注视方向系通过眼动指令分析模块28进行计算。The image module 27 is paired with the graphic output module 17 via the wireless transmission unit 26, and the image module 27 is used to display image data on the output unit 21 of the wearable device 20 for the user to watch and operate . As shown in FIG. 3 , the image module 27 can establish a user interface window W for displaying the image data, and adjust the user interface window W according to the length and width of the display screen 11 of the electronic device 10 in proportion to enlarge or reduce, so that the user can easily Operate the user interface window W within the best visual perception range. The image module 27 displays a movable cursor W1 on the user interface window W, and the cursor W1 moves along the gaze direction of the user, and the gaze direction is calculated by the eye movement command analysis module 28 .

在一较佳实施例中,该图形输出模块17可以通过串流技术(Streaming media)将一连串的影像数据压缩后,经过无线网络分段传送数据,并由耦接于该穿戴式装置20的映像模块27解压串流封包,以使在将影像数据实时显示于该穿戴式装置20的输出单元21上。In a preferred embodiment, the graphics output module 17 can compress a series of image data through streaming technology (Streaming media), and then transmit the data in segments through the wireless network, and the image coupled to the wearable device 20 The module 27 decompresses the streaming packet to display the image data on the output unit 21 of the wearable device 20 in real time.

当该影像数据显示在该输出单元21所提供的用户界面窗口W后,用户可通过眼部动作操作该用户界面窗口W上的光标W1,或是通过连续的眼部动作产生对应的眼部动作指令操作该用户界面窗口W。该眼动指令分析模块28在分析用户的眼部动作后,系将该眼部动作转换为眼部动作指令,并将该眼部动作指令通过无线网络传送至耦接于该电子装置10的指令转换模块18,以供指令转换模块18分析,进一步找到该眼部动作指令对应于该电子装置10的动作指令,以通过处理单元12启动对应于该动作指令的一或多个程序。在一较佳实施例中,该指令转换模块18系包括有查找表,通过该查找表该指令转换模块18可将该眼部动作指令转换为对应至该电子装置10的动作指令。是以,当穿戴式装置20与电子装置10成功配对时,用户可通过该穿戴式装置20操作该用户界面窗口W,此时,摄像单元25将持续捕捉用户的眼部影像,通过所捕捉到的眼部动作,用户可通过穿戴式装置20通过眼部动作操作该电子装置10。After the image data is displayed on the user interface window W provided by the output unit 21, the user can operate the cursor W1 on the user interface window W through eye movements, or generate corresponding eye movements through continuous eye movements Instructions operate the user interface window W. After the eye movement instruction analysis module 28 analyzes the user's eye movement, it converts the eye movement into an eye movement instruction, and transmits the eye movement instruction to the instruction coupled to the electronic device 10 through the wireless network. The conversion module 18 is used for the analysis of the command conversion module 18, and further finds that the eye movement command corresponds to the motion command of the electronic device 10, so as to start one or more programs corresponding to the motion command through the processing unit 12. In a preferred embodiment, the command conversion module 18 includes a look-up table, through which the command conversion module 18 can convert the eye movement command into an action command corresponding to the electronic device 10 . Therefore, when the wearable device 20 is successfully paired with the electronic device 10, the user can operate the user interface window W through the wearable device 20. At this time, the camera unit 25 will continue to capture the user's eye image, and through the captured The user can operate the electronic device 10 through the wearable device 20 through eye movements.

具体而言,该眼动指令分析模块28系包括有二种功能,其一为通过用户的眼部影像分析对应的眼部动作,通过该眼部动作判断用户的注视方向;其二为通过用户的注视方向判断用户所欲输入的眼部动作指令。Specifically, the eye movement command analysis module 28 includes two functions, one is to analyze the corresponding eye movements through the user's eye images, and judge the user's gaze direction through the eye movements; The gaze direction judges the eye movement command that the user wants to input.

针对通过眼部影像取得眼部动作(注视方向)的技术,可以采用Purkinje影像追踪法(Dual-Purkinje-Image,DPI)、红外线影像系统法(Infra-Red Video System,IRVS)、红外线眼动图法(Infra-Red Oculography,IROG),在本发明中并不予以限制。通过上述的方法,系可通过该用户的多个眼部影像,采集到该眼部图像映射于该输出单元21上的多个样本点(如图4所示),所取得的样本点是用于进一步分析用户所欲输入的眼部动作指令。For the technology of obtaining eye movement (gaze direction) through eye images, Purkinje image tracking method (Dual-Purkinje-Image, DPI), infrared image system method (Infra-Red Video System, IRVS), infrared eye movement diagram can be used method (Infra-Red Oculography, IROG), which is not limited in the present invention. Through the above method, multiple eye images of the user can be used to collect multiple sample points (as shown in FIG. 4 ) where the eye image is mapped on the output unit 21. The obtained sample points are obtained by In order to further analyze the eye movement command that the user intends to input.

以下是对部分主要的眼部动作指令进行说明,用户例如可通过眼部动作指令输入翻页的动作指令至该电子装置10,使该电子装置10翻转用户界面窗口W上的画面以达到翻页的效果。参照图4,为揭示用户操作右侧卷动时的一连串的样本点,在系统默认中,设定中间位置为眼部轨迹指令的起始点及结束点,在图示中,起始位置系显示为○,结束位置系显示为╳,所检测到的坐标顺序系依序呈点状排列。该眼动指令分析模块28在检测到该用户的注视方向系由左(即中间位置)至右(右侧位置)快速移动时,传递左侧翻页的眼部动作指令至该电子装置10的指令转换模块18,此时指令转换模块18于收到该左侧翻页的眼部动作指令时,通过查找表呼叫对应的动作指令,使该处理单元12执行由左侧向右侧翻页的事件。The following is a description of some major eye movement commands. For example, the user can input a page turning command to the electronic device 10 through the eye movement command, so that the electronic device 10 can flip the screen on the user interface window W to achieve page turning. Effect. Referring to Figure 4, in order to reveal a series of sample points when the user scrolls to the right, the system defaults to set the middle position as the start point and end point of the eye track command. In the figure, the start position is displayed as is ○, the end position is displayed as ╳, and the detected coordinates are arranged in dotted order. When the eye movement command analysis module 28 detects that the user's gaze direction is moving rapidly from the left (ie, the middle position) to the right (the right position), the eye movement command for turning the page on the left side is transmitted to the electronic device 10. Instruction conversion module 18. At this time, when the instruction conversion module 18 receives the eye movement instruction for turning the page on the left side, it calls the corresponding action instruction through the lookup table, so that the processing unit 12 performs the operation of turning the page from the left side to the right side. event.

在向右侧翻页的部分,请参照图5,为揭示用户操作右侧翻页时的一连串的样本点,在系统默认中,在图示中,起始位置系显示为○,结束位置系显示为╳,所检测到的坐标顺序系依序呈点状排列。该眼动指令分析模块28在检测到该用户的注视方向系由右(即中间位置)至左(左侧位置)快速移动时,传递右侧翻页的眼部动作指令至该电子装置10的指令转换模块18,此时指令转换模块18于收到该右侧翻页系通过查找表找到对应的动作指令,使该处理单元12执行由右侧向左侧翻页的事件。In the part of turning the page to the right, please refer to Figure 5. In order to reveal a series of sample points when the user operates the right page turning, in the system default, in the illustration, the starting position is displayed as ○, and the ending position is Displayed as ╳, the detected coordinates are arranged in dotted order. When the eye movement command analysis module 28 detects that the user's gaze direction is moving rapidly from the right (ie, the middle position) to the left (the left side position), it transmits the eye movement command of turning the page on the right to the electronic device 10. The instruction conversion module 18, at this time, upon receiving the right page turning, the instruction conversion module 18 finds the corresponding action instruction through the lookup table, so that the processing unit 12 executes the event of turning the page from the right to the left.

在翻页指令输入完成时,所述的电子装置10系进入短时间的不反应期(例如一秒),这是在于避免用户通过眼部动作翻页后,在眼睛回到中间位置时,被判定为翻页指令。When the page-turning command input is completed, the electronic device 10 enters a short period of non-response (for example, one second), which is to prevent the user from being caught when the eyes return to the middle position after turning the page through eye movements. It is judged as a page turning command.

在另一较佳实施态样,用户可通过眼部动作指令输入卷动页面的动作指令至该电子装置10,使该用户界面W上的页面朝用户注视的方向卷动。具体而言,所述的眼动指令分析模块28系检测该用户的眼部动作,在检测到用户的触发动作时设定一基准点(所述的触发动作可为眨眼、闭眼、画圈或其他预定义的眼部动作),并持续记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离。当该X轴移动距离或该Y轴移动距离大于阈值时,该眼动指令分析模块28系传递包含有对应于该眼部移动方向(即X轴或Y轴的正负值)及移动距离的眼部动作指令至该指令转换模块18,该指令转换模块18系将该指令传送至处理单元12,以使该处理单元12执行对应于该眼部移动方向卷动的动作指令。其中,所取得的眼部移动方向将作为判定卷动方向的参考值,所取得的移动距离将作为判定卷动速度的参考值。In another preferred embodiment, the user can input an action command of scrolling pages to the electronic device 10 through an eye movement command, so that the pages on the user interface W are scrolled in the direction of the user's gaze. Specifically, the eye movement instruction analysis module 28 detects the user's eye movement, and sets a reference point when detecting the user's trigger action (the described trigger action can be blinking, closing eyes, drawing a circle) or other predefined eye movements), and continuously record the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinates. When the X-axis moving distance or the Y-axis moving distance is greater than the threshold value, the eye movement command analysis module 28 transmits the instruction corresponding to the moving direction of the eye (that is, the positive and negative values of the X-axis or Y-axis) and the moving distance. The eye movement command is sent to the command conversion module 18, and the command conversion module 18 transmits the command to the processing unit 12, so that the processing unit 12 executes a scrolling motion command corresponding to the eye movement direction. Wherein, the obtained eye moving direction will be used as a reference value for determining the scrolling direction, and the obtained moving distance will be used as a reference value for determining the scrolling speed.

参照图6,用户可通过注视用户界面窗口W,可以操作该用户界面窗口W上的光标W1。所述的眼动指令分析模块28依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口W上形成可依据该注视方向移动的光标W1。当该用户的注视方向停留在该用户界面窗口W上时,该眼动指令分析模块28系纪录该注视方向所对应于该电子装置10的显示屏幕11上的坐标位置,并当该注视方向大致停留在同一图形化界面超过所设定的阈值时间时,系将触发指令经由无线网络传送至该指令转换模块18,该指令转换模块18系将该触发指令转换为轻触启动的动作指令,启动对应于该坐标位置上的图形化界面所对应的软件程序或指令。Referring to FIG. 6 , the user can manipulate the cursor W1 on the user interface window W by staring at the user interface window W. Referring to FIG. The eye movement instruction analysis module 28 analyzes the gaze direction of the user according to the acquired eye images, and forms a cursor W1 on the user interface window W that can move according to the gaze direction. When the user's gaze direction stays on the user interface window W, the eye movement instruction analysis module 28 records the coordinate position corresponding to the gaze direction on the display screen 11 of the electronic device 10, and when the gaze direction is approximately When staying on the same graphical interface for more than the set threshold time, the trigger command is sent to the command conversion module 18 via the wireless network, and the command conversion module 18 converts the trigger command into an action command of light touch start, and starts Corresponding to the software program or instruction corresponding to the graphical interface at the coordinate position.

如图6所示,在一较佳实施例中,映像模块27在接收到该影像数据时,可将该输出单元21上所对应的图形化界面划定范围并对象化(所划定的范围可由电子装置10的图形处理单元13取得),当用户的注视方向停留在其中一图形化界面所划定的范围内时,游标W1将转换为定时器W2,并标示出启动图形化界面所需的时间。如图示内容,该定时器W2包含有百分比数字、以及计时条,当用户的注视方向停留在图形化界面时,光标W1即转换为定时器W2,此时百分比数字及计时条将显示剩余的时间,在百分比数字达到100%时,此时计时条亦显示为满格,该眼动指令分析模块28系将触发指令经由无线网络传送至该指令转换模块18,以启动该图形化界面所对应的软件程序或指令。As shown in Figure 6, in a preferred embodiment, when the image module 27 receives the image data, it can define a range and objectify the corresponding graphical interface on the output unit 21 (the defined range can be obtained by the graphics processing unit 13 of the electronic device 10), when the user's gaze direction stays within the range defined by one of the graphical interfaces, the cursor W1 will be converted into a timer W2, and mark the time needed to start the graphical interface. time. As shown in the figure, the timer W2 includes a percentage number and a timing bar. When the user's gaze stays on the graphical interface, the cursor W1 will be converted into a timer W2. At this time, the percentage number and the timing bar will display the remaining Time, when the percentage figure reaches 100%, the timing bar is also displayed as a full grid at this time, and the eye movement command analysis module 28 is to transmit the trigger command to the command conversion module 18 via the wireless network to start the corresponding graphic interface. software programs or instructions.

请一并参阅图7,系揭示另一较佳的实施例,所述的用户界面窗口W可设置有多个操作区域,用户可注视对应于每一操作区域上的图形化界面,以产生对应于该操作区域的眼部动作指令。例如当用户将光标W1(注视方向)停留在右侧方向的箭头图形W3时,所述的眼动指令分析模块28将传递右侧翻页的眼部动作指令至该电子装置10的指令转换模块18;当用户将光标W1(注视方向)停留在左侧方向的箭头图形W4时,所述的眼动指令分析模块28将传递右侧翻页的眼部动作指令至该电子装置10的指令转换模块18;当用户将光标W1(注视方向)停留在上侧方向的箭头图形W5时,所述的眼动指令分析模块28将传递上侧翻页的眼部动作指令至该电子装置10的指令转换模块18;当用户将光标W1(注视方向)停留在下侧方向的箭头图形W6时,所述的眼动指令分析模块28将传递下侧翻页的眼部动作指令至该电子装置10的指令转换模块18;所述的指令转换模块18在接收到上述的眼部动作指令后,可通过查找表或面向对象的方式找到对应的动作指令,并将该动作指令传送至该处理单元12,以启动对应的一或多个程序。Please also refer to FIG. 7, which discloses another preferred embodiment. The user interface window W can be provided with multiple operating areas, and the user can watch the graphical interface corresponding to each operating area to generate a corresponding Eye movement commands in the operating area. For example, when the user places the cursor W1 (gazing direction) on the arrow graphic W3 in the right direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of turning the page on the right to the instruction conversion module of the electronic device 10 18. When the user keeps the cursor W1 (gazing direction) on the arrow graphic W4 in the left direction, the eye movement instruction analysis module 28 converts the instruction of the eye movement instruction for turning the page on the right to the electronic device 10 Module 18; when the user keeps the cursor W1 (gazing direction) on the arrow graphic W5 in the upper direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of turning the page up to the electronic device 10 Conversion module 18; when the user stays the cursor W1 (gazing direction) on the arrow graphic W6 in the downward direction, the eye movement instruction analysis module 28 will transmit the eye movement instruction of turning the page down to the instruction of the electronic device 10 Conversion module 18; after receiving the above-mentioned eye movement instructions, the instruction conversion module 18 can find the corresponding movement instructions through a lookup table or an object-oriented method, and send the movement instructions to the processing unit 12 for Start the corresponding one or more programs.

以下是配合图示针对本发明的用户界面同步方法进行详细的说明,参照图8,为本发明用户界面同步方法的流程示意图,如图所示:The following is a detailed description of the user interface synchronization method of the present invention with reference to the diagram. Referring to FIG. 8, it is a schematic flow diagram of the user interface synchronization method of the present invention, as shown in the figure:

本发明的用户界面同步方法,应用在电子装置10及穿戴式装置20上,可通过将电子装置10上的画面转移至穿戴式装置20上,以通过该穿戴式装置20的眼控功能无线操作该电子装置10的用户界面。有关于界面同步的具体流程如下:The user interface synchronization method of the present invention is applied to the electronic device 10 and the wearable device 20, and the screen on the electronic device 10 can be transferred to the wearable device 20 to operate wirelessly through the eye control function of the wearable device 20 The user interface of the electronic device 10 . The specific process of interface synchronization is as follows:

起始时,先将电子装置10与穿戴式装置20进行配对,所述的配对可通过加密、建立密钥或其他可相互验证的方式执行,以使该电子装置10与穿戴式装置20间建立联机。(步骤S201)Initially, the electronic device 10 is first paired with the wearable device 20, and the pairing can be performed through encryption, key establishment or other mutually verifiable methods, so that the electronic device 10 and the wearable device 20 can establish online. (step S201)

在配对完成时,存取该电子装置10的影像数据(例如显示屏幕11上的影像),并将该电子装置10的影像数据经由无线网络传送至该穿戴式装置20。(步骤S202)When the pairing is completed, access the image data of the electronic device 10 (such as the image on the display screen 11 ), and transmit the image data of the electronic device 10 to the wearable device 20 via the wireless network. (step S202)

该穿戴式装置20在接收到该影像数据后,将影像数据显示在该穿戴式装置20的输出单元21上,以供用户注视操作。所述的穿戴式装置20系依据该电子装置10的显示屏幕11的长宽按照等比例放大或缩小建立一显示该影像数据的用户界面窗口W。(步骤S203)After receiving the image data, the wearable device 20 displays the image data on the output unit 21 of the wearable device 20 for the user to watch and operate. The wearable device 20 is based on the length and width of the display screen 11 of the electronic device 10 according to the same ratio of zooming in or out to create a user interface window W for displaying the image data. (step S203)

该穿戴式装置20系依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口W上形成可依据该注视方向移动的光标W1。(步骤S204)The wearable device 20 analyzes the gaze direction of the user according to the acquired eye image, and forms a cursor W1 on the user interface window W that can move according to the gaze direction. (step S204)

分析该穿戴式装置20所获取到的眼部动作指令,并将该眼部动作指令经由无线网络传送至该电子装置10。该电子装置10在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置10执行的动作指令。(步骤S205)The eye movement command acquired by the wearable device 20 is analyzed, and the eye movement command is transmitted to the electronic device 10 via the wireless network. The electronic device 10 converts the eye movement command when receiving the eye movement command, so as to output the eye movement command as an action command executable by the electronic device 10 . (step S205)

以下系针对步骤S205的步骤举三种不同实施方式进行说明,可理解的,以下三种实施方式皆可在步骤S205中同时进行:The following are three different implementation modes for step S205. It is understandable that the following three implementation modes can be performed simultaneously in step S205:

第一种实施方式,以下参照图9的内容。首先,先取得每一图形化界面在显示屏幕数组上所对应的范围,该范围可由该电子装置10的图形处理单元13中取得,每一图形化界面的范围系分别包含有对应至该图形化界面的一或多个程序。(步骤S2051A)该用户的注视方向停留在该用户界面窗口上时,记录该注视方向所对应于该电子装置10的显示屏幕11上的坐标位置,以确认用户的注视方向(步骤S2052A)。用户的注视方向停留在其中一图形化界面上时,启动一定时器纪录用户注视方向的停留时间,并判断该注视方向是否超过所设定的阈值时间,例如1~2秒(步骤S2053A),当用户的注视方向停留在同一图形化界面上超过所设定的阈值时间时(眼部动作指令),将触发指令经由无线网络传送至该电子装置10以启动该图形化界面所对应的一或多个程序(动作指令)。(步骤S2054A)。反之,若该注视方向若离开该图形化界面的范围内,则回到步骤S2052A,继续检测用户的注视方向。For the first implementation mode, refer to the contents of FIG. 9 below. Firstly, obtain the range corresponding to each graphical interface on the display screen array, which can be obtained from the graphics processing unit 13 of the electronic device 10, and the range of each graphical interface includes the range corresponding to the graphical interface One or more programs that interface. (Step S2051A) When the user's gaze direction stays on the user interface window, record the coordinate position corresponding to the gaze direction on the display screen 11 of the electronic device 10 to confirm the user's gaze direction (step S2052A). When the user's gaze direction stays on one of the graphical interfaces, start a timer to record the dwell time of the user's gaze direction, and judge whether the gaze direction exceeds the set threshold time, such as 1 to 2 seconds (step S2053A), When the user's gaze direction stays on the same graphical interface for more than the set threshold time (eye movement instruction), the trigger instruction is transmitted to the electronic device 10 via the wireless network to activate one or more corresponding to the graphical interface. Multiple programs (action instructions). (step S2054A). On the contrary, if the gaze direction is out of the scope of the graphical interface, return to step S2052A and continue to detect the gaze direction of the user.

第二种实施方式,以下参照图10的内容。首先,启动决策程序,在决策程序中系持续检测用户的注视方向(步骤S2051B),当检测该用户的注视方向系由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置10,以使该电子装置10执行由左侧向右侧翻页的动作指令(步骤S2052B)。当检测到该用户的注视方向系由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置10,以使该电子装置10执行由右侧向左侧翻页的动作指令。(步骤S2053B)For the second implementation mode, refer to the contents of FIG. 10 below. First, start the decision-making program, in the decision-making program, continuously detect the user's gaze direction (step S2051B), when detecting that the user's gaze direction is moving rapidly from left to right, transfer the eye movement instruction of turning the page on the left to the user The electronic device 10, so that the electronic device 10 executes an action instruction of turning pages from left to right (step S2052B). When it is detected that the gaze direction of the user is moving rapidly from right to left, the eye movement command of turning the page on the right is transmitted to the electronic device 10, so that the electronic device 10 performs the action of turning the page from the right to the left instruction. (step S2053B)

第三种实施方式,以下参照图11的内容。首先,该穿戴式装置20检测用户是否行为一触发动作(步骤S2051C)。在检测到该用户的触发动作时设定一基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离(步骤S2052C)。接续,判断该X轴移动距离或该Y轴移动距离是否大于阈值(步骤S2053C),若是,进入下一步骤,若否,回到步骤S2052C。当该X轴移动距离或该Y轴移动距离大于阈值时,系判断该眼部移动方向(步骤S2054C)。将包含有该眼部移动方向及移动距离的眼部动作指令传送至该电子装置10,以使该电子装置10执行对应于该眼部移动方向卷动的动作指令(步骤S2055C)。在上述的步骤中,若用户的注视方向的距离回到小于该阈值的状态时,则复归至步骤2051C前的程序,持续检测用户是否产生一触发动作。For the third implementation mode, refer to the content of FIG. 11 below. First, the wearable device 20 detects whether the user performs a trigger action (step S2051C). When the trigger action of the user is detected, a reference coordinate is set, the gaze direction of the user is continuously detected, and the X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinate are recorded (step S2052C). Next, determine whether the X-axis moving distance or the Y-axis moving distance is greater than a threshold (step S2053C), if yes, go to the next step, if not, go back to step S2052C. When the X-axis moving distance or the Y-axis moving distance is greater than the threshold, the eye moving direction is determined (step S2054C). Send the eye motion command including the eye moving direction and moving distance to the electronic device 10, so that the electronic device 10 executes the scrolling motion command corresponding to the eye moving direction (step S2055C). In the above steps, if the distance of the user's gaze direction returns to a state smaller than the threshold, return to the procedure before step 2051C, and continue to detect whether the user generates a trigger action.

本发明中所述的方法步骤亦可作为一种计算机可读取纪录媒体实施,用于储存在光盘片、硬盘、半导体记忆装置等计算机可读取记录媒体,并通过该计算机可读取记录媒体载置在电子装置上为该电子装置或电子设备所存取使用。The method steps described in the present invention can also be implemented as a computer-readable recording medium for storing in computer-readable recording media such as optical discs, hard disks, and semiconductor memory devices, and through the computer-readable recording medium Loaded on an electronic device for access and use by the electronic device or electronic equipment.

本发明所述的方法步骤亦可作为一种计算机程序产品实施,用于储存在网络服务器的硬盘、记忆装置,例如app store、google play、windows市集、或其他类似的应用程序在线发行平台,可通过将计算机程序产品上传至服务器后供用户付费下载的方式实施。The method steps described in the present invention can also be implemented as a computer program product, which is used to store in the hard disk of the network server, memory device, such as app store, google play, windows market, or other similar online distribution platforms of application programs, It can be implemented by uploading the computer program product to the server and then providing the user with a fee to download it.

综上所述,本发明的用户界面同步系统可将电子装置的影像数据传送至穿戴式装置的输出单元上,以使于通过追踪用户的眼部动作操作电子装置。本发明的前镜头与用户眼部间可维持在固定的间距,较容易检测用户的眼部动作。To sum up, the user interface synchronization system of the present invention can transmit the image data of the electronic device to the output unit of the wearable device, so as to operate the electronic device by tracking the user's eye movements. The distance between the front lens and the user's eyes can be maintained at a fixed distance, which makes it easier to detect the user's eye movements.

以上已将本发明做一详细说明,惟以上所述者,仅为本发明的较佳实施例而已,当不能以此限定本发明实施的范围,即凡依本发明申请专利范围所作的同等变化与修饰,皆应仍属本发明的专利涵盖范围内。The present invention has been described in detail above, but the above is only a preferred embodiment of the present invention, and should not limit the scope of the present invention with this, that is, all equivalent changes made according to the patent scope of the present invention and modification, all should still belong to the scope of patent coverage of the present invention.

Claims (24)

1.一种用户界面同步系统,用于将电子装置与穿戴式装置进行配对,其特征在于,所述用户界面同步系统包括:1. A user interface synchronization system for pairing an electronic device with a wearable device, wherein the user interface synchronization system comprises: 图形输出模块,耦接于该电子装置,以存取该电子装置的影像数据,并将该电子装置的影像数据经由无线网络传送至该穿戴式装置;a graphic output module, coupled to the electronic device, to access the image data of the electronic device, and transmit the image data of the electronic device to the wearable device via the wireless network; 映像模块,耦接于该穿戴式装置,将该影像数据显示于该穿戴式装置的输出单元上,以供用户注视操作;An image module, coupled to the wearable device, displays the image data on the output unit of the wearable device for the user to watch and operate; 眼动指令分析模块,耦接于该穿戴式装置,分析该穿戴式装置所获取到的眼部动作指令;以及An eye movement instruction analysis module, coupled to the wearable device, analyzes the eye movement instructions acquired by the wearable device; and 指令转换模块,耦接于该电子装置或该穿戴式装置,在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行的动作指令。An instruction conversion module, coupled to the electronic device or the wearable device, converts the eye movement instruction when receiving the eye movement instruction, so as to output the eye movement instruction as an executable instruction for the electronic device action command. 2.如权利要求1所述的用户界面同步系统,其特征在于,所述映像模块在该穿戴式装置的输出单元上建立显示该影像数据的用户界面窗口,并依据该电子装置的显示屏幕的长宽按照等比例放大或缩小调整该用户界面窗口。2. The user interface synchronization system according to claim 1, wherein the image module establishes a user interface window displaying the image data on the output unit of the wearable device, and according to the display screen of the electronic device The length and width are enlarged or reduced in proportion to adjust the user interface window. 3.如权利要求2所述的用户界面同步系统,其特征在于,所述眼动指令分析模块是依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。3. The user interface synchronization system according to claim 2, wherein the eye movement command analysis module analyzes the user's gaze direction according to the obtained eye image, and forms a gazing direction on the user interface window. A cursor that can be moved according to this gaze direction. 4.如权利要求3所述的用户界面同步系统,其特征在于,当该用户的注视方向停留在该用户界面窗口上时,所述眼动指令分析模块是记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并当该注视方向大致停留在同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至所述指令转换模块以启动该图形化界面所对应的一个或多个程序。4. The user interface synchronization system as claimed in claim 3, wherein, when the user's gaze direction stays on the user interface window, the eye movement command analysis module is to record the gaze direction corresponding to the electronic window. The coordinate position on the display screen of the device, and when the gaze direction roughly stays on the same graphical interface for more than the set threshold time, the trigger instruction is transmitted to the instruction conversion module via the wireless network to start the graphical interface corresponding to one or more programs. 5.如权利要求3所述的用户界面同步系统,其特征在于,所述眼动指令分析模块在检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的该眼部动作指令至该指令转换模块,以使该电子装置执行由左侧向右侧翻页的动作指令,所述眼动指令分析模块在检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的该眼部动作指令至所述指令转换模块,以使所述电子装置执行由右侧向左侧翻页的动作指令。5. The user interface synchronization system as claimed in claim 3, wherein the eye movement instruction analysis module transmits the eye that turns the page on the left side when it detects that the user's gaze direction is moving rapidly from left to right. part of the action command to the command conversion module, so that the electronic device executes the action command of turning the page from left to right, and the eye movement command analysis module detects that the user's gaze direction is moving quickly from right to left and transmitting the eye movement instruction of turning the page from the right to the instruction conversion module, so that the electronic device executes the action instruction of turning the page from the right to the left. 6.如权利要求5所述的用户界面同步系统,其特征在于,所述眼动指令分析模块在检测到该用户的触发动作时设定一基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令至所述指令转换模块,以使所述电子装置执行对应于该眼部移动方向卷动的动作指令。6. The user interface synchronization system as claimed in claim 5, wherein the eye movement instruction analysis module sets a reference coordinate when detecting the trigger action of the user, continuously detects the gaze direction of the user, and records The X-axis movement distance and the Y-axis movement distance of the gaze direction relative to the reference coordinates, when the X-axis movement distance or the Y-axis movement distance is greater than a threshold value, an eye movement command corresponding to the eye movement direction and movement distance is transmitted to the instruction conversion module, so that the electronic device executes the action instruction of scrolling corresponding to the eye movement direction. 7.如权利要求1所述的用户界面同步系统,其特征在于,所述无线网络采用无线保真度直连协议、蓝芽无线传输、或虚拟无线接入点。7. The user interface synchronization system according to claim 1, wherein the wireless network adopts Wi-Fi Direct protocol, Bluetooth wireless transmission, or virtual wireless access point. 8.一种受控端电子装置,其特征在于,包括:显示屏幕、无线传输单元、以及连接于该显示屏幕及该无线传输单元的处理器,所述处理器包括:8. A controlled electronic device, comprising: a display screen, a wireless transmission unit, and a processor connected to the display screen and the wireless transmission unit, the processor comprising: 图形处理单元,用于将影像数据传送至该显示屏幕上以提供用户操作的用户界面;a graphic processing unit, configured to transmit image data to the display screen to provide a user interface for user operation; 运算单元,用于挂载并执行以下的程序:The computing unit is used to mount and execute the following programs: 图形输出模块,用于存取该电子装置的影像数据,并将该电子装置所显示的影像数据经由无线网络传送至穿戴式装置,由该穿戴式装置输出以供用户目视操作;以及指令转换模块,经由无线网络接收该穿戴式装置所提供的眼部动作指令,将该眼部动作指令输出为可供该电子装置执行的动作指令。A graphics output module, used to access the image data of the electronic device, and transmit the image data displayed by the electronic device to the wearable device via the wireless network, and the wearable device outputs it for visual operation by the user; and instruction conversion The module receives the eye movement instruction provided by the wearable device via the wireless network, and outputs the eye movement instruction as an action instruction that can be executed by the electronic device. 9.如权利要求8所述的电子装置,其特征在于,所述无线网络采用无线保真度直连协议、蓝芽无线传输、或虚拟无线接入点。9. The electronic device according to claim 8, wherein the wireless network adopts Wi-Fi Direct, Bluetooth wireless transmission, or virtual wireless access point. 10.一种主控端穿戴式装置,其特征在于,包括:输出单元、无线传输单元、摄像单元、以及连接于所述输出单元、所述无线传输单元及所述摄像单元的处理器,所述摄像单元是用于拍摄并取得用户的眼部影像,所述处理器包括:10. A wearable device at the master control end, comprising: an output unit, a wireless transmission unit, a camera unit, and a processor connected to the output unit, the wireless transmission unit, and the camera unit, the The camera unit is used to shoot and obtain the user's eye image, and the processor includes: 图形处理单元,用于将影像数据传送至该输出单元上以提供用户操作的用户界面;a graphic processing unit, configured to transmit the image data to the output unit to provide a user interface for user operation; 运算单元,用于挂载并执行以下的程序:The computing unit is used to mount and execute the following programs: 映像模块,是通过无线网络取得电子装置的影像数据,并将该影像数据传送至该穿戴式装置的该图形处理单元以显示在该输出单元上,供用户注视操作;The imaging module obtains the image data of the electronic device through the wireless network, and transmits the image data to the graphics processing unit of the wearable device to be displayed on the output unit for the user to watch and operate; 眼动指令分析模块,取得由该摄像单元所拍摄取得的该眼部影像,并由该眼部影像获取眼部动作指令,通过无线网络将该眼部动作指令传送至该电子装置,以启动该电子装置的一个或多个程序。The eye movement instruction analysis module obtains the eye image captured by the camera unit, obtains the eye movement instruction from the eye image, and transmits the eye movement instruction to the electronic device through the wireless network to activate the One or more programs of an electronic device. 11.如权利要求10所述的穿戴式装置,其特征在于,所述映像模块在所述穿戴式装置的该输出单元上建立显示该影像数据的用户界面窗口,并依据该电子装置的显示屏幕的长宽按照等比例放大或缩小调整该用户界面窗口。11. The wearable device according to claim 10, wherein the imaging module establishes a user interface window displaying the image data on the output unit of the wearable device, and according to the display screen of the electronic device The length and width of the user interface window are enlarged or reduced in proportion to adjust the user interface. 12.如权利要求11所述的穿戴式装置,其特征在于,所述眼动指令分析模块系依据所获取到的该眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。12. The wearable device according to claim 11, wherein the eye movement instruction analysis module analyzes the gaze direction of the user based on the acquired eye image, and forms a A cursor that can be moved according to this gaze direction. 13.如权利要求12所述的穿戴式装置,其特征在于,当该用户的注视方向停留在该用户界面窗口上时,所述眼动指令分析模块将记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并当检测到该注视方向停留在同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该电子装置以启动该图形化界面所对应的一个或多个程序。13. The wearable device according to claim 12, wherein when the user's gaze direction stays on the user interface window, the eye movement instruction analysis module will record the gaze direction corresponding to the electronic device. The coordinate position on the display screen of the computer, and when it is detected that the gaze direction stays on the same graphical interface for more than the set threshold time, the trigger instruction is sent to the electronic device via the wireless network to start the corresponding graphical interface. one or more programs. 14.如权利要求12所述的穿戴式装置,其特征在于,所述眼动指令分析模块检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由左侧向右侧翻页的程序,所述眼动指令分析模块检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由右侧向左侧翻页的程序。14. The wearable device according to claim 12, wherein when the eye movement instruction analysis module detects that the user's gaze direction is moving rapidly from left to right, the eye movement instruction to turn the page on the left is transmitted to the electronic device, so that the electronic device executes the program of turning the page from left to right, and when the eye movement command analysis module detects that the user’s gaze direction is moving rapidly from right to left, it transmits the program of turning the page on the right eye movement instructions to the electronic device, so that the electronic device executes a program of turning pages from right to left. 15.如权利要求12所述的穿戴式装置,其特征在于,所述眼动指令分析模块系检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令至该电子装置,以使该电子装置执行对应于该眼部移动方向卷动的程序。15. The wearable device according to claim 12, wherein the eye movement command analysis module detects the user's trigger action, sets the reference coordinates, continuously detects the user's gaze direction, and records the gaze direction. The X-axis movement distance and the Y-axis movement distance of the direction relative to the reference coordinates. When the X-axis movement distance or the Y-axis movement distance is greater than the threshold, the eye movement command corresponding to the eye movement direction and movement distance is transmitted to the an electronic device, so that the electronic device executes a scrolling program corresponding to the moving direction of the eyes. 16.如权利要求10所述的穿戴式装置,其特征在于,所述无线网络采用无线保真度直连协议、蓝芽无线传输、或虚拟无线接入点。16. The wearable device according to claim 10, wherein the wireless network adopts Wi-Fi Direct protocol, Bluetooth wireless transmission, or virtual wireless access point. 17.一种穿戴式装置及电子装置的界面同步方法,其特征在于,包括:17. A method for interface synchronization of a wearable device and an electronic device, comprising: 存取该电子装置的影像数据,并将该电子装置的影像数据经由无线网络传送至该穿戴式装置;accessing the image data of the electronic device, and transmitting the image data of the electronic device to the wearable device via the wireless network; 通过该穿戴式装置将该影像数据显示在该穿戴式装置的输出单元上,以供用户注视操作;Displaying the image data on the output unit of the wearable device for the user to watch and operate; 分析该穿戴式装置所获取到的眼部动作指令,并将该眼部动作指令经由无线网络传送至该电子装置;在接收到该眼部动作指令时对该眼部动作指令进行转换,以将该眼部动作指令输出为可供该电子装置执行的动作指令。Analyzing the eye movement command acquired by the wearable device, and transmitting the eye movement command to the electronic device via the wireless network; converting the eye movement command when receiving the eye movement command, so as to The eye movement instruction is output as an action instruction that can be executed by the electronic device. 18.如权利要求17所述的界面同步方法,其特征在于,所述穿戴式装置在接收到该影像数据后,依据所述电子装置的显示屏幕的长宽按照等比例放大或缩小建立显示该影像数据的用户界面窗口。18. The interface synchronization method according to claim 17, characterized in that, after receiving the image data, the wearable device establishes and displays the image according to the length and width of the display screen of the electronic device according to proportional enlargement or reduction. User interface window for image data. 19.如权利要求18所述的界面同步方法,其特征在于,该穿戴式装置是依据所获取到的眼部影像,分析该用户的注视方向,并在该用户界面窗口上形成可依据该注视方向移动的光标。19. The interface synchronization method according to claim 18, wherein the wearable device analyzes the gaze direction of the user based on the acquired eye image, and forms an image on the user interface window that can be based on the gaze direction. direction to move the cursor. 20.如权利要求19所述的界面同步方法,其特征在于,该用户的注视方向停留在该用户界面窗口上时,记录该注视方向所对应于该电子装置的显示屏幕上的坐标位置,并在该注视方向大致停留在同一图形化界面上超过所设定的阈值时间时,将触发指令经由无线网络传送至该电子装置以启动该图形化界面所对应的一个或多个程序。20. The interface synchronization method according to claim 19, wherein, when the user's gaze direction stays on the user interface window, the coordinate position on the display screen corresponding to the gaze direction of the electronic device is recorded, and When the gaze direction stays on the same graphical interface for more than a set threshold time, a trigger command is sent to the electronic device via the wireless network to start one or more programs corresponding to the graphical interface. 21.如权利要求19所述的界面同步方法,其特征在于,当检测到该用户的注视方向是由左至右快速移动时,传递左侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由左侧向右侧翻页的动作指令,当检测到该用户的注视方向是由右至左快速移动时,传递右侧翻页的眼部动作指令至该电子装置,以使该电子装置执行由右侧向左侧翻页的动作指令。21. The interface synchronization method according to claim 19, wherein when it is detected that the user's gaze direction is moving rapidly from left to right, the eye movement instruction for turning the page on the left is transmitted to the electronic device to Make the electronic device execute the action command of turning the page from left to right, and when it is detected that the user's gaze direction is moving rapidly from right to left, transmit the eye action command of turning the page on the right to the electronic device, so that The electronic device is made to execute the action instruction of turning pages from right to left. 22.如权利要求19所述的界面同步方法,其特征在于,当检测到该用户的触发动作时设定基准坐标,持续检测该用户的注视方向,并记录该注视方向相对该基准坐标的X轴移动距离及Y轴移动距离,当该X轴移动距离或该Y轴移动距离大于阈值时,传递对应于该眼部移动方向及移动距离的眼部动作指令至该电子装置,以使该电子装置执行对应于该眼部移动方向卷动的动作指令。22. The interface synchronization method according to claim 19, characterized in that, when the trigger action of the user is detected, the reference coordinates are set, the user's gaze direction is continuously detected, and the X of the gaze direction relative to the reference coordinates is recorded. Axis moving distance and Y-axis moving distance, when the X-axis moving distance or the Y-axis moving distance is greater than the threshold value, transmit the eye movement command corresponding to the eye moving direction and moving distance to the electronic device, so that the electronic The device executes an action instruction of scrolling corresponding to the eye movement direction. 23.一种计算机可读取记录媒体,其特征在于,在所述可读取记录媒体上记录一程序,当电子装置及穿戴式装置加载该程序并执行后,可完成如权利要求17至第22中任一项所述的方法。23. A computer-readable recording medium, characterized in that a program is recorded on the readable recording medium, and when the electronic device and the wearable device are loaded with the program and executed, it can complete the tasks as claimed in claims 17 to 1. The method described in any one of 22. 24.一种计算机程序产品,其特征在于,当该计算机程序产品被加载电子装置及穿戴式装置中执行时,可完成权利要求17至第22中任一项所述的方法。24. A computer program product, characterized in that when the computer program product is loaded into an electronic device or a wearable device for execution, the method according to any one of claims 17 to 22 can be completed.
CN201510340993.1A 2015-04-29 2015-06-18 User interface synchronization system and method Active CN106201284B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW104113704A TWI571768B (en) 2015-04-29 2015-04-29 User interface synchronization system, device, method, computer readable recording medium, and computer program product
TW104113704 2015-04-29

Publications (2)

Publication Number Publication Date
CN106201284A true CN106201284A (en) 2016-12-07
CN106201284B CN106201284B (en) 2020-03-24

Family

ID=57453126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510340993.1A Active CN106201284B (en) 2015-04-29 2015-06-18 User interface synchronization system and method

Country Status (2)

Country Link
CN (1) CN106201284B (en)
TW (1) TWI571768B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774893A (en) * 2016-12-15 2017-05-31 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
CN107820599A (en) * 2016-12-09 2018-03-20 深圳市柔宇科技有限公司 The method of adjustment of user interface, adjustment system and wear display device
CN112560572A (en) * 2020-10-24 2021-03-26 北京博睿维讯科技有限公司 Camera shooting and large screen interaction processing method, device and system
CN119621846A (en) * 2024-11-21 2025-03-14 数字广东网络建设有限公司 Data synchronization method, device, electronic device and storage medium
TWI901406B (en) * 2024-10-25 2025-10-11 森思股份有限公司 Visual perception testing system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180150204A1 (en) * 2016-11-30 2018-05-31 Google Inc. Switching of active objects in an augmented and/or virtual reality environment
US10511842B2 (en) 2017-10-06 2019-12-17 Qualcomm Incorporated System and method for foveated compression of image frames in a system on a chip

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101258436A (en) * 2005-09-08 2008-09-03 瑞士电信流动电话公司 Communication devices, systems and methods
CN101272727A (en) * 2005-09-27 2008-09-24 潘尼公司 Devices for controlling external units
US7762665B2 (en) * 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
CN103472915A (en) * 2013-08-30 2013-12-25 深圳Tcl新技术有限公司 Reading control method and reading control device on basis of pupil tracking and display equipment
TWM472854U (en) * 2013-11-27 2014-02-21 Chipsip Technology Co Ltd Wearable display
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye tracking method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
JP5539945B2 (en) * 2011-11-01 2014-07-02 株式会社コナミデジタルエンタテインメント GAME DEVICE AND PROGRAM

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7762665B2 (en) * 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
CN101258436A (en) * 2005-09-08 2008-09-03 瑞士电信流动电话公司 Communication devices, systems and methods
CN101272727A (en) * 2005-09-27 2008-09-24 潘尼公司 Devices for controlling external units
CN103472915A (en) * 2013-08-30 2013-12-25 深圳Tcl新技术有限公司 Reading control method and reading control device on basis of pupil tracking and display equipment
TWM472854U (en) * 2013-11-27 2014-02-21 Chipsip Technology Co Ltd Wearable display
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye tracking method and device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107820599A (en) * 2016-12-09 2018-03-20 深圳市柔宇科技有限公司 The method of adjustment of user interface, adjustment system and wear display device
CN107820599B (en) * 2016-12-09 2021-03-23 深圳市柔宇科技股份有限公司 User interface adjustment method, adjustment system, and head mounted display device
CN106774893A (en) * 2016-12-15 2017-05-31 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
CN106774893B (en) * 2016-12-15 2019-10-18 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
CN112560572A (en) * 2020-10-24 2021-03-26 北京博睿维讯科技有限公司 Camera shooting and large screen interaction processing method, device and system
CN112560572B (en) * 2020-10-24 2024-11-12 北京博睿维讯科技有限公司 A camera and large screen interactive processing method, device and system
TWI901406B (en) * 2024-10-25 2025-10-11 森思股份有限公司 Visual perception testing system
CN119621846A (en) * 2024-11-21 2025-03-14 数字广东网络建设有限公司 Data synchronization method, device, electronic device and storage medium

Also Published As

Publication number Publication date
TWI571768B (en) 2017-02-21
TW201638723A (en) 2016-11-01
CN106201284B (en) 2020-03-24

Similar Documents

Publication Publication Date Title
CN106201284B (en) User interface synchronization system and method
EP3035656B1 (en) Method and apparatus for controlling an electronic device
CN110163045B (en) A method, device and equipment for identifying gestures
EP3467707A1 (en) System and method for deep learning based hand gesture recognition in first person view
US9337926B2 (en) Apparatus and method for providing dynamic fiducial markers for devices
WO2020187157A1 (en) Control method and electronic device
WO2020177585A1 (en) Gesture processing method and device
EP4047549A1 (en) Method and device for image detection, and electronic device
US10635180B2 (en) Remote control of a desktop application via a mobile device
CN103309437B (en) The caching mechanism of posture based on video camera
US20190080188A1 (en) Facial recognition method and related product
CN102779000B (en) User interaction system and method
CN110196640A (en) A kind of method of controlling operation thereof and terminal
CN110604579B (en) Data acquisition method, device, terminal and storage medium
JP6504058B2 (en) INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM
CN114201030A (en) Device interaction method, electronic device and interaction system
US12432443B2 (en) System and method for detecting a user intent to start a video recording
CN115529405B (en) Image display method and device for front camera
US11662592B2 (en) Head-mounted display device and head-mounted display system
CN116382460A (en) Terminal control method, device, equipment and storage medium
CN116132811A (en) Camera light filling lamp control method and device, camera, robot equipment and medium
US12518473B1 (en) Method and device for dynamically adjusting a pointer visualization
US20260030839A1 (en) Input event detection for extended reality (xr) headsets based on eye tracking
US20240377886A1 (en) System and method for interacting with extended reality environment
RU2782960C1 (en) Control method and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant