CN117130471A - Man-machine interaction method, electronic equipment and system - Google Patents
Man-machine interaction method, electronic equipment and system Download PDFInfo
- Publication number
- CN117130471A CN117130471A CN202310378606.8A CN202310378606A CN117130471A CN 117130471 A CN117130471 A CN 117130471A CN 202310378606 A CN202310378606 A CN 202310378606A CN 117130471 A CN117130471 A CN 117130471A
- Authority
- CN
- China
- Prior art keywords
- application
- screen
- gesture
- electronic device
- virtual space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
技术领域Technical field
本申请涉及终端技术领域,尤其涉及一种人机交互方法、电子设备及系统。The present application relates to the field of terminal technology, and in particular to a human-computer interaction method, electronic device and system.
背景技术Background technique
随着计算机图形技术的发展,增强现实(augmented reality,AR)、虚拟现实(virtual reality,VR)、混合现实(mediated reality,MR)等扩展现实(extendedreality,XR)技术逐渐应用到人们的生活中。现有XR设备的人机交互方式通常是通过手机等电子设备作为遥控器来进行操作。如何提高通过电子设备操作XR设备的便捷性是需要解决的一个问题。With the development of computer graphics technology, extended reality (XR) technologies such as augmented reality (AR), virtual reality (VR), mixed reality (mediated reality, MR) are gradually applied to people's lives. . The human-computer interaction method of existing XR equipment is usually operated through electronic devices such as mobile phones as remote controls. How to improve the convenience of operating XR equipment through electronic equipment is a problem that needs to be solved.
发明内容Contents of the invention
本申请实施例提供一种人机交互方法、电子设备及系统,能够提供便捷的人机交互方式,便于用户通过手机等电子设备操作XR设备的虚拟空间,提升用户的使用体验。Embodiments of the present application provide a human-computer interaction method, electronic device and system, which can provide a convenient human-computer interaction method, facilitate users to operate the virtual space of XR devices through mobile phones and other electronic devices, and improve the user experience.
为达到上述目的,本申请的实施例采用如下技术方案:In order to achieve the above objectives, the embodiments of the present application adopt the following technical solutions:
第一方面,提供了一种人机交互方法,应用于电子设备,该电子设备与头戴式显示设备连接,该方法包括:电子设备启动第一应用后,接收到用户在第一应用显示界面的第一手势;该第一应用用于对头戴式显示设备的虚拟空间进行管理;该第一手势包括截屏手势、录屏手势或返回上一级手势。由于第一应用在前台运行,因此截屏手势、录屏手势或返回上一级手势等系统手势作用于头戴式显示设备。响应于第一手势,电子设备针对头戴式显示设备的虚拟空间执行第一手势对应的动作。In a first aspect, a human-computer interaction method is provided, applied to an electronic device, and the electronic device is connected to a head-mounted display device. The method includes: after the electronic device starts the first application, it receives the user's input on the first application display interface. The first gesture; the first application is used to manage the virtual space of the head-mounted display device; the first gesture includes a screenshot gesture, a screen recording gesture, or a return to the previous level gesture. Since the first application is running in the foreground, system gestures such as screenshot gestures, screen recording gestures, or return to the previous level gestures act on the head-mounted display device. In response to the first gesture, the electronic device performs an action corresponding to the first gesture with respect to the virtual space of the head-mounted display device.
其中,第一手势为截屏手势,针对虚拟空间的显示界面执行第一手势对应的动作包括:对虚拟空间的显示界面进行截屏;第一手势为录屏手势,针对虚拟空间的显示界面执行第一手势对应的动作包括:对虚拟空间的显示界面进行录屏;第一手势为返回上一级手势,针对虚拟空间的显示界面执行第一手势对应的动作包括:使得虚拟空间显示当前显示界面的上一级显示界面。Among them, the first gesture is a screenshot gesture, and the action corresponding to the first gesture performed on the display interface of the virtual space includes: taking a screenshot of the display interface of the virtual space; the first gesture is a screen recording gesture, and the first gesture is performed on the display interface of the virtual space. The actions corresponding to the gesture include: recording the screen of the display interface of the virtual space; the first gesture is a gesture to return to the previous level, and the action corresponding to the first gesture performed on the display interface of the virtual space includes: causing the virtual space to display the upper part of the current display interface. Level 1 display interface.
在该方法中,支持用户在电子设备的触摸屏上执行系统手势(截屏手势、录屏手势或返回上一级手势),实现对头戴式显示设备的虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。这样,用户在观看虚拟空间的画面时,只需要在电子设备触摸屏上做出系统手势,就可以实现对头戴式显示设备的虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。用户的视线不需要离开虚拟空间,避免了视线在虚拟空间和电子设备屏幕之间跳转。而且手势操作方便快捷,用户操作成本低、体验好。In this method, the user is supported to perform system gestures (screen capture gestures, screen recording gestures or return to the previous level gestures) on the touch screen of the electronic device to realize screenshots, screen recordings, and Return to the previous level and other operations. In this way, when the user is watching the screen in the virtual space, he only needs to make system gestures on the touch screen of the electronic device to take screenshots, record the screen, and return to the previous level on the display interface of the virtual space of the head-mounted display device. . The user's sight does not need to leave the virtual space, avoiding the jump between the virtual space and the electronic device screen. Moreover, gesture operation is convenient and fast, and the user operation cost is low and the user experience is good.
结合第一方面,在一种实施方式中,如果第一应用关闭或者退到后台运行,电子设备退出显示第一应用显示界面,电子设备显示第二应用的应用界面;示例性的,该第二应用可以为桌面应用、系统应用、第三方应用等。电子设备接收用户在第二应用显示界面的第一手势;响应于在第二应用显示界面的第一手势,电子设备针对第二应用显示界面执行第一手势对应的动作。其中,第一手势为截屏手势,针对第二应用显示界面执行第一手势对应的动作包括:对第二应用显示界面进行截屏;第一手势为录屏手势,针对第二应用显示界面执行第一手势对应的动作包括:对第二应用显示界面进行录屏;第一手势为返回上一级手势,针对第二应用显示界面执行第一手势对应的动作包括:电子设备显示第二应用显示界面的上一级显示界面。With reference to the first aspect, in one implementation, if the first application is closed or retreats to the background, the electronic device exits and displays the first application display interface, and the electronic device displays the application interface of the second application; for example, the second application Applications can be desktop applications, system applications, third-party applications, etc. The electronic device receives the user's first gesture on the second application display interface; in response to the first gesture on the second application display interface, the electronic device performs an action corresponding to the first gesture on the second application display interface. Wherein, the first gesture is a screenshot gesture, and the action corresponding to the first gesture performed on the second application display interface includes: taking a screenshot of the second application display interface; the first gesture is a screen recording gesture, and the first gesture is performed on the second application display interface. Actions corresponding to the gesture include: recording the screen of the second application display interface; the first gesture is a gesture to return to the previous level, and actions corresponding to the first gesture performed on the second application display interface include: the electronic device displays the second application display interface Previous level display interface.
在该方法中,电子设备接收到系统手势后,区分不同的情况,触发电子设备自身执行系统手势对应的动作,或者触发头戴式显示设备执行系统手势对应的动作。如果第一应用未在前台运行,电子设备自身执行系统手势对应的动作(针对第二应用显示界面执行第一手势对应的动作)。支持通过手势对电子设备执行截屏、录屏、返回上一级等操作。In this method, after receiving the system gesture, the electronic device distinguishes between different situations and triggers the electronic device itself to perform the action corresponding to the system gesture, or triggers the head-mounted display device to perform the action corresponding to the system gesture. If the first application is not running in the foreground, the electronic device itself performs the action corresponding to the system gesture (performs the action corresponding to the first gesture for the second application display interface). Supports operations such as taking screenshots, recording screens, and returning to the previous level on electronic devices through gestures.
结合第一方面,在一种实施方式中,截屏手势包括在屏幕上多指向下滑动、多指向上滑动、单指关节双击屏幕或双指关节双击屏幕中的任一种;录屏手势包括在屏幕上多指向下滑动、多指向上滑动、单指关节双击屏幕或双指关节双击屏幕中的任一种;其中,截屏手势与录屏手势不同;返回上一级手势包括单指从屏幕左边侧向右滑动、单指从屏幕右边侧向左滑动或手指从屏幕两侧向内滑动中的任一种。With reference to the first aspect, in one embodiment, the screenshot gesture includes any one of sliding down with multiple fingers on the screen, sliding up with multiple fingers, double-clicking the screen with one knuckle, or double-clicking the screen with two knuckles; the screen recording gesture includes Swipe down with multiple fingers on the screen, slide up with multiple fingers, double-tap the screen with one knuckle, or double-tap the screen with two knuckles; among them, the screenshot gesture is different from the screen recording gesture; the gesture to return to the previous level includes single-finger tapping from the left side of the screen Either swipe side to right, swipe one finger from the right side of the screen to the left, or swipe your finger inward from both sides of the screen.
结合第一方面,在一种实施方式中,电子设备生成虚拟空间的显示界面,向头戴式显示设备发送该显示界面。也就是说,电子设备生成头戴式显示设备的显示数据和显示界面,电子设备为头戴式显示设备提供显示数据,头戴式显示设备作为电子设备的显示设备。With reference to the first aspect, in one implementation, the electronic device generates a display interface of the virtual space and sends the display interface to the head-mounted display device. That is to say, the electronic device generates the display data and display interface of the head-mounted display device, the electronic device provides display data for the head-mounted display device, and the head-mounted display device serves as a display device for the electronic device.
在该实现方式中,电子设备本地存有头戴式显示设备的显示界面的信息,电子设备根据本地获取的头戴式显示设备的显示界面进行截屏、录屏或生成虚拟空间的显示界面的上一级显示界面。这样的话,在截屏或录屏时,不需要电子设备从头戴式显示设备获取虚拟空间显示界面的大量数据,减少电子设备和头戴式显示设备的数据交互,避免受到电子设备和头戴式显示设备通信速率的制约。In this implementation, the electronic device locally stores information about the display interface of the head-mounted display device, and the electronic device takes screenshots, records, or generates the display interface of the virtual space based on the locally obtained display interface of the head-mounted display device. Level 1 display interface. In this way, when taking a screenshot or recording a screen, the electronic device does not need to obtain a large amount of data on the virtual space display interface from the head-mounted display device, reducing the data interaction between the electronic device and the head-mounted display device, and avoiding the interference between the electronic device and the head-mounted display device. Displays the restrictions on device communication rate.
结合第一方面,在一种实施方式中,电子设备触发头戴式显示设备对虚拟空间的显示界面进行截屏、录屏或生成虚拟空间的显示界面的上一级显示界面。进一步的,头戴式显示设备可以将截屏生成的截图图片或录屏生成的录屏文件发送给电子设备。With reference to the first aspect, in one implementation, the electronic device triggers the head-mounted display device to take a screenshot, record a screen, or generate a higher-level display interface of the display interface of the virtual space. Further, the head-mounted display device can send the screenshot image generated by the screenshot or the screen recording file generated by the screen recording to the electronic device.
第二方面,提供了一种人机交互方法,应用于电子设备,该电子设备与头戴式显示设备连接,该方法包括:电子设备启动用于管理头戴式显示设备的虚拟空间的第一应用后,接收到用户在第一应用显示界面的第一操作;响应于第一操作,电子设备触发头戴式显示设备的虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项。其中,截屏虚拟按键用于对虚拟空间的显示界面进行截屏;录屏虚拟按键用于对虚拟空间的显示界面进行录屏;返回上一级虚拟按键用于触发虚拟空间显示当前显示界面的上一级显示界面;返回主页虚拟按键用于触发虚拟空间显示桌面界面。In a second aspect, a human-computer interaction method is provided, applied to an electronic device, and the electronic device is connected to a head-mounted display device. The method includes: the electronic device starts a first step for managing the virtual space of the head-mounted display device. After application, the user's first operation on the first application display interface is received; in response to the first operation, the electronic device triggers the virtual space of the head-mounted display device to display a screenshot virtual button, a screen recording virtual button, and a return to the previous level virtual button. and at least one of the virtual keys to return to the homepage. Among them, the screenshot virtual button is used to take a screenshot of the display interface of the virtual space; the screen recording virtual button is used to record the screen of the virtual space display interface; the return to the previous level virtual button is used to trigger the virtual space to display the previous level of the current display interface. level display interface; the return to homepage virtual button is used to trigger the virtual space to display the desktop interface.
在该方法中,响应于用户在电子设备上第一应用的应用界面上输入的第一操作,头戴式显示设备的虚拟空间弹出截屏、录屏、返回上一级、返回主页等虚拟按键。这些虚拟按键不依赖于虚拟空间当前显示界面,在虚拟空间的任意显示界面都可以弹出。这样,用户需要进行截屏、录屏、返回上一级、返回主页等操作时,用户视线不需要离开虚拟空间,并且不需要退出虚拟空间显示的当前应用,只要做出简单的手势,操作简便快捷,用户体验较好。In this method, in response to the first operation input by the user on the application interface of the first application on the electronic device, virtual buttons such as screenshots, screen recordings, return to the previous level, and return to the home page pop up in the virtual space of the head-mounted display device. These virtual buttons do not depend on the current display interface of the virtual space and can pop up on any display interface of the virtual space. In this way, when the user needs to take screenshots, record the screen, return to the previous level, return to the homepage, etc., the user does not need to leave the virtual space, and does not need to exit the current application displayed in the virtual space. He only needs to make simple gestures, and the operation is simple and fast. , the user experience is better.
结合第二方面,在一种实施方式中,电子设备触发头戴式显示设备的虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项,包括:电子设备触发头戴式显示设备的虚拟空间显示快捷操作面板,快捷操作面板上设置有截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项。Combined with the second aspect, in one embodiment, the electronic device triggers the virtual space of the head-mounted display device to display at least one of a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to homepage virtual button, including : The electronic device triggers the virtual space of the head-mounted display device to display a shortcut operation panel. The shortcut operation panel is provided with at least one of a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to homepage virtual button.
其中,快捷操作面板的形状、大小、位置都可以根据具体情况进行设置。Among them, the shape, size, and position of the quick operation panel can be set according to specific conditions.
在一种实施方式中,快捷操作面板叠加于虚拟空间显示的第二应用的应用界面之上。In one implementation, the shortcut operation panel is superimposed on the application interface of the second application displayed in the virtual space.
在另一种实施方式中,快捷操作面板显示于虚拟空间显示界面的空白区域。In another implementation, the shortcut operation panel is displayed in a blank area of the virtual space display interface.
结合第二方面,在一种实施方式中,截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项叠加于虚拟空间显示的第二应用的应用界面之上。In connection with the second aspect, in one embodiment, at least one of a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to the homepage virtual button is superimposed on the application interface of the second application displayed in the virtual space. .
结合第二方面,在一种实施方式中,第一操作包括:单指敲击、多指敲击、指关节敲击或单指划圈中任意一项。Combined with the second aspect, in one embodiment, the first operation includes: any one of single-finger tapping, multi-finger tapping, knuckle tapping, or single-finger circling.
结合第二方面,在一种实施方式中,响应于第一操作,电子设备触发头戴式显示设备的虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项,包括:响应于第一操作,电子设备生成第一显示数据,向头戴式显示设备发送第一显示数据;该第一显示数据用于头戴式显示设备在虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项。Combined with the second aspect, in one implementation, in response to the first operation, the electronic device triggers the virtual space of the head-mounted display device to display a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to home page virtual button. At least one of the above, including: in response to the first operation, the electronic device generates first display data and sends the first display data to the head-mounted display device; the first display data is used for the head-mounted display device to display screenshots in the virtual space At least one of the virtual buttons, the screen recording virtual button, the return to the previous level virtual button, and the return to the homepage virtual button.
在该实现方式中,电子设备生成头戴式显示设备的显示数据和显示界面,电子设备为头戴式显示设备提供显示数据,头戴式显示设备作为电子设备的显示设备。In this implementation, the electronic device generates display data and a display interface of the head-mounted display device, the electronic device provides display data for the head-mounted display device, and the head-mounted display device serves as a display device of the electronic device.
结合第二方面,在一种实施方式中,响应于第一操作,电子设备触发头戴式显示设备的虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项,包括:响应于第一操作,电子设备向头戴式显示设备发送第一操作对应的第一事件;第一事件用于触发头戴式显示设备生成第一显示数据,第一显示数据用于头戴式显示设备在虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项。Combined with the second aspect, in one implementation, in response to the first operation, the electronic device triggers the virtual space of the head-mounted display device to display a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to home page virtual button. At least one of the following, including: in response to the first operation, the electronic device sends a first event corresponding to the first operation to the head-mounted display device; the first event is used to trigger the head-mounted display device to generate first display data, and the first The display data is used by the head-mounted display device to display at least one of a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to homepage virtual button in the virtual space.
第三方面,提供了一种电子设备,该电子设备具有实现上述第一方面或第二方面所述的方法的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。A third aspect provides an electronic device having the function of implementing the method described in the first or second aspect. This function can be implemented by hardware, or it can be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above functions.
第四方面,提供了一种电子设备,包括:处理器、触摸屏和存储器;该存储器用于存储计算机执行指令,当该电子设备运行时,该处理器执行该存储器存储的该计算机执行指令,以使该电子设备执行如上述第一方面或第二方面中任一项所述的方法。In a fourth aspect, an electronic device is provided, including: a processor, a touch screen and a memory; the memory is used to store computer execution instructions, and when the electronic device is running, the processor executes the computer execution instructions stored in the memory to The electronic device is caused to perform the method described in any one of the first aspect or the second aspect.
第五方面,提供了一种电子设备,包括:处理器;所述处理器用于与存储器耦合,并读取存储器中的指令之后,根据所述指令执行如上述第一方面或第二方面中任一项所述的方法。In a fifth aspect, an electronic device is provided, including: a processor; the processor is configured to be coupled to a memory, and after reading instructions in the memory, execute any of the above first or second aspects according to the instructions. method described in one item.
第六方面,提供了一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机可以执行上述第一方面或第二方面中任一项所述的方法。In a sixth aspect, a computer-readable storage medium is provided. Instructions are stored in the computer-readable storage medium. When the computer-readable storage medium is run on a computer, the computer can execute any one of the above-mentioned first or second aspects. method described.
第七方面,提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机可以执行上述第一方面或第二方面中任一项所述的方法。A seventh aspect provides a computer program product containing instructions that, when run on a computer, enable the computer to execute the method described in any one of the above first or second aspects.
第八方面,提供了一种装置(例如,该装置可以是芯片系统),该装置包括处理器,用于支持电子设备实现上述第一方面或第二方面中所涉及的功能。在一种可能的设计中,该装置还包括存储器,该存储器,用于保存电子设备必要的程序指令和数据。该装置是芯片系统时,可以由芯片构成,也可以包含芯片和其他分立器件。An eighth aspect provides a device (for example, the device may be a chip system). The device includes a processor and is used to support an electronic device to implement the functions involved in the first or second aspect. In a possible design, the device further includes a memory, which is used to store necessary program instructions and data of the electronic device. When the device is a chip system, it may be composed of a chip or may include a chip and other discrete components.
其中,第三方面至第八方面中任一种设计方式所带来的技术效果可参见第一方面或第二方面中不同设计方式所带来的技术效果,此处不再赘述。Among them, the technical effects brought by any one of the design methods in the third to eighth aspects can be referred to the technical effects brought by different design methods in the first aspect or the second aspect, and will not be described again here.
附图说明Description of the drawings
图1为本申请实施例提供的人机交互方法所适用的系统架构示意图;Figure 1 is a schematic diagram of the system architecture applicable to the human-computer interaction method provided by the embodiment of the present application;
图2为本申请实施例提供的一种电子设备的硬件结构示意图;Figure 2 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application;
图3A为本申请实施例提供的一种头戴式显示设备的光学构成示意图;Figure 3A is a schematic diagram of the optical structure of a head-mounted display device provided by an embodiment of the present application;
图3B为本申请实施例提供的一种头戴式显示设备的硬件结构示意图;Figure 3B is a schematic diagram of the hardware structure of a head-mounted display device provided by an embodiment of the present application;
图4A为用户与头戴式显示设备进行人机交互的一种场景实例示意图;Figure 4A is a schematic diagram of an example scenario of human-computer interaction between the user and the head-mounted display device;
图4B为用户与头戴式显示设备进行人机交互的另一种场景实例示意图;Figure 4B is a schematic diagram of another scenario example of human-computer interaction between the user and the head-mounted display device;
图5为本申请实施例提供的手机显示AR应用的应用界面时接收到截屏手势的一种场景实例示意图;Figure 5 is a schematic diagram of a scenario example in which a screenshot gesture is received when a mobile phone displays the application interface of an AR application provided by an embodiment of the present application;
图6为本申请实施例提供的手机未显示AR应用的应用界面时接收到截屏手势的一种场景实例示意图;Figure 6 is a schematic diagram of a scenario example in which a screenshot gesture is received when the mobile phone does not display the application interface of the AR application provided by the embodiment of the present application;
图7为本申请实施例提供的手机显示AR应用的应用界面时接收到录屏手势的一种场景实例示意图;Figure 7 is a schematic diagram of a scene example in which a screen recording gesture is received when a mobile phone displays the application interface of an AR application provided by an embodiment of the present application;
图8为本申请实施例提供的手机未显示AR应用的应用界面时接收到录屏手势的一种场景实例示意图;Figure 8 is a schematic diagram of a scene example in which a screen recording gesture is received when the mobile phone does not display the application interface of the AR application provided by the embodiment of the present application;
图9为本申请实施例提供的手机显示AR应用的应用界面时接收到返回上一级手势的一种场景实例示意图;Figure 9 is a schematic diagram of a scenario example in which the mobile phone receives a return gesture when displaying the application interface of an AR application provided by an embodiment of the present application;
图10为本申请实施例提供的手机未显示AR应用的应用界面时接收到返回上一级手势的一种场景实例示意图;Figure 10 is a schematic diagram of a scenario example in which a return gesture is received when the mobile phone does not display the application interface of the AR application provided by the embodiment of the present application;
图11为本申请实施例提供的人机交互方法的一种流程示意图;Figure 11 is a schematic flowchart of a human-computer interaction method provided by an embodiment of the present application;
图12-图13为本申请实施例提供的人机交互方法的一种详细流程示意图;Figures 12-13 are a detailed flow chart of the human-computer interaction method provided by the embodiment of the present application;
图14为本申请实施例提供的手机接收到第一操作的一种场景实例示意图;Figure 14 is a schematic diagram of a scenario example in which the mobile phone receives the first operation provided by the embodiment of the present application;
图15A-图15D为本申请实施例提供的用户在虚拟空间进行人机交互的场景实例示意图;Figures 15A-15D are schematic diagrams of scene examples in which users perform human-computer interaction in virtual space provided by embodiments of the present application;
图16A-图16C为本申请实施例提供的虚拟空间显示界面的实例示意图;Figures 16A-16C are schematic diagrams of examples of virtual space display interfaces provided by embodiments of the present application;
图17为本申请实施例提供的人机交互方法的另一种流程示意图;Figure 17 is another schematic flow chart of the human-computer interaction method provided by the embodiment of the present application;
图18为本申请实施例提供的一种电子设备结构组成示意图;Figure 18 is a schematic structural diagram of an electronic device provided by an embodiment of the present application;
图19为本申请实施例提供的一种芯片系统结构组成示意图。Figure 19 is a schematic structural diagram of a chip system provided by an embodiment of the present application.
具体实施方式Detailed ways
在本申请实施例的描述中,以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。In the description of the embodiments of the present application, the terms used in the following embodiments are only for the purpose of describing specific embodiments and are not intended to limit the present application. As used in the specification and appended claims of this application, the singular expressions "a," "the," "above," "the" and "the" are intended to also include, for example, "a "or more" unless the context clearly indicates otherwise. It should also be understood that in the following embodiments of this application, "at least one" and "one or more" refer to one or more than two (including two). The term "and/or" is used to describe the relationship between associated objects, indicating that there can be three relationships; for example, A and/or B can mean: A exists alone, A and B exist simultaneously, and B exists alone, Where A and B can be singular or plural. The character "/" generally indicates that the related objects are in an "or" relationship.
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。Reference in this specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Therefore, the phrases "in one embodiment", "in some embodiments", "in other embodiments", "in other embodiments", etc. appearing in different places in this specification are not necessarily References are made to the same embodiment, but rather to "one or more but not all embodiments" unless specifically stated otherwise. The terms “including,” “includes,” “having,” and variations thereof all mean “including but not limited to,” unless otherwise specifically emphasized. The term "connected" includes both direct and indirect connections unless otherwise stated. “First” and “second” are only used for descriptive purposes and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features.
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。In the embodiments of this application, words such as "exemplary" or "for example" are used to represent examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "such as" in the embodiments of the present application is not to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the words "exemplary" or "such as" is intended to present the concept in a concrete manner.
本申请实施例提供的人机交互方法可以应用于图1所示的系统。该系统10可以包括:电子设备100和头戴式显示设备200。The human-computer interaction method provided by the embodiment of the present application can be applied to the system shown in Figure 1. The system 10 may include an electronic device 100 and a head-mounted display device 200 .
其中,电子设备100和头戴式显示设备200之间可以通过有线或者无线的方式连接。有线连接可包括通过通用串行总线(universal serial bus,USB)接口、高清晰度多媒体接口(high definition multimedia interface,HDMI)接口等接口进行通信的有线连接。无线连接可包括通过蓝牙、无线保真(wireless fidelity,Wi-Fi)直连(如Wi-Fi p2p)、Wi-Fi softAP、Wi-Fi LAN、射频等技术进行通信的无线连接中一项或多项。本申请实施例对二者的连接方式不作限制。The electronic device 100 and the head-mounted display device 200 may be connected in a wired or wireless manner. Wired connections may include wired connections that communicate through interfaces such as a universal serial bus (USB) interface and a high definition multimedia interface (HDMI) interface. Wireless connections may include one or more wireless connections that communicate through Bluetooth, wireless fidelity (Wi-Fi) direct connection (such as Wi-Fi p2p), Wi-Fi softAP, Wi-Fi LAN, radio frequency and other technologies. Multiple items. The embodiment of the present application does not limit the connection method between the two.
电子设备100可以是手机、平板电脑,还可以是具有触敏表面或触控面板的膝上型计算机(Laptop)、具有触敏表面或触控面板的台式计算机等非便携式终端设备。电子设备100可运行特定应用程序,以提供传输给头戴式显示设备200显示的内容,该应用程序例如可以是视频应用、游戏应用、音乐应用、桌面应用、镜像投屏应用等。The electronic device 100 may be a mobile phone, a tablet computer, or a laptop computer (Laptop) with a touch-sensitive surface or a touch panel, a desktop computer with a touch-sensitive surface or a touch panel, or other non-portable terminal equipment. The electronic device 100 may run a specific application program to provide content transmitted to the head-mounted display device 200 for display. The application program may be, for example, a video application, a game application, a music application, a desktop application, a mirroring screen application, etc.
头戴式显示设备200的可实现形式包括头盔、眼镜、耳机等可以佩戴在用户头部的电子装置。头戴式显示设备200利用AR、VR、MR等技术在虚拟空间显示图像,可以使得用户感受到3D场景,为用户提供AR/VR/MR体验。该3D场景可包括3D的图像、3D的视频、音频等。可以理解的,图1所示虚拟空间是一个平面,在实际使用中,虚拟空间可以是存在曲率的弯曲空间。Possible implementation forms of the head-mounted display device 200 include helmets, glasses, headphones and other electronic devices that can be worn on the user's head. The head-mounted display device 200 uses AR, VR, MR and other technologies to display images in the virtual space, allowing users to experience 3D scenes and providing users with AR/VR/MR experiences. The 3D scene may include 3D images, 3D videos, audio, etc. It can be understood that the virtual space shown in Figure 1 is a plane. In actual use, the virtual space may be a curved space with curvature.
头戴式显示设备200可佩戴于用户头部,相当于电子设备100的一个外延显示器。电子设备100为头戴式显示设备200提供显示数据。The head-mounted display device 200 can be worn on the user's head and is equivalent to an epitaxial display of the electronic device 100 . The electronic device 100 provides display data to the head-mounted display device 200 .
电子设备100还可充当输入设备,接收点击、滑动等用户操作,而且可向AR/VR/MR的虚拟空间(AR空间、VR空间或MR空间)中投递射线,以模拟鼠标作用,便于用户对头戴式显示设备200所显示的内容作出的控制操作。The electronic device 100 can also serve as an input device to receive user operations such as clicks and slides, and can deliver rays to the AR/VR/MR virtual space (AR space, VR space or MR space) to simulate mouse actions to facilitate user interaction. Control operations performed on the content displayed on the head-mounted display device 200 .
当电子设备100用作输入设备时,可通过其配置的多种传感器,例如触敏传感器、加速度传感器、陀螺仪传感器、磁传感器、压力传感器等,接收用户输入。其中,加速度传感器、陀螺仪传感器可用于检测用户移动电子设备100的操作,该操作可用来改变射线的方向;触敏传感器、压力传感器等可用来检测用户在触控屏等触摸面板上的触控操作,例如滑动操作、点击操作、短按操作、长按操作等。When the electronic device 100 is used as an input device, user input can be received through various sensors configured therein, such as touch-sensitive sensors, acceleration sensors, gyroscope sensors, magnetic sensors, pressure sensors, etc. Among them, acceleration sensors and gyroscope sensors can be used to detect the user's operation of the mobile electronic device 100, which can be used to change the direction of the ray; touch-sensitive sensors, pressure sensors, etc. can be used to detect the user's touch on a touch panel such as a touch screen. Operations, such as sliding operations, click operations, short press operations, long press operations, etc.
头戴式显示设备200可以配置有一些物理按键,以接收一些用户输入,例如用于开关屏的按键、用于调整屏幕亮度的按键、用于切换空间模式和镜像模式的按键等。这些用户输入可通过头戴式显示设备200与电子设备100之间的有线或无线通信连接传输至电子设备100,继而触发电子设备100对此作出响应。例如,响应于从空间模式切换到镜像模式的用户输入,电子设备100可停止向头戴式显示设备200传输空间模式的显示数据,而开始传输镜像模式的显示数据。镜像模式的显示数据主要是电子设备100的屏幕流,可由电子设备100上的镜像投屏应用提供。空间模式的显示数据可由电子设备100上的特定应用程序提供,该特定应用程序可以是视频应用、游戏应用、音乐应用、桌面应用等。The head-mounted display device 200 may be configured with some physical buttons to receive some user inputs, such as buttons for turning the screen on and off, buttons for adjusting screen brightness, buttons for switching space mode and mirror mode, etc. These user inputs may be transmitted to the electronic device 100 through a wired or wireless communication connection between the head-mounted display device 200 and the electronic device 100, thereby triggering the electronic device 100 to respond thereto. For example, in response to a user input switching from the spatial mode to the mirror mode, the electronic device 100 may stop transmitting the display data of the spatial mode to the head-mounted display device 200 and start transmitting the display data of the mirror mode. The display data in the mirror mode is mainly the screen stream of the electronic device 100 , which can be provided by the mirror projection application on the electronic device 100 . The display data of the spatial mode may be provided by a specific application program on the electronic device 100, which may be a video application, a game application, a music application, a desktop application, etc.
用户看到头戴式显示设备200显示的图像后,可以通过在电子设备100或者头戴式显示设备200输入用户操作,以控制虚拟空间内的显示内容,以及头戴式显示设备200的工作状态,例如开关状态、屏幕亮度等。After the user sees the image displayed by the head-mounted display device 200, he or she can control the display content in the virtual space and the working status of the head-mounted display device 200 by inputting user operations on the electronic device 100 or the head-mounted display device 200. , such as switch status, screen brightness, etc.
图2示例性示出了本申请实施例提供的电子设备100的硬件架构图。如图2所示,该电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,传感器模块180,摄像头191,显示屏192。其中传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。FIG. 2 schematically shows a hardware architecture diagram of the electronic device 100 provided by the embodiment of the present application. As shown in Figure 2, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery. 142. Antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, sensor module 180, camera 191, and display screen 192. The sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
其中,处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processingunit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processing unit. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processing unit (NPU), etc. . Among them, different processing units can be independent devices or integrated in one or more processors.
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。The controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。The processor 110 may also be provided with a memory for storing instructions and data. In some embodiments, the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuitsound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purposeinput/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuitsound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous receiver (universal asynchronous receiver) /transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and/or Universal serial bus (USB) interface, etc.
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。The charging management module 140 is used to receive charging input from the charger. Among them, the charger can be a wireless charger or a wired charger.
电源管理模块141用于连接电池142,充电管理模块140与处理器110。The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。The wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。The mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 . The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc. The mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation. The mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation. In some embodiments, at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 . In some embodiments, at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏192显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。A modem processor may include a modulator and a demodulator. Among them, the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal. The demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor. The application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 192 . In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(WLAN)(如无线保真(Wi-Fi)网络),蓝牙(BT),全球导航卫星系统(GNSS),调频(FM),近距离无线通信技术(NFC),红外技术(IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 160 can provide applications on the electronic device 100 including wireless local area network (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), Near field communication technology (NFC), infrared technology (IR) and other wireless communication solutions. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 . The wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(codedivision multiple access,CDMA),宽带码分多址(wideband code division multipleaccess,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidounavigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellitesystem,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (codedivision multiple access, CDMA), broadband code Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM , and/or IR technology, etc. The GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou satellite navigation system (beidounavigation satellite system, BDS), quasi-zenith satellite system (quasi- zenith satellitesystem (QZSS) and/or satellite based augmentation systems (SBAS).
电子设备100通过GPU,显示屏192,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏192和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The electronic device 100 implements display functions through a GPU, a display screen 192, an application processor, and the like. The GPU is an image processing microprocessor and is connected to the display screen 192 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
显示屏192用于显示图像,视频等。显示屏192包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emittingdiode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrixorganic light emitting diode的,AMOLED),柔性发光二极管(flex light-emittingdiode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot lightemitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏192,N为大于1的正整数。The display screen 192 is used to display images, videos, etc. Display 192 includes a display panel. The display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode). (AMOLED), flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc. In some embodiments, the electronic device 100 may include 1 or N display screens 192, where N is a positive integer greater than 1.
压力传感器用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器可以设置于显示屏192。压力传感器的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏192,电子设备100根据压力传感器检测所述触摸操作强度。电子设备100也可以根据压力传感器的检测信号计算触摸的位置。Pressure sensors are used to sense pressure signals and convert pressure signals into electrical signals. In some embodiments, a pressure sensor may be provided on display screen 192 . There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. A capacitive pressure sensor may include at least two parallel plates of conductive material. When a force acts on a pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the intensity of the pressure based on the change in capacitance. When a touch operation is performed on the display screen 192, the electronic device 100 detects the strength of the touch operation according to the pressure sensor. The electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor.
触摸传感器,也称“触控面板”。触摸传感器可以设置于显示屏192,由触摸传感器与显示屏192组成触摸屏,也称“触控屏”。触摸传感器用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏192提供与触摸操作相关的视觉输出。Touch sensor, also called "touch panel". The touch sensor can be disposed on the display screen 192, and the touch sensor and the display screen 192 form a touch screen, which is also called a "touch screen". Touch sensors are used to detect touches on or near them. The touch sensor can pass the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through display screen 192 .
电子设备100可以通过压力传感器或触摸传感器检测用户在显示屏192上做出的手势。比如向上滑动手势、向下滑动手势、向左滑动手势、向右滑动手势、点击手势、长按手势、敲击手势等。The electronic device 100 may detect gestures made by the user on the display screen 192 through a pressure sensor or a touch sensor. For example, swipe up gestures, swipe down gestures, swipe left gestures, swipe right gestures, click gestures, long press gestures, tap gestures, etc.
电子设备100可以通过ISP,摄像头191,视频编解码器,GPU,显示屏192以及应用处理器等实现拍摄功能。The electronic device 100 can implement the shooting function through an ISP, a camera 191, a video codec, a GPU, a display screen 192, an application processor, and the like.
ISP用于处理摄像头191反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。The ISP is used to process the data fed back by the camera 191. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
摄像头191用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头191,N为大于1的正整数。Camera 191 is used to capture still images or video. The object passes through the lens to produce an optical image that is projected onto the photosensitive element. The photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal. In some embodiments, the electronic device 100 may include 1 or N cameras 191, where N is a positive integer greater than 1.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。Video codecs are used to compress or decompress digital video. Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。NPU is a neural network (NN) computing processor. By drawing on the structure of biological neural networks, such as the transmission mode between neurons in the human brain, it can quickly process input information and can continuously learn by itself.
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。The external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。Internal memory 121 may be used to store computer executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.). The storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.). In addition, the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), etc. The processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,以及应用处理器等实现音频功能。例如音乐播放,录音等。其中,音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。The electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, and the application processor. Such as music playback, recording, etc. Among them, the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert analog audio input into a digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
内部存储器121可用于存储一个或多个应用的应用程序,该应用程序包括指令。当该应用程序被处理器110执行时,使得电子设备100生成用于呈现给用户的内容。示例性的,该应用可以包括用于管理头戴式显示设备200的应用、游戏应用、会议应用、视频应用、桌面应用或其他应用等。Internal memory 121 may be used to store an application program for one or more applications, the application program including instructions. When executed by the processor 110, the application program causes the electronic device 100 to generate content for presentation to the user. For example, the application may include an application for managing the head-mounted display device 200, a game application, a conferencing application, a video application, a desktop application or other applications, etc.
GPU可用于根据从处理器110处获取到的数据(例如应用程序提供的数据)执行数学和几何运算,利用计算机图形技术、计算机仿真技术等来渲染图像,确定用于在头戴式显示设备200上显示的图像。在一些实施例中,GPU可以将校正或预失真添加到图像的渲染过程中,以补偿或校正由头戴式显示设备200的光学组件引起的失真。The GPU can be used to perform mathematical and geometric operations based on data obtained from the processor 110 (such as data provided by an application program), render images using computer graphics technology, computer simulation technology, etc., and determine the conditions for use in the head-mounted display device 200 image shown on. In some embodiments, the GPU may add corrections or pre-distortion to the rendering process of the image to compensate for or correct for distortion caused by the optical components of the head mounted display device 200 .
在本申请实施例中,电子设备100可通过移动通信模块150、无线通信模块160或者有线接口将GPU处理后得到的图像发送给头戴式显示设备200。In this embodiment of the present application, the electronic device 100 can send the image processed by the GPU to the head-mounted display device 200 through the mobile communication module 150, the wireless communication module 160, or a wired interface.
需要说明的是,图2示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It should be noted that the structure illustrated in FIG. 2 does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently. The components illustrated may be implemented in hardware, software, or a combination of software and hardware.
图3A示例性示出了本申请实施例提供的头戴式显示设备200的光学构成。如图3A所示,头戴式显示设备200可包括:显示屏201、光学组件202、显示屏203、光学组件204。其中,显示屏201和显示屏203可以是一个整体,即一整块屏幕的左右两个部分。光学组件202和光学组件204的材质、结构等相同。光学组件202和光学组件204可由一个或多个透镜组成,该透镜可包括凸透镜、菲涅尔透镜或其他类型的透镜中的一个或多个。FIG. 3A schematically shows the optical structure of the head-mounted display device 200 provided by the embodiment of the present application. As shown in FIG. 3A , the head-mounted display device 200 may include: a display screen 201 , an optical component 202 , a display screen 203 , and an optical component 204 . The display screen 201 and the display screen 203 may be a whole, that is, the left and right parts of a whole screen. The optical component 202 and the optical component 204 have the same material and structure. Optical assembly 202 and optical assembly 204 may be composed of one or more lenses, which may include one or more of convex lenses, Fresnel lenses, or other types of lenses.
显示屏201和光学组件202对应于用户的左眼。用户佩戴头戴式显示设备200时,显示屏201上可以显示有图像a1。显示屏201显示图像a1时发出的光经过光学组件202的透射后将在用户左眼前方形成该图像a1的虚像a1’。Display 201 and optical assembly 202 correspond to the user's left eye. When the user wears the head-mounted display device 200, the image a1 may be displayed on the display screen 201. The light emitted when the display screen 201 displays the image a1 will form a virtual image a1' of the image a1 in front of the user's left eye after being transmitted by the optical component 202.
显示屏203和光学组件204对应于用户的右眼。用户佩戴头戴式显示设备时,显示屏203可以显示有图像a2。显示屏203显示图像a2时发出的光经过光学组件204的透射后将在用户右眼前方形成该图像a2的虚像a2’。Display 203 and optical assembly 204 correspond to the user's right eye. When the user wears the head-mounted display device, the display screen 203 may display image a2. The light emitted when the display screen 203 displays the image a2 is transmitted by the optical component 204 and will form a virtual image a2' of the image a2 in front of the user's right eye.
图像a1和图像a2为针对同一物体例如物体a的具有视差的两幅图像。视差是指从有一定距离的两个点上观察同一个物体时,该物体在视野中位置的差异。虚像a1’和虚像a2’位于同一平面上,该平面可以被称为虚像面。Image a1 and image a2 are two images with parallax for the same object, such as object a. Parallax refers to the difference in the position of the object in the field of view when the same object is viewed from two points at a certain distance. The virtual image a1' and the virtual image a2' are located on the same plane, and this plane can be called a virtual image plane.
在佩戴头戴式显示设备200时,用户的左眼会聚焦到虚像a1’上,用户的右眼会聚焦到虚像a2’上。然后,虚像a1’和虚像a2’会在用户的大脑中叠加成为一幅完整且具有立体感的图像,该过程被称为辐辏。在辐辏过程中,双眼视线的交汇点会被用户认为是图像a1和图像a2所描述的物体实际所在的位置。由于辐辏过程,用户便可以感受到头戴式显示设备200提供的3D场景。When wearing the head-mounted display device 200, the user's left eye will focus on the virtual image a1', and the user's right eye will focus on the virtual image a2'. Then, the virtual image a1’ and the virtual image a2’ will be superimposed in the user’s brain to form a complete and three-dimensional image. This process is called vergence. During the convergence process, the intersection point of the two eyes will be considered by the user to be the actual location of the object described in image a1 and image a2. Due to the convergence process, the user can feel the 3D scene provided by the head-mounted display device 200 .
图3B示例性示出了本申请实施例提供的头戴式显示设备200的硬件架构。如图3B所示,头戴式显示设备200可包括:处理器210、存储器211、通信模块212、传感器系统213、摄像头214、显示装置215、音频装置216。以上各个部件可以耦合连接并相互通信。FIG. 3B schematically shows the hardware architecture of the head-mounted display device 200 provided by the embodiment of the present application. As shown in FIG. 3B , the head-mounted display device 200 may include: a processor 210, a memory 211, a communication module 212, a sensor system 213, a camera 214, a display device 215, and an audio device 216. Each of the above components can be coupled and communicate with each other.
可理解的,图3B所示的结构并不构成对头戴式显示设备200的具体限定。在本申请另一些实施例中,头戴式显示设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。例如,头戴式显示设备200还可以包括物理按键如开关键、音量键、屏幕亮度调节键,以及各类接口,例如USB接口等。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structure shown in FIG. 3B does not constitute a specific limitation on the head-mounted display device 200. In other embodiments of the present application, the head-mounted display device 200 may include more or fewer components than shown in the figures, or combine some components, or split some components, or arrange different components. For example, the head-mounted display device 200 may also include physical buttons such as on/off keys, volume keys, screen brightness adjustment keys, and various interfaces, such as USB interfaces. The components illustrated may be implemented in hardware, software, or a combination of software and hardware.
处理器210可以包括一个或多个处理单元,例如:处理器可以包括AP,调制解调处理器,GPU,ISP,控制器,视频编解码器,DSP,基带处理器,和/或NPU等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制,使得各个部件执行相应的功能,例如人机交互、运动跟踪/预测、渲染显示、音频处理等。The processor 210 may include one or more processing units. For example, the processor may include an AP, a modem processor, a GPU, an ISP, a controller, a video codec, a DSP, a baseband processor, and/or an NPU, etc. Among them, different processing units can be independent devices or integrated in one or more processors. The controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions, allowing each component to perform corresponding functions, such as human-computer interaction, motion tracking/prediction, rendering display, audio processing, etc.
存储器211可存储一些可执行的指令。存储器211可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储头戴式显示设备200使用过程中所创建的数据(比如音频数据等)等。此外,存储器211可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flashstorage,UFS)等。处理器210通过运行存储在存储器211的指令,和/或存储在设置于处理器中的存储器的指令,执行头戴式显示设备200的各种功能应用以及数据处理。Memory 211 may store some executable instructions. The memory 211 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.). The storage data area may store data created during use of the head-mounted display device 200 (such as audio data, etc.). In addition, the memory 211 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc. The processor 210 executes various functional applications and data processing of the head-mounted display device 200 by executing instructions stored in the memory 211 and/or instructions stored in a memory provided in the processor.
通信模块212可包括移动通信模块和无线通信模块。其中,移动通信模块可以提供应用在头戴式显示设备200上的包括2G/3G/4G/5G等无线通信的解决方案。无线通信模块可以提供应用在头戴式显示设备200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块可以是集成至少一个通信处理模块的一个或多个器件。The communication module 212 may include a mobile communication module and a wireless communication module. Among them, the mobile communication module can provide wireless communication solutions including 2G/3G/4G/5G applied on the head-mounted display device 200. The wireless communication module can provide information that is applied on the head-mounted display device 200, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), global Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR), etc. The wireless communication module may be one or more devices integrating at least one communication processing module.
传感器系统213可包括加速度计、指南针、陀螺仪、磁力计、或用于检测运动的其他传感器等。传感器系统213用于采集对应的数据,例如加速度传感器采集头戴式显示设备200加速度、陀螺仪传感器采集头戴式显示设备200的运动速度等。传感器系统213采集到的数据可以反映佩戴该头戴式显示设备200的用户头部的运动情况。在一些实施例中,传感器系统213可以为设置在头戴式显示设备200内的惯性测量单元(inertial measurementunit,IMU)。在一些实施例中,头戴式显示设备200可以将传感器系统获取到的数据发送给处理器210进行分析。用户可以可通过在头戴式显示设备200上输入头部运动操作,来触发头戴式显示设备200执行对应的功能。用户头部的运动情况可包括:是否转动、转动的方向等。Sensor system 213 may include an accelerometer, compass, gyroscope, magnetometer, or other sensor for detecting motion, or the like. The sensor system 213 is used to collect corresponding data. For example, the acceleration sensor collects the acceleration of the head-mounted display device 200 , the gyroscope sensor collects the movement speed of the head-mounted display device 200 , etc. The data collected by the sensor system 213 can reflect the movement of the head of the user wearing the head-mounted display device 200 . In some embodiments, the sensor system 213 may be an inertial measurement unit (IMU) disposed within the head-mounted display device 200 . In some embodiments, the head mounted display device 200 may send data acquired by the sensor system to the processor 210 for analysis. The user can trigger the head-mounted display device 200 to perform corresponding functions by inputting head movement operations on the head-mounted display device 200 . The movement of the user's head may include: whether to rotate, the direction of rotation, etc.
传感器系统213还可以包括光学传感器,用于结合摄像头214来跟踪用户的眼睛位置以及捕获眼球运动数据。该眼球运动数据例如可以用于确定用户的眼间距、每只眼睛相对于头戴式显示设备200的3D位置、每只眼睛的扭转和旋转(即转动、俯仰和摇动)的幅度和注视方向等。在一个示例中,红外光在头戴式显示设备200内发射并从每只眼睛反射,反射光由摄像头214或者光学传感器检测到,检测到的数据被传输给处理器210,以使得处理器210从每只眼睛反射的红外光的变化中分析用户眼睛的位置、瞳孔直径、运动状态等。Sensor system 213 may also include optical sensors for tracking the user's eye position and capturing eye movement data in conjunction with camera 214 . This eye movement data may be used, for example, to determine the distance between the user's eyes, the 3D position of each eye relative to the head-mounted display device 200 , the magnitude and gaze direction of each eye's twist and rotation (i.e., roll, pitch, and shake), etc. . In one example, infrared light is emitted within the head mounted display device 200 and reflected from each eye. The reflected light is detected by the camera 214 or an optical sensor, and the detected data is transmitted to the processor 210 such that the processor 210 The position, pupil diameter, movement status, etc. of the user's eyes are analyzed from changes in the infrared light reflected by each eye.
摄像头214可以用于捕捉捕获静态图像或视频。该静态图像或视频可以是面向外部的用户周围的图像或视频,也可以是面向内部的图像或视频。摄像头214可以跟踪用户单眼或者双眼的运动。摄像头214包括但不限于传统彩色摄像头(RGB camera)、深度摄像头(RGB depth camera)、动态视觉传感器(dynamic vision sensor,DVS)相机等。深度摄像头可以获取被拍摄对象的深度信息。在一些实施例中,摄像头214可用于捕捉用户眼睛的图像,并将图像发送给处理器210进行分析。处理器210可以根据摄像头214采集到的图像,确定用户眼睛的状态,并根据用户眼睛所处的状态执行对应的功能。也就是说,用户可通过在头戴式显示设备200上输入眼睛运动操作,来触发头戴式显示设备200执行对应的功能。用户眼睛的状态可包括:是否转动、转动的方向、是否长时间未转动、看向外界的角度等。Camera 214 may be used to capture still images or video. This static image or video may be an externally facing image or video of the user's surroundings or an internally facing image or video. The camera 214 can track the movement of one or both eyes of the user. The camera 214 includes but is not limited to a traditional color camera (RGB camera), a depth camera (RGB depth camera), a dynamic vision sensor (dynamic vision sensor, DVS) camera, etc. The depth camera can obtain the depth information of the subject. In some embodiments, camera 214 may be used to capture images of the user's eyes and send the images to processor 210 for analysis. The processor 210 can determine the state of the user's eyes based on the images collected by the camera 214, and perform corresponding functions according to the state of the user's eyes. That is to say, the user can trigger the head-mounted display device 200 to perform corresponding functions by inputting eye movement operations on the head-mounted display device 200 . The status of the user's eyes may include: whether it is turned, the direction of the turn, whether it has not been turned for a long time, the angle at which it looks to the outside world, etc.
音频装置216用于实现音频的采集以及输出。音频装置216可包括但不限于:麦克风、扬声器、耳机等。The audio device 216 is used to implement audio collection and output. Audio devices 216 may include, but are not limited to, microphones, speakers, headphones, etc.
头戴式显示设备200通过GPU,显示装置215,以及应用处理器等来呈现或者显示图像。The head-mounted display device 200 presents or displays images through a GPU, a display device 215, an application processor, and the like.
GPU为图像处理的微处理器,连接显示装置215和应用处理器。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。GPU用于根据从处理器210处得到的数据执行数学和几何计算,利用计算机图形技术、计算机仿真技术等来渲染图像,以提供用于在显示装置215上显示的内容。GPU还用于将校正或预失真添加到图像的渲染过程中,以补偿或校正由显示装置215中的光学组件引起的失真。GPU还可以基于来自传感器系统213的数据来调整提供给显示装置215的内容。例如,GPU可以基于用户眼睛的3D位置、瞳距等在提供给显示装置215的内容中添加景深信息。The GPU is an image processing microprocessor and is connected to the display device 215 and the application processor. Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information. The GPU is used to perform mathematical and geometric calculations based on data obtained from the processor 210, and render images using computer graphics technology, computer simulation technology, etc., to provide content for display on the display device 215. The GPU is also used to add corrections or pre-distortion to the rendering process of the image to compensate or correct for distortion caused by optical components in the display device 215 . The GPU may also adjust content provided to display device 215 based on data from sensor system 213 . For example, the GPU may add depth information to the content provided to the display device 215 based on the 3D position of the user's eyes, interpupillary distance, etc.
显示装置215可包括:一个或多个显示屏、一个或多个光学组件。这里,显示屏和光学组件的结构以及它们之间的位置关系可参考图3A中的相关描述。其中,显示屏可包括显示面板,显示面板可以用于显示图像,从而为用户呈现立体的虚拟场景。显示面板可以采用LCD,OLED,AMOLED,FLED,Miniled,MicroLed,Micro-oLed,QLED等。光学组件可用于将来自显示屏的光引导至出射光瞳以供用户感知。在一些实施方式中,光学组件中的一个或多个光学元件(例如透镜)可具有一个或多个涂层,诸如,抗反射涂层。光学组件对图像光的放大允许显示屏在物理上更小、更轻、消耗更少的功率。另外,图像光的放大可以增加显示屏显示的内容的视野。例如,光学组件可以使得显示屏所显示的内容的视野为用户的全部视野。The display device 215 may include one or more display screens and one or more optical components. Here, the structure of the display screen and optical components and the positional relationship between them may refer to the relevant description in FIG. 3A. The display screen may include a display panel, and the display panel may be used to display images to present a three-dimensional virtual scene to the user. The display panel can adopt LCD, OLED, AMOLED, FLED, Miniled, MicroLed, Micro-oLed, QLED, etc. Optical components can be used to direct light from the display to the exit pupil for perception by the user. In some embodiments, one or more optical elements (eg, lenses) in an optical assembly may have one or more coatings, such as anti-reflective coatings. The amplification of image light by optical components allows displays to be physically smaller, lighter and consume less power. In addition, the amplification of the image light can increase the field of view of the content displayed on the display screen. For example, the optical component can make the field of view of the content displayed on the display screen the entire field of view of the user.
在本申请实施例中,头戴式显示设备200中的显示屏可用于显示电子设备100传输过来的显示数据,给用户提供XR体验。In this embodiment of the present application, the display screen in the head-mounted display device 200 can be used to display display data transmitted from the electronic device 100 to provide the user with an XR experience.
头戴式显示设备200可以在虚拟空间显示图像,使得用户感受到3D场景,为用户提供AR/VR/MR体验。示例性的,如图4A所示,头戴式显示设备200的虚拟空间显示桌面界面,该桌面界面上可显示各种应用程序的桌面图标,例如“联系人”、“日历”、“天气”、“视频”、“时间”、“购物”、“游戏”、“短视频”、“音乐”等应用程序的桌面图标。不限于应用程序,桌面界面上还可包括文件、文件夹等对象。在一些虚拟空间中,用户可以通过移动虚拟空间中的射线对虚拟空间的显示对象进行操作。The head-mounted display device 200 can display images in the virtual space, allowing the user to experience the 3D scene, and providing the user with an AR/VR/MR experience. Exemplarily, as shown in FIG. 4A , the virtual space of the head-mounted display device 200 displays a desktop interface, which can display desktop icons of various applications, such as "Contacts", "Calendar", and "Weather" , "Video", "Time", "Shopping", "Game", "Short Video", "Music" and other desktop icons for applications. Not limited to applications, the desktop interface can also include objects such as files and folders. In some virtual spaces, users can operate display objects in the virtual space by moving rays in the virtual space.
电子设备100可以作为头戴式显示设备200的输入设备,比如将手机作为AR眼镜的输入设备,用户可以通过在电子设备100上进行操作,实现与头戴式显示设备200的人机交互。以电子设备100是手机,头戴式显示设备200是AR眼镜为例。参考图4A,手机100上安装有AR应用,AR应用用于对AR眼镜200的虚拟空间进行管理。响应于用户对AR应用的应用图标的点击操作,手机100启动AR应用,显示触摸板界面101。AR眼镜200显示虚拟空间后,用户可以使用AR应用实现与AR眼镜200的虚拟空间的人机交互。可选的,用户可以通过移动手机100的位置和朝向来控制虚拟空间内的射线的指向位置,实现与虚拟空间内显示对象的人机交互。可选的,触摸板界面101上设置有各种虚拟按键,用户可以通过对虚拟按键的操作实现与虚拟空间内显示对象的人机交互。示例性的,触摸板界面101包括触控区域102,用户可以在触控区域102内向上、向下、向左或向右滑动(执行滑动手势),相应的向上、向下、向左或向右移动虚拟空间内的射线;还可以在触控区域102内执行点击、长按等操作(执行点击手势、长按手势),实现与虚拟空间内显示对象的人机交互。The electronic device 100 can be used as an input device for the head-mounted display device 200 , such as using a mobile phone as an input device for AR glasses. The user can implement human-computer interaction with the head-mounted display device 200 by operating on the electronic device 100 . As an example, the electronic device 100 is a mobile phone and the head-mounted display device 200 is AR glasses. Referring to FIG. 4A , an AR application is installed on the mobile phone 100 , and the AR application is used to manage the virtual space of the AR glasses 200 . In response to the user's click operation on the application icon of the AR application, the mobile phone 100 starts the AR application and displays the touchpad interface 101 . After the AR glasses 200 display the virtual space, the user can use an AR application to implement human-computer interaction with the virtual space of the AR glasses 200 . Optionally, the user can control the pointing position of the rays in the virtual space by moving the position and orientation of the mobile phone 100 to achieve human-computer interaction with objects displayed in the virtual space. Optionally, the touch panel interface 101 is provided with various virtual keys, and the user can realize human-computer interaction with objects displayed in the virtual space by operating the virtual keys. Exemplarily, the touchpad interface 101 includes a touch area 102. The user can slide up, down, left, or right (perform a sliding gesture) in the touch area 102, and the user can slide up, down, left, or right accordingly. Move the ray in the virtual space to the right; you can also perform clicks, long presses and other operations (perform click gestures, long press gestures) in the touch area 102 to achieve human-computer interaction with objects displayed in the virtual space.
示例性的,参考图4A,用户可以在触控区域102内向上、向下、向左或向右滑动,移动虚拟空间内的射线,使得射线移动至桌面界面上短视频应用的桌面图标;然后用户可以在触控区域102内执行点击操作,选中短视频应用的桌面图标。响应于该点击操作,头戴式显示设备200在虚拟空间内显示短视频应用的应用界面。For example, referring to FIG. 4A , the user can slide up, down, left or right in the touch area 102 to move the ray in the virtual space so that the ray moves to the desktop icon of the short video application on the desktop interface; then The user can perform a click operation in the touch area 102 to select the desktop icon of the short video application. In response to the click operation, the head-mounted display device 200 displays the application interface of the short video application in the virtual space.
一般来说,通过在电子设备100上输入手势,实现与头戴式显示设备200的虚拟空间的人机交互,对于用户来说操作便捷,使用体验较好。比如上述实现方式中,在触控区域102内执行滑动手势、点击手势、长按手势等。在人机交互过程中,除了上述操作外,用户还需要在头戴式显示设备200的虚拟空间内进行截屏、录屏、返回上一级等操作。Generally speaking, by inputting gestures on the electronic device 100, human-computer interaction with the virtual space of the head-mounted display device 200 is realized, which is convenient for the user and provides a better user experience. For example, in the above implementation, sliding gestures, click gestures, long press gestures, etc. are performed in the touch area 102. During the human-computer interaction process, in addition to the above operations, the user also needs to perform operations such as screenshots, screen recordings, and returning to the previous level in the virtual space of the head-mounted display device 200 .
手机等具备触摸屏的电子设备上支持的手势操作较丰富。比如,通过在触摸屏上执行多指下滑手势实现手机显示界面的截屏;比如,通过在触摸屏上执行指关节双击手势实现手机显示界面的截屏;比如,通过在触摸屏上执行向左滑动手势实现手机显示界面返回上一级等。Electronic devices with touch screens, such as mobile phones, support a wide range of gesture operations. For example, a screenshot of the mobile phone display interface can be achieved by performing a multi-finger sliding gesture on the touch screen; for example, a screenshot of the mobile phone display interface can be achieved by performing a double-click gesture on the touch screen; for example, a left sliding gesture can be performed on the touch screen to capture the display of the mobile phone. The interface returns to the previous level and so on.
现有技术中,为了避免与电子设备100自身的系统手势(比如,多指下滑进行截屏的手势、指关节双击进行截屏的手势、向左滑动返回上一级的手势等)产生冲突,当用户需要在头戴式显示设备200的虚拟空间进行截屏、录屏、返回上一级等操作时,不支持在电子设备100上使用手势实现,而是通过虚拟按键等方式实现。In the prior art, in order to avoid conflicts with the system gestures of the electronic device 100 (for example, the gesture of sliding down with multiple fingers to take a screenshot, the gesture of double-clicking the knuckle to take a screenshot, the gesture of sliding left to return to the previous level, etc.), when the user When operations such as taking screenshots, recording screens, and returning to the previous level need to be performed in the virtual space of the head-mounted display device 200 , gestures are not supported on the electronic device 100 , but can be implemented through virtual buttons or other methods.
示例性的,在一种实现方式中,用户通过点击电子设备100上的虚拟按键实现在头戴式显示设备200的虚拟空间内进行截屏、录屏、返回上一级等操作。示例性的,参考图4B,电子设备100的触摸板界面101上包括“主页”按钮103,用户可以点击“主页”按钮103实现虚拟空间显示界面切换为桌面界面。触摸板界面101上包括“返回”按钮104,用户可以点击“返回”按钮104实现虚拟空间的显示界面返回上一级。触摸板界面101上包括“截屏”按钮105,用户可以点击“截屏”按钮105实现对虚拟空间的显示界面截图。触摸板界面101上包括“录屏”按钮106,用户可以点击“录屏”按钮106实现对虚拟空间的显示界面进行录屏。使用电子设备100上的虚拟按键实现对虚拟空间进行截屏、录屏、返回上一级等操作,用户需要将视线从头戴式显示设备200内移开,移动到电子设备100的触摸板界面上,寻找对应虚拟按键。在视线跳转过程中,由于双眼需要调节焦距来看清两个设备,会产生较强烈的不适感,用户体验不好。For example, in one implementation, the user clicks virtual buttons on the electronic device 100 to take screenshots, record screens, return to the previous level, and other operations in the virtual space of the head-mounted display device 200 . For example, referring to FIG. 4B, the touch panel interface 101 of the electronic device 100 includes a "Home" button 103. The user can click the "Home" button 103 to switch the virtual space display interface to a desktop interface. The touch panel interface 101 includes a "return" button 104, and the user can click the "return" button 104 to return the display interface of the virtual space to the previous level. The touch panel interface 101 includes a "Screenshot" button 105, and the user can click the "Screenshot" button 105 to take a screenshot of the display interface of the virtual space. The touch panel interface 101 includes a "record screen" button 106, and the user can click the "record screen" button 106 to record the screen of the display interface of the virtual space. Use the virtual buttons on the electronic device 100 to perform operations such as taking screenshots, recording screens, and returning to the previous level in the virtual space. The user needs to move his eyes away from the head-mounted display device 200 and move to the touch panel interface of the electronic device 100 , to find the corresponding virtual key. During the line of sight jump, since the eyes need to adjust the focus to see the two devices clearly, a strong sense of discomfort will occur and the user experience will be poor.
本申请实施例提供一种人机交互方法,当电子设备作为头戴式显示设备的输入设备时,支持用户在电子设备的触摸屏上执行系统手势,实现对头戴式显示设备的虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。简化了用户的操作流程,提高了用户的使用体验。Embodiments of the present application provide a human-computer interaction method that, when an electronic device is used as an input device for a head-mounted display device, supports the user to perform system gestures on the touch screen of the electronic device to realize the display of the virtual space of the head-mounted display device. Use the interface to take screenshots, record screens, return to the previous level and other operations. It simplifies the user's operation process and improves the user's experience.
下面,以电子设备是手机,头戴式显示设备是AR眼镜为例,结合附图对本申请实施例提供的人机交互方法进行详细介绍。Below, taking the electronic device as a mobile phone and the head-mounted display device as AR glasses as an example, the human-computer interaction method provided by the embodiment of the present application will be introduced in detail with reference to the accompanying drawings.
手机上安装有AR应用,用户可以在手机上启动AR应用,来管理AR眼镜的虚拟空间。当手机启动AR应用后,显示AR应用的应用界面时,如果接收到用户在手机触摸屏上的系统手势,比如截屏手势、录屏手势、返回上一级手势等,这些系统手势作用于AR眼镜的虚拟空间,而不作用于手机。具体的,手机启动AR应用后,显示AR应用的应用界面。如果接收到用户在手机触摸屏上的截屏手势,不对手机的显示界面进行截图,而是对AR眼镜的虚拟空间的显示界面进行截图。如果接收到用户在手机触摸屏上的录屏手势,不对手机的显示界面进行录屏,而是对AR眼镜的虚拟空间的显示界面进行录屏。如果接收到用户在手机触摸屏上的返回上一级手势,不对手机的显示界面进行返回上一级操作,而是对AR眼镜的虚拟空间的显示界面进行返回上一级操作。这样,用户在观看虚拟空间的画面时,只需要在手机触摸屏上做出系统手势,就可以实现对AR眼镜的虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。用户的视线不需要离开虚拟空间,避免了视线在虚拟空间和手机屏幕之间跳转。而且手势操作方便快捷,用户操作成本低、体验好。There is an AR application installed on the mobile phone, and the user can start the AR application on the mobile phone to manage the virtual space of the AR glasses. When the mobile phone starts an AR application and displays the application interface of the AR application, if it receives the user's system gestures on the mobile phone's touch screen, such as screenshot gestures, screen recording gestures, return to the previous level gestures, etc., these system gestures will act on the AR glasses. Virtual space, not mobile phones. Specifically, after the mobile phone starts the AR application, the application interface of the AR application is displayed. If the user's screenshot gesture on the touch screen of the mobile phone is received, the display interface of the mobile phone is not taken, but the display interface of the virtual space of the AR glasses is taken. If the user's screen recording gesture on the touch screen of the mobile phone is received, the display interface of the mobile phone will not be recorded, but the display interface of the virtual space of the AR glasses will be recorded. If the user's return to the previous level gesture on the touch screen of the mobile phone is received, the display interface of the mobile phone is not returned to the previous level, but the display interface of the virtual space of the AR glasses is returned to the previous level. In this way, when users watch the screen in the virtual space, they only need to make system gestures on the touch screen of the mobile phone to take screenshots, record the screen, return to the previous level and other operations on the display interface of the virtual space of the AR glasses. The user's eyes do not need to leave the virtual space, avoiding the jump between the virtual space and the mobile phone screen. Moreover, gesture operation is convenient and fast, and the user operation cost is low and the user experience is good.
当手机退出AR应用,或者AR应用退到后台运行时,手机不再显示AR应用的应用界面。比如,手机可以显示桌面界面,或者手机可以显示其他应用的应用界面(比如视频应用的应用界面、聊天应用的应用界面等)。如果手机接收到用户在手机触摸屏上的系统手势,比如截屏手势、录屏手势、返回上一级手势等,这些系统手势作用于手机。具体的,当手机不显示AR应用的应用界面时,如果接收到用户在手机触摸屏上的截屏手势,对手机的显示界面进行截图。如果接收到用户在手机触摸屏上的录屏手势,对手机的显示界面进行录屏。如果接收到用户在手机触摸屏上的返回上一级手势,对手机的显示界面进行返回上一级操作。这样,用户可以在手机触摸屏上做出系统手势,方便地实现对手机显示界面进行截屏、录屏、返回上一级等操作。When the phone exits the AR application, or the AR application retreats to the background, the phone will no longer display the application interface of the AR application. For example, the mobile phone can display the desktop interface, or the mobile phone can display the application interface of other applications (such as the application interface of a video application, the application interface of a chat application, etc.). If the phone receives the user's system gestures on the touch screen of the phone, such as screenshot gestures, screen recording gestures, return to the previous level gestures, etc., these system gestures will act on the phone. Specifically, when the mobile phone does not display the application interface of the AR application, if a screenshot gesture of the user on the touch screen of the mobile phone is received, the display interface of the mobile phone is screenshotted. If the user's screen recording gesture on the touch screen of the mobile phone is received, the display interface of the mobile phone is recorded. If the user's return to the previous level gesture on the touch screen of the mobile phone is received, the display interface of the mobile phone is returned to the previous level. In this way, the user can make system gestures on the touch screen of the mobile phone to easily take screenshots, record the screen, return to the previous level and other operations on the mobile phone display interface.
图5-图10示出了手机接收到系统手势的一些场景实例的示意图。需要说明的是,图5-图10中示出的具体手势仅为对应场景中的一种示例,用于示意性说明系统手势。在另一些实施方式中,截屏手势、录屏手势、返回上一级手势等系统手势可以是其他具体形式。比如,截屏手势可以是在屏幕上多指向下滑动、多指向上滑动,也可以是单指关节双击屏幕、双指关节双击屏幕等。比如,录屏手势可以是在屏幕上多指向上滑动、多指向下滑动,也可以是单指关节双击屏幕、双指关节双击屏幕等。可以理解的,截屏手势与录屏手势不同。比如,返回上一级手势可以是单指从屏幕左边侧向右滑动,也可以是单指从屏幕右边侧向左滑动,还可以是手指从屏幕两侧向内滑动等。Figures 5-10 show schematic diagrams of some scenario examples in which the mobile phone receives system gestures. It should be noted that the specific gestures shown in Figures 5 to 10 are only examples in corresponding scenarios and are used to schematically illustrate system gestures. In other implementations, system gestures such as screenshot gestures, screen recording gestures, and return to the previous level gestures may be in other specific forms. For example, the screenshot gesture can be sliding down with multiple fingers on the screen, sliding up with multiple fingers, or double-clicking the screen with one knuckle, double-clicking the screen with two knuckles, etc. For example, screen recording gestures can be swiping up and down on the screen with multiple fingers, double-clicking the screen with one knuckle, double-clicking the screen with two knuckles, etc. Understandably, screenshot gestures are different from screen recording gestures. For example, the gesture to return to the previous level can be a single finger sliding from the left side of the screen to the right, a single finger sliding from the right side of the screen to the left, or a finger sliding inward from both sides of the screen, etc.
以截屏手势是在屏幕上多指向下滑动为例,图5示出了手机显示AR应用的应用界面时,接收到截屏手势的一种场景实例示意图。Taking the screenshot gesture of sliding down with multiple fingers on the screen as an example, Figure 5 shows a schematic diagram of a scene example in which a screenshot gesture is received when the mobile phone displays the application interface of an AR application.
如图5所示,手机100启动AR应用后,显示触摸板界面101;AR眼镜200的虚拟空间显示短视频应用的应用界面。用户在手机100的触摸屏上做出截屏手势;示例性的,如图5所示,用户使用三指在手机100的屏幕向下滑动。手机100接收到截屏手势后,触发生成AR眼镜200的虚拟空间的显示界面的截图图片。示例性的,对短视频应用的应用界面进行截屏,生成截图图片。进一步的,手机100还可以保存生成的截图图片。As shown in Figure 5, after starting the AR application, the mobile phone 100 displays the touch panel interface 101; the virtual space of the AR glasses 200 displays the application interface of the short video application. The user makes a screenshot gesture on the touch screen of the mobile phone 100; for example, as shown in Figure 5, the user uses three fingers to slide down on the screen of the mobile phone 100. After receiving the screenshot gesture, the mobile phone 100 triggers the generation of a screenshot of the display interface of the virtual space of the AR glasses 200 . For example, a screenshot is taken of the application interface of the short video application and a screenshot image is generated. Furthermore, the mobile phone 100 can also save the generated screenshot image.
在一种实现方式中,手机100接收到截屏手势后,确定手机100当前显示界面是AR应用的应用界面(比如触摸板界面101),手机100获取AR眼镜200的虚拟空间的显示界面的图像,并对该图像进行截图,生成虚拟空间的显示界面的截图图片。In one implementation, after receiving the screenshot gesture, the mobile phone 100 determines that the current display interface of the mobile phone 100 is an application interface of an AR application (such as the touchpad interface 101), and the mobile phone 100 obtains an image of the display interface of the virtual space of the AR glasses 200, And take a screenshot of the image to generate a screenshot of the display interface of the virtual space.
在另一种实现方式中,手机100接收到截屏手势后,确定手机100当前显示界面是AR应用的应用界面(比如触摸板界面101)。手机100通知AR眼镜200对虚拟空间的显示界面的图像进行截图。在一种示例中,手机100根据截屏手势生成截屏事件,将截屏事件分发给AR眼镜200。在另一种示例中,手机100向AR眼镜200发送第一指示消息,第一指示消息用于指示AR眼镜200对虚拟空间的显示界面进行截图。AR眼镜200接收到截屏事件或接收到第一指示消息后,对虚拟空间的显示界面的图像进行截图,生成虚拟空间的显示界面的截图图片。AR眼镜200可以保存生成的截图图片;进一步的,AR眼镜200还可以将生成的截图图片发送给手机100,手机100也可以保存该截图图片。In another implementation manner, after receiving the screenshot gesture, the mobile phone 100 determines that the current display interface of the mobile phone 100 is an application interface of an AR application (such as the touchpad interface 101). The mobile phone 100 notifies the AR glasses 200 to take a screenshot of the image of the display interface of the virtual space. In one example, the mobile phone 100 generates a screenshot event according to the screenshot gesture, and distributes the screenshot event to the AR glasses 200 . In another example, the mobile phone 100 sends a first instruction message to the AR glasses 200 , and the first instruction message is used to instruct the AR glasses 200 to take a screenshot of the display interface of the virtual space. After receiving the screenshot event or receiving the first instruction message, the AR glasses 200 take a screenshot of the image of the display interface of the virtual space and generate a screenshot of the display interface of the virtual space. The AR glasses 200 can save the generated screenshot image; further, the AR glasses 200 can also send the generated screenshot image to the mobile phone 100, and the mobile phone 100 can also save the generated screenshot image.
在该场景中,手机上AR应用在前台运行,手机显示AR应用的应用界面。手机接收到用户的截屏手势,执行对AR眼镜的虚拟空间的显示界面进行截图。In this scenario, the AR application on the mobile phone is running in the foreground, and the mobile phone displays the application interface of the AR application. The mobile phone receives the user's screenshot gesture and takes a screenshot of the display interface of the virtual space of the AR glasses.
以截屏手势是在屏幕上多指向下滑动为例,图6示出了手机未显示AR应用的应用界面时,接收到截屏手势的一种场景实例示意图。Taking the screenshot gesture of sliding down with multiple fingers on the screen as an example, Figure 6 shows a schematic diagram of a scene example in which the screenshot gesture is received when the mobile phone does not display the application interface of the AR application.
如图6所示,手机100未启动AR应用,手机100显示桌面界面。可以理解的,手机100也可以显示其他应用的应用界面,比如视频应用的应用界面、聊天应用的应用界面等。AR眼镜200的虚拟空间显示视频应用的应用界面。用户在手机100的触摸屏上做出截屏手势;示例性的,如图6所示,用户使用三指向下滑动。手机100接收到截屏手势后,生成手机显示界面的截图图片,示例性的,生成手机桌面界面的截图图片。进一步的,手机100可以保存生成的截图图片。As shown in Figure 6, the mobile phone 100 does not start the AR application, and the mobile phone 100 displays the desktop interface. It is understandable that the mobile phone 100 can also display application interfaces of other applications, such as application interfaces of video applications, application interfaces of chat applications, etc. The virtual space of the AR glasses 200 displays the application interface of the video application. The user makes a screenshot gesture on the touch screen of the mobile phone 100; for example, as shown in Figure 6, the user slides down using three fingers. After receiving the screenshot gesture, the mobile phone 100 generates a screenshot of the mobile phone's display interface, for example, generates a screenshot of the mobile phone's desktop interface. Further, the mobile phone 100 can save the generated screenshot image.
在该场景中,手机上未运行AR应用或AR应用在后台运行,手机未显示AR应用的应用界面。手机接收到用户的截屏手势,执行对手机的显示界面进行截图。In this scenario, the AR application is not running on the phone or the AR application is running in the background, and the application interface of the AR application is not displayed on the phone. The mobile phone receives the user's screenshot gesture and takes a screenshot of the display interface of the mobile phone.
以录屏手势是在屏幕上多指向上滑动为例,图7示出了手机显示AR应用的应用界面时,接收到录屏手势的一种场景实例示意图。Taking the screen recording gesture of sliding upwards with multiple fingers on the screen as an example, Figure 7 shows a schematic diagram of a scene example in which the screen recording gesture is received when the mobile phone displays the application interface of the AR application.
如图7所示,手机100启动AR应用后,显示触摸板界面101;AR眼镜200的虚拟空间显示短视频应用的应用界面。用户在手机100的触摸屏上做出录屏手势;示例性的,如图7所示,用户使用三指在手机100屏幕上向上滑动。手机100接收到录屏手势后,触发对AR眼镜200的虚拟空间的显示界面进行录屏,生成录屏文件。示例性的,对短视频应用的应用界面进行录屏,生成录屏文件。进一步的,手机100还可以保存生成的录屏文件。As shown in Figure 7, after starting the AR application, the mobile phone 100 displays the touch panel interface 101; the virtual space of the AR glasses 200 displays the application interface of the short video application. The user makes a screen recording gesture on the touch screen of the mobile phone 100; for example, as shown in Figure 7, the user uses three fingers to slide upward on the screen of the mobile phone 100. After receiving the screen recording gesture, the mobile phone 100 triggers screen recording on the display interface of the virtual space of the AR glasses 200 and generates a screen recording file. For example, the application interface of the short video application is screen-recorded and a screen-recording file is generated. Furthermore, the mobile phone 100 can also save the generated screen recording file.
在一种实现方式中,手机100接收到录屏手势后,确定手机100当前显示界面是AR应用的应用界面(比如触摸板界面101),手机100对AR眼镜200的虚拟空间的显示界面进行录屏,生成录屏文件。In one implementation, after receiving the screen recording gesture, the mobile phone 100 determines that the current display interface of the mobile phone 100 is an application interface of an AR application (such as the touchpad interface 101), and the mobile phone 100 records the display interface of the virtual space of the AR glasses 200. screen and generate a screen recording file.
在另一种实现方式中,手机100接收到录屏手势后,确定手机100当前显示界面是AR应用的应用界面(比如触摸板界面101)。手机100通知AR眼镜200对虚拟空间的显示界面进行录屏。在一种示例中,手机100根据录屏手势生成录屏事件,将录屏事件分发给AR眼镜200。在另一种示例中,手机100向AR眼镜200发送第二指示消息,第二指示消息用于指示AR眼镜200对虚拟空间的显示界面进行录屏。AR眼镜200接收到录屏事件或接收到第二指示消息后,对虚拟空间的显示界面进行录屏,生成虚拟空间的显示界面的录屏文件。AR眼镜200可以保存生成的录屏文件;进一步的,AR眼镜200还可以将生成的录屏文件发送给手机100,手机100也可以保存该录屏文件。In another implementation manner, after receiving the screen recording gesture, the mobile phone 100 determines that the current display interface of the mobile phone 100 is an application interface of an AR application (such as the touchpad interface 101). The mobile phone 100 notifies the AR glasses 200 to record the screen of the display interface of the virtual space. In one example, the mobile phone 100 generates a screen recording event according to the screen recording gesture, and distributes the screen recording event to the AR glasses 200 . In another example, the mobile phone 100 sends a second instruction message to the AR glasses 200 , and the second instruction message is used to instruct the AR glasses 200 to record the screen of the display interface of the virtual space. After receiving the screen recording event or receiving the second instruction message, the AR glasses 200 record the screen of the display interface of the virtual space and generate a screen recording file of the display interface of the virtual space. The AR glasses 200 can save the generated screen recording file; further, the AR glasses 200 can also send the generated screen recording file to the mobile phone 100, and the mobile phone 100 can also save the screen recording file.
在该场景中,手机上AR应用在前台运行,手机显示AR应用的应用界面。手机接收到用户的录屏手势,执行对AR眼镜的虚拟空间的显示界面进行录屏。In this scenario, the AR application on the mobile phone is running in the foreground, and the mobile phone displays the application interface of the AR application. The mobile phone receives the user's screen recording gesture and performs screen recording on the display interface of the virtual space of the AR glasses.
以录屏手势是在屏幕上多指向上滑动为例,图8示出了手机未显示AR应用的应用界面时,接收到录屏手势的一种场景实例示意图。Taking the screen recording gesture of sliding upwards with multiple fingers on the screen as an example, Figure 8 shows a schematic diagram of a scene example in which the screen recording gesture is received when the mobile phone does not display the application interface of the AR application.
如图8所示,手机100未启动AR应用,手机100启动视频通话应用,显示视频通话应用的应用界面。可以理解的,手机100也可以显示其他应用的应用界面。AR眼镜200的虚拟空间显示视频应用的应用界面。用户在手机100的触摸屏上做出录屏手势;示例性的,如图8所示,用户使用三指在手机100屏幕上向上滑动。手机100接收到录屏手势后,对手机显示界面(视频通话应用的应用界面)开始录屏,生成手机显示界面的录屏文件,示例性的,生成视频通话应用的应用界面的录屏文件。进一步的,手机100可以保存生成的录屏文件。As shown in Figure 8, the mobile phone 100 does not start the AR application, but the mobile phone 100 starts the video call application and displays the application interface of the video call application. It is understandable that the mobile phone 100 can also display application interfaces of other applications. The virtual space of the AR glasses 200 displays the application interface of the video application. The user makes a screen recording gesture on the touch screen of the mobile phone 100; for example, as shown in Figure 8, the user uses three fingers to slide upward on the screen of the mobile phone 100. After receiving the screen recording gesture, the mobile phone 100 starts recording the screen of the mobile phone display interface (the application interface of the video call application), and generates a screen recording file of the mobile phone display interface. For example, a screen recording file of the application interface of the video call application is generated. Further, the mobile phone 100 can save the generated screen recording file.
在该场景中,手机上未运行AR应用或AR应用在后台运行,手机未显示AR应用的应用界面。手机接收到用户的录屏手势,执行对手机的显示界面进行录屏。In this scenario, the AR application is not running on the phone or the AR application is running in the background, and the application interface of the AR application is not displayed on the phone. The mobile phone receives the user's screen recording gesture and performs screen recording on the display interface of the mobile phone.
以返回上一级手势是单指从屏幕左边侧向右滑为例,图9示出了手机显示AR应用的应用界面时,接收到返回上一级手势的一种场景实例示意图。Taking the gesture of returning to the previous level as an example of sliding one finger from the left side of the screen to the right, Figure 9 shows a schematic diagram of a scenario in which the gesture of returning to the previous level is received when the mobile phone displays the application interface of an AR application.
如图9所示,手机100启动AR应用后,显示触摸板界面101;AR眼镜200的虚拟空间显示短视频应用的应用界面。用户在手机100的触摸屏上做出返回上一级手势;示例性的,如图9所示,用户使用单指在手机100屏幕上从左边沿向右滑动。手机100接收到返回上一级手势后,触发AR眼镜200的虚拟空间返回上一级显示界面。示例性的,响应于手机100接收到返回上一级手势,AR眼镜200的虚拟空间显示桌面界面。As shown in Figure 9, after starting the AR application, the mobile phone 100 displays the touch panel interface 101; the virtual space of the AR glasses 200 displays the application interface of the short video application. The user makes a gesture of returning to the previous level on the touch screen of the mobile phone 100; for example, as shown in Figure 9, the user uses a single finger to slide from the left edge to the right on the screen of the mobile phone 100. After receiving the gesture of returning to the previous level, the mobile phone 100 triggers the virtual space of the AR glasses 200 to return to the previous level display interface. Exemplarily, in response to the mobile phone 100 receiving the return to the previous level gesture, the virtual space of the AR glasses 200 displays the desktop interface.
在一种实现方式中,手机100接收到返回上一级手势后,确定手机100当前显示界面是AR应用的应用界面(比如触摸板界面101),手机100根据AR眼镜200虚拟空间当前的显示界面生成该显示界面的上一级显示界面的显示数据,并将上一级显示界面的显示数据发送给AR眼镜200。AR眼镜200根据该上一级显示界面的显示数据在虚拟空间内进行显示。In one implementation, after receiving the gesture of returning to the previous level, the mobile phone 100 determines that the current display interface of the mobile phone 100 is the application interface of the AR application (such as the touchpad interface 101), and the mobile phone 100 determines the current display interface of the virtual space according to the AR glasses 200 The display data of the upper-level display interface of the display interface is generated, and the display data of the upper-level display interface is sent to the AR glasses 200 . The AR glasses 200 display in the virtual space according to the display data of the upper-level display interface.
在另一种实现方式中,手机100接收到返回上一级手势后,确定手机100当前显示界面是AR应用的应用界面(比如触摸板界面101)。手机100通知AR眼镜200返回上一级显示界面。在一种示例中,手机100根据返回上一级手势生成返回上一级事件,将返回上一级事件分发给AR眼镜200。在另一种示例中,手机100向AR眼镜200发送第三指示消息,第三指示消息用于指示AR眼镜200返回上一级显示界面。AR眼镜200接收到返回上一级事件或接收到第三指示消息后,显示当前显示界面的上一级显示界面。In another implementation manner, after receiving the return to the previous level gesture, the mobile phone 100 determines that the current display interface of the mobile phone 100 is an application interface of an AR application (such as the touchpad interface 101). The mobile phone 100 notifies the AR glasses 200 to return to the previous level display interface. In one example, the mobile phone 100 generates a return to the previous level event according to the return to the previous level gesture, and distributes the return to the previous level event to the AR glasses 200 . In another example, the mobile phone 100 sends a third instruction message to the AR glasses 200 , and the third instruction message is used to instruct the AR glasses 200 to return to the previous level display interface. After receiving the return to the upper level event or receiving the third instruction message, the AR glasses 200 display the upper level display interface of the current display interface.
在该场景中,手机上AR应用在前台运行,手机显示AR应用的应用界面。手机接收到用户的返回上一级手势,AR眼镜的虚拟空间的显示界面返回上一级。In this scenario, the AR application on the mobile phone is running in the foreground, and the mobile phone displays the application interface of the AR application. The mobile phone receives the user's return gesture to the previous level, and the display interface of the virtual space of the AR glasses returns to the previous level.
以返回上一级手势是单指从屏幕左边侧向右滑为例,图10示出了手机未显示AR应用的应用界面时,接收到返回上一级手势的一种场景实例示意图。Taking the gesture of returning to the previous level as an example of sliding one finger from the left side of the screen to the right, Figure 10 shows a schematic diagram of a scenario in which the gesture of returning to the previous level is received when the mobile phone does not display the application interface of the AR application.
如图10所示,手机100未启动AR应用,手机100启动聊天应用,显示聊天应用的应用界面。AR眼镜200的虚拟空间显示视频应用的应用界面。用户在手机100的触摸屏上做出返回上一级手势;示例性的,如图10所示,用户使用单指在手机100屏幕上从左边沿向右滑动。手机100接收到返回上一级手势后,手机100显示当前界面的上一级显示界面。示例性的,手机的当前界面为聊天应用的应用界面,聊天应用的应用界面的上一级显示界面为桌面界面。响应于接收到返回上一级手势,手机100显示桌面界面。As shown in Figure 10, the mobile phone 100 does not start the AR application, but the mobile phone 100 starts the chat application and displays the application interface of the chat application. The virtual space of the AR glasses 200 displays the application interface of the video application. The user makes a gesture of returning to the previous level on the touch screen of the mobile phone 100; for example, as shown in Figure 10, the user uses a single finger to slide from the left edge to the right on the screen of the mobile phone 100. After the mobile phone 100 receives the gesture of returning to the previous level, the mobile phone 100 displays the previous level display interface of the current interface. For example, the current interface of the mobile phone is the application interface of the chat application, and the upper-level display interface of the application interface of the chat application is the desktop interface. In response to receiving the return to previous level gesture, the mobile phone 100 displays the desktop interface.
在该场景中,手机上未运行AR应用或AR应用在后台运行,手机未显示AR应用的应用界面。手机接收到用户的返回上一级手势,显示当前显示界面的上一级显示界面。In this scenario, the AR application is not running on the phone or the AR application is running in the background, and the application interface of the AR application is not displayed on the phone. The mobile phone receives the user's return to the previous level gesture and displays the previous level display interface of the current display interface.
本申请实施例提供了一种人机交互方法,支持在电子设备(比如手机)上做出系统手势。电子设备通过判断第一应用是否在前台运行,确定由电子设备执行系统手势对应的动作还是头戴式显示设备(比如AR眼镜)执行系统手势对应的动作。The embodiment of the present application provides a human-computer interaction method that supports making system gestures on electronic devices (such as mobile phones). The electronic device determines whether the electronic device should perform the action corresponding to the system gesture or the head-mounted display device (such as AR glasses) should perform the action corresponding to the system gesture by determining whether the first application is running in the foreground.
示例性的,图11示出了本申请实施例提供的人机交互方法的一种流程示意图。如图11所示,电子设备和头戴式显示设备通信连接。电子设备接收到用户的第一手势,比如截屏手势、录屏手势或返回上一级手势等。电子设备判断第一应用(比如AR应用)是否在前台运行,该第一应用用于对头戴式显示设备的虚拟空间进行管理。在一种实现方式中,如果电子设备显示第一应用的应用界面,则确定第一应用在前台运行;如果电子设备未显示第一应用的应用界面,则确定第一应用不在前台运行。For example, FIG. 11 shows a schematic flowchart of the human-computer interaction method provided by the embodiment of the present application. As shown in Figure 11, the electronic device and the head-mounted display device are communicated and connected. The electronic device receives the user's first gesture, such as a screenshot gesture, a screen recording gesture, or a return gesture to the previous level. The electronic device determines whether a first application (such as an AR application) is running in the foreground. The first application is used to manage the virtual space of the head-mounted display device. In one implementation, if the electronic device displays the application interface of the first application, it is determined that the first application is running in the foreground; if the electronic device does not display the application interface of the first application, it is determined that the first application is not running in the foreground.
如果电子设备确定第一应用在前台运行,由头戴式显示设备(比如AR眼镜)执行系统手势对应的动作。示例性的,如图5、图7和图9所示场景。If the electronic device determines that the first application is running in the foreground, the head-mounted display device (such as AR glasses) performs an action corresponding to the system gesture. For example, the scenarios are shown in Figure 5, Figure 7 and Figure 9.
如果电子设备确定第一应用不在前台运行,由电子设备(比如手机)执行系统手势对应的动作。示例性的,如图6、图8和图10所示场景。If the electronic device determines that the first application is not running in the foreground, the electronic device (such as a mobile phone) performs an action corresponding to the system gesture. For example, the scenarios are shown in Figure 6, Figure 8 and Figure 10.
在该方法中,电子设备接收到系统手势后,区分不同的情况,触发电子设备自身执行系统手势对应的动作,或者触发头戴式显示设备执行系统手势对应的动作。支持用户在电子设备的触摸屏上执行系统手势,实现对头戴式显示设备的虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。这样,用户在观看虚拟空间的画面时,只需要在手机触摸屏上做出系统手势,就可以实现对头戴式显示设备的虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。用户的视线不需要离开虚拟空间,避免了视线在虚拟空间和手机屏幕之间跳转。而且手势操作方便快捷,用户操作成本低、体验好。In this method, after receiving the system gesture, the electronic device distinguishes between different situations and triggers the electronic device itself to perform the action corresponding to the system gesture, or triggers the head-mounted display device to perform the action corresponding to the system gesture. Supports users to perform system gestures on the touch screen of electronic devices to enable operations such as taking screenshots, recording screens, and returning to the previous level on the display interface of the virtual space of the head-mounted display device. In this way, when the user is watching the screen in the virtual space, he only needs to make system gestures on the touch screen of the mobile phone to take screenshots, record the screen, return to the previous level and other operations on the virtual space display interface of the head-mounted display device. The user's eyes do not need to leave the virtual space, avoiding the jump between the virtual space and the mobile phone screen. Moreover, gesture operation is convenient and fast, and the user operation cost is low and the user experience is good.
示例性的,图12和图13示出了本申请实施例提供的人机交互方法的一种详细流程示意图。For example, FIG. 12 and FIG. 13 show a detailed flowchart of the human-computer interaction method provided by the embodiment of the present application.
如图12所示,手机100的操作系统安装有应用1和应用2,应用1的应用界面显示在手机100上,应用2的应用界面显示在AR眼镜200的虚拟空间。示例性的,应用1为手机的桌面应用,应用2为视频应用。应用1启动运行后,生成应用1对应的活动(Activity)。应用2启动运行后,生成应用2对应的活动(Activity)。可以理解的,一个应用可以包括一个或多个界面,每个界面对应一个Activity;一个Activity包括一个或多个图层(layer)。每个图层包括一个或多个元素(控件)。操作系统的框架层(framework)包括窗口管理服务(windowmanager service,WMS),显示管理单元,表面合成(SurfaceFlinger)模块,活动管理服务(Activity manager service,AMS)等。WMS用于窗口管理(比如,新增窗口、删除窗口、修改窗口等)。应用1启动,WMS创建应用1对应的窗口1;应用2启动,WMS创建应用2对应的窗口2。显示管理单元为每个窗口创建对应的显示(Display),并建立窗口与Display的一一对应关系。表面合成(SurfaceFlinger)模块用于从WMS获取各个应用界面的显示数据(比如,界面包括的图层个数,每个图层的显示元素)。SurfaceFlinger根据应用界面的显示数据确定每个显示数据对应的合成指令(比如,OPENGL ES指令,HWComposer指令)。图形渲染合成组件采用合成指令对显示数据进行图形合成,生成应用界面各个图层的图像;并对各个图层的图像进行合成和渲染,生成Activity对应的界面。进一步的,图形渲染合成组件将不同应用的界面生成至对应的显示(display),每个display对应一个屏幕。As shown in FIG. 12 , the operating system of the mobile phone 100 is installed with application 1 and application 2. The application interface of application 1 is displayed on the mobile phone 100 , and the application interface of application 2 is displayed in the virtual space of the AR glasses 200 . For example, application 1 is a mobile phone desktop application, and application 2 is a video application. After application 1 starts running, the activity (Activity) corresponding to application 1 is generated. After application 2 starts running, the activity (Activity) corresponding to application 2 is generated. It can be understood that an application can include one or more interfaces, and each interface corresponds to an Activity; an Activity includes one or more layers. Each layer contains one or more elements (controls). The framework layer of the operating system includes window management service (windowmanager service, WMS), display management unit, surface synthesis (SurfaceFlinger) module, activity management service (Activity manager service, AMS), etc. WMS is used for window management (for example, adding windows, deleting windows, modifying windows, etc.). When application 1 is started, WMS creates window 1 corresponding to application 1; when application 2 is started, WMS creates window 2 corresponding to application 2. The display management unit creates a corresponding display (Display) for each window and establishes a one-to-one correspondence between the window and the Display. The surface synthesis (SurfaceFlinger) module is used to obtain the display data of each application interface from WMS (for example, the number of layers included in the interface, the display elements of each layer). SurfaceFlinger determines the synthesis instructions corresponding to each display data (for example, OPENGL ES instructions, HWComposer instructions) based on the display data of the application interface. The graphics rendering synthesis component uses synthesis instructions to perform graphics synthesis on the display data to generate images of each layer of the application interface; it also synthesizes and renders the images of each layer to generate an interface corresponding to the Activity. Further, the graphics rendering synthesis component generates the interfaces of different applications to corresponding displays, and each display corresponds to one screen.
在一种示例中,SurfaceFlinger从WMS获取应用1的Activity的显示数据1,图形渲染合成组件采用合成指令对显示数据1进行图形合成,生成各个图层的图像;并将各个图层图像进行合成渲染,生成应用1的界面至显示1(display1)。显示1上保存的界面用于在手机100的屏幕上进行显示。这样,手机100的屏幕显示应用1的应用界面;示例性的,如图4A所示,手机100显示桌面界面。SurfaceFlinger从WMS获取应用2的Activity的显示数据2,图形渲染合成组件采用合成指令对显示数据2进行图形合成,生成各个图层的图像;并将各个图层图像进行合成渲染,生成应用2的界面至显示2(display2)。显示2上保存的界面用于在AR眼镜200的虚拟空间内进行显示。进一步的,手机100将应用2的界面信息发送给AR眼镜200,这样,AR眼镜200就可以在虚拟空间内显示应用2的应用界面。In one example, SurfaceFlinger obtains the display data 1 of the Activity of application 1 from WMS. The graphics rendering synthesis component uses synthesis instructions to perform graphic synthesis on the display data 1 to generate images of each layer; and synthesizes and renders the images of each layer. , generate the interface of application 1 to display 1 (display1). The interface saved on the display 1 is used for display on the screen of the mobile phone 100 . In this way, the screen of the mobile phone 100 displays the application interface of application 1; for example, as shown in FIG. 4A, the mobile phone 100 displays the desktop interface. SurfaceFlinger obtains the display data 2 of the Activity of application 2 from WMS. The graphics rendering synthesis component uses the synthesis instructions to perform graphic synthesis on the display data 2 to generate images of each layer; it synthesizes and renders the images of each layer to generate the interface of application 2. Go to display 2 (display2). The interface saved on display 2 is used for display in the virtual space of AR glasses 200 . Further, the mobile phone 100 sends the interface information of Application 2 to the AR glasses 200, so that the AR glasses 200 can display the application interface of Application 2 in the virtual space.
手机100接收到用户输入的系统手势(比如截屏手势、录屏手势或返回上一级手势等)。手机确定桌面应用在前台运行,AR应用未在前台运行,确定由手机执行系统手势对应的动作。参考图12,响应于接收到系统手势,手机对应用1的应用界面进行截屏或录屏或返回上一级操作。The mobile phone 100 receives a system gesture input by the user (such as a screenshot gesture, a screen recording gesture, or a return to the previous level gesture, etc.). The mobile phone determines that the desktop application is running in the foreground and the AR application is not running in the foreground. It is determined that the mobile phone performs the action corresponding to the system gesture. Referring to Figure 12, in response to receiving the system gesture, the mobile phone takes a screenshot or recording of the application interface of Application 1 or returns to the previous level operation.
在另一些实施例中,手机100的操作系统还安装有应用3,应用3用于对AR眼镜200的虚拟空间进行管理,应用3的应用界面显示在手机100上。示例性的,应用3为AR应用。如图13所示,应用3启动运行后,生成应用3对应的活动(Activity)。WMS创建应用3对应的窗口3。SurfaceFlinger从WMS获取应用3的Activity的显示数据3,图形渲染合成组件采用合成指令对显示数据3进行图形合成,生成各个图层的图像;并将各个图层图像进行合成渲染,生成应用3的界面至显示1(display1)。显示1上保存的界面用于在手机100的屏幕上进行显示。这样,手机100的屏幕显示应用3的应用界面;示例性的,如图4A所示,手机100启动AR应用后,显示触摸板界面101。In other embodiments, the operating system of the mobile phone 100 is also installed with application 3. Application 3 is used to manage the virtual space of the AR glasses 200, and the application interface of application 3 is displayed on the mobile phone 100. For example, application 3 is an AR application. As shown in Figure 13, after application 3 starts running, the activity (Activity) corresponding to application 3 is generated. WMS creates window 3 corresponding to application 3. SurfaceFlinger obtains the display data 3 of the Activity of application 3 from WMS. The graphics rendering synthesis component uses synthesis instructions to perform graphic synthesis on the display data 3 to generate images of each layer; it synthesizes and renders the images of each layer to generate the interface of application 3. to display1(display1). The interface saved on the display 1 is used for display on the screen of the mobile phone 100 . In this way, the screen of the mobile phone 100 displays the application interface of application 3; for example, as shown in FIG. 4A , after the mobile phone 100 starts the AR application, the touch panel interface 101 is displayed.
手机100接收到用户输入的系统手势(比如截屏手势、录屏手势或返回上一级手势等)。手机确定AR应用在前台运行,确定由AR眼镜执行系统手势对应的动作。参考图13,响应于接收到系统手势,对应用2的应用界面进行截屏或录屏或返回上一级操作。The mobile phone 100 receives a system gesture input by the user (such as a screenshot gesture, a screen recording gesture, or a return to the previous level gesture, etc.). The mobile phone determines that the AR application is running in the foreground and determines that the AR glasses perform actions corresponding to the system gestures. Referring to Figure 13, in response to receiving the system gesture, screenshot or record the application interface of Application 2 or return to the previous level operation.
在现有技术的一种实现方式中,虚拟空间的桌面界面上设置有虚拟按键,用户可以通过移动射线位置,点击桌面界面上的虚拟按键实现在虚拟空间内进行截屏、录屏等操作。但是,这种操作方式的操作步骤较繁琐,比如,当用户在观看其他应用的显示界面时,需要先退出当前显示的应用界面,进入桌面界面,然后再点击桌面界面上的虚拟按键,进行截屏、录屏等操作。用户的使用体验较差。In one implementation of the prior art, a virtual button is provided on the desktop interface of the virtual space. The user can move the ray position and click the virtual button on the desktop interface to perform screenshots, screen recordings and other operations in the virtual space. However, the operation steps of this operation method are relatively cumbersome. For example, when the user is viewing the display interface of other applications, he needs to exit the currently displayed application interface, enter the desktop interface, and then click the virtual button on the desktop interface to take a screenshot. , screen recording and other operations. The user experience is poor.
本申请实施例还提供了一种人机交互方法,当用户在电子设备上AR应用的应用界面上输入第一操作(比如指关节敲击),响应于该第一操作,头戴式显示设备的虚拟空间弹出快捷操作面板,该快捷操作面板上设置有截屏、录屏、返回上一级、返回主页等虚拟按键。用户可以移动射线位置,点击快捷操作面板上的虚拟按键来实现对虚拟空间显示界面的截屏、录屏、返回上一级、返回主页等操作。用户在观看虚拟空间的画面时,只需要在手机触摸屏上输入第一操作(比如指关节敲击),然后就可以通过虚拟空间内的虚拟按键实现对虚拟空间的显示界面进行截屏、录屏、返回上一级等操作。用户的视线不需要离开虚拟空间,避免了视线在虚拟空间和手机屏幕之间跳转带来的较差体验。而且,虚拟按键是设置在弹出的快捷操作面板上,不依赖于虚拟空间的显示界面。当用户在使用具体应用时,不需要退出应用进入桌面界面进行操作。操作方面快捷,用户操作成本低、体验好。Embodiments of the present application also provide a human-computer interaction method. When a user inputs a first operation (such as knuckle tapping) on the application interface of an AR application on an electronic device, in response to the first operation, the head-mounted display device A shortcut operation panel pops up in the virtual space. The shortcut operation panel is equipped with virtual buttons such as screenshots, screen recordings, return to the previous level, and return to the home page. Users can move the ray position and click the virtual buttons on the shortcut operation panel to take screenshots of the virtual space display interface, record the screen, return to the previous level, return to the home page, and other operations. When the user is watching the screen in the virtual space, he only needs to input the first operation (such as knuckle tapping) on the touch screen of the mobile phone, and then he can use the virtual buttons in the virtual space to take screenshots, record the screen, etc. of the display interface of the virtual space. Return to the previous level and other operations. The user's line of sight does not need to leave the virtual space, avoiding the poor experience caused by the line of sight jumping between the virtual space and the mobile phone screen. Moreover, the virtual keys are set on the pop-up shortcut operation panel and do not depend on the display interface of the virtual space. When the user is using a specific application, there is no need to exit the application and enter the desktop interface for operation. The operation is fast, the user's operation cost is low and the user experience is good.
下面以电子设备是手机,头戴式显示设备是AR眼镜为例,对本申请实施例提供的人机交互方法进行详细介绍。Taking the electronic device as a mobile phone and the head-mounted display device as AR glasses as an example, the human-computer interaction method provided by the embodiment of the present application will be introduced in detail below.
示例性的,如图14所示,手机100和AR眼镜200通信连接。手机100启动AR应用,显示触摸板界面101;AR眼镜200的虚拟空间显示短视频应用的应用界面301。用户可以佩戴AR眼镜200观看虚拟空间内的显示界面。For example, as shown in Figure 14, the mobile phone 100 and the AR glasses 200 are connected through communication. The mobile phone 100 starts the AR application and displays the touch panel interface 101; the virtual space of the AR glasses 200 displays the application interface 301 of the short video application. The user can wear the AR glasses 200 to view the display interface in the virtual space.
用户可以在手机100的触摸板界面101上输入预设的第一操作。在一种示例中,触摸板界面101包括触控区域102,用户可以在触控区域102输入第一操作。示例性的,第一操作可以包括单指敲击、多指敲击、指关节敲击、单指划圈等。可以理解的,第一操作还可以包括其他形式,本申请实施例对此并不进行限定。The user can input a preset first operation on the touch panel interface 101 of the mobile phone 100 . In one example, the touch panel interface 101 includes a touch area 102, and the user can input a first operation in the touch area 102. For example, the first operation may include single-finger tapping, multi-finger tapping, knuckle tapping, single-finger circling, etc. It can be understood that the first operation may also include other forms, which are not limited by the embodiments of the present application.
参考图14,以第一操作是指关节敲击为例。手机100接收到用户在触控区域102内指关节敲击的动作,响应于该指关节敲击的动作,AR眼镜200的虚拟空间弹出快捷操作面板302,快捷操作面板302上设置有虚拟按键。示例性的,如图14所示,快捷操作面板302上包括“主页”按钮303、“返回上一级”按钮304、“截屏”按钮305、“录屏”按钮306。其中,“主页”按钮303用于返回虚拟空间的桌面界面,“返回上一级”按钮304用于返回当前显示界面的上一级显示界面,“截屏”按钮305用于对虚拟空间的显示界面进行截屏,“录屏”按钮306用于对虚拟空间的显示界面进行录屏。可选的,快捷操作面板302还包括“关闭”按钮307,用于关闭快捷操作面板302。Referring to Figure 14, take the first operation as knuckle tapping as an example. The mobile phone 100 receives the user's knuckle tapping action in the touch area 102. In response to the knuckle tapping action, the shortcut operation panel 302 pops up in the virtual space of the AR glasses 200. The shortcut operation panel 302 is provided with virtual buttons. For example, as shown in Figure 14, the shortcut operation panel 302 includes a "Home" button 303, a "Return to the previous level" button 304, a "Screenshot" button 305, and a "Screen Recording" button 306. Among them, the "Home" button 303 is used to return to the desktop interface of the virtual space, the "Return to the previous level" button 304 is used to return to the previous level display interface of the current display interface, and the "Screenshot" button 305 is used to return to the display interface of the virtual space. To take a screenshot, the "record screen" button 306 is used to record the screen of the display interface of the virtual space. Optionally, the shortcut operation panel 302 also includes a "Close" button 307 for closing the shortcut operation panel 302.
示例性的,如图15A所示,用户可以选中“主页”按钮303;比如,用户在手机100显示的触控区域102内向上、向下、向左或向右滑动,移动虚拟空间内的射线至“主页”按钮303,然后在触控区域102内点击一下,选中“主页”按钮303。响应于用户对“主页”按钮303的选中操作,虚拟空间显示桌面界面310(主页)。For example, as shown in Figure 15A, the user can select the "Home" button 303; for example, the user slides up, down, left or right in the touch area 102 displayed on the mobile phone 100 to move the ray in the virtual space. Go to the "Home" button 303, and then click once in the touch area 102 to select the "Home" button 303. In response to the user's selection operation of the "Home" button 303, the virtual space displays the desktop interface 310 (home page).
在一种示例中,用户可以选中“返回上一级”按钮304;响应于用户对“返回上一级”按钮304的选中操作,虚拟空间显示短视频应用的应用界面301的上一级显示界面。示例性的,短视频应用的应用界面301的上一级显示界面为桌面界面310,响应于用户对“返回上一级”按钮304的选中操作,虚拟空间显示桌面界面310。In one example, the user can select the "Return to Previous Level" button 304; in response to the user's selection operation of the "Return to Previous Level" button 304, the virtual space displays the upper level display interface of the application interface 301 of the short video application. . For example, the upper-level display interface of the application interface 301 of the short video application is the desktop interface 310. In response to the user's selection operation of the "return to previous level" button 304, the virtual space displays the desktop interface 310.
示例性的,如图15B所示,用户可以选中“截屏”按钮305;比如,用户在手机100显示的触控区域102内向上、向下、向左或向右滑动,移动虚拟空间内的射线至“截屏”按钮305,然后在触控区域102内点击一下,选中“截屏”按钮305。响应于用户对“截屏”按钮305的选中操作,对虚拟空间的显示界面(短视频应用的应用界面301)进行截屏,生成截图图片。For example, as shown in Figure 15B, the user can select the "Screenshot" button 305; for example, the user slides up, down, left or right in the touch area 102 displayed on the mobile phone 100 to move the ray in the virtual space. Go to the "Screenshot" button 305, and then click once in the touch area 102 to select the "Screenshot" button 305. In response to the user's selection operation of the "Screenshot" button 305, a screenshot is taken of the display interface of the virtual space (the application interface 301 of the short video application) to generate a screenshot image.
示例性的,如图15C所示,用户可以选中“录屏”按钮306;比如,用户在手机100显示的触控区域102内向上、向下、向左或向右滑动,移动虚拟空间内的射线至“录屏”按钮306,然后在触控区域102内点击一下,选中“录屏”按钮306。响应于用户对“录屏”按钮306的选中操作,对虚拟空间的显示界面进行录屏,生成录屏文件。For example, as shown in Figure 15C, the user can select the "record screen" button 306; for example, the user slides up, down, left or right in the touch area 102 displayed on the mobile phone 100 to move the virtual space. Ray to the "Record Screen" button 306, and then click once in the touch area 102 to select the "Record Screen" button 306. In response to the user's selection operation on the "record screen" button 306, the screen is recorded on the display interface of the virtual space and a screen recording file is generated.
示例性的,如图15D所示,用户可以选中“关闭”按钮307;响应于用户对“关闭”按钮307的选中操作,虚拟空间停止显示快捷操作面板302。For example, as shown in Figure 15D, the user can select the "Close" button 307; in response to the user's selection operation of the "Close" button 307, the virtual space stops displaying the shortcut operation panel 302.
需要说明的是,图15A-图15D所示的快捷操作面板302仅为一种示例。在具体实现中,快捷操作面板可以为其他形式,比如,快捷操作面板的形状、大小、位置都可以根据具体情况进行设置。It should be noted that the shortcut operation panel 302 shown in FIGS. 15A to 15D is only an example. In specific implementation, the shortcut operation panel can be in other forms. For example, the shape, size, and position of the shortcut operation panel can be set according to specific circumstances.
在一种示例中,如图16A所示,快捷操作面板302可以叠加于虚拟空间当前显示界面(短视频应用的应用界面301)上。可选的,快捷操作面板302设置于虚拟空间当前显示界面的一侧。In one example, as shown in FIG. 16A , the shortcut operation panel 302 can be superimposed on the current display interface of the virtual space (the application interface 301 of the short video application). Optionally, the shortcut operation panel 302 is provided on one side of the current display interface of the virtual space.
在另一种示例中,如图16B所示,快捷操作面板302可以显示在虚拟空间中空白区域,与短视频应用的应用界面301位于同一层。In another example, as shown in FIG. 16B , the shortcut operation panel 302 can be displayed in a blank area in the virtual space, and is located on the same layer as the application interface 301 of the short video application.
在另一种示例中,如图16C所示,虚拟空间中可以不显示快捷操作面板302的边框,而是直接显示“主页”按钮303、“返回上一级”按钮304、“截屏”按钮305、“录屏”按钮306等。In another example, as shown in Figure 16C, the frame of the shortcut operation panel 302 may not be displayed in the virtual space, but the "Home" button 303, the "Return to the previous level" button 304, and the "Screenshot" button 305 may be directly displayed. , "record screen" button 306, etc.
在本申请实施例提供的人机交互方法中,响应于用户在电子设备上AR应用的应用界面上输入的第一操作,头戴式显示设备的虚拟空间弹出截屏、录屏、返回上一级、返回主页等虚拟按键。这些虚拟按键不依赖于虚拟空间当前显示界面,在任意的显示界面都可以弹出。这样,用户需要进行截屏、录屏、返回上一级、返回主页等操作时,不需要退出当前应用,操作简便快捷。In the human-computer interaction method provided by the embodiment of the present application, in response to the first operation input by the user on the application interface of the AR application on the electronic device, the virtual space of the head-mounted display device pops up to take a screenshot, record the screen, and return to the previous level. , return to homepage and other virtual keys. These virtual buttons do not depend on the current display interface of the virtual space and can pop up on any display interface. In this way, when users need to take screenshots, record screens, return to the previous level, return to the homepage, etc., they do not need to exit the current application, and the operation is simple and fast.
示例性的,图17示出了本申请实施例提供的人机交互方法的一种流程示意图。如图17所示,该方法包括:For example, FIG. 17 shows a schematic flowchart of the human-computer interaction method provided by the embodiment of the present application. As shown in Figure 17, the method includes:
S401、电子设备和头戴式显示设备通信连接。S401. Communication connection between the electronic device and the head-mounted display device.
电子设备可以作为头戴式显示设备的输入设备。The electronic device can serve as an input device for the head-mounted display device.
S402、电子设备启动第一应用,显示第一应用的应用界面。S402. The electronic device starts the first application and displays the application interface of the first application.
电子设备上安装有第一应用(比如AR应用),用于对头戴式显示设备的虚拟空间进行管理。示例性的,用户可以点击电子设备桌面界面上第一应用的应用图标,来启动第一应用。示例性的,如图14所示,手机100显示AR应用的触摸板界面101。A first application (such as an AR application) is installed on the electronic device for managing the virtual space of the head-mounted display device. For example, the user can click the application icon of the first application on the desktop interface of the electronic device to start the first application. For example, as shown in Figure 14, the mobile phone 100 displays the touchpad interface 101 of the AR application.
S403、电子设备接收用户在第一应用的应用界面的第一操作。S403. The electronic device receives the user's first operation on the application interface of the first application.
在一种示例中,触摸板界面101包括触控区域102,用户可以在触控区域102输入第一操作。示例性的,第一操作可以包括单指敲击、多指敲击、指关节敲击、单指划圈等。可以理解的,第一操作还可以包括其他形式,本申请实施例对此并不进行限定。In one example, the touch panel interface 101 includes a touch area 102, and the user can input a first operation in the touch area 102. For example, the first operation may include single-finger tapping, multi-finger tapping, knuckle tapping, single-finger circling, etc. It can be understood that the first operation may also include other forms, which are not limited by the embodiments of the present application.
S404、响应于该第一操作,头戴式显示设备的虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键和返回主页虚拟按键中至少一项。S404. In response to the first operation, the virtual space of the head-mounted display device displays at least one of a screenshot virtual button, a screen recording virtual button, a return to the previous level virtual button, and a return to homepage virtual button.
在一种实现方式中,电子设备接收到第一操作后,生成虚拟按键的显示数据,并将虚拟按键的显示数据发送给头戴式显示设备;头戴式显示设备根据接收到的虚拟按键的显示数据,在虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键、返回主页虚拟按键等。In one implementation, after receiving the first operation, the electronic device generates display data of the virtual keys and sends the display data of the virtual keys to the head-mounted display device; the head-mounted display device responds to the received data of the virtual keys. Display data, display screenshot virtual buttons, screen recording virtual buttons, return to the previous level virtual button, return to homepage virtual button, etc. in the virtual space.
在另一种实现方式中,电子设备接收到第一操作后,生成对应的第一事件。电子设备将第一事件分发至头戴式显示设备。头戴式显示设备接收到第一事件,生成虚拟按键的显示数据,并根据虚拟按键的显示数据,在虚拟空间显示截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键、返回主页虚拟按键等。In another implementation manner, after receiving the first operation, the electronic device generates a corresponding first event. The electronic device distributes the first event to the head mounted display device. The head-mounted display device receives the first event, generates display data of the virtual buttons, and displays in the virtual space virtual buttons for taking screenshots, virtual buttons for recording screens, virtual buttons for returning to the previous level, and virtual buttons for returning to the home page based on the display data of the virtual buttons. wait.
在一种示例中,头戴式显示设备的虚拟空间显示快捷操作面板,快捷操作面板上包括截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键、返回主页虚拟按键等。示例性的,如图14、图16A和图16B所示。In one example, the virtual space of the head-mounted display device displays a shortcut operation panel. The shortcut operation panel includes virtual buttons for taking screenshots, virtual buttons for recording screens, virtual buttons for returning to the previous level, virtual buttons for returning to the homepage, etc. Illustratively, as shown in Figure 14, Figure 16A and Figure 16B.
在另一种示例中,截屏虚拟按键、录屏虚拟按键、返回上一级虚拟按键、返回主页虚拟按键等直接显示在虚拟空间内应用界面上。示例性的,如图16C所示,短视频应用的应用界面301上叠加显示“主页”按钮303、“返回上一级”按钮304、“截屏”按钮305和“录屏”按钮306。In another example, virtual keys for taking screenshots, virtual keys for recording screens, virtual keys for returning to the previous level, virtual keys for returning to homepage, etc. are directly displayed on the application interface in the virtual space. For example, as shown in Figure 16C, the application interface 301 of the short video application displays a "Home" button 303, a "Return to the previous level" button 304, a "Screenshot" button 305 and a "Screen Recording" button 306 in a superimposed manner.
可以理解的是,上述各个设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。It can be understood that, in order to implement the above functions, each of the above devices includes corresponding hardware structures and/or software modules for performing each function. Persons skilled in the art should easily realize that, in conjunction with the units and algorithm steps of each example described in the embodiments disclosed herein, the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Professionals and technicians may use different methods to implement the described functions for each specific application, but such implementations should not be considered beyond the scope of the embodiments of the present application.
本申请实施例可以根据上述方法示例对上述设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。Embodiments of the present application can divide the above-mentioned device into functional modules according to the above-mentioned method examples. For example, each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module. The above integrated modules can be implemented in the form of hardware or software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic and is only a logical function division. In actual implementation, there may be other division methods.
在采用集成的单元的情况下,图18示出了上述实施例中所涉及的电子设备的一种可能的结构示意图。该电子设备1800包括:处理单元1801、通信单元1802和存储单元1803。其中,处理单元1801,用于对电子设备1800的动作进行控制管理;通信单元1802,用于支持电子设备1800与其他网络实体的通信;存储单元1803,保存电子设备1800的指令和数据,上述指令可以用于执行本申请相应实施例中的各个步骤。In the case of using an integrated unit, FIG. 18 shows a possible structural diagram of the electronic device involved in the above embodiment. The electronic device 1800 includes: a processing unit 1801, a communication unit 1802, and a storage unit 1803. Among them, the processing unit 1801 is used to control and manage the actions of the electronic device 1800; the communication unit 1802 is used to support the communication between the electronic device 1800 and other network entities; the storage unit 1803 is used to save the instructions and data of the electronic device 1800. The above instructions It can be used to perform various steps in the corresponding embodiments of this application.
当然,上述电子设备1800中的单元模块包括但不限于上述处理单元1801、通信单元1802和存储单元1803。例如,电子设备1800中还可以包括显示单元等,显示单元用于显示电子设备1800的用户界面。Of course, the unit modules in the above-mentioned electronic device 1800 include but are not limited to the above-mentioned processing unit 1801, communication unit 1802 and storage unit 1803. For example, the electronic device 1800 may further include a display unit and the like, and the display unit is used to display a user interface of the electronic device 1800 .
其中,处理单元1801可以是处理器或控制器,例如可以是中央处理器(centralprocessing unit,CPU),数字信号处理器(digital signal processor,DSP),专用集成电路(application-specific integrated circuit,ASIC),现场可编程门阵列(fieldprogrammable gate array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。通信单元1802可以是收发器、收发电路等。存储单元1803可以是存储器。The processing unit 1801 may be a processor or a controller, such as a central processing unit (CPU), a digital signal processor (DSP), or an application-specific integrated circuit (ASIC). , field programmable gate array (fieldprogrammable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. The communication unit 1802 may be a transceiver, a transceiver circuit, or the like. The storage unit 1803 may be a memory.
本申请实施例所提供的电子设备1800可以为图2所示的电子设备100。其中,上述处理器、存储器、通信接口等可以连接在一起,例如通过总线连接。处理器调用存储器存储的程序代码,以执行以上方法实施例中的各个步骤。The electronic device 1800 provided in the embodiment of the present application may be the electronic device 100 shown in FIG. 2 . The above-mentioned processors, memories, communication interfaces, etc. may be connected together, for example, through a bus. The processor calls the program code stored in the memory to execute each step in the above method embodiment.
本申请实施例还提供一种芯片系统,如图19所示,该芯片系统包括至少一个处理器1901和至少一个接口电路1902。处理器1901和接口电路1902可通过线路互联。例如,接口电路1902可用于从其它装置(例如路由设备的存储器)接收信号。又例如,接口电路1902可用于向其它装置(例如处理器1901)发送信号。示例性的,接口电路1902可读取存储器中存储的指令,并将该指令发送给处理器1901。当所述指令被处理器1901执行时,可使得电子设备执行上述实施例中的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。An embodiment of the present application also provides a chip system. As shown in Figure 19, the chip system includes at least one processor 1901 and at least one interface circuit 1902. The processor 1901 and the interface circuit 1902 may be interconnected via wires. For example, interface circuitry 1902 may be used to receive signals from other devices, such as memory of a routing device. As another example, interface circuit 1902 may be used to send signals to other devices (eg, processor 1901). For example, the interface circuit 1902 can read instructions stored in the memory and send the instructions to the processor 1901. When the instructions are executed by the processor 1901, the electronic device can be caused to perform various steps in the above embodiments. Of course, the chip system may also include other discrete devices, which are not specifically limited in the embodiments of this application.
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序代码,当上述处理器执行该计算机程序代码时,电子设备执行上述实施例中的方法。Embodiments of the present application also provide a computer-readable storage medium. The computer-readable storage medium stores computer program code. When the above-mentioned processor executes the computer program code, the electronic device executes the method in the above-mentioned embodiment.
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述实施例中的方法。Embodiments of the present application also provide a computer program product. When the computer program product is run on a computer, it causes the computer to execute the method in the above embodiment.
其中,本申请实施例提供的电子设备1800、芯片系统、计算机可读存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。Among them, the electronic device 1800, chip system, computer-readable storage medium or computer program product provided by the embodiment of the present application are all used to execute the corresponding method provided above. Therefore, the beneficial effects it can achieve can be referred to the above. The beneficial effects of the corresponding methods provided will not be described again here.
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。Through the above description of the embodiments, those skilled in the art can clearly understand that for the convenience and simplicity of description, only the division of the above functional modules is used as an example. In actual applications, the above functions can be allocated as needed. It is completed by different functional modules, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in this application, it should be understood that the disclosed devices and methods can be implemented in other ways. For example, the device embodiments described above are only illustrative. For example, the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be The combination can either be integrated into another device, or some features can be omitted, or not implemented. On the other hand, the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以使用硬件的形式实现,也可以使用软件功能单元的形式实现。In addition, each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit. The above integrated units can be implemented in the form of hardware or software functional units.
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、磁碟或者光盘等各种可以存储程序代码的介质。If the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium. Based on this understanding, the technical solutions of the embodiments of the present application are essentially or contribute to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium , including several instructions to cause a device (which can be a microcontroller, a chip, etc.) or a processor to execute all or part of the steps of the methods described in various embodiments of this application. The aforementioned storage media include: U disk, mobile hard disk, ROM, magnetic disk or optical disk and other media that can store program codes.
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。The above are only specific embodiments of the present application, but the protection scope of the present application is not limited thereto. Any changes or substitutions within the technical scope disclosed in the present application shall be covered by the protection scope of the present application. . Therefore, the protection scope of this application should be subject to the protection scope of the claims.
Claims (16)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310378606.8A CN117130471B (en) | 2023-03-31 | 2023-03-31 | Human-computer interaction method, electronic equipment and system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310378606.8A CN117130471B (en) | 2023-03-31 | 2023-03-31 | Human-computer interaction method, electronic equipment and system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN117130471A true CN117130471A (en) | 2023-11-28 |
| CN117130471B CN117130471B (en) | 2024-07-26 |
Family
ID=88855304
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202310378606.8A Active CN117130471B (en) | 2023-03-31 | 2023-03-31 | Human-computer interaction method, electronic equipment and system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117130471B (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109634503A (en) * | 2018-07-26 | 2019-04-16 | 维沃移动通信有限公司 | A kind of operation response method and mobile terminal |
| CN111142675A (en) * | 2019-12-31 | 2020-05-12 | 维沃移动通信有限公司 | Input method and head-mounted electronic equipment |
| CN112019877A (en) * | 2020-10-19 | 2020-12-01 | 深圳乐播科技有限公司 | Screen projection method, device and equipment based on VR equipment and storage medium |
| CN113225610A (en) * | 2021-03-31 | 2021-08-06 | 北京达佳互联信息技术有限公司 | Screen projection method, device, equipment and storage medium |
| CN114510186A (en) * | 2020-10-28 | 2022-05-17 | 华为技术有限公司 | Cross-device control method and device |
| CN115756268A (en) * | 2021-09-03 | 2023-03-07 | 华为技术有限公司 | Cross-device interaction method and device, screen projection system and terminal |
-
2023
- 2023-03-31 CN CN202310378606.8A patent/CN117130471B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109634503A (en) * | 2018-07-26 | 2019-04-16 | 维沃移动通信有限公司 | A kind of operation response method and mobile terminal |
| CN111142675A (en) * | 2019-12-31 | 2020-05-12 | 维沃移动通信有限公司 | Input method and head-mounted electronic equipment |
| CN112019877A (en) * | 2020-10-19 | 2020-12-01 | 深圳乐播科技有限公司 | Screen projection method, device and equipment based on VR equipment and storage medium |
| CN114510186A (en) * | 2020-10-28 | 2022-05-17 | 华为技术有限公司 | Cross-device control method and device |
| CN113225610A (en) * | 2021-03-31 | 2021-08-06 | 北京达佳互联信息技术有限公司 | Screen projection method, device, equipment and storage medium |
| CN115756268A (en) * | 2021-09-03 | 2023-03-07 | 华为技术有限公司 | Cross-device interaction method and device, screen projection system and terminal |
Also Published As
| Publication number | Publication date |
|---|---|
| CN117130471B (en) | 2024-07-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112558825B (en) | Information processing method and electronic device | |
| EP4057135B1 (en) | Display method for electronic device having foldable screen, and electronic device | |
| CN112717370B (en) | A control method and electronic device | |
| US12254174B2 (en) | Method and device for displaying two application interfaces in folded and unfolded states of the device | |
| CN113220139B (en) | Method for controlling display of large-screen equipment, mobile terminal and first system | |
| CN114895861B (en) | A message processing method, related device and system | |
| CN112835445B (en) | Interaction method, device and system in virtual reality scene | |
| EP3846427B1 (en) | Control method and electronic device | |
| WO2020177585A1 (en) | Gesture processing method and device | |
| WO2021036770A1 (en) | Split-screen processing method and terminal device | |
| WO2023030099A1 (en) | Cross-device interaction method and apparatus, and screen projection system and terminal | |
| WO2022057644A1 (en) | Device interaction method, electronic device, and interactive system | |
| CN113821129B (en) | Display window control method and electronic equipment | |
| US20230119849A1 (en) | Three-dimensional interface control method and terminal | |
| CN110830645A (en) | An operating method and electronic device | |
| CN113391775A (en) | Man-machine interaction method and equipment | |
| US20250373579A1 (en) | Message processing method, electronic device, and readable storage medium | |
| WO2022134691A1 (en) | Method and device for screech processing in terminal device, and terminal | |
| CN112738332A (en) | Electronic equipment interaction method and electronic equipment | |
| WO2022206659A1 (en) | Screencast method and related apparatus | |
| CN116932101A (en) | Interface display method and electronic equipment | |
| US12436610B2 (en) | Display method and electronic device | |
| CN117130471B (en) | Human-computer interaction method, electronic equipment and system | |
| CN117130472B (en) | Virtual space operation guide display method, mobile device and system | |
| CN116225274A (en) | Identification method and device for touch operation, electronic equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| CP03 | Change of name, title or address |
Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040 Patentee after: Honor Terminal Co.,Ltd. Country or region after: China Address before: 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong Patentee before: Honor Device Co.,Ltd. Country or region before: China |
|
| CP03 | Change of name, title or address |