[go: up one dir, main page]

CN113918015B - Interaction method and device for augmented reality - Google Patents

Interaction method and device for augmented reality Download PDF

Info

Publication number
CN113918015B
CN113918015B CN202111171422.1A CN202111171422A CN113918015B CN 113918015 B CN113918015 B CN 113918015B CN 202111171422 A CN202111171422 A CN 202111171422A CN 113918015 B CN113918015 B CN 113918015B
Authority
CN
China
Prior art keywords
display area
display
spherical
limb
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111171422.1A
Other languages
Chinese (zh)
Other versions
CN113918015A (en
Inventor
梁效富
高磊
武永超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN202111171422.1A priority Critical patent/CN113918015B/en
Publication of CN113918015A publication Critical patent/CN113918015A/en
Priority to PCT/KR2022/015033 priority patent/WO2023059087A1/en
Application granted granted Critical
Publication of CN113918015B publication Critical patent/CN113918015B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/0209Systems with very large relative bandwidth, i.e. larger than 10 %, e.g. baseband, pulse, carrier-free, ultrawideband
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请公开了一种增强现实的交互方法和装置,其中方法包括:在增强现实(AR)场景中,基于超宽带(UWB)雷达传感器,监测用户的指定肢体动作;其中,所述UWB雷达传感器的覆盖区域和AR显示区域重叠;当监测到指定的肢体动作时,根据所述肢体动作对应的操作在所述AR场景进行相应渲染;其中,如果所述肢体动作对应的操作属于针对场景中具体对象的操作,则根据相应监测时获得的肢体定位坐标,确定所述AR显示区域中的目标操作对象。采用本申请,既可以保障操作方便性,又可以提高操作识别的准确性。

The present application discloses an augmented reality interaction method and device, wherein the method comprises: in an augmented reality (AR) scene, based on an ultra-wideband (UWB) radar sensor, monitoring a user's designated body movements; wherein the coverage area of the UWB radar sensor overlaps with the AR display area; when a designated body movement is monitored, rendering is performed in the AR scene according to the operation corresponding to the body movement; wherein, if the operation corresponding to the body movement belongs to an operation for a specific object in the scene, the target operation object in the AR display area is determined according to the body positioning coordinates obtained during the corresponding monitoring. The use of the present application can not only ensure the convenience of operation, but also improve the accuracy of operation recognition.

Description

Interaction method and device for augmented reality
Technical Field
The present invention relates to computer application technology, and in particular, to an Augmented Reality (AR) interaction method and apparatus.
Background
Aiming at the interaction of the AR scene, an implementation scheme is currently proposed, wherein a camera is utilized to capture operation gestures of a user, and corresponding operations are executed on target objects in the AR scene according to the captured operation gestures. According to the scheme, external equipment such as a keyboard or a mouse is not required to be cited, so that the problem that when a user uses the AR headset, the user needs to input by means of the external equipment such as the keyboard and the mouse, and the operation is complex can be solved.
The inventors found in the course of implementing the present invention that: the scheme needs to rely on the image capturing device to capture the user operation gestures, the recognition precision of the image capturing device is not high, and the target operation object in the AR scene is also easy to be blocked by other display contents, so that the image captured based on the image capturing device cannot be accurately recognized for the user operation, and therefore, the problem of poor recognition precision for the continuous action exists.
Disclosure of Invention
In view of the above, the main objective of the present invention is to provide an interaction method and device for Augmented Reality (AR), which can improve the accuracy of user operation recognition in AR scene while ensuring the convenience of user operation.
In order to achieve the above purpose, the technical solution provided by the embodiment of the present application is as follows:
an augmented reality interaction method, comprising:
In an AR scenario, monitoring a user's designated limb movements based on Ultra Wideband (UWB) radar sensors; wherein the coverage area of the UWB radar sensor overlaps with the AR display area;
When the appointed limb action is monitored, corresponding rendering is carried out on the AR scene according to the operation corresponding to the limb action; and if the operation corresponding to the limb action belongs to the operation for the specific object in the scene, determining the target operation object in the AR display area according to the limb positioning coordinates obtained during corresponding monitoring.
Preferably, the method further comprises:
The emission point of the UWB radar sensor is a view cone vertex of an AR display area; the origin of the spherical coordinate system for UWB radar positioning is the view cone vertex of the AR display area;
The determining the target operation object in the AR display area according to the limb positioning coordinates obtained during corresponding monitoring comprises:
and taking the display object positioned at the limb positioning coordinates in the spherical coordinate system as the target operation object.
Preferably, the method further comprises:
the AR display area is cut into a corresponding number of sub display areas in a spherical coordinate system by spherical surfaces having different spherical radii; the position of the display object in the AR display area in the spherical coordinate system is the position of the display object on the sphere corresponding to the sub-display area.
Preferably, the position of the display object in the spherical coordinate system is commonly characterized by a spherical radius R and coordinates (x, y, z); wherein R is the sphere radius of the sphere corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are the coordinates of the display object on the sphere corresponding to the sub-display area to which the display object belongs.
Preferably, the sphere radius is obtained based on a preset sphere radius interval.
Preferably, the specified limb motion is a gesture motion.
The embodiment of the invention also discloses an interaction device for augmented reality, which comprises:
The monitoring module is used for monitoring the appointed limb actions of the user based on the UWB radar sensor in the AR scene; wherein the coverage area of the UWB radar sensor overlaps with the AR display area;
the rendering module is used for performing corresponding rendering on the AR scene according to the operation corresponding to the limb action when the appointed limb action is monitored; and if the operation corresponding to the limb action belongs to the operation for the specific object in the scene, determining the target operation object in the AR display area according to the limb positioning coordinates obtained during corresponding monitoring.
Preferably, the emission point of the UWB radar sensor is the vertex of a view cone of the AR display area; the origin of the spherical coordinate system for UWB radar positioning is the view cone vertex of the AR display area;
the rendering module is used for determining a target operation object in the AR display area according to limb positioning coordinates obtained during corresponding monitoring, and comprises the following steps:
and taking the display object positioned at the limb positioning coordinates in the spherical coordinate system as the target operation object.
Preferably, the AR display area of the AR scene is cut into a corresponding number of sub display areas by spheres having different sphere radii in a spherical coordinate system; the position of the display object in the AR display area in the spherical coordinate system is the position of the display object on the sphere corresponding to the sub-display area.
The embodiment of the application also discloses a non-volatile computer readable storage medium, which stores instructions, characterized in that the instructions, when executed by a processor, cause the processor to perform the steps of the augmented reality interaction method as described above.
The embodiment of the application also discloses an electronic device which comprises the nonvolatile computer readable storage medium and the processor which can access the nonvolatile computer readable storage medium.
According to the technical scheme, the augmented reality interaction scheme provided by the embodiment of the invention monitors the appointed limb actions of the user by using the UWB radar sensor in the AR scene, and enables the coverage area of the UWB radar sensor to overlap with the AR display area, so that when the appointed limb actions are monitored, the target operation object in the corresponding AR display area can be rapidly and accurately determined according to the corresponding limb positioning coordinates, and the user operation instruction can be rapidly and accurately identified.
Drawings
FIG. 1 is a schematic flow chart of a method according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a spherical coordinate system according to an embodiment of the present invention;
FIG. 3 is a schematic view of a view cone according to an embodiment of the present invention;
FIG. 4 is a schematic view of a cross-sectional view of a layered cut view cone according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a finger drag operation according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a gesture set according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a scenario in which an embodiment of the present invention is implemented;
FIG. 8 is a schematic diagram illustrating a process of moving an object in a first scene according to an embodiment of the present invention;
FIG. 9 is a second schematic view of a scenario in accordance with an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating a process of rotating a cart in a second scenario according to an embodiment of the present invention;
FIG. 11 is a schematic diagram illustrating a process of copying/pasting objects in an AR scene according to an embodiment of the present invention;
Fig. 12 is a schematic view of a device structure according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and the embodiments, in order to make the objects, technical solutions and advantages of the present invention more apparent.
Fig. 1 is a schematic flow chart of an embodiment of the present invention, and as shown in fig. 1, an interaction method of augmented reality implemented by this embodiment mainly includes the following steps:
Step 101, monitoring the appointed limb actions of a user based on a UWB radar sensor in an AR scene; wherein the coverage area of the UWB radar sensor overlaps with the AR display area.
In the step, in the AR scene, the appointed limb actions of the user are required to be monitored based on the UWB radar sensor, so that the positioning advantage of the UWB radar sensor can be fully utilized, and the appointed limb actions of the user can be rapidly and accurately captured in real time. Moreover, the coverage area of the UWB radar sensor is overlapped with the AR display area, so that limb positioning coordinates obtained during limb motion monitoring are favorably associated with specific objects in the AR display area, and a user can directly indicate the specific objects needing to be operated in an AR scene by utilizing limb motion, so that the operation instruction of the user can be rapidly and accurately identified while the operation convenience of the user is ensured.
By adopting the UWB radar technology, based on the emission and recovery waveforms, the limb actions (including single-point stationary actions and continuous actions) of the user can be accurately positioned and identified. The monitoring of the user's limb movements in this step may be specifically implemented by using the prior art, which is not described here again.
In one embodiment, considering that the user's visual range from the origin of the field of view is a cone shape, to reduce the interaction overhead, the UWB radar (transmit+receive) coverage area and the AR display area may be overlapped by adjusting the emission point of the UWB radar sensor to be the cone vertex of the AR display area. In addition, the origin of the spherical coordinate system for UWB radar positioning (shown in fig. 2) is set as the vertex of the view cone (shown in fig. 3) of the AR display area, so that the positions of the respective display contents in the AR display area can be identified directly using the spherical coordinate system for UWB radar positioning. Therefore, the coordinate system for positioning the content in the AR display area is consistent with the coordinate system for positioning the UWB radar, so that the positioning coordinates of the limb actions are consistent with the coordinates of the target operation object in the AR display area, and the target operation object in the AR display area can be rapidly positioned directly based on the positioning coordinates of the limb actions.
In one embodiment, in order to facilitate the identification of the position of the target operation object in the AR display area, the display object in the AR display area may be identified by using a spherical coordinate system, specifically, the AR display area may be cut by using a sphere with a coordinate system origin as a sphere center and different sphere radii in the spherical coordinate system, so as to obtain a corresponding number of sub-display areas, so that each sub-display area is defined by a sphere with different sphere radii, and the position of the display object in the AR display area may be accurately identified by using the sphere with the largest sphere radius corresponding to the sub-display area in the AR display area, that is, the position of the display object in the AR display area in the spherical coordinate system is the position of the display object on the sphere corresponding to the sub-display area.
In one embodiment, the position of each display object in the AR scene in a spherical coordinate system may be characterized in particular by a sphere radius R and coordinates (x, y, z) together.
Wherein R is the sphere radius of the sphere corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are the coordinates of the display object on the sphere corresponding to the sub-display area to which the display object belongs.
Here, the spherical surface corresponding to each sub-display area may specifically be a spherical surface corresponding to a maximum spherical radius or a minimum spherical radius corresponding to the sub-display area.
Fig. 4 shows a schematic view of a view cone from the origin of the view to the far plane of the view, as shown, on which two points (P and Q) in the AR display area lie, point P being on the sphere of sphere radius Rp and point Q being on the sphere corresponding to sphere radius between Rq and Rp, such that the coordinates of point P and point Q on the respective spheres can be marked P (Px, py, pz), Q (Qx, qy, qz), respectively, and the positions of point P and point Q can be marked Fp (Rq, px, py, pz), respectively, fq (Rp, qx, qy, qz).
In one embodiment, the specific limb motion may be a gesture motion, but is not limited thereto, and may be a foot motion, for example.
For the specified limb motion, different limb motions may be defined according to the transmission waveform and the reflection waveform, for example, for a finger drag motion, as shown in fig. 5, a finger is selected at a point P Fp (Rp, xp, yp, zp), and then the finger is dragged to a point Q Fq (Rq, xq, yq, zq), which is defined as a drag. When the finger is selected at a certain position, the position information of the point can be accurately positioned based on the echo signal of the single point.
In practical applications, the designated set of limb actions may be specifically set according to practical needs, and as an example of gesture actions, a set of gesture actions as shown in fig. 6 may be set, but is not limited thereto.
In one embodiment, the sphere radius may be obtained based on a preset sphere radius interval, that is, a group of sphere radii with the same interval is obtained, so that the distances between adjacent sub-display areas obtained after the AR display areas are cut are the same.
The smaller the sphere radius interval is, the smaller the cutting granularity of the AR display area is, the more accurate the position mark of the scene content is by utilizing the sub display area, and in practical application, the sphere radius interval can be set to a proper value according to practical needs by a person skilled in the art.
102, When a designated limb action is monitored, performing corresponding rendering on the AR scene according to an operation corresponding to the limb action; and if the operation corresponding to the limb action belongs to the operation for the specific object in the scene, determining the target operation object in the AR display area according to the limb positioning coordinates obtained during corresponding monitoring.
In this step, if the currently monitored limb actions belong to the operations for the specific object, the association between the limb positioning coordinates and the display content in the AR scene is needed to be utilized, and the target operation object in the AR display area is quickly and accurately determined based on the limb positioning coordinates.
In one embodiment, the following method may be specifically used to determine the target operation object:
And taking the object positioned at the limb positioning coordinates in the spherical coordinate system as the target operation object.
According to the embodiment, the interaction between the user and the AR scene is realized by introducing the UWB radar technology and adopting the mode of overlapping and calibrating the UWB radar coverage area and the AR display area, and further, the accuracy of identifying the user operation can be improved by combining different layering of the AR display area marks, and the operation cost is low. The following describes in detail the implementation of the above method embodiments in connection with several specific application scenarios.
Scene one: moving virtual objects in an AR scene
Fig. 7 is a schematic diagram of scenario one. FIG. 8 is a schematic diagram of a process for moving an object in scene one. As shown in fig. 8, the finger points to select a virtual object, drags to the selected area, releases the finger, and the clicked object is re-rendered at the new position. In fig. 7, clicking on a sofa (virtual object) moves its position, and a more reasonable home layout can be selected.
Scene II: rotating virtual objects in an AR scene
Fig. 9 is a schematic diagram of scenario two. Fig. 10 is a schematic diagram of a process of rotating a cart in scenario two. As shown in fig. 10, the finger clicks the virtual object, rotates a certain angle (e.g., rotates 180 ° clockwise), the finger loosens, and the clicked object is re-rendered at the current position, but the rendered angle rotates 180 °. In fig. 9, the selected car toy (virtual object) rotates it, and more information on the other faces can be seen.
Scene III: copying/pasting virtual objects in an AR scene
Fig. 11 is a schematic diagram of a process of copying/pasting objects in an AR scene. As shown in fig. 11, the virtual object is clicked by two hands, the left hand is kept still, the right hand drags the virtual object, the right hand is released, the clicked object repeatedly renders one virtual object (and the virtual object originally clicked is copied as it is) at the current position (the position where the right hand is released), and the two virtual objects are identical and only are located at different positions.
Based on the above method embodiment, the embodiment of the present application also discloses an augmented reality interaction device, as shown in the figure, the interaction device includes:
The monitoring module is used for monitoring the appointed limb actions of the user based on the UWB radar sensor in the augmented reality AR scene; wherein the coverage area of the UWB radar sensor overlaps with the AR display area;
the rendering module is used for performing corresponding rendering on the AR scene according to the operation corresponding to the limb action when the appointed limb action is monitored; and if the operation corresponding to the limb action belongs to the operation for the specific object in the scene, determining the target operation object in the AR display area according to the limb positioning coordinates obtained during corresponding monitoring.
In one embodiment, the emission point of the UWB radar sensor is the cone apex of the AR display area; the origin of the spherical coordinate system for UWB radar localization is the cone vertex of the AR display area.
The rendering module is specifically configured to determine, according to limb positioning coordinates obtained during corresponding monitoring, a target operation object in the AR display area, including: and taking the display object positioned at the limb positioning coordinates in the spherical coordinate system as the target operation object.
In one embodiment, the AR display area of the AR scene is cut in a spherical coordinate system by spheres having different spherical radii into a corresponding number of sub display areas; the position of the display object in the AR display area in the spherical coordinate system is the position of the display object on the sphere corresponding to the sub-display area.
In one embodiment, the specified limb motion is a gesture motion.
In one embodiment, the sphere radius is derived based on a preset sphere radius interval.
In one embodiment, the position of the display object in the spherical coordinate system may be characterized by a sphere radius R and coordinates (x, y, z) together.
The radius R is the sphere radius of the sphere corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are the coordinates of the display object on the sphere corresponding to the sub-display area to which the display object belongs.
Based on the embodiment of the interaction method of augmented reality, the embodiment of the application realizes the interaction electronic equipment of augmented reality, which comprises a processor and a memory; the memory has stored therein an application executable by the processor for causing the processor to perform the augmented reality interaction method as described above. Specifically, a system or apparatus provided with a storage medium on which a software program code realizing the functions of any of the above embodiments is stored, and a computer (or CPU or MPU) of the system or apparatus may be caused to read out and execute the program code stored in the storage medium. Further, some or all of the actual operations may be performed by an operating system or the like operating on a computer based on instructions of the program code. The program code read out from the storage medium may also be written into a memory provided in an expansion board inserted into a computer or into a memory provided in an expansion unit connected to the computer, and then, based on instructions of the program code, a CPU or the like mounted on the expansion board or the expansion unit is caused to perform part or all of actual operations, thereby realizing the functions of any of the above-described embodiments of the interaction method of augmented reality.
The memory may be implemented as various storage media such as an electrically erasable programmable read-only memory (EEPROM), a Flash memory (Flash memory), a programmable read-only memory (PROM), and the like. A processor may be implemented to include one or more central processors or one or more field programmable gate arrays, where the field programmable gate arrays integrate one or more central processor cores. In particular, the central processor or central processor core may be implemented as a CPU or MCU.
Embodiments of the present application implement a computer program product comprising computer programs/instructions which, when executed by a processor, implement the steps of an augmented reality interaction method as described above.
It should be noted that not all the steps and modules in the above processes and the structure diagrams are necessary, and some steps or modules may be omitted according to actual needs. The execution sequence of the steps is not fixed and can be adjusted as required. The division of the modules is merely for convenience of description and the division of functions adopted in the embodiments, and in actual implementation, one module may be implemented by a plurality of modules, and functions of a plurality of modules may be implemented by the same module, and the modules may be located in the same device or different devices.
The hardware modules in the various embodiments may be implemented mechanically or electronically. For example, a hardware module may include specially designed permanent circuits or logic devices (e.g., special purpose processors such as FPGAs or ASICs) for performing certain operations. A hardware module may also include programmable logic devices or circuits (e.g., including a general purpose processor or other programmable processor) temporarily configured by software for performing particular operations. As regards implementation of the hardware modules in a mechanical manner, either by dedicated permanent circuits or by circuits that are temporarily configured (e.g. by software), this may be determined by cost and time considerations.
In this document, "schematic" means "serving as an example, instance, or illustration," and any illustrations, embodiments described herein as "schematic" should not be construed as a more preferred or advantageous solution. For simplicity of the drawing, the parts relevant to the present invention are shown only schematically in the drawings, and do not represent the actual structure thereof as a product. Additionally, in order to simplify the drawing for ease of understanding, components having the same structure or function in some of the drawings are shown schematically with only one of them, or only one of them is labeled. In this document, "a" does not mean to limit the number of relevant portions of the present invention to "only one thereof", and "an" does not mean to exclude the case where the number of relevant portions of the present invention is "more than one". In this document, "upper", "lower", "front", "rear", "left", "right", "inner", "outer", and the like are used merely to indicate relative positional relationships between the relevant portions, and do not limit the absolute positions of the relevant portions.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1.一种增强现实的交互方法,其特征在于,包括:1. An augmented reality interaction method, comprising: 在增强现实AR场景中,基于超宽带UWB雷达传感器,监测用户的指定肢体动作;其中,所述UWB雷达传感器的覆盖区域和AR显示区域重叠;所述UWB雷达传感器的发射点为AR显示区域的视锥体顶点;用于UWB雷达定位的球形坐标系的原点为所述AR显示区域的视锥体顶点;所述AR显示区域在球形坐标系中由具有不同球半径的球面切割为相应数量的子显示区域;In an augmented reality (AR) scene, a user's designated body movements are monitored based on an ultra-wideband (UWB) radar sensor; wherein the coverage area of the UWB radar sensor overlaps with the AR display area; the emission point of the UWB radar sensor is the vertex of the visual cone of the AR display area; the origin of the spherical coordinate system used for UWB radar positioning is the vertex of the visual cone of the AR display area; the AR display area is cut into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system; 当监测到指定的肢体动作时,根据所述肢体动作对应的操作在所述AR场景进行相应渲染;其中,如果所述肢体动作对应的操作属于针对场景中具体对象的操作,则根据相应监测时获得的肢体定位坐标,确定所述AR显示区域中的目标操作对象;其中,所述AR显示区域中的显示对象在所述球形坐标系的位置为所述显示对象在所属子显示区域所对应球面上的位置。When a specified limb movement is monitored, the AR scene is rendered accordingly according to the operation corresponding to the limb movement; wherein, if the operation corresponding to the limb movement belongs to an operation on a specific object in the scene, the target operation object in the AR display area is determined according to the limb positioning coordinates obtained during the corresponding monitoring; wherein, the position of the display object in the AR display area in the spherical coordinate system is the position of the display object on the sphere corresponding to the sub-display area to which it belongs. 2.根据权利要求1所述的方法,其特征在于,2. The method according to claim 1, characterized in that 所述根据相应监测时获得的肢体定位坐标,确定所述AR显示区域中的目标操作对象包括:The determining the target operation object in the AR display area according to the limb positioning coordinates obtained during the corresponding monitoring includes: 将所述球形坐标系中位于所述肢体定位坐标的显示对象,作为所述目标操作对象。The display object located at the limb positioning coordinates in the spherical coordinate system is used as the target operation object. 3.根据权利要求1所述的方法,其特征在于,所述显示对象在所述球形坐标系的位置由球半径R和坐标(x,y,z)共同表征;其中,R为显示对象所属子显示区域所对应球面的球半径,坐标(x,y,z)为显示对象在所属子显示区域所对应球面上的坐标。3. The method according to claim 1 is characterized in that the position of the display object in the spherical coordinate system is jointly characterized by a spherical radius R and coordinates (x, y, z); wherein R is the spherical radius of the sphere corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are the coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs. 4.根据权利要求1所述的方法,其特征在于,所述球半径基于预设的球半径间隔得到。4 . The method according to claim 1 , wherein the spherical radius is obtained based on a preset spherical radius interval. 5.根据权利要求1所述的方法,其特征在于,所述指定肢体动作为手势动作。The method according to claim 1 , wherein the specified body movement is a gesture movement. 6.一种增强现实的交互装置,其特征在于,包括:6. An augmented reality interactive device, comprising: 监测模块,用于在增强现实AR场景中,基于超宽带UWB雷达传感器,监测用户的指定肢体动作;其中,所述UWB雷达传感器的覆盖区域和AR显示区域重叠;所述UWB雷达传感器的发射点为AR显示区域的视锥体顶点;用于UWB雷达定位的球形坐标系的原点为所述AR显示区域的视锥体顶点;所述AR显示区域在球形坐标系中由具有不同球半径的球面切割为相应数量的子显示区域;A monitoring module, for monitoring a user's designated body movements in an augmented reality (AR) scene based on an ultra-wideband (UWB) radar sensor; wherein the coverage area of the UWB radar sensor overlaps with the AR display area; the emission point of the UWB radar sensor is the vertex of the visual cone of the AR display area; the origin of the spherical coordinate system used for UWB radar positioning is the vertex of the visual cone of the AR display area; and the AR display area is cut into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system; 渲染模块,用于当监测到指定的肢体动作时,根据所述肢体动作对应的操作在所述AR场景进行相应渲染;其中,如果所述肢体动作对应的操作属于针对场景中具体对象的操作,则根据相应监测时获得的肢体定位坐标,确定所述AR显示区域中的目标操作对象;其中,所述AR显示区域中的显示对象在所述球形坐标系的位置为所述显示对象在所属子显示区域所对应球面上的位置。A rendering module is used to render the AR scene accordingly according to the operation corresponding to the limb movement when a specified limb movement is monitored; wherein, if the operation corresponding to the limb movement is an operation on a specific object in the scene, the target operation object in the AR display area is determined according to the limb positioning coordinates obtained during the corresponding monitoring; wherein the position of the display object in the AR display area in the spherical coordinate system is the position of the display object on the sphere corresponding to the sub-display area to which it belongs. 7.根据权利要求6所述的装置,其特征在于,7. The device according to claim 6, characterized in that 渲染模块,用于根据相应监测时获得的肢体定位坐标,确定所述AR显示区域中的目标操作对象包括:The rendering module is used to determine the target operation object in the AR display area according to the limb positioning coordinates obtained during the corresponding monitoring, including: 将所述球形坐标系中位于所述肢体定位坐标的显示对象,作为所述目标操作对象。The display object located at the limb positioning coordinates in the spherical coordinate system is used as the target operation object. 8.一种非易失性计算机可读存储介质,所述非易失性计算机可读存储介质存储指令,其特征在于,所述指令在由处理器执行时使得所述处理器执行如权利要求1至5中任一项所述的增强现实的交互方法的步骤。8. A non-volatile computer-readable storage medium storing instructions, wherein the instructions, when executed by a processor, cause the processor to perform the steps of the augmented reality interaction method as described in any one of claims 1 to 5. 9.一种电子设备,其特征在于,包括如权利要求8所述的非易失性计算机可读存储介质、以及可访问所述非易失性计算机可读存储介质的所述处理器。9. An electronic device, comprising the non-volatile computer-readable storage medium according to claim 8, and the processor capable of accessing the non-volatile computer-readable storage medium.
CN202111171422.1A 2021-10-08 2021-10-08 Interaction method and device for augmented reality Active CN113918015B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111171422.1A CN113918015B (en) 2021-10-08 2021-10-08 Interaction method and device for augmented reality
PCT/KR2022/015033 WO2023059087A1 (en) 2021-10-08 2022-10-06 Augmented reality interaction method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111171422.1A CN113918015B (en) 2021-10-08 2021-10-08 Interaction method and device for augmented reality

Publications (2)

Publication Number Publication Date
CN113918015A CN113918015A (en) 2022-01-11
CN113918015B true CN113918015B (en) 2024-04-19

Family

ID=79238238

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111171422.1A Active CN113918015B (en) 2021-10-08 2021-10-08 Interaction method and device for augmented reality

Country Status (2)

Country Link
CN (1) CN113918015B (en)
WO (1) WO2023059087A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20240047186A (en) * 2022-10-04 2024-04-12 삼성전자주식회사 Augmented reality apparatus and operating method thereof
CN117368869B (en) * 2023-12-06 2024-03-19 航天宏图信息技术股份有限公司 Visualization method, device, equipment and medium for radar three-dimensional power range

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106055089A (en) * 2016-04-27 2016-10-26 深圳市前海万象智慧科技有限公司 Control system for gesture recognition based on man-machine interaction equipment and control method for same
CN111149079A (en) * 2018-08-24 2020-05-12 谷歌有限责任公司 Smart phone, system and method including radar system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7942744B2 (en) * 2004-08-19 2011-05-17 Igt Virtual input system
WO2013032955A1 (en) * 2011-08-26 2013-03-07 Reincloud Corporation Equipment, systems and methods for navigating through multiple reality models
US9921657B2 (en) * 2014-03-28 2018-03-20 Intel Corporation Radar-based gesture recognition
US9761049B2 (en) * 2014-03-28 2017-09-12 Intel Corporation Determination of mobile display position and orientation using micropower impulse radar
EP3762190A4 (en) * 2018-03-05 2022-01-26 The Regents Of The University Of Colorado HUMAN-ROBOT INTERACTION AUGMENTED REALITY COORDINATION
EP3797345A4 (en) * 2018-05-22 2022-03-09 Magic Leap, Inc. TRANSMODAL INPUT FUSION FOR PORTABLE SYSTEM

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106055089A (en) * 2016-04-27 2016-10-26 深圳市前海万象智慧科技有限公司 Control system for gesture recognition based on man-machine interaction equipment and control method for same
CN111149079A (en) * 2018-08-24 2020-05-12 谷歌有限责任公司 Smart phone, system and method including radar system

Also Published As

Publication number Publication date
WO2023059087A1 (en) 2023-04-13
CN113918015A (en) 2022-01-11

Similar Documents

Publication Publication Date Title
KR102590841B1 (en) virtual object driving Method, apparatus, electronic device, and readable storage medium
Kim et al. Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality
CN105528082B (en) Three dimensions and gesture identification tracking exchange method, device and system
WO2018191091A1 (en) Identifying a position of a marker in an environment
CN113119104B (en) Mechanical arm control method, mechanical arm control device, computing equipment and system
CN113918015B (en) Interaction method and device for augmented reality
CN111433783B (en) Hand model generation method, device, terminal device and hand motion capture method
US11995254B2 (en) Methods, devices, apparatuses, and storage media for mapping mouse models for computer mouses
US20240203069A1 (en) Method and system for tracking object for augmented reality
US10162737B2 (en) Emulating a user performing spatial gestures
CN102221884A (en) Visual tele-existence device based on real-time calibration of camera and working method thereof
Park et al. Hand tracking with a near-range depth camera for virtual object manipulation in an wearable augmented reality
CN113139892A (en) Sight line track calculation method and device and computer readable storage medium
US20250272927A1 (en) Method for achieving virtual touch control with virtual three-dimensional cursor, a storage medium and a chip implementing said method
Wang et al. A mixed reality-based aircraft cable harness installation assistance system with fully occluded gesture recognition
Matulic et al. Above-Screen Fingertip Tracking with a Phone in Virtual Reality
CN115892528B (en) Path planning method, device, equipment and medium for multi-arm spacecraft mobile mission
US11429247B1 (en) Interactions with slices of medical data in augmented reality
US20230267692A1 (en) Mixed reality processing system and mixed reality processing method
CN119579941A (en) Point cloud annotation method, device and storage medium
CN110471577B (en) 360-degree omnibearing virtual touch control method, system, platform and storage medium
CN118034483A (en) Gesture recognition method, apparatus, device, storage medium and program product
CN107944337A (en) A kind of low target intelligent-tracking method and system, storage medium and electric terminal
CN114373016A (en) Method for positioning implementation point in augmented reality technical scene
US12373985B2 (en) Positioning method and apparatus, device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant