[go: up one dir, main page]

CN111176445A - Identification method, terminal device and readable storage medium of interactive device - Google Patents

Identification method, terminal device and readable storage medium of interactive device Download PDF

Info

Publication number
CN111176445A
CN111176445A CN201911342720.5A CN201911342720A CN111176445A CN 111176445 A CN111176445 A CN 111176445A CN 201911342720 A CN201911342720 A CN 201911342720A CN 111176445 A CN111176445 A CN 111176445A
Authority
CN
China
Prior art keywords
sub
markers
marker
interaction device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911342720.5A
Other languages
Chinese (zh)
Other versions
CN111176445B (en
Inventor
胡永涛
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Virtual Reality Technology Co Ltd
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN201911342720.5A priority Critical patent/CN111176445B/en
Publication of CN111176445A publication Critical patent/CN111176445A/en
Application granted granted Critical
Publication of CN111176445B publication Critical patent/CN111176445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供了一种交互装置的识别方法,交互装置包括第一交互装置和第二交互装置,第一交互装置设有第一标记物,第二交互装置设有第二标记物,第一标记物与第二标记物相区别,第一标记物以及第二标记物均包括多个子标记物;该方法通过先对目标图像中的子标记物进行识别,识别出第一标记物和第二标记物后,再获取该标记物对应的交互装置的当前位姿信息,最后通过交互装置的当前位姿信息判断该交互装置对应左手或对应右手,以准确识别双交互装置,并对其进行追踪,进而提高交互的准确性。本申请实施例还提供一种终端设备和计算机可读存储介质。

Figure 201911342720

An embodiment of the present application provides a method for identifying an interaction device. The interaction device includes a first interaction device and a second interaction device. The first interaction device is provided with a first marker, the second interaction device is provided with a second marker, and the first interaction device is provided with a second marker. A marker is distinguished from the second marker, and both the first marker and the second marker include multiple sub-markers; the method identifies the first marker and the second marker by first identifying the sub-markers in the target image. After there are two markers, the current pose information of the interactive device corresponding to the marker is obtained, and finally the current pose information of the interactive device is used to determine whether the interactive device corresponds to the left hand or the right hand, so as to accurately identify the dual interactive devices, and carry out tracking to improve the accuracy of interactions. Embodiments of the present application further provide a terminal device and a computer-readable storage medium.

Figure 201911342720

Description

Interactive device identification method, terminal equipment and readable storage medium
Technical Field
The present application relates to the field of interactive device control, and more particularly, to an interactive device identification method, a terminal device, and a readable storage medium.
Background
In recent years, with the progress of science and technology, technologies such as Augmented Reality (AR) and Virtual Reality (VR) have become hot spots of research at home and abroad.
In the conventional technology, when displaying augmented reality or mixed reality and the like by superimposing virtual content in a real scene image, a user can control the virtual content or interact with the virtual content through a single controller, but in some scenes, when the user needs to interact through dual controllers, the identification of the dual controllers needs to be improved to ensure the accuracy of interaction.
Disclosure of Invention
The embodiment of the application provides an identification method of an interaction device, which can identify double interaction devices and improve the interactivity.
On one hand, the embodiment of the application provides an identification method of an interaction device, which is applied to terminal equipment, wherein the interaction device comprises a first interaction device and a second interaction device, the first interaction device is provided with a first marker, the second interaction device is provided with a second marker, the first marker is different from the second marker, and the first marker and the second marker both comprise a plurality of sub-markers; the method comprises the following steps: acquiring a target image containing a sub-marker; identifying the sub-markers in the target image, grouping the sub-markers in the target image and acquiring a grouping result; determining current pose information of the interaction devices to which the sub markers classified into the same group belong according to the image coordinates of the sub markers classified into the same group in the target image based on the grouping result; and determining that the interaction devices belonging to the sub-markers classified into the same group correspond to the left hand or the right hand according to the current pose information.
In another aspect, an embodiment of the present application provides a terminal device, where the terminal device includes one or more processors, a memory, and one or more application programs. Wherein one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the above-described methods.
In another aspect, an embodiment of the present application provides a computer-readable storage medium, in which a program code is stored, and the program code can be called by a processor to execute the method described above.
According to the identification method of the interaction device, the terminal device and the readable storage medium, the sub-markers in the target image are identified, the first marker and the second marker are identified, the pose information of the interaction device corresponding to the markers is acquired, and finally the left hand or the right hand corresponding to the interaction device is judged according to the pose information of the interaction device, so that the double interaction devices are accurately identified and tracked, and the interaction accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an identification tracking system according to an embodiment of the present application.
Fig. 2 is a block diagram of a terminal device according to an embodiment of the present disclosure.
Fig. 3 is a flowchart of an identification method of an interactive device according to an embodiment of the present disclosure.
Fig. 4 is a flowchart of another identification method for an interactive device according to an embodiment of the present disclosure.
Fig. 5 is a block flow diagram of an embodiment of step S203 of fig. 4.
Fig. 6 is a block flow diagram of an embodiment of step S205 of fig. 4.
Fig. 7 is a block flow diagram of another embodiment of step S205 of fig. 4.
Fig. 8 is a schematic diagram of the arrangement of the sub-markers of the first marker according to the first rule and the arrangement of the sub-markers of the second marker according to the second rule provided in the embodiment of the present application.
Fig. 9 is a schematic diagram of establishing a first spatial coordinate system and a second spatial coordinate system according to the first rule and the second rule shown in fig. 8.
Fig. 10 is another schematic diagram of establishing a first spatial coordinate system and a second spatial coordinate system according to the first rule and the second rule shown in fig. 8.
Fig. 11 is a block flow diagram of an embodiment of step S207 of fig. 4.
Fig. 12 is a block diagram of an identification apparatus of an interaction apparatus according to an embodiment of the present application.
Fig. 13 is a block diagram of a structure of a readable storage medium according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1 and fig. 2 together, an identification tracking system 10 of an interactive device 200 according to an embodiment of the present application is shown, which includes a terminal device 100 and the interactive device 200. The terminal device 100 and the interaction apparatus 200 may be connected through communication methods such as bluetooth, Wi-Fi (Wireless-Fidelity), ZigBee (ZigBee technology), etc., or may be connected through communication methods such as a data line, etc., or the terminal device 100 and the interaction apparatus 200 may not be connected. Of course, the connection mode of the terminal device 100 and the interaction apparatus 200 may not be limited in the embodiment of the present application.
In some embodiments, the terminal device 100 may be a head-mounted display device, and may also be a mobile device such as a mobile phone and a tablet. When the terminal device 100 is a head-mounted display device, the head-mounted display device may be an integrated head-mounted display device. The terminal device 100 may also be an external head-mounted display device, and is connected to an intelligent terminal such as a mobile phone as a processing and storage device of the head-mounted display device, that is, the terminal device 100 may be inserted into or connected to the intelligent terminal to display virtual content on the external head-mounted display device. In some embodiments, the terminal device 100 may be inserted or connected to an external head-mounted display device as a processing and storage device of the head-mounted display device, and display the virtual content in the head-mounted display device.
In one embodiment, terminal device 100 includes one or more processors 110, memory 130. Wherein the memory 130 stores one or more application programs 150, the one or more application programs 150 are configurable to be executed by the one or more processors 110 to implement the identification method of the interaction apparatus 200.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the entire terminal device 100 using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 130 and calling data stored in the memory 130. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 130 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 130 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 130 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal device 100 in use, and the like.
In the embodiment of the present application, the terminal device 100 further includes a camera, and the camera is configured to capture an image of a real object and capture a scene image of a target scene. The camera may be an infrared camera or a visible light camera, and the specific type is not limited in the embodiments of the present application.
In one embodiment, the terminal device is a head-mounted display device, and may further include one or more of the following components in addition to the processor, the memory, and the image sensor described above: display device module, optical module, communication module and power.
The display module device may include a display control unit. The display control unit is used for receiving the display image of the virtual content rendered by the processor, and then displaying and projecting the display image onto the optical module, so that a user can view the virtual content through the optical module. The display device may be a display screen or a projection device, and may be used to display an image.
The optical module can adopt an off-axis optical system or a waveguide optical system as a transflective lens, so that a display image displayed by the display device can be directly back-projected to the eyes of a user after being directly reflected by the optical module. The user sees the display image that display device throws through optical module group simultaneously. In some embodiments, the user can also observe the real environment through the optical module, and therefore, the image obtained by the eyes of the user is a display image of the virtual content and an augmented reality scene effect obtained by overlaying the virtual content and the real environment.
The communication module can be a module such as Bluetooth, WiFi (Wireless-Fidelity), ZigBee (Violet technology) and the like, and the head-mounted display device can be in communication connection with the terminal equipment through the communication module. The head-mounted display device in communication connection with the terminal equipment can perform information and instruction interaction with the terminal equipment. For example, the head-mounted display device may receive image data transmitted from the terminal device via the communication module, and generate and display virtual content of a virtual world from the received image data.
The power supply can supply power for the whole head-mounted display device, and the normal operation of each part of the head-mounted display device is ensured.
Referring to fig. 1, the interactive device 200 includes a first interactive device 210 and a second interactive device 230, the first interactive device 210 is provided with a first marker 211, the second interactive device 230 is provided with a second marker 231, and the first marker 211 is different from the second marker 231. Here, the terminal device 100 may identify the first interaction means 210 and the second interaction means 230 through the first marker 211 and the second marker 231 which are distinguished, that is, distinguish the first interaction means 210 and the second interaction means 230 through the first marker 211 and the second marker 231.
The user enters a preset virtual scene using the terminal device 100, and when the interaction apparatus 200 is within the visual range of the camera, the camera captures a target image including the first interaction apparatus 210 and the second interaction apparatus 230. The processor 110 acquires the first marker 211 and the second marker 231 contained in the target image and the related information of the first marker 211 and the second marker 213, and respectively calculates and identifies the relative position information between the first interaction device 210, the second interaction device 230 and the terminal device 100. In some embodiments, it may be determined that the left hand or the right hand of the user corresponds to the first interaction device 210 or the second interaction device 230, so as to accurately identify and track the interaction device 200, thereby improving the accuracy of the interaction.
Based on the identification tracking system of the interaction device, the embodiment of the application provides an identification method of the interaction device, which is applied to the terminal equipment and the interaction device of the identification tracking system.
Referring to fig. 3, an embodiment of the present application provides an identification method of an interaction device, which is applicable to the terminal device, and the identification method of the interaction device is configured to identify the first interaction device and the second interaction device and track at least one of the interaction devices, where the method includes:
step S101: an image of the target containing the sub-marker is acquired.
Specifically, when part or the whole of the first interaction device and the second interaction device is within the visual field of the camera of the terminal equipment, the camera acquires an image containing the marker on the interaction device, and the image is a target image; the processor acquires the target image from the camera. Further, the target image should include at least a plurality of sub-markers, each of which may be assigned to either the first marker or the second marker.
Step S103: and identifying the sub-markers in the target image, grouping the sub-markers in the target image and acquiring a grouping result.
In some embodiments, the sub-markers in the target image may be grouped according to the arrangement information and/or the identity information of the sub-markers in the target image, and a grouping result may be obtained. In embodiments of the present application, the tags comprise a first tag and a second tag, and thus the grouping result comprises a first group and a second group. The arrangement information in the sub-markers in the target image refers to the distribution of the sub-markers in the target image, for example, the relative position relationship between one sub-marker and the other sub-markers with respect to the sub-marker is used as a reference. The identity information of the sub-markers in the target image may be used to identify the identity of the sub-markers, e.g., may include the number of the sub-markers, etc.
Step S105: and determining the current pose information of the interaction devices to which the sub-markers classified into the same group belong according to the image coordinates of the sub-markers classified into the same group in the target image based on the grouping result.
In the embodiment of the present application, the sub-markers of the first group may be determined to belong to the first marker or the second marker by identifying and determining the sub-markers of the first group, for example, identifying the identities of the sub-markers and determining the arrangement rules thereof. Correspondingly, the sub-markers of the second group are identified and judged, for example, the identities of the sub-markers are identified and the arrangement rules of the sub-markers are judged, so that the markers of the second group are determined to belong to the first marker or the second marker.
In some embodiments, the first marker may include a plurality of sub-markers, each having different identity information, and the second marker also includes a plurality of sub-markers, each having different identity information. The identity information may be represented by the number of the sub-marker, for example, but not limited to, the number of the sub-marker is 1, 2, 3 … …, etc. The identity information of the sub-marker can be determined by the number of feature points included in the sub-marker, for example, the sub-marker includes 1 feature point, corresponding number is 1, the sub-marker includes 2 feature points, corresponding number is 2, and so on. The identity information may also be determined according to the shape, color, or shape of the sub-marker, and the like of the feature point, which is not limited herein.
The first marker may comprise a sub-marker having different identity information from the second marker. In some embodiments, the first and second markers may also comprise sub-markers of the same identity information, but the sub-markers are arranged differently.
In one embodiment, when the first group of sub-markers is determined to be the first marker, the second group of sub-markers can be directly determined to be the second marker. It may also be verified whether the sub-markers of the second set belong to the second marker by the same method as determining that the sub-markers of the first set belong to the first marker.
Further, the current pose information of the interaction devices to which the markers classified into the same group belong is determined according to the image coordinates of the markers classified into the same group in the target image. It should be noted that the pose information of the interaction device includes position information of the interaction device and pose information of the interaction device, and the pose information of the interaction device may include a rotation direction and a rotation angle of the interaction device with respect to the terminal device.
And if the sub-markers classified into the same group belong to the first marker, the interaction device to which the sub-markers classified into the same group belong is the first interaction device, so that the current pose information of the first interaction device is determined and acquired according to the image coordinates of the first marker in the target image. In other embodiments, if the sub markers classified into the same group belong to a second marker, the interaction device to which the sub markers classified into the same group belong is a second interaction device, so that the current pose information of the second interaction device is determined and obtained according to the image coordinates of the second marker in the target image.
Step S107: and determining that the interaction devices belonging to the sub-markers classified into the same group correspond to the left hand or the right hand according to the current pose information.
In the embodiment of the application, the left hand or the right hand corresponding to the interaction device can be determined according to the pose information of the interaction device. For example, when the interaction device is held by a user during use, the interaction device held by the left hand and the interaction device held by the right hand in the two interaction devices can be determined according to the pose information of the first interaction device and the second interaction device respectively.
According to the identification method of the interaction device, the sub-markers in the target image are identified, the first marker and the second marker are identified, then the current pose information of the interaction device corresponding to the markers is obtained, and finally the left hand or the right hand corresponding to the interaction device is judged according to the current pose information of the interaction device, so that the double interaction device is accurately identified and tracked, and the interaction accuracy is improved.
Referring to fig. 4, an embodiment of the present application provides another identification method for an interactive device, which can be applied to the terminal device, and the identification method for the interactive device is used to identify any one of the first interactive device and the second interactive device, and track the identified interactive device. The method can comprise the following steps:
step S201: and when the first interaction device and the second interaction device start to be used, calibrating the reference corresponding states of the first interaction device and the second interaction device.
As an embodiment, the reference corresponding state of the first interactive device and the second interactive device may be determined according to the relative position relationship when the first interactive device and the second interactive device start to be used, and based on this, the step S201 may include the steps of: when the first interaction device and the second interaction device start to be used, acquiring a reference image containing a first marker and a second marker, and determining a first relative position of the first marker and the second marker in the reference image; and determining a second relative position of the first interaction device and the second interaction device according to the first relative position, and calibrating the reference corresponding state of the first interaction device and the second interaction device according to the second relative position.
When the first interactive device and the second interactive device start to be used, a reference image containing the first marker and the second marker can be acquired through the terminal equipment. The reference image is an image of the first interactive device and the second interactive device in an initial state. In the embodiments of the present application, the first interaction means and the second interaction means are in the initial state to be understood as the state when the first interaction means and the second interaction means start to be used.
In one embodiment, when the terminal device collects the reference image, it may first determine whether the pose information of the first interaction device and the second interaction device is within a preset reference range, and if so, collect the reference image, and calibrate the reference corresponding state of the first interaction device and the second interaction device. The reference range can be preset and can be a pose range in which the interaction device approximately faces the terminal equipment, so that the situation that the interaction device is not used properly due to overturning and the like is prevented, and the subsequent identification and tracking of the interaction device are inaccurate.
Further, the second relative position of the first and second interaction means may be determined by determining a first relative position of the first and second markers in the reference image. For example, if the first marker is located to the left of the second marker in the reference image, it may be determined that the first interactive device is located to the left of the second interactive device; if the first marker is located to the right of the second marker in the reference image, it may be determined that the first interaction means is located to the right of the second interaction means; the "left" and "right" are determined based on the left and right sides of the reference image, for example, if the first marker is located on the left side of the reference image and the second marker is located on the right side of the reference image, then the first marker can be determined to be located on the left side of the second marker.
Further, reference corresponding states of the first interaction device and the second interaction device are calibrated according to the second relative position, wherein the reference corresponding states are used for representing that the first interaction device and the second interaction device respectively correspond to a left hand or a right hand. Taking the second relative positional relationship as an example that the first interactive device is located on the left side of the second interactive device, the reference corresponding state may be that the first interactive device corresponds to the left hand, and the second interactive device corresponds to the right hand. It should be noted that the reference corresponding state at this time may be that the first interaction device corresponds to the right hand, and the second interaction device corresponds to the left hand.
Step S202: an image of the target containing the sub-marker is acquired.
Step S202 is similar to step S101, and is not described herein.
Step S203: and identifying the sub-markers in the target image, grouping the sub-markers in the target image and acquiring a grouping result.
In some embodiments, the sub-markers of the first marker have non-repeating identity information and the sub-markers of the second marker have non-repeating identity information, wherein the identity information of the sub-markers of the first marker is one-to-one corresponding to and the same as the identity information of the sub-markers of the second marker. Further, the identity information of each sub-marker in the target image can be acquired by identifying the sub-markers in the target image; sub-markers with the same identity information are then grouped into different groups.
For example, the identification information is given by the numbers, the sub-marker number of the first marker is 1, 2 … 8, and the sub-marker number of the second marker is 1, 2 … 8, for convenience of description, the sub-marker is assigned to the first marker by "A" and the sub-marker is assigned to the second marker by "B". In the process of identifying the target image, the sub-markers in the group "a" and the sub-markers corresponding to the group "B" are not clearly distinguished, the number information of the sub-markers in the target image can be identified as 1 and 2 … 8, the number corresponding to each number is two, and the sub-markers with the same number identified in the target image can be divided into different groups.
As an example, after identifying the sub-markers in the target image, the sub-markers in the target image may be grouped according to the relative distance between the sub-markers in the target image and the reference sub-markers and the grouping result is obtained, based on which, referring to fig. 5, step S203 may include steps S301 to S305:
step S301: a fiducial sub-marker is determined.
Specifically, the reference sub-marker is any one of the sub-markers of the target image, which is used as a reference for all the sub-markers in the target image, and is also a reference point for grouping all the sub-markers in the target image in the present embodiment.
For example, the target image includes a plurality of sub-markers, and one of the sub-markers is randomly selected as the fiducial marker. It should be noted that, when actually determining the reference daughter marker, it is not known that the marker belongs to the first marker or the second marker.
Step S303: relative distance information between the fiducial sub-marker and other sub-markers in the target image is obtained.
After the reference sub-marker is determined, relative distance information between other sub-markers in the target image and the reference sub-marker may be acquired with reference to the reference sub-marker.
Step S305: and grouping the sub-markers in the target image with the relative distance information larger than the distance threshold into one group, and grouping the sub-markers with the relative distance information smaller than or equal to the distance threshold into another group.
Wherein the sub-markers in the target image include sub-markers for which the relative distance information is greater than the distance threshold, and sub-markers for which the relative distance information is less than or equal to the distance threshold. Since the first marker and the second marker belong to different interaction means, if the fiducial marker is a sub-marker of the first marker, the relative distance between it and the other sub-markers of the first marker may be significantly smaller than the relative distance between it and the sub-markers of the second marker.
For example, if the fiducial sub-marker belongs to the first marker, the relative distance between the fiducial marker and the other sub-markers in the first marker may be smaller than the distance between the fiducial marker and the sub-markers in the second marker. Thus, by comparing the relative distance information with the distance threshold, grouping sub-markers for which the relative distance information is greater than the distance threshold into one group, and sub-markers for which the relative distance information is less than or equal to the distance threshold into another group, approximate grouping of sub-markers of the first marker and sub-markers of the second marker can be achieved. In some embodiments, the distance threshold may be a maximum of the relative distance between any one of the sub-markers of the first marker and the other sub-markers, or may be a maximum of the relative distance between any one of the sub-markers of the second marker and the other sub-markers. By grouping the relative distance information between the reference sub-marker and other sub-markers, the sub-markers in the target image can be grouped quickly, and the identification speed of the interactive device is improved.
Step S205: according to the grouping result, it is determined that the sub-markers classified into the same group belong to the first marker or the second marker.
In the embodiments of the present application, the sub-markers of the first group can be determined to belong to the first marker or the second marker by identifying and determining the sub-markers of the first group, for example, identifying the identity of the sub-markers and determining the arrangement rule thereof. Correspondingly, the sub-markers of the second group are also identified and determined, for example, by identifying the identity of the sub-markers and determining the arrangement rules thereof, thereby determining whether the markers of the second group belong to the first marker or the second marker.
As an embodiment, after identifying the sub-markers in the target image and grouping the sub-markers, the sub-markers grouped in the same group may be determined to belong to the first marker or the second marker through the historical frame image information of the markers, based on which, referring to fig. 6, step S205 may further include steps S401 to S409:
step S401: historical frame images containing a first marker and a second marker are acquired.
The historical frame image is the target image of the previous frame or the previous frames of the target image acquired in step S202, that is, the historical frame image is the image including the first marker and the second marker acquired at the historical time node. Further, the recognition results of the first marker and the second marker have been included in the history frame image, in other words, the sub-marker in the history frame image has specified that it belongs to the first marker or the second marker.
Step S403: first image coordinates of a sub-marker of the first marker and second image coordinates of a sub-marker of the second marker in the historical frame images are identified.
The image coordinate system is a plane coordinate system established by the plane of the target image.
Further, the coordinates of the sub-markers of the first marker in the history frame image in the image coordinate system are identified and recorded as first image coordinates. The coordinates of the sub-markers of the second marker in the historical frame image in the image coordinate system are identified and recorded as second image coordinates.
Step S405: the current image coordinates of the sub-markers grouped into the same group in the target image are acquired.
And acquiring the coordinates of the sub-markers grouped into the same group in the target image and recording the coordinates as the current image coordinates. The sub-markers classified as the same group are the first group or the second group in step S203, and for convenience of description, the sub-markers classified as the same group are described as the sub-markers of the first group. Accordingly, the processing method of the second group of sub-markers is similar to that of the first group of sub-markers, and can be performed by referring to the processing method, which is not described herein again. The image coordinate system in which the target image in step S405 is located and the image coordinate system in which the history frame image in step S403 is located are the same coordinate system.
Step S407: and acquiring a first displacement degree of the current image coordinate relative to the first image coordinate.
Wherein the current image coordinates may be used to identify the position of the first set of sub-markers in the current target image and the first image coordinates may be used to identify the position of the sub-markers of the first marker in the historical frame images. Further, a degree of displacement between the sub-markers of the first set and the sub-markers of the first marker may be obtained, i.e. a first degree of displacement of the current image coordinates relative to the first image coordinates is obtained.
As an embodiment, the first displacement degree may be a variance of displacement amounts performed by the sub-markers of the first marker when the sub-markers of the first marker in the history frame image are displaced to correspond to (or coincide with) the sub-markers of the first group in the current target image. Alternatively, the first degree of displacement may be an average of the amounts of displacement performed by the sub-markers of the first marker when the sub-markers of the first marker are displaced to correspond to (or coincide with) the sub-markers of the first group.
Step S408: and acquiring a second displacement degree of the current image coordinate relative to the second image coordinate.
Wherein the second image coordinates may be used to identify the location of the sub-markers of the second marker in the historical frame images. Further, a degree of displacement between the sub-markers of the first set and the sub-markers of the second marker may be obtained, i.e. a second degree of displacement of the current image coordinates relative to the second image coordinates is obtained.
As an embodiment, the second displacement degree may be a variance of displacement amounts performed by the sub-markers of the second marker when the sub-markers of the second marker in the history frame image are displaced to correspond to (or coincide with) the sub-markers of the first group of the current target image. Alternatively, the second degree of displacement may be an average of the amounts of displacement performed by the sub-markers of the second marker when the sub-markers of the second marker are displaced to correspond to (or coincide with) the sub-markers of the first group.
Step S409: and determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the first degree of displacement and the second degree of displacement.
Wherein the sub-markers of the first group can be determined to belong to the first marker or the second marker according to the magnitude of the first degree of displacement and the second degree of displacement. If the first degree of displacement is less than the second degree of displacement, determining that the sub-markers of the first group are the sub-markers of the first marker; and if the second displacement degree is smaller than the first displacement degree, determining the sub-markers of the first group as the sub-markers of the second marker.
For example, when the obtained first displacement degree is the variance of the displacement amounts performed by the sub-markers of the first marker, the deviation degree between the displacement amounts respectively performed by the sub-markers of the first marker during the motion process and the average value of the displacement amounts can be obtained by the variance value, and the higher the deviation degree, the lower the degree of correspondence between the sub-markers of the first marker and the sub-markers of the first group; a lower degree of deviation indicates a higher degree of correspondence of the sub-markers of the first marker with the sub-markers of the first group.
Further, by comparing the first displacement degree and the second displacement degree, the markers with higher degree of correspondence with the sub-markers of the first group can be obtained, and then the sub-markers of the first group are determined to belong to the first marker or the second marker. Since the history frame image is the target image of the previous frame or the previous frames of the target image acquired in step S202, the displacement change between the first marker in the history frame image and the first marker in the target image acquired in step S202 is not large. Furthermore, the sub-markers classified into the same group can be judged to belong to the first marker or the second marker by comparing the first displacement degree and the second displacement degree, the sub-markers in the current target image are identified by referring to the historical frame image, and the speed of judging the first marker and the second marker can be increased.
As an embodiment, the arrangement of the sub-markers of the first marker and the arrangement of the sub-markers of the second marker are mirror images, taking the arrangement of the sub-markers of the first marker as a first rule for the arrangement of the sub-markers of the first marker in the first interaction device, and the arrangement of the sub-markers of the second marker as a second rule for the arrangement of the sub-markers of the second marker in the second interaction device as an example, based on this, please refer to fig. 7, step S205 further includes steps S501 to S509, where steps S501 to S509 may replace steps S401 to S409, and the determination results of steps S401 to S409 may also be verified, so as to improve the accuracy of determining that the sub-markers grouped together belong to the first marker or the second marker.
Specifically, the sub-markers of the first marker are arranged according to a first rule on the first interaction device, the sub-markers of the second marker are arranged according to a second rule on the second interaction device, and the second rule and the first rule are mirror images of each other. The second rule and the first rule are mirror images of each other, that is, the sub-markers with the same identity information in the sub-markers of the first marker and the sub-markers of the second marker are mirror images of each other. Steps S501 to S509 include:
step S501: and constructing a first space coordinate system according to a first rule and constructing a second space coordinate system according to a second rule.
In embodiments of the present application, the sub-markers of the first marker are arranged according to a first rule. Further, as shown in FIG. 8, the first rule is that 1, 3, 5 and 7 of the sub-markers of the "A" group constitute a plane, 4 and 8 are distributed to the left of the plane in the drawing, and 2 and 6 are distributed to the right of the plane in the drawing. It should be noted that the first rule is also 1, 2, 3, 4 and 5, 6, 7, 8 arranged clockwise, for example, "left", "right" and "clockwise" in the text description shown in fig. 8 all refer to the direction in fig. 8, which is only defined for description and not limited.
In the embodiment of the present application, the sub-markers of the second marker are arranged according to a second rule, and the first rule is to number the sub-markers of the second marker in sequence counterclockwise. Wherein the second space coordinate system is a mirror image of the first space coordinate system. Further, as shown in FIG. 8, the second rule is that 1, 3, 5 and 7 of the sub-markers of the "B" group constitute a plane, 2 and 6 are distributed on the left side of the plane, and 4 and 8 are distributed on the right side of the plane. It should be noted that the second rule is also 1, 2, 3, 4 and 5, 6, 7, 8 arranged counterclockwise, for example, in the text description shown in fig. 8, "left" and, "right" and "counterclockwise" are all the left and right directions in fig. 8 as reference, and are defined only for description and not by limitation.
As shown in FIG. 8, the first rule refers to the arrangement rules of 1 to 8 (sub-markers of the first marker) of the group "A", and the second rule refers to the arrangement rules of 1 to 8 (sub-markers of the second marker) of the group "B". 1-8 arranged according to the first rule and 1-8 arranged according to the second rule are mirror images, and the identity information of the corresponding sub-markers is the same.
Further, the sub-markers of the first marker are arranged on the first interaction means according to a first rule, the sub-markers of the second marker are arranged on the second interaction means according to a second rule, and the first rule and the second rule are mirror images of each other, so that the arrangement of the sub-markers of the first marker and the arrangement of the sub-markers of the second marker are mirror images of each other. The first space coordinate system is established according to the first rule, the second space coordinate system is established according to the second rule, the X axis and the Y axis of the space coordinate system can be determined according to the same establishing mechanism, and then the Z axis of the space coordinate system is determined.
The first space coordinate system and the second space coordinate system have the same establishing rule, and the first arrangement rule and the second arrangement rule are mirror images, so that the first space coordinate system and the second space coordinate system are mirror images.
For example, as shown in fig. 9, the first spatial coordinate system and the second spatial coordinate system are established by the following mechanisms: an X axis and a Y axis are established by taking a middle point between the sub-markers 6 pointing to the sub-markers 8 as an origin, the positive direction of the X axis is the direction in which the origin points to the sub-markers 8, and the positive direction of the Y axis is the direction in which the origin points to the sub-markers 7. And determining the Z axis according to the same rectangular coordinate system determination mode. Further, the second spatial coordinate system and the sub-markers having the same identity information in the first spatial coordinate system have the same coordinate value. In other words, the sub-markers having the same number in the second spatial coordinate system and the first spatial coordinate system have the same coordinate value.
In other embodiments, as shown in fig. 10, the first spatial coordinate system and the second spatial coordinate system are established by: and establishing an X axis and a Y axis by taking a middle point between the sub-markers 8 pointed by the sub-markers 6 as an origin, wherein the positive direction of the X axis is the direction in which the origin points to the sub-markers 8, the positive direction of the Y axis is the direction in which the origin points to the sub-markers 7, and then determining the Z axes of the first space coordinate system and the second space coordinate system respectively according to the determination mode of the right-hand coordinate system, so that the Z axes of the two space coordinate systems are in opposite directions.
Step S503: and acquiring the arrangement rule of the sub-markers in the target image, which are classified into the same group, in the target image.
In the embodiment of the application, the arrangement rule of the sub-markers in the target image, which are grouped into the same group, in the target image is obtained. Alternatively, the arrangement rule may be a distribution rule of the partial sub-markers classified into the same group with respect to a reference plane, and the reference plane may be determined according to a positional relationship between the sub-markers other than the partial sub-markers, for example, the reference plane may be a plane formed by connecting 1, 3, 5, 7 of the sub-markers classified into the same group.
Step S505: and acquiring a first matching degree between the arrangement rule of the sub-markers classified into the same group in the target image and the first rule.
For example, the first matching degree is a matching degree between the position of 2, 4, 6, 8 relative to the plane distribution of 1, 3, 5, 7 connection in the arrangement rule and the position of 2, 4, 6, 8 relative to the plane distribution of 1, 3, 5, 7 connection in the "a" group of sub-markers in the first rule. Optionally, the first matching degree is the same number of positions of the plane distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connection as the positions of the plane distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connection in the "a" group of sub-markers.
Step S507: and acquiring a second matching degree between the arrangement rule of the sub-markers classified into the same group in the target image and the second rule.
For example, the second matching degree is a matching degree between the position of 2, 4, 6, 8 relative to the plane distribution of 1, 3, 5, 7 connection in the arrangement rule and the position of 2, 4, 6, 8 relative to the plane distribution of 1, 3, 5, 7 connection in the "B" group of sub-markers in the second rule. Optionally, the second matching degree is the same number of positions of the plane distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connection as the positions of the plane distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connection in the "B" group of sub-markers.
Step S509: and determining whether the sub-markers classified into the same group belong to the first marker or the second marker according to the first matching degree and the second matching degree.
According to the first matching degree and the second matching degree, the arrangement rule of the sub-markers classified into the same group is determined to be close to the first rule or close to the second rule, and then the sub-markers classified into the same group are determined to belong to the first marker or the second marker. If the first matching degree is greater than the second matching degree, it may be determined that the arrangement rule of the sub-markers classified into the same group is close to the first rule, and the sub-markers classified into the same group belong to the first marker. If the second matching degree is greater than the first matching degree, it may be determined that the arrangement rule of the sub-markers classified into the same group is close to the second rule, and the sub-markers classified into the same group belong to the second marker.
For example, as shown in fig. 8, if the arrangement rule of the sub-markers grouped into the same group is 2, 6 on the right of the plane connected by 1, 3, 5, 7, and 4, 8 on the left of the plane connected by 1, 3, 5, 7. Comparing the arrangement rule with the first rule, if the positions of 2, 4, 6 and 8 relative to the plane distribution of 1, 3, 5 and 7 connection are the same as the first rule (the arrangement rule of the sub-markers in the group 'A'), the first matching degree is 4; comparing the arrangement rule with the second rule (the arrangement rule of the sub-markers in the 'B' group), if the positions of 2, 4, 6 and 8 relative to the plane distribution connected with 1, 3, 5 and 7 are different from the second rule, the second matching degree is recorded as 0, so that the result that the first matching degree is greater than the second matching degree can be obtained, and the sub-markers classified into the same group belong to the first marker. Through the acquisition and comparison of the first matching degree and the second matching degree, the sub-markers which are classified into the same group can be accurately judged to belong to the first marker or the second marker, and the identification speed and the accuracy of the interaction device are improved.
As an embodiment, a matching threshold may be set, for example, when the positions of 2, 4, 6, 8 relative to the plane distribution connected by 1, 3, 5, 7 are in the matching process of the first rule and the second rule, and the positions of at least two points in 2, 4, 6, 8 correspond to the positions in the first rule or the positions in the second rule, the first matching degree and the second matching degree may be obtained to judge that the sub-markers classified into the same group belong to the first marker or the second marker. When the positions of less than two points among 2, 4, 6, and 8 correspond to the positions of the first rule or the positions of the second rule, the packet result needs to be corrected, for example, by the steps S401 to S409.
Alternatively, in some embodiments, if the sub-markers of the first marker and the second marker are significantly different, such as different sizes of the sub-markers, different shapes of the sub-markers, and the like, the sub-markers can be clearly identified as belonging to the first marker or the second marker in the target image, and step S203 can be omitted.
Step S207: and determining the current pose information of the interaction devices to which the sub-markers classified into the same group belong according to the images of the sub-markers classified into the same group in the target image.
If the sub markers classified into the same group belong to the first marker and the interaction device to which the sub markers classified into the same group belong is the first interaction device, determining the pose information of the interaction device to which the sub markers classified into the same group belong according to the images of the sub markers classified into the same group in the target image, referring to fig. 11, wherein step S207 includes steps S601 to S605:
s601: pixel coordinates of the sub-markers of the first marker in the image coordinate system are acquired.
The pixel coordinates are coordinates of the sub-markers of the first marker in an image coordinate system corresponding to the target image. Alternatively, the processor may acquire the pixel coordinates of all the sub-markers, i.e., the processor may acquire the pixel coordinates of all the sub-markers.
S603: physical coordinates of the sub-markers of the first marker are obtained.
The physical coordinates are coordinates of the sub-markers of the first marker in a physical coordinate system corresponding to the first interaction device, and the physical coordinates of the sub-markers are used for representing the sub-markersThe actual position of the object on the first interaction means is noted. The physical coordinates of the sub-markers may be pre-set and then recalled based on the identified sub-markers. For example, a plurality of markers are arranged on the marking surface of the first interaction device, and a certain point on the marking surface is selected as an origin to establish a physical coordinate system. And coordinate values of the physical coordinates of the sub-markers corresponding to the preset coordinate axes are preset values. For example, an XYZ coordinate system is established with a point of the controller as an origin, and the positional relationship of each of the sub-markers (and the feature points included in the sub-markers) with respect to the origin can be measured, whereby the physical coordinates (X) of each of the sub-markers in the XYZ coordinate system can be determined0,Y0,Z0)。
S605: and acquiring the current pose information of the first interaction device according to the pixel coordinates and the physical coordinates.
After the pixel coordinates and the physical coordinates of the sub-markers of the first marker in the target image are acquired, the relative position and posture information between the terminal device and the first interaction device is acquired according to the pixel coordinates and the physical coordinates of the sub-markers, and specifically, the mapping parameters between an image coordinate system and a physical coordinate system are acquired according to the pixel coordinates and the physical coordinates of each sub-marker and the internal parameters of the terminal device acquired in advance.
For example, by operating the acquired pixel coordinates and physical coordinates of the sub-markers and the pre-acquired internal parameters of the terminal device through a preset algorithm (such as an SVD algorithm), rotation parameters between the image coordinate system and the physical coordinate system of the terminal device and translation parameters between the image coordinate system and the physical coordinate system of the terminal device can be acquired.
It should be noted that the rotation parameter and the translation parameter are used to represent the current pose information between the terminal device and the first interaction apparatus. The rotation parameter represents a rotation state between the image coordinate system and the physical coordinate system, that is, a rotation degree of freedom of each coordinate axis of the first interaction device and the physical coordinate system in the physical coordinate system. The translation parameter represents a moving state between the image coordinate system and the physical coordinate system, that is, a degree of freedom of movement of the first interaction device and each coordinate axis of the physical coordinate system in the physical coordinate system. The rotation parameter and the translation parameter are information of six degrees of freedom of the first interaction device in the physical coordinate system, and can represent the rotation and movement states of the first interaction device in the physical coordinate system, that is, the angle, the distance, and the like between the view field of the terminal device and each coordinate axis in the physical coordinate system can be obtained.
Step S208: and determining that the interaction devices belonging to the sub-markers classified into the same group correspond to the left hand or the right hand according to the current pose information.
In the embodiment of the present application, taking the interaction device to which the sub markers classified into the same group belong as the first interaction device as an example, the current pose information of the first interaction device may be determined, and then the left hand or the right hand corresponding to the first interaction device may be determined according to the pose information of the first interaction device in the reference image and the reference corresponding state.
The pose information and the reference corresponding state of the first interaction device in the reference image may be the pose information and the calibrated reference corresponding state of the first interaction device obtained in step S201. The left hand or the right hand corresponding to the first interaction device can be determined through the pose information of the first interaction device in the reference image and the current pose information of the interaction device. Accordingly, the interactive device to which the sub-markers classified as the same group belong is the second interactive device, and the manner of determining whether the second interactive device corresponds to the left hand or the right hand is also similar to the above method, which is not described herein again.
In one embodiment, after acquiring the current pose information of the interaction devices grouped into the same group, acquiring first historical pose information of a first interaction device and second historical pose information of a second interaction device according to a historical frame image containing a first marker and a second marker; respectively determining pose changes among the current pose information, the first historical pose information and the second historical pose information; and determining the interaction devices belonging to the sub-markers classified into the same group as the first interaction device or the second interaction device according to the pose change. At this time, step S205 may be omitted.
The pose change is used for reflecting the change of the first historical pose information and the current pose information and the change of the second historical pose information and the current pose information. By comparing the change of the first historical pose information and the current pose information with the change of the second historical pose information and the current pose information, the pose change of the interaction device to which the sub markers classified into the same group belong can be determined to be in accordance with the pose change of the first interaction device or the pose change of the second interaction device, and therefore the interaction device to which the sub markers classified into the same group belong is determined to be the first interaction device or the second interaction device.
Further, it is determined whether the interactive device to which the sub-markers classified into the same group belong corresponds to the left hand or corresponds to the right hand according to the reference correspondence state. In other embodiments, the approximate moving route of the interaction device to which the sub markers classified into the same group belong may also be obtained by comparing the change between the pose information of the interaction device to which the sub markers classified into the same group belong in the historical frame image and the current pose information of the interaction device to which the sub markers classified into the same group belong, so as to determine that the interaction device to which the sub markers classified into the same group belong corresponds to the left hand or the right hand.
Step S209: and determining and executing corresponding control instructions according to the corresponding states of the interaction devices belonging to the sub-markers classified into the same group.
The identification method of the interactive device can also determine a control instruction according to the determined posture information of the interactive device and the hand corresponding to the interactive device, and realize different controls on the interactive device. The pose information of the interaction device may be obtained from the pose information of the interaction device obtained in step S207.
For example, if it is determined that the first interactive device corresponds to the left hand, the first control instruction is determined according to the posture information of the first interactive device. And if the first interaction device is determined to correspond to the right hand, determining a second control instruction according to the posture information of the first interaction device. The first control instruction and the second control instruction are different, that is, different control instructions can be obtained according to the interaction devices with the same gesture corresponding to different hands. Further, the terminal device can selectively execute the first control instruction or the second control instruction according to whether the current corresponding hand is the left hand or the right hand, so that diversified control of the virtual content is realized.
Referring to fig. 12, a block diagram of an identification apparatus 500 of an interactive apparatus according to an embodiment of the present application is shown, and is applied to a terminal device. The recognition apparatus 500 includes an image acquisition module 510, a sub-marker grouping module 530, a pose information acquisition module 550, and a correspondence determination module 570.
The image acquiring module 510 is used for acquiring a target image containing the sub-markers. The sub-marker grouping module 530 is configured to identify sub-markers in the target image, group the sub-markers in the target image, and obtain a grouping result. The pose information acquiring module 550 is configured to determine, according to the images of the sub markers classified into the same group in the target image, current pose information of the interaction device to which the sub markers classified into the same group belong. The correspondence determining module 570 is configured to determine, according to the current pose information, whether the interaction devices belonging to the sub markers classified as the same group correspond to a left hand or a right hand.
In some embodiments, the sub-marker grouping module 530 includes a fiducial determining unit for determining fiducial sub-markers, which are any of the sub-markers in the target image, a distance acquiring unit, and a grouping unit. The distance acquisition unit is used for acquiring relative distance information between the reference sub-marker and other sub-markers in the target image. The grouping unit is used for grouping the sub-markers of which the relative distance information is greater than the distance threshold into one group, and grouping the sub-markers of which the relative distance information is less than or equal to the distance threshold into another group.
In other embodiments, the first label comprises a plurality of sub-labels that are distinguishable from each other, and the second label comprises a plurality of sub-labels that are distinguishable from each other, wherein the sub-labels of the first label are identical to the sub-labels of the second label, but the sub-labels of the first label are arranged differently from the sub-labels of the second label. The sub-marker grouping module 530 includes an identity information recognition unit and a grouping unit, where the identity information recognition unit is configured to recognize sub-markers in the target image and obtain identity information of each sub-marker in the target image. The grouping unit is used for respectively grouping the sub-markers with the same identity information into different groups.
Furthermore, the arrangement of the sub-markers of the first marker and the arrangement of the sub-markers of the second marker are mirror images of each other, a first spatial coordinate system is constructed according to a first rule, and a second spatial coordinate system is constructed according to a second rule. The second space coordinate system is in mirror symmetry with the first space coordinate system, wherein the first rule is the arrangement rule of the sub-markers of the first marker, and the second rule is the arrangement rule of the sub-markers of the second marker; the sub-markers of the first marker have non-repeating identity information, the sub-markers of the second marker have non-repeating identity information, and the second spatial coordinate system has the same identity information as the sub-markers having the same coordinate value in the first spatial coordinate system.
In some embodiments, the recognition apparatus 500 includes a marker determination module for determining sub-markers classified into the same group as the first marker or the second marker, and the marker determination module may include a preset rule acquisition unit, a current rule acquisition unit, a matching degree acquisition unit, and a determination unit. The preset rule obtaining unit is used for obtaining a first rule and a second rule. The current rule obtaining unit is used for obtaining the arrangement rule of the sub-markers in the target image, which are classified into the same group, in the target image. The matching degree obtaining unit is used for obtaining a first matching degree between the arrangement rule and the first rule and a second matching degree between the arrangement rule and the second rule; and a determination unit for determining whether the sub-markers classified into the same group belong to the first marker or the second marker based on the first matching degree and the second matching degree.
In other embodiments, the marker determination module may include a history frame unit, a history frame image coordinate acquisition unit, a current image coordinate acquisition unit, a first degree of displacement unit, a second degree of displacement unit, and a determination unit. The historical frame unit is used for acquiring a historical frame image containing a first marker and a second marker. The history frame image coordinate acquiring unit is used for identifying a first image coordinate of a sub-marker of the first marker and a second image coordinate of a sub-marker of the second marker in the history frame image. The current image coordinate acquiring unit is used for acquiring current image coordinates of the sub-markers which are classified into the same group in the target image. The first displacement degree unit is used for acquiring a first displacement degree of the current image coordinate relative to the first image coordinate. The second displacement unit is used for acquiring a second displacement of the current image coordinate relative to the second image coordinate. The determining unit is configured to determine that the sub-markers classified into the same group belong to the first marker or the second marker based on the first degree of displacement and the second degree of displacement.
The pose information acquisition module 550 includes a pixel coordinate acquisition unit, a physical coordinate acquisition unit, and a current pose acquisition unit. If the interaction device to which the sub-markers classified into the same group belong is the first interaction device, the pixel coordinate acquisition unit is used for acquiring the pixel coordinates of the sub-markers of the first marker in the image coordinate system. The physical coordinate acquiring unit is used for acquiring physical coordinates of the sub-markers of the first marker, wherein the physical coordinates are coordinates of the sub-markers of the first marker in a first space coordinate system. The current pose information acquisition unit is used for acquiring current pose information of the first interaction device according to the pixel coordinates and the physical coordinates. It should be noted that the pixel coordinate obtaining unit, the physical coordinate obtaining unit, and the current pose obtaining unit may also be used to obtain the pose of the second marker, and the process is similar to the process of obtaining the current pose of the first marker, and is not described herein again.
In some embodiments, the identification apparatus 500 further comprises a calibration module (not shown in the figures) for calibrating the reference correspondence states when the first interactive apparatus and the second interactive apparatus start to be used. Specifically, the calibration module includes a first relative position unit, a second relative position unit, and a calibration unit, where the first relative position unit is configured to acquire a reference image including the first marker and the second marker when the first interaction device and the second interaction device start to be used, and determine a first relative position of the first marker and the second marker in the reference image. The second relative position unit is used for determining a second relative position between the first interaction device and the second interaction device according to the first relative position. The calibration unit is used for calibrating the reference corresponding states of the first interaction device and the second interaction device according to the second relative position, wherein the reference corresponding states are used for representing that the first interaction device and the second interaction device respectively correspond to a left hand or a right hand.
Further, the correspondence determining module 570 includes a reference position determining unit and a correspondence determining unit, and the reference position determining unit is configured to determine, according to the reference image, reference position information when the first interaction device and the second interaction device start to be used, respectively. The corresponding determining unit is used for determining whether the first interaction device belongs to the corresponding left hand or the corresponding right hand according to the reference position information, the current position information and the reference corresponding state.
As an embodiment, the recognition device 500 further includes an instruction determining module, and the instruction determining module is configured to execute a control instruction of the interaction device. The instruction determining module comprises a first instruction obtaining unit, a second instruction obtaining unit and an execution instruction determining unit, wherein the first instruction obtaining unit is used for determining a first control instruction according to the posture information of the first interaction device if the first interaction device is determined to correspond to the left hand. The second instruction acquisition unit is used for determining a second control instruction according to the posture information of the first interaction device if the first interaction device is determined to correspond to the right hand. The execution instruction determination unit is used for executing the first control instruction or the second control instruction.
It should be noted that, the units of the modules mentioned above refer to the related control of the first interaction device, and all the units may be replaced by the second interaction device, that is, the related control of the units of the modules to the second interaction device is the same as the related control to the first interaction device, and details are not described here.
In summary, according to the identification method and the identification device for the interaction device provided in the embodiment of the present application, the sub-markers in the target image are identified, the first marker and the second marker are identified, the pose information of the interaction device corresponding to the markers is obtained, and finally the left hand or the right hand corresponding to the interaction device is determined according to the pose information of the interaction device, so as to accurately identify the interaction device and track the interaction device, thereby improving the accuracy of interaction.
Referring to fig. 13, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable storage medium 600 may be an electronic memory such as a group of flash memory, EEPROM (electrically erasable programmable read only memory), EPROM, hard disk, or ROM. Alternatively, the computer-readable storage medium 600 includes a non-volatile computer-readable storage medium. The computer readable storage medium 600 has storage space for program code 610 for performing any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 610 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1.一种交互装置的识别方法,其特征在于,应用于终端设备,所述交互装置包括第一交互装置和第二交互装置,所述第一交互装置设有第一标记物,所述第二交互装置设有第二标记物,所述第一标记物与所述第二标记物相区别,所述第一标记物以及所述第二标记物均包括多个子标记物;所述方法包括:1. A method for identifying an interaction device, characterized in that it is applied to a terminal device, the interaction device comprises a first interaction device and a second interaction device, the first interaction device is provided with a first marker, and the first interaction device is provided with a first marker. The second interaction device is provided with a second marker, the first marker is different from the second marker, and both the first marker and the second marker include a plurality of sub-markers; the method includes : 获取包含子标记物的目标图像;Get the target image containing the sub-markers; 识别所述目标图像中的子标记物,对所述目标图像中的子标记物进行分组并获取分组结果;Identifying the sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result; 基于所述分组结果,根据被归为同一组的子标记物在所述目标图像中的图像坐标,确定被归为同一组的子标记物所属的交互装置的当前位姿信息;以及Based on the grouping result, according to the image coordinates of the sub-markers classified into the same group in the target image, determine the current pose information of the interaction device to which the sub-markers classified into the same group belong; and 根据所述当前位姿信息,确定所述被归为同一组的子标记物所属的交互装置对应左手或对应右手。According to the current pose information, it is determined that the interaction device to which the sub-markers classified into the same group belong corresponds to the left hand or the right hand. 2.如权利要求1所述的方法,其特征在于,所述识别所述图像中的子标记物,对所述目标图像中的子标记物进行分组并获取分组结果,包括:2. The method of claim 1, wherein the identifying the sub-markers in the image, grouping the sub-markers in the target image and obtaining a grouping result, comprises: 确定基准子标记物,所述基准子标记物为所述目标图像中的子标记物中的任一个;determining a fiducial sub-marker, the fiducial sub-marker being any one of the sub-markers in the target image; 获取所述基准子标记物与所述目标图像中的其他子标记物之间的相对距离信息;以及obtaining relative distance information between the fiducial sub-marker and other sub-markers in the target image; and 将所述相对距离信息大于距离阈值的子标记物归为一组,所述相对距离信息小于或等于距离阈值的子标记物归为另一组。The sub-markers whose relative distance information is greater than the distance threshold are grouped into one group, and the sub-markers whose relative distance information is less than or equal to the distance threshold are grouped into another group. 3.如权利要求1所述的方法,其特征在于,所述第一标记物包含的子标记物之间相互区别,所述第二标记物包含的子标记物之间相互区别,所述第一标记物的子标记物与所述第二标记物的子标记物相同,但所述第一标记物的子标记物的排布方式与所述第二标记物的子标记物不同;3. The method of claim 1, wherein the sub-markers contained in the first marker are distinguished from each other, the sub-markers contained in the second marker are distinguishable from each other, and the first marker is distinguished from each other. The sub-markers of a marker are the same as the sub-markers of the second marker, but the arrangement of the sub-markers of the first marker is different from the sub-markers of the second marker; 所述识别所述目标图像中的子标记物,对所述目标图像中的子标记物进行分组并获取分组结果,包括:The identifying sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result, include: 识别所述目标图像中的子标记物,获取所述目标图像中每一个子标记物的身份信息;以及Identifying sub-markers in the target image, and acquiring identity information for each sub-marker in the target image; and 将具有相同身份信息的子标记物分别归入不同的组别。Sub-markers with the same identity information were classified into different groups. 4.如权利要求2或3所述的方法,其特征在于,在所述获取分组结果之后,所述方法还包括:4. The method according to claim 2 or 3, wherein after the obtaining the grouping result, the method further comprises: 根据所述分组结果确定被归为同一组的子标记物属于所述第一标记物或所述第二标记物,包括:It is determined according to the grouping result that the sub-markers classified into the same group belong to the first marker or the second marker, including: 获取第一规则以及第二规则,其中,所述第一规则为所述第一标记物的子标记物的排布规则,所述第二规则为所述第二标记物的子标记物的排布规则;Obtain a first rule and a second rule, where the first rule is an arrangement rule of sub-markers of the first marker, and the second rule is an arrangement of sub-markers of the second marker distribution rules; 获取所述目标图像中被归为同一组的子标记物在所述目标图像中的排布规则;acquiring the arrangement rules of the sub-markers in the target image that are classified into the same group in the target image; 获取所述被归为同一组的子标记物在所述目标图像中的排布规则和所述第一规则之间的第一匹配度,以及所述被归为同一组的子标记物在所述目标图像中的排布规则和所述第二规则之间的第二匹配度;以及Obtain the first matching degree between the arrangement rule of the sub-markers that are classified into the same group in the target image and the first rule, and the sub-markers that are classified into the same group are in the target image. a second degree of matching between the arrangement rule in the target image and the second rule; and 根据所述第一匹配度和所述第二匹配度,确定所述被归为同一组的子标记物属于所述第一标记物或所述第二标记物。According to the first matching degree and the second matching degree, it is determined that the sub-markers classified into the same group belong to the first marker or the second marker. 5.如权利要求2或3所述的方法,其特征在于,所述根据被归为同一组的子标记物在所述目标图像中的图像坐标,确定所述被归为同一组的子标记物所属的交互装置的当前位姿信息,包括:5. The method according to claim 2 or 3, wherein the sub-markers classified into the same group are determined according to the image coordinates of the sub-markers classified into the same group in the target image The current pose information of the interactive device to which the object belongs, including: 获取被归为同一组的子标记物在所述图像坐标系中的像素坐标;obtaining the pixel coordinates in the image coordinate system of the sub-markers that are classified into the same group; 获取所述同一组的子标记物的物理坐标,其中,所述物理坐标用于表示子标记物在所属交互装置上的真实物理位置;以及acquiring the physical coordinates of the sub-markers of the same group, wherein the physical coordinates are used to represent the real physical positions of the sub-markers on the interactive device to which they belong; and 根据所述像素坐标和所述物理坐标,获取所述被归为同一组的子标记物所属的交互装置的当前位姿信息。According to the pixel coordinates and the physical coordinates, the current pose information of the interaction device to which the sub-markers classified into the same group belong is acquired. 6.如权利要求1所述的方法,其特征在于,所述获取包含子标记物的目标图像之前,所述方法还包括:6. The method of claim 1, wherein before the acquiring the target image containing the sub-markers, the method further comprises: 所述第一交互装置以及所述第二交互装置开始使用时,获取包含所述第一标记物和所述第二标记物的参考图像,并确定所述第一标记物和所述第二标记物在所述参考图像中的第一相对位置;When the first interactive device and the second interactive device start to use, a reference image containing the first marker and the second marker is acquired, and the first marker and the second marker are determined the first relative position of the object in the reference image; 根据所述第一相对位置,确定所述第一交互装置与所述第二交互装置之间的第二相对位置;以及determining a second relative position between the first interaction device and the second interaction device based on the first relative position; and 根据所述第二相对位置,标定所述第一交互装置以及所述第二交互装置的参考对应状态,其中,所述参考对应状态用于表征所述第一交互装置以及所述第二交互装置分别对应左手或对应右手。According to the second relative position, the reference corresponding states of the first interaction device and the second interaction device are calibrated, wherein the reference corresponding state is used to characterize the first interaction device and the second interaction device Corresponding to the left hand or corresponding to the right hand, respectively. 7.如权利要求6所述的方法,其特征在于,所述根据所述当前位姿信息,确定所述被归为同一组的子标记物所属的交互装置对应左手或对应右手,包括:7. The method according to claim 6, wherein, according to the current pose information, determining that the interaction device to which the sub-markers classified into the same group belong corresponds to a left hand or a corresponding right hand, comprising: 根据包含所述第一标记物和所述第二标记物的历史帧图像获取所述第一交互装置的第一历史位姿信息和所述第二交互装置的第二历史位姿信息;Acquiring first historical pose information of the first interactive device and second historical pose information of the second interactive device according to a historical frame image containing the first marker and the second marker; 分别确定所述当前位姿信息与所述第一历史位姿信息及第二历史位姿信息之间的位姿变化;respectively determining the pose change between the current pose information, the first historical pose information and the second historical pose information; 根据所述位姿变化确定被归为同一组的子标记物所属的交互装置为所述第一交互装置或第二交互装置,并根据所述参考对应状态确定所述被归为同一组的子标记物所属的交互装置对应左手或对应右手。It is determined according to the pose change that the interaction device belonging to the sub-markers classified into the same group belongs to the first interaction device or the second interaction device, and the sub-markers classified into the same group are determined according to the reference corresponding state. The interactive device to which the marker belongs corresponds to the left hand or to the right hand. 8.如权利要求1所述的方法,其特征在于,在所述获取分组结果之后,所述方法还包括:8. The method according to claim 1, wherein after the obtaining the grouping result, the method further comprises: 根据所述分组结果确定被归为同一组的子标记物属于所述第一标记物或所述第二标记物,包括:It is determined according to the grouping result that the sub-markers classified into the same group belong to the first marker or the second marker, including: 获取包含所述第一标记物和所述第二标记物的历史帧图像;acquiring a historical frame image containing the first marker and the second marker; 识别所述历史帧图像中所述第一标记物的子标记物的第一图像坐标,和所述第二标记物的子标记物的第二图像坐标;identifying first image coordinates of sub-markers of the first marker and second image coordinates of sub-markers of the second marker in the historical frame image; 获取被归为同一组的子标记物在所述目标图像中的当前图像坐标;obtaining the current image coordinates of the sub-markers classified into the same group in the target image; 获取所述当前图像坐标相对所述第一图像坐标的第一位移度;obtaining a first displacement degree of the current image coordinates relative to the first image coordinates; 获取所述当前图像坐标相对所述第二图像坐标的第二位移度;以及obtaining a second degree of displacement of the current image coordinates relative to the second image coordinates; and 根据所述第一位移度和所述第二位移度,确定所述被归为同一组的子标记物属于第一标记物或第二标记物。According to the first displacement degree and the second displacement degree, it is determined that the sub-markers classified into the same group belong to the first marker or the second marker. 9.一种终端设备,其特征在于,包括:9. A terminal device, comprising: 一个或多个处理器;one or more processors; 存储器;以及memory; and 一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行如权利要求1~8中任一项所述的方法。One or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs are configured to perform such as The method of any one of claims 1-8. 10.一种计算机可读取存储介质,其特征在于,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行如权利要求1~8中任一项所述的方法。10. A computer-readable storage medium, wherein the computer-readable storage medium stores program codes, and the program codes can be invoked by a processor to execute any one of claims 1 to 8. method described.
CN201911342720.5A 2019-12-23 2019-12-23 Identification method of interactive device, terminal equipment and readable storage medium Active CN111176445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911342720.5A CN111176445B (en) 2019-12-23 2019-12-23 Identification method of interactive device, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911342720.5A CN111176445B (en) 2019-12-23 2019-12-23 Identification method of interactive device, terminal equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111176445A true CN111176445A (en) 2020-05-19
CN111176445B CN111176445B (en) 2023-07-14

Family

ID=70655640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911342720.5A Active CN111176445B (en) 2019-12-23 2019-12-23 Identification method of interactive device, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111176445B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112558759A (en) * 2020-11-30 2021-03-26 苏州端云创新科技有限公司 Education-based VR (virtual reality) interaction method, interaction development platform and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113138666B (en) * 2021-04-15 2023-04-25 潍坊歌尔电子有限公司 Handle calibration method, handle, head-mounted display and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765498A (en) * 2018-05-30 2018-11-06 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
CN208722146U (en) * 2018-09-30 2019-04-09 广东虚拟现实科技有限公司 Wearables for Assisted Location Tracking
CN110120062A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Image processing method and device
CN110443853A (en) * 2019-07-19 2019-11-12 广东虚拟现实科技有限公司 Scaling method, device, terminal device and storage medium based on binocular camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120062A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Image processing method and device
CN108765498A (en) * 2018-05-30 2018-11-06 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
US20190371003A1 (en) * 2018-05-30 2019-12-05 Baidu Online Network Technology (Beijing) Co., Ltd . Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN208722146U (en) * 2018-09-30 2019-04-09 广东虚拟现实科技有限公司 Wearables for Assisted Location Tracking
CN110443853A (en) * 2019-07-19 2019-11-12 广东虚拟现实科技有限公司 Scaling method, device, terminal device and storage medium based on binocular camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112558759A (en) * 2020-11-30 2021-03-26 苏州端云创新科技有限公司 Education-based VR (virtual reality) interaction method, interaction development platform and storage medium

Also Published As

Publication number Publication date
CN111176445B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN110443853B (en) Calibration method and device based on binocular camera, terminal equipment and storage medium
CN107223269B (en) Three-dimensional scene positioning method and device
US11380063B2 (en) Three-dimensional distortion display method, terminal device, and storage medium
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
CN111158469A (en) Viewing angle switching method, device, terminal device and storage medium
US20190362559A1 (en) Augmented reality method for displaying virtual object and terminal device therefor
CN110737414B (en) Interactive display method, device, terminal equipment and storage medium
US11127156B2 (en) Method of device tracking, terminal device, and storage medium
CN111427452A (en) Controller tracking method and VR system
CN114549285A (en) Controller positioning method and device, head-mounted display equipment and storage medium
CN111813214A (en) Method, device, terminal device and storage medium for processing virtual content
CN111176445A (en) Identification method, terminal device and readable storage medium of interactive device
CN110874868A (en) Data processing method, device, terminal device and storage medium
CN106980378A (en) Virtual display methods and system
CN103033145A (en) Method and system for identifying shapes of plurality of objects
US20200184222A1 (en) Augmented reality tools for lighting design
TW202402364A (en) Tracking apparatus, method, and non-transitory computer readable storage medium thereof
WO2020114395A1 (en) Virtual picture control method, terminal device and storage medium
CN110908508B (en) Control method of virtual picture, terminal device and storage medium
CN110473257A (en) Information scaling method, device, terminal device and storage medium
CN111913564B (en) Virtual content manipulation method, device, system, terminal equipment and storage medium
CN117994839B (en) Gesture recognition method, device, equipment, medium and program
US20240035648A1 (en) Method for creating xyz focus paths with a user device
CN113643358B (en) External parameter calibration method, device, storage medium and system of camera
CN110598605B (en) Positioning method, positioning device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Identification method, terminal device, and readable storage medium for interactive devices

Granted publication date: 20230714

Pledgee: Guangdong Provincial Bank of Communications Co.,Ltd.

Pledgor: GUANGDONG VIRTUAL REALITY TECHNOLOGY Co.,Ltd.

Registration number: Y2024980041215

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Granted publication date: 20230714

Pledgee: Guangdong Provincial Bank of Communications Co.,Ltd.

Pledgor: GUANGDONG VIRTUAL REALITY TECHNOLOGY Co.,Ltd.

Registration number: Y2024980041215

PC01 Cancellation of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Identification method, terminal device, and readable storage medium for interactive devices

Granted publication date: 20230714

Pledgee: Guangdong Provincial Bank of Communications Co.,Ltd.

Pledgor: GUANGDONG VIRTUAL REALITY TECHNOLOGY Co.,Ltd.

Registration number: Y2025980042692

PE01 Entry into force of the registration of the contract for pledge of patent right