CN107943293B - Information interaction method and information processing device - Google Patents
Information interaction method and information processing device Download PDFInfo
- Publication number
- CN107943293B CN107943293B CN201711193606.1A CN201711193606A CN107943293B CN 107943293 B CN107943293 B CN 107943293B CN 201711193606 A CN201711193606 A CN 201711193606A CN 107943293 B CN107943293 B CN 107943293B
- Authority
- CN
- China
- Prior art keywords
- electronic device
- identifications
- obtaining
- electronic
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides an information interaction method, which comprises the following steps: obtaining a first operation of a user of a first electronic device; identifying operation information of the first operation; and feeding back the operation information to the first electronic equipment. It can be seen from the above that, in the information interaction method, the first operation of the user of the first electronic device can be obtained through other electronic devices different from the first electronic device or a processor arranged on the server side, and the operation information of the first operation is identified and fed back to the first electronic device, so that the first electronic device does not need to have an identification function for identifying the first operation of the user, and the first electronic device can be designed to be lighter and more convenient to use.
Description
Technical Field
The invention relates to an information interaction method and an information processing device.
Background
At present, with the development of scientific technology, AR (Augmented Reality) devices have appeared more and more in our daily life and work. However, in order to support SLAM (instant positioning and mapping), gesture recognition, object recognition, and the like, the conventional AR device must be provided with a powerful sensor and a processor for support, and has relatively high power consumption and difficulty in being made into a lightweight device. Therefore, in some application scenes, the AR equipment is not very convenient to use, if the AR equipment is made to be lighter, the use of the application scenes is met, and a plurality of functions (such as gesture recognition) are inevitably abandoned, so that a user cannot experience the effect of the complete AR equipment, and the user experience is reduced.
Disclosure of Invention
In view of the above problems in the prior art, the present invention provides an information interaction method and an information processing apparatus that enable a user to use the information easily, and at the same time, enable recognition operations without affecting the user experience.
In order to solve the above problem, the present invention provides an information interaction method, including:
obtaining a first operation of a user of a first electronic device;
identifying operation information of the first operation;
and feeding back the operation information to the first electronic equipment.
Preferably, the obtaining a first operation of the user of the first electronic device includes:
obtaining the relative position of the first electronic equipment and the second electronic equipment;
obtaining the first operation based on the relative position.
Preferably, the obtaining the relative position of the first electronic device and the second electronic device includes:
acquiring a designated position set by the first electronic equipment;
obtaining the relative position based on the specified position.
Preferably, the obtaining the relative position of the first electronic device and the second electronic device includes:
acquiring identification information of the first electronic equipment;
obtaining the relative position based on the identification information.
Preferably, the obtaining a first operation of the user of the first electronic device includes:
obtaining a first identifier of the first electronic device;
obtaining a second identifier of the user, wherein the first identifier corresponds to the second identifier in a one-to-one manner;
obtaining the first operation based on the first identifier and the second identifier.
The invention also provides an information interaction method, which comprises the following steps:
providing a first operation of a user of a first electronic device;
receiving an operation instruction of the first operation, wherein the operation instruction is operation information identified by the second electronic device based on the first operation;
and executing the first operation based on the operation instruction.
The present invention also provides an information processing apparatus comprising:
the acquisition unit is used for acquiring a first operation of a user of the first electronic equipment;
a processor configured to identify operation information of the first operation;
a feedback unit for feeding back the operation information to the first electronic device.
Preferably, the acquisition unit is further configured to acquire a relative position of the first electronic device and the second electronic device;
the processor is further configured to obtain the first operation based on the relative position.
Preferably, the acquisition unit is further configured to obtain a first identifier of the first electronic device;
obtaining a second identifier of the user, wherein the first identifier corresponds to the second identifier in a one-to-one manner;
the processor is further configured to obtain the first operation based on the first identification and the second identification.
The present invention also provides an information processing apparatus comprising:
the electronic device comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving an operation instruction of a first operation, and the operation instruction is operation information which is identified by a second electronic device based on the provided first operation of a user of the first electronic device;
an execution unit to execute the first operation.
Compared with the prior art, the invention has the beneficial effects that: the first operation of the user of the first electronic equipment can be obtained through other electronic equipment different from the first electronic equipment or a processor arranged on the server side, the operation information of the first operation is identified, and the operation information is fed back to the first electronic equipment, so that the first electronic equipment does not need to have an identification function for identifying the first operation of the user, the first electronic equipment can be designed to be lighter and more convenient to use, meanwhile, the first operation of the user is identified through other electronic equipment, and the identified operation information is fed back to the first electronic equipment, so that the user can still experience the whole effect of the electronic equipment, and the use experience of the user is met.
Drawings
Fig. 1 is a flowchart of an information interaction method according to a first embodiment of the present invention;
FIG. 2 is a flowchart of an information interaction method according to a second embodiment of the present invention;
FIG. 3 is a block diagram of an information processing apparatus according to a third embodiment of the present invention;
fig. 4 is a block diagram of an information processing apparatus according to a fourth embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
Various aspects and features of the present invention are described herein with reference to the drawings.
These and other characteristics of the invention will become apparent from the following description of a preferred form of embodiment, given as a non-limiting example, with reference to the accompanying drawings.
It should also be understood that, although the invention has been described with reference to some specific examples, a person of skill in the art shall certainly be able to achieve many other equivalent forms of the invention, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present invention will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present invention are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the invention in unnecessary or unnecessary detail based on the user's historical actions. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the invention.
The invention provides an information interaction method, which comprises the following steps:
obtaining a first operation of a user of a first electronic device;
identifying operation information of the first operation;
and feeding back the operation information to the first electronic equipment.
It can be seen from the above that, in the information interaction method, the first operation of the user of the first electronic device can be obtained through other electronic devices different from the first electronic device or a processor arranged on the server side, and the operation information of the first operation is identified and fed back to the first electronic device, so that the first electronic device does not need to have an identification function for identifying the first operation of the user, and the first electronic device can be designed to be lighter and more convenient to use.
In order to better understand the technical solution, a specific flow of the information processing method will be described below with reference to the drawings of the specification and specific embodiments.
As shown in fig. 1, fig. 1 is a flowchart of an information interaction method in a first embodiment of the present invention, and the information interaction method provided by the present invention includes the following steps:
step 101: obtaining a first operation of a user of a first electronic device;
the first electronic devices may be, but are not limited to, AR glasses, AR helmets, or AR masks, which can provide a virtual scene, and the number of the first electronic devices may be one or more, and each first electronic device is used by one user. The first operation is made by a user corresponding to the first electronic device, and is a corresponding operation performed by the user for a scene seen by the first electronic device used by the user. In addition, the device for acquiring the first operation may be another device different from the first electronic device or a processor provided on the server side.
The following description will take the first electronic device as AR glasses, the user as a child, and the first operation as a selection operation as an example.
At this time, step 101 is to obtain a selected operation of the child wearing the AR glasses. Of course, the number of the AR glasses and the number of the children may be plural, and the first operation may be different operations, for example, a selection operation of obtaining a first child wearing the first AR glasses at the same time, a deletion operation of obtaining a second child wearing the second AR glasses, a movement operation of obtaining a third child wearing the third AR glasses, and the like. Further, the first operation of obtaining the child wearing the AR glasses may be another AR device different from the above AR glasses or a processor on the server side.
In a preferred embodiment, in step 101, the obtaining a first operation of the user of the first electronic device may include:
obtaining the relative position of the first electronic equipment and the second electronic equipment;
the second electronic device may have an identification function of identifying a first operation of a user of the first electronic device, and the obtaining of the relative position between the first electronic device and the second electronic device may be obtained by the first electronic device, may also be obtained by the second electronic device, or may of course be obtained by another device, and then transmits the position information to the first electronic device and/or the second electronic device.
Obtaining a relative position of the first electronic device and the second electronic device may be;
acquiring a designated position set by the first electronic equipment; the designated location of the first electronic device may be acquired by the second electronic device, or the designated location may be acquired by another device and then the location information may be transferred to the second electronic device.
Obtaining the relative position based on the specified position. After the specified location of the first electronic device is obtained, the relative location may be obtained by the second electronic device or by a processor on the server side.
The following description will be made by taking the first electronic device as AR glasses, the user as a child, and the second device as an AR device with gesture recognition function.
After the children wear the AR glasses, the AR glasses are made into a row or made orderly according to the seats arranged in a classroom in a row (equivalent to that the AR glasses are fixedly arranged in the area where the AR glasses are located at the moment), the AR equipment with the gesture recognition function is arranged in the classroom at the position where the operation action of each child can be observed at the same time, and at the moment, the AR equipment acquires the relative position relation between the AR equipment and each AR glasses (or child). Of course, the position of each AR glasses may be obtained by setting a camera in the classroom, and then the position of the AR glasses is sent to the processor on the server side to obtain the relative position, at this time, the second device is equivalent to a camera, that is, the relative position between the camera and each AR glasses is obtained.
Of course, obtaining the relative position of the first electronic device and the second electronic device may also be;
acquiring identification information of the first electronic equipment; namely, the first electronic devices are respectively and correspondingly provided with identification information, and the identification information is information capable of identifying the first electronic devices.
Obtaining the relative position based on the identification information. After the identification information of the first electronic device is obtained, the position of the first electronic device can be obtained, and then the relative position of the first electronic device and the second electronic device can be obtained.
The following description will be made by taking the first electronic device as AR glasses, the user as a child, and the second device as an AR device with gesture recognition function.
After the children wear the AR glasses, the AR equipment can acquire the identification information of each AR glass, the position of each AR glass can be determined on the basis of the identification information, and then the relative position of the AR equipment and each AR glass is obtained.
After the relative position is obtained, the next step is continued:
obtaining the first operation based on the relative position. After the relative positions of the second electronic device and the first electronic device are obtained, it can be determined from which angle the second electronic device has completed the acquisition of the first operation of the user of the first electronic device.
The following description will be made by taking the first electronic device as AR glasses, the user as a child, and the second device as an AR device with gesture recognition function.
On the basis that the children wear the AR glasses and obtain the relative position relationship, a first operation performed by each child for a scene shown by the AR glasses worn by the child is obtained, wherein the first operation can be a deleting operation, a selecting operation, a moving operation and the like.
Of course, when the second electronic device is a camera, the first operation can be obtained by adopting a function of switching between a near-distance function and a far-distance function of a depth camera, so that the remote gesture recognition is realized, the gesture action at a far distance can be recognized by using an RGB camera, and after the first operation performed by each child for the scene displayed by the AR glasses worn by the child is obtained, the processor on the server side obtains the first operation. The camera and the processor on the server side are in communication connection through a Wireless Local Area Network (WLAN), a mobile communication network/cellular network (third generation (3G), fourth generation (4G), fifth generation (5G) or a network which continues to evolve), or short-distance communication such as a bluetooth/Zigbee protocol network.
In a preferred embodiment, in step 101, the obtaining a first operation of the user of the first electronic device may further include:
obtaining a first identifier of the first electronic device; the first identification is used for identifying the first electronic equipment.
Obtaining a second identifier of the user, wherein the first identifier corresponds to the second identifier in a one-to-one manner; wherein the second identifier is used for identifying a user of the first electronic device.
Obtaining the first operation based on the first identifier and the second identifier. That is, on the basis of obtaining the first identifier of the first electronic device and the second identifier of the second electronic device, a first operation, which is performed by a user corresponding to the first electronic device with respect to the first electronic device, is obtained.
In the following, the first electronic device is also used as AR glasses, the user is a child, and the second device is an AR device with gesture recognition function
At this time, the AR glasses have a first identifier, the child wearing the AR glasses has a second identifier, and after the AR device acquires the first identifier of the AR glasses and the second identifier of the child wearing the AR glasses, the first operation of the child is acquired, where the first operation may be a deletion operation, a selection operation, a movement operation, or the like. When AR glasses and children are a plurality of, the first identification of AR glasses can be A, B, C etc., correspondingly, the second identification of the children wearing the AR glasses can be a, B, c etc., the AR equipment acquires the first identification A of the first AR glasses, the second identification a of the children wearing the first AR glasses, the first identification B of the second AR glasses, the second identification B of the children wearing the second AR glasses, and then the AR equipment obtains the first operation of the children wearing the first AR glasses to the first AR glasses, and the first operation of the children wearing the second AR glasses to the second AR glasses.
Step 102: identifying operation information of the first operation;
the operation information is information of corresponding operation which the user wants the first electronic equipment to complete.
And the processor for identifying the first operation can be the second electronic device or the server side processor.
The following description is also given by taking the first electronic device as AR glasses, the user as a child, the first operation as a delete operation, and the second device as an AR device with a gesture recognition function.
When this function is recognized as being completed by the second electronic device:
the AR device obtains an object A in a virtual scene provided by the AR glasses and of a child wearing the AR glasses, and the AR device recognizes the deletion operation.
When this function is identified to be done by the server-side processor:
the processor obtains the object A of the child wearing the AR glasses in a virtual scene provided by the AR glasses through the camera to perform deletion operation, and the processor recognizes the deletion operation.
Step 103: feeding back the operation information to the first electronic equipment;
that is, the processor on the second electronic device or the server side feeds back the identified operation information to the first electronic device to make the first electronic device execute the first operation.
The description continues with the first electronic device being an AR glasses, the user being a child, the first operation being a delete operation, and the second device being an AR device with a gesture recognition function.
When this function is recognized as being completed by the second electronic device:
after the AR device recognizes the deletion operation, the deletion operation is fed back to the AR glasses.
When this function is identified to be done by the server-side processor:
the processor recognizes the delete operation and feeds back the delete operation to the AR glasses.
Based on the same concept as the information interaction method, the invention also provides another information interaction method, which comprises the following steps:
providing a first operation of a user of a first electronic device;
receiving an operation instruction of the first operation, wherein the operation instruction is operation information identified by the second electronic device based on the first operation;
and executing the first operation based on the operation instruction.
It can be seen from the above that, in the information interaction method, the first operation of the user of the first electronic device can be obtained through the other electronic devices different from the first electronic device or the processor arranged at the server side, the operation information of the first operation is identified, and then the operation information is fed back to the first electronic device, so that the first electronic device does not need to have an identification function for identifying the first operation of the user, the first electronic device can be designed to be lighter and more convenient to use, and meanwhile, the first operation of the user is identified through the other electronic devices and the identified operation information is fed back to the first electronic device, so that the user can still experience the overall effect of the electronic device, and the user experience is satisfied.
In order to better understand the technical solution, a specific flow of the information processing method will be described below with reference to the drawings of the specification and specific embodiments.
As shown in fig. 2, fig. 2 is a flowchart of an information interaction method in a second embodiment of the present invention, and the information interaction method provided by the present invention includes the following steps:
step 201: providing a first operation of a user of a first electronic device;
the description will be made by taking the first electronic device as AR glasses, the user as a child, the first operation as a moving operation, and the second device as an AR device having a gesture recognition function.
That is, the AR glasses provide the AR device with the movement operations made by the child wearing the AR glasses for the AR glasses.
Of course, the AR glasses may also provide the processor on the server side with the movement operations made by the child wearing the AR glasses for the AR glasses.
Step 202: receiving an operation instruction of the first operation, wherein the operation instruction is operation information identified by the second electronic device based on the first operation;
in the following, the first electronic device is also used as AR glasses, the user is a child, the first operation is a moving operation, the second device is an AR device with gesture recognition function,
after the AR glasses provide the moving operation made by the child wearing the AR glasses for the object A in the virtual scene provided by the AR glasses to the AR equipment, the AR equipment identifies the moving operation, and then sends the operation instruction of the moving operation to the AR glasses, and the AR glasses receive the operation instruction of the moving operation fed back by the AR equipment.
When the AR glasses provide the mobile operation made by the child wearing the AR glasses for the AR glasses to the processor on the server side, the second electronic device is equivalent to the processor on the server side, the processor recognizes the mobile operation and further sends the operation instruction of the mobile operation to the AR glasses, and the AR glasses receive the operation instruction of the mobile operation fed back from the processor, wherein the processor and the AR glasses are in communication connection through a Wireless Local Area Network (WLAN), a mobile communication network/cellular network (third generation (3G), fourth generation (4G), fifth generation (5G) or a continuously evolving network) or short-distance communication such as a bluetooth/Zigbee protocol network.
Step 203: and executing the first operation based on the operation instruction.
In the following, the first electronic device is also used as AR glasses, the user is a child, the first operation is a moving operation,
after the AR glasses receive the operation instruction of the movement operation fed back, the movement operation is executed, and the object a in the virtual scene provided for the child in the AR glasses is moved.
EXAMPLE III
Based on the same concept as the above information interaction method, as shown in fig. 3, the present invention also provides an information processing apparatus, comprising:
the acquisition unit 11 is used for acquiring a first operation of a user of the first electronic equipment;
a processor 12 configured to identify operation information of the first operation;
a feedback unit 13, configured to feed back the operation information to the first electronic device.
Preferably, the acquisition unit 11 is further configured to acquire a relative position of the first electronic device and the second electronic device;
the processor 12 is further configured to obtain the first operation based on the relative position.
Preferably, the acquisition unit 11 is further configured to obtain a first identifier of the first electronic device;
obtaining a second identifier of the user, wherein the first identifier corresponds to the second identifier in a one-to-one manner;
the processor 12 is further configured to obtain the first operation based on the first identity and the second identity.
Example four
Based on the same concept as the above information interaction method, as shown in fig. 4, the present invention also provides another information processing apparatus, including:
a receiving unit 21, configured to receive an operation instruction of a first operation, where the operation instruction is operation information that is recognized by a second electronic device based on a provided first operation of a user of a first electronic device;
an execution unit 22 for executing the first operation.
Since the electronic device described in this embodiment is an electronic device corresponding to the information interaction method in this embodiment, based on the information interaction method in this embodiment, a person skilled in the art can understand a specific implementation manner of the electronic device in this embodiment and various variations thereof, so that detailed description of the electronic device is omitted here. As long as those skilled in the art implement the electronic device used in the information interaction method in the embodiments of the present application, the electronic device is within the scope of the present application.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and the scope of the present invention is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present invention, and such modifications and equivalents should also be considered as falling within the scope of the present invention.
Claims (8)
1. An information interaction method is applied to a second electronic device and comprises the following steps:
obtaining a first operation of a user of a first electronic device;
identifying operation information of the first operation;
feeding back the operation information to the first electronic equipment;
wherein the obtaining of the first operation of the user of the first electronic device includes:
respectively obtaining first identifications of a plurality of first electronic devices, wherein the first identifications are used for identifying the first electronic devices;
respectively obtaining second identifications of a plurality of users, wherein the first identifications correspond to the second identifications in a one-to-one mode, and the second identifications are used for identifying the users of the first electronic equipment;
and respectively obtaining the first operations of the users of the plurality of first electronic devices based on the first identification and the second identification.
2. The information interaction method of claim 1, wherein the obtaining of the first operation of the user of the first electronic device comprises:
obtaining a relative position of the first electronic device and the second electronic device;
obtaining the first operation based on the relative position.
3. The information interaction method of claim 2, wherein the obtaining of the relative positions of the first electronic device and the second electronic device comprises:
acquiring a designated position set by the first electronic equipment;
obtaining the relative position based on the specified position.
4. The information interaction method of claim 2, wherein the obtaining of the relative positions of the first electronic device and the second electronic device comprises:
acquiring identification information of the first electronic equipment;
obtaining the relative position based on the identification information.
5. An information interaction method is applied to a plurality of first electronic devices with first identifications, and comprises the following steps:
providing first operations corresponding to users of a plurality of first electronic devices respectively;
respectively receiving a plurality of operation information identified by the second electronic equipment based on a plurality of first operations;
based on the operation information, executing the corresponding first operation;
wherein the providing of the first operations corresponding to the users of the plurality of the first electronic devices, respectively:
respectively providing first identifications of a plurality of first electronic devices, wherein the first identifications are used for identifying the first electronic devices;
respectively providing second identifications of a plurality of users, wherein the first identifications correspond to the second identifications in a one-to-one mode, and the second identifications are used for identifying the users of the first electronic equipment;
so that the second electronic device obtains the first operations of the users of the plurality of first electronic devices respectively based on the first identifiers and the second identifiers.
6. An electronic device, comprising:
the acquisition unit is used for acquiring a first operation of a user of the first electronic equipment;
a processor configured to identify operation information of the first operation;
a feedback unit for feeding back the operation information to the first electronic device;
the acquisition unit is further configured to:
respectively obtaining first identifications of a plurality of first electronic devices, wherein the first identifications are used for identifying the first electronic devices;
respectively obtaining second identifications of a plurality of users, wherein the first identifications correspond to the second identifications in a one-to-one mode, and the second identifications are used for identifying the users of the first electronic equipment;
the processor is further configured to obtain the first operations of the users of the plurality of first electronic devices based on the first and second identifications, respectively.
7. The electronic device of claim 6, wherein the acquisition unit is further configured to obtain a relative position between the first electronic device and the electronic device;
the processor is further configured to obtain the first operation based on the relative position.
8. An electronic device, comprising:
a providing unit configured to provide first operations corresponding to users of a plurality of the electronic devices, respectively, the electronic devices having first identifications;
a receiving unit, configured to receive operation information identified based on a plurality of first operations from the second electronic device, respectively;
an execution unit, configured to execute the corresponding first operation based on the operation information;
wherein the providing unit is specifically configured to:
respectively providing first identifications of a plurality of electronic devices, wherein the first identifications are used for identifying the electronic devices;
respectively providing second identifications of a plurality of users, wherein the first identifications correspond to the second identifications in a one-to-one mode, and the second identifications are used for identifying the users of the electronic equipment;
so that the second electronic device obtains the first operations of the users of the plurality of electronic devices respectively based on the first identifier and the second identifier.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711193606.1A CN107943293B (en) | 2017-11-24 | 2017-11-24 | Information interaction method and information processing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711193606.1A CN107943293B (en) | 2017-11-24 | 2017-11-24 | Information interaction method and information processing device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107943293A CN107943293A (en) | 2018-04-20 |
CN107943293B true CN107943293B (en) | 2021-01-15 |
Family
ID=61948752
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711193606.1A Active CN107943293B (en) | 2017-11-24 | 2017-11-24 | Information interaction method and information processing device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107943293B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105103082A (en) * | 2012-12-11 | 2015-11-25 | 微软技术许可有限责任公司 | People-triggered holographic reminders |
CN106716306A (en) * | 2014-09-30 | 2017-05-24 | 索尼互动娱乐股份有限公司 | Synchronize multiple HMDs to Unity Space and make object movements in Unity Space related |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101252169B1 (en) * | 2011-05-27 | 2013-04-05 | 엘지전자 주식회사 | Mobile terminal and operation control method thereof |
CN104834249A (en) * | 2015-03-16 | 2015-08-12 | 张时勉 | Wearable remote controller |
CN105068649A (en) * | 2015-08-12 | 2015-11-18 | 深圳市埃微信息技术有限公司 | Binocular gesture recognition device and method based on virtual reality helmet |
CN105045398B (en) * | 2015-09-07 | 2018-04-03 | 哈尔滨市一舍科技有限公司 | A kind of virtual reality interactive device based on gesture identification |
CN205581784U (en) * | 2016-04-14 | 2016-09-14 | 江苏华博创意产业有限公司 | Can mix real platform alternately based on reality scene |
CN106200981B (en) * | 2016-07-21 | 2019-06-28 | 北京小鸟看看科技有限公司 | A kind of virtual reality system and its wireless implementation method |
CN206400472U (en) * | 2016-08-24 | 2017-08-11 | 王忠民 | A kind of virtual reality device and its alignment system |
-
2017
- 2017-11-24 CN CN201711193606.1A patent/CN107943293B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105103082A (en) * | 2012-12-11 | 2015-11-25 | 微软技术许可有限责任公司 | People-triggered holographic reminders |
CN106716306A (en) * | 2014-09-30 | 2017-05-24 | 索尼互动娱乐股份有限公司 | Synchronize multiple HMDs to Unity Space and make object movements in Unity Space related |
Also Published As
Publication number | Publication date |
---|---|
CN107943293A (en) | 2018-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3237991B1 (en) | Communication system comprising head wearable devices | |
KR20220160665A (en) | A collection of augmented reality items | |
CN107390863B (en) | Device control method and device, electronic device and storage medium | |
CN109918975A (en) | An augmented reality processing method, object recognition method and terminal | |
CN109271125A (en) | Screen display control method and device of split type terminal equipment and storage medium | |
EP3163473A1 (en) | Video playing method and device | |
US10664011B2 (en) | Wearable apparatus and method for controlling VR apparatus | |
CN105026981A (en) | Methods and apparatus for displaying images on a head mounted display | |
CN104469158A (en) | Moving shooting and shooting controlling method and device | |
US10084986B2 (en) | System and method for video call using augmented reality | |
US20160228763A1 (en) | Method and apparatus for adjusting game scene | |
WO2016110009A1 (en) | Control method, system and apparatus for projection device | |
CN103916978A (en) | Wireless connection establishing method and electronic devices | |
CN106168895A (en) | Sound control method and intelligent terminal for intelligent terminal | |
CN109389687A (en) | Information processing method, device, equipment and readable storage medium storing program for executing based on AR | |
WO2015184903A1 (en) | Picture processing method and device | |
CN103631225B (en) | A kind of scene device long-range control method and device | |
CN106803854A (en) | Method, device and wearable device that information is received, sent | |
CN107943293B (en) | Information interaction method and information processing device | |
CN104133607A (en) | Handwriting sharing method and handwriting sharing device | |
CN106161725B (en) | A kind of information processing method and electronic equipment | |
CN106445121A (en) | Virtual reality device and terminal interaction method and apparatus | |
US12282592B2 (en) | Co-located full-body gestures | |
CN108459716B (en) | Method for realizing multi-person cooperation to complete task in VR | |
CN104598140A (en) | Information processing method and first electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |