CN111813220A - Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment - Google Patents
Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment Download PDFInfo
- Publication number
- CN111813220A CN111813220A CN202010568809.XA CN202010568809A CN111813220A CN 111813220 A CN111813220 A CN 111813220A CN 202010568809 A CN202010568809 A CN 202010568809A CN 111813220 A CN111813220 A CN 111813220A
- Authority
- CN
- China
- Prior art keywords
- module
- chip
- touch
- wireless communication
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an interactive system based on augmented reality or virtual reality intelligent head-mounted equipment, which comprises: a head-mounted device having a display component; the head-mounted equipment is provided with a first control unit, a storage module and a display control unit, wherein the first control unit is provided with a display output end for outputting pictures to the display part, and the storage module is used for providing a storage space; the head-mounted equipment is also provided with a camera unit; the head-mounted equipment is also provided with a first wireless communication module; the interactive system also comprises a handle, wherein the handle is provided with a system control chip, a second wireless communication module, a six-axis sensor module, a physical key module, a touch control module and a controllable power supply, and the second wireless communication module, the six-axis sensor module, the physical key module, the touch control module and the controllable power supply are electrically connected with the system control chip; the physical key module and the touch control module are used for acquiring operation information input by a user; the six-axis sensor module is used for acquiring the position information of a preset point on the handle and the position change information of the preset point; the second wireless communication module is used for sending the operation information, the position information and the position change information to the first wireless communication module.
Description
Technical Field
The invention relates to the field of wireless handles, in particular to an interaction system based on augmented reality or virtual reality intelligent head-mounted equipment.
Background
VR (virtual reality, VR) is a computer simulation system that can create and experience virtual world, and it uses computer to generate a simulation environment, which is a system simulation of multi-source information-fused interactive three-dimensional dynamic visual and physical behaviors to immerse users in the environment.
The AR Augmented Reality (Augmented Reality) technology is a technology for skillfully fusing virtual information and a real world, and is widely applied to the real world after simulating and simulating virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer by using various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like, wherein the two kinds of information supplement each other, so that the real world is enhanced.
Dof (freeform Of freedom) is an abbreviation for degrees Of freedom. Where 3Dof means that there are 3 degrees of freedom in rotation.
With the continuous development and popularization of modern electronic technologies, VR (virtual reality) and AR (augmented reality) in virtual reality are increasingly widely applied in daily life, VR or AR equipment with convenient operation, low delay and humanization is increasingly valued by the market, and means for improving the indexes of VR or AR equipment in the industry are infinite and diverse.
The traditional VR or AR equipment is selectively controlled by three modes of shaking head, moving head left and right and moving up and down, the operation is very inconvenient and humanized, and a certain control difficulty exists particularly for people with neck problems; on the other hand, the VR or AR equipment is controlled by the body motion, and the speed of the body motion almost determines that the traditional equipment cannot be controlled quickly, namely the time delay is serious; a further aspect is that the conventional VR or AR device uses a cable for signal transmission, which greatly reduces the convenience of the device; although some conventional VR or AR devices are improved, mechanical keys are used, which significantly compromises ease of operation, not the design in terms of power consumption. The conventional VR or AR equipment cannot meet the requirements of a user on convenience in operation, low time delay and humanization due to the reasons.
Disclosure of Invention
The invention aims to solve the technical problems and provides an interactive system based on an augmented reality or virtual reality intelligent head-mounted device, which is operated by utilizing a physical key and a touch panel, automatically detects the gesture of a handle, uses wireless signals for low-delay and quick transmission, and is low in system power consumption and convenient to operate.
In order to achieve the purpose, the invention adopts the technical scheme that:
interactive system based on augmented reality or virtual reality intelligence head-mounted apparatus includes: the head-mounted equipment is provided with a display part, and the display part is used for a user to perceive pictures; the head-mounted equipment is provided with a first control unit, a storage module and a display control unit, wherein the first control unit is provided with a display output end for outputting the picture to the display part, and the storage module is used for providing a storage space and can be used for storing a preset program and acquired and processed data; the head-mounted equipment is also provided with a camera unit which can acquire data based on environmental information to output images and/or video signals; the head-mounted equipment is also provided with a first wireless communication module which can receive a control signal from the outside, the first control unit can execute a preset action based on the control signal, and the preset action comprises changing a picture sensed by a user based on the control signal; the interactive system also comprises a handle, wherein the handle is provided with a system control chip, a second wireless communication module, a six-axis sensor module, a physical key module, a touch control module and a controllable power supply, and the second wireless communication module, the six-axis sensor module, the physical key module, the touch control module and the controllable power supply are electrically connected with the system control chip; the physical key module and the touch control module are used for acquiring operation information input by a user; the six-axis sensor module is used for acquiring the position information of a preset point on the handle and the position change information of the preset point; the second wireless communication module is configured to send the operation information, the location information, and the location change information to the first wireless communication module.
Preferably, the head-mounted device is augmented reality glasses.
Preferably, the head-mounted device is virtual reality glasses.
Preferably, the first wireless communication module includes a first bluetooth chip, and the second wireless communication module includes a second bluetooth chip capable of performing communication connection with the first bluetooth chip based on a bluetooth protocol.
Preferably, the handle further comprises a micro control unit, and the micro control unit can control the on-off of the controllable power supply.
Preferably, the controllable power supply comprises a removable battery.
Preferably, the physical key module comprises a determination key, a startup key and a return key.
Preferably, the battery is a rechargeable battery, and the handle is provided with a charging interface.
Preferably, the six-axis sensor module includes a three-axis gyroscope and a three-axis accelerometer.
Preferably, the model of the system control chip is nRF52832, the six-axis sensor module comprises a six-axis sensor chip MPU6000 integrating a three-axis gyroscope and a three-axis accelerometer, the touch control module comprises a touch chip with a model of PCT1322QK, a touch pad interface and a touch panel, the touch chip is in communication connection with the system control chip through I2C, and the six-axis sensor chip is in SPI communication connection with the system control chip.
The invention has the technical effects that: by providing an interactive system based on augmented reality or virtual reality smart headsets, comprising: the head-mounted equipment is provided with a display part, and the display part is used for a user to perceive pictures; the head-mounted equipment is provided with a first control unit, a storage module and a display control unit, wherein the first control unit is provided with a display output end for outputting the picture to the display part, and the storage module is used for providing a storage space and can be used for storing a preset program and acquired and processed data; the head-mounted equipment is also provided with a camera unit which can acquire data based on environmental information to output images and/or video signals; the head-mounted equipment is also provided with a first wireless communication module which can receive a control signal from the outside, the first control unit can execute a preset action based on the control signal, and the preset action comprises changing a picture sensed by a user based on the control signal; the interactive system also comprises a handle, wherein the handle is provided with a system control chip, a second wireless communication module, a six-axis sensor module, a physical key module, a touch control module and a controllable power supply, and the second wireless communication module, the six-axis sensor module, the physical key module, the touch control module and the controllable power supply are electrically connected with the system control chip; the physical key module and the touch control module are used for acquiring operation information input by a user; the six-axis sensor module is used for acquiring the position information of a preset point on the handle and the position change information of the preset point; the second wireless communication module is configured to send the operation information, the location information, and the location change information to the first wireless communication module. Therefore, the problem of industry concern has been solved, need not to select the target through head removal promptly and just can operate, improved the availability factor like this, reduced the head fatigue degree of user at the in-process of wearing, utilize the cooperation of hand simultaneously, help training motor nerve, strengthen the motor nerve and reach the purpose of mental training with the cooperation of brain, be favorable to healthyly.
In addition, in a preferred embodiment, the invention achieves the advantages of high system integration level, small circuit volume, low power consumption and available battery power by using an nRF52832 system control chip integrating a 32-bit micro control unit and a wireless Bluetooth function, a six-axis sensor module MPU6000 integrating a three-axis gyroscope and a three-axis accelerometer and a PCT1322QK touch chip integrating five-point capacitive touch control.
Furthermore, in a preferred embodiment, the physical key module is in electronic control connection with the system control chip, the touch control module is in electronic control connection with the system control chip, the handle system control chip is in interactive communication with AR equipment, VR equipment, PC equipment and the like in a 2.4G wireless Bluetooth mode, the six-axis sensor module and the touch control module change the head motion control mode of the equipment into the control mode of handle posture and touch movement, interactive control is more comfortable, no time delay exists, and user experience is enhanced.
Drawings
The invention is described in further detail below with reference to the following figures and embodiments:
FIG. 1 is a schematic circuit diagram of a handle in one embodiment of the present invention;
FIG. 2 is a circuit schematic of one embodiment of the micro-control unit of FIG. 1;
FIG. 3 is a circuit schematic of one embodiment of the six-axis sensor module of FIG. 1;
FIG. 4 is a circuit schematic of one embodiment of the physical key module of FIG. 1;
FIG. 5 is a circuit schematic of one embodiment of the touch chip of FIG. 1;
FIG. 6 is a circuit schematic of one embodiment of the touch interface of FIG. 1;
FIG. 7 is a circuit schematic of one embodiment of the controllable power supply of FIG. 1;
fig. 8 is a block diagram of the system components of the present invention.
Reference numbers in the figures:
system control chip-100; a micro control unit-201; wireless bluetooth transmission module-202; six-axis sensor module-300; a three-axis gyroscope-301; a three-axis accelerometer-302; physical key module-400; touch control Module-500; a touch chip-501; touch pad interface-502; a touch panel-503; a controllable power supply-601; the smart headset 700; a display section 701; a storage module 702; an image pickup unit 703; a first control unit 704; a first wireless communication module 705; a handle 800.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the present invention, unless otherwise expressly specified or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; either directly or indirectly through intervening media, or through the communication between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, "above" or "below" a first feature means that the first and second features are in direct contact, or that the first and second features are not in direct contact but are in contact with each other via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The invention aims to solve the problem that an application program can be selected only by head movement in the prior art, which is inconvenient in use, so that the invention provides an interactive system based on augmented reality or virtual reality intelligent head-mounted equipment, which can be operated by hands of a user to replace the traditional head movement mode, thereby achieving the effects of good experience effect and convenient use, and simultaneously, the coordination of the hands is utilized to be beneficial to training motor nerves and enhancing the coordination of the motor nerves and the brain to achieve the aim of mental training, thereby being beneficial to health.
In one possible implementation, an interaction system based on an augmented reality or virtual reality smart headset includes: the head-mounted equipment is provided with a display part, and the display part is used for a user to perceive pictures; the head-mounted equipment is provided with a first control unit, a storage module and a display control unit, wherein the first control unit is provided with a display output end for outputting the picture to the display part, and the storage module is used for providing a storage space and can be used for storing a preset program and acquired and processed data; the head-mounted equipment is also provided with a camera unit which can acquire data based on environmental information to output images and/or video signals; the head-mounted equipment is also provided with a first wireless communication module which can receive a control signal from the outside, the first control unit can execute a preset action based on the control signal, and the preset action comprises changing a picture sensed by a user based on the control signal; the interactive system also comprises a handle, wherein the handle is provided with a system control chip, a second wireless communication module, a six-axis sensor module, a physical key module, a touch control module and a controllable power supply, and the second wireless communication module, the six-axis sensor module, the physical key module, the touch control module and the controllable power supply are electrically connected with the system control chip; the physical key module and the touch control module are used for acquiring operation information input by a user; the six-axis sensor module is used for acquiring the position information of a preset point on the handle and the position change information of the preset point; the second wireless communication module is configured to send the operation information, the location information, and the location change information to the first wireless communication module.
In the invention, the head-mounted equipment can be augmented reality glasses or virtual reality glasses.
In the invention, the first wireless communication module comprises a first Bluetooth chip, and the second wireless communication module comprises a second Bluetooth chip which can be in communication connection with the first Bluetooth chip based on a Bluetooth protocol.
In a possible embodiment, the handle further comprises a micro control unit, which can control the on/off of the controllable power source. Preferably, the controllable power supply comprises a detachable battery, more preferably, the battery is a rechargeable battery, and the handle is provided with a charging interface.
In a possible implementation manner, the physical key module comprises a determination key, a starting key and a return key.
In one possible implementation, the six-axis sensor module includes a three-axis gyroscope and a three-axis accelerometer.
Specific embodiments of the present invention will be described below with reference to fig. 1 to 8.
Referring to fig. 1 and 8, an interaction system based on an augmented reality or virtual reality smart headset is shown, including a handle 800 and a smart headset 700, where the handle 800 includes: the system comprises a micro control unit 201, a wireless Bluetooth transmission module 202 serving as a second wireless communication module, a six-axis sensor module 300, a three-axis gyroscope 301, a three-axis accelerometer 302, a physical key module 400, a touch control module 500, a touch chip 501, a touch pad interface 502, a touch panel 503 and a controllable power supply 601.
The head-mounted device 700 is provided with a display part 701, the display part 701 is used for a user to perceive pictures, and in one embodiment, the display part comprises a display screen installed on the head-mounted device; the head-mounted device 700 is provided with a first control unit 704 having a display output terminal for outputting the picture to the display part 701, and a storage module 702 for providing a storage space, the storage module 702 being capable of storing a preset program and acquired and processed data; the head-mounted device 700 is further provided with a camera unit 703 capable of acquiring data based on environmental information to output an image and/or a video signal; the head-mounted device 700 is further provided with a first wireless communication module 705, which can receive a control signal from the outside, and the first control unit 704 can execute a preset action based on the control signal, where the preset action includes changing a picture perceived by a user based on the control signal, for example, moving a cursor, entering a related display interface, and the like.
The system control chip 100 is used for implementing the control method and the transceiving transmission of the wireless bluetooth in this embodiment, and because the SOC technology is adopted, the system control chip 100 includes a micro control unit 201 and a wireless bluetooth transmission module 202, that is, the system control chip 100 has dual functions of the micro control unit 201 and the wireless bluetooth transmission module 202.
The six-axis sensor module 300 is electrically connected to the system control chip 100, the physical key module 400 is electrically connected to the system control chip 100, the touch control module 500 is electrically connected to the system control chip 100, and the controllable power source 601 is electrically connected to the touch control module 500, the six-axis sensor module 300, and the system control chip 100, respectively.
The six-axis sensor module 300 is configured such that the three-axis gyroscope 301 and the three-axis accelerometer 302 are integrated in the chip, and thus the gyroscope has dual functions of the gyroscope 301 and the accelerometer 302, and the six-axis sensor module 300 is configured to capture an operation posture of the handle.
The physical key module 400 is electrically connected to the system control chip 100, and the physical key module 400 is used for the power-on function, the return function, the determination function, and the determination function of the touch panel in this embodiment.
The touch control module 500 is electrically connected to the system control chip 100, the touch control module 500 includes a touch chip 501, a touch pad interface 502, and a touch panel 503, and the touch control module 500 is used for collecting and processing thumb sliding and pressing operations of a user in this embodiment.
The controllable power source 601 is electrically connected to the system control chip 100, the controllable power source 601 is electrically connected to the six-axis sensor module 300, and the controllable power source 601 supplies power to the touch control module 500 and the six-axis sensor module 300 in this embodiment and controls power consumption.
Referring to fig. 2 and 3, in the present embodiment, the system control chip-100 is a radio frequency control chip with a 32-bit controller, which is manufactured by Nordic Semiconductor and is model nRF 52832. The six-axis sensor module 300 is manufactured by TDK corporation of japan, and has a model number of MPU6000, the system control chip 100 has a control data input terminal 212, the six-axis sensor module 300 has a control data output terminal 311, and the system control chip 100 controls the data input terminal 212 to be electrically connected to the data output terminal 311 of the six-axis sensor module 300.
The six-axis sensor module 300 is internally integrated with the three-axis gyroscope 301 and the three-axis accelerometer 302, so that the gyroscope has the dual functions of the gyroscope 301 and the accelerometer 302, and the six-axis sensor module 300 and the system control chip 100 communicate through an SPI bus protocol.
Referring to fig. 2 and fig. 4, in this embodiment, the physical key module 400 refers to a physical key set of a determination key of a handle, a power-on key, a return key, and a determination key under a capacitive touch panel, the physical key module 400 has a control data output end 411, the system control chip 100 has a control data input end 211, and the control data input end 211 of the system control chip 100 is electrically connected to the data output end 411 of the physical key module 400.
Referring to fig. 2 and fig. 5, in the present embodiment, the touch control module 500 includes a touch chip 501, a touch pad interface 502, and a touch panel 503, the touch chip 501 has a control data input end 511, the system control chip 100 has a control data input end 213, and the touch chip 501 controls the data input end 213 to be electrically connected to the data output end 511 of the system control chip 100.
The touch chip 501 and the system control chip 100 pass through I2The C bus protocol communicates.
Referring to fig. 5 and fig. 6, in the present embodiment, the touch chip 501 has a touch signal data input 517 and a touch signal data input 514, and the touch pad interface 502 has a touch signal data output 515 and a touch signal data output 516; the touch signal data input end 517 of the touch chip 501 is electrically connected with the touch signal data output end 515 of the touch pad interface 502, and the touch signal data input end 514 of the touch chip 501 is electrically connected with the touch signal data output end 516 of the touch pad interface 502.
The touch pad interface 502 is electrically connected with the touch panel 503 by adopting FPC13 interface package, a user can slide and press the touch panel 503 by using a thumb, touch signals are sampled and then sent to the touch chip 501 for touch signal processing, and finally the touch signals are processed through I2The C bus protocol is fed back to the system control chip 100, improving the comfort of use.
Referring to fig. 2 and fig. 7, in the present embodiment, the controllable power source 601 has a data input end 611, the system control chip 100 has a control data output end 215, and the data input end 611 of the controllable power source 601 is electrically connected to the data output end 215 of the system control chip 100.
Referring to fig. 7, fig. 3 and fig. 5, in the present embodiment, the controllable power supply 601 has a power supply output end 612, the six-axis sensor module 300 has a power supply input end 312, and the touch chip 501 has a power supply input end 512; the controllable power supply 601 data input end 612 is electrically connected with the six-axis sensor module 300 power output end 312, and the controllable power supply 601 data output end 612 is electrically connected with the touch chip 501 power input end 512.
Referring to fig. 2 and fig. 5, in the present embodiment, the system control chip 100 has a data input end 214, the touch chip 501 has a data output end 513, and the data input end 214 of the system control chip 100 is electrically connected to the data output end 513 of the touch chip 501.
In this embodiment, after the touch data of the touch panel 503 is received and processed by the touch chip 501, the touch chip 501 generates a trigger signal at the data output terminal 513 and inputs the trigger signal to the port 214 of the system control chip 100, so as to trigger the system control chip 100 to receive the touch data, and at the same time, the system control chip 100 controls the port 611 of the controllable power supply 601 according to the trigger signal, and further controls the port 612 of the controllable power supply 601 to output power.
The beneficial effects of implementing the embodiment at least comprise:
1. by using the nRF52832 system control chip 100 integrated with a 32-bit micro control unit and a wireless bluetooth function, by using the six-axis sensor module 300 integrated with the three-axis gyroscope 301 and the three-axis accelerometer 302 model of MPU6000, and by using the five-point capacitive touch control touch chip 501 of PCT1322QK, the whole system of the embodiment has high integration level, small circuit volume, low power consumption, and can be powered by a battery.
2. The physical key module 400 is in electric control connection with the system control chip 100, the touch control module 500 is in electric control connection with the system control chip 100, the handle system control chip 100 is in interactive communication with AR equipment, VR equipment, PC equipment and the like in a 2.4G wireless Bluetooth mode, the six-axis sensor module 300 and the touch control module 500 change the head movement control mode of the equipment into the control mode of handle posture and touch movement, the interactive control is more comfortable, no time delay exists, and the user experience is enhanced.
The present invention has been described in detail with reference to the above embodiments, but the present invention is not limited thereto. The scope of the present invention is not limited to the above embodiments, but equivalent modifications and variations made by those skilled in the art according to the present disclosure should be included in the scope of the present invention as set forth in the appended claims
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The foregoing is a more detailed description of the present invention that is presented in conjunction with specific embodiments, and the practice of the invention is not to be considered limited to those descriptions. It will be apparent to those skilled in the art that a number of simple derivations or substitutions can be made without departing from the inventive concept.
Claims (10)
1. Interactive system based on augmented reality or virtual reality intelligence head-mounted apparatus, its characterized in that includes:
the head-mounted equipment is provided with a display part, and the display part is used for a user to perceive pictures;
the head-mounted equipment is provided with a first control unit, a storage module and a display control unit, wherein the first control unit is provided with a display output end for outputting the picture to the display part, and the storage module is used for providing a storage space and can be used for storing a preset program and acquired and processed data;
the head-mounted equipment is also provided with a camera unit which can acquire data based on environmental information to output images and/or video signals;
the head-mounted equipment is also provided with a first wireless communication module which can receive a control signal from the outside, the first control unit can execute a preset action based on the control signal, and the preset action comprises changing a picture sensed by a user based on the control signal;
the interactive system also comprises a handle, wherein the handle is provided with a system control chip, a second wireless communication module, a six-axis sensor module, a physical key module, a touch control module and a controllable power supply, and the second wireless communication module, the six-axis sensor module, the physical key module, the touch control module and the controllable power supply are electrically connected with the system control chip;
the physical key module and the touch control module are used for acquiring operation information input by a user;
the six-axis sensor module is used for acquiring the position information of a preset point on the handle and the position change information of the preset point;
the second wireless communication module is configured to send the operation information, the location information, and the location change information to the first wireless communication module.
2. The augmented reality or virtual reality smart headset-based interaction system of claim 1, wherein the headset is augmented reality glasses.
3. The augmented reality or virtual reality smart headset-based interaction system of claim 1, wherein the headset is virtual reality glasses.
4. The augmented reality or virtual reality smart headset-based interaction system of claim 1, wherein the first wireless communication module comprises a first bluetooth chip and the second wireless communication module comprises a second bluetooth chip communicatively connectable to the first bluetooth chip based on a bluetooth protocol.
5. The interactive system based on the augmented reality or virtual reality intelligent headset of claim 1, wherein the handle further comprises a micro control unit capable of controlling the on/off of the controllable power supply.
6. The augmented reality or virtual reality smart headset-based interaction system of claim 5, wherein the controllable power source comprises a removable battery.
7. The interactive system based on the augmented reality or virtual reality intelligent headset of claim 1, wherein the physical key module comprises a confirm key, a power-on key and a return key.
8. The interaction system based on the augmented reality or virtual reality intelligent headset of claim 6, wherein the battery is a rechargeable battery, and a charging interface is arranged on the handle.
9. The augmented reality or virtual reality smart headset-based interaction system of any one of claims 1-8, wherein the six-axis sensor module comprises a three-axis gyroscope and a three-axis accelerometer.
10. The augmented reality or virtual reality intelligent headset-based interaction system according to any one of claims 9, wherein the model of the system control chip is nRF52832, the six-axis sensor module comprises a six-axis sensor chip MPU6000 integrating a three-axis gyroscope and a three-axis accelerometer, the touch control module comprises a touch chip with the model of PCT1322QK, a touch pad interface and a touch panel, the touch chip establishes an I2C communication connection with the system control chip, and the six-axis sensor chip establishes an SPI communication connection with the system control chip.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010568809.XA CN111813220A (en) | 2020-06-19 | 2020-06-19 | Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010568809.XA CN111813220A (en) | 2020-06-19 | 2020-06-19 | Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN111813220A true CN111813220A (en) | 2020-10-23 |
Family
ID=72845338
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010568809.XA Pending CN111813220A (en) | 2020-06-19 | 2020-06-19 | Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111813220A (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106226903A (en) * | 2016-08-02 | 2016-12-14 | 彭顺德 | A kind of virtual reality helmet |
| CN106445098A (en) * | 2016-07-29 | 2017-02-22 | 北京小米移动软件有限公司 | Control method and control apparatus used for head-mounted device, and mobile device |
| US20170115689A1 (en) * | 2015-10-21 | 2017-04-27 | Beijing Pico Technology Co., Ltd. | Virtual reality glasses |
| CN207833717U (en) * | 2017-09-21 | 2018-09-07 | 李颖智 | A kind of public security training system based on virtual reality |
| CN109375764A (en) * | 2018-08-28 | 2019-02-22 | 北京凌宇智控科技有限公司 | A kind of head-mounted display, cloud server, VR system and data processing method |
| CN111124136A (en) * | 2019-12-31 | 2020-05-08 | 维沃移动通信有限公司 | Virtual picture synchronization method and wearable device |
-
2020
- 2020-06-19 CN CN202010568809.XA patent/CN111813220A/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170115689A1 (en) * | 2015-10-21 | 2017-04-27 | Beijing Pico Technology Co., Ltd. | Virtual reality glasses |
| CN106445098A (en) * | 2016-07-29 | 2017-02-22 | 北京小米移动软件有限公司 | Control method and control apparatus used for head-mounted device, and mobile device |
| CN106226903A (en) * | 2016-08-02 | 2016-12-14 | 彭顺德 | A kind of virtual reality helmet |
| CN207833717U (en) * | 2017-09-21 | 2018-09-07 | 李颖智 | A kind of public security training system based on virtual reality |
| CN109375764A (en) * | 2018-08-28 | 2019-02-22 | 北京凌宇智控科技有限公司 | A kind of head-mounted display, cloud server, VR system and data processing method |
| CN111124136A (en) * | 2019-12-31 | 2020-05-08 | 维沃移动通信有限公司 | Virtual picture synchronization method and wearable device |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3797344B1 (en) | Computer systems with finger devices | |
| CN205563457U (en) | Virtual reality's portable host computer, virtual reality helmet and system | |
| US8217893B2 (en) | Inertial sensor-based pointing device with removable transceiver | |
| CN105868738B (en) | Intelligent bracelet | |
| JP3176588U (en) | Remote touch circuit and attachment | |
| CN102779000B (en) | User interaction system and method | |
| US20130241927A1 (en) | Computer device in form of wearable glasses and user interface thereof | |
| CN104618712A (en) | Head wearing type virtual reality equipment and virtual reality system comprising equipment | |
| US20130265300A1 (en) | Computer device in form of wearable glasses and user interface thereof | |
| CN203217500U (en) | Handheld virtual space controller | |
| CN114005511A (en) | Rehabilitation training method and system, training self-service equipment and storage medium | |
| CN204442580U (en) | A kind of wear-type virtual reality device and comprise the virtual reality system of this equipment | |
| CN112739304A (en) | System and method for controlling a massage device | |
| CN107241579A (en) | A kind of utilization AR realizes the method and apparatus of Telemedicine Consultation | |
| JPWO2016042862A1 (en) | Control device, control method and program | |
| Vatavu | Sensorimotor realities: Formalizing ability-mediating design for computer-mediated reality environments | |
| CN102750121A (en) | Electronic display device intelligent expanding and accurate controlling system and multi-user multi-task encryption sharing method thereof | |
| CN111161581B (en) | A virtual power safety training system with haptic feedback | |
| JP2020201575A (en) | Display controller, display control method, and display control program | |
| CN205692375U (en) | A kind of smart handle | |
| CN111813220A (en) | Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment | |
| CN208239980U (en) | A kind of virtual interacting device based on data glove | |
| CN105717994A (en) | Wearable Smart Devices | |
| WO2025237021A1 (en) | Human-computer interaction method and apparatus, computer device, and storage medium | |
| CN107025784A (en) | A kind of remote control, helmet and system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |