[go: up one dir, main page]

CN108632373B - Equipment control method and system - Google Patents

Equipment control method and system Download PDF

Info

Publication number
CN108632373B
CN108632373B CN201810436971.9A CN201810436971A CN108632373B CN 108632373 B CN108632373 B CN 108632373B CN 201810436971 A CN201810436971 A CN 201810436971A CN 108632373 B CN108632373 B CN 108632373B
Authority
CN
China
Prior art keywords
equipment
user
information
control system
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810436971.9A
Other languages
Chinese (zh)
Other versions
CN108632373A (en
Inventor
方超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201810436971.9A priority Critical patent/CN108632373B/en
Publication of CN108632373A publication Critical patent/CN108632373A/en
Application granted granted Critical
Publication of CN108632373B publication Critical patent/CN108632373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The invention relates to a device control system, which is used for controlling devices and recognition target images in an Internet of things and comprises the following components: the information acquisition unit is used for acquiring information of a user in the current environment and sending the information to the control unit, and the control unit analyzes the use intention of the user on the equipment according to the information and controls the corresponding equipment to execute corresponding actions according to the use intention. According to the equipment control method and the equipment control system, the user is connected with the equipment through the equipment control system, the equipment control system acquires the position relation and the equipment information between the equipment and the user, and the equipment control system identifies the intention characteristics of the user to control the equipment, so that the interaction between the user and the equipment is more convenient and quicker.

Description

Equipment control method and system
Technical Field
The invention relates to the field of application of the Internet of things, in particular to a device control method and system.
Background
Today, people's habits and customs are changing quietly when science and technology is rapidly developed. The internet of things is increasingly well known because of the popularity of smart devices.
With the development of the mobile internet technology, more and more application programs are installed on the intelligent terminal, and the operating systems of the intelligent terminal include an Android system developed by google, an iOS system of apple, a Windows system of microsoft, and the like. By using the application programs of the systems, the intelligent equipment is controlled on the intelligent terminal through the application programs, but the mode is relatively poor in interactivity and single in interactive mode.
Disclosure of Invention
Therefore, it is necessary to provide a device control method and system for solving the problems that the intelligent terminal has poor interactivity and a single interactive mode for controlling the intelligent device through an application program.
A device control system for controlling a device and a recognition target image in an internet of things, comprising: the information acquisition unit is used for acquiring information of a user in the current environment and sending the information to the control unit, and the control unit analyzes the use intention of the user on the equipment according to the information and controls the corresponding equipment to execute corresponding actions according to the use intention.
In one embodiment, the information acquisition unit includes,
the environment acquisition module is used for acquiring environment information and equipment information;
the identity authentication acquisition module is used for acquiring the identity characteristic information of the user;
a user analysis module for identifying a user in the environment and identifying intent characteristics of the user;
a first screen module for outputting an image;
the control unit is connected with the environment acquisition module, the identity authentication acquisition module and the user analysis module, and presets an operation instruction corresponding to the intention characteristic, and is used for analyzing the environment information acquired by the environment acquisition module to position the user, comparing the intention characteristic of the user with the preset intention characteristic, sending the corresponding operation instruction to the equipment, and verifying whether the user has the authority control equipment control system or not.
In one embodiment, the information acquisition unit further comprises,
the first wireless module is used for connecting with equipment in the Internet of things; and
the first screen module is used for outputting images.
In one embodiment, the identity authentication acquisition module is an infrared camera, and the infrared camera is used for acquiring finger vein images and joint line images of the user.
In one embodiment, the equipment control system further comprises an output unit, the output unit is in communication connection with the information acquisition unit and the control unit, the output unit comprises,
the audio module is used for audio acquisition and audio output;
the second screen module is used for outputting an image; and
the output control module is connected with the audio module, the second screen module and the second wireless module and used for generating an operation instruction to control the equipment.
In one embodiment, when the information acquisition unit is connected with the output unit, the image presented by the first screen module is displayed on the second screen module, and the first screen module does not display the image.
In one embodiment, the equipment control system is connected with a server, the server is used for processing data information of the equipment control system, and the equipment control system controls equipment connected with the server through the server.
A device control method is applied to a device control system and comprises the following steps:
collecting information of a user in a current environment;
and analyzing the use intention of the user on the equipment according to the information, and controlling the corresponding equipment to perform corresponding action according to the use intention.
In one embodiment, the method further comprises,
acquiring user identity characteristic information, and verifying whether the user identity characteristic information is consistent with user identity characteristic information preset in a control unit;
if the input command is consistent with the input command, the equipment control system responds to the input command of the user;
and if the data are inconsistent, the equipment control system does not respond to the input instruction of the user.
In one embodiment, the step of analyzing, by the control unit, the usage intention of the user on the device according to the information, and controlling the corresponding device to perform the corresponding action according to the usage intention further includes:
the equipment control system is connected with the equipment and/or the server is connected with the equipment, the intention characteristics of the user are identified to control the equipment and/or the spatial position relation between the user and the equipment is determined to control the equipment;
and acquiring a target image, identifying the target image through an equipment control system and/or a server, and starting software corresponding to the target image in the equipment control system.
According to the equipment control method and the equipment control system, the user is connected with the equipment through the equipment control system, the equipment control system acquires the position relation and the equipment information between the equipment and the user, and the equipment control system identifies the intention characteristics of the user to control the equipment, so that the interaction between the user and the equipment is more convenient and quicker.
Drawings
FIG. 1 is a schematic diagram of an equipment control system according to an embodiment of the present invention;
FIG. 2 is a schematic view of the information acquisition unit of FIG. 1;
FIG. 3 is a schematic diagram of the output unit of FIG. 1;
fig. 4 is a diagram of an apparatus control method according to an embodiment of the present invention.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
As shown in fig. 1, fig. 1 is a schematic diagram of an internet-of-things-based device control system according to an embodiment of the present invention, configured to control a device 400 in the internet of things, where the device 400 may be, but is not limited to, a vehicle, a household appliance, furniture, an office supply, an electronic device, and the like. The device control system includes: the information acquisition device comprises an information acquisition unit 100 and a control unit 300 in communication connection with the information acquisition unit 100, wherein the information acquisition unit 100 is used for acquiring information of a user in the current environment and sending the information to the control unit 300, and the control unit 300 analyzes the use intention of the user on the device according to the information and controls the corresponding device 400 to perform corresponding action according to the use intention.
The equipment control system comprises an information acquisition unit 100, an output unit 200 and a control unit 300, wherein the information acquisition unit 100, the output unit 200 and the control unit 300 are in communication connection. The equipment control system is connected with a server, and the server is used for processing data information of the equipment control system.
The device control system is communicatively coupled to the device 400, is capable of detecting, identifying, and controlling the device 400 based on predetermined trigger conditions. The device control system 400 and the device 400 may be connected via an internal closed network or via the internet.
The device control system may be connected to an intelligent terminal capable of controlling the device 400 to which the device control system is connected. The intelligent terminal comprises a mobile phone, a tablet, a computer or other intelligent equipment. In one embodiment, the intelligent terminal is a mobile phone.
As shown in fig. 2, fig. 2 is a schematic diagram of the information collecting unit 100 of fig. 1. The information collection unit 100 includes: an environment collection module 120, an identity authentication collection module 110, a user analysis module 130, a first wireless module 140, and a first screen module 150. The environment collecting module 120 is configured to collect environment information and device 400 information; the identity authentication acquisition module 110 is used for acquiring user identity characteristic information; the user analysis module 130 is used for identifying a user in the environment and identifying the intention characteristics of the user; the first screen module 150 is used for outputting an image; the first wireless module 140 is used to connect with the device 400 in the internet of things.
The control unit 300 is in communication connection with each module in the information acquisition unit 100, and the control unit 300 presets an operation instruction corresponding to the intention feature, and is configured to analyze the environment information acquired by the environment acquisition module 120 to locate the position of the user, compare the direction of the user intention feature with a preset intention feature, send a corresponding operation instruction to the device 400, and verify whether the user has an authority to control the device control system based on the user identity feature information analysis user.
The identity authentication collecting module 110 sends the collected user identity feature information to the control unit 300. The control unit 300 is preset with user identity characteristic information, and compares whether the user identity characteristic information acquired by the identity authentication acquisition module 110 is consistent with the user identity characteristic information preset in the control module, if not, the system does not respond to the user input instruction; if the two are consistent, the system responds to the input instruction of the user. In one embodiment, the identity authentication collecting module 110 is an infrared camera, and the infrared camera is used for collecting the finger vein image and the joint line image of the user. It is understood that the identity authentication acquisition module 110 may also be an iris recognizer, a fingerprint recognizer, a face recognition device, or other biometric information recognition devices.
The control unit 300 is provided with a vision processor. The control unit 300 receives the device 400 information and the environment information acquired by the environment acquisition module 120, analyzes the device 400 information and the environment information through a vision processor, and identifies a fixed identifier in the environment information to locate the user. The fixed identification comprises household electrical equipment, furniture, guideboards, signal lamps, trees, buildings or other standing objects.
In one embodiment, the control unit 300 collects the environment information and the device 400 information by the environment collection module 120, establishes a spatial rectangular coordinate system, simulates the fixed identifiers in the device 400 information and the environment information, and sets corresponding parameters, thereby positioning the user position. The user has six-degree-of-freedom posture positions in a spatial rectangular coordinate system, and the control unit 300 analyzes the environmental information, locates a position relationship with a fixed marker among the six-degree-of-freedom posture positions of the user, and determines a position relationship between the device 400 and the user. The control unit 300 controls the device 400 to operate according to the position relationship between the device 400 and the user.
The user analysis module 130 sends the user intention characteristics to the control unit 300, and an operation instruction corresponding to the intention characteristics is preset in the control unit 300. In one embodiment, the intent features are gestures and spatial pose information of the wrist and/or user eye focus information. The control unit analyzes the user gesture and the spatial posture information of the wrist and/or the user binocular focus information, compares the preset gesture and the spatial posture information of the wrist in the control unit 300, and sends a corresponding operation instruction to the device 400 if the user gesture and the spatial posture information of the wrist are consistent with the preset gesture and the spatial posture information of the wrist; if not, no operation instruction is sent to the device 400, and the user is prompted to not recognize the gesture. It is understood that the intention feature may also be a representation of the focus information of the eyes of the user, a body movement, a voice password or other user intention features, as long as the control unit 300 can recognize that the intention feature matches with a preset operation instruction. It is understood that the prompt mode can also be a vibration prompt, a voice prompt, a light prompt, a text prompt, an image prompt or other prompt modes. In one embodiment, the user analysis module 130 may designate a local range of information collected by the environment collection module 120 to be submitted to the control unit for processing, so as to reduce the amount of image recognition pixel calculation and eliminate useless images, thereby enabling the device control system to recognize the device 400 at a higher speed and with a higher accuracy.
As shown in fig. 3, fig. 3 is a schematic diagram of the output unit 200 of fig. 1. The output unit 200 includes: the audio module 210 is used for audio acquisition and output; the second screen module 230 is used to output an image; the second wireless module 220 is used for connecting with the device 400; and an output control module 240 connected to the second screen module 230 and the audio module 210 for generating an operation command to control the apparatus 400. The output unit 200 further includes: and a second wireless module 220, wherein the second wireless module 220 is used for connecting with an internet of things device 400.
Output unit 200 still includes second environment collection module 250, compensation environmental information that second environment collection module 250 will gather sends the control unit 300, the control unit 300 analysis compensation environmental information contrast the environmental information that the information acquisition unit gathered, location user position that can be more accurate.
When the information collecting unit 100 and the output unit 200 are connected, the image presented by the first screen module 150 is projected to the second screen module 230, and the first screen module 150 does not display the image.
The audio module 210 may further recognize a voice of a user, and send the voice of the user to the output control module 240, where an operation instruction corresponding to the voice is preset in the output control module 240, the voice of the user is analyzed and compared with a voice preset in the control unit 300, and if the user identity characteristic information acquisition system is consistent with user identity characteristic information preset in the control unit 300, the device control system responds to a user input instruction; if not, the equipment control system does not respond to the input instruction of the user and prompts the user that the voice is not recognized. It is understood that the prompt mode can also be a vibration prompt, a voice prompt, a light prompt, a text prompt, an image prompt or other prompt modes. The audio module 210 and the user analysis module 130 in the information collection unit 100 may be used in a mixed manner to achieve the fastest input result.
The following are several control modes of the equipment control system:
in a first mode, when the user wears the device control system and approaches the device 400, the image information obtained by the environment acquisition module 120 is analyzed by the control unit 300 to obtain data of the six-degree-of-freedom posture of the user in the spatial coordinate system, that is, the user positioning data. Thereby obtaining data on the distance, direction, etc. of the user with respect to the control device 400. Thereby automatically issuing instructions to control the device 400. For example, the equipment control system is connected with an intelligent closestool, when a user approaches the intelligent closestool, the equipment control system sends an opening instruction to the intelligent closestool, and a closestool cover of the intelligent closestool is opened; when the user leaves the intelligent closestool after use, the equipment control system sends a closing instruction to the intelligent closestool, and a closestool cover of the intelligent closestool is closed and is flushed. For another example, the device control system is connected with an intelligent desk lamp, when a user sits in front of a desk, the device control system sends a starting instruction to the intelligent desk lamp, and the intelligent desk lamp is started; when a user leaves a desk, the equipment control system sends a closing instruction to the intelligent desk lamp, and the intelligent desk lamp is closed.
In the second mode, the user uses the device control system, the user determines the controlled device 400 through the image recognition mode, and the device control system is connected with the determined device 400 to control the device 400. For example, the user indicates that the smart television is controlled by a specific gesture, the device control system controls the smart television, a control menu of the smart television is displayed on the output unit 200 for the user to select, and the user can select the menu through the corresponding gesture and/or select the menu through a touch screen. For another example, the device control system establishes a virtual reality according to environment information collected in advance or acquires real-time environment information through a proxy camera, the user displays a virtual reality of another room in the output unit 200, the user points to the device 400, the device control system performs image recognition on the device 400 and then connects the determined device 400, the output unit 200 performs corresponding actions on the image of the device 400 selected by the user in the virtual reality, the user controls the device 400 through a specific gesture, and the image of the device 400 in the output unit 200 in the virtual reality according to an operation instruction corresponding to the specific gesture of the user.
In a third mode, the user uses the device control system to control the device control system to acquire the target image, the acquired target image is identified by the device control system and/or the server, and the control menu is displayed on the output unit 200. In the process of recognizing the target image, the fixed mark in the environmental information may be collected by the information collecting unit 100, so that the recognition target image is more accurate. For example, if the user controls the device control system to point to an express delivery cabinet, the control device control unit 300 analyzes the target image as the express delivery cabinet, and meanwhile, the position of the express delivery cabinet can be located according to the fixed identification around the express delivery cabinet, the device control system opens software (app) corresponding to the express delivery cabinet, and after the information acquisition unit 100 submits the identity authentication information, the pickup code is automatically sent, and the express delivery cabinet door is opened. For example, the user designates the device control system to identify the vehicle, the device control system and/or the server is successfully identified, and the output unit 200 displays a control menu for the user to select, wherein the control menu comprises information for identifying the vehicle model and presenting different search information according to different vehicle information. For example, identifying the bus, and opening a circuit diagram corresponding to the bus; the taxi taking software (app) is identified to be opened for the car, and the image analysis, the vehicle route or other relevant vehicle information identification can be further carried out on the parts of the vehicle.
The method is as follows: and the user acquires a target image by using the equipment control system, wherein the target image is software information. For example, the two-dimensional code and/or the text description information, the control unit 300 processes the two-dimensional code and/or the text description information, and the control unit 300 starts corresponding software according to the processed two-dimensional code and/or the text description information. The two-dimensional code for payment contains a Chinese character 'zhi' character, the two-dimensional code for wechat payment contains a Chinese character 'mi', and the control unit 300 starts corresponding software by identifying the Chinese character in the two-dimensional code for wechat payment. For example, a two-dimensional code merchant typically prints a text description around the two-dimensional code to describe the two-dimensional code. The control unit 300 starts corresponding software according to the processed text description information.
In the four modes, in the using process, the information acquisition unit 100 acquires the user identity characteristic information at the same time, and can directly verify the user identity characteristic information to enter taxi taking software (app) without additional operation of a user. It is understood that the taxi taking software (app) may be other (apps) that require identity information verification. It can also be understood that the user identity feature information may perform identity authentication with the device connected to the device control system, and allow the user to use the device if the authentication is passed, or disallow the user to use the device if the authentication is failed.
As shown in fig. 4, fig. 4 is a diagram of an apparatus control method according to an embodiment of the present invention.
A device control method is applied to a device control system and comprises the following steps:
step 410, collecting information of a user in a current environment;
and 420, analyzing the use intention of the user on the equipment according to the information, and controlling the corresponding equipment to perform corresponding action according to the use intention.
The device control method further includes: acquiring user identity characteristic information, and verifying whether the user identity characteristic information is consistent with user identity characteristic information prestored in a database; if the input command is consistent with the input command, the equipment control system responds to the input command of the user; and if the data are inconsistent, the equipment control system does not respond to the input instruction of the user.
The step of analyzing the use intention of the user on the equipment according to the information and controlling the corresponding equipment to perform corresponding action according to the use intention further comprises the following steps: the equipment control system is connected with the equipment and/or the server is connected with the equipment, the intention characteristics of the user are identified to control the equipment and/or the spatial position relation between the user and the equipment is determined to control the equipment; and acquiring a target image, identifying the target image through an equipment control system and/or a server, and starting software corresponding to the target image in the equipment control system. And if the software needs identity authentication, acquiring user identity characteristic information through an information acquisition unit for authentication. And if the software does not need identity authentication, directly starting the software.
According to the equipment control method, when the user uses the equipment control system, whether the user identity characteristic information is consistent with the user identity characteristic information prestored in the database or not is verified, and if so, the user is allowed to use the equipment control system; if not, the user is inhibited from controlling the device 400. The information acquisition unit acquires information of a user in the current environment and sends the information to the control unit, and the control unit analyzes the use intention of the user on the equipment according to the information and controls the corresponding equipment to perform corresponding actions according to the use intention. The information collecting unit 100 collects environmental information, analyzes the environmental information to locate the position of the user, and the user connects to the device 400 through the device control system to control the device 400 through gestures and voice. The device 400 is controlled by determining the position relationship between the user and the device 400, and when the distance and the direction between the user and the device 400 positioned by the information acquisition unit 100 reach preset values, the device 400 is controlled to actuate. The information collecting unit 100 acquires a target image, and opens software (app) corresponding to the target image by recognizing the target image, the corresponding software (app) including software on a control device system or software at a server that controls the device.
According to the device control method and system provided by the invention, the user is connected with the device 400 through the device control system, the device control system acquires the position relation between the device 400 and the user and the information of the device 400, and the device control system identifies the intention characteristics of the user to control the device 400, so that the interaction between the user and the device 400 is more convenient and quicker.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. A device control system for controlling a device and a recognition target image in an internet of things, comprising: the device comprises an information acquisition unit and a control unit which is in communication connection with the information acquisition unit, wherein the information acquisition unit is used for acquiring information of a user in the current environment and sending the information to the control unit, the control unit analyzes the use intention of the user on the device according to the information and controls the corresponding device to execute corresponding action according to the use intention, and the device control system is wearable;
the information acquisition unit comprises a first information acquisition unit and a second information acquisition unit,
the environment acquisition module is used for acquiring environment information and equipment information;
the identity authentication acquisition module is used for acquiring the identity characteristic information of the user;
a user analysis module for identifying a user in the environment and identifying intent characteristics of the user;
a first screen module for outputting an image;
the control unit is connected with the environment acquisition module, the identity authentication acquisition module and the user analysis module, and is used for presetting an operation instruction corresponding to the intention characteristic, analyzing the environment information acquired by the environment acquisition module to position a user, comparing the intention characteristic of the user with the preset intention characteristic, sending the corresponding operation instruction to equipment, and verifying whether the user has an authority to control the equipment control system or not by analyzing the user identity characteristic information;
the equipment control system acquires the target image, the acquired target image is identified through the equipment control system and/or the server, and the information acquisition unit acquires a fixed identifier in the environmental information in the process of identifying the target image; wherein:
the equipment control system also comprises an output unit which is in communication connection with the information acquisition unit and the control unit; the output unit comprises an output control module, and the output control module is used for generating an operation instruction to control the equipment;
the equipment control system is connected with the server, the server is used for processing data information of the equipment control system, and the equipment control system controls equipment connected with the server through the server.
2. The appliance control system according to claim 1, wherein the information collecting unit further includes,
the first wireless module is used for connecting with equipment in the Internet of things; and
the first screen module is used for outputting images.
3. The device control system of claim 1, wherein the identity authentication acquisition module is an infrared camera, and the infrared camera is configured to acquire a finger vein image and a joint texture image of the user.
4. The device control system according to any one of claims 1 to 3, wherein the output unit further includes:
the audio module is used for audio acquisition and audio output;
the second screen module is used for outputting an image;
the second wireless module is used for connecting with the equipment; and
the output control module is connected with the audio module, the second screen module and the second wireless module and used for generating an operation instruction to control the equipment.
5. The device control system according to claim 4, wherein when the information acquisition unit and the output unit are connected, an image presented by the first screen module is displayed on the second screen module, and the first screen module does not display the image.
6. An appliance control method applied to the appliance control system according to any one of claims 1 to 5, characterized by comprising the steps of:
collecting information of a user in a current environment;
and analyzing the use intention of the user on the equipment according to the information, and controlling the corresponding equipment to perform corresponding action according to the use intention.
7. The apparatus control method according to claim 6, characterized in that the method further comprises,
acquiring user identity characteristic information, and verifying whether the user identity characteristic information is consistent with user identity characteristic information preset in a control unit;
if the input command is consistent with the input command, the equipment control system responds to the input command of the user;
and if the data are inconsistent, the equipment control system does not respond to the input instruction of the user.
8. The device control method according to claim 6, wherein the step of the control unit analyzing the usage intention of the user on the device according to the information and controlling the corresponding device to perform corresponding action according to the usage intention further comprises:
the equipment control system is connected with the equipment and/or the server is connected with the equipment, the intention characteristics of the user are identified to control the equipment and/or the spatial position relation between the user and the equipment is determined to control the equipment;
and acquiring a target image, identifying the target image through an equipment control system and/or a server, and starting software corresponding to the target image in the equipment control system.
9. The apparatus control method according to claim 6, characterized in that the method further comprises:
establishing a virtual reality through an equipment control system according to environment information collected in advance, or acquiring real-time environment information through a proxy camera;
a user displays a virtual reality of another room in an output unit, and under the condition that the user points to the equipment, the equipment is connected after image recognition is carried out on the equipment through the equipment control system;
and displaying the image of the equipment in the virtual reality selected by the user through the output unit, wherein the equipment controlled by the user through a specific gesture performs corresponding action according to an operation instruction corresponding to the specific gesture of the user on the equipment image in the output unit in the virtual reality.
CN201810436971.9A 2018-05-09 2018-05-09 Equipment control method and system Active CN108632373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810436971.9A CN108632373B (en) 2018-05-09 2018-05-09 Equipment control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810436971.9A CN108632373B (en) 2018-05-09 2018-05-09 Equipment control method and system

Publications (2)

Publication Number Publication Date
CN108632373A CN108632373A (en) 2018-10-09
CN108632373B true CN108632373B (en) 2021-11-30

Family

ID=63692175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810436971.9A Active CN108632373B (en) 2018-05-09 2018-05-09 Equipment control method and system

Country Status (1)

Country Link
CN (1) CN108632373B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109395416B (en) * 2018-11-06 2021-05-11 广东乐博士教育装备有限公司 Intelligent building block module interaction implementation method, device and system
CN111080919A (en) * 2019-01-02 2020-04-28 姜鹏飞 Control method and system for shared input and output equipment
CN111147505A (en) * 2019-01-02 2020-05-12 姜鹏飞 Calling method, device, equipment and computer readable storage medium
CN110913145A (en) * 2019-01-10 2020-03-24 姜鹏飞 Shooting control method, device, equipment and computer readable storage medium
CN112801596A (en) * 2020-06-18 2021-05-14 汪永强 Logistics information pushing and inquiring system based on big data
CN112578721B (en) * 2020-11-27 2022-10-18 宁波阶梯教育科技有限公司 Equipment control method, equipment and computer readable storage medium
CN117831122B (en) * 2023-12-20 2024-08-06 慧之安信息技术股份有限公司 Underground vehicle-booking method and system based on gesture recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101506868A (en) * 2006-09-08 2009-08-12 索尼株式会社 Display device and display method
CN104062909A (en) * 2013-03-19 2014-09-24 海尔集团公司 Household appliance equipment and control device and method thereof
CN105353871A (en) * 2015-10-29 2016-02-24 上海乐相科技有限公司 Target object control method and apparatus in virtual reality scene
CN105549408A (en) * 2015-12-31 2016-05-04 歌尔声学股份有限公司 Wearable device and control method thereof, intelligent household server and control method thereof, and system
CN106502401A (en) * 2016-10-31 2017-03-15 宇龙计算机通信科技(深圳)有限公司 A kind of display control method and device
CN107239139A (en) * 2017-05-18 2017-10-10 刘国华 Based on the man-machine interaction method and system faced

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446474B (en) * 2014-09-26 2018-08-10 中芯国际集成电路制造(上海)有限公司 Wearable smart machine and its method of interaction, wearable smart machine system
CN104635505A (en) * 2015-01-19 2015-05-20 赵树乔 Method for interacting with indoor intelligent equipment
US20160378204A1 (en) * 2015-06-24 2016-12-29 Google Inc. System for tracking a handheld device in an augmented and/or virtual reality environment
US10186086B2 (en) * 2015-09-02 2019-01-22 Microsoft Technology Licensing, Llc Augmented reality control of computing device
CN105657505A (en) * 2016-01-05 2016-06-08 天脉聚源(北京)传媒科技有限公司 Video remote control play method and apparatus
CN106557167B (en) * 2016-11-23 2020-08-04 上海擎感智能科技有限公司 Intelligent glasses and method, system and controller for controlling equipment of intelligent glasses

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101506868A (en) * 2006-09-08 2009-08-12 索尼株式会社 Display device and display method
CN104062909A (en) * 2013-03-19 2014-09-24 海尔集团公司 Household appliance equipment and control device and method thereof
CN105353871A (en) * 2015-10-29 2016-02-24 上海乐相科技有限公司 Target object control method and apparatus in virtual reality scene
CN105549408A (en) * 2015-12-31 2016-05-04 歌尔声学股份有限公司 Wearable device and control method thereof, intelligent household server and control method thereof, and system
CN106502401A (en) * 2016-10-31 2017-03-15 宇龙计算机通信科技(深圳)有限公司 A kind of display control method and device
CN107239139A (en) * 2017-05-18 2017-10-10 刘国华 Based on the man-machine interaction method and system faced

Also Published As

Publication number Publication date
CN108632373A (en) 2018-10-09

Similar Documents

Publication Publication Date Title
CN108632373B (en) Equipment control method and system
EP2942698B1 (en) Non-contact gesture control method, and electronic terminal device
CN107239139B (en) Based on the man-machine interaction method and system faced
CN108681399B (en) Equipment control method, device, control equipment and storage medium
CN109230927B (en) Elevator control method, device, computer equipment and storage medium
US20170053149A1 (en) Method and apparatus for fingerprint identification
CN108345907B (en) Recognition method, augmented reality device, and storage medium
US20130159939A1 (en) Authenticated gesture recognition
CN105139470A (en) Checking-in method, device and system based on face recognition
CN107622246B (en) Face recognition method and related products
CN106888204B (en) Implicit identity authentication method based on natural interaction
CN106774848A (en) Remote control equipment and remote control system
EP3098692A1 (en) Gesture device, operation method for same, and vehicle comprising same
CN111881431A (en) Man-machine verification method, device, equipment and storage medium
CN110647732B (en) Voice interaction method, system, medium and device based on biological recognition characteristics
CN109858320B (en) Fingerprint entry method and related equipment
CN114363547A (en) Double-recording device and double-recording interaction control method
CN108153568A (en) A kind of information processing method and electronic equipment
WO2019213855A1 (en) Device control method and system
CN107992825B (en) A method and system for face recognition based on augmented reality
CN116088992A (en) Click control method and system based on image recognition and voice recognition
CN111176594B (en) Screen display method of intelligent sound box, intelligent sound box and storage medium
CN107577929B (en) Different system access control method based on biological characteristics and electronic equipment
US10521647B2 (en) Body information analysis apparatus capable of indicating shading-areas
CN115205955B (en) Eye-controlled intelligent integrated measurement and control device and eye-control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant