[go: up one dir, main page]

CN113495617A - Method and device for controlling equipment, terminal equipment and storage medium - Google Patents

Method and device for controlling equipment, terminal equipment and storage medium Download PDF

Info

Publication number
CN113495617A
CN113495617A CN202010252369.7A CN202010252369A CN113495617A CN 113495617 A CN113495617 A CN 113495617A CN 202010252369 A CN202010252369 A CN 202010252369A CN 113495617 A CN113495617 A CN 113495617A
Authority
CN
China
Prior art keywords
gesture
target
household equipment
image
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010252369.7A
Other languages
Chinese (zh)
Inventor
赵超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lumi United Technology Co Ltd
Lumi United Technology Co Ltd
Original Assignee
Lumi United Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lumi United Technology Co Ltd filed Critical Lumi United Technology Co Ltd
Priority to CN202010252369.7A priority Critical patent/CN113495617A/en
Publication of CN113495617A publication Critical patent/CN113495617A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a device control method, a device, terminal equipment and a storage medium. The method comprises the following steps: acquiring a first gesture action and an image of at least one piece of household equipment acquired by a camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action. According to the method and the device, the controlled household equipment is determined according to the acquired gesture actions, and the controlled household equipment is controlled according to the gesture actions, so that the household equipment can be flexibly controlled. The position relation between the gesture motion and the household equipment during imaging can be set into various control modes, the operation is convenient and simple, the accuracy is high, and the monotonicity of only a few gesture controls is broken.

Description

Method and device for controlling equipment, terminal equipment and storage medium
Technical Field
The present application relates to the field of smart home technologies, and in particular, to a method and an apparatus for controlling a device, a terminal device, and a storage medium.
Background
With the development of communication technology and smart home, the number and functions of smart home devices are increasing, and convenience is brought to daily life of people. For a large number of smart home devices, how to control the smart home devices quickly and conveniently is a problem that users are more and more concerned about. At present, an application program for controlling the smart home devices may be installed in the smart terminal, and the corresponding smart home devices are controlled through a user interface in the application program. However, when the number of the smart home devices is large, the smart home devices need to be searched for in the APP, and the flexibility is poor.
Disclosure of Invention
The embodiment of the application provides a device control method and device, a terminal device and a storage medium, so as to solve the above problems.
In a first aspect, an embodiment of the present application provides a method for device control, where the method includes: acquiring a first gesture action and an image of at least one piece of household equipment acquired by the camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action.
In a second aspect, an embodiment of the present application provides an apparatus for device control, where the apparatus includes: the first action acquisition module is used for acquiring a first gesture action and an image of at least one piece of household equipment, wherein the first gesture action is acquired by the camera device; the target equipment determining module is used for determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; the second action acquisition module is used for acquiring second gesture actions acquired by the camera device; and the target equipment control module is used for controlling the target household equipment based on the second gesture action.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory and a processor, where the memory is coupled to the processor, and the memory stores instructions, and the processor executes the above method when the instructions are executed by the processor.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a program code is stored, and the program code can be called by a processor to execute the above method.
In the embodiment of the application, a first gesture and an image of at least one piece of household equipment acquired by a camera device are acquired; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action. According to the method and the device, the controlled household equipment is determined according to the acquired gesture actions, and the controlled household equipment is controlled according to the gesture actions, so that the household equipment can be flexibly controlled. The position relation between the gesture motion and the household equipment during imaging can be set into various control modes, the operation is convenient and simple, the accuracy is high, and the monotonicity of only a few gesture controls is broken. And the incidence relation between the controlled equipment and the gesture action in the imaging area is established, and the controlled equipment in the imaging area can be randomly changed, so that the control on the controlled equipment is richer. Compared with a single gesture control device, the controlled device in the scheme can be flexibly changed, so that the control mode is more flexible.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic application environment diagram illustrating a method for controlling a device according to an embodiment of the present application;
FIG. 2 is a flow chart illustrating a method for device control provided by an embodiment of the present application;
FIG. 3 is a flow chart illustrating a method for device control according to another embodiment of the present application;
FIG. 4 is a flow chart illustrating a method for device control according to yet another embodiment of the present application;
FIG. 5 is a flow chart illustrating a method for device control provided by another embodiment of the present application;
FIG. 6 is a flow chart illustrating a method for controlling a device according to another embodiment of the present application;
FIG. 7 is a flow chart illustrating a method for device control according to yet another embodiment of the present application;
FIG. 8 is a block diagram illustrating an apparatus for controlling a device according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram illustrating a hardware structure of a terminal device according to an embodiment of the present application for executing a method for device control according to the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the development of communication technology and smart home, the number and functions of smart home devices are increasing, and convenience is brought to daily life of people. For a large number of smart home devices, how to control the smart home devices quickly and conveniently is a problem that users are more and more concerned about. At present, an application program for controlling the smart home devices may be installed in the smart terminal, and the corresponding smart home devices are controlled through a user interface in the application program. However, when the number of the smart home devices is large, the smart home devices need to be searched for in the APP, and the flexibility is poor.
Therefore, the inventor provides the device control method, the device, the terminal device and the storage medium provided by the embodiment of the application, determines the controlled home equipment based on the acquired gesture image, and controls the controlled home equipment according to the gesture image, so that the home equipment can be flexibly controlled.
An application environment to which the present application relates will be described below.
Referring to fig. 1, fig. 1 is a schematic diagram of an application environment suitable for the embodiment of the present application. The device control method provided by the embodiment of the application can be applied to the smart home system 10 shown in fig. 1. The smart home system 10 includes a terminal device 100 and a home device 200. The terminal device 100 is connected to the home device 200, and the terminal device may include a personal computer, a smart phone, a tablet computer, a wearable electronic device, and the like, which is not limited herein. As an embodiment, the terminal device 100 may include a camera device, and may be configured to capture an image of the home device 200.
Further, in the smart home system 10, a home device 200 may be further included, wherein the home device 200 may include, but is not limited to, a door and window sensor, a smart switch, a lamp, an air conditioner, a curtain, a television, a refrigerator, and a fan. The number of the home devices 200 is at least one, and the at least one home device 200 is connected to the terminal device 100. The terminal device 100 and the home device 200 may be connected through bluetooth, Wifi, or ZigBee.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a method for controlling a device according to an embodiment of the present application. As will be explained in detail below with respect to the embodiment shown in fig. 2, the method may specifically include, but is not limited to, the following steps:
step S110: the method comprises the steps of obtaining a first gesture action and at least one image of household equipment, wherein the first gesture action and the at least one image are collected by a camera device.
In the embodiment of the application, the terminal device may include a camera device, and when a user wants to control the home equipment, the user may simulate human eyes through the terminal device and control the home equipment by combining gesture actions. Specifically, the terminal device may acquire the first gesture motion and the image of at least one piece of home equipment, which are acquired by the camera device. The household devices include, but are not limited to, door and window sensors, intelligent switches, lamps, air conditioners, curtains, televisions, refrigerators, and fans. Further, the image of at least one household device at least comprises an image of a household device which the user wants to control. In some embodiments, the image of at least one home device may include an image of a home device that the user wants to control, as well as images of other home devices. Further, the first gesture motion may be a single gesture motion, that is, a single-finger gesture, a multi-finger gesture, a fist making, and the like, wherein the multi-finger may be two fingers or more than two fingers; the first gesture motion may also be a compound gesture motion, that is, the first gesture motion may include at least two motions, for example, the first gesture motion may be a first punch making, a second single-finger gesture making, and the like, which is not limited herein.
Step S120: and determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment.
In the embodiment of the application, the target home equipment can be determined from the at least one home equipment based on the acquired first gesture and the image of the at least one home equipment, wherein the target home equipment is the home equipment which the user wants to control.
In some embodiments, a relative position relationship between the first gesture motion and the at least one home device may be determined according to the first gesture motion and the image of the at least one home device, and the target home device may be determined according to the relative position relationship. As an implementation mode, which device the gesture is directed to can be determined through the relative position relationship, and then the device is taken as the target household device. For example, if the image capturing device captures an image of a television or an image of a refrigerator, and the relative positional relationship between the first gesture motion and the television or the refrigerator is determined based on the first gesture motion and the image of the television or the image of the refrigerator, and if it is determined that the first gesture motion is directed to the television, it can be considered that the user wants to control the operation of the television, the television can be used as the target home device.
In some embodiments, the type of the first gesture motion may be determined through the first gesture motion, and when the type of the first gesture motion meets the specified type, a relative position relationship between the first gesture motion and at least one piece of home equipment is obtained, and the target home equipment is determined according to the relative position relationship. For example, if the type of the first gesture motion is stretching the index finger and the index finger is pointing to the air conditioner, the air conditioner can be used as the target home device. In some embodiments, the first gesture motion may be a composite gesture, for example, the first gesture motion is a composite gesture in which a multi-finger gesture and a single-finger gesture are matched, the multi-finger gesture may be used to zoom the imaging area, and then the single-finger gesture may be used to point to the target home device in the imaging area to determine the controlled device.
As an implementation mode, the residence time of the first gesture action on the household equipment can be further acquired, and when the residence time meets the preset time, the household equipment can be used as the target household equipment, so that the problem of false triggering can be solved to a certain extent. For example, when the first gesture action is continued for one second on the desk lamp, the desk lamp may be used as the target home device.
In some embodiments, the terminal device may display the acquired first gesture motion and the image of the at least one home device on a screen in real time, so that a user may see a position of the first gesture motion in the image in real time, and the user may adjust a position of the gesture according to the viewed image. Further, the household equipment in the imaging area of the camera device can be changed at will, so that the control of the controlled equipment is richer.
In some embodiments, there may be a plurality of home devices in the imaging area, and distances of the plurality of home devices are different, so that the target home device may become clearer in a zooming manner, so as to establish an association relationship between the first gesture and the target home device.
Step S130: and acquiring a second gesture motion acquired by the camera device.
In the embodiment of the application, after the target household equipment is determined, the target household equipment can be controlled according to the second gesture. Specifically, the second gesture motion acquired by the camera device may be acquired. In some embodiments, in the process of controlling the target home device by using the second gesture motion, the imaging area of the camera device may include an image of the target home device, or may not include the image of the target home device, which is not limited herein. The second gesture motion can be a single gesture motion, namely, a single-finger gesture, a multi-finger gesture, a fist making and the like, wherein the multi-finger can be two fingers or more than two fingers; the second gesture motion may also be a compound gesture motion, that is, the second gesture motion may include at least two motions, for example, the second gesture motion may be a first punch making, a second single finger gesture making, and the like, which is not limited herein.
In some embodiments, the scene of acquiring the first gesture action and the scene of acquiring the second gesture action may be the same scene, that is, within the same scene, the controlled device is determined through the first gesture action, and then the controlled device is controlled through the second gesture action. As an embodiment, the terminal device may determine a target imaging area where the image capturing device captures the first gesture and the image of the at least one home device, and acquire the second gesture in the target imaging area. For example, the imaging area includes areas a, b and c, the area b is an area for acquiring television control authority, namely a target imaging area, when the gesture appears in the area, the control authority of the television can be acquired, and a second gesture action can be acquired in the area b. Further, the target imaging area may also be enlarged to more accurately capture the second gesture. For example, the area b is an area for acquiring television control authority, when a gesture appears in the area, the area of the area b is automatically enlarged, and meanwhile, volume up and down and channel up and down can be performed by changing the second gesture action type.
In some embodiments, the scene for acquiring the first gesture action and the scene for acquiring the second gesture action may not be in the same scene, that is, the first gesture action is adopted to determine the controlled device in the first scene, and the second gesture action is adopted to control the controlled device in the second scene. For example, the imaging area includes areas a, b, and c, and the terminal device determines that the target imaging area where the image capturing device captures the first gesture and the image of the at least one piece of home equipment is area b, and may acquire the second gesture in area a or area c.
Step S140: and controlling the target household equipment based on the second gesture action.
In the embodiment of the application, the target household equipment can be controlled based on the second gesture action.
In some embodiments, the second gesture motion may be recognized, the type of the second gesture is obtained, and the control manner for the target home device is determined based on the type of the second gesture. For example, the target home device is a television, after the second gesture motion is recognized, if it is recognized that the right finger is upward, the channel can be adjusted upward, the right finger is downward, the channel can be adjusted downward, the volume can be increased when the left finger is upward, the volume can be decreased when the left finger is downward, the fast forward can be completed when the right finger is rightward, the reverse can be completed when the right finger is leftward, the television can be closed when a fist is clenched, and the like. If the target household equipment is a curtain, the fist making can show that the curtain is closed, and the palm can show that the curtain is opened; if the target household equipment is lamplight, the fist making can show that the lamplight is turned off, and the palm can show that the lamplight is turned on; if the target household equipment is an air conditioner, the fist can indicate that the air conditioner is closed, the palm can indicate that the air conditioner is opened, the fingers of the right hand can upwards adjust the temperature, and the fingers of the right hand can downwards adjust the temperature. The foregoing examples are merely illustrative and are not intended to be limiting.
In some embodiments, the control mode of the target household equipment can be further determined by determining the area of the second gesture motion in the imaging area of the camera device. For example, the target household device is a desk lamp, and when the second gesture motion is located on the left side of the imaging area, the desk lamp can be turned on. The above-described modes are merely examples and are not limited thereto. In some embodiments, false triggering may be prevented by maintaining the time the gesture is in a region, for example, dividing the imaging region into three regions, left, middle, and right, where a gesture to the left for three seconds may be set to control the turning on of the device and a gesture to the right for one second may be set to control the turning off of the device.
The method for controlling the equipment provided by the embodiment comprises the steps of obtaining a first gesture action and at least one image of the household equipment, wherein the first gesture action is collected by a camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action. According to the method and the device, the controlled household equipment is determined according to the acquired gesture, and the controlled household equipment is controlled according to the gesture action, so that the household equipment can be flexibly controlled. The position relation between the gesture motion and the household equipment during imaging can be set into various control modes, the operation is convenient and simple, the accuracy is high, and the monotonicity of only a few gesture controls is broken. And the incidence relation between the controlled equipment and the gesture action in the imaging area is established, and the controlled equipment in the imaging area can be randomly changed, so that the control on the controlled equipment is richer. Compared with a single gesture control device, the controlled device in the scheme can be flexibly changed, so that the control mode is more flexible.
Referring to fig. 3, fig. 3 is a flow chart illustrating a method for controlling a device according to another embodiment of the present application. As will be explained in detail below with respect to the flow shown in fig. 3, the method may specifically include, but is not limited to, the following steps:
step S210: the method comprises the steps of obtaining a first gesture action and at least one image of household equipment, wherein the first gesture action and the at least one image are collected by a camera device.
For detailed description of step S210, please refer to step S110, which is not described herein again.
Step S220: and acquiring the relative position relation between the first gesture action and the at least one household device based on the first gesture action and the image of the at least one household device.
In this embodiment of the application, a relative position relationship between the first gesture and the at least one household device may be obtained based on the first gesture and the image of the at least one household device.
In some embodiments, a coordinate system may be established with a center of the imaging area as an origin, coordinates of the first gesture in the coordinate system and coordinates of the at least one household device in the coordinate system are acquired, and a relative position relationship between the first gesture and the at least one household device is calculated according to the acquired coordinates.
Step S230: and determining target household equipment from the at least one household equipment based on the relative position relation between the first gesture and the at least one household equipment.
In some embodiments, the target household device may be determined from the at least one household device based on a relative positional relationship of the first gesture and the at least one household device.
In some embodiments, according to the relative position relationship between the first gesture and at least one home device, which home device the first gesture falls on may be determined, and then the home device may be used as the target home device. For example, when it is determined that the first gesture is a fall on a rice cooker, the rice cooker may be set as the target home appliance.
In some embodiments, when there are multiple home devices in the imaging area, and the multiple home devices are partially overlapped and different distances are different, the target home device may be determined by zooming, so as to improve the accuracy of the control.
In some embodiments, the distance between the first gesture and the at least one household device may be determined according to the relative position relationship between the first gesture and the at least one household device, and the household device with the smallest distance may be used as the target household device. For example, the distance between the first gesture and the table lamp is the smallest, which can indicate that the first gesture is closest to the table lamp, and then the table lamp can be used as the target home device.
Step S240: and acquiring a second gesture motion acquired by the camera device.
Step S250: and controlling the target household equipment based on the second gesture action.
For detailed description of steps S240 to S250, please refer to steps S130 to S140, which are not described herein again.
The method for controlling the equipment provided by the embodiment comprises the steps of obtaining a first gesture action and at least one image of the household equipment, wherein the first gesture action is collected by a camera device; acquiring a relative position relation between the first gesture and at least one piece of household equipment based on the first gesture and the image of the at least one piece of household equipment; determining target household equipment from at least one household equipment based on the relative position relation between the first gesture and the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action. According to the embodiment, the target household equipment is determined according to the position relation between the first gesture and the household equipment, and the flexibility of household equipment control is improved.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating a method for controlling a device according to another embodiment of the present application. As will be explained in detail below with respect to the flow shown in fig. 4, the method may specifically include, but is not limited to, the following steps:
step S310: the method comprises the steps of obtaining a first gesture action and at least one image of household equipment, wherein the first gesture action and the at least one image are collected by a camera device.
Step S320: and determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment.
Step S330: and acquiring a second gesture motion acquired by the camera device.
For the detailed description of steps S310 to S330, refer to steps S110 to S130, which are not described herein again.
Step S340: and acquiring a region containing the first gesture motion and the image of the target household equipment in an imaging region of the camera device as a first target region.
In the embodiment of the application, the specific pointing object of the gesture can be more accurately determined by amplifying the first gesture and the area of the image of the target household device, and the specific pointing object is controlled according to the second gesture object. Specifically, a region including the first gesture motion and the image of the target home equipment in the imaging region of the imaging device may be acquired as the first target region.
In some embodiments, the imaging area of the imaging device may include the first gesture motion, the image of the target home device, the background image, and the like, and an area including the first gesture motion and the image of the target home device may be acquired as the first target area and enlarged.
In some embodiments, the terminal device and the home devices may be bound first, and then the background image and the images of the home devices that are not bound may be removed in the imaging area, or the background image and the images of the home devices that are not bound may be filled with plain colors, and only the first gesture action and the home devices that establish a binding relationship with the terminal device are retained, so as to improve the accuracy of identifying the target home devices.
Step S350: and amplifying the first target area, and acquiring the relative position relation between the amplified first gesture action and the amplified image of the target household equipment.
In an embodiment of the application, the first target area may be enlarged in order to determine a specific pointing object for the first gesture. Specifically, the first target area may be enlarged, and a relative position relationship between the enlarged first gesture and the enlarged image of the target home device may be acquired.
Further, after the first target area is enlarged, the image in the enlarged first target area may be processed, for example, impurities in the image may be removed, or the image may be subjected to noise reduction, enhancement, and the like, so as to improve the quality of the image, and thus, components on the target home equipment corresponding to the first gesture motion may be determined more accurately.
In some embodiments, a coordinate system may be established with the center of the imaging area as an origin, coordinates of the amplified first gesture motion are obtained, then coordinates of the amplified image of the target home device are obtained, and a relative position relationship between the amplified first gesture motion and the amplified image of the target home device is calculated.
Step S360: and determining a target part from at least one part of the target household equipment based on the relative position relation between the amplified first gesture and the amplified image of the target household equipment.
In this embodiment of the application, the target component may be determined from at least one component of the target household device based on a relative positional relationship between the amplified first gesture and the amplified image of the target household device.
In some embodiments, the image of the target household device may include an image of at least one component, for example, an image of a television may include an image of a component that controls volume, an image of a component that controls a channel, an image of a component that controls a switch, and the like. As an implementation manner, a coordinate system may be established with the center of the imaging area as an origin, coordinates of at least one component in the enlarged image of the target home device and coordinates of the first gesture are obtained, and a component specifically pointed by the first gesture is determined as the target component.
In some embodiments, a coordinate system may be established with a center of the imaging area as an origin, coordinates of the enlarged first gesture motion may be acquired, the image of the enlarged target home device may be divided into several areas, each area includes at least one component, the coordinates of each area are acquired, an area to which the first gesture is specifically directed is determined according to the coordinates of the first gesture motion and the coordinates of each area, and the component in the area may be used as the target component.
Step S370: controlling the target component based on the second gesture motion.
In the embodiment of the present application, the target component may be controlled based on the second gesture motion.
In some embodiments, a second gesture motion may be recognized, a type of the second gesture may be obtained, and a manner of controlling the target component may be determined based on the type of the second gesture. For example, the target component is a component for controlling a channel of a television, and after the second gesture motion is recognized, if the type of the acquired second gesture is left-hand finger-up, the channel can be adjusted upwards.
The method for controlling the equipment provided by the embodiment comprises the steps of obtaining a first gesture action and at least one image of the household equipment, wherein the first gesture action is collected by a camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; acquiring a region containing a first gesture action and an image of target household equipment in an imaging region of a camera device, and taking the region as a first target region; amplifying the first target area, and acquiring the relative position relation between the amplified first gesture and the amplified image of the target household equipment; determining a target part from at least one part of the target household equipment based on the relative position relation between the amplified first gesture and the amplified image of the target household equipment; controlling the target component based on the second gesture motion. According to the embodiment, the area containing the first gesture and the image of the at least one piece of home equipment is enlarged, so that the specific pointing object of the gesture can be extracted conveniently, and the corresponding relation between the gesture and the target home equipment is established more accurately.
Referring to fig. 5, fig. 5 is a schematic flow chart illustrating a method for controlling a device according to another embodiment of the present application, and the method specifically includes, but is not limited to, the following steps:
step S410: the method comprises the steps of obtaining a first gesture action and at least one image of household equipment, wherein the first gesture action and the at least one image are collected by a camera device.
Step S420: and determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment.
Step S430: and acquiring a second gesture motion acquired by the camera device.
For detailed description of steps S410 to S430, please refer to steps S110 to S130, which are not described herein again.
Step S440: and recognizing the second gesture action, and taking the area of the second gesture action in the imaging area as the first target area.
In this embodiment of the application, the imaging area of the imaging device includes a plurality of areas, each of the plurality of areas corresponds to at least one piece of instruction information, and the control mode corresponding to the second gesture can be determined according to the area where the second gesture is located in the imaging area. Specifically, the acquired second gesture image may be recognized, and an area where the second gesture is located in the imaging area is acquired as the target area.
In some embodiments, the second gesture image acquired by the camera device can be acquired in real time and displayed through a screen of the terminal device, so that the user can accurately know the area where the finger is located, and false triggering can be prevented by maintaining the time when the gesture is located in the area.
Step S450: and acquiring first instruction information corresponding to the first target area.
In the embodiment of the application, first instruction information corresponding to a target area can be acquired.
In some embodiments, the imaging region may include a plurality of regions, where a correspondence between the regions and the instruction information may be stored in advance, and after a region where the second gesture is located in the imaging region, that is, a target region is obtained, first instruction information corresponding to the target region may be obtained according to the correspondence between the regions and the instruction information. For example, the imaging area may be divided into three areas, i.e., a left area corresponding to the on state of the control device and a right area corresponding to the off state of the control device, and when it is detected that the second gesture is located in the left area, the first instruction information may be acquired as the on state of the control device.
Step S460: and controlling the target household equipment based on the first instruction information.
In the embodiment of the application, the target household equipment can be controlled based on the first instruction information. For example, the imaging area may be divided into three areas, namely, a left area corresponding to the on state of the control device and a right area corresponding to the off state of the control device, and when it is detected that the second gesture is located in the left area, the first instruction information may be acquired as the on state of the control device, and the target home devices may be controlled to be on.
The method for controlling the equipment provided by the embodiment comprises the steps of obtaining a first gesture action and at least one image of the household equipment, wherein the first gesture action is collected by a camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; identifying the second gesture action, and acquiring the area of the second gesture in the imaging area as a target area; acquiring first instruction information corresponding to a target area; and controlling the target household equipment based on the first instruction information. According to the embodiment, the control mode of the target household equipment is determined according to the position of the second gesture in the imaging area, so that the control of the target household equipment is richer.
Referring to fig. 6, fig. 6 is a schematic flow chart illustrating a method for controlling a device according to still another embodiment of the present application, and the method specifically includes, but is not limited to, the following steps:
step S510: the method comprises the steps of obtaining a first gesture action and at least one image of household equipment, wherein the first gesture action and the at least one image are collected by a camera device.
Step S520: and determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment.
Step S530: and acquiring a second gesture motion acquired by the camera device.
For the detailed description of steps S510 to S530, please refer to steps S110 to S130, which are not described herein again.
Step S540: and identifying the second gesture action, and acquiring a gesture type corresponding to the second gesture action.
In this embodiment of the application, the control mode of the target household device may be determined based on the type of the second gesture. Specifically, the second gesture motion may be recognized, and a gesture type corresponding to the second gesture may be obtained.
In some embodiments, a machine learning model may be established in advance, and the machine learning model may be trained by using a gesture image and a gesture type corresponding to the gesture image as training samples, so that a second gesture motion may be input into the machine learning model to obtain a gesture type corresponding to a second gesture output by the machine learning model.
In some embodiments, the area of the region where the gesture is located may be increased appropriately, so as to extract the gesture motion more completely, and increase the accuracy of control. Specifically, a region including the second gesture motion in the imaging region of the imaging device may be acquired, the region including the second gesture motion in the imaging region of the imaging device may be used as a second target region, the second target region may be enlarged, and the enlarged second gesture motion may be recognized to obtain a gesture type corresponding to the second gesture. In one embodiment, the enlarged second gesture motion may be input to the machine learning model, and the gesture type output by the machine learning model and corresponding to the enlarged second gesture motion may be obtained.
Step S550: and acquiring second instruction information corresponding to the gesture type according to the target household equipment.
In the embodiment of the application, after the gesture type corresponding to the second gesture is obtained, second instruction information corresponding to the gesture type can be obtained according to the target household equipment.
In some embodiments, the corresponding relationship between the home device, the gesture type, and the second instruction information may be stored in advance. For example, when the household equipment is an air conditioner, a fist is closed to turn off the air conditioner, a palm is opened to turn on the air conditioner, the fingers of the right hand are upward to finish upward temperature adjustment, the fingers of the right hand are downward to finish downward temperature adjustment, and the like. When the household equipment is a television, the channel is adjusted upwards by the fingers of the right hand, the channel is adjusted downwards by the fingers of the right hand, the volume is adjusted upwards by the fingers of the left hand, the volume is adjusted downwards by the fingers of the left hand, and the television is turned off by clenching the fist. The above distances are merely examples and are not limiting. And determining corresponding second instruction information according to the corresponding relationship between the household equipment, the gesture type and the second instruction information through the determined type corresponding to the target household equipment and the second gesture.
Step S560: and controlling the target household equipment based on the second instruction information.
In the embodiment of the application, the target household equipment can be controlled based on the second instruction information. For example, when the target household device is an air conditioner, the temperature of the air conditioner can be controlled to be increased if the second instruction information is to adjust the temperature upwards.
The method for controlling the equipment provided by the embodiment comprises the steps of obtaining a first gesture action and at least one image of the household equipment, wherein the first gesture action is collected by a camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; identifying the second gesture to acquire a gesture type corresponding to the second gesture; acquiring second instruction information corresponding to the gesture type; and controlling the target household equipment based on the second instruction information. According to the embodiment, the control mode of the household equipment is determined according to the type corresponding to the second gesture, so that the control mode of the target household equipment is richer.
Referring to fig. 7, fig. 7 is a schematic flowchart illustrating a method for controlling a device according to still another embodiment of the present application, where before acquiring a first gesture motion and an image of at least one piece of home equipment, step S610-step S620 shown in fig. 7 may be further performed, and the method may specifically include, but is not limited to, the following steps:
step S610: when the imaging area of the camera device contains a plurality of household equipment, the use frequencies of the household equipment in a specified time period are respectively obtained.
In the embodiment of the application, when the imaging area of the imaging device contains a plurality of household devices, the use frequencies of the plurality of household devices in a specified time period can be respectively acquired.
In some embodiments, the usage frequency of the plurality of home devices may be determined according to historical usage information by obtaining the historical usage information of the plurality of home devices within a specified time period. For example, when 19:30 is used, the imaging area of the camera device comprises a television and an air conditioner, the use time of the television and the air conditioner in the time period of 19:00-19:30 can be acquired, and if the user operates the television for multiple times in the half hour and does not operate the air conditioner, the use frequency of the television can be higher than that of the air conditioner.
Step S620: determining a corresponding area size of each household device in the imaging area based on the use frequency, wherein the use frequency is positively correlated with the area size.
In some embodiments, the corresponding area size of each household device in the imaging area may be determined based on a frequency of use, wherein the frequency of use is positively correlated with the area size, i.e., the higher the frequency of use, the larger the corresponding area size. For example, in the time period of 19:00-19:30, the use frequency of the television is higher than that of the air conditioner, and the corresponding area of the television in the imaging area is larger than that of the air conditioner in the imaging area.
In some embodiments, step S620 may be further followed by step S630: displaying a plurality of home devices in an imaging area based on an area size.
In the embodiment of the application, a plurality of household devices can be displayed in the imaging area based on the area size. In some embodiments, the larger the region, the larger the area displayed in the imaging region.
According to the device control method provided by the embodiment, when the imaging area of the camera device contains a plurality of household devices, the use frequencies of the household devices in a specified time period are respectively obtained; determining the corresponding area size of each household device in the imaging area based on the use frequency, wherein the use frequency is positively correlated with the area size; displaying a plurality of home devices in an imaging area based on an area size. According to the embodiment, the size of the area corresponding to the imaging area can be distributed through the use frequency of the household equipment in the historical time period, so that the household equipment and the gesture action can be more accurately associated.
Referring to fig. 8, fig. 8 is a block diagram illustrating an apparatus 800 for controlling a device according to an embodiment of the present disclosure. As will be explained below with respect to the block diagram shown in fig. 8, the apparatus control apparatus 800 is applied to a terminal device including an image pickup apparatus, and the apparatus control apparatus 800 includes: a first action obtaining module 810, a target device determining module 820, a second action obtaining module 830, and a target device controlling module 840, wherein:
the first action obtaining module 810 is configured to obtain a first gesture action and an image of at least one piece of home equipment, where the first gesture action is collected by the camera device.
And the target device determining module 820 is configured to determine a target home device from the at least one home device based on the first gesture and the image of the at least one home device.
Further, the target device determining module 820 includes: the device comprises a position relation obtaining submodule and a target device determining submodule, wherein:
and the position relation acquisition sub-module is used for acquiring the relative position relation between the first gesture action and the at least one piece of household equipment based on the first gesture action and the image of the at least one piece of household equipment.
And the target equipment determining submodule is used for determining target household equipment from the at least one household equipment based on the relative position relation between the first gesture and the at least one household equipment.
And a second action obtaining module 830, configured to obtain a second gesture action acquired by the image capturing device.
Further, the second action obtaining module 830 includes: a target area determination submodule and a second action acquisition submodule, wherein:
and the target area determining submodule is used for determining a target imaging area of the image of the at least one piece of home equipment and the first gesture action acquired by the camera device.
And the second action acquisition sub-module is used for acquiring a second gesture action in the target imaging area.
And the target device control module 840 is used for controlling the target household device based on the second gesture action.
Further, the target device control module 840 includes: the system comprises a first area acquisition submodule, a position relation acquisition submodule, a target component determination submodule and a target component control submodule, wherein the first area acquisition submodule, the position relation acquisition submodule, the target component determination submodule and the target component control submodule are connected;
and the first area acquisition submodule is used for acquiring an area which contains the first gesture and the image of the target household equipment in the imaging area of the camera device as a first target area.
And the position relation acquisition submodule is used for amplifying the first target area and acquiring the relative position relation between the amplified first gesture action and the amplified image of the target household equipment.
And the target component determining submodule is used for determining a target component from at least one component of the target household equipment based on the relative position relation between the amplified first gesture and the amplified image of the target household equipment.
And the target component control sub-module is used for controlling the target component based on the second gesture action.
Further, the imaging area of the image capturing apparatus includes a plurality of areas, and each of the areas corresponds to at least one piece of instruction information, and the target device control module 840 further includes: the device comprises a first image identification submodule, a first information acquisition submodule and a first control submodule, wherein:
and the first image recognition submodule is used for recognizing the second gesture action and acquiring the area of the second gesture action in the imaging area as a target area.
And the first information acquisition submodule is used for acquiring first instruction information corresponding to the target area.
And the first control submodule is used for controlling the target household equipment based on the first instruction information.
Further, the target device control module 840 further includes: a second image recognition submodule, a second information acquisition submodule and a second control submodule, wherein:
and the second image recognition submodule is used for recognizing the second gesture action and acquiring the gesture type corresponding to the second gesture action.
Further, the second image identification submodule includes an area acquisition unit and an image identification unit, wherein:
and an area acquisition unit configured to acquire an area including the second gesture motion in the imaging area of the imaging device, and to set the area including the second gesture motion in the imaging area of the imaging device as the second target area.
And the image recognition unit is used for amplifying the second target area and recognizing the amplified second gesture action to obtain a gesture type corresponding to the second gesture action.
And the second information acquisition submodule is used for acquiring second instruction information corresponding to the gesture type.
And the second control submodule is used for controlling the target household equipment based on the second instruction information.
Further, the apparatus 800 for device control further includes: frequency of use acquisition module, regional size determine module and house equipment display module, wherein:
and the use frequency acquisition module is used for respectively acquiring the use frequencies of the plurality of household equipment in a specified time period when the imaging area of the camera device contains the plurality of household equipment.
The imaging device comprises an area size determining module, a processing module and a display module, wherein the area size determining module is used for determining the corresponding area size of each household device in the plurality of household devices in the imaging area based on the use frequency, and the use frequency is positively correlated with the area size.
And the home equipment display module is used for displaying a plurality of home equipment in the imaging area based on the area size.
The device for controlling the equipment, provided by the embodiment of the application, comprises a first action acquisition module, a second action acquisition module and a control module, wherein the first action acquisition module is used for acquiring a first gesture action and an image of at least one piece of household equipment; the target equipment determining module is used for determining target household equipment from at least one household equipment based on the first gesture and the image of the at least one household equipment; the second action acquisition module is used for acquiring second gesture actions acquired by the camera device; and the target equipment control module is used for controlling the target household equipment based on the second gesture action, so that the controlled household equipment is determined according to the obtained gesture action, and the controlled household equipment is controlled according to the gesture action, and the household equipment is flexibly controlled. The position relation between the gesture motion and the household equipment during imaging can be set into various control modes, the operation is convenient and simple, the accuracy is high, and the monotonicity of only a few gesture controls is broken. And the incidence relation between the controlled equipment and the gesture action in the imaging area is established, and the controlled equipment in the imaging area can be randomly changed, so that the control on the controlled equipment is richer. Compared with a single gesture control device, the controlled device in the scheme can be flexibly changed, so that the control mode is more flexible.
It can be clearly understood by those skilled in the art that the device for controlling equipment provided in the embodiment of the present application can implement each process implemented by the terminal equipment in the method embodiments of fig. 2 to fig. 7, and for convenience and simplicity of description, the specific working processes of the device and the module described above may refer to corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling or direct coupling or communication connection between the modules shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or modules may be in an electrical, mechanical or other form. In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Fig. 9 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present application.
The terminal device 100 includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 9 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present application, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The processor 110 is configured to respond to a device network access instruction, scan the smart device based on the short-distance wireless communication technology, acquire a target smart device meeting a preset network access condition from the scanned smart device, and push the target smart device according to a preset rule, so that the target smart device can be quickly and accurately selected for network connection, and user experience is improved.
It should be understood that, in the embodiment of the present application, the radio frequency unit 101 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 9, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
The embodiment of the present application further provides a terminal device, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program is executed by the processor 110 to implement each process of the method embodiment for controlling the device, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
The terminal equipment provided by the embodiment of the application executes the first gesture action and the image of at least one piece of household equipment acquired by the camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action. According to the method and the device, the controlled household equipment is determined according to the acquired gesture actions, and the controlled household equipment is controlled according to the gesture actions, so that the household equipment can be flexibly controlled. The position relation between the gesture motion and the household equipment during imaging can be set into various control modes, the operation is convenient and simple, the accuracy is high, and the monotonicity of only a few gesture controls is broken. And the incidence relation between the controlled equipment and the gesture action in the imaging area is established, and the controlled equipment in the imaging area can be randomly changed, so that the control on the controlled equipment is richer. Compared with a single gesture control device, the controlled device in the scheme can be flexibly changed, so that the control mode is more flexible.
The embodiment of the present application further provides a computer-readable storage medium 900, where a computer program 910 is stored on the computer-readable storage medium 900, and the computer program 910 is executed by a processor to implement each process of the method embodiment for device control, and the same technical effect can be achieved, and is not described herein again to avoid repetition. The computer-readable storage medium 900 may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or the like.
In summary, the embodiment of the application discloses a device control method, an apparatus, a terminal device and a storage medium. Acquiring a first gesture action and an image of at least one piece of household equipment acquired by a camera device; determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment; acquiring a second gesture collected by the camera device; and controlling the target household equipment based on the second gesture action. According to the method and the device, the controlled household equipment is determined according to the acquired gesture, and the controlled household equipment is controlled according to the gesture image, so that the household equipment can be flexibly controlled.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a smart gateway, a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, the present embodiments are not limited to the above-described embodiments, which are merely illustrative and not restrictive, and those skilled in the art can now make various changes and modifications without departing from the spirit and scope of the present invention.

Claims (11)

1. A method for device control, which is applied to a terminal device including an image pickup device, the method comprising:
acquiring a first gesture action and an image of at least one piece of household equipment acquired by the camera device;
determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment;
acquiring a second gesture collected by the camera device;
and controlling the target household equipment based on the second gesture action.
2. The method according to claim 1, wherein the determining a target household device from the at least one household device based on the first gesture and the image of the at least one household device comprises:
acquiring a relative position relation between the first gesture action and the at least one household device based on the first gesture action and the image of the at least one household device;
and determining the target household equipment from the at least one household equipment based on the relative position relationship between the first gesture and the at least one household equipment.
3. The method according to claim 1, wherein the controlling the target home device based on the second gesture action comprises:
acquiring a region containing the first gesture action and an image of target household equipment in an imaging region of the camera device as a first target region;
amplifying the first target area, and acquiring a relative position relation between the amplified first gesture and the amplified image of the target household equipment;
determining a target part from at least one part of the target household equipment based on the relative position relation between the amplified first gesture and the amplified image of the target household equipment;
controlling the target component based on the second gesture action.
4. The method according to claim 1, wherein an imaging area of the camera device comprises a plurality of areas, each of the plurality of areas corresponds to at least one piece of instruction information, and the controlling the target household device based on the second gesture comprises:
recognizing the second gesture action, and taking the area of the second gesture action in the imaging area as a first target area;
acquiring first instruction information corresponding to the first target area;
and controlling the target household equipment based on the first instruction information.
5. The method according to claim 1, wherein the controlling the target home device based on the second gesture action comprises:
identifying the second gesture action to acquire a gesture type corresponding to the second gesture action;
acquiring second instruction information corresponding to the gesture type according to the target household equipment;
and controlling the target household equipment based on the second instruction information.
6. The method according to claim 5, wherein the recognizing the second gesture action and acquiring the gesture type corresponding to the second gesture action comprise:
acquiring a region containing the second gesture action in an imaging region of the imaging device, and taking the region containing the second gesture action in the imaging region of the imaging device as a second target region;
and amplifying the second target area, and identifying the amplified second gesture action to obtain a gesture type corresponding to the second gesture action.
7. The method according to claim 1, wherein before the acquiring the first gesture motion and the image of the at least one household device collected by the camera device, the method further comprises:
when the imaging area of the camera device comprises a plurality of household equipment, respectively acquiring the use frequency of the plurality of household equipment in a specified time period;
determining a corresponding area size of each household device in the imaging area based on the use frequency, wherein the use frequency is positively correlated with the area size.
8. The method according to any one of claims 1-7, wherein the acquiring of the second gesture motion captured by the camera device comprises:
determining a target imaging area of the camera device for acquiring the first gesture action and at least one image of the household equipment;
acquiring the second gesture within the target imaging region.
9. An apparatus for controlling a device, applied to a terminal device including an image pickup device, the apparatus comprising:
the first image acquisition module is used for acquiring a first gesture action and an image of at least one piece of household equipment, which are acquired by the camera device;
the target equipment determining module is used for determining target household equipment from the at least one household equipment based on the first gesture and the image of the at least one household equipment;
the second image acquisition module is used for acquiring a second gesture motion acquired by the camera device;
and the target equipment control module is used for controlling the target household equipment based on the second gesture action.
10. A terminal device comprising a memory and a processor, the memory coupled to the processor, the memory storing instructions that, when executed by the processor, the processor performs the method of any of claims 1-8.
11. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 8.
CN202010252369.7A 2020-04-01 2020-04-01 Method and device for controlling equipment, terminal equipment and storage medium Pending CN113495617A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010252369.7A CN113495617A (en) 2020-04-01 2020-04-01 Method and device for controlling equipment, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010252369.7A CN113495617A (en) 2020-04-01 2020-04-01 Method and device for controlling equipment, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113495617A true CN113495617A (en) 2021-10-12

Family

ID=77994336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010252369.7A Pending CN113495617A (en) 2020-04-01 2020-04-01 Method and device for controlling equipment, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113495617A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253396A (en) * 2021-11-15 2022-03-29 青岛海尔空调电子有限公司 Target control method, device, equipment and medium
CN115103261A (en) * 2022-06-28 2022-09-23 歌尔科技有限公司 Headphone control method, device and computer-readable storage medium
CN115390468A (en) * 2022-08-03 2022-11-25 深圳绿米联创科技有限公司 Device control method, device, electronic device and storage medium
CN115390469A (en) * 2022-08-19 2022-11-25 青岛海尔科技有限公司 Control method, system and storage medium for home appliance
CN115616928A (en) * 2022-10-21 2023-01-17 广州视声智能股份有限公司 Control panel control method and device based on artificial intelligence
CN116449720A (en) * 2023-03-31 2023-07-18 深圳开鸿数字产业发展有限公司 Smart home control method, device, equipment and medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604224A (en) * 2009-07-08 2009-12-16 宇龙计算机通信科技(深圳)有限公司 A kind of icon display packing, system and portable terminal of application program
CN102508599A (en) * 2011-10-11 2012-06-20 宇龙计算机通信科技(深圳)有限公司 Desktop icon display method and communication terminal thereof
CN103392163A (en) * 2008-10-10 2013-11-13 高通股份有限公司 Single camera tracker
CN103472796A (en) * 2013-09-11 2013-12-25 厦门狄耐克电子科技有限公司 Intelligent housing system based on gesture recognition
JP2015038648A (en) * 2010-12-24 2015-02-26 株式会社東芝 Information processor, control method and program
CN104699244A (en) * 2015-02-26 2015-06-10 小米科技有限责任公司 Controlling method and device for intelligent device
CN106489080A (en) * 2014-08-07 2017-03-08 谷歌公司 Radar-based gesture sensing and data transmission
CN107678551A (en) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 Gesture identification method and device, electronic equipment
CN108710465A (en) * 2018-03-19 2018-10-26 西安艾润物联网技术服务有限责任公司 License plate number input method virtual key display control method, storage medium and user terminal
CN108965067A (en) * 2017-05-23 2018-12-07 美的智慧家居科技有限公司 Intelligent home equipment control method, intelligent gateway and smart machine
CN208723929U (en) * 2018-09-20 2019-04-09 塔普翊海(上海)智能科技有限公司 An AR remote control
CN109814717A (en) * 2019-01-29 2019-05-28 珠海格力电器股份有限公司 Household equipment control method and device, control equipment and readable storage medium
DE102018208827A1 (en) * 2018-06-05 2019-12-05 Bayerische Motoren Werke Aktiengesellschaft User interface, means of transport and method for determining user input

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103392163A (en) * 2008-10-10 2013-11-13 高通股份有限公司 Single camera tracker
CN101604224A (en) * 2009-07-08 2009-12-16 宇龙计算机通信科技(深圳)有限公司 A kind of icon display packing, system and portable terminal of application program
JP2015038648A (en) * 2010-12-24 2015-02-26 株式会社東芝 Information processor, control method and program
CN102508599A (en) * 2011-10-11 2012-06-20 宇龙计算机通信科技(深圳)有限公司 Desktop icon display method and communication terminal thereof
CN103472796A (en) * 2013-09-11 2013-12-25 厦门狄耐克电子科技有限公司 Intelligent housing system based on gesture recognition
CN106489080A (en) * 2014-08-07 2017-03-08 谷歌公司 Radar-based gesture sensing and data transmission
CN104699244A (en) * 2015-02-26 2015-06-10 小米科技有限责任公司 Controlling method and device for intelligent device
CN108965067A (en) * 2017-05-23 2018-12-07 美的智慧家居科技有限公司 Intelligent home equipment control method, intelligent gateway and smart machine
CN107678551A (en) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 Gesture identification method and device, electronic equipment
CN108710465A (en) * 2018-03-19 2018-10-26 西安艾润物联网技术服务有限责任公司 License plate number input method virtual key display control method, storage medium and user terminal
DE102018208827A1 (en) * 2018-06-05 2019-12-05 Bayerische Motoren Werke Aktiengesellschaft User interface, means of transport and method for determining user input
CN208723929U (en) * 2018-09-20 2019-04-09 塔普翊海(上海)智能科技有限公司 An AR remote control
CN109814717A (en) * 2019-01-29 2019-05-28 珠海格力电器股份有限公司 Household equipment control method and device, control equipment and readable storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253396A (en) * 2021-11-15 2022-03-29 青岛海尔空调电子有限公司 Target control method, device, equipment and medium
CN115103261A (en) * 2022-06-28 2022-09-23 歌尔科技有限公司 Headphone control method, device and computer-readable storage medium
CN115390468A (en) * 2022-08-03 2022-11-25 深圳绿米联创科技有限公司 Device control method, device, electronic device and storage medium
CN115390469A (en) * 2022-08-19 2022-11-25 青岛海尔科技有限公司 Control method, system and storage medium for home appliance
CN115616928A (en) * 2022-10-21 2023-01-17 广州视声智能股份有限公司 Control panel control method and device based on artificial intelligence
CN116449720A (en) * 2023-03-31 2023-07-18 深圳开鸿数字产业发展有限公司 Smart home control method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN113495617A (en) Method and device for controlling equipment, terminal equipment and storage medium
WO2020020063A1 (en) Object identification method and mobile terminal
CN108989672B (en) A shooting method and mobile terminal
CN108089891B (en) Application program starting method and mobile terminal
CN109831585B (en) A kind of operating parameter adjustment method and mobile terminal
CN109523253B (en) A payment method and device
CN108762877B (en) A kind of control method of mobile terminal interface and mobile terminal
CN108427873B (en) A biometric identification method and mobile terminal
CN111045344A (en) A kind of control method of household equipment and electronic equipment
CN109558046B (en) Information display method and terminal equipment
CN108509141B (en) A method for generating a control and a mobile terminal
CN108881617B (en) A display switching method and mobile terminal
CN110807405A (en) A detection method and electronic device for a candid camera device
CN109521684B (en) Household equipment control method and terminal equipment
CN109600468B (en) Control method of foldable terminal and foldable terminal
CN110913067A (en) A kind of information sending method and electronic device
CN108958593B (en) Method for determining communication object and mobile terminal
CN109542321B (en) A method and device for controlling content displayed on a screen
CN109145731B (en) Fingerprint identification method and device and mobile terminal
CN109922294B (en) A video processing method and mobile terminal
CN111246105B (en) Photographing method, electronic device, and computer-readable storage medium
CN111142396A (en) Information display method and electronic device
CN111313114B (en) Charging method and electronic equipment
CN109871253A (en) A display method and terminal
CN109902679B (en) Icon display method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination