CN116136659A - Smart device control method and electronic device - Google Patents
Smart device control method and electronic device Download PDFInfo
- Publication number
- CN116136659A CN116136659A CN202111355520.0A CN202111355520A CN116136659A CN 116136659 A CN116136659 A CN 116136659A CN 202111355520 A CN202111355520 A CN 202111355520A CN 116136659 A CN116136659 A CN 116136659A
- Authority
- CN
- China
- Prior art keywords
- intention
- user
- group
- electronic device
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides an intelligent device control method and electronic equipment, and relates to the technical field of terminals. The method and the device can divide the devices according to the groups, realize group control of the devices, simplify the device control flow and promote the user experience. The method comprises the following steps: a first interface is displayed that includes a first intent identification, and a first group identification corresponding to a first group that includes X electronic devices that can execute the first intent corresponding to the first intent identification. Then, a first operation of the first group identifier by the user is received, the first group identifier is controlled to move to the first intention identifier, and the X electronic devices are instructed to execute the first intention.
Description
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to an intelligent device control method and electronic equipment.
Background
With the development of terminal technology, more and more electronic devices are owned by users. As shown in fig. 1, in the home scenario, various devices (such as an audio/video device 11, a lighting system device 12, an environmental control device 13, a security system device 14, etc.) in the home are connected together through the internet of things technology to form an intelligent home system, so that centralized control of the devices is realized, and multiple functions such as home appliance control, lighting control, anti-theft alarm, etc. are provided for users.
However, because of the large number of devices, users need to connect, arrange, and group one by one before device control can be achieved. In addition, if the user needs to operate a plurality of devices, the user needs to repeatedly switch a plurality of interfaces in the intelligent home system to operate the plurality of devices respectively, so that the user has complicated operation, consumes time and cannot list the states of the full-house devices.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides an intelligent device control method and electronic equipment. According to the technical scheme provided by the embodiment of the application, the equipment can be divided according to the groups, so that the group control of the equipment is realized, the equipment control flow is simplified, and the user experience is improved.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
in a first aspect, an intelligent device control method is provided and is applied to a first electronic device. The method comprises the following steps: and displaying a first interface, wherein the first interface comprises a first intention identifier and a first group identifier corresponding to the first group, the first group comprises X electronic devices capable of executing the first intention corresponding to the first intention identifier, and X is a positive integer. A first operation of a user on a first group identification is received. In response to the first operation, the first group identification is controlled to move to the first intent identification and the X electronic devices are instructed to execute the first intent.
In some embodiments, the first intent corresponds to a function that the electronic device may perform, such as including a lighting intent, a thermostat intent, and so on. The intention mark is a function mark and can be used for representing corresponding achievable functions, such as the on-lamp intention mark represents corresponding on-lamp functions, and the constant-temperature intention mark represents corresponding constant-temperature functions.
In some embodiments, a plurality of electronic devices of the same category attribute and/or location attribute are included in the first group, which may implement the corresponding intent. For example, the category attribute and/or the location attribute of the X electronic devices are the same.
In this way, the first electronic device displays the first intention identifier and the first group identifier, so that a user can intuitively determine the intention executed by the electronic devices in the group and the controllable group. And the group control of the electronic equipment can be realized through simple operation, each electronic equipment does not need to be controlled one by one, the user operation is simplified, and the user experience is improved.
According to a first aspect, before displaying the first interface, the method further comprises: and displaying a second interface, wherein the second interface comprises a plurality of electronic equipment identifiers respectively corresponding to the plurality of electronic equipment. A second operation is received to select X electronic device identifications from the plurality of electronic device identifications. In response to the second operation, a first group comprising X electronic devices is established.
In some embodiments, the first electronic device may establish one or more groups including a plurality of electronic devices according to user operations prior to implementing the group control. And if the plurality of electronic device identifications are displayed, determining the electronic devices for establishing the group according to a second operation of selecting the device identifications by the user. Optionally, the second operation may be an operation of clicking to select the electronic device identifier, or may be an operation of circling a plurality of electronic device identifiers for the slider.
In some embodiments, the first electronic device may display electronic devices capable of performing group connection, and in the displaying process, receive a user operation, and determine a grouping result of the electronic devices according to the user operation. The first electronic device may determine the electronic device capable of performing group connection through a device searching manner, or may acquire a device list corresponding to a home maintained in the server through interaction with the server, and determine the electronic device capable of performing group connection. Wherein, the equipment list comprises information such as equipment rooms, equipment categories and the like.
In other embodiments, the first electronic device may be any electronic device with a display screen and a group processing capability, for example, the first electronic device has an explicit category attribute and/or a location attribute, and may automatically confirm that the nearby electronic devices have the same category attribute and/or location attribute, so as to prompt the user to establish a group. Wherein the category attribute is used to represent a category of the electronic device. The location attribute is used to represent a space where the electronic device is located, such as dividing the space where the electronic device is located with room granularity.
In still other embodiments, the user may also establish the group of electronic devices by touching a touch operation through a near field connection function (such as an NFC function) of the first electronic device.
For example, the first electronic device detects a bump operation of the user, such as a bump of the NFC tag of the device a, obtains device information of the device a, and may broadcast the information for searching for electronic devices having the same category attribute as the device a within a preset distance. The electronic device identification of each searched electronic device may be displayed and it is determined to establish a group including all or part of the electronic devices therein according to the user operation.
For another example, in the preset time, the user performs a touch operation on the plurality of electronic devices through the first electronic device, and the first electronic device may obtain electronic device information of the corresponding plurality of electronic devices and display electronic device identifiers of the plurality of electronic devices. Then, it is determined to establish a group including all or part of the electronic devices therein according to the user operation.
Therefore, the user can add the equipment in batches through simple operation, the electronic equipment group is established, the operation difficulty of the user is reduced, and the use experience of the user is improved.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: and aggregating and displaying the electronic device identifiers corresponding to the electronic devices with the same category attribute and/or position attribute in the plurality of electronic devices.
For example, the first electronic device may divide the electronic device according to the category attribute and/or the location attribute of the electronic device. For example, the electronic equipment is divided into kitchen electric equipment, video-audio equipment, illumination equipment and the like according to the category attribute; such as dividing the electronic devices into electronic devices in a main sleeper, electronic devices in a kitchen, electronic devices in a living room, etc., according to the location attribute. Thus, the first electronic device can gather and display the electronic device identifications, so that a user can conveniently and quickly select the electronic devices divided into one group. And if the first electronic equipment displays the electronic equipment identifiers of the lighting equipment in an aggregation way, the user can directly circle the electronic equipment identifiers of the aggregation display part to determine to establish a group corresponding to the corresponding lighting intention.
Therefore, the first electronic equipment displays the electronic equipment identifiers through aggregation, so that the user operation is facilitated, and the group establishment efficiency is further improved.
According to a first aspect, or any implementation manner of the first aspect, before receiving a first operation of a first group identifier by a user, the first intention identifier is displayed in a first area of a first interface, and the first group identifier is displayed in other areas than the first area; the first operation includes: the first group identification is moved to a first area.
Therefore, the user can conveniently control the operation by dividing the region where the intention identification is located, the user only needs to approach the group identification to the first region or move the group identification to the first region, and the first electronic device can determine that the user needs to instruct the electronic device in the group identification to execute the intention corresponding to the current first region.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: a third operation of the user on the first group identification is received. And responding to the third operation, and displaying X electronic equipment identifiers corresponding to the X electronic equipment respectively on the first interface.
Alternatively, the third operation may be an operation of clicking on the first group identification.
In some embodiments, the first electronic device may highlight a first group identification that includes the executed first intent such that the user is able to determine the execution of the electronic device based on the highlighting of the group identification. Then, if the first electronic device detects that the user clicks the first group identifier, the electronic device identifier of the electronic device for which the first intention has been executed in the first group corresponding to the first group identifier may be displayed.
For example, the first electronic device may display, in response to the third operation, a device identifier of an electronic device that performs the first intention from among the X electronic devices, for example, the M electronic devices perform the first intention successfully, the N electronic devices perform the first intention with failure, and the first electronic device may display the electronic device identifiers of the M electronic devices. I.e. the electronic device identification of the electronic device that did not perform or failed to perform the first intention may not be displayed or highlighted. In this way, the user may determine which electronic devices performed the first intent and which electronic devices did not perform the first intent.
According to the first aspect, or any implementation manner of the first aspect, after displaying X electronic device identifiers corresponding to the X electronic devices on the first interface, the method further includes: and receiving fourth operation of the user on the electronic equipment identifier corresponding to the second electronic equipment in the X electronic equipment identifiers. And responding to the fourth operation, controlling the electronic device identifier corresponding to the second electronic device to move out of the first area, and indicating the second electronic device to cancel executing the first intention.
Optionally, the fourth operation may be an operation of dragging the electronic device identifier of the second electronic device out of the first area.
Wherein cancelling the intent to execute includes instructing the electronic device to execute a command opposite to the intent, e.g., cancelling the intent to turn on the light instructs the electronic device to turn off the light. The command to cancel the execution intention includes no longer executing the intention, for example, the intention is a constant temperature intention, and canceling the constant temperature intention is an instruction that the electronic device no longer executes the constant temperature intention.
For example, detection of a fourth operation by the first electronic device to move the electronic device identification of the second electronic device in a direction away from the first intent identification may instruct the second electronic device to cancel execution of the first intent.
Therefore, the first electronic equipment can realize independent control of equipment in the group while realizing group control, and the user requirement is met.
According to a first aspect, or any implementation manner of the first aspect, the category attribute and/or the location attribute of the X electronic devices are the same.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: and displaying prompt information on the first interface, wherein the prompt information is used for feeding back the condition that X pieces of electronic equipment execute the first intention.
The first electronic device may instruct the X electronic devices to perform the first intention, and the X electronic devices may send the feedback signal after performing the first intention. The first electronic device may determine whether the first intention is performed based on the received feedback signal. Optionally, the first electronic device may display a prompt message according to the received feedback signal, for example, displaying the execution condition of the command in the group identifier.
Therefore, the user can intuitively determine the execution condition of the command according to the display of the group identifier, and the use experience of the user is improved.
According to the first aspect, or any implementation manner of the first aspect, the first interface further displays a second intention identifier, and the method further includes: a fifth operation of the user moving the second intent identification to the first area is received. And in response to the fifth operation, instructing Y electronic devices in the first group to execute the second intention corresponding to the second intention identification, wherein Y is a positive integer.
In some embodiments, the first electronic device determines a fusion intention, which may include at least two intents, based on the user operation. Thus, if the user needs to execute more than two intents, the user does not need to operate the two intention identifiers one by one, but can pre-establish the fusion intention comprising the more than two required intents, and can directly instruct the electronic devices in the group to execute the fusion intention, thereby simplifying the user operation.
Alternatively, after detecting the fifth operation, the first electronic device may determine the electronic devices in the first group that may execute the second intention, and then send an indication signal to these electronic devices to instruct to execute the second intention. Or after the first electronic device detects the fifth operation, the first electronic device may directly send an indication signal to the electronic devices in the first group, and then the first electronic device determines the execution condition according to the feedback signal.
For example, as shown in fig. 31 (a), the first electronic device displays the second intention identification, detects a fifth operation of the user to move the second intention identification to the first region, and may generate a fusion intention of the first intention and the second intention. And, determining whether there are electronic devices that can perform the second intention in the first group corresponding to the first intention, if so, may instruct the electronic devices that can perform the second intention to perform the second intention. The electronic device that executes the second intention may be an electronic device that has executed the first intention, may be an electronic device that has not executed the first intention, or may be a part of an electronic device that has executed the first intention.
According to the first aspect, or any implementation manner of the first aspect, after receiving the fifth operation, the method further includes: a sixth operation of a second group identification displayed in the first interface by the user is received, the second group identification corresponding to a second group of Z electronic devices including executable first intent and second intent, Z being a positive integer. In response to the sixth operation, the second group identification is moved to the first area and the Z electronic devices are instructed to perform the first intent and the second intent.
In some embodiments, the first electronic device may display group identifications of a plurality of groups to which the intent corresponds, and the user may select some or all of the groups according to the needs to determine to instruct the electronic device therein to perform the intent.
Therefore, the user can determine the group of executable intents through simple operation, and the group control of the electronic equipment is convenient for the user.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: a seventh operation of the user on the first region is received. In response to the seventh operation, the first group identification and the second group identification are controlled to move out of the first area and instruct the Y electronic devices and the Z electronic devices to cancel executing the first intention and the second intention.
The seventh operation is an operation of pressing an arbitrary position in the first area for a long time, for example. After the first electronic device detects the seventh operation of the user, it is determined that all electronic devices indicating that the first intention and the second intention have been performed cancel performing the first intention and the second intention.
Also, for example, the first electronic device may instruct the electronic devices in all the currently displayed group identifications to execute the intention corresponding to the current first area when detecting the operation of the user pressing the first area for a long time at any position.
Therefore, the user can indicate more group execution intents or cancel the execution intents through simple operation, and the operation difficulty of the user is further reduced.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: an eighth operation of the user on a second region is received, the second region corresponding to a second intent. In response to the eighth operation, the second intent identification is controlled to move out of the first area and the Y electronic devices are instructed to cancel execution of the second intent.
The first region includes a second region, the second region corresponding to a second intent. For example, if the converged intent includes M intents, then the first electronic device may divide the first region into an average of hot zones corresponding to the M intents, each hot zone corresponding to one intent. The first electronic device determines that the user performs eight operations on a certain hot zone, can remove the intention corresponding to the hot zone from the fusion intention, and indicates the electronic device currently executing the fusion intention to cancel the intention of executing the removal.
For example, if the two intents are fused, the second region may be a region located on the left side of the first region, or may be a region located on the right side of the first region.
Therefore, the first electronic device can create the fusion intention according to the requirement of the user, and can split the fusion intention according to the requirement of the user, so that the use experience of the user is further improved.
According to the first aspect, or any implementation manner of the first aspect, the third operation is a click operation. The fourth operation, the fifth operation, the sixth operation, and the eighth operation are drag operations. The seventh operation is a long press operation.
In a second aspect, an electronic device is provided. The electronic device includes: a processor, a memory, and a display screen, the memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to: and displaying a first interface, wherein the first interface comprises a first intention identifier and a first group identifier corresponding to the first group, the first group comprises X electronic devices capable of executing the first intention corresponding to the first intention identifier, and X is a positive integer. A first operation of a user on a first group identification is received. In response to the first operation, the first group identification is controlled to move to the first intent identification and the X electronic devices are instructed to execute the first intent.
According to a second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to: and displaying a second interface, wherein the second interface comprises a plurality of electronic equipment identifiers respectively corresponding to the plurality of electronic equipment. A second operation is received to select X electronic device identifications from the plurality of electronic device identifications. In response to the second operation, a first group comprising X electronic devices is established.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: and aggregating and displaying the electronic device identifiers corresponding to the electronic devices with the same category attribute and/or position attribute in the plurality of electronic devices.
According to a second aspect, or any implementation manner of the above second aspect, before receiving a first operation of a first group identifier by a user, the first intention identifier is displayed in a first area of a first interface, and the first group identifier is displayed in other areas than the first area; the first operation includes: the first group identification is moved to a first area.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: a third operation of the user on the first group identification is received. And responding to the third operation, and displaying X electronic equipment identifiers corresponding to the X electronic equipment respectively on the first interface.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: and receiving fourth operation of the user on the electronic equipment identifier corresponding to the second electronic equipment in the X electronic equipment identifiers. And responding to the fourth operation, controlling the electronic device identifier corresponding to the second electronic device to move out of the first area, and indicating the second electronic device to cancel executing the first intention.
According to a second aspect, or any implementation manner of the second aspect above, the category attribute and/or the location attribute of the X electronic devices are the same.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: and displaying prompt information on the first interface, wherein the prompt information is used for feeding back the condition that X pieces of electronic equipment execute the first intention.
According to a second aspect, or any implementation manner of the second aspect, the first interface further displays a second intention identifier, and when the processor reads the computer instructions from the memory, the electronic device is further caused to perform the following operations: a fifth operation of the user moving the second intent identification to the first area is received. And in response to the fifth operation, instructing Y electronic devices in the first group to execute the second intention corresponding to the second intention identification, wherein Y is a positive integer.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: a sixth operation of a second group identification displayed in the first interface by the user is received, the second group identification corresponding to a second group of Z electronic devices including executable first intent and second intent, Z being a positive integer. In response to the sixth operation, the second group identification is moved to the first area and the Z electronic devices are instructed to perform the first intent and the second intent.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: a seventh operation of the user on the first region is received. In response to the seventh operation, the first group identification and the second group identification are controlled to move out of the first area and instruct the Y electronic devices and the Z electronic devices to cancel executing the first intention and the second intention.
According to a second aspect, or any implementation manner of the second aspect, the computer instructions, when read from the memory by the processor, further cause the electronic device to perform the following operations: an eighth operation of the user on a second region is received, the second region corresponding to a second intent. In response to the eighth operation, the second intent identification is controlled to move out of the first area and the Y electronic devices are instructed to cancel execution of the second intent.
According to the second aspect, or any implementation manner of the second aspect above, the third operation is a click operation. The fourth operation, the fifth operation, the sixth operation, and the eighth operation are drag operations. The seventh operation is a long press operation.
The technical effects corresponding to the second aspect and any implementation manner of the second aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device has a function of implementing the method for controlling an intelligent device as described in the first aspect and any one of possible implementation manners of the first aspect. The functions may be implemented by hardware, or by corresponding software executed by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
The technical effects corresponding to the third aspect and any implementation manner of the third aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, and are not described herein again.
In a fourth aspect, a computer-readable storage medium is provided. The computer readable storage medium stores a computer program (which may also be referred to as instructions or code) which, when executed by an electronic device, causes the electronic device to perform the method of the first aspect or any implementation of the first aspect.
The technical effects corresponding to the fourth aspect and any implementation manner of the fourth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a fifth aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform the method of the first aspect or any of the embodiments of the first aspect.
The technical effects corresponding to the fifth aspect and any implementation manner of the fifth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a sixth aspect, embodiments of the present application provide circuitry comprising processing circuitry configured to perform the first aspect or the method of any one of the embodiments of the first aspect.
The technical effects corresponding to the sixth aspect and any implementation manner of the sixth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a seventh aspect, embodiments of the present application provide a chip system, including at least one processor and at least one interface circuit, where the at least one interface circuit is configured to perform a transceiver function and send an instruction to the at least one processor, and when the at least one processor executes the instruction, the at least one processor performs the method of the first aspect or any implementation manner of the first aspect.
The technical effects corresponding to the seventh aspect and any implementation manner of the seventh aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein again.
Drawings
Fig. 1 is a schematic view of a home scenario provided in an embodiment of the present application;
FIG. 2 is a first schematic interface diagram according to an embodiment of the present disclosure;
FIG. 3 is a second schematic interface diagram according to an embodiment of the present disclosure;
FIG. 4A is a schematic diagram of a system architecture according to an embodiment of the present disclosure;
FIG. 4B is a second schematic diagram of a system architecture according to an embodiment of the present disclosure;
fig. 5A is a schematic hardware structure of a first electronic device according to an embodiment of the present application;
fig. 5B is a schematic hardware structure of a communication device according to an embodiment of the present application;
FIG. 6 is a third interface schematic provided in an embodiment of the present application;
fig. 7 is a schematic diagram of an interface provided in an embodiment of the present application;
fig. 8 is a fifth interface schematic diagram provided in an embodiment of the present application;
fig. 9 is a schematic diagram of an interface provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface provided in an embodiment of the present application;
FIG. 11 is a schematic diagram eighth interface provided in an embodiment of the present application;
fig. 12 is a schematic diagram of an interface provided in an embodiment of the present application;
Fig. 13A is a schematic diagram of a group division scenario 1 provided in an embodiment of the present application;
FIG. 13B is a schematic illustration of an interface provided in an embodiment of the present application;
fig. 14A is a second schematic view of a group division scenario provided in an embodiment of the present application;
fig. 14B is a schematic diagram three of a group division scenario provided in an embodiment of the present application;
fig. 14C is a schematic diagram of a group division scenario provided in an embodiment of the present application;
FIG. 15 is an eleventh interface schematic provided in an embodiment of the present application;
FIG. 16 is a schematic view of an interface twelve according to an embodiment of the present disclosure;
FIG. 17 is a thirteenth interface schematic diagram provided in an embodiment of the present application;
FIG. 18 is a schematic diagram fourteen of an interface provided in an embodiment of the present application;
FIG. 19 is a schematic diagram fifteen of an interface provided in an embodiment of the present application;
FIG. 20A is a sixteen interface schematic diagrams provided by embodiments of the present application;
FIG. 20B is a seventeen schematic interface diagrams provided in embodiments of the present application;
FIG. 21 is a schematic illustration of an interface eighteen provided in an embodiment of the present application;
FIG. 22 is a nineteenth interface schematic provided in an embodiment of the present application;
FIG. 23 is a schematic illustration of an interface provided in an embodiment of the present application;
FIG. 24 is a schematic illustration of an interface provided by an embodiment of the present application twenty-one;
FIG. 25 is a diagram illustrating twenty-two interfaces according to an embodiment of the present disclosure;
FIG. 26 is a twenty-third interface schematic provided in an embodiment of the present application;
FIG. 27 is a twenty-four interface schematic diagram provided by an embodiment of the present application;
FIG. 28A is a diagram illustrating twenty-five interfaces provided in an embodiment of the present application;
FIG. 28B is a diagram illustrating twenty-six interfaces according to one embodiment of the present disclosure;
FIG. 29 is a diagram illustrating twenty-seven interfaces according to an embodiment of the present disclosure;
FIG. 30 is a schematic diagram of an interface twenty-eighth embodiment of the present application;
FIG. 31 is an illustration of an interface diagram twenty-nine provided in an embodiment of the present application;
FIG. 32 is a flowchart of a method for controlling an intelligent device according to an embodiment of the present disclosure;
fig. 33 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments of the present application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the various embodiments herein below, "at least one", "one or more" means one or more than two (including two).
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In some scenes, various electronic devices have entered the lives of people, and the concept of an intelligent home system is provided for the household use scene of the electronic devices, the house is taken as a platform, and the intelligent home system organically combines all electronic devices and application subsystems related to the home life by utilizing technologies such as the Internet of things and automatic control. The electronic device in the smart home system is, for example, a smart home device. The intelligent household equipment is intelligent equipment, and comprises audio and video equipment (such as large screen equipment, bluetooth loudspeaker boxes and the like), lighting equipment (such as a ceiling lamp, a desk lamp, a spotlight and the like), environment control equipment (such as an air conditioner, an air purifier and the like), anti-theft alarm equipment (such as a human body sensor, a camera and the like) and the like.
For example, assuming that the air conditioner is used as a smart home device and is connected with the mobile phone, the air conditioner can receive a control command sent by a user through the mobile phone. For example, the air conditioner may be automatically started by receiving an "on" command input by a user through a mobile phone. For another example, the air conditioner receives a command of adjusting the temperature to 26 ℃ inputted by a user through a mobile phone, and can automatically adjust the temperature to 26 ℃. Optionally, an intelligent home application (such as a smart living application) is installed in the electronic device (such as a mobile phone), and the intelligent home application is paired with the intelligent home device to manage and control the intelligent home device.
However, if the control of the smart home device is to be implemented, the electronic device needs to establish a connection with the smart home device in advance and configure the smart home device.
Illustratively, assuming that a smart life application is installed in a mobile phone, the mobile phone starts the smart life application, and a scene interface 201 as shown in fig. 2 (a) is displayed. After detecting the operation of clicking the add control 21 by the user, the mobile phone displays an interface 202 as shown in fig. 2 (b), and receives the operation of creating a scene by the user, such as the operation of adding the condition for controlling the smart home device and the task required to be executed by the smart home device. After detecting the user clicking the add task control 22, the mobile phone displays an interface 203 as shown in fig. 2 (c). On the interface 203, after detecting the operation of clicking the control 23 of the intelligent device by the mobile phone, it is determined that the user needs to add a task of controlling the intelligent device, and the controllable intelligent device can be displayed for the user to select. Assuming that the smart device selected by the user is an air conditioner, the mobile phone may display the execution task corresponding to the air conditioner selected by the user, as shown by reference numeral 24 in the interface 204 shown in fig. 2 (d). As indicated by reference numeral 25, the trigger condition that currently controls the air conditioner defaults to "when clicking the scene card". After detecting the operation of clicking the control shown by the reference numeral 25 by the user, the mobile phone can modify and control the triggering condition of the air conditioner according to the operation of the user; alternatively, as shown in the interface 202 of fig. 2 (b), after detecting the operation of clicking the add condition control 26 by the user, the mobile phone may also receive the trigger condition input by the user. Then, as shown in an interface 204 in fig. 2 (d), after detecting the operation of clicking the confirmation control 27 by the user, the mobile phone confirms that the current scene creation of the user is completed. Thereafter, the handset displays an interface 205 as shown in fig. 2 (e), such as by the scene card 28 prompting the user that the scene creation was successful.
It can be seen that the intelligent devices need to be scheduled in advance if control of the intelligent devices needs to be achieved. However, as in the scene editing process described in fig. 2, the scene editing process is complicated and the user operation difficulty is high. And, if the user needs to add or delete some devices in the scene, a complex editing process is also needed, which affects the user experience.
In other embodiments, the electronic device is configured with a superterminal function for implementing inter-device linkage and control. For example, the electronic device can establish connection with other nearby electronic devices (such as mobile phones, flat plates, sound boxes and other intelligent devices) through the function of the super terminal, so that data transmission is realized, and display content is sent to the other electronic devices for display. However, the function of the super terminal can only realize the linkage between the electronic device which starts the super terminal and other electronic devices, but cannot realize the linkage between any electronic devices.
Illustratively, assume that the handset is configured with a superterminal function. As shown in interface 301 of fig. 3 (a), the mobile phone displays a drop-down menu, and after detecting the operation of clicking the connection control 31 by the user, it is determined that the user instructs the mobile phone to establish a connection with a nearby electronic device. The mobile phone displays an interface 302 as shown in fig. 3 (b), and an electronic device connectable near the mobile phone is displayed on the interface 302. After the mobile phone detects the operation of dragging the icon 33 to the mobile phone icon 32 by the user, the mobile phone determines that the connection between the smart screen corresponding to the icon 33 and the mobile phone is required. As shown in the interface 303 of fig. 3 (c), the handset prompts the user, via the icon shown by reference numeral 34, that the smart screen is establishing a connection with the handset. After the connection is established successfully, the mobile phone can respond to the operation of the user to send the video being played to the smart screen for playing. And then, if the user needs to add other electronic equipment to establish connection with the mobile phone, repeating the operation. For example, as shown in interface 303 in fig. 3 (c), after detecting the operation of dragging icon 35 to icon 32 of the mobile phone, the mobile phone determines that it is necessary to connect the smart screen corresponding to icon 35 with the mobile phone. As shown in fig. 3 (d) as interface 304, the handset prompts the user, via an icon shown at 36, that the smart screen is establishing a connection with the handset.
It can be seen that, in the super terminal scene, the electronic device configured with the super terminal function is a central device, each electronic device can only establish connection with the central electronic device to perform linkage, but each electronic device cannot directly establish connection, so that the realization of other application scenes is limited. And if the mobile phone is required to be connected with a plurality of electronic devices, the user is required to drag the plurality of electronic devices one by one, and the operation is complex. Correspondingly, the linkage process of the mobile phone and other electronic equipment still needs one-to-one configuration of users, so that the operation difficulty of the users is high, and the use experience of the users is influenced.
Therefore, the embodiment of the application provides an intelligent device control method, which can divide the electronic devices into groups. According to the group division result, group connection and control are realized, the operation difficulty of a user is effectively reduced, and the use experience of the user is improved.
Fig. 4A is a schematic diagram of a system architecture according to an embodiment of the present application. As shown in fig. 4A, the system architecture includes a first electronic device 100, a server 200, and a control device 300.
Optionally, the first electronic device 100 may be, for example, a speaker 101, a large screen device 102, a desk lamp 103, an electric lamp 104, a camera 105, an air purifier 106, a mobile phone, a tablet computer, a personal computer (personal computer, PC), a personal digital assistant (personal digital assistant, PDA), a netbook, a wearable electronic device, an artificial intelligence (artificial intelligence) terminal, and the specific form of the electronic device is not limited in this application. The operating system installed by the first electronic device 100 includes, but is not limited to Or other operating system. The first electronic device 100 may not be equipped with an operating system. In some embodiments, the first electronic device 100 may be a fixed device or a portable device. The specific type of the first electronic device 100, whether the operating system is installed or not, and the operating system installed under the installed operating system are not limited in the present application.
In some embodiments, the first electronic devices 100 may be smart home devices, and the first electronic devices 100 may be connected to each other to form a smart home system. The first electronic devices 100 can establish a connection with the server 200, and the respective first electronic devices 100 are managed by the server 200. For example, the server 200 manages one or more first electronic devices 100 included in one or more home units (home) in units of home, and the first electronic device 100 joins the first electronic device 100 to a corresponding home in a network configuration process of requesting to join the smart home system.
Alternatively, the server 200 may be a device or network device having a computing function, such as a cloud server or a network server. The server 200 may be a server, a server cluster formed by a plurality of servers, or a cloud computing service center. The server 200 may also be described as a smart home cloud platform for managing smart home devices and the like included in a smart home system.
Optionally, as shown in fig. 4A, a control device 300 may also be included in the system architecture. The control device 300 may be connected to one or more first electronic devices 100 for managing and controlling the first electronic devices 100.
Alternatively, the control device 300 may be a dedicated device that controls the smart home device, or a device that includes functionality that controls the smart home device. For example, the control device 300 may be a smart home control panel 302, or a terminal device such as a mobile phone 301, a tablet, a smart stereo, a smart watch, etc. The smart home control panel 302 is a dedicated device for controlling smart home devices in the smart home system. In some embodiments, the control device 300 may be a stationary device or a portable device. The specific form of the control apparatus 300 is not particularly limited in this application.
In some embodiments, the control device 300 is connected to one or more first electronic devices 100 to obtain device information of the first electronic devices 100. The control device 300 provides a man-machine interaction interface, displays device information of the first electronic device 100 for a user through the man-machine interaction interface, and receives a control command of the user on the first electronic device 100.
In some embodiments, the control device 300 has a first application installed therein. The first application is an intelligent home application capable of being connected with intelligent home equipment and editing and managing the intelligent home equipment. As shown in fig. 4A, the control device 300 connects one or more first electronic devices 100 through a first application. Optionally, the first application is a smart life application.
In some embodiments, the control device 300 detects an operation of adding a device by a user in a process of starting the first application, searches for the nearby first electronic device 100, and performs network allocation on the searched first electronic device 100. In the network configuration process, the control device 300 sends network information (such as a network name and a password) of the local area network to the first electronic device 100, so as to assist the first electronic device 100 to join in the same local area network as the control device 300, and the first electronic device 100 can establish a wireless communication connection with the control device 300. And, the control device 300 transmits the device information of the first electronic device 100 to the server 200, so that the server 200 joins the first electronic device 100 into a corresponding home and assigns a device identity (identity document, ID) to the first electronic device 100. Then, the subsequent server 200 can uniformly manage the first electronic device 100 included in the home.
For example, the control device 300 receives a command input by a user to activate a certain first electronic device, and forwards the command to the server 200 for processing. The server 200 analyzes the command, determines the target first electronic device, and issues the command to the target first electronic device to activate the target first electronic device.
It should be noted that, in the above example, in the process of controlling the device 300 to launch the first application, triggering the first electronic device 100 to join a home is taken as an example, and a process of adding the first electronic device 100 to a corresponding home is described. It is understood that the method of triggering the first electronic device 100 to join the home may also include other methods. For example, if the control device 300 does not start the first application after being started, the first electronic devices 100 that are not connected to the local area network and/or do not join the home may be automatically searched, and according to the user operation, some or all of the first electronic devices 100 are connected to the local area network and join the home. The embodiment of the present application is not particularly limited.
In some embodiments, as shown in fig. 4B, the control device 300 may not be included in the system architecture. The first electronic device 100 joins in the home managed by the server 200, and the server 200 can directly acquire information of all the first electronic devices 100 in the home. Then any of the first electronic devices 100 with processing capabilities may send a request to the server 200 as needed to obtain information of the other first electronic devices 100. Thereafter, the electronic device acts as a master device and can be used to control other first electronic devices 100.
For example, the first electronic device 100 includes a range hood, which can search for nearby devices of the same class according to a user operation. A device search request can be sent to the server 200 to acquire all device information included in the home where the range hood is located, and further information of devices of the same category included in the device information is determined; or, the range hood determines the same class of equipment among the searched equipment located in the preset range according to the searched information of the equipment located in the preset range and the equipment information acquired from the server 200. The range hood then determines that a group is established with the same class of devices (e.g., group a is established) and reports the group information to the server 200. Thereafter, the user can control other devices in group a through any one of the devices in group a, such as to start all the devices in group a simultaneously, etc.; or, the user can control the devices in other groups through any one of the devices in the home, for example, the user can acquire the information of the other devices in the home through the device A (such as a range hood) in the group A, and then control the device B to start through the device A (the device B is the device in the group B), or control all the devices in the group B to start through the device A. Therefore, the user can configure and control part of the equipment in a group mode, and the user does not need to configure and operate each equipment one by one, so that the operation difficulty of the user is reduced. One category of equipment is used to represent a consumer demand, and includes kitchen appliances, audio-visual appliances, lighting appliances, etc.
By way of example, fig. 5A shows a schematic structural diagram of the first electronic device 100.
The first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to a touch sensor, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor through an I2C interface, such that the processor 110 communicates with the touch sensor through an I2C bus interface to implement a touch function of the first electronic device 100.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing function of first electronic device 100. The processor 110 and the display 194 communicate via the DSI interface to implement the display functionality of the first electronic device 100.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the first electronic device 100, or may be used to transfer data between the first electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the first electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the first electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on the first electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device or displays images or video through a display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied on the first electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of first electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that first electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
In some embodiments, the first electronic device 100 communicates with the server 200, the control device 300, or other first electronic devices 100 through the mobile communication module 150 or the wireless communication module 160, to achieve uniform control of the first electronic devices 100 within the group.
For example, if the device 1, the device 2 and the device 3 are divided into the same group, any one of the device 1, the device 2 and the device 3 may control the three devices simultaneously through interaction with the server 200. Alternatively, the control device 300 may realize simultaneous control of the device 1, the device 2, and the device 3 through interaction with the server 200.
The first electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may be manufactured using a liquid crystal display (liquid crystal display, LCD), for example, using an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a Mini-led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the first electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some embodiments, the first electronic device 100 may obtain the full-house device information from the server 200 and display the full-house device information on the display screen 194, so that the user may list the full-house device conditions. And, the user can control the devices in the group in a unified manner on the display screen 194 in the unit of group according to the condition of the whole house device.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the first electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the first electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the first electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the first electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The first electronic device 100 may play, record, etc. music through the audio module 170. The audio module 170 may include a speaker, a receiver, a microphone, a headphone interface, an application processor, etc. to implement audio functions.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The first electronic device 100 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the display screen, the first electronic device 100 detects the touch operation intensity according to the pressure sensor. The first electronic device 100 may also calculate the position of the touch according to the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
Touch sensors, also known as "touch devices". The touch sensor may be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor may also be disposed on the surface of the first electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The first electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the first electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation with the first electronic device 100. The first electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
Alternatively, the server 200 and the control device 300 in the embodiments of the present application may be implemented by different devices. For example, the server 200 and the control device 300 in the embodiment of the present application may be implemented by the communication device in fig. 5B. Fig. 5B is a schematic hardware structure of a communication device according to an embodiment of the present application. The communication device comprises at least one processor 501, communication lines 502, a memory 503 and at least one communication interface 504. Wherein the memory 503 may also be included in the processor 501.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the communication device. In other embodiments of the present application, the communication device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. For example, the communication device is the control device 300, and the control device 300 is a mobile phone, and then the control device 300 may be further configured with a SIM card interface, a camera, an audio module, and other modules.
The processor 501 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 502 may include a pathway to transfer information between the aforementioned components.
A communication interface 504 for communicating with other devices. In the embodiment of the application, the communication interface may be a module, a circuit, a bus, an interface, a transceiver, or other devices capable of implementing a communication function, for communicating with other devices. Alternatively, when the communication interface is a transceiver, the transceiver may be a separately provided transmitter that is operable to transmit information to other devices, or a separately provided receiver that is operable to receive information from other devices. The transceiver may also be a component that integrates the functions of transmitting and receiving information, and the embodiments of the present application do not limit the specific implementation of the transceiver.
The memory 503 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store the desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via communication line 502. The memory may also be integrated with the processor.
The memory 503 is used to store computer executable instructions for implementing the embodiments of the present application, and is controlled to be executed by the processor 501. The processor 501 is configured to execute computer-executable instructions stored in the memory 503, thereby implementing a carrier wave transmission method provided in the following embodiments of the present application.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application code, instructions, computer programs, or other names, and the embodiments of the present application are not limited in detail.
In a particular implementation, as one embodiment, processor 501 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 5B.
In a particular implementation, as one embodiment, the communication device may include multiple processors, such as processor 501 and processor 507 in FIG. 5B. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In a specific implementation, the communication device may further include an output device 505 and an input device 506, as one embodiment. The output device 505 communicates with the processor 501 and may display information in a variety of ways. For example, the output device 505 may be a liquid crystal display (liquid crystal display, LCD), a light emitting diode (light emitting diode, LED) display device, a Cathode Ray Tube (CRT) display device, or a projector (projector), or the like. The input device 506 is in communication with the processor 501 and may receive user input in a variety of ways. For example, the input device 506 may be a mouse, a keyboard, a touch screen device, a sensing device, or the like.
The communication device may be a general purpose device or a special purpose device, and the embodiments of the present application are not limited to the type of communication device. For example, the communication device is an intelligent home control panel, and is a special device for controlling the intelligent home device. For another example, the communication device is a mobile phone, and is a general device capable of controlling the smart home device.
The following describes the smart device control method provided in the application embodiment, taking the control device 300 as a mobile phone or a smart device control panel, and taking the application for managing the smart home device as a smart life application as an example.
In some embodiments, after the electronic devices access the local area network, other electronic devices accessing the same local area network and/or logging in the same account can be found, and the electronic devices are grouped. Then, subsequently, the divided groups can be used as units, and the group control can be realized according to the user requirements. The account is used for representing an account of the intelligent household equipment management system which the electronic equipment logs in the server registration process.
For example, a user registers a smart life application, obtains a user name and password for an account. And the subsequent user logs in the account through other electronic equipment (such as a mobile phone) which is already connected with the network in the network connection process of the new electronic equipment, so as to assist the new electronic equipment in network connection. Then, the server divides the electronic devices under the same account into the same home to realize the management of the electronic devices by taking the home as a unit. Optionally, the server manages one or more home including one or more groups. A home is a large group that includes all electronic devices added by the user.
Specifically, taking a mobile phone as an example, after the mobile phone logs in a smart life application, after detecting the operation of adding the electronic equipment by a user, sending equipment information of the newly added electronic equipment to a server, determining an electronic equipment ID by the server, and dividing the electronic equipment into a home corresponding to the current login account of the mobile phone to complete the network allocation of the electronic equipment. Or the intelligent home control panel responds to the user operation, the newly added device information of the electronic device is sent to the server, the server determines the ID of the electronic device, and the electronic device is divided into the home corresponding to the intelligent home control panel. Optionally, the smart home control panel may log in to multiple accounts, and then the same server divides the home according to the accounts.
The mobile phone can perform connection management of the electronic equipment through the super terminal function. As shown in interface 601 of fig. 6 (a), the handset displays a drop down menu in which the super terminal card 61 is displayed. After detecting the operation of clicking the connection control 62 displayed on the super terminal card 61 by the mobile phone, it is determined that the user needs to perform connection management on nearby devices, and an interface 602 shown in fig. 6 (b) is displayed. On the interface 602, icons of electronic devices that can be group-connected and that are searched by the cell phone are displayed. For example, the center icon displayed on the interface 602 is a mobile phone icon 63, and icons of nearby electronic devices, such as a dishwasher icon 64, a steamer icon 65, an oven icon 66, and the like, are displayed around the mobile phone icon according to the positional relationship between the electronic device and the mobile phone.
Also exemplary, the handset may perform connection management of the electronic device through a smart life application. As shown in interface 701 of fig. 7 (a), the smart life application is displayed by the mobile phone. After detecting the operation of clicking the add control 71 by the user, the mobile phone displays an interface 702 as shown in fig. 7 (b). Upon detecting the user's operation to click on the add device control 73 displayed in the menu bar 72, it is determined that the user needs to add the electronic device to the smart life application, and an interface 703 as shown in fig. 7 (c) is displayed. Electronic devices that can make group connections near the cell phone are displayed on the interface 703.
A specific implementation method for establishing a group connection is described as follows. The method for establishing the group connection includes, for example, in a process of controlling the device 300 to display the first electronic device 100 capable of performing the group connection, according to a user operation, establishing the group connection with the first electronic device 100 selected by the user. Alternatively, the group connection is established according to the category attribute and/or the location attribute of the first electronic device 100. Or, with the NFC function of the first electronic device 100, group connection of a plurality of first electronic devices 100 is established through a bump operation by the user. The respective methods are described in detail as follows.
In some embodiments, the mobile phone receives a user operation in the process of displaying the electronic devices capable of performing group connection, and determines a grouping result of the electronic devices according to the user operation. The mobile phone can determine the electronic equipment capable of performing group connection in a device searching mode, or can acquire a device list corresponding to the home maintained in the server through interaction with the server, and determine the electronic equipment capable of performing group connection. Wherein, the equipment list comprises information such as equipment rooms, equipment categories and the like.
Illustratively, as shown in interface 801 in fig. 8 (a), the mobile phone displays icons of electronic devices located on the same lan and/or logged into the same account, and the user may divide some or all of the devices into one group and one or more groups. Generally, a user may divide electronic devices located in a certain area into one group, or may divide electronic devices of the same category into one group, so as to facilitate group control. For example, if the electronic devices in the same room are divided into a group, the user may activate all of the electronic devices in the room at the same time, such as turning on all of the lights in the bedroom. For another example, if the electronic devices of the same category are divided into a group, the user may activate all the electronic devices of the same category at the same time, such as turning on all the lighting devices in the home. Then, for convenience of user operation, icons of the electronic devices that are clustered in geographical positional relationship partitions are displayed near the cell phone icon 81 as shown in the interface 801. For example, the projector, the sound box and the television are all located in a living room, and corresponding icons can be displayed in an aggregated mode. For another example, the dishwasher, the steam box and the oven are all located in the kitchen, and the corresponding icons can be displayed in an aggregation manner. Alternatively, as shown in the interface 801, icons of the electronic devices that are grouped by device class division are displayed near the mobile phone icon 81. For example, the projector, the sound box and the television belong to video-audio electronic equipment, and can gather and display corresponding icons; the dish washer, the steam box and the oven belong to kitchen electric electronic equipment, and can gather and display corresponding icons; the intelligent door lock belongs to safety electronic equipment and displays corresponding icons independently.
Then, as shown in an interface 802 in fig. 8 (b), the mobile phone detects a selection operation of the user on the display screen, and divides the electronic devices selected by the user into a group. For example, as shown in interface 802, the handset detects operation of a plurality of electronic devices in a user's slider on a display screen; or the mobile phone detects that the user continuously clicks the operations of the plurality of electronic devices. The mobile phone displays an interface 803 as shown in fig. 8 (c), and on the interface 803, as shown by reference numeral 82, the mobile phone highlights an icon of the electronic device selected by the user to prompt the user to confirm whether the selected electronic device is correct. For example, the mobile phone detects that the user presses a highlighted icon (such as a dish washer, a steam box and an oven) for a long time, and the icon can be deselected. For another example, the mobile phone displays a prompt message to prompt the user to confirm whether the device selection result is correct. For another example, the operation of clicking the blank area by the user is detected, and the selection of all icons is determined to be canceled.
Thereafter, the mobile phone displays an interface 804 as shown in fig. 8 (d), and as shown by reference numeral 83, the selected electronic device icon is close to the mobile phone icon 81, so as to prompt the user that a group is being established, and the established group information can be synchronized to the server. The manner in which the icon is highlighted includes, for example, darkening, changing color, flashing, etc., and the embodiment of the present application is not particularly limited.
Illustratively, the cell phone establishes a group including a dishwasher, a steamer, an oven through a process as shown in fig. 8, and can acquire information of devices from a server, such as 3 devices as a kitchen. Then, as shown in an interface 901 of fig. 9, the mobile phone starts a smart life application in response to a user operation and displays a space managed by the smart life application, as shown in reference numeral 91, 3 devices that the user has added are displayed in the kitchen card.
Therefore, the user can add the equipment in batches through simple operation, the operation difficulty of the user is reduced, and the use experience of the user is improved.
In other embodiments, the user may also establish the group of electronic devices through the smart home control panel.
Illustratively, as shown in interface 1001 of fig. 10, after the smart home control panel detects that the user clicks on the operation of connecting to the new device control 101, it determines that the user needs to connect to the new device. The intelligent home control panel can also search for nearby electronic devices or obtain a device list corresponding to the home from the server. Thereafter, the smart home control panel displays an interface 1101 as shown in fig. 11, and receives a grouping operation of the user on the interface 1101. The center icon displayed on the interface 1101 is the smart home control panel icon 111, that is, the smart home control panel is used as a center device. Optionally, the smart home control panel may also determine, according to a user operation, that the electronic device selected by the user is a central device.
In other embodiments, the user may also establish the group of electronic devices by any electronic device having a display screen and group processing capabilities. Optionally, the electronic device for establishing the group is used as a master control device, for example, the master control device has an explicit category attribute and/or a location attribute, and the nearby electronic devices with the same category attribute and/or location attribute can be automatically confirmed to prompt the user to establish the group. Wherein the category attribute is used to represent a category of the electronic device. The location attribute is used to represent a space where the electronic device is located, such as dividing the space where the electronic device is located with room granularity.
Illustratively, assume that the master device is a range hood. The range hood detects the operation of establishing a group by a user on a display screen, and starts a group connection function. Thereafter, the range hood displays an interface 1201 as shown in fig. 12 (a), the center icon displayed is a range hood icon 121, and icons of electronic devices located on the same local area network and/or registered on the same account are displayed around, wherein icons of electronic devices having category attributes and/or location attributes with the range hood are highlighted. For example, the range hood can acquire a device information list from the server, determine the category attribute and/or the location attribute of the electronic device, and further highlight the icon of the electronic device with the same category attribute and/or location attribute. As shown in the interface 1201, the range hood determines itself as a kitchen appliance, and can determine to highlight the icon 122 of the steam box, the icon 123 of the oven, and the icon 124 of the dishwasher, which are both kitchen appliances. Alternatively, as shown in interface 1201, the range hood determines itself to be in the kitchen, and may determine to highlight icons of electronic devices that are also in the kitchen, such as the steamer icon 122, the oven icon 123, and the dishwasher icon 124.
And then, the range hood detects the operation of dragging the icon to the center icon by a user, and determines the group relation between the electronic equipment corresponding to the icon and the range hood. For example, as shown in interface 1202 of fig. 12 (b), the extractor hood detects that the user drags the icon 122 of the steam box, the icon 123 of the oven, and the icon 124 of the dishwasher to the extractor hood icon 121, and after detecting that the user clicks the confirm control 125, determines that the user establishes a group including the extractor hood, the steam box, the oven, and the dishwasher. Or the range hood automatically gathers and displays icons of all electronic devices with the range hood and having category attributes and/or position attributes around the range hood icon, and detects the operation that a user drags the icons of the electronic devices in a direction away from the range hood icon, and determines that the user does not need to add the corresponding electronic devices to the group, so that the dragging operation of the user is reduced.
Alternatively, the master device (e.g., range hood) may send group information to the server after the group is established. Then, after the mobile phone responds to the user operation to start the intelligent living application, the mobile phone can obtain the latest equipment information from the server, display the latest group division result and realize the group control of the user on the electronic equipment.
In other embodiments, the user may also establish the group of electronic devices through a touch operation by using a near field connection function (such as an NFC function) of the mobile phone.
For example, as shown in fig. 13A, assuming that the device a is configured with an NFC tag, when a user needs to establish a group including the device a, the user may bump the NFC tag of the device a through the NFC function of the mobile phone, and then the mobile phone may generate broadcast information for searching for an electronic device having the same category attribute as the device a within a preset distance. As shown in fig. 13A, it is assumed that the devices B and C are located within a preset distance of the device a and are the same as the class of the device a (e.g., are office devices).
Then, as shown in an interface 1301 of fig. 13B, the group connection card 131 is displayed on the interface 1301. As shown in interface 1301, icons of electronic devices detected by the mobile phone to meet the requirements are displayed on the group connection card 131. And, as indicated by reference numeral 132, the names of these electronic devices and the names of the group are displayed, for example, the mobile phone searches for device B and device C, the names of device a, device B and device C are displayed on the group connection card 131, and the corresponding group names are office devices (for example, the group is automatically named by the device class). The mobile phone may receive an operation of the user on the group connection card 131, such as an operation of deleting an icon of a part of the electronic device, an operation of modifying a group name, and the like. If the mobile phone detects that the user deletes the icon of part of the electronic devices, it may be determined that the user does not join the corresponding electronic device into the group including the device a. After that, the mobile phone determines to establish a group including device a, device B, and device C after detecting the operation of the user clicking on the group connection control 133 displayed on the group connection card 131.
Optionally, after the mobile phone establishes the group, the group information may be sent to the server. Then, when the intelligent life application is started in response to the user operation, the intelligent life application can obtain the latest equipment information from the server, display the latest group division result and realize the group control of the user on the electronic equipment.
Also exemplary, as shown in fig. 14A, the handset detects that the user has bumped the NFC tag of device a via the NFC function, determines that the user needs to establish a group including device a, and displays an interface 1401. On interface 1401, group connection card 141 of device a is displayed, and an icon of device a is displayed on group connection card 141.
Then, as shown in fig. 14B, during a preset time (e.g., 20 seconds), the mobile phone detects that the user bumps the NFC tag of the device B through the NFC function, determines that the user needs to join the device B into the group including the device a, and displays the interface 1402. As shown by reference numeral 142, an icon of device a and an icon of newly added device B are displayed on the group connection card 141 of device a, prompting the user that device B has been added.
Thereafter, as shown in fig. 14C, during a preset time (e.g., 20 seconds) after adding the device B, the mobile phone detects that the user bumps the NFC tag of the device C through the NFC function again, determines that the user needs to join the device C into the group including the device a and the device B, and displays the interface 1403. As shown by reference numeral 143, the icon of device a, the icon of device B, and the icon of newly added device C are displayed on the group connection card 141 of device a, prompting the user that device C has been added.
As shown in fig. 14C, after detecting the operation of clicking the group connection control 144 displayed on the group connection card 141 by the user, the mobile phone determines that the user has completed the addition of the devices in the group, and may establish the devices a, B, and C, which have been determined to be added, as one group. Or the mobile phone does not detect the operation of adding the new device by the user (such as the operation of touching the NFC label of the device by the NFC function) within the preset time, and the added device is automatically set into a group.
Optionally, after the mobile phone establishes the group, the group information may be sent to the server. Then, when the intelligent life application is started in response to the user operation, the intelligent life application can obtain the latest equipment information from the server, display the latest group division result and realize the group control of the user on the electronic equipment.
It should be noted that, the scenario described in fig. 13A to 14C above describes the group establishment process taking the implementation of the connection through the bump operation of the NFC function as an example. It can be understood that the group may be established by other functions, such as a shake function of the mobile phone, or adding a device serial number of an electronic device to be established on the mobile phone.
As described above, how to establish a group connection of the first electronic device 100 is described below, how to implement control over a group according to the intention of a user. In the smart home application scenario, the intent is used for expressing the user desire, and may include, for example: on/off lights, music playing, air purification, curtain control, temperature control, etc.
The group control method is described below by taking a room as a group unit as an example. For example, smart home devices are divided into devices in primary lying, devices in secondary lying, devices in living room, devices in study room, devices in child room, and the like. It can be understood that, as described in the group connection section above, groups with other granularity, such as groups of office equipment, may be established according to user operations, and the control methods for other groups may refer to the following group control methods, which will not be described herein.
In some embodiments, the smart home devices are of a plurality of kinds, and the subsystems, such as the lighting subsystem, the environment subsystem, the security subsystem, and the like, are divided based on the functions of the smart home devices according to the intention of the user. Each subsystem corresponds to one or more intents. For example, the corresponding intents of the lighting subsystem include: on/off lamp intent, etc.; the corresponding intents of the environment subsystem include: constant temperature intention, constant humidity intention, constant net intention, etc. Then, the control apparatus 300, after determining the intention of the user and determining the group that the user needs to control, may determine the first electronic apparatus 100 in the group that can achieve the intention and instruct the first electronic apparatus 100 to perform the intention of the user.
In some embodiments, the server has configured therein a subsystem profile and an intent profile. The subsystem configuration file is used to configure the subsystem, and as shown in table 1 below, the subsystem configuration file includes a subsystem name, a device type, a device ID, and an intent identifier. Taking the lighting subsystem as an example, a subsystem configuration file of the lighting subsystem includes a device ID of an electronic device capable of realizing lighting and a corresponding intention identifier. The intention mark is used for representing a control command corresponding to the execution intention of the electronic equipment, and the server can determine corresponding operation required to be executed by the electronic equipment according to the control command, so as to send a corresponding instruction to the electronic equipment. Optionally, the control device 300 stores a subsystem configuration file, and after determining the user intention, the control device 300 can determine the device ID of the electronic device capable of implementing the user intention according to the subsystem configuration file, and further recommend the group including the electronic device capable of implementing the user intention to the user, so as to facilitate the group control of the user on the electronic device.
TABLE 1
The intention configuration file is used for configuring the execution conditions and the execution actions corresponding to the intention, and as shown in the following table 2, the intention configuration file includes a device ID, an intention identifier, the execution conditions and the execution actions. Wherein one subsystem profile corresponds to one or more intent profiles, the execution conditions are used to represent conditions for execution of the execution action, and the execution of the partial intent may not include the execution conditions, i.e., the execution conditions may not be configured in the intent profile.
For example, as shown in table 2 below, taking an illumination subsystem as an example, the intent profile corresponding to the illumination subsystem includes a device ID of an electronic device capable of illumination (assumed to be a device ID of device a), an intent identifier, and a corresponding execution action. Then, if the intention configuration file of all intents is stored in the server, as shown in the following table 2, the server determines that the device a needs to be started (for example, turned on) according to the intention configuration file, and sends an indication signal to the device a for indicating the device a to start. For another example, if the intention corresponding to the security subsystem is to be implemented, coordination of a plurality of electronic devices may be required, and then the intention configuration file includes execution conditions for implementing execution actions of the plurality of electronic devices. If the electronic devices corresponding to the security subsystem include cameras, body sensors, alarms and the like, the corresponding execution actions can be executed only after the content detected by the electronic devices meets the execution conditions. Therefore, corresponding execution conditions need to be configured in the intention configuration file corresponding to the security subsystem. For another example, the electronic device needs to execute the corresponding intention under the specific conditions of temperature, time, etc., and then needs to configure the corresponding execution conditions in the intention configuration file.
TABLE 2
The group control process will be described below by taking the control device 300 as an example of an intelligent home control panel.
For example, as shown in the interface 1501 in fig. 15, the smart home control panel detects the user clicking the full-house control 151, determines that the user needs to operate the smart home device, and the smart home control panel enters the full-house control system to display the interface 1601 shown in fig. 16. On the interface 1601, an intention identification column 161 is displayed, the intention identification column 161 being for displaying intention identifications of all intents or for displaying intention identifications of part of the intents. Where full intent is used to represent a developer or user defined full intent. The partial intent may include an intent that is achievable by the current in-house electronic device and/or an intent selected by the user from among all intents.
For example, the partial intent may include an intent that is achievable by the current in-house electronic device. If it is assumed that the first electronic device 100 in the current house includes a lamp and a sound box, the corresponding achievable intents include an on-lamp intent and a music playing intent, the smart home control panel may display the on-lamp intent identification and the music playing intent identification in the intent identification column. For another example, the partial intent includes an intent that can be achieved by the current in-house electronic device and an intent selected by the user from among all intents. If the first electronic device 100 in the current room is assumed to include a lamp and a sound box, the corresponding achievable intents include an on-lamp intent and a music playing intent, the smart home control panel in turn determines the intent selected by the user as the purified air intent (assuming that the intent is an unrealizable intent of the electronic device in the current room). The smart home control panel may display the on-light intent identification, the music play intent identification, and the purified air intent identification in the intent identification column. For another example, the smart home control panel displays only the intention identifications of the user's selected intention, such as the purified air intention identifications, in the intention identification column according to the user operation.
The intention selected by the user may be an intention which is not achievable in the current house, or an intention which is not achievable in the current house. For example, according to the user operation, the smart home control panel displays, in the intention identification column, an intention identification of a part of the intents among the intents that the current indoor electronic device can realize.
Wherein, in the case that the intention identifier displayed in the intention identifier column includes an intention identifier corresponding to an intention which cannot be achieved by the current indoor electronic device, the intention identifier corresponding to the intention which cannot be achieved by the current indoor electronic device may be displayed in a gray inoperable state. If the new electronic device is accessed, determining that the intention achievable by the new electronic device corresponds to the intention identifier of the currently displayed inoperable state, and converting the intention identifier of the inoperable state into the intention identifier of the operable state.
Alternatively, as shown in fig. 16, the order in which the intention identifications are displayed in the intention identification column 161 may be a default order; or, for the order determined according to the user operation, such as adjusting the intention identification display order according to the user operation; or, in order determined according to the frequency of user operation, for example, the intention identification of the intention of the user to frequently operate is displayed at a display position with higher priority (such as a display position convenient for the user to operate or a previous display position) so as to be convenient for the user to operate; or alternatively; the intention identification column 161 is divided into an adjustable region that adjusts the display of intention identifications according to user operations and a non-adjustable region that displays default intention identifications in a preset number and a preset display order.
As shown in fig. 16, the on-lamp intention identifier 162 displayed in the intention identifier column 161 is highlighted, so as to indicate that the interface 1601 currently displayed on the smart home control panel is a control interface corresponding to the on-lamp intention identifier 162. As indicated by reference numeral 163, the center intent displayed is an on-lamp intent, and circles displayed around the center intent are used to represent groups containing electronic devices that can achieve the on-lamp intent. As indicated by reference numeral 164, the living room contains electronic devices that can achieve the intent of turning on the lights, and thus the smart home control panel displays the identity of the living room group (as indicated by the circle indicated by reference numeral 164 displayed on the interface 1601). Alternatively, the circle displayed around the center intent may also be used to represent an electronic device that may achieve the intent to turn on the light. For example, if some electronic devices capable of achieving the lighting intention do not join in a group or only include one electronic device in a group (the electronic device can achieve the lighting intention), the smart home control panel may display the identity of the electronic device directly around the center intention. That is, the identification displayed around the center intent identification includes a group identification and/or a device identification.
Specifically, after the smart home control panel detects the operation of clicking the lighting intention identifier 162 by the user, according to the subsystem configuration file shown in table 1 above, the electronic devices capable of realizing the lighting intention are determined, and then the group in which the electronic devices are located is determined. Thereafter, the center intention identification is displayed on the interface 1601, and the identification of the determined group is displayed, facilitating the user to implement group control according to intention. The intelligent home control panel stores subsystem configuration files; or after determining the intention selected by the user, the intelligent home control panel sends a request to the server, and the server determines the equipment ID of the corresponding electronic equipment; or, the requesting server issues a subsystem configuration file.
Optionally, as shown in fig. 16, the group identifier displayed around the center intention identifier further displays the situation that the electronic device corresponding to the center intention is not turned on, for example, the electronic device corresponding to the lighting intention in the child room, the secondary sleeping room, the living room, the primary sleeping room, and the study room. The user can list the whole house condition through the intelligent home control panel, and the user can use the device without confirming the device state one by one through complex operation, so that the user experience is improved.
In some embodiments, the smart home control panel detects a preset operation of the user on the group identifier, and determines that the user instructs the electronic devices in the group corresponding to the group identifier to execute a preset condition and a preset action corresponding to the central intention. Optionally, the preset operation includes, for example, an operation of a user operation group identification collision center intention identification, or an operation of a user operation center intention identification collision group identification.
Illustratively, as shown in fig. 17 (a), the center intention displayed by the smart home control panel is the on-lamp intention, as shown by the on-lamp intention flag 171 shown by the interface 1701. The intelligent home control panel detects the operation that the user drags the main sleeping identifier 172 to move towards the lighting intention identifier 171 along the direction indicated by the arrow 173, and determines that the user needs to control the electronic equipment corresponding to the lighting intention in the main sleeping to start the lighting function after detecting that the main sleeping identifier 172 collides with the lighting intention identifier 171. Then, the smart home control panel determines, according to the subsystem configuration file, a device ID of the electronic device corresponding to the lighting intention in the main lying group and an intention identifier corresponding to the lighting intention (the intention identifier also corresponds to the lighting intention identifier 171 displayed by the smart home control panel), and sends the device ID and the intention identifier to the server. And the server determines the execution action and the execution condition which need to be executed by the electronic equipment corresponding to the equipment ID according to the equipment ID, the intention identifier and the intention configuration file. The server generates a signal indication and sends the signal indication to the electronic device corresponding to the device ID, so that the electronic device executes corresponding execution actions, such as turning on a lamp. Or the server generates a signal indication, sends the signal indication to the intelligent home control panel, and forwards the signal indication to the electronic equipment corresponding to the equipment ID by the intelligent home control panel, so that the electronic equipment executes corresponding execution actions, such as turning on a lamp. And then, the electronic equipment in the main sleeping position sends a feedback signal after being turned on according to the signal indication, and the intelligent home control panel determines whether the signal indication is executed according to the received feedback signal forwarded by the server or the received feedback signal sent by the electronic equipment.
As shown in interface 1702 of fig. 17 (b), the smart home control panel displays the execution of the command in the group identifier of the primary sleeper, such as 4 lights on, according to the received feedback signal, as shown by reference numeral 174. The user can intuitively determine the execution condition of the command according to the display of the group identification, and the use experience of the user is improved.
Therefore, the user can realize group control of the electronic equipment through simple dragging operation, the equipment operation difficulty of the user is reduced, and the multi-equipment control requirement of the user is met.
In some embodiments, after the electronic device executes the execution action corresponding to the intention, the smart home control panel may further display the electronic device in the group that specifically executes the execution action, so that the user can know the execution condition of the electronic device.
For example, corresponding to the scenario shown in fig. 17, as shown in interface 1801 in fig. 18, after determining that the electronic device in the main sleeper starts the light-on function, the smart home control panel sequentially flashes the identifiers of the electronic devices that start the light-on function in the group identifier. As indicated by reference numeral 181, the identity of the electric lamp is displayed in the primary sleeper. For example, assuming that there should be 6 lighting devices in the main sleeper, but only 4 lighting devices are started in the scene shown in fig. 17, the user can intuitively determine which electronic devices perform the execution actions corresponding to the intention and which do not perform the execution actions corresponding to the intention according to the electronic device identifiers shown in fig. 18. Thereby simplifying the user's confirmation operation.
In some embodiments, when detecting a preset operation of a user on a group identifier corresponding to electronic devices for which center intention has been executed, the smart home control panel expands and displays identifiers of all the electronic devices for which center intention has been executed corresponding to the group identifier.
Illustratively, corresponding to the scenario shown in fig. 17, as in interface 1901 shown in fig. 19 (a), the smart home control panel detects the operation of the user clicking on the main horizontal marker 191, and displays interface 1902 shown in fig. 19 (b). As shown in interface 1902, the smart home control panel displays the identity of the electronic device (e.g., dome lamp, spotlight, floor lamp, desk lamp) that has activated the light-on function in the home in the surrounding light-on intent identity.
Therefore, a user can check the intention execution condition of the electronic equipment in the specific group according to the requirements, and quickly know the state of the whole house equipment. The operation difficulty of the user is reduced, and the user interactivity is improved.
In some embodiments, in the process of expanding the identification of the electronic device for executing the intention in the display group, in response to the operation of the user on the device identification, the intelligent home control panel cancels the execution action and the execution condition corresponding to the intention executed by the electronic device corresponding to the device identification.
For example, corresponding to the scenario shown in fig. 19, as shown in interface 2001 in fig. 20A (a), during the process of expanding and displaying the identifier of the electronic device that starts the lighting function in the main sleeping, the smart home control panel detects the operation of dragging the table lamp identifier 202 by the user in a direction away from the lighting intention identifier 201, and determines that the user needs to cancel the lighting of the table lamp (i.e., the user instructs to turn off the table lamp). The intelligent home control panel sends the device ID of the desk lamp and a command for canceling the execution intention to the server, and the server sends an indication signal for turning off the lamp to the desk lamp according to the device ID. Or the intelligent home control panel directly sends an indication signal to the desk lamp for indicating the desk lamp to turn off. Then, after receiving the indication signal, the desk lamp executes a lamp turn-off command and sends a feedback signal. The smart home control panel displays an interface 2002 as shown in fig. 20A (b) according to the feedback signal forwarded by the server or according to the feedback signal sent by the received desk lamp. As shown in interface 2002, the desk lamp logo 202 is no longer highlighted for prompting the user that the desk lamp is turned off. Wherein the command to cancel the execution intention includes a command to instruct the electronic device to execute a command opposite to the intention, for example, a command to cancel the execution intention to turn on the light instructs the electronic device to perform turning off the light. The command to cancel the execution intention includes a command to no longer execute the intention, for example, the intention is a constant temperature intention, and the command to cancel the constant temperature intention is a command to indicate that the electronic device no longer executes the constant temperature intention.
Thereafter, as shown in fig. 20A (b), the user may also continue to turn off other lighting devices in the primary sleeper, such as overhead lights, spot lights, etc., on the interface 2002. After the smart home control panel detects that the user clicks the close control 203, it determines that the user has completed controlling the electronic devices in the current group, and an interface 2003 as shown in fig. 20A (c) may be displayed. On the interface 2003, the execution of the command displayed in the main-lying sign is changed from the main-lying on 4 lights shown in fig. 19 (a) to the main-lying on 3 lights shown as reference numeral 204, i.e., one of the desk lamps in the main-lying has been turned off.
In some embodiments, the smart home control panel determines a modification operation of the group parameter by the user, and performs group control according to the modified group parameter in a process of controlling the electronic devices in the group by the subsequent user. Wherein the modifying operation of the group parameter includes an operation of changing the number of devices included in the group, an operation of changing the device parameter in the group, and the like. Alternatively, the modification of the group parameters may be implemented by means of a blacklist. If the device ID of the electronic device deleted by the user is added to the blacklist, the smart home control panel may exclude the electronic device corresponding to a part of the intention according to the blacklist in the process of determining the subsystem configuration file. Alternatively, the modification of the group parameters may be a modification of the subsystem profile and the intent profile directly. If the intelligent home control panel sends the modification of the device parameters to the server, and the server modifies the corresponding parameters in the intention configuration file, then the electronic device can be instructed to execute the intention according to the modified device parameters when determining the execution conditions and executing the actions later. Optionally, the blacklist, the modified subsystem configuration file and the intention configuration file determined by the server can be synchronized to the intelligent home control panel.
Illustratively, in the scenario shown in fig. 20A, the smart home control panel deletes the desk lamp in the home in response to the user operation. Then, it is assumed that the user has turned off all lighting devices in the primary sleeper. For example, as shown in fig. 20A (c), the user drags the main horizontal sign shown by reference numeral 204 to move away from the intention sign of turning on the lights, and the smart home control panel determines to indicate that the three lights that have been turned on are turned off. Subsequently, as in the interface 2004 in fig. 20B (a), after the smart home control panel detects the operation of the user clicking on the on-light intention flag 205 displayed in the intention flag column, the displayed center intention flag is the on-light intention flag 206, and the main lying flag that is not highlighted is displayed around the on-light intention flag 206. Thereafter, as shown in interface 2005 in fig. 20B (B), the smart home control panel detects an operation in which the user drags the main sleeper identification 207 to move toward the lighting intention identification 206 and collides, and determines that it is necessary to control the lighting apparatus in the main sleeper to perform the lighting intention. The intelligent home control panel can determine that 3 lighting devices are included in the current main sleeping according to the subsystem configuration file and the blacklist, and determine corresponding device IDs and intention identifications, so that the 3 lighting devices are controlled to execute the lighting intention. Thereafter, the smart home control panel displays an interface 2006 as shown in fig. 20B (c), and as shown by reference numeral 208, the display command execution condition in the primary horizontal control flag is 3 lights on, and no desk lamp added to the blacklist in response to the user operation is included. Further, if the user turns on the desk lamp again, the intelligent home control panel can delete the desk lamp from the blacklist and synchronize the desk lamp to the server. Then, the intelligent home control panel determines that the lighting equipment in the main sleeping position performs the lighting intention, and the intelligent home control panel can instruct the desk lamp to perform the lighting intention.
Also exemplary, assume that the execution action in the intent profile of the on-lamp intent includes a brightness parameter of the lamp, such as 350 candela per square meter (cd/m). Then, as in the scenario shown in fig. 17 described above, the server instructs the lighting device in the primary sleeper to activate at a luminance of 350 cd/m. Assuming that the smart home control panel detects an operation of the user to adjust the brightness of the desk lamp in the lighting apparatus in the home, it is determined that the user adjusts the brightness of the desk lamp to 300cd/m. Then, the smart home control panel sends the adjusted parameters to the server, which modifies and synchronizes the intent profile. Subsequently, after detecting the operation of starting the lighting equipment in the main sleeping position again, the intelligent home control panel instructs the desk lamp in the main sleeping position to start at the brightness of 300cd/m, and the rest lighting equipment starts at the brightness of 350 cd/m.
Therefore, the intelligent home control panel can adaptively adjust the subsystem configuration file and the intention configuration file according to user operation, so that group control meets the user requirement, and the use experience of a user is further improved.
In some embodiments, the intent representation column can only display a limited number of intent identifications limited by the display area of the display screen of the smart home control panel. Thus, in the case where the current page of the intention identification column does not display some intention identifications, the user can view more intention identifications through operations such as sliding within the intention identification column. That is, the smart home control panel slides the intention flag in response to a user's sliding operation or the like in the intention flag field.
Illustratively, as shown in interface 2101 of fig. 21 (a), the smart home control panel displays on-light intention identification, music play intention identification, purified air intention identification, and curtain intention identification in intention identification field 211. It is assumed that none of the currently displayed intent identifications is an intent identification of an intent that the user wants to control. Then, as shown in the interface 2101, the smart home control panel detects a sliding operation of the user on the intention identification column 211 in the direction indicated by the arrow 212, and then slides and displays the intention identification displayed in the intention identification column 211. If the smart home control panel displays an interface 2102 as shown in fig. 21 (b) in response to a sliding operation of the user, a constant temperature intention flag 213 that cannot be displayed in the interface 2101 is displayed in the intention flag column 211.
In this way, the user can view more intentions through simple sliding operation, thereby meeting more intention demands of the user.
In some embodiments, an operation of clicking the intention identifier by the user in the intention identifier column is detected, and an intention interface corresponding to the intention identifier is displayed.
For example, as shown in interface 2201 in fig. 22 (a), the smart home control panel detects an operation of clicking the window curtain intention identifier 221 by the user in the process of displaying the interface of the on-lamp intention, and determines that the user needs to view the interface corresponding to the window curtain intention. The smart home control panel obtains a group corresponding to the curtain intention and the execution condition of the electronic devices in the group on the curtain intention, and displays an interface 2202 as shown in fig. 22 (b). As shown in interface 2202, the displayed indication of center intent is a curtain intent indication 222 around which a group indication including a group of electronic devices that can implement curtain intent is displayed. Also, assuming that the electronic device for which curtain intention has been performed is included in the secondary lying, the secondary lying flag 223 is highlighted as shown in the interface 2202, and the execution situation of intention, such as the secondary lying curtain opening, is displayed in the secondary lying flag 223.
In this way, the user can switch to view different intention interfaces through the intention identification column, so that the state of the electronic equipment of the whole house is determined. And, through switching the intention interface, the electronic equipment corresponding to the disagreement diagram can be controlled.
In some embodiments, the smart home control panel detects a deletion operation of the user in the intention identification column, and can delete the intention identifications that are not needed by the user.
Illustratively, as shown in interface 2301 in fig. 23 (a), the smart home control panel detects that the user has long pressed the music playing intention flag 231, and may display the delete bar 232. After that, upon detecting an operation of dragging the music play intention flag 231 to the deletion column 232 in the direction indicated by the arrow 233 by the user, the music play flag may be deleted, displaying an interface 2302 as shown in fig. 23 (b). As shown in interface 2302, the deleted music play identifier is no longer displayed in the icon identifier field 234.
In some embodiments, when the group corresponding to the group identifier displayed around the center intention identifier does not execute the center intention, an operation that the user presses the center intention identifier for a long time is detected, and an intention execution command may be sent to the electronic devices corresponding to all the groups corresponding to the center intention. Therefore, the user can rapidly operate all the electronic devices corresponding to the intention without operating the electronic devices in groups, so that the operation difficulty of the user is further reduced, the control efficiency of the electronic devices is improved, and the use experience of the user is improved.
Illustratively, as shown in interface 2401 of fig. 24 (a), the center is intended to be on. The intelligent home control panel detects operation of pressing the lighting intention identification 241 for a long time by a user, determines that all groups corresponding to the lighting intention do not execute the lighting intention, and determines device IDs and intention identifications of all lighting devices corresponding to the lighting intention according to the subsystem configuration file. The intelligent home control panel sends the device IDs and the intention identifications of all the lighting devices corresponding to the lighting intention to the server. Thereby enabling the lighting devices in these groups to be fully activated. Thereafter, the smart home control panel displays an interface 2402 as shown in fig. 24 (b), and as shown by reference numeral 242, the group identifications of all the groups corresponding to the lighting intents are highlighted and gathered around the lighting intents for displaying, so as to prompt the user that the lighting devices corresponding to all the lighting intents are started.
In some embodiments, in a case where the group around the center intention performs the center intention in its entirety, an operation of the user pressing the center intention for a long time is detected, and an intention cancel command may be transmitted to the corresponding electronic devices in the entire group corresponding to the center intention. Therefore, the user can rapidly operate all the electronic devices corresponding to the intention without operating the electronic devices in groups, the operation difficulty of the user is further reduced, the control efficiency of the electronic devices is improved, and the use experience of the user is improved.
Illustratively, as shown in interface 2501 (a) in fig. 25, the center is intended to turn on, all groups to which the turn-on intents correspond have performed the turn-on intents (as shown in interface 2501, all group identifications displayed around turn-on intents 251 have been highlighted). The intelligent home control panel detects the operation of long-time pressing of the lighting intention identifier 251 by a user, and determines the device IDs and the intention identifiers of all lighting devices corresponding to the lighting intention according to the subsystem configuration file. The intelligent home control panel sends the device IDs and the intention identifications of all the lighting devices corresponding to the lighting intention to the server. Thereby achieving a complete switching off of the lighting devices. Thereafter, the smart home control panel displays an interface 2502 as shown in fig. 25 (b), and all group identifications are not highlighted any more, for prompting the user that the lighting devices corresponding to the all on intents have been turned off.
In other embodiments, corresponding to the scenarios shown in fig. 24 and 25, as the partial group identification corresponding to the center intent is highlighted, the partial group identification is not highlighted. Then, the electronic devices in the partial group execute the execution conditions and the execution actions corresponding to the center intention, and the electronic devices in the partial group do not execute the execution conditions and the execution actions corresponding to the center intention. In this case, after detecting the operation of the user for a long time according to the center intention identification, the intelligent home control panel determines to control the execution conditions and the execution actions corresponding to the execution center intention of all the groups according to the preset rule. For example, the center intention is an on-lamp intention, and when detecting that the user presses the operation of the on-lamp intention mark for a long time, the intelligent home control panel controls the lighting devices in the group including the lighting devices not performing the on-lamp intention to perform the on-lamp intention according to a preset rule. For another example, when the center intention is an on intention, the smart home control panel controls the lighting devices in the group including the lighting devices for which the on intention has been performed to be turned off according to a preset rule when detecting an operation of the user for long pressing the on intention flag. The preset rules can be rules preset in the intelligent home control panel for a developer, and can also be rules customized for a user.
In some embodiments, the smart home control panel detects an operation of fusing intent of the user, and may fuse two or more intents to generate a fused intent. Therefore, the user can directly control the electronic equipment corresponding to the multiple intentions included in the fusion intention through the fusion intention, and the user operation is further simplified.
Illustratively, as shown in interface 2601 of fig. 26 (a), it is assumed that the smart home control panel is displaying a lighting intent interface. The smart home control panel detects an operation that the user drags the constant temperature intention flag 261 displayed in the intention flag column to move toward the center intention (i.e., on-lamp intention) flag in the direction indicated by the arrow 262, and as indicated by reference numeral 264, the constant temperature intention flag is highlighted after moving to within a preset range around the center intention to prompt the user to currently start the intention fusion mode, so that the constant temperature intention and the center intention can be fused. If the smart home control panel detects that the user continues to drag the operation of moving the constant temperature intention flag to the on-lamp intention flag 263, as shown in an interface 2602 in fig. 26 (b), as shown by a reference numeral 265, if the smart home control panel detects that the constant temperature flag collides with the on-lamp flag, a schematic diagram of the combination of the two intention flags is displayed, and the user is prompted that the two intention has been combined, for example, from the detection of the contact of the two intention flags. The intelligent home control panel determines that the intention to be fused is the lighting intention and the constant temperature intention, and can generate a fused intention identifier. As shown in interface 2603 of fig. 26 (c), fusion intention flag 266 is displayed in the intention flag field. And, the switching center intention is a fusion intention, and a fusion intention interface is displayed. Such as a center intent identifier 267 that is a fusion of an on-light intent and a constant temperature intent. The fused intent is displayed on the fused intent identification 267, prompting the user to control the electronic device that can perform the lighting intent and the constant temperature intent with the fused intent.
Optionally, after generating the new fusion intention, the intelligent home control panel pops all the group identifications highlighted around the original center intention identification off, and does not highlight any more. As shown in interface 2602 in fig. 26 (b), the primary horizontal identity is highlighted around the on-light intention identity, and then as shown in interface 2603 in fig. 26 (c), the smart home control panel, after determining the fusion intention, does not highlight the group identity around the fusion intention identity 267.
It should be noted that, the group identifier that is originally highlighted in the intent interface, such as not highlighting when displaying the fusion intent interface, does not indicate that the intended execution of the electronic devices in the group is directly canceled. For example, in the scenario shown in fig. 26, after the lighting intention and the constant temperature intention are fused, the main lying sign is not highlighted in the fusion intention interface shown in fig. 26 (c). However, as shown in fig. 26 (a), 4 lamps are turned on in the main sleeping position. Then, after the intended fusion, the 4 lamps remain on. As shown in fig. 26 (c), the on-screen intention flag 268 remains displayed in the intention flag field, and if the user clicks on the on-screen intention flag 268, the on-screen intention interface is switched to display, and the main-lying flag is highlighted on the on-screen intention interface.
Thus, based on the scenario shown in fig. 26, if the user needs to start the group corresponding to the lighting intention and the constant temperature intention before fusing the intention, the intention identifiers of the two intentions need to be operated by the method shown in fig. 17, and the operation is complicated. After the fusion intention is established, the user can directly pass through the operation of the fusion intention identification, so that the electronic equipment in the group corresponding to the fusion intention is controlled, and the user operation is simplified.
For example, it is assumed that before the fusion intention is made, the user needs to start the lighting device and the thermostat in the primary sleeping, and needs to operate the primary sleeping identifier to start the primary sleeping lighting device at the lighting intention interface first, and then switch the thermostat intention interface to operate the primary sleeping identifier to start the primary sleeping thermostat. After the intent is fused, as shown in an interface 2701 in fig. 27 (a), the smart home control panel detects an operation of clicking the fusion identifier 271 displayed in the intent identifier column by the user, and determines that the intent corresponding to the fusion identifier 271 includes a lighting intent and a constant temperature intent. After the electronic equipment included in the two subsystem configuration files is determined, a group comprising the electronic equipment capable of executing the lighting intention and the electronic equipment capable of executing the constant temperature intention is determined, and the equipment ID and the corresponding intention identification of the electronic equipment capable of executing the lighting intention and/or the constant temperature intention in the group are sent to a server, so that execution conditions and execution actions for indicating the execution intention of the corresponding electronic equipment are realized. For example, assume that as shown in interface 2701, lighting devices that can perform an on-light intention are included in the primary, study, child, secondary, and living room, but only thermostat devices that can perform a thermostat intention are included in the primary. Therefore, the intelligent home control panel determines the device ID and the corresponding intention identification of the lighting devices in the main sleeping and the device ID and the corresponding intention identification of the constant temperature device, and realizes the instruction of the lighting devices and the constant temperature devices to execute the corresponding execution conditions and the execution actions. After that, after determining that the electronic devices in the main lying group have performed the corresponding execution conditions and execution actions, the smart home control panel displays an interface 2702 as shown in (b) of fig. 27, at this time, highlights the main lying identifier 274, and displays the execution condition of the command on the main lying identifier, such as that 4 lamps of the main lying are turned on, and the temperature is 26 ℃.
In this way, the user can quickly activate the lighting and thermostat devices in the primary sleeper. Accordingly, the user can also quickly cancel the intended execution of the electronic device by fusing the intents. The operation intended to cancel execution may refer to the related contents shown in fig. 20A or 25 described above, and will not be described again.
In other embodiments, as in the intent fusion process, the smart home control panel determines that the group corresponding to the highlighted group identifier includes an electronic device capable of executing the intent to be fused in the currently displayed intent interface. Then, the smart home control panel may keep highlighting the group identifier on the fusion intention interface after the fusion intention, but the command execution condition displayed in the group identifier changes to the corresponding command execution condition after the fusion intention is executed. Of course, if the smart home control panel determines that the currently displayed intention interface, the group corresponding to the highlighted group identifier does not include the electronic device capable of executing the intention to be fused. Then the intent identification will not be highlighted in the fusion intent interface. That is, in the intent fusion process, the smart home control panel needs to determine the central intent of the electronic device corresponding to the group identifier highlighted by the current interface and the execution condition of the intent to be fused.
Illustratively, corresponding to the scenario shown in fig. 26, assuming that a thermostat device that can perform a thermostat intention is included in the main sleeper, an interface 2801 is shown in fig. 28A, in the case that the center intention flag is the thermostat intention flag 281, the main sleeper flag 282 is highlighted on the interface 2801, and it is determined to activate the thermostat device in the main sleeper. Further, as shown in an interface 2601 of (a) in fig. 26, in the case where the center intention flag is the on-lamp intention flag 263, the main lying flag is highlighted on the interface 2601, and it is determined that 4 lamps in the main lying are on. Then, after the smart home control panel detects the fusion intention operation on the interface 2601, the interface 2802 shown in fig. 28B may be directly displayed after determining the fusion intention. On the interface 2802, the center intention is identified as a fusion intention identifier 283 of the on-lamp intention and the constant temperature intention, and the main lying identifier is highlighted as indicated by reference numeral 284, and command execution conditions of the fusion intention, such as 4 lights on and a temperature of 26 ℃, are displayed on the main lying identifier.
In other embodiments, the fusion intent operation may include other implementations in addition to the fusion intent operation of the drag intent identifier to the center intent identifier shown in FIG. 26. For example, the smart home control panel detects an operation that a user drags a certain intention identifier displayed in the intention identifier column to another intention identifier displayed in the intention identifier column, and can determine an intention corresponding to the two intention identifiers. Further, the two intent identifications may be intent identifications of separate intents; alternatively, the two intent identifications may be intent identifications of the fusion intent; alternatively, the two intention identifications may be one intention identification as an intention identification of a single intention and the other intention identification as an intention identification of a fusion intention. For another example, the smart home control panel detects an operation that the user presses the intention identifier displayed in the intention identifier column for a long time, and may select the intention identifier. Then, the intelligent home control panel detects operation of continuously selecting a plurality of intention identifiers by the user, and fuses intention corresponding to the plurality of intention identifiers continuously selected by the user to obtain fusion intention.
In some embodiments, the smart home control panel may split the corresponding fusion intent in response to a user splitting the fusion intent identification. Thereby further satisfying the requirements of users for controlling the electronic equipment.
Illustratively, as shown in interface 2901 in fig. 29 (a), the center intention identifier displayed in interface 2901 is a fusion intention identifier corresponding to the fusion intention after the lighting intention and the constant temperature intention are fused, the fusion intention identifier includes hot areas corresponding to the intention included in the fusion intention, such as lighting intention hot area 291 and constant temperature intention hot area 292, and the user can intuitively determine the position of each intention hot area through the content displayed on the fusion intention identifier. The smart home control panel detects the operation that the user presses the constant temperature hot zone 292 for a long time and moves in a direction away from the fusion intention identification, and determines that the user needs to strip the constant temperature intention from the fusion intention. As shown in interface 2902 in fig. 29 (b), the smart home control panel displays a constant temperature intention identifier in the process of peeling off the fusion intention, as shown by reference numeral 293, for prompting the user to continue dragging to finish peeling off the constant temperature intention. If the smart home control panel detects that the user keeps dragging the isothermal heat zone to move away from the fusion intent identification direction, an interface 2903 as shown in fig. 29 (c) may be displayed. On the interface 2903, the fusion intention identifications corresponding to the peeled-off fusion thermostat intention and the lighting intention are no longer displayed in the intention identification field 295.
Optionally, after the fusion intention is stripped, the smart home control panel may instruct the electronic devices in the group that have performed the original fusion intention to cancel the intention to perform the stripped. For example, the scene smart home control panel shown in fig. 29 strips the constant temperature intention among the fusion intention in response to a user operation. Then, the smart home control panel may instruct the thermostat in the primary sleeper that has performed the fusion intent to cancel performing the thermostat intent.
Optionally, as shown in interface 2903 in fig. 29 (c), the center intent identifier is changed to an intent identifier corresponding to the intent remaining after stripping, and a group identifier, such as a main horizontal identifier, of a group of electronic devices including an execution intent corresponding to the intent identifier is highlighted, as shown by reference numeral 294.
It should be noted that, after the intent is stripped, the remaining intent is still the fusion intent, and the smart home control panel may generate and display the fusion intent identifier of the remaining intent. Alternatively, the determination method of whether to highlight the corresponding group identifier in the new fusion intention identifier may refer to the method shown in fig. 26 or fig. 28A and 28B, which will not be described herein.
The above-described embodiment describes the group control process taking, as an example, a center intention flag as a bubble-like (circular) flag. It is to be understood that the center intent identification may also be displayed in other shapes or sizes, as well, and embodiments of the present application are not limited in this regard. In some embodiments, the center intention identifier may also be implemented as an area display, and after the smart home control panel detects the operation of the user on the center intention identifier area corresponding to the center intention, the corresponding group control can be implemented as well. It will be appreciated that embodiments of the present application are likewise not limited in the shape and size of the center intent identification area display.
Illustratively, as shown in interface 3001 in fig. 30 (a), the smart home control panel display center is intended to be an on-light intention, such as on-light intention identification area 301 corresponding to the on-light intention. A group identification (e.g., a main bedroom identification, a study room identification, etc.) including a group of electronic devices that can implement the lighting intent is displayed around the lighting intent identification area 301. The intelligent home control panel detects that the user drags the main sleeping identifier 302 to the operation in the lighting intention identifier area 301 along the direction indicated by the arrow 303, and determines that the user needs to control the electronic device corresponding to the lighting intention in the main sleeping to start the lighting function. Then, referring to the related content described in fig. 17, in the scenario shown in fig. 30, the smart home control panel may also send the determined device ID and the intention identifier to the server, so as to control the electronic device corresponding to the intention of turning on the light in the main sleeping position to start the light-on function.
After the smart home control panel determines that the corresponding electronic device in the primary sleeping has started the turn-on function, as shown in fig. 30 (b) on interface 3002, the primary sleeping identifier is displayed in the turn-on intention identifier area 301, and the execution condition of the command is displayed in the primary sleeping identifier, such as turning on 4 lights in the primary sleeping.
Then, correspondingly, in the group control process described in fig. 16-29, the functions that the smart home control panel can implement can be implemented in the case that the center intention is displayed as the center intention identification area.
For example, as shown in fig. 18, the main horizontal identifier sequentially flashes the identifier of the electronic device that starts the light-on function. Correspondingly, in the scenario shown in fig. 30, after determining that the corresponding electronic device in the home bed starts the lighting function, the smart home control panel may flash the identifiers of the electronic devices in the home bed that start the lighting function in the home bed in sequence in the home bed identifier displayed in the lighting intention identifier area 301.
For another example, as shown in fig. 19, the identification of the electronic device that has activated the light-on function in the main bed is displayed around the main bed identification. Accordingly, in the scenario illustrated in fig. 30, the smart home control panel may also display the identity of the electronic device that has activated the light-on function in the home in the vicinity of the home identity displayed in the light-on intention identification area 301. Alternatively, the size of the center intent identification area may be adaptively changed according to the display contents. As shown in the scenario in fig. 30, the lighting intention identification area 301 displays a preset initial size, after determining that the identification of the electronic device that needs to display the lighting function in the main sleeping is displayed around the main sleeping, the smart home control panel determines that the area of the currently displayed lighting intention identification area 301 is insufficient to display the identification to be displayed, so that the size of the lighting intention identification area 301 can be automatically enlarged, and the display of the main sleeping identification and the corresponding electronic device identification that starts the lighting function in the lighting intention identification area 301 can be realized.
Then, corresponding to the process of canceling the execution center intention of the electronic device indicating that the center intention has been executed in part in the group shown in fig. 20A, the smart home control panel may determine to cancel the execution center intention of the electronic device corresponding to the electronic device identifier after detecting that the user drags the electronic device identifier displayed in a spread manner around the group identifier out of the center intention identification area.
For another example, after detecting a long press operation of the user in the center intention identification area at any position where no identification (such as a group identification or a device identification) is displayed, the smart home control panel may determine a group corresponding to the group identification displayed around the whole selection, and control the electronic device therein to execute the center intention. Or, it may be determined that the group corresponding to the group identifier displayed around is not selected at all, and the electronic device therein is controlled to cancel the intention of the execution center.
For another example, corresponding to the intent fusion scene shown in fig. 27, 28A and 28B, in the process that the central intent displayed by the smart home control panel is the central intent identification area, the intent fusion can be implemented after the user fusion intent operation is detected.
For example, as shown in interface 3101 (a) in fig. 31, after the smart home control panel detects the operation of dragging the constant temperature mark 311 displayed in the intention mark column to the on intention mark area 312 by the user, it determines that the constant temperature intention corresponding to the constant temperature mark 311 and the on intention corresponding to the on intention mark area 312 need to be fused. Thereafter, as shown in an interface 3102 of fig. 31 (b), a fusion intention identification area 313 of "on light+constant temperature" is displayed, and a main-lying identification of a main lying including an electronic device that can perform on light intention and constant temperature intention is displayed within the fusion intention identification area 313. Likewise, the fused intent identification area includes hot areas corresponding to each fused intent, and stripping of the fused intent can be achieved. In the fusion intention identification region 313 shown in fig. 31 (b), the left half is a hot zone corresponding to the on intention, and the right half is a hot zone corresponding to the constant temperature intention.
Optionally, the implementation of the center intention identification area of the other intelligent home control panel display may refer to the related content of the center intention identification of the intelligent home control panel display, which is not described herein.
Therefore, the intelligent home control panel displays the central intention identification area, so that a user can determine the central intention execution condition of the group according to the group identification displayed in the central intention identification area, the group control of the user is facilitated, and the use experience of the user is improved.
Fig. 32 is a schematic flow chart of a control method of an intelligent device according to an embodiment of the present application. The method is applied to the first electronic device, which may be the control device 300 (such as an intelligent home control panel and a mobile phone), and may also be the first electronic device 100. As shown in fig. 32, the method includes the following steps.
S3201, displaying a first interface, where the first interface includes a first intention identifier and a first group identifier corresponding to the first group.
The first group comprises X electronic devices capable of executing a first intention to identify a corresponding first intention, wherein X is a positive integer. Optionally, the category attribute and/or the location attribute of the X electronic devices are the same. The first intent corresponds to functions that the electronic device may perform, including, for example, a lighting intent, a thermostat intent, etc. The intention mark is a function mark and can be used for representing corresponding achievable functions, such as the on-lamp intention mark represents corresponding on-lamp functions, and the constant-temperature intention mark represents corresponding constant-temperature functions.
In some embodiments, the first intent identification is displayed in a first area of the first interface and the first group identification is displayed in other areas than the first area.
Illustratively, as shown in interface 3001 of fig. 30 (a), the first area is an on-intent identification area 301, and the on-intent identification may be an intent identification displayed in the on-intent identification area 301, and may occupy all or part of the on-intent identification area 301.
S3202, a first operation of the user on the first group identification is received.
In some embodiments, the first operation comprises: the first group identification is moved to a first area.
Illustratively, as shown in interface 3001 in fig. 30 (a), the first electronic device (e.g., smart home control panel) detects a first operation in which the user drags the primary horizontal marker 302 in the direction indicated by arrow 303 to the on-lamp intention marker region 301 (i.e., the first region).
S3203, in response to the first operation, controlling the first group identifier to move to the first intention identifier, and instructing the X electronic devices to execute the first intention.
In some embodiments, after the first electronic device detects the first operation, the device IDs of the X electronic devices in the first group that may perform the first intent can be determined according to the subsystem configuration file. Then, the first electronic device may control the first group identifier to move to the first intention identifier, and send the device ID and the intention identifier corresponding to the displayed first intention identifier to the server. And the server determines the execution actions and the execution conditions which need to be executed by the X electronic devices corresponding to the device ID according to the device ID, the intention identifier and the intention configuration file. The server generates a signal indication, and sends the signal indication to the corresponding X electronic devices of the device ID so as to instruct the X electronic devices to execute corresponding execution actions, such as turning on a lamp. Or the server generates a signal indication, sends the signal indication to the first electronic device, and forwards the signal indication to the X electronic devices corresponding to the device ID by the first electronic device so as to instruct the X electronic devices to execute corresponding execution actions, such as turning on a lamp.
For example, as shown in fig. 30 (a), after the first electronic device (such as the smart home control panel) detects the first operation of the main sleeping identifier 172 by the user, it is determined that the user needs to control the electronic device corresponding to the lighting intention in the main sleeping to start the lighting function. Then, the first electronic device displays a moving course of the main-lying identification 172 to the lighting intention identification region 301 in response to the first operation of the user, and instructs the X electronic devices determined in the main lying to perform the lighting intention.
Therefore, the user can realize group control of the electronic equipment through simple dragging operation, the equipment operation difficulty of the user is reduced, and the multi-equipment control requirement of the user is met.
In some embodiments, the first electronic device may further display a prompt message on the first interface, where the prompt message is used to feed back a case where the X electronic devices execute the first intention.
For example, as shown in the scenario of fig. 30, after the electronic device in the main sleeping position turns on the light according to the signal indication, the first electronic device may send a feedback signal, and determine whether the signal indication is executed according to the received feedback signal forwarded by the server or the received feedback signal sent by the electronic device. As shown in interface 3002 in fig. 30 (b), the first electronic device displays, according to the received feedback signal, a prompt message, for example, an execution condition of a command is displayed in the group identifier of the main sleeping, for example, 4 lamps of the main sleeping are turned on, as shown by reference numeral 304. The user can intuitively determine the execution condition of the command according to the display of the group identification, and the use experience of the user is improved.
In some embodiments, the first electronic device receives a third operation of the user on the first group identifier, and in response to the third operation, displays X electronic device identifiers corresponding to the X electronic devices respectively on the first interface. Alternatively, the third operation may be an operation of clicking on the first group identification.
Illustratively, as shown in fig. 19 (a), when the first electronic device (e.g., the smart home control panel) detects that the user clicks the main horizontal identifier 191, an interface 1902 as shown in fig. 19 (b) may be displayed. The first electronic device displays, on the interface 1902, electronic device identifications of electronic devices that have performed an on-lamp intention (i.e., a first intention), such as a desk lamp identification, a ceiling lamp identification, a spotlight identification, and a floor lamp identification.
Optionally, the first electronic device may display, in response to the third operation, a device identifier of an electronic device that executes the first intention from among the X electronic devices, for example, the M electronic devices execute the first intention successfully, the N electronic devices execute the first intention with failure, and the first electronic device may display the electronic device identifiers of the M electronic devices. That is, the electronic device identification of the electronic device that did not perform or failed to perform the first intent may not be displayed, or may not be highlighted. In this way, the user may determine which electronic devices performed the first intent and which electronic devices did not perform the first intent.
In some embodiments, the first electronic device receives a fourth operation of the user on an electronic device identifier corresponding to the second electronic device in the X electronic device identifiers. And responding to the fourth operation, controlling the electronic device identifier corresponding to the second electronic device to move out of the first area, and indicating the second electronic device to cancel executing the first intention. Optionally, the fourth operation may be an operation of dragging the electronic device identifier of the second electronic device out of the first area.
Wherein cancelling the intent to execute includes instructing the electronic device to execute a command opposite to the intent, e.g., cancelling the intent to turn on the light instructs the electronic device to turn off the light. The command to cancel the execution intention includes no longer executing the intention, for example, the intention is a constant temperature intention, and canceling the constant temperature intention is an instruction that the electronic device no longer executes the constant temperature intention.
For example, as shown in fig. 20A (a), the first electronic device detects a fourth operation of the electronic device identifier (e.g., the table lamp identifier 202) of the second electronic device (e.g., the table lamp) moving in a direction away from the first intention identifier (e.g., the lighting intention identifier), and may instruct the table lamp to cancel execution of the lighting intention. Thus, the independent control of the equipment in the group can be realized while the group control is realized, and the user requirement is met.
In some embodiments, the first interface also displays a second intent identification, and the first electronic device receives a fifth operation by the user to move the second intent identification to the first area. And in response to the fifth operation, instructing Y electronic devices in the first group to execute the second intention corresponding to the second intention identification, wherein Y is a positive integer. Optionally, the Y electronic devices are identical, different, or partially identical to the X electronic devices. Alternatively, the fifth operation may be an operation of dragging the second intention flag to the first area.
For example, as shown in fig. 31 (a), the first electronic device (e.g., the smart home control panel) displays the second intention flag (e.g., the constant temperature flag), and the fusion intention of the lighting intention and the constant temperature intention may be generated by detecting the fifth operation of the user to move the constant temperature flag 311 to the first area (e.g., the lighting intention flag area 312). And determining whether electronic devices capable of executing the constant temperature intention exist in a first group (such as a main lying group) corresponding to the lighting intention, if so, indicating the electronic devices capable of executing the constant temperature intention to execute the constant temperature intention. The electronic device performing the constant temperature intention may be an electronic device that has performed the light-on intention, may be an electronic device that has not performed the light-on intention, or may be a part of an electronic device that has performed the light-on intention.
In some embodiments, the first electronic device receives a sixth operation by the user of a second group identification displayed in the first interface, the second group identification corresponding to a second group of Z electronic devices including the first intent and the second intent, Z being a positive integer. In response to the sixth operation, the second group identification is moved to the first area and the Z electronic devices are instructed to perform the first intent and the second intent. Alternatively, the fourth operation may be an operation of dragging the second group identification to the first area.
For example, corresponding to the scene shown in fig. 31, after the first intention (for example, the lighting intention) and the second intention (for example, the constant temperature intention) are fused, a fused intention is generated, and then the first region is used to represent a region corresponding to the fused intention. The first electronic device detecting a fourth operation to move the second intent identification to the first area may determine to instruct the Z electronic devices in the second group to perform the fusion intent, i.e., to perform the first intent and the second intent. For example, some of the electronic devices in the second group execute the first intention, some of the electronic devices execute the second intention, and the electronic devices executing the first intention and the electronic devices executing the second intention may be identical, may be different, may be partially identical, and together constitute Z electronic devices.
In some embodiments, the first electronic device receives a seventh operation of the first region by the user. In response to the seventh operation, the first group identification and the second group identification are controlled to move out of the first area and instruct the Y electronic devices and the Z electronic devices to cancel executing the first intention and the second intention.
For example, the seventh operation is an operation of long pressing an arbitrary position in the first area. After the first electronic device detects the seventh operation of the user, it is determined that all electronic devices indicating that the first intention and the second intention have been performed cancel performing the first intention and the second intention.
In some embodiments, the first electronic device receives an eighth operation by the user on a second region, the second region corresponding to the second intent. In response to the eighth operation, the second intent identification is controlled to move out of the first area and the Y electronic devices are instructed to cancel execution of the second intent.
Optionally, the first region includes a second region, the second region corresponding to the second intent. For example, as shown in fig. 31 (b), the fusion intention identification area 313 is used to represent a first area corresponding to the on intention and the constant temperature intention, and then the first electronic device equally divides the first area into hot areas corresponding to the two intents according to the number of intents included in the fusion intention, such as 2. The region on the left side of the fusion intention identification region 313 is a hot region corresponding to the on-lamp intention (first intention), and the region on the right side is a hot region corresponding to the constant-temperature intention (second intention).
In some scenarios, a step of establishing the first group may also be included before step S3201 described above.
In some embodiments, the first electronic device displays a second interface including a plurality of electronic device identifications corresponding to the plurality of electronic devices, respectively. A second operation is received to select X electronic device identifications from the plurality of electronic device identifications. In response to the second operation, a first group comprising X electronic devices is established. Optionally, the electronic device identifiers corresponding to the electronic devices with the same category attribute and/or position attribute in the plurality of electronic devices are displayed in an aggregation mode.
Illustratively, as shown in fig. 8 (b), a first electronic device (e.g., a mobile phone) displays an interface 802, a plurality of electronic device identifications are displayed on an interface 1101, and the electronic device identifications of the electronic devices of the same category attribute among the plurality of electronic devices are displayed in an aggregated manner. The three electronic devices may be divided into one group upon detecting a user selection of operations in which the dishwasher, the steaming box, and the oven are located.
Alternatively, other methods for establishing the group may refer to the related content shown in fig. 10-14C, which is not described herein.
Therefore, the user can add the equipment in batches through simple operation, the operation difficulty of the user is reduced, and the use experience of the user is improved.
The intelligent device control method provided in the embodiment of the present application is described in detail above with reference to fig. 6 to 32. The following describes in detail the intelligent device control apparatus provided in the embodiment of the present application with reference to fig. 33.
In one possible design, fig. 33 is a schematic structural diagram of a first electronic device according to an embodiment of the present application. As shown in fig. 33, the first electronic device 3300 may include: a display unit 3301, a transmitting/receiving unit 3302, and a processing unit 3303. The first electronic device 3300, as an intelligent device control apparatus, may be used to implement the functions of the first electronic device 100 or the control device 300 related to the above-described method embodiments.
Optionally, a display unit 3301 is configured to support the first electronic device 3300 to display interface content; and/or support the first electronic device 3300 to perform S3201 and S3203 in fig. 32.
Alternatively, the transceiver unit 3302 is configured to support the first electronic device 3300 to execute S3202 in fig. 32.
Optionally, the processing unit 3303 is configured to support the first electronic device 3300 to execute S3203 in fig. 32.
The transceiver unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver related circuit component, and may be a transceiver or a transceiver module. The operations and/or functions of each unit in the first electronic device 3300 are respectively for implementing the corresponding flow of the intelligent device control method described in the above method embodiment, and all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional unit, which is not repeated herein for brevity.
Optionally, the first electronic device 3300 shown in fig. 33 may further include a storage unit (not shown in fig. 33) in which a program or instructions are stored. When the display unit 3301, the transceiver unit 3302, and the processing unit 3303 execute the program or instructions, the first electronic device 3300 shown in fig. 33 is enabled to execute the smart device control method described in the above-described method embodiment.
The technical effects of the first electronic device 3300 shown in fig. 33 may refer to the technical effects of the intelligent device control method described in the above method embodiment, and will not be described herein.
In addition to the form of the first electronic device 3300, the technical solution provided in the present application may also be a functional unit or a chip in the first electronic device, or a device that is matched with the first electronic device for use.
The embodiment of the application also provides a chip system, which comprises: a processor coupled to a memory for storing programs or instructions which, when executed by the processor, cause the system-on-a-chip to implement the method of any of the method embodiments described above.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integrated with the processor or may be separate from the processor, and embodiments of the present application are not limited. For example, the memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of memory and the manner of disposing the memory and the processor in the embodiments of the present application are not specifically limited.
Illustratively, the chip system may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated chip (AP device plication specific integrated circuit, ASIC), a system on chip (SoC), a central processor (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor or in a combination of hardware and software modules in a processor.
The embodiment of the application further provides a computer readable storage medium, in which a computer program is stored, which when executed on a computer, causes the computer to perform the above related steps to implement the smart device control method in the above embodiment.
The embodiment of the present application further provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the smart device control method in the above-mentioned embodiment.
In addition, the embodiment of the application also provides a device. The apparatus may be a component or module in particular, and may comprise one or more processors and memory coupled. Wherein the memory is for storing a computer program. The computer program, when executed by one or more processors, causes the apparatus to perform the smart device control method in the method embodiments described above.
Wherein an apparatus, a computer-readable storage medium, a computer program product, or a chip provided by embodiments of the present application are each configured to perform the corresponding method provided above. Therefore, the advantages achieved by the method can be referred to as the advantages in the corresponding method provided above, and will not be described herein.
The steps of a method or algorithm described in connection with the disclosure of the embodiments disclosed herein may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access memory (random access memory, RAM), flash memory, read Only Memory (ROM), erasable programmable read only memory (erasable programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc read only memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (AP device plication specific integrated circuit, ASIC).
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that the foregoing functional block divisions are merely illustrative for convenience and brevity of description. In practical application, the above functions can be allocated by different functional modules according to the need; i.e. the internal structure of the device is divided into different functional modules to perform all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in this application, it should be understood that the disclosed methods may be implemented in other ways. The device embodiments described above are merely illustrative. For example, the division of the modules or units is only one logic function division, and other division modes can be adopted when the modules or units are actually implemented; for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, module or unit indirect coupling or communication connection, which may be electrical, mechanical or other form.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Computer readable storage media include, but are not limited to, any of the following: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (16)
1. An intelligent device control method applied to a first electronic device is characterized by comprising the following steps:
displaying a first interface, wherein the first interface comprises a first intention identifier and a first group identifier corresponding to a first group, the first group comprises X electronic devices capable of executing a first intention corresponding to the first intention identifier, and X is a positive integer;
Receiving a first operation of a user on the first group identifier;
in response to the first operation, the first group identification is controlled to move to the first intent identification and the X electronic devices are instructed to execute the first intent.
2. The method of claim 1, wherein prior to said displaying the first interface, the method further comprises:
displaying a second interface, wherein the second interface comprises a plurality of electronic equipment identifiers respectively corresponding to a plurality of electronic equipment;
receiving a second operation of selecting X electronic equipment identifiers from the plurality of electronic equipment identifiers;
in response to the second operation, the first group including the X electronic devices is established.
3. The method according to claim 2, wherein the method further comprises:
and gathering and displaying the electronic equipment identifiers corresponding to the electronic equipment with the same category attribute and/or position attribute in the plurality of electronic equipment.
4. A method according to any of claims 1-3, wherein the first intent indication is displayed in a first area of the first interface prior to the receiving of a first operation of the first group indication by the user, the first group indication being displayed in an area other than the first area; the first operation includes: the first group identification is moved to the first area.
5. The method according to claim 4, wherein the method further comprises:
receiving a third operation of the user on the first group identifier;
and responding to the third operation, and displaying X electronic equipment identifiers respectively corresponding to the X electronic equipment on the first interface.
6. The method of claim 5, wherein after the first interface displays X electronic device identifiers corresponding to the X electronic devices, respectively, the method further comprises:
receiving a fourth operation of a user on the electronic equipment identifier corresponding to the second electronic equipment in the X electronic equipment identifiers;
and responding to the fourth operation, controlling the electronic equipment identifier corresponding to the second electronic equipment to move out of the first area, and indicating the second electronic equipment to cancel executing the first intention.
7. The method according to any of claims 1-6, wherein the category properties and/or location properties of the X electronic devices are the same.
8. The method according to any one of claims 1-7, further comprising:
and displaying prompt information on the first interface, wherein the prompt information is used for feeding back the condition that the X electronic devices execute the first intention.
9. The method of claim 4, wherein the first interface further displays a second intent identification, the method further comprising:
receiving a fifth operation of moving the second intention identification to the first area by a user;
and responding to the fifth operation, and indicating Y electronic devices in the first group to execute the second intention corresponding to the second intention identification, wherein Y is a positive integer.
10. The method of claim 9, wherein after receiving the fifth operation, the method further comprises:
receiving a sixth operation of a second group identification displayed in the first interface by a user, the second group identification corresponding to a second group including Z electronic devices that can execute the first intention and the second intention, the Z being a positive integer;
in response to the sixth operation, the second group identification is moved to the first area and the Z electronic devices are instructed to perform the first intent and the second intent.
11. The method according to claim 10, wherein the method further comprises:
receiving a seventh operation of the user on the first area;
and in response to the seventh operation, controlling the first group identifier and the second group identifier to move out of the first area, and instructing the Y electronic devices and the Z electronic devices to cancel executing the first intention and the second intention.
12. The method according to claim 9, wherein the method further comprises:
receiving an eighth operation of a user on a second region, the second region corresponding to the second intent;
in response to the eighth operation, the second intent identification is controlled to move out of the first area and instruct the Y electronic devices to cancel execution of the second intent.
13. The method according to any one of claims 1 to 12, wherein,
the third operation is a clicking operation;
the fourth operation, the fifth operation, the sixth operation, and the eighth operation are drag operations;
the seventh operation is a long press operation.
14. An electronic device, comprising: a processor, a memory and a display screen, the memory and the display screen being coupled to the processor, the memory being for storing computer program code, the computer program code comprising computer instructions which, when read from the memory by the processor, cause the electronic device to perform the method of any of claims 1-13.
15. A computer readable storage medium, characterized in that the computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the method according to any one of claims 1-13.
16. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method according to any of claims 1-13.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111355520.0A CN116136659A (en) | 2021-11-16 | 2021-11-16 | Smart device control method and electronic device |
PCT/CN2022/127999 WO2023088061A1 (en) | 2021-11-16 | 2022-10-27 | Smart device control method and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111355520.0A CN116136659A (en) | 2021-11-16 | 2021-11-16 | Smart device control method and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116136659A true CN116136659A (en) | 2023-05-19 |
Family
ID=86332893
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111355520.0A Pending CN116136659A (en) | 2021-11-16 | 2021-11-16 | Smart device control method and electronic device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116136659A (en) |
WO (1) | WO2023088061A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117707242A (en) * | 2023-07-11 | 2024-03-15 | 荣耀终端有限公司 | Temperature control methods and related devices |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5956030A (en) * | 1993-06-11 | 1999-09-21 | Apple Computer, Inc. | Computer system with graphical user interface including windows having an identifier within a control region on the display |
US20110301764A1 (en) * | 2008-12-10 | 2011-12-08 | Somfy Sas | Method of learning a device for controlling home-automation equipment of a building |
CN105355011A (en) * | 2014-08-18 | 2016-02-24 | 颂莱视听工程股份有限公司 | Wireless environment control system |
CN105607499A (en) * | 2016-01-05 | 2016-05-25 | 北京小米移动软件有限公司 | Equipment grouping method and apparatus |
CN106919305A (en) * | 2017-03-03 | 2017-07-04 | 广东星美灿照明科技股份有限公司 | A way to manage and control a smart home |
CN107273119A (en) * | 2017-05-31 | 2017-10-20 | 广东星美灿照明科技股份有限公司 | The smart terminal accepts the computer-readable storage medium and the smart terminal for users to configure the controlled equipment of the smart home system |
CN107896179A (en) * | 2016-10-01 | 2018-04-10 | 杭州鸿雁智能科技有限公司 | A kind of group control method of home equipment |
CN111897462A (en) * | 2020-07-17 | 2020-11-06 | 深圳市致趣科技有限公司 | Intelligent display control method and device in intelligent household APP |
CN112398710A (en) * | 2020-11-05 | 2021-02-23 | 木林森股份有限公司 | Intelligent device, group adding device, system and method, and storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105376125B (en) * | 2015-12-08 | 2018-12-18 | 深圳众乐智府科技有限公司 | A kind of smart home system control method and device |
CN105611047A (en) * | 2015-12-16 | 2016-05-25 | 芜湖美智空调设备有限公司 | Shortcut control method and device based on mobile terminal |
KR20180052347A (en) * | 2016-11-10 | 2018-05-18 | 삼성전자주식회사 | Voice recognition apparatus and method |
CN108845503A (en) * | 2018-08-11 | 2018-11-20 | 深圳市百创网络科技有限公司 | The providing method and its system of Intelligent household scene service |
CN112799305A (en) * | 2019-11-13 | 2021-05-14 | 北京安云世纪科技有限公司 | Intelligent household control method and system |
CN110944236B (en) * | 2019-11-29 | 2021-11-30 | 维沃移动通信有限公司 | Group creation method and electronic device |
CN111752165B (en) * | 2020-07-10 | 2024-08-27 | 广州博冠智能科技有限公司 | Intelligent equipment control method and device of intelligent home system |
CN113055255A (en) * | 2020-12-25 | 2021-06-29 | 青岛海尔科技有限公司 | Scene configuration method and device of intelligent household appliance, storage medium and electronic equipment |
-
2021
- 2021-11-16 CN CN202111355520.0A patent/CN116136659A/en active Pending
-
2022
- 2022-10-27 WO PCT/CN2022/127999 patent/WO2023088061A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5956030A (en) * | 1993-06-11 | 1999-09-21 | Apple Computer, Inc. | Computer system with graphical user interface including windows having an identifier within a control region on the display |
US20110301764A1 (en) * | 2008-12-10 | 2011-12-08 | Somfy Sas | Method of learning a device for controlling home-automation equipment of a building |
CN105355011A (en) * | 2014-08-18 | 2016-02-24 | 颂莱视听工程股份有限公司 | Wireless environment control system |
CN105607499A (en) * | 2016-01-05 | 2016-05-25 | 北京小米移动软件有限公司 | Equipment grouping method and apparatus |
CN107896179A (en) * | 2016-10-01 | 2018-04-10 | 杭州鸿雁智能科技有限公司 | A kind of group control method of home equipment |
CN106919305A (en) * | 2017-03-03 | 2017-07-04 | 广东星美灿照明科技股份有限公司 | A way to manage and control a smart home |
CN107273119A (en) * | 2017-05-31 | 2017-10-20 | 广东星美灿照明科技股份有限公司 | The smart terminal accepts the computer-readable storage medium and the smart terminal for users to configure the controlled equipment of the smart home system |
CN111897462A (en) * | 2020-07-17 | 2020-11-06 | 深圳市致趣科技有限公司 | Intelligent display control method and device in intelligent household APP |
CN112398710A (en) * | 2020-11-05 | 2021-02-23 | 木林森股份有限公司 | Intelligent device, group adding device, system and method, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2023088061A1 (en) | 2023-05-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111752443B (en) | Method, related device and system for controlling page of display equipment | |
WO2020253695A1 (en) | Smart home device access method and electronic device | |
US10091020B2 (en) | Electronic device and gateway for network service, and operation method therefor | |
CN111614524A (en) | A method, device and system for linkage control of multi-intelligent devices | |
US20150257104A1 (en) | Method for controlling beacon signal of electronic device and electronic device thereof | |
CN110795179A (en) | A display method and electronic device | |
WO2021238933A1 (en) | Control method applied to electronic device, and electronic device | |
CN114727416A (en) | Wireless communication method and electronic device providing the same | |
WO2020155870A1 (en) | Device control method and devices | |
WO2020133467A1 (en) | Method for smart home appliance to access network and related device | |
US11412555B2 (en) | Mobile terminal | |
CN113810542B (en) | Control method applied to electronic equipment, electronic equipment and computer storage medium | |
KR102444897B1 (en) | Device and method for establishing communication connection | |
CN115509139A (en) | Equipment control method, related device and communication system | |
CN117092921B (en) | Scene setting method and electronic device | |
CN114035721B (en) | Touch screen display method, device and storage medium | |
CN115708344B (en) | Method for sharing remote control between devices, electronic device and storage medium | |
EP4195708A1 (en) | Movement trajectory generation method and apparatus | |
CN104950716A (en) | A device control method and device | |
WO2023071454A1 (en) | Scenario synchronization method and apparatus, and electronic device and readable storage medium | |
CN116136659A (en) | Smart device control method and electronic device | |
WO2022052713A1 (en) | Interaction method and apparatus, and electronic device | |
CN205647813U (en) | Family's intelligence cinema based on wireless network deployment of wiFi technique | |
WO2024027713A1 (en) | Scenario configuration method, electronic device and system | |
WO2023226923A1 (en) | Method for controlling plc device, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |