GB2525398A - Stand mixer controls - Google Patents
Stand mixer controls Download PDFInfo
- Publication number
- GB2525398A GB2525398A GB1407093.2A GB201407093A GB2525398A GB 2525398 A GB2525398 A GB 2525398A GB 201407093 A GB201407093 A GB 201407093A GB 2525398 A GB2525398 A GB 2525398A
- Authority
- GB
- United Kingdom
- Prior art keywords
- gesture
- user
- stand mixer
- database
- hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J43/00—Implements for preparing or holding food, not provided for in other groups of this subclass
- A47J43/04—Machines for domestic use not covered elsewhere, e.g. for grinding, mixing, stirring, kneading, emulsifying, whipping or beating foodstuffs, e.g. power-driven
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B01—PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
- B01F—MIXING, e.g. DISSOLVING, EMULSIFYING OR DISPERSING
- B01F27/00—Mixers with rotary stirring devices in fixed receptacles; Kneaders
- B01F27/05—Stirrers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B01—PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
- B01F—MIXING, e.g. DISSOLVING, EMULSIFYING OR DISPERSING
- B01F27/00—Mixers with rotary stirring devices in fixed receptacles; Kneaders
- B01F27/60—Mixers with rotary stirring devices in fixed receptacles; Kneaders with stirrers rotating about a horizontal or inclined axis
- B01F27/65—Mixers with rotary stirring devices in fixed receptacles; Kneaders with stirrers rotating about a horizontal or inclined axis with buckets
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J43/00—Implements for preparing or holding food, not provided for in other groups of this subclass
- A47J43/04—Machines for domestic use not covered elsewhere, e.g. for grinding, mixing, stirring, kneading, emulsifying, whipping or beating foodstuffs, e.g. power-driven
- A47J43/044—Machines for domestic use not covered elsewhere, e.g. for grinding, mixing, stirring, kneading, emulsifying, whipping or beating foodstuffs, e.g. power-driven with tools driven from the top side
- A47J2043/04454—Apparatus of counter top type
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J36/00—Parts, details or accessories of cooking-vessels
- A47J36/32—Time-controlled igniting mechanisms or alarm devices
- A47J36/321—Time-controlled igniting mechanisms or alarm devices the electronic control being performed over a network, e.g. by means of a handheld device
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J43/00—Implements for preparing or holding food, not provided for in other groups of this subclass
- A47J43/04—Machines for domestic use not covered elsewhere, e.g. for grinding, mixing, stirring, kneading, emulsifying, whipping or beating foodstuffs, e.g. power-driven
- A47J43/044—Machines for domestic use not covered elsewhere, e.g. for grinding, mixing, stirring, kneading, emulsifying, whipping or beating foodstuffs, e.g. power-driven with tools driven from the top side
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J43/00—Implements for preparing or holding food, not provided for in other groups of this subclass
- A47J43/04—Machines for domestic use not covered elsewhere, e.g. for grinding, mixing, stirring, kneading, emulsifying, whipping or beating foodstuffs, e.g. power-driven
- A47J43/07—Parts or details, e.g. mixing tools, whipping tools
- A47J43/0705—Parts or details, e.g. mixing tools, whipping tools for machines with tools driven from the upper side
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Food Science & Technology (AREA)
- Chemical & Material Sciences (AREA)
- Chemical Kinetics & Catalysis (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A stand mixer 100 comprising of a gesture control unit 302 for remote controlling the stand mixer 100 by the use of hand signals. The gesture control unit comprises of a commands database 404, a sensing unit 402 and a command processing unit 406. The commands database may preferably comprise a plurality of commands corresponding to a range of gestures and a voice commands with each command corresponding to a specific function of the stand mixer. The commands database further comprises a gesture database maintaining the plurality of gestures. The sensing unit is configured for capturing data, at least one of a gesture performed by a user and a voice command given by the user. The sensing unit is configured to process the captured data. The processing includes mapping motion corresponding to the gesture performed by the user with a known gesture using the gesture database and mapping the known gesture to an associated command using the command database. The processing procedure includes mapping the voice command given by the user with the voice commands in the commands database. The command processing unit may preferably be configured for sending the associated command for performing a plurality of corresponding functions.
Description
I
GESTURE CONTROLLED STAND MIXER
DESCRIPTION
[0001] This invention relates to a stand mixer device and, more particularly, to a stand mixer control and display.
[0002] A stand mixer is a multipurpose kitchen appliance which helps and reduces physical work required while food preparation and cooking. The stand mixer is a motor-driven kitchen machine which is used to mix, or otherwise process, ingredients in a bowl by means of planetary movement of one or more shanked tools which extend into the bowl from a head unit that carries a downwardly-facing drive outlet, powered by the motor, to which the shank, or shanks, of the tool, or tools are fitted. Historically, stand mixers offered mechanical controls such as knobs and, switches. However, consumers using these devices found the mechanical controls inconvenient to operate. The mechanical controls often get very hard and consumers have to put in additional manual effort to operate the mechanical controls. Further, due to the flexibility in the variety of the tools and attachments offered by the equipmellt, the equipment itself is quite heavy.
Therefore, the usage of these stand mixers for daily cookifig is quite difficult due to head lift and tool change requiremeults, which would require physical power.
[0003] This resulted in infrared (IR) based remote controls ailci touch sensitive input methods, which allowed the consumers to operate the devices through these controls, which could be operated from within a specific distance from the devices. However, these controls also have certain disadvantages such as operating the remote controls in the dark is not possible and also it is uncomfortable to operate using these touch based controls while cooking in the kitchen or while dining.
[0994] It is known that using the motion sensors/detectors, consumers may operate television, lights or other devices by recognizing the presence of a person. However, this method was disadvantageous as these actions had to he configured and changed manually to adjust to the specific user requirements, as they were not dynamic. Further, these sensors only worked on a binary sensing means to operate or not. Hence, there is a need to operate the kitchen appliances like stand mixer, when hands are dirty and to maintain hygiene while cooking.
[0995] Gesture control technology is known in art. The technology helps users to operate these devices using gestures without touching any control. The technology has been used in the entertainment devices (televisions, laptops) or automobiles. Therefore, having a gesture controlled kitcilell appliance will give that extra flexibility ill addition to the already existing multi-use feature of the stand mixer. However, the implementation of such gesture control technology in a kitchen environment or small household equipment has unique demands to realise the technology with efficiency and without illcerference.
[0996] In view of the problems and the shortcomings of the existing solutions, the present invention includes a system of recognising user commands without the user being in physical contact with the machine controls. This is achieved through gesture control and voice control.
More specifically, the solution to the problems is provided in a way such that it mitigates the existing problems around the implementation of such solutions within the home environment for appliances.
[0997] It is an object of this invention to provide gesture control unit that using hand motions provides contactiess control. Since the user may operate stand mixer by no contact means, it alleviates any discomfort. Gesture control may also simplify the user ifiterface. instead of having controls requirillg rotation or pushing, simple hand motions can achieve the same objective.
Further, the invention provides a dial in mode for security. This allows a user to opt when to use voice and gesture controlling axd when to use manual controls providing security at homes when children are around.
[0008] According to the inventioll, a stand mixer 100 includes a gesture coiltrol unit 302 for remote controlling the stand mixer 100. The gesture control unit 302 comprising a commands database 404, a sensing unit 402 and a command processing unit 406. The commands database 404 comprising a plurality of commands corresponding to a plurality of gestures. Each command corresponding to a specific function of the stand mixer 100. The commands database 404 further comprising a gesture database maintaining the plurality of gestures. The sensing unit 402 configured for capturing data of a gesture performed by a user and processing the captured data.
The processing comprising mapping motion corresponding to the gesture performed by the user with a known gesture using the gesture database and mapping gesture to an associated command using the command database. The command processing unit 406 configured for sending the associated command for performing a corresponding function. The fLinction is one of switching on/off one or more of a high-speed blender drive outlet, a slow-speed mincer drive outlet and a planetary drive outlet: selecting speed of operation; and selecting a tool for the planetary drive outlet, wherein the tool is one of a K-heater, a whisk, a doLigh hook and a spatula.
[9999] Technical advantages of particular embodiments include the ability of the stand mixer to detect gestures and perform different functions, operations or tasks. The gestures may be mapped to a large number of commands by using a display to show the various user selectable functions. Accordingly, the functionality of the device may be increased.
[0910] Other technical advantages will be readily apparent to one skilled in the art from the following figures, descriptions and claims. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some or none of the enumerated advantages.
[0911] In order that the invention may be clearly understood and readily carried into effect, certain embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, of which: Fig. I shows, in perspective view, one example of a stand mixer of the invention: Fig. 2 shows anotherperspective view of the stand mixer of the invention; Fig. 3 is a block diagram of a gesture-based control system according to the present invention; Fig. 4 is a block diagram of a gesture-based control system; Fig. 5 is a flowchart of a method for controlling a stand mixer according to the present invention; and Fig. 6 illustrates images of different hand gestures that may be used to operate a stand mixer according to the present invention.
[0912] Referring now to Figs. 1 and 2, in which corresponding features carry the same reference numbers, the stand mixer 100 comprises a pedestal 102 which supports a bowl platform 104 and a housing 106. The housing 106 encloses, either in an upright casing part 108 or in a header unit 110, an electric drive motor and gearing (neither shown) which provides motive power to a plurality of drive outlets to which various tools can he attached to perform a wide variety of tasks in the kitchen.
[9913] In this particular example, there is provided a high-speed blender drive oLitlet behind covers Ii 2, a slow-speed mincer drive outlet behind cover i 14 and a planetary drive outlet i 16, disposed beneath a gearbox casing I 18: the planetary drive being intended for food mixing and thus being disposed overhead of the bowl location. It will readily be appreciated, however, that more, fewer andlor different drive oLitlets can he provided in accordance with desired functionality of the stand mixer.
[0914] A shanked mixing tool, attached as is conventional, to a socket comprised in the planetary drive outlet 116, will depend in use into a mixing howl placed on the howl platform 104, and is configured to rotate about both the axis of the planetary drive outlet 116 and a central axis, thus performing a planetary mixing action. The tool may be one of a K-beater, a whisk, a dough hook and a spatula. The K-heater is used for making cakes, biscuits, pastry, icing, fillings, éclairs and mashed potato. The whisk is used for eggs, cream, batters, fatless sponges, meringues, cheesecakes, mousses, and souffles. The dough hood is used for yeast mixtures. The necessary relationships between the relative shapes and dimensions of the bowl and the mixing tool to ensure thorough and repeatable mixing of ingredients are well known and established in use over many years.
[0015] As shown in FIGS. 1 and 2, the stand mixer 100 is, in this example, provided with a pair of latches 120, 122 within a recess 124 provided in the bowl platform 104, which latches co-operate with components on the base of the howl to form a bayonet latching system which ensures firm and read)' location of the howl on its platform. Other latching systems, such as screw-threading for example, can be used as an alternative to bayonet latching if preferred.
[0016] The upright part 108 of the housing 106 is configured with a break line 126, to permit the header part 110 of the stand mixer to be hinged away from the bowl platform 104 end of the pedestal part 102, in order to facilitate the insertion and removal of the mixing tools and the bowl.
[0017] lii the pedestal 102, beneath the bowl platform 104, is provided a heating means (not shown) capable of generating sufficient heat to cook ingredients in the bowl. Typically, and indeed preferably, the bowl is of metal and the heating means comprises an inductive heater.
Alternative or additional heating technologies can be employed, however, such as thick film heaters, halogen heaters and suitably configured sheathed resistance heaters.
[0918] If desired, the heating means may comprise relatively low and relatively high wattage settings, to enable the stand mixer arrangement to he used for slow cooking as well as for attended cooking programmes. It will he appreciated that, in such circumstances, the relatively low wattage setting maybe provided by energising jLlst a portion of a relatively high wattage heater element. Alternatively, the lower wattage may be provided by a separate element, or by cyclic interruption of the power to a single element.
[0919] The stand mixer 100 also incorporates a sensor 128 located on the extenor sLirface at one side of the housing i 06. The sensor I 28 captures data of a gesture performed by a user in a predetermined region near the stand mixer 100. The sensor 128 maybe configured to captLlre one or more of a video signal, an audio signal, an infrared signal, an elecnic field signal or like.
A gesture performed by the user may include a specific motion performed by user's hand in a specific configuration. Some examples of gestures performed by the user are explained in conjunction with Fig. 6 below, in an alternate embodiment, the sensor 128 is available on a smart communication device (example, mobile phone, table and like). The smart communication device receives data from the sensor 128, and wirelessly comamnicates the same to the stand mixer 100. Accordingly, the stand mixer 100 is capable of wirelessly communicating with the smart communication device. Further, the smart communication device may also process the data to determine the user command.
[0020] The user may perform gestures corresponding to one or more available functions: switching on/off one or more of a high-speed blender drive outlet, a slow-speed mincer drive outlet and a planetary drive outlet; selecting speed of operation; and selecting a tool for the planetary drive outlet, wherein the tool is one of a K-beater, a whisk, a dough hook and a spatula.
Further, pre-stored settings may be activated using gestures. For example, for using K-beater for creaming fat and sugar, the pre-stored setting may be start on minimum speed (mm.) and then gradually increasing to maximum speed (max). For beating eggs into creamed mixtures 4, the pre-stored setting may be to start and keep the speed at max. For folding in flour, fruit etc, the pre-stored setting may be to start and keep the speed at value "1". For cakes, the pre-stored setting may be to start on mm speed and gradually increase to max. Similarly, for using whisk, the pre-stored setting may be to start and keep the speed at max and for using dough hook, the pre-stored setting may be to start on mm and gradually increased the speed to value "1", In another embodiment, the various tools maybe included in the housing 106, wherein based on a gesture pedoimed by a user, a specific tool may he automatically pulled out of the housing 106 and placed in the position for operation.
[0921] The output of the sensor 128 is coupled to a processor for receiving the sensor input and performing gesture recognition. The processor configured to map motion corresponding to the gesture performed by the user with a known gesture using a gesture database maintaining a plurality of gestures, and then map the known gesture to an associated command using a command database maintaining a plurality of commands and then sending the associated command for performing the associated function. This is explained in further detail in conjunction with FTGs. 3, 4, and 5 below.
[0922] The stand mixer 100 may further include a display 130 to provide feedback to the user about a detected gesture. The stand mixer 100 may also include one or LEDs 132 to provide feedback to the user about a detected gesture. The stand mixer 100 may further include one or more manual controls 134 that may be used when the gesture coiltrol does not work as expected.
[0023] The sensor 128 may be further configured to capture a voice command given by a user 306. Accordingly, the processor is configured to map voice command given by the user 306 with a known voice command using a voice command database maintaining a plurality of voice commands, wherein the commands database 404 includes the voice command database. Further, the stand mixer 100 may further include a dial in mode that allows the user to select one or more of gesture controlling, voice controlling and manual controlling.
[0924] Referring now to Figs. 3 and 4, the stand mixer 100 includes the gesture control unit 302 interacts with a control circuitry 304. The gesture control unit 302 configured to detect a gesture and seild a corresponding command to the control circuitry 304 to execute the corresponding function. A user 306 interacts with the stand mixer 100 by performing gestures.
[0025] The gesture control unit 302 further includes other enabling blocks including the sensing unit 402, the commands database 404 and the command processing ullit 406. The commaMs database 404 maintains a plurality of commands corresponding to a plurality of gestures. Each command correspoilding to a specific function of the stand mixer 100. The commands database 404 further comprising a gesture database maintaining a plurality of the gestures. Each gesture in the gesture database is defined by a trajectory of a user's hand. Alternatively, the gesture database may be part of the commands database 404.
[0926] The sensing unit 402 receives data captured by the sensor 128. The data corresponds to a gesture performed by the user 306. Then, the sensing unit 402 maps motion corresponding to the gesture performed by the user 306 with a known gesture using the gesture database.
Accordingly, the sensing unit 402 is configured to analyze signals in the captured data for possible use of a user hand within it, and when a hand is detected, it follows the hand to record a trajectory, wherein the recorded trajectory is compared with trajectories stored in the gesture database to detect the known gestLire corresponding to the gesture performed by the user 306.
[0927] The gesture control unit 302 may further include a display unit 408 that sends information of the detected motion to the display 130. This provides a feedback to the user 306 that helps the user 306 to perform the gesture correctly.
[0028] Next, the sensing unit 402 maps the known gesture to an associated command using the commands database 404. Based on the match, the command processing unit 406 sends the associated command to the control circuitry 304. Finally, the control circuitry 304 executes the command to perform a suitable action as desired by the user 306. Additionally, the executed commands may be displayed through the display unit 408 integrated with the gesture control unit 302.
[0029] In an alternate embodiment of the present disclosure, the display 130 shows available functions. The user 306 moves a cursor on the display 130 by performing a known gesture to select an available function. For example, the user 306 may move an open hand to move the cursor on the display 130 and select an available function. Once, the cursor is in the right position, the user 306 may perform another gesture to activate the selected function.
[0930] Referring now to Fig. 5, is a flowchart 500 illustrating a gesture control process utilizing a number of features described herein, in accordance with a particular embodiment. At step 502, raw data of a particular gesture movement performed by the user 306 is received. The raw data is processed at step 504, where the actual trajectory is determined. At step 506, the determined trajectory is mapped to a known gesture in the gesture database. For example, a detected hand motion may be mapped to a gesture involving a hand wave gesture. Further, mapping the trajectory to a gesture may include accessing a user settings database, which may include user data comprising, for example, user precision and noise characteristics or thresholds, user-created gestures and any other user-specific data including user identities. User-specific information allows different users of the stand mixer 100 to have different settings and motion input characteristics. For example, a child may have less precision than an adult when inputting gestures such that the child may have fewer gestures available. Moreover, a more experienced user may have more functions available through gesture input.
[0931] At step 508, the sensed action or gesture is mapped to an associated command. This step may include accessing the commands database 404, which may include correlation between gestures and commands. Further, different users may have different mappings of gestures to commands and different user-created commands. Thus, the commands database 404 may also include user-specific mapping instructions or characteristics, user-created functions and any other function information which may he applicable to map a particular gesture to one or more commands. The commands database 404 may further include user identities to identify users providing gestures and selecting appropriate commands.
[0032] At step 510, the control circuitry 304 executes the appropriately-mapped one or more coniinands to perform a suitable function as desired by the user 306.
[0033] Fig. 6 illustrates some exemplary gestures that may be stored in the gesture database of the present invention. For example, to trigger the gesture control unit 302 to initialize, a hand 602 may be raised in a region within the field of view 604 of the sensor 128 and a hand wave gesture 606 may be performed for 2 to 5 seconds.
[0034] Once the system is initialized, the user 306 may perform one or more gestures including a heart-shaped hand gesture 608, a right-angle-shaped hand gesture 610, a cross finger gesture 612, a hand rotating a knob gesture 614, a thumbs-up gesture 616, a clapping hands gesture 618, an open hand pushing-down gesture 620 and an open hand pushing-up gesture 622. Each of the gestures 606-622 may correspond to one or more functions of the stand mixer i 00.
[0035] Alternatively, an open hand may be used to control a cursor via tracking free hand movement (the cursor is visible on the display 130). The display 130 shows various available functions. The user 306 moves a cursor on the display 130 by performing a known gesture to select an available function. The user 306 may select the appropriate function by moving the hand and placing the cursor on the setting on the display 130. Thereafter, the user 306 may perform press and release to activate the setting. For example, holding an open hand over the appropriate setting, pressing forward until the function is selected and releasing by pulling back slightly to complete the activation.
Claims (28)
- CLAIMSWhat is claimed is: 1. A stand mixer (100) comprising a gesture control capability, wherein the stand mixer comprising: a sensor (128) for capturing data of a gesmre performed by a user (306); a processor for processing the captured data, the processor coilfigured to: map motion corresponding to the gesture performed by the user (306) with a known gesture using a gesture database maintaining a plurality of gestures; map the known gesture to an associated command using a commands database (404) maintaining a plurality of commands, whereill each command corresponding to a specific function of the stand mixer (100): sending the associated command; a control circuitry (304) for receiving the associated command and performing the associated function, characterized in that, the function is one of switching on/off one or more of a high-speed blender drive outlet, a slow-speed mincer drive outlet and a planetary drive outlet; selecting speed of operation; and selecting a tool for the planetary drive outlet.
- 2. The stand mixer (100) of claim I, wherein the gesture performed by the user (306) includes a specific motion performed by the user's hand in a specific configuration.
- 3. The stand mixer (100) of claim 1, wherein the plurality of gestures in the gesture database includes two or more of a hand wave gesture (606), a heart-shaped hand gesture (608), a right-angle-shaped haiid gesture (610), a cross finger gesture (612), a hand rotating a knob gesture (614), a thumbs-up gesture (616), a clapping hands gesture (618), an open hand pushing-down gesture (620) and an open hand pushing-up gesture (622).
- 4. The stand mixer (100) of claim I, wherein the sensor (128) captures the data in a predetermined region near the stand mixer (tOO).
- 5. The stand mixer (100) of claim I, wherein each gesture in the gesture database is defined by a trajectory of a user's hand.
- 6. The stand mixer (100) of claimS, wherein the processor is configured to analyze signals in the captured data for possible use of a user hand within it, and when a hand is detected, follow the hand to record a trajectory, wherein the recorded trajectory is compared with tralectories stored in the gesture database to detect the known gesture corresponding to the gesture performed by the user (306).
- 7. The stand mixer (100) of claiml, wherein the processor is further configured to map motion with a known gesture by identifying the user (306) and match the motion to the known gesture based on the user identify, wherein the commands database (404) comprises a plurality of gestures for each user using the stand mixer (100).
- 8. The stand mixer (100) of claimi, wherein the sensor (128) is available on a smart communication device, wherein the smart communication device receives data from the sensor (128) and wirelessly communicates the same to the stand mixer (100), wherein the stand mixer (100) is capable of wirelessly communicating with the smart communication device.
- 9. The stand mixer (100) of claim 8, wherein the smart communication device further comprising the processor for processing the captured data, wherein the processor processes the captured data and wirelessly sends the associated command to the stand mixer (100).
- 10. The stand mixer (100) of claiml further comprising a display unit (408) to provide feedback on the recognised associated gesture.
- Ii. The stand mixer (100) of claim 10, wherein the display unit (408) shows available functions, wherein the user (306) moves a cursor on the display unit (408) by performing a known gesture to select an available function and activate the selected function.
- 12. The stand mixer (100) of claiml, wherein the sensor (128) is placed on a side face of the stand mixer (100), wherein the sensor (128) captures data in a predetermined region on the side of the stand mixer (100).
- 13-The stand mixer (100) of claiml, wherein the sensor (128) is further configured to capture a voice command given by the user (306), wherein the processor is configured to map voice command given by the user (306) with a known voice command using a voice command database maintaining a plurality of voice commands, wherein the commands database (404) includes the voice command database.
- 14. The stand mixer (100) of claim 1, further includes a dial in mode that allows the user to select one or more of gesture controlling, voice controlling and manual controlling.
- 15. A gesture control unit (302) for remote controlling a stand mixer (100), wherein the gesture control unit (302) comprising: a commands database (404) maintaining a plurality of commands corresponding to a plurality of gestures, wherein each command corresponding to a specific function of the stand mixer (100), wherein the commands database (404) further comprising a gesture database maintaining a plurality of the gestures; a sensing unit (402) for: capturing data of a gesture performed by a user (306); processing the captured data comprising: mapping motion corresponding to the gesture performed by the user (306) with a known gcsturc using the gcsturc database; mapping the known gesture to an associated command using the commands database (404); and a command processing unit (406) for sending the associated command for performing a corresponding function; characterized in that, the function is one of switching on/off one or more of a high-speed blender drive outlet, a slow-speed mincer drive outlet and a planetary drive outlet; selecting speed of operation; and selecting a tool for the planetary drive outlet.
- 16. The gesture control unit (302) of claim 15, wherein each gesture in the gesture database is defined by a trajectory of a user's hand, wherein the sensing unit (402) is configured to analyze signals in the captured data for possible use of a user hand within it, and when a hand is detected, follow the hand to record a trajectory, wherein the recorded trajectory is compared with trajectories stored in the gesture database to detect the known gesture corresponding to the gesture performed by the user (306).
- 17. The gesture control unit (302) of claim 15, further comprising a display unit (408) to show the executed commands.
- 18. The gesture control unit (302) of claim 17, wherein the display unit (408) to provide feedback on the recognised associated gesture.
- 19. The gesture control unit (302) of claim 17, wherein the display unit (408) configured to display available functions, wherein the user (306) moves a cursor on the display unit (408) by performing a known gesture to select an available function.
- 20. The gesture control unit (302) of claim 15, the command processing unit (406) sending the associated command to a control circuitry (304) of the stand mixer (100), wherein the control circuitry (304) performing the corresponding function to the associated command.
- 21. The gesture control unit (302) of claim 15, wherein the gesture performed by the user (306) includes a specific motion performed by user's hand in a specific configuration, the plurality of gestures in the gesture database includes two or more of a hand wave gesture (606), a heart-shaped hand gesture (608), a right-angle-shaped hand gesture (610), a cross finger gesture (612), a hand rotating a knob gesture (614), a thumbs-up gesture (6i 6), a clapping hands gesture (6i 8), an open hand pushing-down gesture (620) and an open hand pushing-up gesture (622).
- 22. The gesture control unit (302) of claim IS, wherein the sensing unit (402) includes a sensor (i 28) placed on a side face of the stand mixer (100), wherein the sensor (i 28) configured to capture data of a gesture performed by the user (306) in a predetermined region near the stand mixer (100).
- 23. The gesture control unit (302) of claim 15, wherein the sensing unit (402) configured to map motion with a knoWn gestLire by identifying the user (306) and match the motion to the known gesture based on the user identify, wherein the commands database (404) comprises a plurality of gestures for each user using the staild mixer (100).
- 24. The gesture control unit (302) of claim 15, whereill the system is trainillg free.
- 25. The gesture control unit (302) of claim 15 included in a smart commurncatioll device, wherein the smart communication device wirelessly commullicates with the stand mixer (100).
- 26. The gesture control unit (302) of claim 15, wherein the sensing unit (402) further configured to capture a voice command given by the user (306) and map voice command given by the user (306) with a known voice command using a voice command database maintaining a plurality of voice commands, wherein the commands database (404) includes the voice command database.
- 27. The gesture control unit (302) of claim 15 further includes a dial in mode that allows the user to select one or more of gesture controlling, voice controlling and manual coiltrolling.
- 28. A stand mixer substantially as herein described with reference to and/or as shown in the accompallying drawings.
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1407093.2A GB2525398B (en) | 2014-04-22 | 2014-04-22 | Gesture controlled stand mixer |
| PCT/GB2015/051188 WO2015162421A1 (en) | 2014-04-22 | 2015-04-22 | Kitchen appliance having contactless operation |
| EP15719276.6A EP3133965B1 (en) | 2014-04-22 | 2015-04-22 | Kitchen appliance having contactless operation |
| EP20188258.6A EP3747323A1 (en) | 2014-04-22 | 2015-04-22 | Signal control unit for remote controlling a kitchen appliance |
| CN201580007169.2A CN105979832B (en) | 2014-04-22 | 2015-04-22 | Kitchen appliances with contactless operation |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1407093.2A GB2525398B (en) | 2014-04-22 | 2014-04-22 | Gesture controlled stand mixer |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| GB201407093D0 GB201407093D0 (en) | 2014-06-04 |
| GB2525398A true GB2525398A (en) | 2015-10-28 |
| GB2525398B GB2525398B (en) | 2020-12-23 |
Family
ID=50929043
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB1407093.2A Expired - Fee Related GB2525398B (en) | 2014-04-22 | 2014-04-22 | Gesture controlled stand mixer |
Country Status (1)
| Country | Link |
|---|---|
| GB (1) | GB2525398B (en) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3184907A1 (en) * | 2015-12-22 | 2017-06-28 | BSH Hausgeräte GmbH | Domestic appliance |
| EP3187785A1 (en) * | 2015-12-22 | 2017-07-05 | BSH Hausgeräte GmbH | Domestic appliance comprising a gesture detection |
| GB2563255A (en) * | 2017-06-07 | 2018-12-12 | Kenwood Ltd | Kitchen appliance and system therefor |
| CN109414132A (en) * | 2017-05-22 | 2019-03-01 | 李尚俊 | A kind of cold extraction coffee extractor adjusting acoustic vibration based on voice signal |
| GB2598709A (en) * | 2020-07-30 | 2022-03-16 | Kenwood Ltd | Food processing appliance and fascia therefor |
| US20240156294A1 (en) * | 2022-11-10 | 2024-05-16 | Noel Nyirenda | Cooking Appliance |
| EP4501187A1 (en) * | 2023-07-31 | 2025-02-05 | Whirlpool Corporation | Countertop appliance with physical and remote motor control |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112568692A (en) * | 2019-09-29 | 2021-03-30 | 浙江苏泊尔家电制造有限公司 | Control method of cooking appliance, cooking appliance and computer storage medium |
| EP3892170A1 (en) * | 2020-04-07 | 2021-10-13 | Koninklijke Philips N.V. | Control of a portable food processing apparatus |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1434424A (en) * | 2002-01-23 | 2003-08-06 | 廖华勇 | Sign language remote controller |
| US20050212760A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture based user interface supporting preexisting symbols |
| US20130229508A1 (en) * | 2012-03-01 | 2013-09-05 | Qualcomm Incorporated | Gesture Detection Based on Information from Multiple Types of Sensors |
| US20140201688A1 (en) * | 2013-01-17 | 2014-07-17 | Bsh Home Appliances Corporation | User interface - gestural touch |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2007088320A1 (en) * | 2006-02-03 | 2007-08-09 | Kenwood Limited | Improvements in or relating to stand mixer arrangements |
| US8276506B2 (en) * | 2007-10-10 | 2012-10-02 | Panasonic Corporation | Cooking assistance robot and cooking assistance method |
| CN203016640U (en) * | 2012-10-31 | 2013-06-26 | 朱学德 | Voice control electric cooking stove |
| ES2625400T3 (en) * | 2012-11-29 | 2017-07-19 | Vorwerk & Co. Interholding Gmbh | Cooking machine |
-
2014
- 2014-04-22 GB GB1407093.2A patent/GB2525398B/en not_active Expired - Fee Related
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1434424A (en) * | 2002-01-23 | 2003-08-06 | 廖华勇 | Sign language remote controller |
| US20050212760A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture based user interface supporting preexisting symbols |
| US20130229508A1 (en) * | 2012-03-01 | 2013-09-05 | Qualcomm Incorporated | Gesture Detection Based on Information from Multiple Types of Sensors |
| US20140201688A1 (en) * | 2013-01-17 | 2014-07-17 | Bsh Home Appliances Corporation | User interface - gestural touch |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3184907A1 (en) * | 2015-12-22 | 2017-06-28 | BSH Hausgeräte GmbH | Domestic appliance |
| EP3187785A1 (en) * | 2015-12-22 | 2017-07-05 | BSH Hausgeräte GmbH | Domestic appliance comprising a gesture detection |
| CN109414132A (en) * | 2017-05-22 | 2019-03-01 | 李尚俊 | A kind of cold extraction coffee extractor adjusting acoustic vibration based on voice signal |
| GB2563255A (en) * | 2017-06-07 | 2018-12-12 | Kenwood Ltd | Kitchen appliance and system therefor |
| US11406224B2 (en) | 2017-06-07 | 2022-08-09 | Kenwood Limited | Kitchen appliance and system therefor |
| GB2563255B (en) * | 2017-06-07 | 2022-12-28 | Kenwood Ltd | Kitchen appliance and system therefor |
| GB2598709A (en) * | 2020-07-30 | 2022-03-16 | Kenwood Ltd | Food processing appliance and fascia therefor |
| GB2598709B (en) * | 2020-07-30 | 2024-10-16 | Kenwood Ltd | Food processing appliance and fascia therefor |
| US20240156294A1 (en) * | 2022-11-10 | 2024-05-16 | Noel Nyirenda | Cooking Appliance |
| EP4501187A1 (en) * | 2023-07-31 | 2025-02-05 | Whirlpool Corporation | Countertop appliance with physical and remote motor control |
Also Published As
| Publication number | Publication date |
|---|---|
| GB2525398B (en) | 2020-12-23 |
| GB201407093D0 (en) | 2014-06-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3133965B1 (en) | Kitchen appliance having contactless operation | |
| GB2525398A (en) | Stand mixer controls | |
| US10682016B2 (en) | Food processor | |
| US9968221B2 (en) | Food processor with a face recognition software | |
| AU2015263408B2 (en) | Electrically operated domestic appliance having a voice recognition device | |
| TWI590793B (en) | Electrical kitchen machine | |
| CN111596563B (en) | Intelligent smoke kitchen system and cooking guiding method thereof | |
| JP3681757B2 (en) | Cooking apparatus and operation method thereof | |
| GB2525589A (en) | Toaster controls | |
| CN110579985A (en) | control method, device and system | |
| CN211483994U (en) | Sound control electric cooker | |
| CN212394658U (en) | Intelligent frying and roasting device | |
| CN111603072A (en) | Conditioning machine driven by gestures | |
| RU145442U1 (en) | USER HOUSEHOLD APPLIANCES UNIT FOR THERMAL (THERMAL) PRODUCT PROCESSING | |
| CN115957879A (en) | Material processing equipment and control method thereof | |
| CN110448155A (en) | A kind of acoustic control electric cooker | |
| CN110888337A (en) | Control method of fingerprint electric cooker and computer readable storage medium fingerprint electric cooker | |
| CN110731693A (en) | Cooking menu display method and device and cooking appliance |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20240422 |