WO2017010593A1 - Gesture recognition device - Google Patents
Gesture recognition device Download PDFInfo
- Publication number
- WO2017010593A1 WO2017010593A1 PCT/KR2015/007407 KR2015007407W WO2017010593A1 WO 2017010593 A1 WO2017010593 A1 WO 2017010593A1 KR 2015007407 W KR2015007407 W KR 2015007407W WO 2017010593 A1 WO2017010593 A1 WO 2017010593A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- face
- detection unit
- motion
- gesture detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Definitions
- the present invention relates to a gesture recognition device, and more particularly to a gesture recognition device for minimizing power consumption for gesture recognition.
- the present invention minimizes power consumption by performing a gesture recognition operation only when a user's motion and a face are recognized before the gesture operation, and in a preferred embodiment, when the face is recognized
- the present invention relates to a gesture recognition apparatus for minimizing the amount of computation for gesture detection by recognizing a gesture only in a certain region (a region of a face or a region having a predetermined size spaced apart from the face region) based on a face position.
- a gesture recognition device for recognizing a gesture of an object to be operated by an operator is combined with a display device such as a television receiver, a personal computer, or a tablet terminal.
- a display device such as a television receiver, a personal computer, or a tablet terminal.
- the operator does not need to wear special jewelry such as a data glove, and the operator can use his or her hand or fingers to perform the operation on the display device smoothly and smoothly.
- a gesture recognizing apparatus for recognizing a gesture to be performed.
- gesture recognition using the Hidden Markov Model (HMM), continuous dynamic programming (DP), or the like is performed in the gesture recognition apparatus.
- Such systems may perform a gesture recognition operation while maintaining a power-on state to recognize a gesture at all times. Therefore, even if there is no gesture input, the system performs a gesture recognition operation so that unnecessary power is excessively consumed.
- the present invention has been made to solve the above problems, it is possible to minimize the total amount of power consumed for gesture detection by detecting whether the user's gesture detection with low power, and the gesture detection module only when necessary to detect the gesture.
- Another object of the present invention is to provide a gesture recognition device.
- the present invention provides a gesture recognition device that minimizes the amount of calculation for gesture detection / recognition by setting the area for detecting the gesture to a part of the image area instead of the entire image area, thereby reducing the amount of power consumed for gesture recognition. To provide.
- a gesture detection device including a motion detector, a face detector, a gesture detector, and a result output unit may detect a user's motion (motion) and, upon detecting a motion, first wake up to the face detector.
- a motion detector for transmitting a wake up signal;
- the device When the first wake-up signal is received from the motion detection unit, the device operates in the second state for a predetermined time, detects a user's face, returns to the first state, and detects a face to the gesture detection unit.
- a face detector for transmitting a second wake-up signal; When the second wake-up signal is received from the face detection unit, the device operates in the second state for a predetermined time, detects the user's gesture, returns to the first state, and detects the gesture.
- a gesture detector for outputting a corresponding operation signal; And a result output unit configured to perform an operation corresponding to the operation signal received from the gesture detector.
- the first state may be an operating state that consumes less power than the second state.
- the first state may be a power off state or a sleep state.
- the face detection unit may determine that the user's face is detected only when the detected user's face direction is within a predetermined angle based on the face detection unit.
- the face detector may determine that the user's face is detected only when the detected user's face size is greater than or equal to the first threshold and less than or equal to the second threshold.
- the face detector when detecting a user's face, selects the detected face as a gesture detection target and additionally transmits the location information to the gesture detector, and the gesture detector detects a gesture detected by the face detector.
- the gesture detection area of the user may be set based on the location information to perform gesture detection only on the gesture detection area.
- the face detection unit selects one of the faces as a gesture detection target, and additionally transmits the location information to the gesture detection unit, and the gesture detection unit provides a gesture provided from the face detection unit.
- a gesture detection area of a user may be set based on location information on a detection target to perform gesture detection only on the gesture detection area.
- the face detector may select a face closest to the motion information detected by the motion detector as a gesture detection target.
- the face detector may select a face that overlaps all or a part of the motion information detected by the motion detector as a gesture detection target.
- the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
- the gesture detecting unit may set a region having a predetermined size spaced apart from the position of the gesture sensing target as a gesture sensing region.
- the gesture detector may detect a shape and a movement direction of a user's gesture and output an operation signal corresponding thereto.
- Such a gesture detecting apparatus may further include an alarm unit configured to provide alarm information for a predetermined time when the gesture detecting unit detects a user's gesture when detecting a face through the face detecting unit.
- a gesture detection device including a motion detector, a face / gesture detector, and a result output unit detects a user's motion (motion) and wakes up to the face / gesture detector when motion is detected. up) a motion detector for transmitting a signal; When the wake-up signal is received from the motion detection unit, the device operates in the second state for a predetermined time, detects a user's face, and detects a user's gesture for a predetermined time.
- a face / gesture detection unit that outputs a corresponding operation signal and returns to a first state when a user's face is not detected or a gesture is not detected for a predetermined time; And a result output unit configured to perform an operation corresponding to the operation signal received from the face / gesture detector.
- the first state may be an operating state that consumes less power than the second state.
- the face / gesture detector may include: a face detector configured to detect a user's face for a predetermined time and select the detected face as a gesture detection target when receiving a wake-up signal from the motion detector; And a gesture detection unit configured to receive information on the selected gesture detection target from the face detection unit, set a user's gesture detection region based on the information on the gesture detection target, and perform gesture detection only on the gesture detection region. It can include;
- the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
- a gesture detection device including a motion / face detection unit, a gesture detection unit, and a result output unit detects a user's motion (motion), detects a user's face when detecting a motion, and detects a face.
- a motion / face detector configured to transfer a wake up signal to the gesture detector; Operating in the first state When receiving the wake-up signal from the motion / face detection unit, the device operates in the second state for a predetermined time, detects the user's gesture and returns to the first state, and detects the gesture when the gesture is detected.
- a gesture detector for outputting a corresponding operation signal; And a result output unit configured to perform an operation corresponding to the operation signal received from the gesture detector.
- the first state may be an operating state that consumes less power than the second state.
- the motion / face detection unit selects a face detected when a user's face is detected as a gesture detection target, additionally transmits position information on the gesture detection target to the gesture detection unit, and the gesture detection unit detects the motion / face detection.
- a gesture detection unit configured to set a user's gesture detection area based on the location information of the gesture detection target received from the unit, and perform gesture detection only on the gesture detection area.
- the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
- the gesture recognition apparatus detects the user's movement and face to determine whether the gesture input preparation is completed, and operates the gesture recognition module only when the gesture input preparation is completed, thereby reducing the total power consumption and the amount of calculation. It is effective to let.
- the motion detection operation of the human being can be driven with low power, it is possible to detect a human motion (and face) at low power and determine whether to perform a gesture recognition operation that requires a relatively large amount of power. There is an effect that can be reduced.
- a gesture can be detected (detected) by using only some image information instead of all image information. This has the effect of minimizing the amount of computation for gesture detection.
- FIG. 1 is a view showing a gesture recognition device according to a first example of the present invention
- FIG. 2 illustrates an example of determining that a face of a user is detected according to the present invention
- FIG. 3 is a view illustrating an example of selecting one of the faces as a gesture detection target when detecting a plurality of faces according to the present invention
- FIG. 5 illustrates an example of recognizing a gesture on an area spaced a predetermined distance from a detected face area according to the present invention
- FIG. 6 is a view showing an example of the shape of the gesture detected in accordance with the present invention.
- FIG. 7 is a view showing a gesture recognition device according to a second example of the present invention.
- FIG. 8 is a diagram illustrating a gesture recognizing apparatus according to a third example of the present invention.
- first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
- FIG. 1 is a diagram illustrating a gesture recognizing apparatus according to a first example of the present invention.
- the gesture recognition apparatus may include a motion (motion) detecting unit 10, a face detecting unit 20, a gesture detecting unit 30, and a result output unit 40. Include. Hereinafter, each component will be described in detail.
- the motion detector 10 detects a user's motion (motion).
- a human body sensor or an image sensor may be applied to the motion detector 10.
- the human body detection sensor information about the human body is acquired at regular intervals through an infrared passive method (PIR), near infrared active method, ultrasonic waves, microwaves, and the like, and when it is determined that the difference between the acquired information is more than a predetermined level, It may be determined that the motion of the motion is detected.
- PIR infrared passive method
- NIR near infrared active method
- ultrasonic waves ultrasonic waves
- microwaves microwaves
- CMOS complementary MOS
- CCD charge-coupled device
- IR image sensor IR image sensor
- the image sensor may detect whether there is a motion (motion) of the user based on a difference value or a matching result of the image information acquired at a predetermined interval.
- the motion detection unit 10 When the motion detection unit 10 detects a motion through the various methods as described above, the motion detection unit 10 transmits a wake up signal to the separate face detection unit 20.
- the face detector 20 normally operates in a first state.
- the face detector 20 receives a wake up signal from the motion detector 10, the face detector 20 operates in a second state for a predetermined time and detects a user's face.
- the face detecting unit 20 detects the user's face in a certain region for a predetermined time, and then returns to the first state.
- a sleep state, a power off state, or the like may be applied to the first state, and a normal operating state may be applied to the second state.
- the first state may mean an operating state that consumes a smaller amount of power than the second state.
- the power off state means that power to all components is turned off.
- the sleep state controls only some components to operate and the other components are turned off to consume the entire power. It means a state that minimizes the amount of power.
- the wake-up signal is received, power is supplied to all components to control all components to operate.
- the face detector 20 may operate in a sleep state that cuts off power to other components in addition to the configuration of receiving a wake-up signal from the motion detector 10 as a first state. have. Subsequently, upon receiving the wake-up signal from the motion detection unit 10, power is supplied to other components (a module for detecting a user's face, etc.) to perform a user's face detection operation as a second state. do.
- the face detector 20 all the components capable of detecting the shape of a human face may be applied.
- a human body sensor and an image sensor may be applied.
- the face detector 20 may determine that the user's face is detected only in the following cases.
- the user's face is at an angle within
- FIG. 2 is a diagram illustrating an example of determining that a face of a user is detected according to the present invention.
- the face detector 20 may determine that the face of the user is detected only when the face of which the face of the user faces the front is detected. More specifically, the face detector 20 may determine that the face of the user is detected only when the detected direction of the face of the user is within a predetermined angle with respect to the face detector 20.
- various face detection algorithms can be applied. For example, this may be implemented by learning a matching template for face recognition through a forward image.
- the user's face may be detected by detecting the direction of the user's face based on the shape information of the face.
- the face detecting unit 20 detects the overall direction of the face to determine whether the face is detected, calculates the position of the gaze in the image information, and calculates the gaze direction based on the overall calculation amount and This has the effect of reducing the amount of power consumed.
- the face detector 20 may detect only a face of a user located within a certain distance.
- the distance between the user and the face detector 20 may be measured by using a separate sensor, and the face detector 20 may be configured to perform a detection operation only when the distance value is within a predetermined range. .
- the distance between the user and the face detection unit 20 is measured based on the face information detected by the face detection unit 20, and the face detection operation of the user is performed only when the distance value is within a predetermined range. You can do it.
- the face detecting unit 20 may control to detect (detect) the user's face only when the number of pixels of the face according to the resolution is counted.
- the face detector 20 may detect only a face of a user located within a predetermined range. In other words, the face detector 20 may detect only a face of a user spaced apart from the face detector by a minimum A distance or more and a maximum B distance or less.
- the face detector 20 may determine that the user's face is detected only when the detected user's face size is greater than or equal to the first threshold and less than or equal to the second threshold. To this end, the face detector 20 may count the number of pixels of the face according to the resolution to control (detect) the user's face only when the number of pixels is within a predetermined number range.
- the face detecting unit 20 provides the gesture detecting unit 30 with the position information on the detected face.
- the gesture detection unit 30 may receive the gesture detection unit 30 and set the gesture detection area based on the position information. Through this, the gesture detector 30 may reduce the amount of calculation for detecting the gesture.
- the gesture detector 30 will be described in detail with reference to the related description.
- the face detector 20 may select the detected face as a gesture detection target and provide location information thereof to the gesture detector 30.
- the face detector 20 may select one of the faces as a gesture detection target according to a predetermined rule and provide location information thereof to the gesture detector 30. have.
- FIG. 3 is a diagram illustrating an example of selecting one of the faces as a gesture detection target when detecting a plurality of faces according to the present invention.
- the face detector 20 may select a face closest to the motion information detected by the motion detector 10 as a gesture detection target. have.
- the face detector 20 may select the face information closest to the specific direction as the gesture detection target based on the detected motion information.
- 3 illustrates an example of selecting the face information located closest in the upward direction of the detected motion as a gesture detection target, but the face detection unit 20 according to the present invention is not limited to the above embodiment.
- the face detector 20 may select a face that overlaps all or a part of the motion information detected by the motion detector 10 as a gesture detection target. In other words, when the detected motion information is detected on a specific face, the face detector 20 may select a corresponding face as a gesture detection target and provide the position information to the gesture detector 30.
- the face detection unit 20 may select a gesture detection target, and the gesture detection unit 30 provided with the information may perform a gesture detection operation based on the information on the detection target. can do.
- the gesture detector 30 normally operates in a first state.
- the gesture detector 30 receives a wake up signal from the face detector 20, the gesture detector 30 operates in a second state for a predetermined time and detects a user's gesture. At this time, the gesture detector 30 detects the user's gesture with respect to the predetermined area for a predetermined time, and then returns to the first state.
- a sleep state, a power off state, or the like may be applied to the first state, and a normal operating state may be applied to the second state.
- the first state may mean an operating state that consumes a smaller amount of power than the second state.
- the gesture detector 30 may operate in a sleep state that cuts off power to other components in addition to the configuration of receiving a wake-up signal from the face detector 20. Subsequently, when the wake-up signal is received from the face detector 20, the user's gesture detection may be performed by supplying power to other components (such as a module for detecting the user's gesture) to operate.
- all the components capable of detecting the gesture of the user's hand may be applied to the gesture detecting unit 30.
- a vision sensor may be applied to more accurately detect (detect) the user's gesture.
- the gesture detecting unit 30 when the gesture detecting unit 30 receives the (location) information on the gesture detecting object from the face detecting unit 20, the gesture detecting unit 30 sets the gesture detecting region based on the information.
- the gesture detection operation is performed only in the gesture detection area. That is, the gesture detector 30 may perform a gesture detection (detection) operation on only some of the regions without performing a gesture detection operation on all image information obtained to reduce the amount of computation and power consumption.
- FIG. 4 illustrates an example of recognizing a gesture on a detected face area according to the present invention.
- the gesture detector 30 may detect a gesture input on an area on a face selected as a gesture detection target. To this end, the gesture detection unit 30 may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
- FIG. 5 is a diagram illustrating an example of recognizing a gesture on an area spaced a predetermined distance from a detected face area according to the present invention.
- the gesture detector 30 may set a region having a predetermined size spaced apart from a position of the gesture sensing object as a gesture sensing region.
- FIG. 5 illustrates an example in which a region spaced a predetermined distance downward in the downward direction based on the position of the gesture sensing object (face) is set as the gesture sensing region, but the present invention is not limited to the above embodiment.
- the gesture detection unit 30 may optimize the amount of calculation required to calculate the gesture by setting the gesture detection area through various methods as described above and performing the gesture detection operation only in the area. Subsequently, when a gesture is detected, an operation signal corresponding to the detected gesture is output to the result output unit 40.
- the gesture detector 30 may detect a shape and a movement direction of a user's gesture and output an operation signal corresponding thereto.
- the gesture detector 30 may detect the shapes of various gestures shown in FIG. 6 through image analysis.
- the gesture detection unit 30 may also detect a detected gesture direction of movement.
- a moving direction of a hand may be detected using difference information between image data acquired at predetermined time intervals.
- the motion direction of a hand may be detected by obtaining a block motion to obtain a motion vector of an individual block, and then calculating an average of all motion vectors.
- the direction of the movement can be tracked by obtaining the optical flow and detecting which direction the optical flow is moving as a whole.
- the present invention is not limited to the above embodiment, and other motion sensing methods may also be applied.
- the result output unit 40 performs an operation corresponding to the operation signal received from the gesture detection unit 30.
- the gesture detection unit 30 stores table information on the corresponding motion signal for each shape and movement direction of the detected gesture.
- the gesture detection unit 30 uses the table information.
- the signal may be provided to the result output unit 40.
- the A gesture is set to change a channel of the TV (increase or decrease channel number)
- the B gesture is to adjust the volume of the TV (increase or decrease volume)
- the C gesture is set to correspond to the power on / off control operation of the TV. do.
- the gesture detector 30 if the user's B gesture is detected by the gesture detector 30, the gesture detector 30 provides an operation signal for volume control (volume increase or decrease) to the result output unit 40. can do. As a result, the result output unit 40 may perform a corresponding operation.
- the gesture sensing device may further include an alarm unit (not shown).
- the alarm unit (not shown) allows the user to detect the user's gesture by the gesture detecting unit 30. You can provide an alarm message that you are ready to do so.
- the alarm unit may provide information that the gesture detection unit 30 is ready to detect a gesture by providing alarm information for a predetermined time for detecting the user's gesture.
- the gesture recognizing apparatus may include a motion detector 10, a face detector 20, and a gesture detector 30, each of which is distinguished.
- the face detector 20 the gesture detector 30 normally operate in a first state (sleep state or power off state).
- the gesture detection unit 30 operates in a second state for a while. Can return to the 1 state.
- the user may determine whether the user prepares to input a gesture through the low-power motion detector 10, and when the user is ready, wakes up the face detector 20 and the gesture detector 30. Has the effect of being able to be driven with low power as a whole by performing the gesture sensing operation.
- the amount of computation required for the gesture detection may be reduced.
- the face detector 20 and the gesture detector 30 may operate as one component, and in another embodiment, the motion detector 10 and the face detector 20. Can operate as one component.
- FIG. 7 is a diagram illustrating a gesture recognizing apparatus according to a second example of the present invention.
- the face detector 20 and the gesture detector 30 of the first example may be integrated and operate as one component.
- the PIR sensor may be applied to the motion detector 110, and the image sensor may be applied to the face / gesture detector 125.
- the face / gesture detection unit 125 may be woken up only when a motion is detected, and thus may be implemented at low power.
- FIG. 8 is a diagram illustrating a gesture recognizing apparatus according to a third example of the present invention.
- the motion detector 10 and the face detector 20 of the first example may be integrated and operate as one component.
- the PIR sensor may be applied to the motion / face detector 215 and the image sensor may be applied to the gesture detector 230.
- the motion / face detection operation is performed by using the PIR sensor which can be driven at a low power, and can be realized at low power as a whole by waking up the gesture detector 230 only when a face is detected.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
๋ณธ ๋ฐ๋ช ์ ์ ์ค์ณ(gesture) ์ธ์ ์ฅ์น์ ๊ดํ ๊ฒ์ผ๋ก, ์ ์ค์ณ ์ธ์์ ์ํ ์ ๋ ฅ ์๋ชจ๋ฅผ ์ต์ํํ๋ ์ ์ค์ณ ์ธ์ ์ฅ์น์ ๊ดํ ๊ฒ์ด๋ค.The present invention relates to a gesture recognition device, and more particularly to a gesture recognition device for minimizing power consumption for gesture recognition.
๋ณด๋ค ๊ตฌ์ฒด์ ์ผ๋ก, ๋ณธ ๋ฐ๋ช ์ ์ ์ค์ณ ๋์ ์ ์ฌ์ฉ์์ ์์ง์(motion) ๋ฐ ์ผ๊ตด(face)์ด ์ธ์๋ ๊ฒฝ์ฐ์๋ง ์ ์ค์ณ(gesture) ์ธ์ ๋์์ ์ํํจ์ผ๋ก์จ ์ ๋ ฅ ์๋ชจ๋์ ์ต์ํํ๊ณ , ๋ฐ๋์งํ ์๋ก ์ผ๊ตด์ด ์ธ์๋ ๊ฒฝ์ฐ ์ธ์๋ ์ผ๊ตด ์์น๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ์์ญ(์ผ๊ตด ์์ญ ๋๋ ์ผ๊ตด ์์ญ๊ณผ ์ผ์ ๊ฑฐ๋ฆฌ ์ด๊ฒฉ๋์ด ํ์ฑ๋ ์ผ์ ํฌ๊ธฐ์ ์์ญ)์ ๋ํด์๋ง ์ ์ค์ณ๋ฅผ ์ธ์ํจ์ผ๋ก์จ ์ ์ค์ณ ๊ฒ์ถ์ ์ํ ์ฐ์ฐ๋์ ์ต์ํํ๋ ์ ์ค์ณ ์ธ์ ์ฅ์น์ ๊ดํ ๊ฒ์ด๋ค.More specifically, the present invention minimizes power consumption by performing a gesture recognition operation only when a user's motion and a face are recognized before the gesture operation, and in a preferred embodiment, when the face is recognized The present invention relates to a gesture recognition apparatus for minimizing the amount of computation for gesture detection by recognizing a gesture only in a certain region (a region of a face or a region having a predetermined size spaced apart from the face region) based on a face position.
ํ ๋ ๋น์ ์์๊ธฐ, ํผ์ค๋ ์ปดํจํฐ, ๋๋ ํ๋ธ๋ฆฟ ๋จ๋ง ๋ฑ์ ํ์ ์ฅ์น์, ์กฐ์์๊ฐ ๋์์ํค๋ ์ธ์ ๋์๋ฌผ์ ์ ์ค์ณ๋ฅผ ์ธ์ํ๋ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ์กฐํฉ์ํจ ํ์ ๊ธฐ๊ธฐ๊ฐ ๋ฑ์ฅํ๊ณ ์๋ค. ์ด์ ๊ฐ์ ํ์ ๊ธฐ๊ธฐ์์, ์กฐ์์๊ฐ ๋ฐ์ดํฐ ๊ธ๋ก๋ธ ๋ฑ์ ํน์ํ ์ฅ์ ๊ตฌ๋ฅผ ์ฅ์ฐฉํ ํ์๊ฐ ์๊ณ , ๋ํ ์กฐ์์๊ฐ ์์ฐ์ค๋ฝ๊ณ ์ค๋ฌด์คํ๊ฒ ์์ ์ ์กฐ์์ ๋นํด ํ์ ๊ธฐ๊ธฐ์ ๋ํด ํํ ์ ์๋๋ก, ์กฐ์์์ ์์ด๋ ์๊ฐ๋ฝ ๋ฑ์ ์ด์ฉํ์ฌ ํํ๋ ์ ์ค์ณ๋ฅผ ์ธ์ํ๋ ์ ์ค์ณ ์ธ์ ์ฅ์น๊ฐ ์๋ง๋๊ณ ์๋ค. ๋ํ, ์ต๊ทผ์ ์ ์ค์ณ ์ธ์ ์ฅ์น์์๋, HMM(Hidden Markov Model ; ์๋ ๋ง๋ฅด์ฝ๋ธ ๋ชจ๋ธ) ๋๋ ์ฐ์ DP(Continuous Dynamic Programming) ๋ฑ์ ์ด์ฉํ ์ ์ค์ณ ์ธ์์ด ํํ์ฌ์ง๊ณ ์๋ค.BACKGROUND ART Display apparatuses in which a gesture recognition device for recognizing a gesture of an object to be operated by an operator is combined with a display device such as a television receiver, a personal computer, or a tablet terminal. In such a display device, the operator does not need to wear special jewelry such as a data glove, and the operator can use his or her hand or fingers to perform the operation on the display device smoothly and smoothly. There is a demand for a gesture recognizing apparatus for recognizing a gesture to be performed. In recent years, gesture recognition using the Hidden Markov Model (HMM), continuous dynamic programming (DP), or the like is performed in the gesture recognition apparatus.
์ด์ ๊ฐ์ ์์คํ ๋ค์ ํญ์ ์ ์ค์ณ๋ฅผ ์ธ์ํ ์ ์๋๋ก ์ ๋ ฅ ์จ(ON) ์ํ๋ฅผ ์ ์งํ๋ฉฐ ์ ์ค์ณ ์ธ์ ๋์์ ์ํํ๊ฒ ๋๋ค. ๋ฐ๋ผ์ ์๊ธฐ ์์คํ ์ ์ ์ค์ณ ์ ๋ ฅ์ด ์๋ ๊ฒฝ์ฐ์๋ ์ ์ค์ณ ์ธ์ ๋์์ด ์ํ๋์ด ๋ถํ์ํ ์ ๋ ฅ์ด ๊ณผ๋ํ๊ฒ ์๋ชจ๋๋ ๋ฌธ์ ์ ์ด ์์๋ค.Such systems may perform a gesture recognition operation while maintaining a power-on state to recognize a gesture at all times. Therefore, even if there is no gesture input, the system performs a gesture recognition operation so that unnecessary power is excessively consumed.
์ด์ ๊ฐ์ ๋ฌธ์ ์ ์ ํด๊ฒฐํ๊ธฐ ์ํด ์ฌ์ฉ์๊ฐ ์ง์ ์ค์์น๋ฅผ ์ ๋ ฅํ์ฌ ์จ/์คํ ์ํ๋ฅผ ์ ์ดํ๊ณ ์จ ์ํ์ธ ๊ฒฝ์ฐ์๋ง ์ ์ค์ณ ์ธ์ ๋์์ ์ํ์ผ ํจ์ผ๋ก์จ ์ ๋ ฅ ์๋ชจ๋์ ๊ฐ์์ํค๋ ๊ธฐ์ ์ด ๊ฐ๋ฐ๋์๋ค. ๋ค๋ง, ์๊ธฐ ๊ธฐ์ ์ ์ฌ์ฉ์๊ฐ ๋ณ๋๋ก ํด๋น ์ฅ์น์ ์จ/์คํ ์ํ๋ฅผ ์ ์ดํ๊ธฐ ์ํด ์ง์ ์ฅ์น๋ฅผ ์กฐ์ํด์ผ ํ๋ ๋ฒ๊ฑฐ๋ก์์ด ์์๋ค.In order to solve this problem, a technology for reducing power consumption has been developed by allowing a user to directly input a switch to control an on / off state and perform a gesture recognition operation only in an on state. However, the above technology has been cumbersome for the user to directly manipulate the device in order to separately control the on / off state of the device.
๋ณธ ๋ฐ๋ช ์ ์๊ธฐ์ ๊ฐ์ ๋ฌธ์ ์ ์ ํด๊ฒฐํ๊ธฐ ์ํด ์์ถ๋ ๊ฒ์ผ๋ก, ์ฌ์ฉ์์ ์ ์ค์ณ ๊ฐ์ง ์ฌ๋ถ๋ฅผ ์ ์ ๋ ฅ์ผ๋ก ๊ฐ์งํ๊ณ , ํ์์์๋ง ์ ์ค์ณ ๊ฐ์ง ๋ชจ๋์ ์๋์์ผ ์ ์ค์ณ๋ฅผ ๊ฐ์งํจ์ผ๋ก์จ ์ ์ค์ณ ๊ฐ์ง๋ฅผ ์ํด ์๋ชจ๋๋ ์ด ์ ๋ ฅ๋์ ์ต์ํํ ์ ์๋ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ์ ๊ณตํ๊ณ ์ ํ๋ค. The present invention has been made to solve the above problems, it is possible to minimize the total amount of power consumed for gesture detection by detecting whether the user's gesture detection with low power, and the gesture detection module only when necessary to detect the gesture. Another object of the present invention is to provide a gesture recognition device.
๋ํ, ๋ณธ ๋ฐ๋ช ์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๋ ์์ญ์ ์ ์ฒด ์ด๋ฏธ์ง ์์ญ์ด ์๋ ์ผ๋ถ ์ด๋ฏธ์ง ์์ญ์ผ๋ก ์ค์ ํจ์ผ๋ก์จ ์ ์ค์ณ ๊ฐ์ง/์ธ์์ ์ํ ์ฐ์ฐ๋์ ์ต์ํํ๊ณ , ์ด๋ฅผ ํตํด ์ ์ค์ณ ์ธ์์ ์๋ชจ๋๋ ์ ๋ ฅ๋ ๋ํ ๊ฐ์์ํฌ ์ ์๋ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ์ ๊ณตํ๊ณ ์ ํ๋ค. In addition, the present invention provides a gesture recognition device that minimizes the amount of calculation for gesture detection / recognition by setting the area for detecting the gesture to a part of the image area instead of the entire image area, thereby reducing the amount of power consumed for gesture recognition. To provide.
๋ณธ ๋ฐ๋ช ์ ์ผ ์ธก๋ฉด์ ๋ฐ๋ผ ์์ง์ ๊ฐ์ง๋ถ, ์ผ๊ตด ๊ฐ์ง๋ถ, ์ ์ค์ณ ๊ฐ์ง๋ถ ๋ฐ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ๋ฅผ ํฌํจํ๋ ์ ์ค์ณ ๊ฐ์ง ์ฅ์น๋, ์ฌ์ฉ์์ ์์ง์(๋ชจ์ )์ ๊ฐ์งํ๊ณ , ์์ง์ ๊ฐ์ง์ ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ก ์ 1 ์จ์ดํฌ์ (wake up) ์ ํธ๋ฅผ ์ ๋ฌํ๋ ์์ง์ ๊ฐ์ง๋ถ; ์ 1 ์ํ๋ก ๋์ํ๋ค ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ 1 ์จ์ดํฌ์ ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ๊ณ ๋ค์ ์ 1 ์ํ๋ก ๋ณต๊ทํ๊ณ , ์ผ๊ตด ๊ฐ์ง์ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก ์ 2 ์จ์ดํฌ์ ์ ํธ๋ฅผ ์ ๋ฌํ๋ ์ผ๊ตด ๊ฐ์ง๋ถ; ์ 1 ์ํ๋ก ๋์ํ๋ค ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ 2 ์จ์ดํฌ์ ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๊ณ ๋ค์ ์ 1 ์ํ๋ก ๋ณต๊ทํ๊ณ , ์ ์ค์ณ ๊ฐ์ง์ ๊ฐ์ง๋ ์ ์ค์ณ์ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ์ถ๋ ฅํ๋ ์ ์ค์ณ ๊ฐ์ง๋ถ; ๋ฐ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก๋ถํฐ ์์ ๋ ๋์ ์ ํธ์ ๋์ํ ๋์์ ์ํํ๋ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ;๋ฅผ ํฌํจํ๋ค.According to an aspect of the present invention, a gesture detection device including a motion detector, a face detector, a gesture detector, and a result output unit may detect a user's motion (motion) and, upon detecting a motion, first wake up to the face detector. a motion detector for transmitting a wake up signal; When the first wake-up signal is received from the motion detection unit, the device operates in the second state for a predetermined time, detects a user's face, returns to the first state, and detects a face to the gesture detection unit. A face detector for transmitting a second wake-up signal; When the second wake-up signal is received from the face detection unit, the device operates in the second state for a predetermined time, detects the user's gesture, returns to the first state, and detects the gesture. A gesture detector for outputting a corresponding operation signal; And a result output unit configured to perform an operation corresponding to the operation signal received from the gesture detector.
์ด๋, ์๊ธฐ ์ 1 ์ํ๋ ์ 2 ์ํ์ ๋นํด ์ ์ ๋์ ์ ๋ ฅ์ ์๋ชจํ๋ ๋์ ์ํ๊ฐ ์ ์ฉ๋ ์ ์๋ค.In this case, the first state may be an operating state that consumes less power than the second state.
๋ฐ๋์งํ๊ฒ๋, ์๊ธฐ ์ 1 ์ํ๋ ํ์ ์คํ(Power Off) ์ํ ๋๋ ์ฌ๋ฆฝ(sleep) ์ํ๊ฐ ์ ์ฉ๋ ์ ์๋ค.Preferably, the first state may be a power off state or a sleep state.
์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋, ๊ฐ์ง๋ ์ฌ์ฉ์์ ์ผ๊ตด ๋ฐฉํฅ์ด ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ๊ฐ๋ ์ด๋ด์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค.The face detection unit may determine that the user's face is detected only when the detected user's face direction is within a predetermined angle based on the face detection unit.
๋ํ, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋, ๊ฐ์ง๋ ์ฌ์ฉ์์ ์ผ๊ตด ํฌ๊ธฐ๊ฐ ์ 1 ๋ฌธํฑ๊ฐ ์ด์์ด๋ฉฐ ์ 2 ๋ฌธํฑ๊ฐ ์ดํ์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค.In addition, the face detector may determine that the user's face is detected only when the detected user's face size is greater than or equal to the first threshold and less than or equal to the second threshold.
์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋, ์ฌ์ฉ์์ ์ผ๊ตด ๊ฐ์ง์, ๊ฐ์ง๋ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ์ฌ ์ด์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก ์ถ๊ฐ์ ์ผ๋ก ์ ๋ฌํ๊ณ , ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ ๊ณต๋ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ ์์น ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ฌ์ฉ์์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ์ฌ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ๋ํด์๋ง ์ ์ค์ณ ๊ฐ์ง๋ฅผ ์ํํ ์ ์๋ค.The face detector, when detecting a user's face, selects the detected face as a gesture detection target and additionally transmits the location information to the gesture detector, and the gesture detector detects a gesture detected by the face detector. The gesture detection area of the user may be set based on the location information to perform gesture detection only on the gesture detection area.
ํนํ, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋, ๋ณต์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง์, ์ด์ค ์ด๋ ํ๋์ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ์ฌ ์ด์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก ์ถ๊ฐ์ ์ผ๋ก ์ ๋ฌํ๊ณ , ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ ๊ณต๋ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ ์์น ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ฌ์ฉ์์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ์ฌ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ๋ํด์๋ง ์ ์ค์ณ ๊ฐ์ง๋ฅผ ์ํํ ์ ์๋ค.In particular, when detecting a plurality of faces, the face detection unit selects one of the faces as a gesture detection target, and additionally transmits the location information to the gesture detection unit, and the gesture detection unit provides a gesture provided from the face detection unit. A gesture detection area of a user may be set based on location information on a detection target to perform gesture detection only on the gesture detection area.
์ผ ์๋ก, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋, ๋ณต์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง์, ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ๋ฅผ ํตํด ๊ฐ์ง๋ ์์ง์ ์ ๋ณด์ ๊ฐ์ฅ ๊ฐ๊น์ด ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ ์ ์๋ค.For example, when detecting a plurality of faces, the face detector may select a face closest to the motion information detected by the motion detector as a gesture detection target.
๋ค๋ฅธ ์๋ก, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋, ๋ณต์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง์, ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ๋ฅผ ํตํด ๊ฐ์ง๋ ์์ง์ ์ ๋ณด์ ์ ๋ถ ๋๋ ์ผ๋ถ๊ฐ ์ค์ฒฉ๋๋ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ ์ ์๋ค.As another example, when detecting a plurality of faces, the face detector may select a face that overlaps all or a part of the motion information detected by the motion detector as a gesture detection target.
์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์์น๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ๋ฒ์ ์ด๋ด์ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์ ์๋ค.The gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
๋ํ, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์์น๋ก๋ถํฐ ์ผ์ ๊ฑฐ๋ฆฌ ์ด๊ฒฉ๋ ์ผ์ ํฌ๊ธฐ์ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์ ์๋ค.The gesture detecting unit may set a region having a predetermined size spaced apart from the position of the gesture sensing target as a gesture sensing region.
์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์ฌ์ฉ์์ ์ ์ค์ณ์ ๋ชจ์ ๋ฐ ์์ง์ ๋ฐฉํฅ์ ๊ฐ์งํ๊ณ , ์ด์ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ์ถ๋ ฅํ ์ ์๋ค.The gesture detector may detect a shape and a movement direction of a user's gesture and output an operation signal corresponding thereto.
์ด์ ๊ฐ์ ์ ์ค์ณ ๊ฐ์ง ์ฅ์น๋, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ฅผ ํตํด ์ผ๊ตด ๊ฐ์ง์ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๊ฐ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๋ ์ผ์ ์๊ฐ ๋์ ์๋ ์ ๋ณด๋ฅผ ์ ๊ณตํ๋ ์๋๋ถ;๋ฅผ ๋ ํฌํจํ ์ ์๋ค.Such a gesture detecting apparatus may further include an alarm unit configured to provide alarm information for a predetermined time when the gesture detecting unit detects a user's gesture when detecting a face through the face detecting unit.
๋ณธ ๋ฐ๋ช ์ ๋ค๋ฅธ ์ธก๋ฉด์ ๋ฐ๋ผ ์์ง์ ๊ฐ์ง๋ถ, ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ ๋ฐ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ๋ฅผ ํฌํจํ๋ ์ ์ค์ณ ๊ฐ์ง ์ฅ์น๋, ์ฌ์ฉ์์ ์์ง์(๋ชจ์ )์ ๊ฐ์งํ๊ณ , ์์ง์ ๊ฐ์ง์ ์๊ธฐ ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ๋ก ์จ์ดํฌ์ (wake up) ์ ํธ๋ฅผ ์ ๋ฌํ๋ ์์ง์ ๊ฐ์ง๋ถ; ์ 1 ์ํ๋ก ๋์ํ๋ค ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ๋ก๋ถํฐ ์จ์ดํฌ์ ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ๊ณ , ์ผ๊ตด ๊ฐ์ง์ ์ผ์ ์๊ฐ ๋์ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ์ฌ ๊ฐ์ง๋ ์ ์ค์ณ์ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ์ถ๋ ฅํ๋, ์ผ์ ์๊ฐ ๋์ ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋์ง ์๊ฑฐ๋ ์ ์ค์ณ๊ฐ ๊ฐ์ง๋์ง ์์ผ๋ฉด ์ 1 ์ํ๋ก ๋ณต๊ทํ๋ ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ; ๋ฐ ์๊ธฐ ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ๋ก๋ถํฐ ์์ ๋ ๋์ ์ ํธ์ ๋์ํ ๋์์ ์ํํ๋ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ;๋ฅผ ํฌํจํ๋ค.According to another aspect of the present invention, a gesture detection device including a motion detector, a face / gesture detector, and a result output unit detects a user's motion (motion) and wakes up to the face / gesture detector when motion is detected. up) a motion detector for transmitting a signal; When the wake-up signal is received from the motion detection unit, the device operates in the second state for a predetermined time, detects a user's face, and detects a user's gesture for a predetermined time. A face / gesture detection unit that outputs a corresponding operation signal and returns to a first state when a user's face is not detected or a gesture is not detected for a predetermined time; And a result output unit configured to perform an operation corresponding to the operation signal received from the face / gesture detector.
์ด๋, ์๊ธฐ ์ 1 ์ํ๋ ์ 2 ์ํ์ ๋นํด ์ ์ ๋์ ์ ๋ ฅ์ ์๋ชจํ๋ ๋์ ์ํ๊ฐ ์ ์ฉ๋ ์ ์๋ค.In this case, the first state may be an operating state that consumes less power than the second state.
์๊ธฐ ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ๋ก๋ถํฐ ์จ์ดํฌ์ ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ๊ณ ๊ฐ์ง๋ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๋ ์ผ๊ตด ๊ฐ์ง๋ถ; ๋ฐ ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ ์ ๋ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ ์ ๋ณด๋ฅผ ์ ๋ฌ๋ฐ๊ณ , ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ฌ์ฉ์์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ๊ณ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ๋ํด์๋ง ์ ์ค์ณ ๊ฐ์ง๋ฅผ ์ํํ๋ ์ ์ค์ณ ๊ฐ์ง๋ถ;๋ฅผ ํฌํจํ ์ ์๋ค.The face / gesture detector may include: a face detector configured to detect a user's face for a predetermined time and select the detected face as a gesture detection target when receiving a wake-up signal from the motion detector; And a gesture detection unit configured to receive information on the selected gesture detection target from the face detection unit, set a user's gesture detection region based on the information on the gesture detection target, and perform gesture detection only on the gesture detection region. It can include;
ํนํ, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์์น๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ๋ฒ์ ์ด๋ด์ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์ ์๋ค.In particular, the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
๋ณธ ๋ฐ๋ช ์ ๋ ๋ค๋ฅธ ์ธก๋ฉด์ ๋ฐ๋ผ ๋ชจ์ /์ผ๊ตด ๊ฐ์ง๋ถ, ์ ์ค์ณ ๊ฐ์ง๋ถ ๋ฐ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ๋ฅผ ํฌํจํ๋ ์ ์ค์ณ ๊ฐ์ง ์ฅ์น๋, ์ฌ์ฉ์์ ์์ง์(๋ชจ์ )์ ๊ฐ์งํ๊ณ , ์์ง์ ๊ฐ์ง์ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ๋ฉฐ, ์ผ๊ตด ๊ฐ์ง์ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก ์จ์ดํฌ์ (wake up) ์ ํธ๋ฅผ ์ ๋ฌํ๋ ๋ชจ์ /์ผ๊ตด ๊ฐ์ง๋ถ; ์ 1 ์ํ๋ก ๋์ํ๋ค ์๊ธฐ ๋ชจ์ /์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์จ์ดํฌ์ ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๊ณ ๋ค์ ์ 1 ์ํ๋ก ๋ณต๊ทํ๊ณ , ์ ์ค์ณ ๊ฐ์ง์ ๊ฐ์ง๋ ์ ์ค์ณ์ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ์ถ๋ ฅํ๋ ์ ์ค์ณ ๊ฐ์ง๋ถ; ๋ฐ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก๋ถํฐ ์์ ๋ ๋์ ์ ํธ์ ๋์ํ ๋์์ ์ํํ๋ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ;๋ฅผ ํฌํจํ๋ค.According to another aspect of the present invention, a gesture detection device including a motion / face detection unit, a gesture detection unit, and a result output unit detects a user's motion (motion), detects a user's face when detecting a motion, and detects a face. A motion / face detector configured to transfer a wake up signal to the gesture detector; Operating in the first state When receiving the wake-up signal from the motion / face detection unit, the device operates in the second state for a predetermined time, detects the user's gesture and returns to the first state, and detects the gesture when the gesture is detected. A gesture detector for outputting a corresponding operation signal; And a result output unit configured to perform an operation corresponding to the operation signal received from the gesture detector.
์ด๋, ์๊ธฐ ์ 1 ์ํ๋ ์ 2 ์ํ์ ๋นํด ์ ์ ๋์ ์ ๋ ฅ์ ์๋ชจํ๋ ๋์ ์ํ๊ฐ ์ ์ฉ๋ ์ ์๋ค.In this case, the first state may be an operating state that consumes less power than the second state.
์๊ธฐ ๋ชจ์ /์ผ๊ตด ๊ฐ์ง๋ถ๋, ์ฌ์ฉ์์ ์ผ๊ตด ๊ฐ์ง์ ๊ฐ์ง๋ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๊ณ , ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋ก ์ถ๊ฐ์ ์ผ๋ก ์ ๋ฌํ๊ณ , ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ๋ชจ์ /์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ ๋ฌ๋ฐ์ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ ์์น ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ฌ์ฉ์์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ๊ณ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ๋ํด์๋ง ์ ์ค์ณ ๊ฐ์ง๋ฅผ ์ํํ๋ ์ ์ค์ณ ๊ฐ์ง๋ถ;๋ฅผ ํฌํจํ ์ ์๋ค.The motion / face detection unit selects a face detected when a user's face is detected as a gesture detection target, additionally transmits position information on the gesture detection target to the gesture detection unit, and the gesture detection unit detects the motion / face detection. And a gesture detection unit configured to set a user's gesture detection area based on the location information of the gesture detection target received from the unit, and perform gesture detection only on the gesture detection area.
์ด๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์์น๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ๋ฒ์ ์ด๋ด์ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์ ์๋ค.In this case, the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
๋ณธ ๋ฐ๋ช ์ ๋ฐ๋์งํ ์ค์์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ ์ฌ์ฉ์์ ์์ง์ ๋ฐ ์ผ๊ตด์ ๊ฐ์งํ์ฌ ์ ์ค์ณ ์ ๋ ฅ ์ค๋น๊ฐ ์๋ฃ๋์๋์ง๋ฅผ ํ๋ณํ๊ณ , ์ ์ค์ณ ์ ๋ ฅ ์ค๋น๊ฐ ์๋ฃ๋ ๊ฒฝ์ฐ์๋ง ์ ์ค์ณ ์ธ์ ๋ชจ๋์ ๋์ ์ํด์ผ๋ก์จ ์ ์ฒด ์๋ชจ ์ ๋ ฅ๋ ๋ฐ ์ฐ์ฐ๋์ ๊ฐ์์ํค๋ ํจ๊ณผ๊ฐ ์๋ค.The gesture recognition apparatus according to the preferred embodiment of the present invention detects the user's movement and face to determine whether the gesture input preparation is completed, and operates the gesture recognition module only when the gesture input preparation is completed, thereby reducing the total power consumption and the amount of calculation. It is effective to let.
ํนํ, ์ฌ๋์ ์์ง์ ๊ฐ์ง ๋์์ ์ ์ ๋ ฅ์ผ๋ก ๊ตฌ๋ ๊ฐ๋ฅํ๋ฏ๋ก, ์ ์ ๋ ฅ์ผ๋ก ์ฌ๋์ ์์ง์(๋ฐ ์ผ๊ตด)์ ๊ฐ์งํ์ฌ ์๋์ ์ผ๋ก ๋ง์ ์ ๋ ฅ์ ํ์๋ก ํ๋ ์ ์ค์ณ ์ธ์ ๋์์ ์ํํ ์ง ์ฌ๋ถ๋ฅผ ํ๋ณํจ์ผ๋ก์จ ์ข ๋ ๋๋น ๋ง์ ์๋น ์ ๋ ฅ๋์ ๊ฐ์์ํฌ ์ ์๋ค๋ ํจ๊ณผ๊ฐ ์๋ค.In particular, since the motion detection operation of the human being can be driven with low power, it is possible to detect a human motion (and face) at low power and determine whether to perform a gesture recognition operation that requires a relatively large amount of power. There is an effect that can be reduced.
๋ํ, ์ ์ค์ณ ๊ฐ์ง(๊ฒ์ถ) ์ ์ ์ฌ๋์ ์ผ๊ตด์ ๋จผ์ ๊ฐ์ง(๊ฒ์ถ)ํ์ฌ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ ์์ญ์ ํ์ (์ ํ)ํจ์ผ๋ก์จ ์ ์ฒด ์์ ์ ๋ณด๊ฐ ์๋ ์ผ๋ถ ์์ ์ ๋ณด๋ง์ ํ์ฉํ์ฌ ์ ์ค์ณ๋ฅผ ๊ฐ์ง(๊ฒ์ถ)ํ ์ ์๋ค๋ ํน์ง์ด ์์ผ๋ฉฐ, ์ด๋ก ์ธํด ์ ์ค์ณ ๊ฒ์ถ์ ์ํ ์ฐ์ฐ๋์ ์ต์ํ์ํฌ ์ ์๋ค๋ ํจ๊ณผ๊ฐ ์๋ค.In addition, by detecting (detecting) a face of a person before gesture detection (detection) first, by limiting (limiting) an area for detecting a gesture, a gesture can be detected (detected) by using only some image information instead of all image information. This has the effect of minimizing the amount of computation for gesture detection.
๋ 1์ ๋ณธ ๋ฐ๋ช ์ ์ 1 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ๋ํ๋ธ ๋๋ฉด,1 is a view showing a gesture recognition device according to a first example of the present invention;
๋ 2๋ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด,2 illustrates an example of determining that a face of a user is detected according to the present invention;
๋ 3์ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๋ณต์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง์, ์ด์ค ์ด๋ ํ๋์ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด,3 is a view illustrating an example of selecting one of the faces as a gesture detection target when detecting a plurality of faces according to the present invention;
๋ 4๋ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๊ฐ์ง๋ ์ผ๊ตด ์์ญ ์์์ ์ ์ค์ณ๋ฅผ ์ธ์ํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด,4 illustrates an example of recognizing a gesture on a detected face area according to the present invention;
๋ 5๋ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๊ฐ์ง๋ ์ผ๊ตด ์์ญ์ผ๋ก๋ถํฐ ์ผ์ ๊ฑฐ๋ฆฌ ์ด๊ฒฉ๋ ์์ญ ์์์ ์ ์ค์ณ๋ฅผ ์ธ์ํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด,5 illustrates an example of recognizing a gesture on an area spaced a predetermined distance from a detected face area according to the present invention;
๋ 6์ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๊ฐ์ง๋๋ ์ ์ค์ณ์ ๋ชจ์์ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด,6 is a view showing an example of the shape of the gesture detected in accordance with the present invention,
๋ 7์ ๋ณธ ๋ฐ๋ช ์ ์ 2 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ๋ํ๋ธ ๋๋ฉด,7 is a view showing a gesture recognition device according to a second example of the present invention;
๋ 8์ ๋ณธ ๋ฐ๋ช ์ ์ 3 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.8 is a diagram illustrating a gesture recognizing apparatus according to a third example of the present invention.
๋ณธ ๋ฐ๋ช ์ ๋ค์ํ ๋ณํ์ ๊ฐํ ์ ์๊ณ ์ฌ๋ฌ ๊ฐ์ง ์ค์์๋ฅผ ๊ฐ์ง ์ ์๋ ๋ฐ, ํน์ ์ค์์๋ค์ ๋๋ฉด์ ์์ํ๊ณ ์์ธํ ์ค๋ช ์ ์์ธํ๊ฒ ์ค๋ช ํ๊ณ ์ ํ๋ค. ๊ทธ๋ฌ๋, ์ด๋ ๋ณธ ๋ฐ๋ช ์ ํน์ ํ ์ค์ ํํ์ ๋ํด ํ์ ํ๋ ค๋ ๊ฒ์ด ์๋๋ฉฐ, ๋ณธ ๋ฐ๋ช ์ ์ฌ์ ๋ฐ ๊ธฐ์ ๋ฒ์์ ํฌํจ๋๋ ๋ชจ๋ ๋ณํ, ๊ท ๋ฑ๋ฌผ ๋ด์ง ๋์ฒด๋ฌผ์ ํฌํจํ๋ ๊ฒ์ผ๋ก ์ดํด๋์ด์ผ ํ๋ค. ๋ณธ ๋ฐ๋ช ์ ์ค๋ช ํจ์ ์์ด์ ๊ด๋ จ๋ ๊ณต์ง ๊ธฐ์ ์ ๋ํ ๊ตฌ์ฒด์ ์ธ ์ค๋ช ์ด ๋ณธ ๋ฐ๋ช ์ ์์ง๋ฅผ ํ๋ฆด ์ ์๋ค๊ณ ํ๋จ๋๋ ๊ฒฝ์ฐ ๊ทธ ์์ธํ ์ค๋ช ์ ์๋ตํ๋ค.As the invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to be limited to the particular embodiment of the present invention, it should be understood to include all transformations, equivalents, and substitutes included in the spirit and scope of the present invention. In the following description of the present invention, if it is determined that the detailed description of the related known technology may obscure the gist of the present invention, the detailed description thereof will be omitted.
์ 1, ์ 2 ๋ฑ์ ์ฉ์ด๋ ๋ค์ํ ๊ตฌ์ฑ์์๋ค์ ์ค๋ช ํ๋๋ฐ ์ฌ์ฉ๋ ์ ์์ง๋ง, ์๊ธฐ ๊ตฌ์ฑ์์๋ค์ ์๊ธฐ ์ฉ์ด๋ค์ ์ํด ํ์ ๋์ด์๋ ์ ๋๋ค. ์๊ธฐ ์ฉ์ด๋ค์ ํ๋์ ๊ตฌ์ฑ์์๋ฅผ ๋ค๋ฅธ ๊ตฌ์ฑ์์๋ก๋ถํฐ ๊ตฌ๋ณํ๋ ๋ชฉ์ ์ผ๋ก๋ง ์ฌ์ฉ๋๋ค.Terms such as first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
๋ณธ ์ถ์์์ ์ฌ์ฉํ ์ฉ์ด๋ ๋จ์ง ํน์ ํ ์ค์์๋ฅผ ์ค๋ช ํ๊ธฐ ์ํด ์ฌ์ฉ๋ ๊ฒ์ผ๋ก, ๋ณธ ๋ฐ๋ช ์ ํ์ ํ๋ ค๋ ์๋๊ฐ ์๋๋ค. ๋จ์์ ํํ์ ๋ฌธ๋งฅ์ ๋ช ๋ฐฑํ๊ฒ ๋ค๋ฅด๊ฒ ๋ปํ์ง ์๋ ํ, ๋ณต์์ ํํ์ ํฌํจํ๋ค. ๋ณธ ์ถ์์์, "ํฌํจํ๋ค" ๋๋ "๊ฐ์ง๋ค" ๋ฑ์ ์ฉ์ด๋ ๋ช ์ธ์์์ ๊ธฐ์ฌ๋ ํน์ง, ์ซ์, ๋จ๊ณ, ๋์, ๊ตฌ์ฑ์์, ๋ถํ ๋๋ ์ด๋ค์ ์กฐํฉํ ๊ฒ์ด ์กด์ฌํจ์ ์ง์ ํ๋ ค๋ ๊ฒ์ด์ง, ํ๋ ๋๋ ๊ทธ ์ด์์ ๋ค๋ฅธ ํน์ง๋ค์ด๋ ์ซ์, ๋จ๊ณ, ๋์, ๊ตฌ์ฑ์์, ๋ถํ ๋๋ ์ด๋ค์ ์กฐํฉํ ๊ฒ๋ค์ ์กด์ฌ ๋๋ ๋ถ๊ฐ ๊ฐ๋ฅ์ฑ์ ๋ฏธ๋ฆฌ ๋ฐฐ์ ํ์ง ์๋ ๊ฒ์ผ๋ก ์ดํด๋์ด์ผ ํ๋ค.The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.
์ดํ, ๋ณธ ๋ฐ๋ช ์ ์ค์์๋ฅผ ์ฒจ๋ถํ ๋๋ฉด๋ค์ ์ฐธ์กฐํ์ฌ ์์ธํ ์ค๋ช ํ๊ธฐ๋ก ํ๋ค.Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
๋ 1์ ๋ณธ ๋ฐ๋ช ์ ์ 1 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.1 is a diagram illustrating a gesture recognizing apparatus according to a first example of the present invention.
๋ 1์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ๋ณธ ๋ฐ๋ช
์ ์ผ ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ ์์ง์(๋ชจ์
) ๊ฐ์ง๋ถ(10), ์ผ๊ตด ๊ฐ์ง๋ถ(20), ์ ์ค์ณ ๊ฐ์ง๋ถ(30) ๋ฐ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ(40)๋ฅผ ํฌํจํ๋ค. ์ดํ, ๊ฐ ๊ตฌ์ฑ ์์๋ณ๋ก ๊ตฌ๋ถํ์ฌ ์์ธํ ์ค๋ช
ํ๋ค.As shown in FIG. 1, the gesture recognition apparatus according to an embodiment of the present invention may include a motion (motion) detecting unit 10, a
[์์ง์ [movement ๊ฐ์ง๋ถ์Detector ๋ํ ์ค๋ช ]ย Description of
์์ง์ ๊ฐ์ง๋ถ(10)๋ ์ฌ์ฉ์์ ์์ง์(๋ชจ์ )์ ๊ฐ์งํ๋ค. ์ด๋ฅผ ์ํ ๊ธฐ์ ๊ตฌ์ฑ์ผ๋ก์จ ์์ง์ ๊ฐ์ง๋ถ(10)๋ก๋ ์ธ์ฒด๊ฐ์ง ์ผ์๋ ์์ ์ผ์๊ฐ ์ ์ฉ๋ ์ ์๋ค.The motion detector 10 detects a user's motion (motion). As a technical configuration for this, a human body sensor or an image sensor may be applied to the motion detector 10.
์ธ์ฒด๊ฐ์ง ์ผ์์ ๊ฒฝ์ฐ ์ ์ธ์ ํจ์๋ธ ๋ฐฉ์(PIR : Passive Infrared Ray), ๊ทผ์ ์ธ์ ์กํฐ๋ธ๋ฐฉ์, ์ด์ํ, ๋ง์ดํฌ๋กํ ๋ฑ์ ํตํด ์ผ์ ๊ฐ๊ฒฉ์ผ๋ก ์ธ์ฒด์ ๋ํ ์ ๋ณด๋ฅผ ํ๋ํ๊ณ , ํ๋๋๋ ์ ๋ณด๊ฐ ์ฐจ์ด๊ฐ์ด ์ผ์ ์ด์์ด๋ผ๊ณ ํ๋จ๋๋ฉด ์ฌ์ฉ์์ ์์ง์(๋ชจ์ )์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค.In the case of the human body detection sensor, information about the human body is acquired at regular intervals through an infrared passive method (PIR), near infrared active method, ultrasonic waves, microwaves, and the like, and when it is determined that the difference between the acquired information is more than a predetermined level, It may be determined that the motion of the motion is detected.
์์ ์ผ์์ ๊ฒฝ์ฐ CMOS(Complementary MOS) ์์ ์ผ์, CCD(Charge-Coupled Device), IR ์ด๋ฏธ์ง ์ผ์ ๋ฑ ์ฌ๋์ ๋ํ ์์ ์ ๋ณด๋ฅผ ํ๋ํ ์ ์๋ ๋ชจ๋ ๊ตฌ์ฑ ์์๊ฐ ์ ์ฉ๋ ์ ์๋ค. ์ด ๊ฒฝ์ฐ์๋ ์๊ธฐ ์์ ์ผ์๋ ์ผ์ ๊ฐ๊ฒฉ์ผ๋ก ํ๋๋๋ ์์ ์ ๋ณด์ ์ฐจ๋ถ ๊ฐ ๋๋ ์ ํฉ ๊ฒฐ๊ณผ๋ฅผ ๋ฐํ์ผ๋ก ์ฌ์ฉ์์ ์์ง์(๋ชจ์ )์ด ์์๋์ง๋ฅผ ๊ฐ์งํ ์ ์๋ค.In the case of an image sensor, all components capable of acquiring image information of a person, such as a complementary MOS (CMOS) image sensor, a charge-coupled device (CCD), and an IR image sensor, may be applied. In this case, the image sensor may detect whether there is a motion (motion) of the user based on a difference value or a matching result of the image information acquired at a predetermined interval.
์์ง์ ๊ฐ์ง๋ถ(10)๋ ์๊ธฐ์ ๊ฐ์ ๋ค์ํ ๋ฐฉ๋ฒ๋ค์ ํตํด ์์ง์์ด ๊ฐ์ง๋ ๊ฒฝ์ฐ, ๋ณ๋์ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก ์จ์ดํฌ์
(wake up) ์ ํธ๋ฅผ ์ ๋ฌํ๋ค.When the motion detection unit 10 detects a motion through the various methods as described above, the motion detection unit 10 transmits a wake up signal to the separate
[์ผ๊ตด [Face ๊ฐ์ง๋ถ์Detector ๋ํ ์ค๋ช ]ย Description of
์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ํ์์๋ ์ 1 ์ํ๋ก ๋์ํ๋ค ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ(10)๋ก๋ถํฐ ์จ์ดํฌ์
(wake up) ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ๋ค. ์ด๋, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ผ์ ์๊ฐ ๋์ ์ผ์ ์์ญ์ ๋ํ์ฌ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ๊ณ ๋ ํ, ๋ค์ ์ 1 ์ํ๋ก ๋ณต๊ทํ๊ฒ ๋๋ค.The
๋ณธ ๋ฐ๋ช ์ ์ ์ฉ ๊ฐ๋ฅํ ์ค์์์์, ์ 1 ์ํ๋ก๋ ์ฌ๋ฆฝ(sleep) ์ํ, ํ์ ์คํ(power off) ์ํ ๋ฑ์ด ์ ์ฉ๋ ์ ์์ผ๋ฉฐ, ์ 2 ์ํ๋ก๋ ์ ์ ๋์ ์ํ๊ฐ ์ ์ฉ๋ ์ ์๋ค.In an embodiment applicable to the present invention, a sleep state, a power off state, or the like may be applied to the first state, and a normal operating state may be applied to the second state.
์ฆ, ๋ณธ์ ๋ฐ๋ช ์ ์์ด, ์ 1 ์ํ๋ ์ 2 ์ํ์ ๋นํด ์ ์ ๋์ ์ ๋ ฅ์ ์๋ชจํ๋ ๋์ ์ํ๋ฅผ ์๋ฏธํ ์ ์๋ค.That is, in the present invention, the first state may mean an operating state that consumes a smaller amount of power than the second state.
์ผ๋ฐ์ ์ผ๋ก, ํ์ ์คํ(power off) ์ํ๋ ๋ชจ๋ ๊ตฌ์ฑ ์์๋ค๋ก์ ์ ์์ ์ฐจ๋จํ ์ํ๋ฅผ ์๋ฏธํ๋ฉฐ, ์ฌ๋ฆฝ(sleep) ์ํ๋ ์ผ๋ถ ๊ตฌ์ฑ ์์๋ง ๋์ํ๋๋ก ์ ์ดํ๊ณ ๋ค๋ฅธ ๊ตฌ์ฑ ์์๋ค์ ์ ์์ ์ฐจ๋จํ์ฌ ์ ์ฒด ์๋ชจ ์ ๋ ฅ๋์ ์ต์ํํ๋ ์ํ๋ฅผ ์๋ฏธํ๋ค. ์ด๋, ์จ์ดํฌ์ ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ๋ชจ๋ ๊ตฌ์ฑ ์์์ ์ ์์ ๊ณต๊ธํ์ฌ ๋ชจ๋ ๊ตฌ์ฑ ์์๋ค์ด ๋์ํ ์ ์๋๋ก ์ ์ดํ๊ฒ ๋๋ค.In general, the power off state means that power to all components is turned off. The sleep state controls only some components to operate and the other components are turned off to consume the entire power. It means a state that minimizes the amount of power. At this time, when the wake-up signal is received, power is supplied to all components to control all components to operate.
์ผ ์๋ก, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ํ์์๋ ์ 1 ์ํ๋ก์ ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ(10)๋ก๋ถํฐ ์จ์ดํฌ์
์ ํธ๋ฅผ ์์ ํ๋ ๊ตฌ์ฑ ์ธ์ ๋ค๋ฅธ ๊ตฌ์ฑ ์์๋ค๋ก์ ๊ณต๊ธ ์ ์์ ์ฐจ๋จํ๋ ์ฌ๋ฆฝ ์ํ๋ก ๋์ํ ์ ์๋ค. ์ด์ด, ์์ง์ ๊ฐ์ง๋ถ(10)๋ก๋ถํฐ ์จ์ดํฌ์
์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ๊ธฐํ ๊ตฌ์ฑ ์์๋ค(์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฒ์ถํ๋ ๋ชจ๋ ๋ฑ)๋ก ์ ์์ ๊ณต๊ธํ์ฌ ๋์์ผ ํ๋ฉฐ ์ 2 ์ํ๋ก์ ์ฌ์ฉ์์ ์ผ๊ตด ๊ฐ์ง ๋์์ ์ํํ๊ฒ ๋๋ค.For example, the
์ด์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก๋ ์ฌ๋์ ์ผ๊ตด ํํ๋ฅผ ๊ฐ์งํ ์ ์๋ ๋ชจ๋ ๊ตฌ์ฑ ์์๊ฐ ์ ์ฉ๋ ์ ์๋ค. ์ผ ์๋ก, ์์ง์ ๊ฐ์ง๋ถ(10)์ ๊ฐ์ด, ์ธ์ฒด๊ฐ์ง ์ผ์, ์์ ์ผ์ ๋ชจ๋ ์ ์ฉ ๊ฐ๋ฅํ๋ค.As the
[์ผ๊ตด [Face ๊ฐ์ง๋ถ์Detector ์ผ๊ตด ๊ฐ์ง ํ๋จ ๊ธฐ์ค์ ์]ย Example of Face Detection Criteria]
๋ณธ ๋ฐ๋ช
์ ์ ์ฉ๊ฐ๋ฅํ ์ค์์์ ์์ด, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ๋ค์๊ณผ ๊ฐ์ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค.In the embodiment applicable to the present invention, the
1. ์ฌ์ฉ์์ ์ผ๊ตด ๋ฐฉํฅ์ด ์ผ์ ๊ฐ๋ 1. The user's face is at an angle ์ด๋ด์ธ ๊ฒฝ์ฐWithin
๋ 2๋ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.2 is a diagram illustrating an example of determining that a face of a user is detected according to the present invention.
๋ 2์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ฌ์ฉ์์ ์ผ๊ตด์ด ์ ๋ฉด์ ํฅํ๊ณ ์๋ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค. ๋ณด๋ค ๊ตฌ์ฒด์ ์ผ๋ก, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ๊ฐ์ง๋ ์ฌ์ฉ์์ ์ผ๊ตด ๋ฐฉํฅ์ด ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ๊ฐ๋ ์ด๋ด์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค.As illustrated in FIG. 2, the
์ด๋ฅผ ํ๋จํ๊ธฐ ์ํด, ๋ค์ํ ์ผ๊ตด ๊ฒ์ถ ์๊ณ ๋ฆฌ์ฆ์ด ์ ์ฉ๋ ์ ์๋ค. ์๋ฅผ ๋ค์ด, ์ผ๊ตด ์ธ์์ ํ๊ธฐ ์ํ ๋งค์นญ ํ ํ๋ฆฟ(matching template)๋ฅผ ์ ๋ฐฉํฅ ์์์ ํตํด ํ์ต์ํด์ผ๋ก์จ ์ด๋ฅผ ๊ตฌํํ ์ ์๋ค. ๋๋ ๊ฐ์ง๋๋ ์ผ๊ตด์ ํํ ์ ๋ณด๋ฅผ ๋ฐํ์ผ๋ก ์ฌ์ฉ์์ ์ผ๊ตด ๋ฐฉํฅ์ ๊ฐ์งํ์ฌ ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์งํ ์ ์๋ค.To determine this, various face detection algorithms can be applied. For example, this may be implemented by learning a matching template for face recognition through a forward image. Alternatively, the user's face may be detected by detecting the direction of the user's face based on the shape information of the face.
์ด์ ๊ฐ์ด ๋ณธ ๋ฐ๋ช
์ ๋ฐ๋ฅธ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ ์ฒด์ ์ธ ์ผ๊ตด์ ๋ฐฉํฅ์ ๊ฒ์ถํ์ฌ ์ผ๊ตด์ ๊ฐ์ง ์ฌ๋ถ๋ฅผ ํ๋จํจ์ผ๋ก์จ ์์ ์ ๋ณด ๋ด์์ ์์ ์์น๋ฅผ ์ฐ์ถํ๊ณ ์ด์ ๊ธฐ๋ฐํ์ฌ ์์ ๋ฐฉํฅ์ ์ฐ์ถํ๋ ์ข
๋ ๊ธฐ์ ๋๋น ์ ์ฒด ์ฐ์ฐ๋ ๋ฐ ์ด๋ก ์ธํ ์๋ชจ ์ ๋ ฅ๋์ ๊ฐ์์ํค๋ ํจ๊ณผ๊ฐ ์๋ค.As described above, the
2. ์ผ๊ตด 2. Face ๊ฐ์ง๋ถ์Detection unit ์ฌ์ฉ์๊ฐBetween users ๊ฑฐ๋ฆฌ๊ฐ ์ผ์ ย Street schedule ์ด๋ด์ธ ๊ฒฝ์ฐWithin
๋ค๋ฅธ ์ค์์์์, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ผ์ ๊ฑฐ๋ฆฌ ์ด๋ด์ ์์นํ๋ ์ฌ์ฉ์์ ์ผ๊ตด๋ง์ ๊ฐ์งํ ์ ์๋ค.In another embodiment, the
์ด๋ฅผ ์ํด์๋ ๋ค์ํ ๋ฐฉ๋ฒ์ด ์ ์ฉ๋ ์ ์๋ค. ์ผ ์๋ก, ๋ณ๋์ ๊ฐ์ง ์ผ์๋ฅผ ์ด์ฉํ์ฌ ์ฌ์ฉ์์ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๊ฐ ๊ฑฐ๋ฆฌ๋ฅผ ์ธก์ ํ๊ณ , ์๊ธฐ ๊ฑฐ๋ฆฌ ๊ฐ์ด ์ผ์ ์ด๋ด์ธ ๊ฒฝ์ฐ์๋ง ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก ํ์ฌ๊ธ ๊ฐ์ง ๋์์ ์ํํ ๋ก ํ ์ ์๋ค.To this end, various methods may be applied. For example, the distance between the user and the
๋ฐ๋์งํ๊ฒ๋, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ฅผ ํตํด ๊ฐ์ง๋๋ ์ผ๊ตด ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ฌ์ฉ์์ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๊ฐ ๊ฑฐ๋ฆฌ๋ฅผ ์ธก์ ํ๊ณ , ์๊ธฐ ๊ฑฐ๋ฆฌ ๊ฐ์ด ์ผ์ ์ด๋ด์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด ๊ฐ์ง ๋์์ ์ํํ ๋ก ํ ์ ์๋ค. ์ด๋ฅผ ์ํด, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ํด์๋์ ๋ฐ๋ฅธ ์ผ๊ตด์ ํฝ์
์๋ฅผ ์นด์ดํธํ์ฌ ํฝ์
์๊ฐ ์ผ์ ๊ฐ์ ์ด์์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์ง(๊ฒ์ถ)ํ๋๋ก ์ ์ดํ ์ ์๋ค.Preferably, the distance between the user and the
3. ์ผ๊ตด 3. face ๊ฐ์ง๋ถ์Detection unit ์ฌ์ฉ์๊ฐBetween users ๊ฑฐ๋ฆฌ๊ฐ ์ผ์ ๋ฒ์ ย Distance range ์ด๋ด์ธ ๊ฒฝ์ฐWithin
๋ ๋ค๋ฅธ ์ค์์์์, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ผ์ ๋ฒ์ ์ด๋ด์ ์์นํ๋ ์ฌ์ฉ์์ ์ผ๊ตด๋ง์ ๊ฐ์งํ ์ ์๋ค. ๋ค์ ๋งํด, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ๋ก๋ถํฐ ์ต์ A ๊ฑฐ๋ฆฌ ์ด์ ์ต๋ B ๊ฑฐ๋ฆฌ ์ดํ ์ด๊ฒฉ๋ ์ฌ์ฉ์์ ์ผ๊ตด๋ง์ ๊ฐ์งํ ์ ์๋ค.In another embodiment, the
์ด๋ฅผ ์ํด์๋ ๋ค์ํ ๋ฐฉ๋ฒ์ด ์ ์ฉ๋ ์ ์๋ค. ์ผ ์๋ก, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ๊ฐ์ง๋ ์ฌ์ฉ์์ ์ผ๊ตด ํฌ๊ธฐ๊ฐ ์ 1 ๋ฌธํฑ๊ฐ ์ด์์ด๋ฉฐ ์ 2 ๋ฌธํฑ๊ฐ ์ดํ์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒ์ผ๋ก ํ๋จํ ์ ์๋ค. ์ด๋ฅผ ์ํด, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ํด์๋์ ๋ฐ๋ฅธ ์ผ๊ตด์ ํฝ์
์๋ฅผ ์นด์ดํธํ์ฌ ํฝ์
์๊ฐ ์ผ์ ๊ฐ์ ๋ฒ์ ์ด๋ด์ธ ๊ฒฝ์ฐ์๋ง ์ฌ์ฉ์์ ์ผ๊ตด์ ๊ฐ์ง(๊ฒ์ถ)ํ๋๋ก ์ ์ดํ ์ ์๋ค.To this end, various methods may be applied. For example, the
๋ณธ ๋ฐ๋ช
์ ์ ์ฉ๊ฐ๋ฅํ ๋ฐ๋์งํ ์ค์์์์, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์๊ธฐ์ ๊ฐ์ ๋ค์ํ ๋ฐฉ๋ฒ์ ํตํด ์ฌ์ฉ์์ ์ผ๊ตด์ด ๊ฐ์ง๋๊ฒ ๋๋ฉด, ๊ฐ์ง๋ ์ผ๊ตด์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ก ์ ๊ณตํ๊ณ , ์ด๋ฅผ ์์ ๋ฐ์ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ ์์น ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ๊ฒ ๋๋ค. ์ด๋ฅผ ํตํด ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์ ์ค์ณ ๊ฐ์ง๋ฅผ ์ํ ์ฐ์ฐ๋์ ๊ฐ์์ํฌ ์ ์๋ค. ์ดํ, ์ ์ค์ณ ๊ฐ์ง๋ถ(30)์ ๋ํด์๋ ๊ด๋ จ ์ค๋ช
์ ํตํด ์์ธํ ์ค๋ช
ํ๋ค.In the preferred embodiment applicable to the present invention, when the face of the user is detected through the various methods as described above, the
์ด์ฒ๋ผ, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ผ๊ตด์ ๊ฐ์งํ๊ฒ ๋๋ฉด, ๊ฐ์ง๋ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๊ณ ์ด์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ก ์ ๊ณตํ ์ ์๋ค.As such, when the
์ด๋, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ๋ณต์ ๊ฐ์ ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒฝ์ฐ์๋ ์ผ์ ๊ท์น์ ๋ฐ๋ผ ์ด์ค ์ด๋ ํ๋์ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ์ฌ ์ด์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ก ์ ๊ณตํ ์ ์๋ค.In this case, when a plurality of faces are detected, the
[๋ณต์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง์, ์ ์ค์ณ ๊ฐ์ง ๋์ ์ ์ ์ ์][Example of Gesture Detection Target Selection when Detecting Multiple Faces]
1. ์์ง์ ์ ๋ณด์ ๊ฐ์ฅ ๊ฐ๊น์ด ์์นํ ์ผ๊ตด1. Face closest to movement information
๋ 3์ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๋ณต์ ๊ฐ์ ์ผ๊ตด ๊ฐ์ง์, ์ด์ค ์ด๋ ํ๋์ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.3 is a diagram illustrating an example of selecting one of the faces as a gesture detection target when detecting a plurality of faces according to the present invention.
๋ 3์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ๋ณธ ๋ฐ๋ช
์ ์ ์ฉ๊ฐ๋ฅํ ์ค์์์ ์์ด, ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์์ง์ ๊ฐ์ง๋ถ(10)๋ฅผ ํตํด ๊ฐ์ง๋ ์์ง์ ์ ๋ณด์ ๊ฐ์ฅ ๊ฐ๊น์ด ์์นํ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ ์ ์๋ค. As shown in FIG. 3, in the embodiment applicable to the present invention, the
๋ณด๋ค ๊ตฌ์ฒด์ ์ผ๋ก, ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ๊ฐ์ง๋ ์์ง์ ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ํน์ ๋ฐฉํฅ์ผ๋ก ๊ฐ์ฅ ๊ฐ๊น์ด ์์นํ ์ผ๊ตด ์ ๋ณด๋ฅผ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ ์ ์๋ค. ๋ 3์์๋ ๊ฐ์ง๋ ์์ง์์ ์์ชฝ ๋ฐฉํฅ์ผ๋ก ๊ฐ์ฅ ๊ฐ๊น์ด ์์นํ ์ผ๊ตด ์ ๋ณด๋ฅผ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๋ ์๋ฅผ ๋์ํ์์ผ๋, ๋ณธ ๋ฐ๋ช
์ ๋ฐ๋ฅธ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์๊ธฐ ์ค์์๋ก ํ์ ๋์ง ์๋๋ค.More specifically, the
2. ์์ง์ ์ ๋ณด๊ฐ ๊ฐ์ง๋ ์์ญ ์์ ์ผ๊ตด2. Face on the area where motion information is detected
๋ค๋ฅธ ์ค์์์์, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์๊ธฐ ์์ง์ ๊ฐ์ง๋ถ(10)๋ฅผ ํตํด ๊ฐ์ง๋ ์์ง์ ์ ๋ณด์ ์ ๋ถ ๋๋ ์ผ๋ถ๊ฐ ์ค์ฒฉ๋๋ ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ ์ ์๋ค. ๋ค์ ๋งํด, ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ๊ฐ์ง๋ ์์ง์ ์ ๋ณด๊ฐ ํน์ ์ผ๊ตด ์์์ ๊ฐ์ง๋ ๊ฒฝ์ฐ ํด๋น ์ผ๊ตด์ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ํ๊ณ , ์ด์ ๋ํ ์์น ์ ๋ณด๋ฅผ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ก ์ ๊ณตํ ์ ์๋ค.In another embodiment, the
์๊ธฐ์ ๊ฐ์ ์ค์์๋ค์ ํตํด ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์ ์ ํ ์ ์์ผ๋ฉฐ, ์ด์ ๋ํ ์ ๋ณด๋ฅผ ์ ๊ณต๋ฐ์ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ ๊ฐ์ง ๋์์ ๋ํ ์ ๋ณด๋ฅผ ๋ฐํ์ผ๋ก ์ ์ค์ณ ๊ฐ์ง ๋์์ ์ํํ ์ ์๋ค.Through the above embodiments, the
[[ ์ ์ค์ณGesture ๊ฐ์ง๋ถ์Detector ๋ํ ์ค๋ช ]ย Description of
์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ํ์์๋ ์ 1 ์ํ๋ก ๋์ํ๋ค ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก๋ถํฐ ์จ์ดํฌ์
(wake up) ์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ผ์ ์๊ฐ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๋ค. ์ด๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์ผ์ ์๊ฐ ๋์ ์ผ์ ์์ญ์ ๋ํ์ฌ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๊ณ ๋ ํ, ๋ค์ ์ 1 ์ํ๋ก ๋ณต๊ทํ๋ค.The gesture detector 30 normally operates in a first state. When the gesture detector 30 receives a wake up signal from the
๋ณธ ๋ฐ๋ช ์ ์ ์ฉ ๊ฐ๋ฅํ ์ค์์์์, ์ 1 ์ํ๋ก๋ ์ฌ๋ฆฝ(sleep) ์ํ, ํ์ ์คํ(power off) ์ํ ๋ฑ์ด ์ ์ฉ๋ ์ ์์ผ๋ฉฐ, ์ 2 ์ํ๋ก๋ ์ ์ ๋์ ์ํ๊ฐ ์ ์ฉ๋ ์ ์๋ค.In an embodiment applicable to the present invention, a sleep state, a power off state, or the like may be applied to the first state, and a normal operating state may be applied to the second state.
์ฆ, ๋ณธ์ ๋ฐ๋ช ์ ์์ด, ์ 1 ์ํ๋ ์ 2 ์ํ์ ๋นํด ์ ์ ๋์ ์ ๋ ฅ์ ์๋ชจํ๋ ๋์ ์ํ๋ฅผ ์๋ฏธํ ์ ์๋ค.That is, in the present invention, the first state may mean an operating state that consumes a smaller amount of power than the second state.
์ผ ์๋ก, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ํ์์๋ ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก๋ถํฐ ์จ์ดํฌ์
์ ํธ๋ฅผ ์์ ํ๋ ๊ตฌ์ฑ ์ธ์ ๋ค๋ฅธ ๊ตฌ์ฑ ์์๋ค๋ก์ ๊ณต๊ธ ์ ์์ ์ฐจ๋จํ๋ ์ฌ๋ฆฝ ์ํ๋ก ๋์ํ ์ ์๋ค. ์ด์ด, ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก๋ถํฐ ์จ์ดํฌ์
์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ๊ธฐํ ๊ตฌ์ฑ ์์๋ค(์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฒ์ถํ๋ ๋ชจ๋ ๋ฑ)๋ก ์ ์์ ๊ณต๊ธํ์ฌ ๋์์ผ ํจ์ผ๋ก์จ ์ฌ์ฉ์์ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์ํํ๊ฒ ๋๋ค.For example, the gesture detector 30 may operate in a sleep state that cuts off power to other components in addition to the configuration of receiving a wake-up signal from the
์ด๋, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ก๋ ์ฌ์ฉ์์ ์ ๋์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ ์ ์๋ ๋ชจ๋ ๊ตฌ์ฑ ์์๊ฐ ์ ์ฉ๋ ์ ์๋ค. ๋ฐ๋์งํ๊ฒ๋ ๋น์ ผ ์ผ์๊ฐ ์ ์ฉ๋์ด ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๋ณด๋ค ์ ํํ๊ฒ ๊ฐ์ง(๊ฒ์ถ)ํ ์ ์๋ค.In this case, all the components capable of detecting the gesture of the user's hand may be applied to the gesture detecting unit 30. Preferably, a vision sensor may be applied to more accurately detect (detect) the user's gesture.
๋ณธ ๋ฐ๋ช
์ ์ ์ฉ๊ฐ๋ฅํ ๋ฐ๋์งํ ์ค์์์์ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ก๋ถํฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ๋ํ (์์น) ์ ๋ณด๋ฅผ ์ ๊ณต๋ฐ๊ฒ ๋๋ฉด, ์๊ธฐ ์ ๋ณด์ ๊ธฐ๋ฐํ์ฌ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ๊ณ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ํํ์ฌ ์ ์ค์ณ ๊ฐ์ง ์ฐ์ฐ์ ์ํํ๋ค. ์ฆ, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์ฐ์ฐ๋ ๋ฐ ์ ๋ ฅ ์๋ชจ๋์ ๊ฐ์์ํค๊ธฐ ์ํ์ฌ ํ๋๋๋ ๋ชจ๋ ์์ ์ ๋ณด์ ๋ํด ์ ์ค์ณ ๊ฐ์ง ๋์์ ์ํํ์ง ์๊ณ , ์ด์ค ์ผ๋ถ ์์ญ์ ๋ํด์๋ง ์ ์ค์ณ ๊ฐ์ง(๊ฒ์ถ) ๋์์ ์ํํ ์ ์๋ค.In the preferred embodiment applicable to the present invention, when the gesture detecting unit 30 receives the (location) information on the gesture detecting object from the
์ดํ, ์ ์ค์ณ ๊ฐ์ง๋ถ(30)์ ์ํด ์ค์ ๋๋ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก์จ ์ ์ฉ๊ฐ๋ฅํ ์ค์์๋ ๋ค์๊ณผ ๊ฐ๋ค.Hereinafter, an embodiment applicable as the gesture detection area set by the gesture detection unit 30 is as follows.
[[ ์ ์ค์ณGesture ๊ฐ์ง ์์ญ์ ์]ย Example of detection area]
1. ์ผ๊ตด ์์์์ 1. on the face ์ ์ค์ณGesture ๊ฐ์งย Detect
๋ 4๋ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๊ฐ์ง๋ ์ผ๊ตด ์์ญ ์์์ ์ ์ค์ณ๋ฅผ ์ธ์ํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.4 illustrates an example of recognizing a gesture on a detected face area according to the present invention.
๋ 4์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ฅธ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์ ์ค์ณ ๊ฐ์ง ๋์์ผ๋ก ์ ์ ๋ ์ผ๊ตด ์์ ์์ญ ์์ ์ ๋ ฅ๋๋ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ ์ ์๋ค. ์ด๋ฅผ ์ํด, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์์น๋ฅผ ๊ธฐ์ค์ผ๋ก ์ผ์ ๋ฒ์ ์ด๋ด์ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์ ์๋ค.As shown in FIG. 4, the gesture detector 30 may detect a gesture input on an area on a face selected as a gesture detection target. To this end, the gesture detection unit 30 may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
2. ์ผ๊ตด์ ๊ธฐ์ค์ผ๋ก ํน์ ์์ญ์์์ 2. in a specific area relative to the face ์ ์ค์ณGesture ๊ฐ์งย Detect
๋ 5๋ ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ผ ๊ฐ์ง๋ ์ผ๊ตด ์์ญ์ผ๋ก๋ถํฐ ์ผ์ ๊ฑฐ๋ฆฌ ์ด๊ฒฉ๋ ์์ญ ์์์ ์ ์ค์ณ๋ฅผ ์ธ์ํ๋ ์ผ ์๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.5 is a diagram illustrating an example of recognizing a gesture on an area spaced a predetermined distance from a detected face area according to the present invention.
๋ 5์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ๋ณธ ๋ฐ๋ช ์ ๋ฐ๋ฅธ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์์น๋ก๋ถํฐ ์ผ์ ๊ฑฐ๋ฆฌ ์ด๊ฒฉ๋ ์ผ์ ํฌ๊ธฐ์ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์ ์๋ค.As illustrated in FIG. 5, the gesture detector 30 may set a region having a predetermined size spaced apart from a position of the gesture sensing object as a gesture sensing region.
๊ตฌ์ฒด์ ์ผ๋ก, ๋ 5์์๋ ์ ์ค์ณ ๊ฐ์ง ๋์(์ผ๊ตด)์ ์์น๋ฅผ ๊ธฐ์ค์ผ๋ก ์๋ ๋ฐฉํฅ์ผ๋ก ์ผ์ ๊ฑฐ๋ฆฌ ์ด๊ฒฉ๋ ์์ญ์ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ผ๋ก ์ค์ ํ ์๋ฅผ ๋ํ๋ด๊ณ ์์ผ๋, ๋ณธ ๋ฐ๋ช ์ ์๊ธฐ ์ค์์์ ํ์ ๋์ง ์๋๋ค.Specifically, FIG. 5 illustrates an example in which a region spaced a predetermined distance downward in the downward direction based on the position of the gesture sensing object (face) is set as the gesture sensing region, but the present invention is not limited to the above embodiment.
์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ์ ๊ฐ์ด ๋ค์ํ ๋ฐฉ๋ฒ์ ํตํด ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ค์ ํ๊ณ , ์๊ธฐ ์์ญ์ ํ์ ํ์ฌ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์ํํจ์ผ๋ก์จ ์ ์ค์ณ ์ฐ์ถ์ ์์๋๋ ์ฐ์ฐ๋์ ์ต์ ํ์ํฌ ์ ์๋ค. ์ด์ด, ์ ์ค์ณ๊ฐ ๊ฐ์ง๋๊ฒ ๋๋ฉด ๊ฐ์ง๋ ์ ์ค์ณ์ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ(40)๋ก ์ถ๋ ฅํ๊ฒ ๋๋ค.The gesture detection unit 30 may optimize the amount of calculation required to calculate the gesture by setting the gesture detection area through various methods as described above and performing the gesture detection operation only in the area. Subsequently, when a gesture is detected, an operation signal corresponding to the detected gesture is output to the
๋ณธ ๋ฐ๋ช ์ ์ ์ฉ๊ฐ๋ฅํ ์ค์์์์ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์ฌ์ฉ์์ ์ ์ค์ณ์ ๋ชจ์ ๋ฐ ์์ง์ ๋ฐฉํฅ์ ๊ฐ์งํ๊ณ , ์ด์ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ์ถ๋ ฅํ ์ ์๋ค.In an embodiment applicable to the present invention, the gesture detector 30 may detect a shape and a movement direction of a user's gesture and output an operation signal corresponding thereto.
์ผ ์๋ก, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์์ ๋ถ์์ ํตํด ๋ 6์ ๋์๋ ๋ค์ํ ์ ์ค์ณ์ ๋ชจ์์ ๊ฐ์งํ ์ ์๋ค.As an example, the gesture detector 30 may detect the shapes of various gestures shown in FIG. 6 through image analysis.
๋ํ, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ๊ฐ์ง๋ ์ ์ค์ณ ๋ชจ์์ ์์ง์ ๋ฐฉํฅ๋ ๊ฐ์งํ ์ ์๋ค. In addition, the gesture detection unit 30 may also detect a detected gesture direction of movement.
์ด๋ฅผ ์ํด์๋ ๋ค์ํ ์ฐ์ถ ์๊ณ ๋ฆฌ์ฆ์ด ์ ์ฉ๋ ์ ์์ผ๋ฉฐ, ์ผ ์๋ก ์ผ์ ์๊ฐ ๊ฐ๊ฒฉ์ผ๋ก ํ๋๋๋ ์์ ๋ฐ์ดํฐ๊ฐ ์ฐจ๋ถ ์ ๋ณด๋ฅผ ์ด์ฉํ์ฌ ์(์ ์ค์ณ)์ ์ด๋ ๋ฐฉํฅ์ ๊ฐ์งํ ์ ์๋ค.To this end, various calculation algorithms may be applied. For example, a moving direction of a hand (gesture) may be detected using difference information between image data acquired at predetermined time intervals.
๋๋, ๋ธ๋ก ๋ชจ์ ์ ๊ตฌํ์ฌ ๊ฐ๋ณ ๋ธ๋ก์ ์์ง์ ๋ฒกํฐ๋ฅผ ๊ตฌํ ํ ์ ์ฒด ์์ง์ ๋ฒกํฐ์ ํ๊ท ์ ๊ตฌํจ์ผ๋ก์จ ์(์ ์ค์ณ)์ ์์ง์ ๋ฐฉํฅ์ ๊ฐ์งํ ์ ์๋ค.Alternatively, the motion direction of a hand (gesture) may be detected by obtaining a block motion to obtain a motion vector of an individual block, and then calculating an average of all motion vectors.
๋ ๋ค๋ฅธ ๋ฐฉ๋ฒ์ผ๋ก๋, ์ตํฐ์ปฌ ํ๋ก์ฐ(Optical Flow)๋ฅผ ๊ตฌํ์ฌ ์ ์ฒด์ ์ผ๋ก ์ตํฐ์ปฌ ํ๋ก์ฐ๊ฐ ์ด๋ ๋ฐฉํฅ์ผ๋ก ์์ง์ด๋์ง๋ฅผ ๊ฒ์ถํจ์ผ๋ก์จ ์์ง์์ ๋ฐฉํฅ์ ํธ๋ํน(tracking)ํ ์ ์๋ค. Alternatively, the direction of the movement can be tracked by obtaining the optical flow and detecting which direction the optical flow is moving as a whole.
๋ณธ ๋ฐ๋ช ์ ์๊ธฐ ์ค์์๋ก ํ์ ๋์ง ์์ผ๋ฉฐ, ์ด์ธ ๊ธฐํ ๋ค๋ฅธ ์์ง์ ๊ฐ์ง ๋ฐฉ๋ฒ ๋ํ ์ ์ฉ๋ ์ ์๋ค.The present invention is not limited to the above embodiment, and other motion sensing methods may also be applied.
[๊ฒฐ๊ณผ [result ์ถ๋ ฅ๋ถ์ ๋ํ ์ค๋ช Description of output ]]
๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ(40)๋ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ก๋ถํฐ ์์ ๋ ๋์ ์ ํธ์ ๋์ํ ๋์์ ์ํํ๋ค. ์ด๋, ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ๊ฐ์ง๋ ์ ์ค์ณ์ ๋ชจ์ ๋ฐ ์์ง์ ๋ฐฉํฅ๋ณ ๋์๋๋ ๋์ ์ ํธ์ ๋ํ ํ
์ด๋ธ ์ ๋ณด๋ฅผ ์ ์ฅํ๋ฉฐ, ํน์ ์ ์ค์ณ์ ๋ชจ์ ๋ฐ ์์ง์ ๋ฐฉํฅ์ด ๊ฐ์ง๋๊ฒ ๋๋ฉด ์๊ธฐ ํ
์ด๋ธ ์ ๋ณด๋ฅผ ํ์ฉํ์ฌ ๋์๋๋ ๋์ ์ ํธ๋ฅผ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ(40)๋ก ์ ๊ณตํ ์ ์๋ค.The
์ผ ์๋ก, A ์ ์ค์ณ๋ TV์ ์ฑ๋ ๋ณ๊ฒฝ(์ฑ๋ ๋ฒํธ ์ฆ๊ฐ ๋๋ ๊ฐ์), B ์ ์ค์ณ๋ TV์ ์๋ ์กฐ์ (์๋ ํฌ๊ธฐ ์ฆ๊ฐ ๋๋ ๊ฐ์), C ์ ์ค์ณ๋ TV์ ์ ์ ์จ/์คํ ์ ์ด ๋์์ ๋์๋๋๋ก ์ค์ ๋์๋ค๊ณ ๊ฐ์ ํ๋ค.For example, assume that the A gesture is set to change a channel of the TV (increase or decrease channel number), the B gesture is to adjust the volume of the TV (increase or decrease volume), and the C gesture is set to correspond to the power on / off control operation of the TV. do.
์ด ๊ฒฝ์ฐ, ์ ์ค์ณ ๊ฐ์ง๋ถ(30)์ ์ํด ์ฌ์ฉ์์ B ์ ์ค์ณ๊ฐ ๊ฐ์ง๋์๋ค๋ฉด, ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ์๊ธฐ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ(40)๋ก ์๋ ์กฐ์ (์๋ ํฌ๊ธฐ ์ฆ๊ฐ ๋๋ ๊ฐ์)์ ๋ํ ๋์ ์ ํธ๋ฅผ ์ ๊ณตํ ์ ์๋ค. ์ด๋ฅผ ์ ๊ณต ๋ฐ์ ๊ฒฐ๊ณผ ์ถ๋ ฅ๋ถ(40)๋ ๋์๋๋ ๋์์ ์ํํ ์ ์๋ค.In this case, if the user's B gesture is detected by the gesture detector 30, the gesture detector 30 provides an operation signal for volume control (volume increase or decrease) to the
[์ถ๊ฐ [Add ๊ตฌ์ฑ์ ๋ํ ์ค๋ช Description of the configuration ]]
๋ณธ ๋ฐ๋ช ์ ์ ์ฉ๊ฐ๋ฅํ ๋ค๋ฅธ ์ค์์์์, ์ ์ค์ณ ๊ฐ์ง ์ฅ์น๋ ์๋๋ถ(๋ฏธ๋์)๋ฅผ ๋ ํฌํจํ ์ ์๋ค. ์๊ธฐ ์๋๋ถ(๋ฏธ๋์)๋ ์๊ธฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ฅผ ํตํด ์ผ๊ตด์ด ๊ฐ์ง๋๊ฒ ๋๋ฉด(์ฆ, ์ ์ค์ณ ๊ฐ์ง ๋์์ด ์ ์ ๋๊ฒ ๋๋ฉด), ์ฌ์ฉ์๋ก ํ์ฌ๊ธ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๊ฐ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ ์ค๋น๊ฐ ๋์์์ ๋ํ ์๋ ๋ฉ์์ง๋ฅผ ์ ๊ณตํ ์ ์๋ค. ๋ค์ ๋งํด, ์๊ธฐ ์๋๋ถ๋ ์๊ธฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๊ฐ ์ฌ์ฉ์์ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ๋ ์ผ์ ์๊ฐ ๋์ ์๋ ์ ๋ณด๋ฅผ ์ ๊ณตํจ์ผ๋ก์จ ์ ์ค์ณ๋ฅผ ๊ฐ์งํ ์ค๋น๊ฐ ๋์์์ ๋ํ ์ ๋ณด๋ฅผ ์ ๊ณตํ ์ ์๋ค.In another embodiment applicable to the present invention, the gesture sensing device may further include an alarm unit (not shown). When the face is detected through the face detecting unit 20 (that is, when a gesture detection target is selected), the alarm unit (not shown) allows the user to detect the user's gesture by the gesture detecting unit 30. You can provide an alarm message that you are ready to do so. In other words, the alarm unit may provide information that the gesture detection unit 30 is ready to detect a gesture by providing alarm information for a predetermined time for detecting the user's gesture.
์ด์ ๊ฐ์ด ๋ณธ ๋ฐ๋ช
์ ์ 1 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ ๊ฐ๊ฐ ๊ตฌ๋ถ๋๋ ์์ง์ ๊ฐ์ง๋ถ(10), ์ผ๊ตด ๊ฐ์ง๋ถ(20), ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ฅผ ํฌํจํ ์ ์์ผ๋ฉฐ, ์ด๋, ์ผ๊ตด ๊ฐ์ง๋ถ(20) ๋ฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ํ์์๋ ์ 1 ์ํ(์ฌ๋ฆฝ ์ํ ๋๋ ํ์ ์คํ ์ํ ๋ฑ)๋ก ๋์ํ๋ค ๋ณ๋์ ์จ์ดํฌ์
์ ํธ๋ฅผ ์์ ํ๊ฒ ๋๋ฉด ์ ์ ๋์ ์ 2 ์ํ๋ก ๋์ํ๋ฉฐ ๊ด๋ จ ๋์์ ์ํ ํ ๋ค์ ์ 1 ์ํ๋ก ๋๋์๊ฐ ์ ์๋ค.As such, the gesture recognizing apparatus according to the first example of the present invention may include a motion detector 10, a
์ด๋ฅผ ํตํด, ์ ์ ๋ ฅ์ผ๋ก ๊ตฌ๋๋๋ ์์ง์ ๊ฐ์ง๋ถ(10)๋ฅผ ํตํด ์ฌ์ฉ์๊ฐ ์ ์ค์ณ๋ฅผ ์
๋ ฅํ ์ค๋น๋ฅผ ํ์๋์ง๋ฅผ ํ๋ณํ๊ณ , ์ค๋น๊ฐ ๋ ๊ฒฝ์ฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20) ๋ฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ฅผ ์จ์ดํฌ์
์์ผ ์ต์ข
์ ์ผ๋ก๋ ์ ์ค์ณ ๊ฐ์ง ๋์์ ์ํํจ์ผ๋ก์จ ์ ์ฒด์ ์ผ๋ก ์ ์ ๋ ฅ์ผ๋ก ๊ตฌ๋ ๊ฐ๋ฅํ๋ค๋ ํจ๊ณผ๊ฐ ์๋ค.Through this, the user may determine whether the user prepares to input a gesture through the low-power motion detector 10, and when the user is ready, wakes up the
๋ํ, ์ ์ค์ณ ๊ฐ์ง ๋์(์ผ๊ตด)์ ์ ์ ํ์ฌ ์ ์ค์ณ ๊ฐ์ง ์์ญ์ ์ ํํจ์ผ๋ก์จ ์ ์ค์ณ ๊ฐ์ง์ ์์๋๋ ์ฐ์ฐ๋์ ๊ฐ์์ํฌ ์ ์๋ค๋ ํจ๊ณผ๊ฐ ์๋ค.In addition, by selecting the gesture detection target (face) and limiting the gesture detection area, the amount of computation required for the gesture detection may be reduced.
๋ณธ ๋ฐ๋ช
์ ๋ค๋ฅธ ์ค์์์์๋, ์ผ๊ตด ๊ฐ์ง๋ถ(20) ๋ฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๊ฐ ํ๋์ ๊ตฌ์ฑ ์์๋ก์ ๋์ํ ์ ์์ผ๋ฉฐ, ๋ ๋ค๋ฅธ ์ค์์์์๋ ์์ง์ ๊ฐ์ง๋ถ(10) ๋ฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๊ฐ ํ๋์ ๊ตฌ์ฑ ์์๋ก์ ๋์ํ ์ ์๋ค.In another embodiment of the present invention, the
์ดํ, ์๊ธฐ ์ค์์์์๋ ๋ 7 ๋ฐ ๋ 8์ ํตํด ์์ธํ ์ค๋ช ํ๋ค.Hereinafter, the embodiment will be described in detail with reference to FIGS. 7 and 8.
๋ 7์ ๋ณธ ๋ฐ๋ช ์ ์ 2 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.7 is a diagram illustrating a gesture recognizing apparatus according to a second example of the present invention.
๋ 7์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ์ 1 ์์ ์ผ๊ตด ๊ฐ์ง๋ถ(20) ๋ฐ ์ ์ค์ณ ๊ฐ์ง๋ถ(30)๋ ํ๋์ ๊ตฌ์ฑ ์์๋ก์ ํตํฉํ์ฌ ๋์ํ ์ ์๋ค.As shown in FIG. 7, the
์ด๋, ์์ง์ ๊ฐ์ง๋ถ(110)๋ก๋ PIR ์ผ์๊ฐ ์ ์ฉ๋ ์ ์์ผ๋ฉฐ, ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ(125)๋ก๋ ์์ ์ผ์๊ฐ ์ ์ฉ๋ ์ ์๋ค. ์ด๋ฅผ ํตํด, ์์ง์์ด ๊ฐ์ง๋ ๊ฒฝ์ฐ์๋ง ์ผ๊ตด/์ ์ค์ณ ๊ฐ์ง๋ถ(125)๋ฅผ ์จ์ดํฌ์ ์ํด์ผ๋ก์จ ์ ์ฒด์ ์ผ๋ก ์ ์ ๋ ฅ์ผ๋ก ๊ตฌํ ๊ฐ๋ฅํ๋ค.In this case, the PIR sensor may be applied to the motion detector 110, and the image sensor may be applied to the face / gesture detector 125. As a result, the face / gesture detection unit 125 may be woken up only when a motion is detected, and thus may be implemented at low power.
์ด์ธ ๋์ ๋ฐฉ๋ฒ์ ๋ํด์๋ ์์ ์ค๋ช ํ ๋ฐ์ ๋์ผํ ๋ฐ ์๋ตํ๋ค.Other operations will be omitted as described above.
๋ 8์ ๋ณธ ๋ฐ๋ช ์ ์ 3 ์์ ๋ฐ๋ฅธ ์ ์ค์ณ ์ธ์ ์ฅ์น๋ฅผ ๋ํ๋ธ ๋๋ฉด์ด๋ค.8 is a diagram illustrating a gesture recognizing apparatus according to a third example of the present invention.
๋ 8์ ๋์๋ ๋ฐ์ ๊ฐ์ด, ์ 1 ์์ ์์ง์ ๊ฐ์ง๋ถ(10) ๋ฐ ์ผ๊ตด ๊ฐ์ง๋ถ(20)๋ ํ๋์ ๊ตฌ์ฑ ์์๋ก์ ํตํฉํ์ฌ ๋์ํ ์ ์๋ค.As shown in FIG. 8, the motion detector 10 and the
์ด๋, ์์ง์/์ผ๊ตด ๊ฐ์ง๋ถ(215)๋ก๋ PIR ์ผ์๊ฐ ์ ์ฉ๋ ์ ์์ผ๋ฉฐ, ์ ์ค์ณ ๊ฐ์ง๋ถ(230)๋ก๋ ์์ ์ผ์๊ฐ ์ ์ฉ๋ ์ ์๋ค. ์ด๋ฅผ ํตํด, ์์ง์/์ผ๊ตด ๊ฐ์ง ๋์์ ์ ์ ๋ ฅ์ผ๋ก ๊ตฌ๋ ๊ฐ๋ฅํ PIR ์ผ์๋ฅผ ์ด์ฉํ์ฌ ์ํํ๋ฉฐ, ์ผ๊ตด์ด ๊ฐ์ง๋ ๊ฒฝ์ฐ์๋ง ์ ์ค์ณ ๊ฐ์ง๋ถ(230)๋ฅผ ์จ์ดํฌ์ ์ํด์ผ๋ก์จ ์ ์ฒด์ ์ผ๋ก ์ ์ ๋ ฅ์ผ๋ก ๊ตฌํ ๊ฐ๋ฅํ๋ค.In this case, the PIR sensor may be applied to the motion / face detector 215 and the image sensor may be applied to the gesture detector 230. Through this, the motion / face detection operation is performed by using the PIR sensor which can be driven at a low power, and can be realized at low power as a whole by waking up the gesture detector 230 only when a face is detected.
์ด์ธ ๋์ ๋ฐฉ๋ฒ์ ๋ํด์๋ ์์ ์ค๋ช ํ ๋ฐ์ ๋์ผํ ๋ฐ ์๋ตํ๋ค.Other operations will be omitted as described above.
์ด์ ๊น์ง ๋ณธ ๋ฐ๋ช ์ ๋ํ์ฌ ๊ทธ ๋ฐ๋์งํ ์ค์์๋ค์ ์ค์ฌ์ผ๋ก ์ดํด๋ณด์๋ค. ๋ณธ ๋ฐ๋ช ์ด ์ํ๋ ๊ธฐ์ ๋ถ์ผ์์ ํต์์ ์ง์์ ๊ฐ์ง ์๋ ๋ณธ ๋ฐ๋ช ์ด ๋ณธ ๋ฐ๋ช ์ ๋ณธ์ง์ ์ธ ํน์ฑ์์ ๋ฒ์ด๋์ง ์๋ ๋ฒ์์์ ๋ณํ๋ ํํ๋ก ๊ตฌํ๋ ์ ์์์ ์ดํดํ ์ ์์ ๊ฒ์ด๋ค. ๊ทธ๋ฌ๋ฏ๋ก ๊ฐ์๋ ์ค์์๋ค์ ํ์ ์ ์ธ ๊ด์ ์ด ์๋๋ผ ์ค๋ช ์ ์ธ ๊ด์ ์์ ๊ณ ๋ ค๋์ด์ผ ํ๋ค. ๋ณธ ๋ฐ๋ช ์ ๋ฒ์๋ ์ ์ ํ ์ค๋ช ์ด ์๋๋ผ ํนํ์ฒญ๊ตฌ๋ฒ์์ ๋ํ๋ ์์ผ๋ฉฐ, ๊ทธ์ ๋๋ฑํ ๋ฒ์ ๋ด์ ์๋ ๋ชจ๋ ์ฐจ์ด์ ์ ๋ณธ ๋ฐ๋ช ์ ํฌํจ๋ ๊ฒ์ผ๋ก ํด์๋์ด์ผ ํ ๊ฒ์ด๋ค.So far I looked at the center of the preferred embodiment for the present invention. Those skilled in the art will appreciate that the present invention can be implemented in a modified form without departing from the essential features of the present invention. Therefore, the disclosed embodiments should be considered in descriptive sense only and not for purposes of limitation. The scope of the present invention is shown in the claims rather than the foregoing description, and all differences within the scope will be construed as being included in the present invention.
Claims (21)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/KR2015/007407 WO2017010593A1 (en) | 2015-07-16 | 2015-07-16 | Gesture recognition device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/KR2015/007407 WO2017010593A1 (en) | 2015-07-16 | 2015-07-16 | Gesture recognition device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017010593A1 true WO2017010593A1 (en) | 2017-01-19 |
Family
ID=57757992
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2015/007407 Ceased WO2017010593A1 (en) | 2015-07-16 | 2015-07-16 | Gesture recognition device |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2017010593A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107422859A (en) * | 2017-07-26 | 2017-12-01 | ๅนฟไธ็พ็ๅถๅท่ฎพๅคๆ้ๅ ฌๅธ | Regulation and control method, apparatus and computer-readable recording medium and air-conditioning based on gesture |
| KR20200121513A (en) * | 2019-04-16 | 2020-10-26 | ๊ฒฝ๋ถ๋ํ๊ต ์ฐํํ๋ ฅ๋จ | Device and method for recognizing motion using deep learning, recording medium for performing the method |
| KR102437979B1 (en) * | 2022-02-22 | 2022-08-30 | ์ฃผ์ํ์ฌ ๋ง์ธ๋ํฌ์ง | Apparatus and method for interfacing with object orientation based on gesture |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130229508A1 (en) * | 2012-03-01 | 2013-09-05 | Qualcomm Incorporated | Gesture Detection Based on Information from Multiple Types of Sensors |
| WO2013133624A1 (en) * | 2012-03-06 | 2013-09-12 | ๋ชจ์ ผ์ค๋ฉ ์ฃผ์ํ์ฌ | Interface apparatus using motion recognition, and method for controlling same |
| KR20130109031A (en) * | 2012-03-26 | 2013-10-07 | ์ค๋ฆฌ์ฝคํ (์ฃผ) | Motion gesture recognition module and method for recognizing motion gesture thereof |
| EP2680191A2 (en) * | 2012-06-26 | 2014-01-01 | Google Inc. | Facial recognition |
| US20140368423A1 (en) * | 2013-06-17 | 2014-12-18 | Nvidia Corporation | Method and system for low power gesture recognition for waking up mobile devices |
-
2015
- 2015-07-16 WO PCT/KR2015/007407 patent/WO2017010593A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130229508A1 (en) * | 2012-03-01 | 2013-09-05 | Qualcomm Incorporated | Gesture Detection Based on Information from Multiple Types of Sensors |
| WO2013133624A1 (en) * | 2012-03-06 | 2013-09-12 | ๋ชจ์ ผ์ค๋ฉ ์ฃผ์ํ์ฌ | Interface apparatus using motion recognition, and method for controlling same |
| KR20130109031A (en) * | 2012-03-26 | 2013-10-07 | ์ค๋ฆฌ์ฝคํ (์ฃผ) | Motion gesture recognition module and method for recognizing motion gesture thereof |
| EP2680191A2 (en) * | 2012-06-26 | 2014-01-01 | Google Inc. | Facial recognition |
| US20140368423A1 (en) * | 2013-06-17 | 2014-12-18 | Nvidia Corporation | Method and system for low power gesture recognition for waking up mobile devices |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107422859A (en) * | 2017-07-26 | 2017-12-01 | ๅนฟไธ็พ็ๅถๅท่ฎพๅคๆ้ๅ ฌๅธ | Regulation and control method, apparatus and computer-readable recording medium and air-conditioning based on gesture |
| CN107422859B (en) * | 2017-07-26 | 2020-04-03 | ๅนฟไธ็พ็ๅถๅท่ฎพๅคๆ้ๅ ฌๅธ | Gesture-based regulation and control method and device, computer-readable storage medium and air conditioner |
| KR20200121513A (en) * | 2019-04-16 | 2020-10-26 | ๊ฒฝ๋ถ๋ํ๊ต ์ฐํํ๋ ฅ๋จ | Device and method for recognizing motion using deep learning, recording medium for performing the method |
| KR102192051B1 (en) * | 2019-04-16 | 2020-12-16 | ๊ฒฝ๋ถ๋ํ๊ต ์ฐํํ๋ ฅ๋จ | Device and method for recognizing motion using deep learning, recording medium for performing the method |
| KR102437979B1 (en) * | 2022-02-22 | 2022-08-30 | ์ฃผ์ํ์ฌ ๋ง์ธ๋ํฌ์ง | Apparatus and method for interfacing with object orientation based on gesture |
| KR20230126150A (en) * | 2022-02-22 | 2023-08-29 | ์ฃผ์ํ์ฌ ๋ง์ธ๋ํฌ์ง | Method for interfacing based on gesture |
| KR102702165B1 (en) | 2022-02-22 | 2024-09-04 | ์ฃผ์ํ์ฌ ๋ง์ธ๋ํฌ์ง | Method for interfacing based on gesture |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AU2014297039B2 (en) | Auto-cleaning system, cleaning robot and method of controlling the cleaning robot | |
| WO2020091505A1 (en) | Electronic device and method for intelligent interaction thereof | |
| WO2017039308A1 (en) | Virtual reality display apparatus and display method thereof | |
| WO2017119664A1 (en) | Display apparatus and control methods thereof | |
| WO2019107741A1 (en) | Method for detecting wear using plurality of sensors and electronic device implementing same | |
| EP3281058A1 (en) | Virtual reality display apparatus and display method thereof | |
| WO2020180051A1 (en) | Electronic apparatus and control method thereof | |
| WO2018009029A1 (en) | Electronic device and operating method thereof | |
| EP3281089A1 (en) | Smart device and method of operating the same | |
| WO2020059939A1 (en) | Artificial intelligence device | |
| WO2019124908A1 (en) | Electronic device and method for controlling touch sensor controller on basis of image synchronization signal | |
| WO2020141727A1 (en) | Healthcare robot and control method therefor | |
| WO2021047070A1 (en) | Terminal photographing method and apparatus, mobile terminal, and readable storage medium | |
| WO2017023140A1 (en) | Device and method for managing power in electronic device | |
| WO2009157654A2 (en) | Method and apparatus for detecting a slide touch, and recording medium where program for implementing the method is recorded | |
| WO2018143509A1 (en) | Moving robot and control method therefor | |
| WO2017215354A1 (en) | Measurement gauge data storage method and apparatus | |
| WO2017010593A1 (en) | Gesture recognition device | |
| WO2015194697A1 (en) | Video display device and operating method thereof | |
| WO2020032668A1 (en) | Method and device for processing user input on basis of time during which user input is maintained | |
| WO2020171607A1 (en) | Touch circuit for preventing erroneous touch due to temperature change, electronic device comprising touch circuit, and method for operating same | |
| WO2021132743A1 (en) | Electronic device for displaying application-related content, and method for controlling same | |
| WO2022019582A1 (en) | Robot and control method therefor | |
| WO2018120717A1 (en) | Method and device for controlling air conditioner | |
| WO2016080662A1 (en) | Method and device for inputting korean characters based on motion of fingers of user |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15898354 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15898354 Country of ref document: EP Kind code of ref document: A1 |