[go: up one dir, main page]

WO2013015462A1 - Dispositif électronique actionné selon un geste d'utilisateur, et procédé de commande du fonctionnement du dispositif électronique - Google Patents

Dispositif électronique actionné selon un geste d'utilisateur, et procédé de commande du fonctionnement du dispositif électronique Download PDF

Info

Publication number
WO2013015462A1
WO2013015462A1 PCT/KR2011/005452 KR2011005452W WO2013015462A1 WO 2013015462 A1 WO2013015462 A1 WO 2013015462A1 KR 2011005452 W KR2011005452 W KR 2011005452W WO 2013015462 A1 WO2013015462 A1 WO 2013015462A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
user
speed
electronic device
user gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2011/005452
Other languages
English (en)
Korean (ko)
Inventor
이석희
이명구
홍성표
오상혁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Priority to PCT/KR2011/005452 priority Critical patent/WO2013015462A1/fr
Publication of WO2013015462A1 publication Critical patent/WO2013015462A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present invention relates to an electronic device that operates according to a user gesture and a method of controlling an operation of the electronic device according to a user gesture.
  • haptic feedback for example, vibration feedback
  • the user gesture generally does not use a specific device, such as hand gestures and / or gestures in the user's natural space. Since it uses a method of optically sensing the user's movement and analyzing it, it is difficult to give adequate feedback.
  • the present invention provides an electronic device that operates according to a user gesture and a method for controlling the operation of the electronic device according to a user gesture.
  • a technical problem in providing an electronic device and a method of controlling the operation of the electronic device, which are more convenient for the user to operate the electronic device according to the user gesture by implementing more appropriate user feedback for the user gesture.
  • an operation control method of an electronic device includes generating a control signal for displaying an object at a first position; Receiving a user gesture; And generating a control signal to be displayed while moving the selected object according to the object speed determined in consideration of the input of the user gesture and the property of the object.
  • the attribute of the object, the size of the file (file size) corresponding to the object (folder size) of the folder corresponding to the object occupy the resource (resource) when the application corresponding to the object is executed It may include at least one of occupying resource size and occupying resource size when an application required to execute a file corresponding to the object is executed.
  • the object speed may be determined by adjusting the gesture movement speed of the object determined by the user gesture according to a value determined in consideration of the property of the object. In this case, the difference between the object velocity and the gesture movement speed of the object determined by the user gesture is greater as the difference between the attribute value of the object with respect to the reference value increases.
  • a method of controlling an operation of an electronic device including: generating a control signal for displaying an object at a first position; Receiving a user gesture; And generating a control signal to be displayed while moving the selected object according to the object movement trajectory pattern determined in consideration of the input of the user gesture and the property of the object.
  • the attribute of the object corresponds to the type of the object, whether the object is executed, a file size corresponding to the object, a folder size corresponding to the object, and the object.
  • An occupying resource size may occupy a resource when an application to be executed is executed, and an occupying resource size when an application required to execute a file corresponding to the object is executed.
  • the generating of the control signal may include generating the control signal to move the object according to the determined object movement trace pattern in the movement direction determined by the user gesture.
  • the movement trajectory pattern may include at least one of a zigzag pattern, a spiral pattern, a flashing pattern, and a teleportation pattern.
  • a method of controlling an operation of an electronic device including: receiving a user gesture; Generating a control signal to be displayed while moving the first object according to the moving speed of the object determined in consideration of the user gesture; And changing the moving speed of the object to an approach speed determined in consideration of attributes of a second object located on a path of the first object.
  • an attribute of the second object may include at least one of a type of the second object, a user's access right to the second object, and a correlation between the second object and the first object.
  • the correlation between the second object and the first object may be linked to a file, a folder, and / or an application corresponding to the first object and to a file, a folder, and / or an application corresponding to the second object. May be included.
  • the access speed is a movement speed of the object. Can be greater than
  • the changing may be performed when the first object enters an area within a predetermined distance with respect to the second object.
  • a method of controlling an operation of an electronic device includes: receiving a user gesture; Generating a control signal to display while moving the first object; And generating a control signal for changing the display state of the second object according to the display pattern determined in consideration of the property of the second object located on the path of the first object.
  • the display pattern of the second object may include at least one of a display image of the second object, rotation of the second object, blinking of the second object, and enlargement and / or reduction of the second object.
  • an electronic device is provided.
  • An electronic device includes a display unit for displaying information and / or images; A camera receiving a user gesture; And generating a control signal for controlling the display unit to display an object at a first position, and controlling the display unit to display the selected object while moving the selected object according to an object speed determined in consideration of an input of the user gesture and an attribute of the object. And a control unit for generating a control signal.
  • the electronic device performs a function and / or action corresponding to the user's gesture, in particular, in moving the selected object, by varying the object speed by reflecting the object's properties,
  • a function and / or action corresponding to the user's gesture in particular, in moving the selected object, by varying the object speed by reflecting the object's properties.
  • the electronic device selects the user's movement trajectory pattern by changing the movement trajectory pattern of the object to reflect the property of the object in executing the corresponding function and / or action, in particular, the movement of the selected object.
  • the electronic device selects the user's movement trajectory pattern by changing the movement trajectory pattern of the object to reflect the property of the object in executing the corresponding function and / or action, in particular, the movement of the selected object.
  • FIG. 1 is a block diagram of a display apparatus according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of a user's gesture input to the display device of FIG. 1.
  • 3 and 4 are views for explaining a stereoscopic image display method using binocular parallax related to embodiments of the present invention.
  • 5 to 8 illustrate a method for displaying a stereoscopic image.
  • FIG. 9 is a flowchart illustrating a method of controlling the operation of an electronic device according to a first embodiment of the present invention.
  • FIG. 10 is a diagram for describing a method of analyzing a direction of a user gesture and moving an object accordingly.
  • FIG. 11 is a diagram for describing a method of analyzing a speed of a user gesture and moving an object accordingly.
  • FIG. 12 is a diagram for describing a method of analyzing a moving distance of a user gesture and moving an object accordingly.
  • FIG. 13 is a diagram for describing determining an object speed according to a size of a file corresponding to an object.
  • FIG. 14 is a diagram for describing determining an object speed according to a resource occupancy size of an application corresponding to an object.
  • 15 and 16 illustrate a relationship between a moving velocity of an object determined by a user gesture and an objct velocity finally determined in consideration of an attribute of an object.
  • FIG. 17 is a diagram for describing changing an image of an object according to an attribute of the object when the object is moved.
  • FIG. 18 is a flowchart illustrating a method of controlling the operation of an electronic device according to a second embodiment of the present invention.
  • 19 is a diagram illustrating examples of various movement trajectory patterns of an object according to a second exemplary embodiment of the present invention.
  • 20 is a flowchart illustrating a method of controlling the operation of an electronic device according to a third embodiment of the present invention.
  • 21 is a diagram illustrating an example in which an electronic device operates according to a third exemplary embodiment of the present invention.
  • FIG. 22 is a flowchart illustrating a method of controlling the operation of an electronic device according to a fourth embodiment of the present invention.
  • FIG. 23 is a diagram for describing a method of operating an electronic device according to a fourth embodiment of the present invention.
  • FIG. 1 is a block diagram of a display apparatus according to an embodiment of the present invention.
  • the electronic device 100 includes a communication unit 110, a user input unit 120, an output unit 150, a memory 160, an interface unit 170, and a control unit ( 180 and the power supply unit 190 may be included.
  • the components shown in FIG. 1 list components that may be typically included in the display apparatus. Therefore, it is a matter of course that the display device including more or less than the components shown therein may be implemented.
  • the communication unit 110 may include one or more modules that enable communication between the electronic device 100 and the communication system or between the electronic device 100 and another device.
  • the communication unit 110 may include a broadcast receiver 111, an internet module 113, a short range communication module 114, and the like.
  • the broadcast receiver 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a pre-generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a communication network.
  • the broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • ESG Electronic Service Guide
  • DVB-H Digital Video Broadcast-Handheld
  • the broadcast receiver 111 may receive broadcast signals using various broadcast systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiver 111 may be stored in the memory 160.
  • the internet module unit 113 may mean a module for accessing the Internet.
  • the internet module unit 113 may be embedded or external to the electronic device 100.
  • the short range communication module 114 refers to a module for short range communication.
  • Bluetooth Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), and ZigBee may be used.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • the user input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121, a microphone 122, and the like.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
  • the processed image frame may be displayed on the display unit 151.
  • the camera 121 may be a camera 121 capable of 2D or 3D shooting, and a 2D or 3D camera may be configured alone or as a combination thereof.
  • the image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the communication unit 110. Two or more cameras 121 may be installed according to the configuration aspect of the electronic device 100.
  • the microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
  • the microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
  • various voice commands for driving the electronic device 100 and executing a function may be input from the user through the microphone 122.
  • the output unit 150 may include a display unit 151, a sound output unit 152, and the like.
  • the display unit 151 displays information processed by the electronic device 100. For example, a UI (User Interface) or GUI (Graphic User Interface) associated with the electronic device 100 is displayed.
  • the display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, or a three-dimensional display. 3D display).
  • the display unit 151 may be configured as a transparent type or a light transmissive type. This may be referred to as a transparent display.
  • a representative example of the transparent display includes a transparent LCD.
  • the rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the body.
  • two or more display units 151 may exist.
  • the plurality of display units 151 may be spaced apart or integrally disposed on one surface of the electronic device 100, or may be disposed on different surfaces.
  • the display unit 151 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter, abbreviated as “touch screen”)
  • the display unit 151 is an output device. It can also be used as an input device.
  • the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal.
  • the touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.
  • a corresponding signal is sent to the touch controller.
  • the touch controller processes the signal and then transmits the corresponding data to the controller 180.
  • the controller 180 can know which area of the display unit 151 is touched.
  • the sound output unit 152 may output audio data received from the communication unit 110 or stored in the memory 160.
  • the sound output unit 152 may output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed by the electronic device 100.
  • the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
  • the memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 160 may store data regarding vibration and sound of various patterns output when a touch is input on the touch screen.
  • the memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM) magnetic memory, Magnetic disk It may include at least one type of storage medium of the optical disk.
  • the electronic device 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.
  • the interface unit 170 serves as a path with all external devices connected to the electronic device 100.
  • the interface unit 170 receives data from an external device or receives power and transmits the data to each component inside the electronic device 100 or transmits the data inside the electronic device 100 to an external device.
  • wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.
  • the controller 180 typically controls the overall operation of the display device. For example, perform related control and processing for voice calls, data communications, video calls, and the like.
  • the controller 180 may include an image processor 182 for image processing.
  • the image processor 182 will be described in more detail in the corresponding part.
  • the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPCs field programmable gate arrays
  • controllers controllers
  • Micro-controllers microprocessors, electrical units for performing functions, etc.
  • such embodiments may be implemented by the controller 180. have.
  • embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed.
  • the software code may be implemented by a software application written in a suitable programming language.
  • the software code may be stored in the memory 160 and executed by the controller 180.
  • the user input unit 120, the output unit 150, etc. are not included in the electronic device 100, if necessary, may be implemented as a separate module. In this case, the user input unit 120 and / or output unit 150 separately provided may communicate with or be connected to the electronic device 100 through the communication unit 110 and / or the interface unit 170. Can be.
  • FIG. 2 is a diagram illustrating an example of a user's gesture input to the display device of FIG. 1.
  • the electronic device 100 may photograph a gesture taken by the user U and execute an appropriate function corresponding thereto.
  • the electronic device 100 may be various electronic devices including a display unit 151 capable of displaying an image. That is, it means that the electronic device may be a fixed type fixed in a specific position because it is bulky, such as a TV shown in FIG.
  • the electronic device 100 may be provided with a camera 121 capable of capturing a gesture of the user U.
  • the camera 121 may be an optical electronic device photographing a front direction of the electronic device 100.
  • the camera 121 may be a 2D camera for photographing a 2D image and / or a 3D camera for photographing a 3D image.
  • FIG. 2 for convenience of understanding, the case where one camera 121 is provided on the upper center of the electronic device 100 is illustrated, but the type, position, and number of the camera 121 may be changed as necessary.
  • the controller 180 can track the user U having the control right. Granting and tracking of the control right may be performed based on an image captured by the camera 121 provided in the electronic device 100. That is, the controller 180 analyzes the photographed image to determine whether a specific user U exists, whether the specific user U performs a gesture operation necessary to acquire control rights, and the specific user U moves. This means that you can continue to judge whether you are doing so.
  • the controller 180 may analyze the gesture of the user who has the control right in the captured image. For example, even if the user makes a specific gesture, if the user U is a person without control, the specific function may not be executed. However, if the user U has a control, it means that a specific function corresponding to the specific gesture can be executed.
  • the gesture of the user U may be various operations using a body of the user U or the like.
  • the gesture of the user U sitting, waking, running, or moving may be a gesture.
  • the gesture using the head, the foot, the hand H, or the like may also be a gesture.
  • a hand gesture using the hand H of the user U among various gestures of the user U will be described as an example. However, this description is for convenience of understanding, and the present invention is not limited to the user's hand gesture itself.
  • the present invention can be applied when the display unit 151 is a three-dimensional display.
  • the display unit 151 is a three-dimensional display.
  • a method of displaying a stereoscopic image through the display unit 151 which is a 3D display will be described.
  • FIG. 3 and 4 are views for explaining a stereoscopic image display method using a binocular parallax associated with embodiments of the present invention
  • Figures 5 to 8 is a method for displaying a stereoscopic image Figure is shown.
  • Binocular parallax refers to the difference in the way the left and right eyes of a person see objects.
  • a human's brain synthesizes the image seen through the left eye and the image seen through the right eye, the synthesized image makes a person feel three-dimensional.
  • a phenomenon in which a person feels a stereoscopic sense according to binocular parallax is referred to as 'stereoscopic vision', and an image causing stereoscopic vision is referred to as a 'stereoscopic image'.
  • a specific object included in the image causes stereoscopic vision
  • the object is referred to as a 'stereoscopic object'.
  • the stereoscopic image display method according to binocular disparity is divided into glasses type which requires special glasses and glassesless type which do not need glasses.
  • the spectacles include a method using sunglasses with wavelength selectivity, a polarization glasses method using a light blocking effect according to a polarization difference, and a time-division glasses method alternately presenting left and right images within an afterimage time of an eye.
  • a filter having different transmittances is mounted in the left and right sides to obtain a stereoscopic effect on the movement in the left and right directions according to the time difference of the visual system coming from the difference in the transmittances.
  • the autostereoscopic method in which a stereoscopic effect is generated on the image display surface rather than the observer, includes a parallax barrier method, a lenticular lens method, or a microlens array method.
  • the display unit 151 includes a lenticular lens array 81a to display a stereoscopic image.
  • the lenticular lens array 81a includes a display surface 83 and left and right eyes 82a in which pixels L to be input to the left eye 82a and pixels R to be input to the right eye 82b are alternately arranged along the horizontal direction. Located between 82b), it provides optical discrimination directivity for the pixel L to be input to the left eye 82a and the pixel R to be input to the right eye 82b.
  • the image passing through the lenticular lens array 81a is observed separately from the left eye 82a and the right eye 82a, and the image of the human brain is seen through the left eye 82a and the image seen through the right eye 82b.
  • the display unit 151 includes a vertical grid parallax barrier 81b to display a stereoscopic image.
  • the parallax barrier 81b includes a display surface 83 and left and right eyes 82a in which pixels L to be input to the left eye 82a and pixels R to be input to the right eye 82b are alternately arranged along the horizontal direction. 82b) between the left eye 82a and the right eye 82b through the vertical lattice-shaped aperture (aperture) to be viewed separately. Therefore, the human brain synthesizes an image viewed through the left eye 82a and an image viewed through the right eye 82b to observe a stereoscopic image.
  • the parallax barrier 81b is turned on only when a three-dimensional image is to be displayed and is separated from the incident time, and is turned off when a planar image is to be displayed. have.
  • the present invention can display a stereoscopic image using binocular disparity using a variety of methods in addition to the above-described method.
  • the stereoscopic image may be displayed by a stereoscopic image display method using a polarization method and a time division method, which are classified into a spectacle type stereoscopic image implementation method.
  • FIG. 5 illustrates an example of displaying a stereoscopic image including a plurality of image objects 10 and 11.
  • the stereoscopic image illustrated in FIG. 5 may be an image acquired through the camera 121.
  • the stereoscopic image includes a first image object 10 and a second image object 11.
  • two image objects 10 and 11 are assumed for convenience of description, more image objects may be included in the stereoscopic image.
  • the controller 180 may display an image acquired in real time through the camera 121 on the display unit 151 in a camera preview format.
  • the controller 180 may obtain at least one stereo disparity corresponding to each of the at least one image object.
  • the controller 180 controls the first image object through the left eye image and the right eye image acquired through the camera 121. 10) and binocular disparity of the second image object 11 may be obtained, respectively.
  • FIG. 6 is a diagram for describing binocular disparity of an image object included in a stereoscopic image.
  • the first image object 10 may include a left eye image 10a viewed by the user as the left eye 20a and a right eye image 10b viewed by the right eye 20b.
  • the controller 180 may obtain a binocular parallax d1 corresponding to the first image object 10 through the left eye image 10a and the right eye image 10b.
  • the controller 180 converts the 2D image obtained through the camera 121 into a 3D image using a predetermined algorithm for converting a 2D image into a 3D image. Display on the display unit 151.
  • controller 180 may use the left-eye image and the right-eye image generated by using the image conversion algorithm, respectively, to determine the binocular disparity of the first image object 10 and the binocular disparity of the second image object 11, respectively. Can be obtained.
  • FIG. 7 is a diagram for comparing binocular disparity between the image objects 10 and 11 illustrated in FIG. 5.
  • the binocular disparity d1 of the first image object 10 and the binocular disparity d2 of the second image object 11 are different from each other.
  • the second image object 11 since the second image object 11 is larger than d1, the second image object 11 may appear to be farther from the user than the first image object 10.
  • the controller 180 may obtain at least one graphic object corresponding to each of the at least one image object [S120].
  • the controller 180 may display the obtained at least one graphic object on the display unit 151 to have a corresponding binocular disparity.
  • FIG. 8 illustrates a first image object 10 that may have an effect as if it protrudes toward the user from the display unit 151.
  • the positions of the left eye image 10a and the right eye image 10b may be displayed on the display unit 151 as opposed to FIG. 7.
  • the left eye 20a and the right eye 20b may be viewed in opposite directions. Accordingly, the user may feel that the first image object 10 is displayed in front of the display unit 151, which is a point at which the eye gaze crosses. That is, the display unit 151 may feel a positive depth. This is different from the case of FIG. 7, where the first image object 10 is displayed behind the display unit 151.
  • the controller 180 can display a stereoscopic image to feel a positive or negative depth so that the user can feel various depths.
  • FIG. 9 is a flowchart illustrating a method of controlling the operation of an electronic device according to a first embodiment of the present invention.
  • a method of controlling an operation of an electronic device may include generating a control signal for displaying an object (S100), obtaining a user gesture (S110), analyzing a user gesture (S120), and an object. At least one of analyzing an attribute of the step (S130), determining an object velocity based on a user gesture and an object attribute (S140), and generating a control signal to be displayed while moving the object according to the determined object velocity (S150). It includes one.
  • each step will be described in detail.
  • the controller 180 in particular, the image processor 182 may generate a control signal for displaying an object on the display unit 151 (S100).
  • the object refers to various functions of the electronic device 100, various applications installed on the electronic device 100, contacts stored in the electronic device 100, various files, and various types.
  • the file may correspond to a folder in which files are stored, and may be visually displayed through the display unit 151.
  • various functions corresponding to it can be executed.
  • the object may be displayed as various images which are previously designated according to the type of application, the type of file, and / or the type of folder corresponding thereto.
  • the controller 180 checks various images corresponding to each object and / or a location to display each object and the like, and displays a control signal for properly displaying each object on the display unit 151 as confirmed. Can be generated.
  • the control signal is communicated with the communication unit 110 and / or the interface unit 170. ) May be transmitted to the display unit 151.
  • the controller 180 may acquire a user gesture through the camera 121 (S110). Since a method of obtaining a user gesture through the camera 121 has been described in detail with reference to FIG. 2, a detailed description thereof will be omitted.
  • the controller 180 may analyze the user gesture acquired through step S110 (S120).
  • the controller 180 may include a gesture type, a start point of the gesture, an end point of the gesture, a direction of the gesture, a velocity of the gesture, and a movement of the gesture. Distance, etc. can be analyzed.
  • the controller 180 may analyze the viewpoint of the gesture.
  • the object to be executed or moved by the gesture may be selected by the viewpoint of the gesture.
  • the controller 180 may execute a specific operation of the electronic device 100 corresponding to the obtained gesture by analyzing the type of gesture. For example, when the user takes the first gesture, the controller 180 may perform an operation of selecting an object, and when the user takes the second gesture, the controller 180 may execute a function corresponding to the object. When the user is for the third gesture, the position where the object is displayed may be changed. In addition, the electronic device 100 may correspond to different gestures according to the user's gesture type, analyze the type (type) of the input gesture, and execute an operation corresponding thereto.
  • the controller 180 may analyze at least one of a direction, a speed, and a movement distance of the user gesture.
  • the controller 180 analyzes the direction, speed, and movement distance of the gesture may be performed using a known general method.
  • the controller 180 may change the pattern of the corresponding operation in consideration of at least one of a direction, a speed, and a movement distance of the user gesture. For example, when performing an operation of changing a position where an object is displayed by a user gesture, the changed position of the object may be determined in consideration of at least one of a direction, a speed, and a movement distance of the user gesture.
  • FIG. 10 is a diagram for describing a method of analyzing a direction of a user gesture and moving an object accordingly.
  • the moving direction of the object OB may be determined in consideration of the gesture direction of the user's hand H.
  • FIG. 10 is a diagram for describing a method of analyzing a direction of a user gesture and moving an object accordingly.
  • the moving direction of the object OB may be determined in consideration of the gesture direction of the user's hand H.
  • FIG. 11 is a diagram for describing a method of analyzing a speed of a user gesture and moving an object accordingly.
  • the moving speed of the object may be determined in consideration of the speed of the gesture.
  • the controller 180 takes the object OB displayed at the first position OP1 through the display unit 151 to the second position OP2 in consideration of the speed of the hand H of the user.
  • the gesture velocity of the object OB can be determined. In this case, the controller 180 may determine the gesture speed of the object OB in consideration of the speed of the user's hand H.
  • the gesture speed of the object OB is OV1.
  • the gesture speed of the object OB is OV2. Therefore, when the moving speed of the user's hand H is HV3, the gesture speed of the object OB may be determined as OV3 (OV1> OV2> OV3).
  • FIG. 12 is a diagram for describing a method of analyzing a moving distance of a user gesture and moving an object accordingly.
  • the moving distance of the object may be determined in consideration of the moving distance of the gesture.
  • the controller 180 displays the moving distances HS1, HS2, HS3, and HS1 ⁇ HS2 ⁇ HS3 of the user's hand H at the first position OP1 through the display unit 151.
  • a control signal for displaying the object OB while moving it to a predetermined position can be generated.
  • the controller 180 may determine the moving position of the object OB in consideration of the speed of the hand H of the user.
  • the moving position of the object OB is OP3 (the moving distance of the object is OS1)
  • the moving distance of the user's hand H is HS2 (object's The moving distance is OS2)
  • the moving position of the object OB is OP4
  • the moving position of the object OB may be determined as OP5 ( OS1 ⁇ OS2 ⁇ OS3).
  • the controller 180 may analyze an attribute of the object (S130).
  • the attributes of the object may include, for example, a file size of a file corresponding to the object, a folder size of a folder corresponding to the object, and an electronic device 100 when an application (or software) corresponding to the object is executed. Occupying resource size, occupying resource size of an electronic device 100 when an application (or software) required to execute a file corresponding to an object is executed, and the like. It may include.
  • the controller 180 may analyze at least one of the above-described attributes of the object selected by the user's gesture.
  • Attributes of the respective objects may be stored in the memory 160.
  • the controller 180 may check at least one of the attributes of the selected object with reference to the memory 160.
  • the controller 180 may determine an object velocity, which will be described later, in consideration of the identified attributes of the at least one object.
  • the controller 180 may determine the object speed based on the user gesture and the object property (S140).
  • the object speed refers to the speed at which the object moves when the object is displayed while moving.
  • the controller 180 may determine the object speed in consideration of the property of the object checked in step S130 based on the gesture speed of the object determined in step S120 (S140).
  • FIG. 13 is a diagram for describing determining an object speed according to a size of a file corresponding to an object.
  • the object speeds OV4, OV5, OV6, and OV4 ⁇ OV5 ⁇ OV6 become smaller as the size of the file corresponding to the objects OB1, 0B2, 0B3 increases. That is, even when the user makes a gesture at the same speed, the object speed may vary depending on the size of a file corresponding to the selected object.
  • FIG. 14 is a diagram for describing determining an object speed according to a resource occupancy size of an application corresponding to an object.
  • the object speeds OV7, OV8, OV9, and OV7 ⁇ OV8 ⁇ OV9 become smaller as the size of an application corresponding to the objects OB4, OB5, and OB6 increases. That is, even when the user makes a gesture at the same speed, the object speed may vary according to the resource occupancy size of the application corresponding to the selected object.
  • the controller 180 may adjust the gesture speed of the object determined in operation S120 according to a predetermined method. This will be described by way of example with reference to FIGS. 15 and 16.
  • 15 and 16 illustrate a relationship between a gesture velocity of an object determined by a user gesture and an objct velocity finally determined in consideration of an attribute of an object.
  • the controller 180 increases the speed by a constant ( ⁇ a) by a constant constant ( ⁇ a) or decreases the speed by a constant ( ⁇ b) by the constant constant ( ⁇ b) (CV2). You can determine the speed.
  • the object velocity may be calculated by decreasing the gesture velocity of the object determined in step S120 by a constant ⁇ b.
  • the size of ⁇ b may vary depending on the size of the file corresponding to the object or the resource occupancy size of the application corresponding to the object. That is, as the size of the file or the size of resource occupancy increases, the size of ⁇ b may also increase.
  • the controller 180 may control the display unit 151 so that the selected object does not move.
  • the gesture speed of the determined object may be determined as the object speed without adjusting the gesture speed of the object determined in step S120.
  • the median value of the file size and / or the median value of the resource occupancy size may be set in advance, and may be changed according to a user input or an operation state of the electronic device 100.
  • the median value for each attribute of the object may serve as a reference value for the large and small of the object attribute.
  • the object speed may be calculated by increasing the gesture speed of the object determined in operation S120 by a constant ⁇ a.
  • the size of ⁇ a may vary depending on the size of the file corresponding to the object or the resource occupancy size of the application corresponding to the object. In other words, if the file size or resource occupancy size decreases, the size of ⁇ a may increase.
  • the relationship between the velocity of the user gesture and the gesture velocity of the object is indicated by the CV4 curve (tilt x).
  • CV4 curve tilt x
  • the controller 180 increases or decreases the proportional constant (that is, the slope x) of the moving speed of the object with respect to the speed of the user gesture (CV6, the slope z) for all the user gesture speeds.
  • the above object velocity can be determined by (CV5, slope y).
  • the object velocity can be calculated by changing the gesture velocity Va of the object determined in step S120 to the velocity Vc calculated by the curve CV5 having the changed slope, for the specific user gesture velocity Vg. .
  • the difference between the slopes x and y may vary depending on the size of the file corresponding to the object or the resource occupancy size of the application corresponding to the object.
  • the gesture speed of the determined object may be determined as the object speed without adjusting the gesture speed of the object determined in step S120.
  • the median value of the file size and / or the median value of the resource occupancy size may be set in advance, and may be changed according to a user input or an operation state of the electronic device 100.
  • an application corresponding to the object may be used.
  • the object velocity is calculated by changing the gesture velocity Va of the object determined in step S120 to the velocity Vb calculated by the curve CV6 having the changed slope.
  • the difference between the slopes x and z may vary depending on the size of the file corresponding to the object or the resource occupancy size of the application corresponding to the object. That is, as the size of the file or the size of resource occupancy decreases, the difference between x and z may increase.
  • the controller 180 may generate a control signal to be displayed while moving the object according to the determined object speed (S150).
  • the electronic device 100 selects the user by varying the object speed by reflecting the attributes of the object in executing the function and / or operation corresponding thereto, in particular, the movement of the selected object.
  • the electronic device 100 selects the user by varying the object speed by reflecting the attributes of the object in executing the function and / or operation corresponding thereto, in particular, the movement of the selected object.
  • the controller 180 may selectively change the display state of the image of the object OB according to the attributes of the object OB while displaying the object OB while moving. For example, the controller 180 may selectively change the type, size, color, etc. of the image of the moving object OB in consideration of the attributes of the object OB.
  • FIG. 17 is a diagram for describing changing an image of an object according to an attribute of the object when the object is moved.
  • FIG. 17A shows that the object OB is displayed at a specific position.
  • FIGS. 17B and 17C illustrate that the user gesture is input and the electronic device 100 displays the object OB while moving the object OB according to the above-described steps S100 to S150.
  • FIG. 17B illustrates a case where the size of the file and / or the resource occupancy size of the application is small
  • FIG. 17C illustrates the size of the file and / or the resource occupancy size of the application. Shows a large case.
  • the controller 180 moves while maintaining the image of the original object OB.
  • OBI1 if the size of the file and / or the resource occupancy size of the application is large, the image of the object may be displayed in OBI2 to further visually indicate that the size of the file and / or the resource occupancy size of the application is large. Can be changed to display.
  • the controller 180 may change and display the image of the object OB to OBI2 only while the object OB is moved according to a user gesture.
  • FIG. 17 only an embodiment of changing an image of an object is described. However, as described above, it is also possible to change display attributes of an object image such as an object size and color.
  • FIG. 18 is a flowchart illustrating a method of controlling the operation of an electronic device according to a second embodiment of the present invention.
  • a method of controlling an operation of an electronic device including generating a control signal for displaying an object (S200), obtaining a user gesture (S210), and analyzing a user gesture (S220).
  • the method may include determining at least one movement trajectory pattern of the object based on the object property (S230) and generating a control signal to be displayed while moving the object according to the determined movement trajectory pattern (S240).
  • S230 object property
  • S240 determined movement trajectory pattern
  • Steps S200, S210, and S220 of the operation control method of the electronic device according to the second embodiment of the present invention are the same as or similar to the steps S100, S110, and S120 described in the first embodiment of the present invention, and thus a detailed description thereof will be omitted. Shall be.
  • the controller 180 may determine the movement trajectory pattern of the object based on a user gesture and an object property after performing steps S200 to S220 or at the same time as performing steps S200 to S220 (S230).
  • the attributes of an object may include, for example, the type of object (for example, whether it corresponds to a file or an application), whether the object is running (for example, whether it is currently running), or the object.
  • the controller 180 may analyze at least one of the above-described attributes of the object selected by the user's gesture.
  • Attributes of the respective objects may be stored in the memory 160.
  • the controller 180 may check at least one of the attributes of the selected object with reference to the memory 160.
  • the controller 180 may determine a moving path pattern of the object in consideration of the identified attributes of the at least one object.
  • the controller 180 may control the object OB7 to move in a straight line pattern PT1 when the object moves, and the object OB8 moves in a zigzag pattern PT2.
  • Can be controlled to move the object OB9 to the spiral pattern PT3 can be controlled to move the object OB10 to the flashing pattern PT4, the object OB11 is teleportable It can be controlled to move to the pattern PT5.
  • the controller 180 may set various movement trajectory patterns of the object when the object moves according to the attributes of the object.
  • different movement trajectory patterns may correspond to different types of objects. That is, if a file corresponds to the selected object, the controller 180 sets the movement trajectory corresponding to the selected object as a straight line pattern PT1, and if the application corresponds to the selected object, the control unit 180 sets the movement trajectory corresponding to the movement trace corresponding to the selected object. If it is set to PT2 and a folder is associated with the selected object, the movement trajectory corresponding thereto can be set to the spiral pattern PT3.
  • different movement trajectory patterns may correspond to the size of a file corresponding to an object and / or a resource occupancy size of an application corresponding to the object. That is, when the size of the file corresponding to the selected object and / or the resource occupancy size of the application corresponding to the selected object is small, the controller 180 sets the movement trajectory corresponding to the selected object as the teleportation pattern PT5.
  • the movement trajectory corresponding thereto is set to the zigzag pattern PT2, and the file corresponding to the selected object If the size and / or the resource occupancy size of the application corresponding to the selected object is medium, the movement trajectory corresponding thereto may be set as the straight line pattern PT1.
  • different movement trajectory patterns may correspond to the current execution of the object. That is, if the selected object is not currently executed, the controller 180 sets the movement trajectory corresponding to the straight line pattern PT1, and if the selected object is currently executing, the movement trace corresponding to the flashing pattern PT4 is selected. ) Can be set.
  • the movement trajectory pattern of the object may be variously set according to the attributes of various objects.
  • the controller 180 may generate a control signal to be displayed while moving the object according to the determined movement trajectory pattern (S240).
  • the controller 180 may determine a moving end point of the moving speed object of the moving direction object of the object according to the gesture analyzed in step S220. As it has been described in detail in the first embodiment of the present invention, a detailed description thereof will be omitted.
  • the controller 180 may control the display unit 151 to move the object according to the moving direction, the moving speed, etc. of the object analyzed in step S220, and the movement trace pattern thereof is displayed according to the pattern determined in step S230. .
  • the electronic device 100 changes the movement trajectory pattern of the object by reflecting the attributes of the object in executing the function and / or operation corresponding thereto, in particular, the movement of the selected object.
  • the electronic device 100 changes the movement trajectory pattern of the object by reflecting the attributes of the object in executing the function and / or operation corresponding thereto, in particular, the movement of the selected object.
  • FIGS. 20 and 21 A method of controlling the operation of an electronic device according to a third embodiment of the present invention will be described with reference to FIGS. 20 and 21.
  • FIG. 20 is a flowchart illustrating a method of controlling an operation of an electronic device according to a third embodiment of the present disclosure.
  • FIG. 21 is a diagram illustrating an example in which the electronic device operates according to a third embodiment of the present disclosure. .
  • the method for controlling the operation of an electronic device includes obtaining a user gesture (S300), analyzing a user gesture (S310), and starting to move the first object at a predetermined speed.
  • operation S320 analyzing an attribute of the second object on the movement path in operation S330, determining an access speed of the first object in consideration of the analyzed attribute in operation S340, and changing the predetermined speed into an access speed.
  • step (S350) At least one of the step (S350).
  • each step will be described in detail.
  • Steps S300 and S310 according to the third embodiment of the present invention are the same as or similar to steps S110 and S120 according to the first embodiment of the present invention, and thus detailed description thereof will be omitted.
  • the controller 180 may start moving the first object at a predetermined speed (S320).
  • the predetermined speed may be determined by the speed of the user gesture analyzed in step S310. Alternatively, the predetermined speed may be determined by an attribute of the selected first object. Alternatively, the predetermined speed may be determined in consideration of both the speed of the user gesture and the attributes of the selected first object.
  • the selected first object OB is moved along the curve CV7 at a predetermined speed V from the position at which it was initially displayed to the first position P1.
  • the controller 180 may analyze an attribute of the second object on the movement path of the first object (S330).
  • the movement path of the first object may be determined by the direction and / or movement distance of the user gesture analyzed in step S310.
  • the selected first object OB is displayed while being moved along a predetermined movement path PH according to a user gesture.
  • a second object different from the first object OB may be located on the movement path PH.
  • the controller 180 may analyze the attribute of the second object.
  • An attribute of the second object may include a type of the second object, an access right to the second object, a correlation between the second object, and the first object.
  • the type of the second object means whether it corresponds to a file, a folder, or an application.
  • the access right for the second object means a user who is currently using the electronic device 100. It may mean having access right.
  • the correlation between the second object and the first object may have different meanings as follows according to the type of the first object and / or the second object.
  • the second object corresponds to a file-i) If the file corresponds to a first object: whether the file of the first object and the file of the second object can be combined, ii) the first object
  • the folder corresponds to: whether the file of the second object can be moved to the folder of the first object, iii) if the first object corresponds to the application: the file of the second object corresponds to the first object Can run in conjunction with an existing application
  • the controller 180 may determine the approach speed of the first object in consideration of the property of the second object analyzed in step S330 (S340), and change the predetermined speed to the determined approach speed (S350).
  • the controller 180 may determine whether the first object OB1 and the second object OB2 may interoperate with each other based on the attributes of the second object analyzed in operation S330.
  • the controller 180 determines whether a file, an application, and / or a folder corresponding to the first object OB1 can be moved to a folder corresponding to the second object OB2, or corresponds to the first object OB1. Whether the file can be combined with the file linked to the second object OB2, whether the file corresponding to the first object OB1 can be executed by an application corresponding to the second object OB2, or the first object OB1. Whether the application corresponding to the second object OB2 may execute the file corresponding to the second object OB1, or the application corresponding to the first object OB1 may be executed in cooperation with the application corresponding to the second object OB2. It may be determined whether a file and / or an application corresponding to the second object OB2 can be moved to a folder corresponding to the object OB1.
  • the controller 180 may change an access speed of the first object OB1 with respect to the second object OB2.
  • the controller 180 may control the first object OB1 such as curves CV8, CV9, and CV10 illustrated in FIG. 21.
  • the speed of access to the second object OB2 may be increased according to various methods.
  • the controller 180 may be formed as shown in curves CV8 ', CV9', and CV10 'shown in FIG. An access speed of the first object OB1 to the second object OB2 may be reduced according to various methods.
  • the first object OB1 is an area within a predetermined distance of the second object OB2. It may be a time point at which the first object passes (ie, a time point at which the first object passes the first position P1 shown in FIG. 21).
  • the first object OB1 moves at a speed determined by the user gesture and / or the property of the first object OB1 for a predetermined time. It may be moved according to the above-described approach speed while passing through a specific time point (or a specific location).
  • one application is a mail application and the other application is a calendar application
  • the two applications work together to reflect on a schedule, or when there are attendees on a schedule For example, sending a confirmation email for attendees.
  • one application is a mail application and the other application is a text-to-speech (TTS) application
  • TTS text-to-speech
  • FIGS. 22 and 23 A method of controlling an operation of an electronic device according to a fourth embodiment of the present invention will be described with reference to FIGS. 22 and 23.
  • FIG. 22 is a flowchart for describing a method of controlling an operation of an electronic device according to a fourth embodiment of the present disclosure.
  • FIG. 23 is a diagram for describing a method of operating an electronic device according to a fourth embodiment of the present disclosure. to be.
  • Method for controlling the operation of the electronic device obtaining a user gesture (S400), analyzing the user gesture (S410), moving the first object (S420), Analyzing the attributes of the second object on the movement path of the first object (S430), determining the display pattern of the second object in consideration of the analyzed attributes (S440), and changing the display state of the second object It may include at least one of (S450).
  • Steps S400, S410, S420 and S430 according to the fourth embodiment of the present invention are the same as or similar to the steps S300, S310, S320 and S330 described in the third embodiment of the present invention, and thus detailed descriptions thereof will be omitted. do.
  • the controller 180 may determine the display pattern of the second object in consideration of the attributes of the second object analyzed in step S430 after performing steps S400 to S430 or at the same time as performing steps S400 to S430 (S440).
  • the display state of the second object may be changed according to the display pattern determined in operation S440.
  • the controller 180 may determine whether the first object OB1 and the second object OB2 may interoperate with each other, based on the attributes of the second object analyzed in operation S430.
  • the controller 180 determines whether a file, an application, and / or a folder corresponding to the first object OB1 can be moved to a folder corresponding to the second object OB2, or corresponds to the first object OB1. Whether the file can be combined with the file linked to the second object OB2, whether the file corresponding to the first object OB1 can be executed by an application corresponding to the second object OB2, or the first object OB1. Whether the application corresponding to the second object OB2 may execute the file corresponding to the second object OB1, or the application corresponding to the first object OB1 may be executed in cooperation with the application corresponding to the second object OB2. It may be determined whether a file and / or an application corresponding to the second object OB2 can be moved to a folder corresponding to the object OB1.
  • the controller 180 may change the display pattern of the second object OB2.
  • the display pattern of the second object may include a display image of the second object, rotation of the second object, blinking of the second object, and enlargement and / or reduction of the second object.
  • the controller 180 changes the display image of the second object OB2, displays the second object OB2 while rotating the display object, or flashes the second object OB2 according to the above-described determination result.
  • the second object OB2 may be gradually enlarged or the second object OB2 may be gradually reduced.
  • FIG. 23A illustrates a case where the first object OB1 corresponds to an application, and the second object OB2 corresponds to a folder.
  • the display image IM1 of the second object OB2 is displayed as an image representing a general folder.
  • FIG. 23B illustrates a case where it is determined that an application corresponding to the first object OB1 is movable to a folder corresponding to the second object OB2. It can be seen that the display image of is changed from IM1 to IM2.
  • FIG. 23C illustrates a case where it is determined that an application corresponding to the first object OB1 cannot be moved to a folder corresponding to the second object OB2.
  • the second object OB2 It can be seen that the display image of is changed from IM1 to IM3.
  • the second object may be changed according to the attributes of the second object OB2 and / or the correlation between the second object OB2 and the first object OB1. It has already been described above that the display pattern of OB2 can be changed.
  • control unit 180 changes the display pattern of the second object OB2 when the first object OB1 enters an area within a predetermined distance from the second object OB2, The display pattern of the second object OB2 may be changed.
  • the attributes of the second object OB2 located on the movement path of the first object OB1 are analyzed, and accordingly, the display pattern of the second object OB2 is changed differently, thereby giving the user the second object OB2.
  • Attribute information and / or the relationship between the first object OB1 and the second object OB2 may be indirectly provided, and different visual feedback may be provided according to the attributes of the second object with respect to the user gesture. You can give it.
  • the first to fourth embodiments of the method for controlling the operation of the electronic device according to the present invention described above can be used individually or in combination with each other.
  • the steps configuring each embodiment may be used separately or in combination with the steps configuring another embodiment.
  • each embodiment of the method for controlling the operation of the electronic device according to the present invention described above may be provided by recording on a computer-readable recording medium as a program for execution in a computer.
  • Each embodiment according to the present invention can be implemented through software.
  • the constituent means of the present invention are code segments that perform the necessary work.
  • the program or code segments may be stored on a processor readable medium or transmitted by a computer data signal coupled with a carrier on a transmission medium or network.
  • Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of computer-readable recording devices include ROM, RAM, CD-ROM, DVD ⁇ ROM, DVD-RAM, magnetic tape, floppy disk, hard disk, optical data storage device, and the like. have. The computer readable recording medium can also be distributed over network coupled computer devices so that the computer readable code is stored and executed in a distributed fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de commande du fonctionnement d'un dispositif électronique, comprenant les étapes consistant à : générer un signal de commande pour afficher un objet à une première position ; recevoir un geste d'utilisateur en tant qu'entrée ; et générer un signal de commande pour afficher un objet sélectionné tout en déplaçant celui-ci à une vitesse qui est déterminée en prenant en considération à la fois l'entrée de geste d'utilisateur et les attributs de l'objet.
PCT/KR2011/005452 2011-07-22 2011-07-22 Dispositif électronique actionné selon un geste d'utilisateur, et procédé de commande du fonctionnement du dispositif électronique Ceased WO2013015462A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2011/005452 WO2013015462A1 (fr) 2011-07-22 2011-07-22 Dispositif électronique actionné selon un geste d'utilisateur, et procédé de commande du fonctionnement du dispositif électronique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2011/005452 WO2013015462A1 (fr) 2011-07-22 2011-07-22 Dispositif électronique actionné selon un geste d'utilisateur, et procédé de commande du fonctionnement du dispositif électronique

Publications (1)

Publication Number Publication Date
WO2013015462A1 true WO2013015462A1 (fr) 2013-01-31

Family

ID=47601279

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/005452 Ceased WO2013015462A1 (fr) 2011-07-22 2011-07-22 Dispositif électronique actionné selon un geste d'utilisateur, et procédé de commande du fonctionnement du dispositif électronique

Country Status (1)

Country Link
WO (1) WO2013015462A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702156A (zh) * 2013-12-09 2014-04-02 乐视致新电子科技(天津)有限公司 一种自定义手势轨迹的方法及装置
CN104207760A (zh) * 2013-05-31 2014-12-17 义明科技股份有限公司 可携式电子装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080165140A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Detecting gestures on multi-event sensitive devices
US20090228841A1 (en) * 2008-03-04 2009-09-10 Gesture Tek, Inc. Enhanced Gesture-Based Image Manipulation
US20100058240A1 (en) * 2008-08-26 2010-03-04 Apple Inc. Dynamic Control of List Navigation Based on List Item Properties
US20100095206A1 (en) * 2008-10-13 2010-04-15 Lg Electronics Inc. Method for providing a user interface using three-dimensional gestures and an apparatus using the same
KR20110072970A (ko) * 2009-12-23 2011-06-29 엘지전자 주식회사 영상표시장치 및 그 동작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080165140A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Detecting gestures on multi-event sensitive devices
US20090228841A1 (en) * 2008-03-04 2009-09-10 Gesture Tek, Inc. Enhanced Gesture-Based Image Manipulation
US20100058240A1 (en) * 2008-08-26 2010-03-04 Apple Inc. Dynamic Control of List Navigation Based on List Item Properties
US20100095206A1 (en) * 2008-10-13 2010-04-15 Lg Electronics Inc. Method for providing a user interface using three-dimensional gestures and an apparatus using the same
KR20110072970A (ko) * 2009-12-23 2011-06-29 엘지전자 주식회사 영상표시장치 및 그 동작 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104207760A (zh) * 2013-05-31 2014-12-17 义明科技股份有限公司 可携式电子装置
CN103702156A (zh) * 2013-12-09 2014-04-02 乐视致新电子科技(天津)有限公司 一种自定义手势轨迹的方法及装置

Similar Documents

Publication Publication Date Title
WO2019147021A1 (fr) Dispositif de fourniture de service de réalité augmentée et son procédé de fonctionnement
WO2012144666A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2015190666A1 (fr) Terminal mobile et son procédé de commande
WO2015199292A1 (fr) Terminal mobile et son procédé de commande
WO2016024746A1 (fr) Terminal mobile
WO2017034116A1 (fr) Terminal mobile et procédé de commande de celui-ci
WO2015053449A1 (fr) Dispositif d'affichage d'image de type lunettes et son procédé de commande
WO2017191978A1 (fr) Procédé, appareil et support d'enregistrement pour traiter une image
WO2015068911A1 (fr) Terminal mobile et son procédé de commande
WO2017185316A1 (fr) Procédé et système de commande de vol de vue subjective pour véhicule aérien sans pilote et lunettes intelligentes
WO2018117349A1 (fr) Terminal mobile et procédé de commande associé
WO2015122590A1 (fr) Dispositif électronique et son procédé de commande
EP3311557A1 (fr) Terminal mobile et son procédé de commande
WO2015088166A1 (fr) Terminal mobile, et procédé de commande d'une unité d'entrée de face arrière du terminal
EP2982042A1 (fr) Terminal et son procédé de commande
WO2016175424A1 (fr) Terminal mobile, et procédé de commande associé
WO2013100323A1 (fr) Terminal mobile et système permettant de commander une holographie utilisée avec ce dernier
WO2012128399A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2012102592A9 (fr) Dispositif d'affichage d'image et son procédé d'utilisation
WO2015064935A1 (fr) Dispositif électronique et son procédé de commande
WO2017183764A1 (fr) Terminal mobile et procédé de commande associé
WO2015174611A1 (fr) Terminal mobile et son procédé de commande
WO2020022548A1 (fr) Terminal mobile et procédé de commande associé
WO2013015466A1 (fr) Dispositif électronique pour affichage d'image tridimensionnelle et son procédé d'utilisation
WO2015178580A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11869953

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11869953

Country of ref document: EP

Kind code of ref document: A1