WO2017032217A1 - 多点触控装置及方法 - Google Patents
多点触控装置及方法 Download PDFInfo
- Publication number
- WO2017032217A1 WO2017032217A1 PCT/CN2016/093964 CN2016093964W WO2017032217A1 WO 2017032217 A1 WO2017032217 A1 WO 2017032217A1 CN 2016093964 W CN2016093964 W CN 2016093964W WO 2017032217 A1 WO2017032217 A1 WO 2017032217A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- touch operation
- distance
- gesture
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present application relates to, but is not limited to, the field of mobile terminals, and in particular, to a multi-touch device and method.
- mobile terminals such as mobile phones, smart phones, PDAs (personal digital assistants), PADs (tablets), and PMPs (portable multimedia players) are increasingly used. Since the large-screen mobile terminal is more suitable for watching movies, playing games and other entertainment activities, the screen of the mobile terminal is getting bigger and bigger, and at the same time, the operation of the mobile terminal by one hand becomes more and more difficult.
- the operation needs to be completed by multi-touch, it is often necessary to replace the operation with both hands, which has a great influence on the user's operation consistency, and greatly reduces the user experience.
- one carries a bag and the other hand operates the mobile phone. At this time, it is impossible to enlarge or reduce the operation of the electronic map by separating or closing the two fingers. It is necessary to stop and operate the mobile phone with both hands to achieve the zoom of the electronic map, which greatly reduces the user experience.
- the embodiment of the invention can solve the problem that the mobile terminal of the related art cannot complete the multi-touch operation by one hand.
- the display module is configured to: when detecting an open command triggered by the back sensor, display a corresponding virtual point on the front touch screen for the user to perform a touch operation according to the virtual point;
- An acquiring module configured to: acquire, by using the front touch screen, a touch letter of the touch operation interest
- An identification module configured to: perform gesture recognition on the touch operation according to the touch information
- the execution module is configured to perform a corresponding function according to the result of the gesture recognition.
- the embodiment of the invention further provides a multi-touch method, including:
- the embodiment of the invention further provides a computer readable storage medium storing computer executable instructions, which are implemented when executed by a processor.
- the embodiment of the present invention triggers a corresponding instruction by the back sensor, and displays a corresponding virtual point on the front touch screen of the mobile terminal, where the user performs a touch operation by referring to the virtual point, and the mobile terminal acquires touch information based on the front touch screen, Performing gesture recognition according to the touch information, and performing a corresponding function according to the result of the gesture recognition; by the cooperation of the back sensor and the front touch screen, the user moves by one-hand operation when multi-touch operation is required
- the terminal can realize multi-touch and complete corresponding function control, so that the user does not need to operate the mobile terminal with both hands for multi-touch, which greatly improves the user experience.
- FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements an embodiment of the present invention
- FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
- FIG. 3 is a schematic diagram of functional modules of a first embodiment of a multi-touch device based on a back sensor according to the present invention
- FIG. 4 is a schematic diagram of functional modules of a second embodiment of a multi-touch device based on a back sensor according to the present invention.
- FIG. 5 is a schematic diagram of functional modules of a third embodiment of a multi-touch device based on a back sensor according to the present invention.
- FIG. 6 is a schematic diagram of reducing a display image of the front touch screen when the mobile terminal recognizes that the touch gesture is a close gesture according to an embodiment of the present disclosure
- FIG. 7 is a schematic diagram of amplifying the display image of the front touch screen when the mobile terminal recognizes that the touch gesture is a separate gesture according to an embodiment of the present disclosure
- FIG. 8 is a schematic flow chart of a first embodiment of a multi-touch method based on a back sensor according to the present invention.
- FIG. 9 is a schematic flow chart of a second embodiment of a multi-touch method based on a back sensor according to the present invention.
- FIG. 10 is a schematic flow chart of a third embodiment of a multi-touch method based on a back sensor according to the present invention.
- the mobile terminal can be implemented in a variety of forms.
- the terminal described in the embodiments of the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistant), a PAD (Tablet), a PMP (Portable Multimedia Player), a navigation device Mobile terminals of the like and fixed terminals such as digital TVs, desktop computers, and the like.
- PDA Personal Digital Assistant
- PAD Tablett
- PMP Portable Multimedia Player
- a navigation device Mobile terminals of the like and fixed terminals such as digital TVs, desktop computers, and the like.
- the terminal is a mobile terminal.
- those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
- FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements an embodiment of the present invention.
- the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
- Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
- Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
- the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
- the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
- the broadcast channel can include a satellite channel and/or a terrestrial channel.
- the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
- the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
- the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
- the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
- the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
- the broadcast receiving module 111 can receive a signal broadcast by using a plurality of types of broadcast systems.
- the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
- MediaFLO forward link media
- DMB-T multimedia broadcast-terrestrial
- DMB-S digital multimedia broadcast-satellite
- DVD-H digital video broadcast-handheld
- the digital broadcasting system of the @) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
- the broadcast receiving module 111 can be constructed as a broadcast system suitable for providing a broadcast signal as well as the above-described digital broadcast system.
- the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
- the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
- a base station e.g., an access point, a Node B, etc.
- Such radio signals may include voice call signals, video call signals, or based on text and/or more Multiple types of data that are sent and/or received by media messages.
- the wireless internet module 113 supports wireless internet access of the mobile terminal.
- the module can be internally or externally coupled to the terminal.
- the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
- the short range communication module 114 is configured to support short range communication.
- Some examples of short-range communication technologies include BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (I rDA), Ultra Wide Band (UWB), ZigbeeTM, and the like.
- the location information module 115 is configured to check or obtain location information of the mobile terminal.
- a typical example of a location information module is GPS (Global Positioning System).
- GPS Global Positioning System
- the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
- the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
- the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
- the A/V input unit 120 is arranged to receive an audio or video signal.
- the A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
- the processed image frame can be displayed on the display unit 151.
- the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal.
- the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
- the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
- the microphone 122 may pass noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
- the user input unit 130 may generate key input data according to a command input by the user to control the operation of the mobile terminal.
- the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, detecting resistance, pressure, electricity due to being touched) Variable touch sensitive components, etc.), scroll wheels, rockers, and the like.
- a touch panel when the touch panel is superimposed on the display unit 151 in the form of a layer, a touch screen can be formed.
- the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
- the sensing unit 140 can sense whether the slide type phone is turned on or off.
- the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
- Sensing unit 140 may include proximity sensor 141 which will be described below in connection with a touch screen.
- the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
- the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
- the identification module may be stored to verify a variety of information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identification Module (USIM), and the like.
- UIM User Identification Module
- SIM Customer Identification Module
- USB Universal Customer Identification Module
- the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
- the interface unit 170 may be arranged to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more components within the mobile terminal 100 or may be configured to be at the mobile terminal and external device Transfer data between.
- the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a plurality of command signals allowed to be input from the base to be transmitted to the mobile The path to the terminal.
- a variety of command signals or power input from the base can be used as a signal for identifying whether the mobile terminal is accurately mounted on the base.
- Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
- the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
- the display unit 151 can display information processed in the mobile terminal 100. For example, when moving When the terminal 100 is in the phone call mode, the display unit 151 can display a user interface (U I) or a graphical user interface (GU I) associated with a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
- U I user interface
- GUI graphical user interface
- the display unit 151 can function as an input device and an output device.
- the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
- LCD liquid crystal display
- TFT-LCD thin film transistor LCD
- OLED organic light emitting diode
- a flexible display a three-dimensional (3D) display, and the like.
- 3D three-dimensional
- Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
- TOLED Transparent Organic Light Emitting Diode
- the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
- the touch screen can be set to detect touch input pressure as well as touch input position and touch input area.
- the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
- the audio signal is output as sound.
- the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
- the audio output module 152 can include a speaker, a buzzer, and the like.
- the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of an event even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide a notification event via the display unit 151 or the audio output module 152. The output that occurred.
- the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 may store data regarding vibration and audio signals of various manners that are output when a touch is applied to the touch screen.
- the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
- the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
- the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like.
- the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180.
- the controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
- the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate each component and component.
- the embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
- the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
- implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
- the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by controller
- the mobile terminal has been described in terms of its function.
- a slide type mobile terminal among a plurality of types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the embodiment of the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
- the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
- a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
- a communication system in which a mobile terminal is operable according to an embodiment of the present invention will now be described with reference to FIG.
- Such communication systems may use different air interfaces and/or physical layers.
- air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
- FDMA Frequency Division Multiple Access
- TDMA Time Division Multiple Access
- CDMA Code Division Multiple Access
- UMTS Universal Mobile Telecommunications System
- LTE Long Term Evolution
- GSM Global System for Mobile Communications
- the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
- a CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280 MSC 280 configured to communicate with a public switched telephone network (PSTN). ) 290 forms an interface.
- the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
- the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E 1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 can include multiple BSCs 275.
- Each BS 270 can serve one or more partitions (or regions), with each partition covered by a multi-directional antenna or an antenna pointing in a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
- BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
- BTS Base Transceiver Subsystem
- the term "base station” can be used to generally mean a single BSC 275 and at least one BS 270.
- a base station can also be referred to as a "cell station.”
- multiple partitions of a particular BS 270 may be referred to as multiple cellular stations.
- a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
- a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
- GPS Global Positioning System
- the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
- a plurality of satellites 300 are depicted, but it will be appreciated that useful positioning information can be obtained using any number of satellites.
- the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
- BS 270 receives a reverse link signal from mobile terminal 100.
- Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
- Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
- the obtained data is forwarded to the relevant BSC 275.
- the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
- the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
- PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
- FIG. 3 is a schematic diagram of functional modules of a first embodiment of a multi-touch device based on a back sensor according to the present invention.
- the back sensor-based multi-touch device includes a display module 10, an acquisition module 20, an identification module 30, and an execution module 40;
- the display module 10 is configured to: when detecting an open command triggered by the back sensor, display a corresponding virtual point on the front touch screen for the user to perform a touch operation according to the virtual point;
- the back sensor is disposed on the back of the mobile terminal, and is configured to: trigger a corresponding instruction to enter the multi-touch mode, or trigger a corresponding instruction to exit the multi-touch mode.
- the mobile terminal enters a multi-touch mode when detecting an open command triggered by the back sensor, and the mobile terminal displays a corresponding virtual point on the front touch screen for the user to perform according to the virtual point. Touch operation.
- the back sensor may be a touch sensitive component such as detecting a change in resistance, pressure, capacitance, or the like due to being touched, or may be an optical fingerprint sensor, a semiconductor capacitive sensor, a semiconductor thermal sensor, a semiconductor pressure sensitive sensor, and an ultrasonic sensor. And so on.
- the opening command is triggered to enter the multi-touch mode, and the front touch screen displays corresponding a virtual point for the user to perform a touch operation based on the virtual point of the front touch screen; when detecting a release operation of the user based on the back sensor, triggering the exit instruction to exit the multi-touch mode, The virtual point displayed on the front touch screen is cancelled.
- the back sensor-based multi-touch device may further include a startup module, where the startup module is configured to: when detecting that the terminal enters the multi-touch interface, activate the back sensor, and Receiving an open command triggered based on the back sensor.
- the multi-touch interface may be a preset operation interface that requires multi-touch completion, for example, a picture viewing interface, which requires multi-touch to complete the zooming of the image; and a text reading interface that requires multi-touch to complete Text size scaling and more.
- the front touch screen is disposed on the front side of the mobile terminal, and is configured to: display information processed in the mobile terminal, and receive a touch input of the user to control various operations of the mobile terminal.
- the front touch screen may be further configured to: acquire touch information of a user touch operation, and display information processed in the mobile terminal by using an image.
- the virtual point When the virtual point is a multi-touch operation, it is displayed at a fixed point of the front touch screen, and may be at a projection position of the back sensor at the front touch screen, or may be a center of the front touch screen.
- Virtual points can be displayed on the front touch screen in the form of translucent circles, diamonds, and the like.
- the virtual point is used for the user to complete the multi-touch operation with reference to the virtual point. For example, the user directly touches the front touch screen and slides away from the virtual point to form a shape. Or a separate gesture; or, sliding in a direction close to the virtual point to form a collapse gesture.
- the acquiring module 20 is configured to: acquire, by using the front touch screen, touch information of the touch operation;
- the mobile terminal acquires touch information of the touch operation through the front touch screen.
- the touch operation is a touch operation performed by a user based on the front touch screen.
- the touch information is used to perform gesture determination on a corresponding touch operation, and the touch information may be touch information such as a position of the touch point of the touch operation, a touch time, a movement track of the touch point, and the like.
- the identification module 30 is configured to: perform gesture recognition on the touch operation according to the touch information;
- the mobile terminal performs gesture recognition on the touch operation according to the touch information.
- the mobile terminal may determine, according to the moving direction of the touch operation, the touch operation as a separate operation or a close operation; or, according to the movement track of the touch operation, according to the moving Whether the trajectory is curved to determine whether the touch operation is a rotation operation.
- the execution module 40 is configured to perform a corresponding function according to the result of the gesture recognition.
- the mobile terminal performs a corresponding function according to the result of the gesture recognition.
- the function corresponding to the result of the gesture recognition may perform a corresponding function according to a mapping relationship between the preset gesture and the corresponding function. For example, when the close gesture is recognized, the function of reducing the displayed image is performed; when the split gesture is recognized, the function of enlarging the displayed image is performed.
- the display module 10 is further configured to cancel the virtual point displayed on the front touch screen when detecting an exit instruction triggered by the back sensor.
- the user may trigger an opening command to enter the multi-touch mode, and display a corresponding virtual point on the front touch screen for the user to perform virtual based on the front touch screen.
- the touch operation is performed; when the user releases the release operation based on the back sensor, the exit command is triggered to exit the multi-touch mode, and the virtual point displayed on the front touch screen is cancelled.
- the corresponding command is triggered by the back sensor, and the corresponding virtual point is displayed on the front touch screen of the mobile terminal, and the user performs a touch operation by referring to the virtual point, and the mobile terminal acquires touch information based on the front touch screen to
- the touch information performs gesture recognition, and performs a corresponding function according to the result of the gesture recognition; by using a combination of a back sensor and a front touch screen,
- FIG. 4 is a schematic diagram of functional modules of a second embodiment of a multi-touch device based on a back sensor according to the present invention.
- the identification module 30 includes a first determining unit 31, a calculating unit 32, and a comparing unit 33, based on the first embodiment of the above-described back sensor-based multi-touch device;
- the first determining unit 31 is configured to: determine a start point and an end point of the touch operation according to the touch information;
- the calculating unit 32 is configured to: calculate a first distance between the starting point and the virtual point, and a second distance between the end point and the virtual point;
- the comparison unit 33 is configured to: compare the first distance with the second distance;
- the first determining unit 31 is further configured to: if the first distance is smaller than the second distance, determine that the touch operation is a close gesture; if the first distance is greater than the second distance, determine The touch operation is a split gesture; or, if the first distance is greater than the second distance, determining that the touch operation is a close gesture; if the first distance is smaller than the second distance, determining the The touch operation is a separate gesture.
- the mobile terminal Determining, by the mobile terminal, a start point and an end point of the touch operation according to the touch information; the mobile terminal calculating the start point and the virtual point according to the start point, the end point, and location information of the virtual point a first distance, a second distance between the end point and the virtual point; comparing the first distance with the second distance; and the mobile terminal is when the first distance is smaller than the second distance Determining that the touch operation is a collapse gesture; determining that the touch operation is a separate gesture when the first distance is greater than the second distance; or the mobile terminal is greater than the second at the first distance When the distance is determined, the touch operation is determined to be a close gesture; when the first distance is less than the second distance, the touch operation is determined to be a split gesture. The mobile terminal may further determine that the touch operation is an invalid gesture when the first distance is equal to the second distance.
- the touch information of the touch operation is acquired by the front touch screen, and the start point and the end point of the touch operation are determined according to the touch information, and the virtual point is respectively determined according to the start point and the end point.
- the gesture is performed to determine that the touch operation is a close gesture or a separate gesture, and the corresponding function is performed according to the result of the gesture recognition.
- the user can realize multi-touch by one-hand operation of the mobile terminal when performing multi-touch operation, and complete corresponding function control, so that the user does not need to operate the mobile terminal with both hands for multi-touch, thereby greatly improving the user. Experience.
- the identification module may include a second determining unit
- the second determining unit is configured to: determine a moving direction of the touch operation relative to the virtual point according to the touch information; and determine a gesture of the touch operation according to a moving direction of the virtual point according to the touch operation.
- the second determining unit is configured to: if the moving direction of the touch operation is close to the virtual point, the determining that the touch operation is a close gesture; if the touch operation is relative to the The moving direction of the virtual point is away from the virtual point, and then the touch operation is determined to be a separate gesture.
- the touch information of the touch operation is obtained by the front touch screen, the moving direction of the touch operation is determined according to the touch information, and the gesture of the touch operation is determined according to the moving direction of the virtual point according to the touch operation to determine
- the touch operation is a close gesture or a separate gesture, and the corresponding function is performed according to the result of the gesture recognition.
- the user can realize multi-touch by one-hand operation of the mobile terminal when performing multi-touch operation, and complete corresponding function control, so that the user does not need to operate the mobile terminal with both hands for multi-touch, thereby greatly improving the user. Experience.
- the identification module may include a third determining unit
- a third determining unit configured to: determine a movement trajectory of the touch operation according to the touch information; and determine a gesture of the touch operation according to a movement trajectory of the touch operation.
- the third determining unit is configured to: if the moving track is curved, determine that the touch operation is a rotation gesture.
- the touch information of the touch operation is obtained by the front touch screen
- the moving track of the touch operation is determined according to the touch information
- the gesture is determined according to the moving track to determine whether the touch operation is a rotation gesture, and according to the The result of the gesture recognition performs the corresponding function.
- FIG. 5 is a schematic diagram of functional modules of a third embodiment of a multi-touch device based on a back sensor according to the present invention.
- the execution module 40 includes a reduction unit 41 and an amplification unit 42 based on the second embodiment of the above-described back sensor-based multi-touch device;
- the reducing unit 41 is configured to: if the touch operation is a close gesture, determine a reduction ratio according to a distance difference between the first distance and the second distance, and display an image displayed by the front touch screen according to the Reduce the scale to reduce;
- FIG. 6 is a schematic diagram of reducing a front touch screen display image when a mobile terminal recognizes that a touch gesture is a close gesture according to an embodiment of the present invention.
- the amplifying unit 42 is configured to: if the touch operation is a split gesture, determine an enlargement ratio according to a distance difference between the first distance and the second distance, and display an image displayed by the front touch screen according to the Zoom in to zoom in.
- FIG. 7 is a schematic diagram of enlarging a display image of the front touch screen when the mobile terminal recognizes that the touch gesture is a separate gesture according to an embodiment of the present invention.
- the touch operation when the touch operation is recognized as the collapse gesture, according to the first distance of the start point of the touch operation and the virtual point, the end point of the touch operation and the second distance of the virtual point, Determining a zoom ratio according to a distance difference between the first distance and the second distance, and performing a corresponding zoom operation according to a corresponding zoom ratio, so that when the user needs to perform multi-touch operation, the mobile terminal is operated by one hand It can realize multi-touch and perform corresponding zooming function, so that the user does not need to operate the mobile terminal with both hands, which greatly improves the user experience.
- the execution module may further include a rotation unit
- the rotating unit is configured to: if the touch operation is a rotation gesture, determine a rotation angle according to the movement trajectory, and rotate an image displayed by the front touch screen according to the rotation angle.
- the touch operation when the touch operation is recognized as a rotation gesture, the image displayed by the front touch screen is rotated according to the rotation angle according to the rotation angle determined according to the movement trajectory, so that the user needs to touch more.
- the mobile terminal can be operated by one hand to realize multi-touch, and the corresponding rotation function is executed, so that the user does not need to operate the mobile terminal with both hands, thereby greatly improving the user experience.
- the embodiment of the invention further provides a multi-touch method based on the back sensor.
- FIG. 8 is a schematic flow chart of a first embodiment of a multi-touch method based on a back sensor according to the present invention.
- the back sensor-based multi-touch method includes:
- Step S10 when detecting an open command triggered by the back sensor, displaying a corresponding virtual point on the front touch screen for the user to perform a touch operation according to the virtual point;
- the back sensor is disposed on the back of the mobile terminal, and is configured to: trigger a corresponding instruction to enter the multi-touch mode, or trigger a corresponding instruction to exit the multi-touch mode.
- the mobile terminal enters a multi-touch mode when detecting an open command triggered by the back sensor, and the mobile terminal displays a corresponding virtual point on the front touch screen for the user to perform according to the virtual point. Touch operation.
- the back sensor may be a touch sensitive component such as detecting a change in resistance, pressure, capacitance, or the like due to being touched, or may be an optical fingerprint sensor, a semiconductor capacitive sensor, a semiconductor thermal sensor, a semiconductor pressure sensitive sensor, and an ultrasonic sensor. And so on.
- the opening command is triggered to enter the multi-touch mode, and the front touch screen displays corresponding a virtual point for the user to perform a touch operation based on the virtual point of the front touch screen; when detecting a release operation of the user based on the back sensor, triggering the exit instruction to exit the multi-touch mode, The virtual point displayed on the front touch screen is cancelled.
- the back sensor is activated, and an opening command triggered based on the back sensor is received.
- the multi-touch interface may be a preset operation interface that requires multi-touch completion, for example, a picture viewing interface, which requires multi-touch to complete the zooming of the image; and a text reading interface that requires multi-touch to complete Text size scaling and more.
- the front touch screen is disposed on the front side of the mobile terminal, and is configured to: display information processed in the mobile terminal, and receive a touch input of the user to control various operations of the mobile terminal.
- the front touch screen may be further configured to: acquire touch information of a user touch operation, and display information processed in the mobile terminal by using an image.
- the virtual point When the virtual point is a multi-touch operation, it is displayed at a fixed point of the front touch screen, and may be at a projection position of the back sensor at the front touch screen, or may be a center of the front touch screen.
- Virtual points can be displayed on the front touch screen in the form of translucent circles, diamonds, and the like.
- the virtual point is used by the user to complete the multi-touch operation with reference to the virtual point. For example, the user single-touches the front touch screen and slides away from the virtual point to form a separate gesture. Alternatively, sliding in a direction approaching the virtual point to form a collapse gesture.
- Step S20 acquiring touch information of the touch operation through the front touch screen
- the mobile terminal acquires touch information of the touch operation through the front touch screen.
- the touch operation is a touch operation performed by a user based on the front touch screen.
- the touch information is used to perform gesture determination on a corresponding touch operation, and the touch information may be touch information such as a position of the touch point of the touch operation, a touch time, a movement track of the touch point, and the like.
- Step S30 performing gesture recognition on the touch operation according to the touch information
- the mobile terminal performs gesture recognition on the touch operation according to the touch information.
- the mobile terminal may determine, according to the moving direction of the touch operation, the touch operation as a separate operation or a close operation; or, according to the movement track of the touch operation, according to the moving Whether the trajectory is curved to determine whether the touch operation is a rotation operation.
- Step S40 performing a corresponding function according to the result of the gesture recognition.
- the mobile terminal performs a corresponding function according to the result of the gesture recognition.
- the function corresponding to the result of the gesture recognition may perform a corresponding function according to a mapping relationship between the preset gesture and the corresponding function. For example, when the close gesture is recognized, the function of reducing the displayed image is performed; when the split gesture is recognized, the function of enlarging the displayed image is performed.
- the virtual point displayed on the front touch screen may also be cancelled when detecting an exit instruction triggered by the back sensor.
- the user may trigger an opening command to enter the multi-touch mode, and display a corresponding virtual point on the front touch screen for the user to perform virtual based on the front touch screen.
- the touch operation is performed; when the user releases the release operation based on the back sensor, the exit command is triggered to exit the multi-touch mode, and the virtual point displayed on the front touch screen is cancelled.
- the corresponding command is triggered by the back sensor, and the corresponding virtual point is displayed on the front touch screen of the mobile terminal, and the user performs a touch operation by referring to the virtual point, and the mobile terminal acquires touch information based on the front touch screen to
- the touch information performs gesture recognition, and performs a corresponding function according to the result of the gesture recognition; by the cooperation of the back sensor and the front touch screen, the user operates the mobile terminal by one hand when the multi-touch operation is required Multi-touch can be realized, and the corresponding function control is completed, so that the user does not need to operate the mobile terminal with both hands for multi-touch, which greatly improves the user experience.
- FIG. 9 is a schematic flow chart of a second embodiment of a multi-touch method based on a back sensor according to the present invention. Based on the first embodiment of the above-described back sensor-based multi-touch method, the step S30 includes:
- Step S31 determining a start point and an end point of the touch operation according to the touch information
- Step S32 calculating a first distance between the starting point and the virtual point, and a second distance between the end point and the virtual point;
- Step S33 comparing the first distance with the second distance
- Step S34 if the first distance is greater than the second distance, determining that the touch operation is a close gesture
- Step S35 if the first distance is smaller than the second distance, determining that the touch operation is Separate gestures.
- the step S34 and the step S35 may be: if the first distance is smaller than the second distance, determining that the touch operation is a close gesture; if the first distance is greater than the second distance, determining The touch operation is a separate gesture.
- the mobile terminal Determining, by the mobile terminal, a start point and an end point of the touch operation according to the touch information; the mobile terminal calculating the start point and the virtual point according to the start point, the end point, and location information of the virtual point a first distance, a second distance between the end point and the virtual point; comparing the first distance with the second distance; and the mobile terminal is when the first distance is smaller than the second distance Determining that the touch operation is a collapse gesture; determining that the touch operation is a separate gesture when the first distance is greater than the second distance; or the mobile terminal is greater than the second at the first distance When the distance is determined, the touch operation is determined to be a close gesture; when the first distance is less than the second distance, the touch operation is determined to be a split gesture. The mobile terminal may further determine that the touch operation is an invalid gesture when the first distance is equal to the second distance.
- the touch information of the touch operation is acquired by the front touch screen, the start point and the end point of the touch operation are determined according to the touch information, and the distance between the virtual point and the virtual point is respectively determined according to the start point and the end point, so as to determine the location.
- the touch operation is a close gesture or a separate gesture, and the corresponding function is performed according to the result of the gesture recognition. The user can realize multi-touch by one-hand operation of the mobile terminal when performing multi-touch operation, and complete corresponding function control, so that the user does not need to operate the mobile terminal with both hands for multi-touch, thereby greatly improving the user. Experience.
- the step S30 may include:
- Step S34 determining a moving direction of the touch operation relative to the virtual point according to the touch information
- Step S35 Determine a gesture of the touch operation according to a moving direction of the virtual point with respect to the touch operation.
- step S35 may include:
- the touch operation is determined to be a close gesture; if the moving direction of the touch operation is away from the virtual point, Then determining that the touch operation is a separate gesture.
- the touch information of the touch operation is obtained by the front touch screen, the moving direction of the touch operation is determined according to the touch information, and the gesture of the touch operation is determined according to the moving direction of the virtual point according to the touch operation to determine
- the touch operation is a close gesture or a separate gesture, and the corresponding function is performed according to the result of the gesture recognition.
- the user can realize multi-touch by one-hand operation of the mobile terminal when performing multi-touch operation, and complete corresponding function control, so that the user does not need to operate the mobile terminal with both hands for multi-touch, thereby greatly improving the user. Experience.
- the step S30 may include:
- Step S36 determining a movement trajectory of the touch operation according to the touch information
- Step S37 determining a gesture of the touch operation according to the movement trajectory of the touch operation.
- step S37 may include:
- the touch information of the touch operation is obtained by the front touch screen
- the moving track of the touch operation is determined according to the touch information
- the gesture is determined according to the moving track to determine whether the touch operation is a rotation gesture, and according to the
- the result of the gesture recognition performs the corresponding function.
- the user can realize multi-touch by one-hand operation of the mobile terminal when performing multi-touch operation, and complete corresponding function control, so that the user does not need to operate the mobile terminal with both hands for multi-touch, thereby greatly improving the user. Experience.
- FIG. 10 is a schematic flowchart diagram of a third embodiment of a multi-touch method based on a back sensor according to the present invention. Based on the second embodiment of the above-described back sensor-based multi-touch method, the step S40 includes:
- Step S41 if the touch operation is a close gesture, determining a reduction ratio according to a distance difference between the first distance and the second distance, and reducing an image displayed by the front touch screen according to the reduction ratio;
- the mobile terminal recognizes that the touch operation is a close gesture, subtracting the second distance from the first distance to obtain a distance difference, determining a corresponding reduction ratio according to the distance difference, and using the front touch screen The displayed image is reduced in accordance with the reduction ratio.
- Figure 6 is the hair
- the mobile terminal recognizes a schematic diagram of reducing the display image of the front touch screen when the touch gesture is a collapse gesture.
- Step S42 if the touch operation is a split gesture, determining an enlargement ratio according to a distance difference between the first distance and the second distance, and amplifying the image displayed by the front touch screen according to the zoom ratio.
- FIG. 7 is a schematic diagram of enlarging a display image of the front touch screen when the mobile terminal recognizes that the touch gesture is a separate gesture according to an embodiment of the present invention.
- the touch operation when the touch operation is recognized as the collapse gesture, according to the first distance of the start point of the touch operation and the virtual point, the end point of the touch operation and the second distance of the virtual point, Determining a zoom ratio according to a distance difference between the first distance and the second distance, and performing a corresponding zoom operation according to a corresponding zoom ratio, so that when the user needs to perform multi-touch operation, the mobile terminal is operated by one hand It can realize multi-touch and perform corresponding zooming function, so that the user does not need to operate the mobile terminal with both hands, which greatly improves the user experience.
- step S40 may further include:
- Step S43 if the touch operation is a rotation gesture, determining a rotation angle according to the movement trajectory, and rotating the image displayed by the front touch screen according to the rotation angle.
- the touch operation when the touch operation is recognized as a rotation gesture, the image displayed by the front touch screen is rotated according to the rotation angle according to the rotation angle determined according to the movement trajectory, so that the user needs to touch more.
- the mobile terminal can be operated by one hand to realize multi-touch, and the corresponding rotation function is executed, so that the user does not need to operate the mobile terminal with both hands, thereby greatly improving the user experience.
- the technical solution of the embodiment of the present invention may be embodied in the form of a software product stored in a storage medium (such as a ROM/RAM, a magnetic disk, an optical disk), and includes a plurality of instructions for making
- a terminal device which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc. performs the method described in the embodiments of the present invention.
- the cooperation between the back sensor and the front touch screen enables the user to implement multi-touch by one-hand operation of the mobile terminal when multi-touch operation is required, and complete corresponding function control, so that the user The multi-touch is not required to operate the mobile terminal with both hands, which greatly improves the user experience.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
本申请公布一种多点触控装置及方法,所述多点触控装置包括:显示模块,设置为:在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;获取模块,设置为:通过所述前置触摸屏获取所述触摸操作的触摸信息;识别模块,设置为:根据所述触摸信息对所述触摸操作进行手势识别;执行模块,设置为:按照所述手势识别的结果执行对应的功能。
Description
本申请涉及但不限于移动终端领域,尤其涉及一种多点触控装置及方法。
目前,随着通信技术和终端技术的快速发展,移动电话、智能电话、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)等移动终端的使用越来越广泛。由于大屏幕移动终端更适合观看电影、玩游戏等娱乐活动,导致移动终端屏幕越做越大,同时也导致了单手对移动终端的操作变得越来越困难。在需要通过多点触控完成的操作时,往往需要更换为双手进行操作,对用户的操作连贯性影响较大,大大降低了用户体验。而在某些场景下,例如:在逛街或者出行过程中,一只手提着包,另一只手操作手机,此时将无法通过两指的分离或者合拢实现放大或者缩小电子地图的操作,用户必须停下来,通过双手操作手机来实现电子地图的缩放,大大降低了用户操作体验。
上述内容仅用于辅助理解本申请的技术方案,并不代表承认上述内容是现有技术。
发明内容
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。
本发明实施例可以解决相关技术的移动终端无法通过单手完成多点触控操作的问题。
本发明实施例提供的一种多点触控装置,包括:
显示模块,设置为:在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;
获取模块,设置为:通过所述前置触摸屏获取所述触摸操作的触摸信
息;
识别模块,设置为:根据所述触摸信息对所述触摸操作进行手势识别;
执行模块,设置为:按照所述手势识别的结果执行对应的功能。
本发明实施例还提供一种多点触控方法,包括:
在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;
通过所述前置触摸屏获取所述触摸操作的触摸信息;
根据所述触摸信息对所述触摸操作进行手势识别;
按照所述手势识别的结果执行对应的功能。
本发明实施例还提供一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现上述方法。
本发明实施例通过背置传感器触发对应的指令,在移动终端前置触摸屏显示对应的虚拟点,供用户参照所述虚拟点进行触摸操作,所述移动终端获取基于前置触摸屏的触摸信息,以根据所述触摸信息进行手势识别,并按照所述手势识别的结果执行对应的功能;通过背置传感器和前置触摸屏的配合,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
在阅读并理解了附图和详细描述后,可以明白其他方面。
附图概述
图1为实现本发明实施例的移动终端的硬件结构示意图;
图2为如图1所示的移动终端的无线通信系统示意图;
图3为本发明基于背部传感器的多点触控装置的第一实施例的功能模块示意图;
图4为本发明基于背部传感器的多点触控装置的第二实施例的功能模块示意图;
图5为本发明基于背部传感器的多点触控装置的第三实施例的功能模块示意图;
图6为本发明实施例中移动终端识别出触摸手势为合拢手势时缩小所述前置触摸屏显示图像的示意图;
图7为本发明实施例中移动终端识别出触摸手势为分离手势时放大所述前置触摸屏显示图像的示意图;
图8为本发明基于背部传感器的多点触控方法的第一实施例的流程示意图;
图9为本发明基于背部传感器的多点触控方法的第二实施例的流程示意图;
图10为本发明基于背部传感器的多点触控方法的第三实施例的流程示意图。
应当理解,此处所描述的实施例仅仅用以解释本申请,并不用于限定本申请。
现在将参考附图描述实现本发明实施例的移动终端。在后续的描述中,使用用于表示元件的诸如"模块"、"部件"或"单元"的后缀仅为了有利于本发明实施例的说明,其本身并没有特定的意义。因此,"模块"与"部件"可以混合地使用。
移动终端可以以多种形式来实施。例如,本发明实施例中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是移动终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
图1为实现本发明实施例的移动终端的硬件结构示意图。
移动终端100可以包括无线通信单元110、A/V(音频/视频)输入单元120、用户输入单元130、感测单元140、输出单元150、存储器160、接口单元170、控制器180和电源单元190等等。图1示出了具有多种组件的移动终端,但是应理解的是,并不要求实施所有示出的组件。可以替代地实施更多或更少的组件。将在下面详细描述移动终端的元件。
无线通信单元110通常包括一个或多个组件,其允许移动终端100与无线通信系统或网络之间的无线电通信。例如,无线通信单元可以包括广播接收模块111、移动通信模块112、无线互联网模块113、短程通信模块114和位置信息模块115中的至少一个。
广播接收模块111经由广播信道从外部广播管理服务器接收广播信号和/或广播相关信息。广播信道可以包括卫星信道和/或地面信道。广播管理服务器可以是生成并发送广播信号和/或广播相关信息的服务器或者接收之前生成的广播信号和/或广播相关信息并且将其发送给终端的服务器。广播信号可以包括TV广播信号、无线电广播信号、数据广播信号等等。而且,广播信号可以进一步包括与TV或无线电广播信号组合的广播信号。广播相关信息也可以经由移动通信网络提供,并且在该情况下,广播相关信息可以由移动通信模块112来接收。广播信号可以以多种形式存在,例如,其可以以数字多媒体广播(DMB)的电子节目指南(EPG)、数字视频广播手持(DVB-H)的电子服务指南(ESG)等等的形式而存在。广播接收模块111可以通过使用多种类型的广播系统接收信号广播。特别地,广播接收模块111可以通过使用诸如多媒体广播-地面(DMB-T)、数字多媒体广播-卫星(DMB-S)、数字视频广播-手持(DVB-H),前向链路媒体(MediaFLO@)的数据广播系统、地面数字广播综合服务(ISDB-T)等等的数字广播系统接收数字广播。广播接收模块111可以被构造为适合提供广播信号的广播系统以及上述数字广播系统。经由广播接收模块111接收的广播信号和/或广播相关信息可以存储在存储器160(或者其它类型的存储介质)中。
移动通信模块112将无线电信号发送到基站(例如,接入点、节点B等等)、外部终端以及服务器中的至少一个和/或从其接收无线电信号。这样的无线电信号可以包括语音通话信号、视频通话信号、或者根据文本和/或多
媒体消息发送和/或接收的多种类型的数据。
无线互联网模块113支持移动终端的无线互联网接入。该模块可以内部或外部地耦接到终端。该模块所涉及的无线互联网接入技术可以包括WLAN(无线LAN)(Wi-Fi)、Wibro(无线宽带)、Wimax(全球微波互联接入)、HSDPA(高速下行链路分组接入)等等。
短程通信模块114设置为支持短程通信。短程通信技术的一些示例包括蓝牙TM、射频识别(RFID)、红外数据协会(I rDA)、超宽带(UWB)、紫蜂TM等等。
位置信息模块115设置为检查或获取移动终端的位置信息。位置信息模块的典型示例是GPS(全球定位系统)。根据当前的技术,GPS模块115计算来自三个或更多卫星的距离信息和准确的时间信息并且对于计算的信息应用三角测量法,从而根据经度、纬度和高度准确地计算三维当前位置信息。当前,用于计算位置和时间信息的方法使用三颗卫星并且通过使用另外的一颗卫星校正计算出的位置和时间信息的误差。此外,GPS模块115能够通过实时地连续计算当前位置信息来计算速度信息。
A/V输入单元120设置为接收音频或视频信号。A/V输入单元120可以包括相机121和麦克风122,相机121对在视频捕获模式或图像捕获模式中由图像捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元151上。经相机121处理后的图像帧可以存储在存储器160(或其它存储介质)中或者经由无线通信单元110进行发送,可以根据移动终端的构造提供两个或更多相机121。麦克风122可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由移动通信模块112发送到移动通信基站的格式输出。麦克风122可以通过噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
用户输入单元130可以根据用户输入的命令生成键输入数据以控制移动终端的操作。用户输入单元130允许用户输入多种类型的信息,并且可以包括键盘、锅仔片、触摸板(例如,检测由于被接触而导致的电阻、压力、电
容等等的变化的触敏组件)、滚轮、摇杆等等。特别地,当触摸板以层的形式叠加在显示单元151上时,可以形成触摸屏。
感测单元140检测移动终端100的当前状态,(例如,移动终端100的打开或关闭状态)、移动终端100的位置、用户对于移动终端100的接触(即,触摸输入)的有无、移动终端100的取向、移动终端100的加速或减速移动和方向等等,并且生成用于控制移动终端100的操作的命令或信号。例如,当移动终端100实施为滑动型移动电话时,感测单元140可以感测该滑动型电话是打开还是关闭。另外,感测单元140能够检测电源单元190是否提供电力或者接口单元170是否与外部装置耦接。感测单元140可以包括接近传感器141将在下面结合触摸屏来对此进行描述。
接口单元170用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。识别模块可以是存储用于验证用户使用移动终端100的多种信息并且可以包括用户识别模块(UIM)、客户识别模块(SIM)、通用客户识别模块(USIM)等等。另外,具有识别模块的装置(下面称为"识别装置")可以采取智能卡的形式,因此,识别装置可以经由端口或其它连接装置与移动终端100连接。接口单元170可以设置为接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以设置为在移动终端和外部装置之间传输数据。
另外,当移动终端100与外部底座连接时,接口单元170可以用作允许通过其将电力从底座提供到移动终端100的路径或者可以用作允许从底座输入的多种命令信号通过其传输到移动终端的路径。从底座输入的多种命令信号或电力可以用作用于识别移动终端是否准确地安装在底座上的信号。输出单元150被构造为以视觉、音频和/或触觉方式提供输出信号(例如,音频信号、视频信号、警报信号、振动信号等等)。输出单元150可以包括显示单元151、音频输出模块152、警报单元153等等。
显示单元151可以显示在移动终端100中处理的信息。例如,当移动
终端100处于电话通话模式时,显示单元151可以显示与通话或其它通信(例如,文本消息收发、多媒体文件下载等等)相关的用户界面(U I)或图形用户界面(GU I)。当移动终端100处于视频通话模式或者图像捕获模式时,显示单元151可以显示捕获的图像和/或接收的图像、示出视频或图像以及相关功能的UI或GUI等等。
同时,当显示单元151和触摸板以层的形式彼此叠加以形成触摸屏时,显示单元151可以用作输入装置和输出装置。显示单元151可以包括液晶显示器(LCD)、薄膜晶体管LCD(TFT-LCD)、有机发光二极管(OLED)显示器、柔性显示器、三维(3D)显示器等等中的至少一种。这些显示器中的一些可以被构造为透明状以允许用户从外部观看,这可以称为透明显示器,典型的透明显示器可以例如为TOLED(透明有机发光二极管)显示器等等。根据特定想要的实施方式,移动终端100可以包括两个或更多显示单元(或其它显示装置),例如,移动终端可以包括外部显示单元(未示出)和内部显示单元(未示出)。触摸屏可设置为检测触摸输入压力以及触摸输入位置和触摸输入面积。
音频输出模块152可以在移动终端处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将无线通信单元110接收的或者在存储器160中存储的音频数据转换音频信号并且输出为声音。而且,音频输出模块152可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出模块152可以包括扬声器、蜂鸣器等等。
警报单元153可以提供输出以将事件的发生通知给移动终端100。典型的事件可以包括呼叫接收、消息接收、键信号输入、触摸输入等等。除了音频或视频输出之外,警报单元153可以以不同的方式提供输出以通知事件的发生。例如,警报单元153可以以振动的形式提供输出,当接收到呼叫、消息或一些其它进入通信(incoming communication)时,警报单元153可以提供触觉输出(即,振动)以将其通知给用户。通过提供这样的触觉输出,即使在用户的移动电话处于用户的口袋中时,用户也能够识别出事件的发生。警报单元153也可以经由显示单元151或音频输出模块152提供通知事件的
发生的输出。
存储器160可以存储由控制器180执行的处理和控制操作的软件程序等等,或者可以暂时地存储己经输出或将要输出的数据(例如,电话簿、消息、静态图像、视频等等)。而且,存储器160可以存储关于当触摸施加到触摸屏时输出的多种方式的振动和音频信号的数据。
存储器160可以包括至少一种类型的存储介质,所述存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等等。而且,移动终端100可以与通过网络连接执行存储器160的存储功能的网络存储装置协作。
控制器180通常控制移动终端的总体操作。例如,控制器180执行与语音通话、数据通信、视频通话等等相关的控制和处理。另外,控制器180可以包括用于再现(或回放)多媒体数据的多媒体模块181,多媒体模块181可以构造在控制器180内,或者可以构造为与控制器180分离。控制器180可以执行模式识别处理,以将在触摸屏上执行的手写输入或者图片绘制输入识别为字符或图像。
电源单元190在控制器180的控制下接收外部电力或内部电力并且提供操作每个元件和组件所需的适当的电力。
这里描述的实施方式可以以使用例如计算机软件、硬件或其任何组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理装置(DSPD)、可编程逻辑装置(PLD)、现场可编程门阵列(FPGA)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施,在一些情况下,这样的实施方式可以在控制器180中实施。对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器160中并且由控制器180执行。
至此,己经按照其功能描述了移动终端。下面,为了简要起见,将描述诸如折叠型、直板型、摆动型、滑动型移动终端等等的多种类型的移动终端中的滑动型移动终端作为示例。因此,本发明实施例能够应用于任何类型的移动终端,并且不限于滑动型移动终端。
如图1中所示的移动终端100可以被构造为利用经由帧或分组发送数据的诸如有线和无线通信系统以及基于卫星的通信系统来操作。
现在将参考图2描述其中根据本发明实施例的移动终端能够操作的通信系统。
这样的通信系统可以使用不同的空中接口和/或物理层。例如,由通信系统使用的空中接口包括例如频分多址(FDMA)、时分多址(TDMA)、码分多址(CDMA)和通用移动通信系统(UMTS)(特别地,长期演进(LTE))、全球移动通信系统(GSM)等等。作为非限制性示例,下面的描述涉及CDMA通信系统,但是这样的教导同样适用于其它类型的系统。
参考图2,CDMA无线通信系统可以包括多个移动终端100、多个基站(BS)270、基站控制器(BSC)275和移动交换中心(MSC)280MSC 280被构造为与公共电话交换网络(PSTN)290形成接口。MSC 280还被构造为与可以经由回程线路耦接到基站270的BSC 275形成接口。回程线路可以根据多个己知的接口中的任一种来构造,所述接口包括例如E 1/T1、ATM,IP、PPP、帧中继、HDSL、ADSL或xDSL。将理解的是,如图2中所示的系统可以包括多个BSC275。
每个BS 270可以服务一个或多个分区(或区域),由多向天线或指向特定方向的天线覆盖的每个分区放射状地远离BS 270。或者,每个分区可以由用于分集接收的两个或更多天线覆盖。每个BS 270可以被构造为支持多个频率分配,并且每个频率分配具有特定频谱(例如,1.25MHz,5MHz等等)。
分区与频率分配的交叉可以被称为CDMA信道。BS 270也可以被称为基站收发器子系统(BTS)或者其它等效术语。在这样的情况下,术语"基站"可以用于笼统地表示单个BSC 275和至少一个BS 270。基站也可以被称为"蜂窝站"。或者,特定BS270的多个分区可以被称为多个蜂窝站。
如图2中所示,广播发射器(BT)295将广播信号发送给在系统内操作的移动终端100。如图1中所示的广播接收模块111被设置在移动终端100处以接收由BT 295发送的广播信号。在图2中,示出了几个全球定位系统(GPS)卫星300。卫星300帮助定位多个移动终端100中的至少一个。
在图2中,描绘了多个卫星300,但是可以理解的是,可以利用任何数目的卫星获得有用的定位信息。如图1中所示的GPS模块115通常被构造为与卫星300配合以获得想要的定位信息。替代GPS跟踪技术或者在GPS跟踪技术之外,可以使用可以跟踪移动终端的位置的其它技术。另外,至少一个GPS卫星300可以选择性地或者额外地处理卫星DMB传输。
作为无线通信系统的一个典型操作,BS 270接收来自移动终端100的反向链路信号。移动终端100通常参与通话、消息收发和其它类型的通信。特定基站270接收的每个反向链路信号被在特定BS 270内进行处理。获得的数据被转发给相关的BSC 275。BSC提供通话资源分配和包括BS 270之间的软切换过程的协调的移动管理功能。BSC 275还将接收到的数据路由到MSC 280,其提供用于与PSTN 290形成接口的额外的路由服务。类似地,PSTN 290与MSC 280形成接口,MSC与BSC 275形成接口,并且BSC275相应地控制BS270以将正向链路信号发送到移动终端100。
基于上述移动终端硬件结构以及通信系统,提出本发明基于背部传感器的多点触控装置的实施例。
参照图3,图3为本发明基于背部传感器的多点触控装置的第一实施例的功能模块示意图。
在本实施例中,所述基于背部传感器的多点触控装置包括显示模块10、获取模块20、识别模块30及执行模块40;
所述显示模块10,设置为:在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;
所述背置传感器设置在所述移动终端背部,设置为:触发对应的指令以进入多点触控模式,或者触发对应的指令以退出多点触控模式。所述移动终端在侦测到基于所述背置传感器触发的开启指令时,进入多点触控模式,所述移动终端在前置触摸屏显示对应的虚拟点,以供用户根据所述虚拟点进行触控操作。所述背置传感器可以是诸如检测由于被触摸导致的电阻、压力、电容等变化的触敏组件,或者也可以是光学指纹传感器、半导体电容传感器、半导体热敏传感器、半导体压感传感器和超声波传感器等等传感器。
可以通过监测所述背置传感器的输出电平,在侦测到用户基于所述背置传感器的按下操作时,触发所述开启指令,以进入多点触控模式,在前置触摸屏显示对应的虚拟点,供用户基于所述前置触摸屏的虚拟点进行触摸操作;在侦测到用户基于所述背置传感器的松开操作时,触发所述退出指令,以退出多点触控模式,取消在前置触摸屏显示的所述虚拟点。
可选的,所述基于背部传感器的多点触控装置还可以包括启动模块;所述启动模块,设置为:在侦测到终端进入多点触控界面时,启动所述背置传感器,并接收基于所述背置传感器触发的开启指令。所述多点触控界面可以是预设的需多点触控完成的操作界面,例如:图片查看界面,需要多点触控以完成图片的缩放;文本阅读界面,需要多点触控以完成文字大小的缩放等等。通过自动识别所述移动终端是否进入多点触控界面,并自动启动所述背置传感器,以基于所述背置传感器触发对应的开启指令,使得用户操作更加便捷,提高了用户体验。
所述前置触摸屏设置在所述移动终端正面,设置为:显示在移动终端中处理的信息,并接收用户的触摸输入,以控制移动终端的多种操作。所述前置触摸屏还可以设置为:获取用户触摸操作的触摸信息,以及通过图像显示移动终端中处理的信息。
所述虚拟点为多点触控操作时,显示在前置触摸屏的固定点,可以在背置传感器在所述前置触摸屏的投影位置,或者也可以是所述前置触摸屏的中心,所述虚拟点可以在前置触摸屏以半透明的圆形、菱形等等图形形式进行显示。所述虚拟点用于供用户以所述虚拟点为参照完成多点触控操作,例如:用户单指触摸所述前置触摸屏,并向远离所述虚拟点的方向滑动,以形
成分离手势;或者,向接近所述虚拟点的方向滑动,以形成合拢手势。
所述获取模块20,设置为:通过所述前置触摸屏获取所述触摸操作的触摸信息;
所述移动终端通过所述前置触摸屏获取所述触摸操作的触摸信息。所述触摸操作为用户基于所述前置触摸屏完成的触摸操作。所述触摸信息用于对对应的触摸操作进行手势判断,所述触摸信息可以是所述触摸操作的触摸点的位置、触摸时间、触摸点的移动轨迹等等触摸信息。
所述识别模块30,设置为:根据所述触摸信息对所述触摸操作进行手势识别;
所述移动终端根据所述触摸信息对所述触摸操作进行手势识别。所述移动终端可以根据所述触摸操作相对所述虚拟点的移动方向,以判断所述触摸操作为分离操作或者为合拢操作;或者,也可以根据所述触摸操作的移动轨迹,根据所述移动轨迹是否为弧形,以判断所述触摸操作是否为旋转操作。
所述执行模块40,设置为:按照所述手势识别的结果执行对应的功能。
所述移动终端按照所述手势识别的结果执行对应的功能。所述手势识别的结果对应的功能,可以跟据预设的手势与对应的功能的映射关系,执行对应的功能。例如:在识别出合拢手势时,执行缩小显示图像的功能;在识别出分离手势时,执行放大显示图像的功能。
可选的,所述显示模块10,还设置为:在侦测到基于所述背置传感器触发的退出指令时,取消在前置触摸屏显示的所述虚拟点。可以在侦测到用户基于所述背置传感器的按下操作时,触发开启指令,以进入多点触控模式,在前置触摸屏显示对应的虚拟点,供用户基于所述前置触摸屏的虚拟点进行触摸操作;在侦测到用户基于所述背置传感器的松开操作时,触发所述退出指令,以退出多点触控模式,取消在前置触摸屏显示的所述虚拟点。
本实施例通过背置传感器触发对应的指令,在移动终端前置触摸屏显示对应的虚拟点,供用户参照所述虚拟点进行触摸操作,所述移动终端获取基于前置触摸屏的触摸信息,以根据所述触摸信息进行手势识别,并按照所述手势识别的结果执行对应的功能;通过背置传感器和前置触摸屏的配合,使
得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
参照图4,图4为本发明基于背部传感器的多点触控装置的第二实施例的功能模块示意图。基于上述基于背部传感器的多点触控装置的第一实施例,所述识别模块30包括第一确定单元31、计算单元32及比对单元33;
所述第一确定单元31,设置为:根据所述触摸信息确定所述触摸操作的起点和终点;
所述计算单元32,设置为:计算所述起点与所述虚拟点的第一距离、所述终点与所述虚拟点的第二距离;
所述比对单元33,设置为:将所述第一距离与所述第二距离进行比对;
所述第一确定单元31,还设置为:若所述第一距离小于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离大于所述第二距离,则确定所述触摸操作为分离手势;或者,若所述第一距离大于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离小于所述第二距离,则确定所述触摸操作为分离手势。
所述移动终端根据所述触摸信息确定所述触摸操作的起点和终点;所述移动终端根据所述起点、所述终点及所述虚拟点的位置信息,计算所述起点与所述虚拟点的第一距离、所述终点与所述虚拟点的第二距离;将所述第一距离与所述第二距离进行比对;所述移动终端在所述第一距离小于所述第二距离时,确定所述触摸操作为合拢手势;在所述第一距离大于所述第二距离时,确定所述触摸操作为分离手势;或者,所述移动终端在所述第一距离大于所述第二距离时,确定所述触摸操作为合拢手势;在所述第一距离小于所述第二距离时,确定所述触摸操作为分离手势。所述移动终端还可以在所述第一距离等于所述第二距离时,确定所述触摸操作为无效手势。
本实施例通过前置触摸屏获取触摸操作的触摸信息,根据所述触摸信息确定触摸操作的起点和终点,根据所述起点和所述终点分别与所述虚拟点的
距离进行手势识别,以判断所述触摸操作为合拢手势或者分离手势,并按照所述手势识别的结果执行对应的功能。使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
在其它实施例中,所述识别模块可包括第二确定单元;
第二确定单元,设置为:根据所述触摸信息确定所述触摸操作相对所述虚拟点的移动方向;根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势。
其中,所述第二确定单元,设置为:若所述触摸操作相对所述虚拟点的移动方向为接近所述虚拟点,则确定所述触摸操作为合拢手势;若所述触摸操作相对所述虚拟点的移动方向为远离所述虚拟点,则确定所述触摸操作为分离手势。
本实施例通过前置触摸屏获取触摸操作的触摸信息,根据所述触摸信息确定触摸操作的移动方向,根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势,以判断所述触摸操作为合拢手势或者分离手势,并按照所述手势识别的结果执行对应的功能。使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
在其它实施例中,所述识别模块可包括第三确定单元;
第三确定单元,设置为:根据所述触摸信息确定所述触摸操作的移动轨迹;根据所述触摸操作的移动轨迹,确定所述触摸操作的手势。
所述第三确定单元,设置为:若所述移动轨迹为弧形,则确定所述触摸操作为旋转手势。
本实施例通过前置触摸屏获取触摸操作的触摸信息,根据所述触摸信息确定触摸操作的移动轨迹,根据所述移动轨迹进行手势识别,以判断所述触摸操作是否为旋转手势,并按照所述手势识别的结果执行对应的功能。使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触
控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
参照图5,图5为本发明基于背部传感器的多点触控装置的第三实施例的功能模块示意图。基于上述基于背部传感器的多点触控装置的第二实施例,所述执行模块40包括缩小单元41及放大单元42;
所述缩小单元41,设置为:若所述触摸操作为合拢手势,则根据所述第一距离与所述第二距离的距离差确定缩小比例,将所述前置触摸屏显示的图像按照所述缩小比例进行缩小;
若所述移动终端识别出所述触摸操作为合拢手势,则将所述第一距离减去所述第二距离得到距离差,根据所述距离差确定对应的缩小比例,将所述前置触摸屏显示的图像按照所述缩小比例进行缩小。参照图6,图6为本发明实施例中移动终端识别出触摸手势为合拢手势时缩小所述前置触摸屏显示图像的示意图。
所述放大单元42,设置为:若所述触摸操作为分离手势,则根据所述第一距离与所述第二距离的距离差确定放大比例,将所述前置触摸屏显示的图像按照所述放大比例进行放大。
若所述移动终端识别出所述触摸操作为分离手势,则将所述第二距离减去所述第一距离得到距离差,根据所述距离差确定对应的放大比例,将所述前置触摸屏显示的图像按照所述放大比例进行放大。参照图7,图7为本发明实施例中移动终端识别出触摸手势为分离手势时放大所述前置触摸屏显示图像的示意图。
本实施例通过在识别出所述触摸操作为合拢手势时,根据所述触摸操作的起点与所述虚拟点的第一距离,所述触摸操作的终点与所述虚拟点的第二距离,所述第一距离与所述第二距离之间的距离差来确定缩放比例,按照对应的缩放比例进行对应的缩放操作,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现需要多点触控,并执行对应的缩放功能,使得用户无需双手操作移动终端,大大提高了用户体验。
在其它实施例中,所述执行模块还可包括旋转单元;
所述旋转单元,设置为:若所述触摸操作为旋转手势,则根据所述移动轨迹确定旋转角度,将所述前置触摸屏显示的图像按照所述旋转角度进行旋转。
本实施例通过在识别出所述触摸操作为旋转手势时,根据根据所述移动轨迹确定旋转角度,将所述前置触摸屏显示的图像按照所述旋转角度进行旋转,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现需要多点触控,并执行对应的旋转功能,使得用户无需双手操作移动终端,大大提高了用户体验。
本发明实施例还提供一种基于背部传感器的多点触控方法。
参照图8,图8为本发明基于背部传感器的多点触控方法的第一实施例的流程示意图。
在本实施例中,所述基于背部传感器的多点触控方法包括:
步骤S10,在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;
所述背置传感器设置在所述移动终端背部,设置为:触发对应的指令以进入多点触控模式,或者触发对应的指令以退出多点触控模式。所述移动终端在侦测到基于所述背置传感器触发的开启指令时,进入多点触控模式,所述移动终端在前置触摸屏显示对应的虚拟点,以供用户根据所述虚拟点进行触控操作。所述背置传感器可以是诸如检测由于被触摸导致的电阻、压力、电容等变化的触敏组件,或者也可以是光学指纹传感器、半导体电容传感器、半导体热敏传感器、半导体压感传感器和超声波传感器等等传感器。
可以通过监测所述背置传感器的输出电平,在侦测到用户基于所述背置传感器的按下操作时,触发所述开启指令,以进入多点触控模式,在前置触摸屏显示对应的虚拟点,供用户基于所述前置触摸屏的虚拟点进行触摸操作;在侦测到用户基于所述背置传感器的松开操作时,触发所述退出指令,以退出多点触控模式,取消在前置触摸屏显示的所述虚拟点。
可选的,所述步骤S10之前,还可以在侦测到终端进入多点触控界面时,启动所述背置传感器,并接收基于所述背置传感器触发的开启指令。所述多点触控界面可以是预设的需多点触控完成的操作界面,例如:图片查看界面,需要多点触控以完成图片的缩放;文本阅读界面,需要多点触控以完成文字大小的缩放等等。通过自动识别所述移动终端是否进入多点触控界面,并自动启动所述背置传感器,以基于所述背置传感器触发对应的开启指令,使得用户操作更加便捷,提高了用户体验。
所述前置触摸屏设置在所述移动终端正面,设置为:显示在移动终端中处理的信息,并接收用户的触摸输入,以控制移动终端的多种操作。所述前置触摸屏还可以设置为:获取用户触摸操作的触摸信息,以及通过图像显示移动终端中处理的信息。
所述虚拟点为多点触控操作时,显示在前置触摸屏的固定点,可以在背置传感器在所述前置触摸屏的投影位置,或者也可以是所述前置触摸屏的中心,所述虚拟点可以在前置触摸屏以半透明的圆形、菱形等等图形形式进行显示。所述虚拟点用于供用户以所述虚拟点为参照完成多点触控操作,例如:用户单指触摸所述前置触摸屏,并向远离所述虚拟点的方向滑动,以形成分离手势;或者,向接近所述虚拟点的方向滑动,以形成合拢手势。
步骤S20,通过所述前置触摸屏获取所述触摸操作的触摸信息;
所述移动终端通过所述前置触摸屏获取所述触摸操作的触摸信息。所述触摸操作为用户基于所述前置触摸屏完成的触摸操作。所述触摸信息用于对对应的触摸操作进行手势判断,所述触摸信息可以是所述触摸操作的触摸点的位置、触摸时间、触摸点的移动轨迹等等触摸信息。
步骤S30,根据所述触摸信息对所述触摸操作进行手势识别;
所述移动终端根据所述触摸信息对所述触摸操作进行手势识别。所述移动终端可以根据所述触摸操作相对所述虚拟点的移动方向,以判断所述触摸操作为分离操作或者为合拢操作;或者,也可以根据所述触摸操作的移动轨迹,根据所述移动轨迹是否为弧形,以判断所述触摸操作是否为旋转操作。
步骤S40,按照所述手势识别的结果执行对应的功能。
所述移动终端按照所述手势识别的结果执行对应的功能。所述手势识别的结果对应的功能,可以跟据预设的手势与对应的功能的映射关系,执行对应的功能。例如:在识别出合拢手势时,执行缩小显示图像的功能;在识别出分离手势时,执行放大显示图像的功能。
可选的,所述步骤S40之后,还可以在侦测到基于所述背置传感器触发的退出指令时,取消在前置触摸屏显示的所述虚拟点。可以在侦测到用户基于所述背置传感器的按下操作时,触发开启指令,以进入多点触控模式,在前置触摸屏显示对应的虚拟点,供用户基于所述前置触摸屏的虚拟点进行触摸操作;在侦测到用户基于所述背置传感器的松开操作时,触发所述退出指令,以退出多点触控模式,取消在前置触摸屏显示的所述虚拟点。
本实施例通过背置传感器触发对应的指令,在移动终端前置触摸屏显示对应的虚拟点,供用户参照所述虚拟点进行触摸操作,所述移动终端获取基于前置触摸屏的触摸信息,以根据所述触摸信息进行手势识别,并按照所述手势识别的结果执行对应的功能;通过背置传感器和前置触摸屏的配合,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
参照图9,图9为本发明基于背部传感器的多点触控方法的第二实施例的流程示意图。基于上述基于背部传感器的多点触控方法的第一实施例,所述步骤S30包括:
步骤S31,根据所述触摸信息确定所述触摸操作的起点和终点;
步骤S32,计算所述起点与所述虚拟点的第一距离、所述终点与所述虚拟点的第二距离;
步骤S33,将所述第一距离与所述第二距离进行比对;
步骤S34,若所述第一距离大于所述第二距离,则确定所述触摸操作为合拢手势;
步骤S35,若所述第一距离小于所述第二距离,则确定所述触摸操作为
分离手势。
其中,步骤S34和步骤S35也可以是:若所述第一距离小于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离大于所述第二距离,则确定所述触摸操作为分离手势。
所述移动终端根据所述触摸信息确定所述触摸操作的起点和终点;所述移动终端根据所述起点、所述终点及所述虚拟点的位置信息,计算所述起点与所述虚拟点的第一距离、所述终点与所述虚拟点的第二距离;将所述第一距离与所述第二距离进行比对;所述移动终端在所述第一距离小于所述第二距离时,确定所述触摸操作为合拢手势;在所述第一距离大于所述第二距离时,确定所述触摸操作为分离手势;或者,所述移动终端在所述第一距离大于所述第二距离时,确定所述触摸操作为合拢手势;在所述第一距离小于所述第二距离时,确定所述触摸操作为分离手势。所述移动终端还可以在所述第一距离等于所述第二距离时,确定所述触摸操作为无效手势。
本实施例通过前置触摸屏获取触摸操作的触摸信息,根据所述触摸信息确定触摸操作的起点和终点,根据所述起点和所述终点分别与所述虚拟点的距离进行手势识别,以判断所述触摸操作为合拢手势或者分离手势,并按照所述手势识别的结果执行对应的功能。使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
在其它实施例中,所述步骤S30可包括:
步骤S34,根据所述触摸信息确定所述触摸操作相对所述虚拟点的移动方向;
步骤S35,根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势。
其中,步骤S35可包括:
若所述触摸操作相对所述虚拟点的移动方向为接近所述虚拟点,则确定所述触摸操作为合拢手势;若所述触摸操作相对所述虚拟点的移动方向为远离所述虚拟点,则确定所述触摸操作为分离手势。
本实施例通过前置触摸屏获取触摸操作的触摸信息,根据所述触摸信息确定触摸操作的移动方向,根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势,以判断所述触摸操作为合拢手势或者分离手势,并按照所述手势识别的结果执行对应的功能。使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
在其它实施例中,所述步骤S30可包括:
步骤S36,根据所述触摸信息确定所述触摸操作的移动轨迹;
步骤S37,根据所述触摸操作的移动轨迹,确定所述触摸操作的手势。
其中,步骤S37可包括:
若所述移动轨迹为弧形,则确定所述触摸操作为旋转手势。
本实施例通过前置触摸屏获取触摸操作的触摸信息,根据所述触摸信息确定触摸操作的移动轨迹,根据所述移动轨迹进行手势识别,以判断所述触摸操作是否为旋转手势,并按照所述手势识别的结果执行对应的功能。使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
参照图10,图10为本发明基于背部传感器的多点触控方法的第三实施例的流程示意图。基于上述基于背部传感器的多点触控方法的第二实施例,所述步骤S40包括:
步骤S41,若所述触摸操作为合拢手势,则根据所述第一距离与所述第二距离的距离差确定缩小比例,将所述前置触摸屏显示的图像按照所述缩小比例进行缩小;
若所述移动终端识别出所述触摸操作为合拢手势,则将所述第一距离减去所述第二距离得到距离差,根据所述距离差确定对应的缩小比例,将所述前置触摸屏显示的图像按照所述缩小比例进行缩小。参照图6,图6为本发
明实施例中移动终端识别出触摸手势为合拢手势时缩小所述前置触摸屏显示图像的示意图。
步骤S42,若所述触摸操作为分离手势,则根据所述第一距离与所述第二距离的距离差确定放大比例,将所述前置触摸屏显示的图像按照所述放大比例进行放大。
若所述移动终端识别出所述触摸操作为分离手势,则将所述第二距离减去所述第一距离得到距离差,根据所述距离差确定对应的放大比例,将所述前置触摸屏显示的图像按照所述放大比例进行放大。参照图7,图7为本发明实施例中移动终端识别出触摸手势为分离手势时放大所述前置触摸屏显示图像的示意图。
本实施例通过在识别出所述触摸操作为合拢手势时,根据所述触摸操作的起点与所述虚拟点的第一距离,所述触摸操作的终点与所述虚拟点的第二距离,所述第一距离与所述第二距离之间的距离差来确定缩放比例,按照对应的缩放比例进行对应的缩放操作,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现需要多点触控,并执行对应的缩放功能,使得用户无需双手操作移动终端,大大提高了用户体验。
在其它实施例中,所述步骤S40还可包括:
步骤S43,若所述触摸操作为旋转手势,则根据所述移动轨迹确定旋转角度,将所述前置触摸屏显示的图像按照所述旋转角度进行旋转。
本实施例通过在识别出所述触摸操作为旋转手势时,根据根据所述移动轨迹确定旋转角度,将所述前置触摸屏显示的图像按照所述旋转角度进行旋转,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现需要多点触控,并执行对应的旋转功能,使得用户无需双手操作移动终端,大大提高了用户体验。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的
情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明实施例的技术方案可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括多个指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明实施例所述的方法。
以上仅为本发明的可选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。
本发明实施例通过背置传感器和前置触摸屏的配合,使得用户在需要多点触控进行操作时,通过单手操作移动终端即可实现多点触控,并完成对应的功能控制,使得用户无需双手操作移动终端进行多点触控,大大提高了用户体验。
Claims (20)
- 一种多点触控装置,包括:显示模块,设置为:在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;获取模块,设置为:通过所述前置触摸屏获取所述触摸操作的触摸信息;识别模块,设置为:根据所述触摸信息对所述触摸操作进行手势识别;执行模块,设置为:按照所述手势识别的结果执行对应的功能。
- 如权利要求1所述的多点触控装置,其中,所述识别模块包括第一确定单元、计算单元及比对单元;所述第一确定单元,设置为:根据所述触摸信息确定所述触摸操作的起点和终点;所述计算单元,设置为:计算所述起点与所述虚拟点的第一距离、所述终点与所述虚拟点的第二距离;所述比对单元,设置为:将所述第一距离与所述第二距离进行比对;所述第一确定单元,还设置为:若所述第一距离小于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离大于所述第二距离,则确定所述触摸操作为分离手势;或者,若所述第一距离大于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离小于所述第二距离,则确定所述触摸操作为分离手势。
- 如权利要求2所述的多点触控装置,其中,所述执行模块包括缩小单元及放大单元;所述缩小单元,设置为:若所述触摸操作为合拢手势,则根据所述第一距离与所述第二距离的距离差确定缩小比例,将所述前置触摸屏显示的图像按照所述缩小比例进行缩小;所述放大单元,设置为:若所述触摸操作为分离手势,则根据所述第一距离与所述第二距离的距离差确定放大比例,将所述前置触摸屏显示的图像 按照所述放大比例进行放大。
- 如权利要求1所述的多点触控装置,其中,所述显示模块,还设置为:在侦测到基于所述背置传感器触发的退出指令时,取消在前置触摸屏显示的所述虚拟点。
- 如权利要求1所述的多点触控装置,还包括启动模块;所述启动模块,设置为:在侦测到终端进入多点触控界面时,启动所述背置传感器,并接收基于所述背置传感器触发的开启指令。
- 如权利要求1所述的多点触控装置,其中,所述识别模块包括第二确定单元;第二确定单元,设置为:根据所述触摸信息确定所述触摸操作相对所述虚拟点的移动方向;根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势。
- 如权利要求6所述的多点触控装置,其中,所述第二确定单元,设置为:若所述触摸操作相对所述虚拟点的移动方向为接近所述虚拟点,则确定所述触摸操作为合拢手势;若所述触摸操作相对所述虚拟点的移动方向为远离所述虚拟点,则确定所述触摸操作为分离手势。
- 如权利要求1所述的多点触控装置,其中,所述识别模块包括第三确定单元;第三确定单元,设置为:根据所述触摸信息确定所述触摸操作的移动轨迹;根据所述触摸操作的移动轨迹,确定所述触摸操作的手势。
- 如权利要求8所述的多点触控装置,其中,第三确定单元,设置为:若所述移动轨迹为弧形,则确定所述触摸操作为旋转手势。
- 如权利要求9所述的多点触控装置,其中,所述执行模块包括旋转单元;所述旋转单元,设置为:若所述触摸操作为旋转手势,则根据所述移动 轨迹确定旋转角度,将所述前置触摸屏显示的图像按照所述旋转角度进行旋转。
- 一种多点触控方法,包括:在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作;通过所述前置触摸屏获取所述触摸操作的触摸信息;根据所述触摸信息对所述触摸操作进行手势识别;按照所述手势识别的结果执行对应的功能。
- 如权利要求11所述的多点触控方法,其中,所述根据所述触摸信息对所述触摸操作进行手势识别的步骤包括:根据所述触摸信息确定所述触摸操作的起点和终点;计算所述起点与所述虚拟点的第一距离、所述终点与所述虚拟点的第二距离;将所述第一距离与所述第二距离进行比对;若所述第一距离小于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离大于所述第二距离,则确定所述触摸操作为分离手势或者,若所述第一距离大于所述第二距离,则确定所述触摸操作为合拢手势;若所述第一距离小于所述第二距离,则确定所述触摸操作为分离手势。
- 如权利要求12所述的多点触控方法,其中,所述按照所述手势识别的结果执行对应的功能的步骤包括:若所述触摸操作为合拢手势,则根据所述第一距离与所述第二距离的距离差确定缩小比例,将所述前置触摸屏显示的图像按照所述缩小比例进行缩小;若所述触摸操作为分离手势,则根据所述第一距离与所述第二距离的距离差确定放大比例,将所述前置触摸屏显示的图像按照所述放大比例进行放大。
- 如权利要求11所述的多点触控方法,其中,所述按照所述手势识 别的结果执行对应的功能的步骤之后,还包括:在侦测到基于所述背置传感器触发的退出指令时,取消在前置触摸屏显示的所述虚拟点。
- 如权利要求11所述的多点触控方法,其中,所述在侦测到基于背置传感器触发的开启指令时,在前置触摸屏显示对应的虚拟点,供用户根据所述虚拟点进行触摸操作的步骤之前,还包括:在侦测到终端进入多点触控界面时,启动所述背置传感器,并接收基于所述背置传感器触发的开启指令。
- 如权利要求11所述的多点触控方法,其中,所述根据所述触摸信息对所述触摸操作进行手势识别的步骤包括:根据所述触摸信息确定所述触摸操作相对所述虚拟点的移动方向;根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势。
- 如权利要求16所述的多点触控方法,其中,所述根据所述触摸操作相对所述虚拟点的移动方向,确定所述触摸操作的手势的步骤包括:若所述触摸操作相对所述虚拟点的移动方向为接近所述虚拟点,则确定所述触摸操作为合拢手势;若所述触摸操作相对所述虚拟点的移动方向为远离所述虚拟点,则确定所述触摸操作为分离手势。
- 如权利要求11所述的多点触控方法,其中,所述根据所述触摸信息对所述触摸操作进行手势识别的步骤包括:根据所述触摸信息确定所述触摸操作的移动轨迹;根据所述触摸操作的移动轨迹,确定所述触摸操作的手势。
- 如权利要求18所述的多点触控方法,其中,所述根据所述触摸操作的移动轨迹,确定所述触摸操作的手势的步骤包括:若所述移动轨迹为弧形,则确定所述触摸操作为旋转手势。
- 如权利要求19所述的多点触控方法,其中,所述按照所述手势识别 的结果执行对应的功能的步骤包括:若所述触摸操作为旋转手势,则根据所述移动轨迹确定旋转角度,将所述前置触摸屏显示的图像按照所述旋转角度进行旋转。
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510535462.8 | 2015-08-27 | ||
| CN201510535462.8A CN105159579A (zh) | 2015-08-27 | 2015-08-27 | 基于背部传感器的多点触控装置及方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017032217A1 true WO2017032217A1 (zh) | 2017-03-02 |
Family
ID=54800453
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2016/093964 Ceased WO2017032217A1 (zh) | 2015-08-27 | 2016-08-08 | 多点触控装置及方法 |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN105159579A (zh) |
| WO (1) | WO2017032217A1 (zh) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112204511A (zh) * | 2018-08-31 | 2021-01-08 | 深圳市柔宇科技股份有限公司 | 输入控制方法及电子装置 |
| CN113721911A (zh) * | 2021-08-25 | 2021-11-30 | 网易(杭州)网络有限公司 | 虚拟场景的显示比例的控制方法、介质和设备 |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105159579A (zh) * | 2015-08-27 | 2015-12-16 | 努比亚技术有限公司 | 基于背部传感器的多点触控装置及方法 |
| CN105653153A (zh) * | 2015-12-24 | 2016-06-08 | 努比亚技术有限公司 | 移动终端应用的字体同步方法及装置 |
| CN106095264B (zh) * | 2016-05-26 | 2019-10-01 | 努比亚技术有限公司 | 特效显示装置及方法 |
| CN106227451A (zh) * | 2016-07-26 | 2016-12-14 | 维沃移动通信有限公司 | 一种移动终端的操作方法及移动终端 |
| CN106325525A (zh) * | 2016-09-27 | 2017-01-11 | 成都西可科技有限公司 | 一种热敏唤醒的终端和热敏唤醒终端的方法 |
| CN107715454B (zh) * | 2017-09-01 | 2018-12-21 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
| CN109918007A (zh) * | 2019-01-25 | 2019-06-21 | 努比亚技术有限公司 | 一种终端显示方法、终端及计算机可读存储介质 |
| US11385784B2 (en) * | 2019-01-31 | 2022-07-12 | Citrix Systems, Inc. | Systems and methods for configuring the user interface of a mobile device |
| CN109885372A (zh) * | 2019-02-25 | 2019-06-14 | 努比亚技术有限公司 | 穿戴式设备的主界面切换方法、穿戴式设备及存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014041441A (ja) * | 2012-08-21 | 2014-03-06 | Itsuo Kumazawa | 背面触覚情報提示型情報入力装置およびその方法 |
| CN104123024A (zh) * | 2013-04-27 | 2014-10-29 | 华为技术有限公司 | 一种终端设备及设备控制方法 |
| CN104331182A (zh) * | 2014-03-06 | 2015-02-04 | 广州三星通信技术研究有限公司 | 具有辅助触摸屏的便携式终端 |
| CN104460851A (zh) * | 2013-09-24 | 2015-03-25 | 深圳桑菲消费通信有限公司 | 一种双面触控电子设备以及一种触控方法 |
| CN105159579A (zh) * | 2015-08-27 | 2015-12-16 | 努比亚技术有限公司 | 基于背部传感器的多点触控装置及方法 |
-
2015
- 2015-08-27 CN CN201510535462.8A patent/CN105159579A/zh active Pending
-
2016
- 2016-08-08 WO PCT/CN2016/093964 patent/WO2017032217A1/zh not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014041441A (ja) * | 2012-08-21 | 2014-03-06 | Itsuo Kumazawa | 背面触覚情報提示型情報入力装置およびその方法 |
| CN104123024A (zh) * | 2013-04-27 | 2014-10-29 | 华为技术有限公司 | 一种终端设备及设备控制方法 |
| CN104460851A (zh) * | 2013-09-24 | 2015-03-25 | 深圳桑菲消费通信有限公司 | 一种双面触控电子设备以及一种触控方法 |
| CN104331182A (zh) * | 2014-03-06 | 2015-02-04 | 广州三星通信技术研究有限公司 | 具有辅助触摸屏的便携式终端 |
| CN105159579A (zh) * | 2015-08-27 | 2015-12-16 | 努比亚技术有限公司 | 基于背部传感器的多点触控装置及方法 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112204511A (zh) * | 2018-08-31 | 2021-01-08 | 深圳市柔宇科技股份有限公司 | 输入控制方法及电子装置 |
| CN113721911A (zh) * | 2021-08-25 | 2021-11-30 | 网易(杭州)网络有限公司 | 虚拟场景的显示比例的控制方法、介质和设备 |
| CN113721911B (zh) * | 2021-08-25 | 2023-09-26 | 网易(杭州)网络有限公司 | 虚拟场景的显示比例的控制方法、介质和设备 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105159579A (zh) | 2015-12-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017032217A1 (zh) | 多点触控装置及方法 | |
| WO2016029766A1 (zh) | 移动终端及其操作方法和计算机存储介质 | |
| WO2017143847A1 (zh) | 关联应用分屏显示装置、方法及终端 | |
| WO2016173468A1 (zh) | 组合操作方法和装置、触摸屏操作方法及电子设备 | |
| WO2017071424A1 (zh) | 一种移动终端和分享文件的方法 | |
| CN104731480B (zh) | 基于触摸屏的图像显示方法及装置 | |
| CN105100603B (zh) | 一种智能终端内嵌的触发拍照装置及其方法 | |
| WO2016034055A1 (zh) | 移动终端及其操作方法、计算机存储介质 | |
| CN106101423B (zh) | 分屏区域大小调整装置及方法 | |
| WO2017071456A1 (zh) | 一种终端的处理方法、终端及存储介质 | |
| WO2016155550A1 (zh) | 无边框终端的应用切换方法及无边框终端 | |
| WO2017020771A1 (zh) | 终端控制装置及方法 | |
| CN106097284B (zh) | 一种夜景图像的处理方法和移动终端 | |
| WO2017071481A1 (zh) | 一种移动终端及其实现分屏的方法 | |
| CN105190479A (zh) | 移动终端及其控制方法 | |
| WO2016155424A1 (zh) | 移动终端的应用切换方法及移动终端及计算机存储介质 | |
| WO2017143855A1 (zh) | 具有截屏功能的装置和截屏方法 | |
| WO2016155509A1 (zh) | 移动终端的握持方式判断方法及装置 | |
| WO2016161986A1 (zh) | 操作识别方法、装置、移动终端及计算机存储介质 | |
| WO2016155597A1 (zh) | 基于无边框终端的应用控制方法及装置 | |
| CN106911850A (zh) | 移动终端及其截屏方法 | |
| WO2017088631A1 (zh) | 一种移动终端及其增减调节方法及装置、存储介质 | |
| CN106648324B (zh) | 一种隐藏图标操控方法、装置及终端 | |
| WO2017020828A1 (zh) | 移动终端及其操控方法 | |
| CN106843723A (zh) | 一种应用程序关联运用方法和移动终端 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16838477 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.07.2018) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16838477 Country of ref document: EP Kind code of ref document: A1 |