[go: up one dir, main page]

WO2020015529A1 - Procédé de commande de dispositif terminal, et dispositif terminal - Google Patents

Procédé de commande de dispositif terminal, et dispositif terminal Download PDF

Info

Publication number
WO2020015529A1
WO2020015529A1 PCT/CN2019/094532 CN2019094532W WO2020015529A1 WO 2020015529 A1 WO2020015529 A1 WO 2020015529A1 CN 2019094532 W CN2019094532 W CN 2019094532W WO 2020015529 A1 WO2020015529 A1 WO 2020015529A1
Authority
WO
WIPO (PCT)
Prior art keywords
sliding
terminal device
interface
target virtual
virtual key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/094532
Other languages
English (en)
Chinese (zh)
Inventor
曹春阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Publication of WO2020015529A1 publication Critical patent/WO2020015529A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • Embodiments of the present disclosure relate to the technical field of terminals, and in particular, to a method for controlling a terminal device and a terminal device.
  • the full-screen mobile phone in the related technology cancels the design of the physical keys (such as the Home key, the return key, and the menu key) under the mobile phone.
  • a virtual key having the same function as the physical key is displayed instead of the physical key.
  • the mobile phone displays the application interface in full screen
  • the mobile phone does not display the above-mentioned virtual keys.
  • the mobile phone will only call up the virtual keys when it detects the user's sliding operation on the edge of the screen, and then execute the virtual key.
  • the functions corresponding to the virtual keys are complicated and difficult to perform.
  • a method for controlling a terminal device includes:
  • the target virtual key is not displayed on the first interface
  • the preset coordinates are coordinates on the display screen when the target virtual key is displayed on the display screen.
  • an embodiment of the present disclosure further provides a terminal device, including:
  • a receiving module configured to receive a sliding input of a user on the first interface when the terminal device displays the first interface in full screen
  • An execution module in response to the sliding input received by the receiving module, executing the target when the sliding trajectory of the sliding input is the same as the horizontal or vertical coordinate of the preset coordinate of the target virtual key. Functions corresponding to virtual keys;
  • the target virtual key is not displayed on the first interface
  • the preset coordinates are coordinates on the display screen when the target virtual key is displayed on the display screen.
  • an embodiment of the present disclosure provides a terminal device including a processor, a memory, and a computer program stored on the memory and executable on the processor.
  • the computer program is executed by the processor, the computer program is implemented as In one aspect, the steps of the method for controlling a terminal device.
  • an embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the method for controlling a terminal device according to the first aspect is implemented. A step of.
  • FIG. 1 is a schematic architecture diagram of an Android operating system according to an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart of a method for controlling a terminal device according to an embodiment of the present disclosure
  • FIG. 3 is one of schematic diagrams of an interface applied to a method for controlling a terminal device according to an embodiment of the present disclosure
  • FIG. 4 is a second schematic diagram of an interface applied to a method for controlling a terminal device according to an embodiment of the present disclosure
  • FIG. 5 is a third schematic diagram of an interface applied to a method for controlling a terminal device according to an embodiment of the present disclosure
  • FIG. 6 is a fourth schematic diagram of an interface applied to a method for controlling a terminal device according to an embodiment of the present disclosure
  • FIG. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure.
  • FIG. 8 is a second schematic structural diagram of a terminal device according to an embodiment of the present disclosure.
  • words such as “first” and “second” are used to distinguish the same or similar items having substantially the same functions or functions.
  • the skilled person can understand that the words “first” and “second” do not limit the quantity and execution order.
  • the terminal device may be a mobile terminal device or a non-mobile terminal device.
  • Mobile terminal devices can be mobile phones, tablets, laptops, palmtop computers, car terminal devices, wearable devices, ultra-mobile personal computers (UMPCs), netbooks, or personal digital assistants (PDAs) Etc .
  • the non-mobile terminal device may be a personal computer (PC), a television (TV), a teller machine, or a self-service machine; the embodiments of the present disclosure are not specifically limited.
  • the terminal device in the embodiment of the present disclosure is a terminal device having an under-screen fingerprint recognition function.
  • the execution subject of the method for controlling a terminal device provided in the embodiments of the present disclosure may be the above-mentioned terminal devices (including mobile terminal devices and non-mobile terminal devices), or may be a functional module in the terminal device capable of implementing the control method for the terminal device. And / or the functional entity may be specifically determined according to actual use requirements, which are not limited in the embodiments of the present disclosure.
  • the terminal device is taken as an example to describe the control method of the terminal device provided by the embodiment of the present disclosure.
  • the terminal in the embodiment of the present disclosure may be a terminal having an operating system.
  • the operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present disclosure.
  • an Android operating system is used as an example to introduce a software environment applied to a method for controlling a terminal device provided by an embodiment of the present disclosure.
  • FIG. 1 it is a schematic architecture diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, respectively: an application program layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, it can be a Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application, and developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system operating environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest layer of the Android operating system software layer.
  • the kernel layer is based on the Linux kernel and provides core system services and hardware-related drivers for the Android operating system.
  • a developer can develop a software program that implements the control method of the terminal device provided by the embodiment of the present disclosure based on the system architecture of the Android operating system shown in FIG. 1, so that The control method of the terminal device may be based on the Android operating system shown in FIG. 1. That is, the processor or the terminal device may implement the control method of the terminal device provided by the embodiment of the present disclosure by running the software program in an Android operating system.
  • control method of the terminal device according to the embodiment of the present disclosure is described below with reference to the flowchart of the control method of the terminal device shown in FIG. 2.
  • the control method of the terminal device provided by the embodiment of the present disclosure specifically includes steps. 201 and step 202:
  • Step 201 When the terminal device displays the first interface on the full screen of the terminal device, the terminal device receives a sliding input from the user on the first interface.
  • the above-mentioned first interface may be any running interface of any application installed in the terminal device, or may be a functional interface of any function in the terminal device, which is not limited in this disclosure.
  • Step 202 In response to the sliding input, the terminal device executes a function corresponding to the target virtual key when the sliding track of the sliding input is the same as the horizontal or vertical coordinate of the preset coordinate of the target virtual key.
  • the target virtual key is not displayed on the first interface
  • the preset coordinates are the coordinates on the display screen of the target virtual key when the target virtual key is displayed on the display screen of the terminal device.
  • the above-mentioned target virtual key may be a control hidden in a side task bar, for example, a hidden task bar of a terminal device (for example, the user slides upward, downward, or left from the edge of the screen of the terminal device) Slide or slide to the right to hide the controls in the taskbar), or hide the controls (for example, hide the virtual start key (Home key), virtual return key and virtual menu displayed at the bottom of the interface) key).
  • a hidden task bar of a terminal device for example, the user slides upward, downward, or left from the edge of the screen of the terminal device
  • Slide or slide to the right to hide the controls in the taskbar Slide or slide to the right to hide the controls in the taskbar
  • hide the controls for example, hide the virtual start key (Home key), virtual return key and virtual menu displayed at the bottom of the interface
  • the terminal device can display the playback interface of "Video 1" (ie, interface 31 in Fig. 3) in full screen.
  • the user can display the virtual “back” button (ie, 32 in FIG. 4), the virtual “menu” button (ie, 33 in FIG. 4), and the virtual “by” Home “button (ie, 34 in Figure 4).
  • the preset coordinates of the above-mentioned virtual "back” button are the coordinates of the virtual "back” button in Fig. 4, and the preset coordinates of the above-mentioned virtual "menu” button are the virtual "menu” buttons in Fig. 4
  • the coordinates of the virtual "Home” button are the coordinates of the virtual "Home” button in FIG. 4.
  • step 202 when the terminal device executes step 202, it may also be implemented by the following steps:
  • Step A1 In response to the sliding input, the sliding trajectory of the terminal input on the sliding input is the same as the horizontal or vertical coordinate of the preset coordinate of the target virtual key, and the sliding direction of the sliding input matches the preset sliding direction Next, execute the function corresponding to the target virtual key.
  • the above-mentioned preset sliding direction is: the sliding direction of the user's sliding input when the target virtual key is displayed on the first interface.
  • the user when the user wants to display the target virtual key on the first interface, the user can perform a corresponding sliding input on the first interface.
  • the target virtual key is displayed on the first interface, as shown in FIG. 5 (a), when the user moves from the first edge (b1 in FIG. 5) to the second edge (b2 in FIG. 5) of the first interface After sliding, as shown in FIG.
  • the target virtual buttons (such as the “Menu” button, the “Home” button, and the “Back” button) are displayed at the first edge.
  • the trigger condition mentioned above includes: the sliding direction of the sliding input is the same as the preset direction, and the preset direction here is the preset direction in step A1.
  • matching the sliding direction of the sliding input with the preset direction specifically means that the sliding direction of the sliding input is the same as the preset direction, or that the sliding direction of the sliding input is between the preset direction and the preset direction.
  • the included angle is less than or equal to the first threshold. For example, if the included angle between the sliding direction of the sliding input and the preset direction is less than or equal to 45 degrees, it is considered that the sliding direction of the sliding input matches the preset direction.
  • the method before step 201, the method further includes the following content:
  • Step A2 When the included angle between the sliding trajectory of the sliding input and the horizontal axis of the preset coordinates of the target virtual key is less than or equal to the first threshold, the terminal device determines the sliding trajectory of the sliding input.
  • the ordinate is the same as the preset coordinate of the target virtual key described above; and / or,
  • Step A3 In a case where an included angle between the slide track of the slide input and the vertical axis of the preset coordinate of the target virtual key is less than or equal to a first threshold, the terminal device determines the slide track of the slide input.
  • the abscissa is the same as the preset coordinate of the target virtual key described above.
  • the vertical coordinate of the sliding track is considered to be the same as the vertical coordinate of the preset coordinate;
  • the included angle between the sliding track of the sliding input and the vertical axis of the preset coordinate of the target virtual key is less than or equal to 45 degrees, it is considered that the vertical coordinate of the sliding track is the same as the horizontal coordinate of the preset coordinate.
  • the method for controlling the terminal device provided by this embodiment will be described by way of example.
  • the X-axis of the terminal device uses the two long sides of the terminal device as an example
  • the Y-axis uses the two short sides of the terminal device as an example.
  • the terminal device displays a playback interface of “Video 1” (ie, interface 31 in FIG. 3) in full screen.
  • “Video 1” ie, interface 31 in FIG. 3
  • the virtual "back” button of the terminal device in the related art is usually displayed in the bottom right area of the display interface, that is, the X-axis coordinate range of the virtual "back” button of the terminal device in the related technology is the area in FIG. 6 A (ie, a1 in FIG. 6), so if the user wants to exit the playback interface of “Video 1” (ie, interface 31 in FIG. 6), he can horizontally move leftward along the X axis in the area A in FIG. 6 When sliding, that is, when the user's sliding track on the interface 31 is the same as the “sliding track 1” in FIG. 6, the user can exit the current “video 1” playback interface.
  • the virtual "menu" button of the terminal device in the related art is usually displayed in the bottom middle area of the display interface, that is, the X-axis coordinate range of the virtual "menu” button of the terminal device in the related technology is the area B in FIG. 6 (Ie, a2 in FIG. 6), so if the user wants to call up the menu interface, he can slide horizontally along the X axis to the left in the area B in FIG. 6, that is, the user's sliding track on the interface 31 is the same as that in FIG. 6 When "Slide Track 2" is the same, you can display the menu interface;
  • the virtual “Home” button of the terminal device in the related art is usually displayed in the bottom middle area of the display interface, that is, the X-axis coordinate range of the virtual “Home” button of the terminal device in the related technology is the area C in FIG. 6 (Ie, a3 in FIG. 6)
  • the user wants to return to the main interface of the terminal device he can slide horizontally along the X axis to the left in the area C of FIG. 6, that is, the user's sliding track on the interface 31 is the same as that in FIG.
  • the "slide track 3" is the same, the main interface of the terminal device can be displayed.
  • the sliding direction of the above sliding track (that is, the direction along the X axis to the right) is only an example.
  • the sliding direction of the above sliding track That is, along the Y-axis direction, this disclosure does not limit this.
  • the terminal device when the terminal device displays the first interface on a full screen, it receives a user's sliding input on the first interface, and predicts the sliding trajectory of the sliding input and the target virtual key. If the abscissa or ordinate of the coordinates are the same, the function corresponding to the target virtual key is executed, so that when the target virtual key is not displayed on the first interface, there is no need to call up the target virtual key further, that is, no need to call up the target virtual key. By further displaying the target virtual key on the first interface, the function corresponding to the target virtual key can be directly executed, which reduces the complexity of the user operation and improves the user's operation efficiency.
  • the embodiments of the present disclosure may be implemented in at least one of the following two ways.
  • the terminal device may enable a screen fingerprint collection function of the terminal device, so as to determine whether the sliding input is used to trigger a virtual key function by collecting a finger fingerprint when the user triggers a sliding input.
  • the above step 201 specifically includes the following content: the terminal device receives a sliding input of the first finger of the user on the first interface. Based on this, the terminal device may implement the following process when executing step 202: in response to the sliding input, the sliding trajectory of the terminal device on the sliding input is the same as the horizontal or vertical coordinate of the preset coordinate of the target virtual key, and the above-mentioned When the fingerprint of the first finger matches the preset fingerprint, the function corresponding to the target virtual key is executed.
  • the fingerprint of the first finger may be part or all of the fingerprint of the first finger.
  • the terminal device when receiving the above-mentioned sliding input, collects the fingerprint of the first finger that triggered the sliding input.
  • the user may use the fingerprint collection device integrated in the terminal device to detect the fingerprint entered by the user, or use the fingerprint collection device integrated in the external device to detect the fingerprint entered by the user. Then, the detected The fingerprint is sent to the terminal device. At this time, a communication connection can be established between the external device and the terminal device.
  • matching the fingerprint of the first finger with the preset fingerprint means that the fingerprint of the first finger is the same as the preset fingerprint, or that the fingerprint of the first finger is similar to the preset fingerprint.
  • the degree is greater than or equal to a preset threshold.
  • the preset threshold may be set to 95%, that is, if the similarity between the fingerprint of the first finger and the preset fingerprint is greater than or equal to 95%, the fingerprint of the first finger collected by the terminal device is considered to be preset. The fingerprints match.
  • Example 2 has pre-entered the index finger fingerprint.
  • Example two as shown in FIG. 3, after the user opens the app "Video 1" and searches for “Video 1", the terminal device can display the playback interface of "Video 1" in full screen (ie, interface 31 in Fig. 3). At this time, after the user ’s index finger touches the playback interface of “Video 1”, the terminal device matches the index finger fingerprint collected with the pre-stored index finger fingerprint. If the two match, the corresponding virtual motion is executed according to the subsequent slide track of the user.
  • the function of the keys please refer to the example description in Example 1 for specific execution process, which will not be repeated here.
  • the terminal device determines whether the sliding input of one touch point is used to trigger the function of a virtual key by detecting whether at least two touch points appear in the display screen at the same time.
  • step 202 described above the method further includes the following steps:
  • Step A2 The terminal device receives a pressing input from the user on the first interface.
  • step 201 specifically includes the following process:
  • the terminal device receives a sliding input of the user on the first interface during the process of receiving the pressing input. That is, in the process of the user performing a press input on the first interface, the user also performs a sliding input on the first interface.
  • the above-mentioned pressing input is a continuous and position-invariant pressing input, or the movement displacement of the movement track of the above-mentioned pressing input is smaller than the second threshold.
  • FIG. 7 is a schematic diagram of a possible structure for implementing a terminal device provided by an embodiment of the present disclosure.
  • the terminal device 400 includes a receiving module: 401 and an execution module 402, where:
  • the receiving module 401 is configured to receive a sliding input of a user on the first interface when the terminal device displays the first interface in a full screen.
  • the execution module 402 is configured to execute a function corresponding to the target virtual key when the sliding input of the sliding input is the same as the horizontal or vertical coordinate of the preset coordinate of the target virtual key in response to the sliding input received by the receiving module 401.
  • the target virtual key is not displayed on the first interface, and the preset coordinates are the coordinates on the display screen when the target virtual key is displayed on the display screen.
  • the receiving module 401 is specifically configured to: receive a sliding input of the first finger of the user on the first interface; and the execution module 402 is specifically configured to: match the fingerprint of the first finger with a preset fingerprint In the case of executing the function corresponding to the target virtual key.
  • the receiving module 401 is further configured to receive a user's pressing input on the first interface; the receiving module is further configured to receive the user's sliding on the first interface during the receiving of the pressing input. Enter.
  • the execution module 402 is specifically configured to: execute the function corresponding to the target virtual key when the sliding direction of the sliding input received by the receiving module 401 matches the preset sliding direction; wherein, the preset sliding direction The sliding direction of the user's sliding input when the target virtual key is displayed on the first interface.
  • the terminal device 400 further includes a determining module 403, where:
  • the determining module 403 is configured to determine that the vertical coordinate of the sliding track is the same as the vertical coordinate of the preset coordinate when the included angle between the sliding track and the horizontal axis of the preset coordinate is less than or equal to the first threshold; When the included angle between the vertical axes of the preset coordinates is less than or equal to the first threshold, it is determined that the abscissa of the sliding trajectory is the same as the abscissa of the preset coordinates.
  • the terminal device when the terminal device displays the first interface in full screen, the terminal device receives a sliding input from the user on the first interface, and the sliding track of the sliding input and the preset coordinates of the target virtual key are When the abscissa or ordinate is the same, the function corresponding to the target virtual key is executed, so that when the target virtual key is not displayed on the first interface, there is no need to further call up the target virtual key, that is, there is no need to further Once the target virtual key is displayed on an interface, the function corresponding to the target virtual key can be directly executed, which reduces the user's operation complexity and improves the user's operation efficiency.
  • the terminal device provided in the embodiment of the present disclosure can implement the processes implemented by the terminal device in the foregoing method embodiments. To avoid repetition, details are not described herein again.
  • FIG. 8 is a schematic diagram of a hardware structure of a terminal device that implements various embodiments of the present disclosure.
  • the terminal device 100 includes, but is not limited to, a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit. 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111.
  • the terminal device 100 may include more or fewer components than shown in the figure, or combine some components, or Different component arrangements.
  • the terminal device 100 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a car terminal, a wearable device, and a pedometer.
  • the user input unit 107 is configured to receive the sliding input of the user on the first interface when the terminal device displays the first interface full screen; the processor 110 is configured to respond to the sliding input received by the user input unit 107 in If the sliding trajectory of the sliding input is the same as the horizontal or vertical coordinate of the preset coordinate of the target virtual key, the function corresponding to the target virtual key is executed; wherein the target virtual key is not displayed on the first interface, and the above The preset coordinates are the coordinates on the display when the target virtual key is displayed on the display.
  • the terminal device when the terminal device displays the first interface in full screen, the terminal device receives a sliding input from the user on the first interface, and the sliding track of the sliding input and the preset coordinates of the target virtual key are When the abscissa or ordinate is the same, the function corresponding to the target virtual key is executed, so that when the target virtual key is not displayed on the first interface, there is no need to further call up the target virtual key, that is, there is no need to further Once the target virtual key is displayed on an interface, the function corresponding to the target virtual key can be directly executed, which reduces the user's operation complexity and improves the user's operation efficiency.
  • the radio frequency unit 101 may be used to receive and send signals during the transmission and reception of information or during a call. Specifically, the downlink data from the base station is received and processed by the processor 110; The uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
  • the terminal device 100 provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive email, browse web pages, and access streaming media.
  • the audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sound. Moreover, the audio output unit 103 may also provide audio output (for example, a call signal receiving sound, a message receiving sound, etc.) related to a specific function performed by the terminal device 100.
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used for receiving audio or video signals.
  • the input unit 104 may include a graphics processor
  • the graphics processor 1041 processes image data of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
  • the processed image frames may be displayed on the display unit 106.
  • the image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode and output.
  • the terminal device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
  • an accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes).
  • sensor 105 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, infrared The sensors and the like are not repeated here.
  • the display unit 106 is configured to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be configured to receive inputted numeric or character information and generate key signal inputs related to user settings and function control of the terminal device 100.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • Touch panel 1071 also known as touch screen, can collect user's touch operations on or near it (such as the user using a finger, stylus, etc. any suitable object or accessory on touch panel 1071 or near touch panel 1071 operating).
  • the touch panel 1071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal caused by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into contact coordinates, and sends it
  • the processor 110 receives and executes a command sent by the processor 110.
  • various types such as resistive, capacitive, infrared, and surface acoustic wave can be used to implement the touch panel 1071.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, and details are not described herein again.
  • the touch panel 1071 may be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near the touch panel 1071, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event.
  • the type of event provides a corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are implemented as two independent components to implement the input and output functions of the terminal device 100, in some embodiments, the touch panel 1071 and the display panel 1061 can be implemented.
  • the integration implements the input and output functions of the terminal device 100, which is not specifically limited here.
  • the interface unit 108 is an interface through which an external device is connected to the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, and audio input / output (I / O) port, video I / O port, headphone port, and more.
  • the interface unit 108 may be used to receive an input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal device 100 or may be used to connect the terminal device 100 and an external device. Transfer data between devices.
  • the memory 109 may be used to store software programs and various data.
  • the memory 109 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, at least one application required by a function (such as a sound playback function, an image playback function, etc.), etc .; the storage data area may store data according to Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is a control center of the terminal device 100, and connects various parts of the entire terminal device 100 by using various interfaces and lines, and runs or executes software programs and / or modules stored in the memory 109, and calls stored in the memory 109 Data, perform various functions of the terminal device 100 and process the data, so as to monitor the terminal device 100 as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, etc.
  • the tuning processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 may further include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, thereby implementing management of charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 100 includes some functional modules that are not shown, and details are not described herein again.
  • an embodiment of the present disclosure further provides a terminal device including a processor, a memory, and a computer program stored on the memory and executable on the processor 110.
  • the terminal When the computer program is executed by the processor, the terminal is implemented.
  • Each process of the device control method embodiment can achieve the same technical effect. To avoid repetition, details are not repeated here.
  • An embodiment of the present disclosure also provides a computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the processes of the foregoing control method embodiment of the terminal device are implemented, and can achieve the same Technical effects, in order to avoid repetition, will not repeat them here.
  • the computer-readable storage medium is, for example, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk.
  • the methods in the above embodiments can be implemented by means of software plus a necessary universal hardware platform, and of course, also by hardware, but in many cases the former is better.
  • Implementation Based on such an understanding, the technical solution of the present disclosure, which is essentially or contributes to related technologies, can be embodied in the form of a software product, which is stored in a storage medium (such as ROM / RAM, magnetic disk) , CD-ROM), including a number of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the methods described in the embodiments of the present disclosure.
  • a storage medium such as ROM / RAM, magnetic disk
  • CD-ROM compact disc-read only memory
  • a terminal device which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande de dispositif terminal et un dispositif terminal, qui sont appliqués au domaine technique des terminaux et utilisés pour résoudre le problème dans l'état de la technique associé selon lequel, lorsqu'un dispositif terminal affiche une interface de fonctionnement d'un programme d'application en plein écran, l'opération d'appel d'un bouton virtuel est compliquée et difficile à exécuter. Le procédé comprend les étapes consistant à : recevoir une entrée glissante d'un utilisateur sur une première interface, un dispositif terminal affichant la première interface en plein écran ; et en réponse à l'entrée glissante, une trajectoire de glissement de l'entrée glissante étant la même que l'abscisse ou l'ordonnée de coordonnées préétablies d'un bouton virtuel cible, exécuter une fonction correspondant au bouton virtuel cible, le bouton virtuel cible n'étant pas affiché sur la première interface, et les coordonnées préétablies étant les coordonnées du bouton virtuel cible sur un écran d'affichage lorsque le bouton virtuel cible est affiché sur l'écran d'affichage.
PCT/CN2019/094532 2018-07-16 2019-07-03 Procédé de commande de dispositif terminal, et dispositif terminal Ceased WO2020015529A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810780053.8A CN109189514B (zh) 2018-07-16 2018-07-16 一种终端设备的控制方法及终端设备
CN201810780053.8 2018-07-16

Publications (1)

Publication Number Publication Date
WO2020015529A1 true WO2020015529A1 (fr) 2020-01-23

Family

ID=64936665

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/094532 Ceased WO2020015529A1 (fr) 2018-07-16 2019-07-03 Procédé de commande de dispositif terminal, et dispositif terminal

Country Status (2)

Country Link
CN (1) CN109189514B (fr)
WO (1) WO2020015529A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109189514B (zh) * 2018-07-16 2021-03-12 维沃移动通信有限公司 一种终端设备的控制方法及终端设备
CN110851810A (zh) * 2019-10-31 2020-02-28 维沃移动通信有限公司 响应方法及电子设备
CN111158577B (zh) * 2019-12-31 2021-10-01 奇安信科技集团股份有限公司 远程操作处理方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834436A (zh) * 2015-05-10 2015-08-12 汪念鸿 一种实现触摸屏终端功能键的方法
CN105005409A (zh) * 2015-06-30 2015-10-28 汪念鸿 一种利用多点手势实现触摸屏终端功能键的方法
US20150363001A1 (en) * 2014-06-13 2015-12-17 Thomas Malzbender Techniques For Using Gesture Recognition To Effectuate Character Selection
CN107291226A (zh) * 2017-06-12 2017-10-24 深圳市创梦天地科技股份有限公司 基于触摸手势的控制方法及装置、终端
CN107728923A (zh) * 2017-10-20 2018-02-23 维沃移动通信有限公司 一种操作的处理方法及移动终端
CN109189514A (zh) * 2018-07-16 2019-01-11 维沃移动通信有限公司 一种终端设备的控制方法及终端设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991638A (zh) * 2015-05-21 2015-10-21 努比亚技术有限公司 一种输入操作装置
CN105159491A (zh) * 2015-08-26 2015-12-16 深圳市金立通信设备有限公司 一种触控屏控制方法及终端
CN107808084A (zh) * 2017-10-18 2018-03-16 维沃移动通信有限公司 一种触控操作方法及移动终端
CN108108110A (zh) * 2017-12-11 2018-06-01 维沃移动通信有限公司 一种屏幕控制方法、屏幕控制装置及移动终端
CN108256304A (zh) * 2017-12-25 2018-07-06 深圳信炜科技有限公司 电子设备
CN108090337A (zh) * 2017-12-25 2018-05-29 深圳信炜科技有限公司 电子设备的指纹识别方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150363001A1 (en) * 2014-06-13 2015-12-17 Thomas Malzbender Techniques For Using Gesture Recognition To Effectuate Character Selection
CN104834436A (zh) * 2015-05-10 2015-08-12 汪念鸿 一种实现触摸屏终端功能键的方法
CN105005409A (zh) * 2015-06-30 2015-10-28 汪念鸿 一种利用多点手势实现触摸屏终端功能键的方法
CN107291226A (zh) * 2017-06-12 2017-10-24 深圳市创梦天地科技股份有限公司 基于触摸手势的控制方法及装置、终端
CN107728923A (zh) * 2017-10-20 2018-02-23 维沃移动通信有限公司 一种操作的处理方法及移动终端
CN109189514A (zh) * 2018-07-16 2019-01-11 维沃移动通信有限公司 一种终端设备的控制方法及终端设备

Also Published As

Publication number Publication date
CN109189514B (zh) 2021-03-12
CN109189514A (zh) 2019-01-11

Similar Documents

Publication Publication Date Title
CN109542282B (zh) 一种界面显示方法及终端设备
WO2021218902A1 (fr) Procédé et appareil de commande d'affichage et dispositif électronique
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2020019979A1 (fr) Procédé de commande d'affichage et terminal
CN111163260B (zh) 一种摄像头启动方法及电子设备
WO2020173235A1 (fr) Procédé de commutation de tâches et dispositif terminal
WO2020057257A1 (fr) Procédé de basculement d'interface d'application et terminal mobile
US11526320B2 (en) Multi-screen interface control method and terminal device
WO2021057290A1 (fr) Procédé de commande d'informations et dispositif électronique
WO2021104193A1 (fr) Procédé d'affichage d'interface et dispositif électronique
CN109857495A (zh) 一种显示控制方法及终端设备
CN109829707B (zh) 一种界面显示方法及终端设备
CN110109604A (zh) 一种应用界面显示方法及移动终端
CN107728923A (zh) 一种操作的处理方法及移动终端
WO2021098696A1 (fr) Procédé de commande tactile et dispositif électronique
CN108874906B (zh) 一种信息推荐方法及终端
CN108762606B (zh) 一种屏幕解锁方法及终端设备
WO2020078234A1 (fr) Procédé de commande d'affichage et terminal
WO2020192297A1 (fr) Procédé de commutation d'interface d'écran et dispositif terminal
CN108984099B (zh) 一种人机交互方法及终端
WO2020015529A1 (fr) Procédé de commande de dispositif terminal, et dispositif terminal
CN110096203A (zh) 一种截图方法及移动终端
CN111190517B (zh) 分屏显示方法及电子设备
WO2021036529A1 (fr) Procédé de commande et dispositif terminal
CN111338525A (zh) 一种电子设备的控制方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19838599

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19838599

Country of ref document: EP

Kind code of ref document: A1