WO2017113379A1 - 一种用户界面的菜单显示方法及手持终端 - Google Patents
一种用户界面的菜单显示方法及手持终端 Download PDFInfo
- Publication number
- WO2017113379A1 WO2017113379A1 PCT/CN2015/100296 CN2015100296W WO2017113379A1 WO 2017113379 A1 WO2017113379 A1 WO 2017113379A1 CN 2015100296 W CN2015100296 W CN 2015100296W WO 2017113379 A1 WO2017113379 A1 WO 2017113379A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control
- handheld terminal
- displayed
- determining
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to the field of electronic technologies, and in particular, to a menu display method for a user interface and a handheld terminal.
- Handheld terminals have become an indispensable necessity in people's daily life.
- the importance of handheld terminals can be seen from any angle.
- the trend of handheld terminals is that the screen is getting bigger and bigger, but the size of the palm of the hand is fixed.
- So now many handheld terminals require the user to operate with their hands to cover the controls that are clicked on the entire screen, but sometimes the user has to free one hand to do other things, so that the mobile terminal can only be operated with one hand.
- the range of areas that a finger can click on is limited and cannot be overlaid on the entire screen.
- handheld terminals In order to provide better viewing effects, handheld terminals generally provide full-screen immersive mode.
- the system menu In full-screen immersive mode, the system menu includes a status bar, virtual buttons, and application menus that are dynamically hidden.
- the application corresponding to the full-screen immersive mode can use the full screen space (that is, the display content of the application is displayed on the display unit of the terminal in full screen), thereby providing the user with a more compact and refreshing user experience.
- immersive menus including system menus, application menus
- system menus including system menus, application menus
- application menus appear at the top and bottom of the phone screen. Users can control the current application or system by operating an immersive menu to achieve the corresponding functions.
- the user Based on the display mode of the immersion menu generally set at the top and/or bottom of the display screen, the user always has a part of the menu area that is not easy to touch when operating with one hand. Therefore, there is a problem that the menu is inconvenient to operate.
- the present invention provides a menu display method for a user interface and a handheld terminal.
- the method and apparatus provided by the present invention solve the problem that the menu display method of the user interface in the prior art is unreasonable and causes inconvenience to the user.
- a menu display method for a user interface comprising:
- the handheld terminal When the handheld terminal detects the first touch operation that satisfies the first preset condition in the full-screen immersive mode, acquiring the to-be-displayed overlay interface corresponding to the application of the full-screen immersive mode; wherein the overlay interface to be displayed is displayed At the time, the overlay is displayed on the current display content of the handheld terminal;
- the distance between the sides is less than a set threshold, and the holding manner includes a left hand grip or a right hand grip;
- the determining a display reference point corresponding to the one-handed holding manner includes:
- a point is determined on the handheld terminal as the display reference point based on the side.
- determining a point on the handheld terminal as the display reference point based on the side edge includes:
- a point is determined on the handheld terminal as the display reference point based on the side and the sliding direction.
- determining that the user holds the side of the handheld terminal includes:
- the side of the handheld terminal is provided with a touch sensor, and the touch sensor on the side detects a touch signal to determine that the user holds the side of the handheld terminal.
- determining, by the control, the to-be-moved control includes:
- the method further includes :
- the corresponding function is called according to the original coordinates.
- the method further includes: the to-be-moved control is displayed in the new overlay interface in a floating control manner.
- a handheld terminal comprising:
- the input unit is configured to detect, when the handheld terminal is in the full-screen immersive mode, whether there is a first touch operation that satisfies the first preset condition;
- the processor if there is a first touch operation that satisfies the first preset condition, is used to acquire the to-be-displayed overlay interface of the full-screen immersive mode corresponding application; and determines a control displayed in the overlay display interface to be displayed; Determining a display reference point corresponding to the one-handed holding mode when determining that the user is currently in the one-handed holding mode of the handheld terminal; determining a control to be moved from the control; and controlling the to-be-moved control Displaying the position for adjustment, and generating a new overlay interface according to the adjusted control position, replacing the to-be-displayed overlay interface with the new overlay interface; in the new overlay interface, the control to be moved
- the distance between the display position and the display reference point is less than a set threshold; the functions performed by the handheld terminal of the to-be-moved control before and after the user operates the position adjustment are the same; wherein, when the overlay interface to be displayed is displayed, Superimposed on the current display content; the distance between the display reference point and
- the determining, by the processor, the display reference point corresponding to the one-handed holding manner includes determining a side of the handheld terminal that is held by the user; The side defines a point on the handheld terminal as the display reference point.
- the determining, by the processor, a point on the handheld terminal as the display reference point based on the side specifically includes acquiring Determining a touch track of the first touch operation, and determining a sliding direction corresponding to the touch track according to a position of the end point of the touch track relative to the starting point; based on the side edge and the sliding direction A point is determined on the handheld terminal as the display reference point.
- the input unit is further configured to detect a touch signal, so that the processor determines, according to the touch signal, that the user holds the The side of the handheld terminal.
- the determining, by the processor, the control to be moved from the control comprises: detecting the control Determining the distance between the position to be displayed and the display reference point of each of the control points, when the distance between the position to be displayed of the control and the display reference point is greater than the set threshold, determining the a control is the control to be moved; or outputting the overlay interface to be displayed, and detecting whether there is a second touch operation that satisfies the second preset condition, and determining from the control according to the second touch operation The control to be moved.
- the processor replaces the to-be-displayed overlay interface with the new overlay interface
- the current display coordinate of the first control in the new overlay interface is acquired; according to the current display coordinate Determining a corresponding original coordinate of the first control in the overlay layer to be displayed; and calling a corresponding function according to the original coordinate.
- the processor is further configured to display the to-be-moved control in a floating control manner In the new overlay interface.
- the method and device provided by the embodiment of the present invention in order to realize that after the user clicks on the screen, the menu/button/option that the application should display is moved to an area convenient for finger manipulation, and the overlay interface of the operating system to be displayed is processed and processed. Then show it again.
- the control after the move has the same function as the corresponding control before the move.
- the phase of the original position control The function should be triggered normally. Therefore, the menu provided by the embodiment of the invention is more convenient for the user to touch, and the interface is refreshing and beautiful.
- FIG. 1 is a schematic flowchart of a menu display method of a user interface according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of comparison between before and after the position of the mobile control to be moved when the user holds the handheld terminal in the right hand according to the embodiment of the present invention
- FIG. 3 is a schematic diagram of comparison between before and after the position of the mobile control to be moved when the user holds the handheld terminal in the left hand according to the embodiment of the present invention
- FIG. 4 is a schematic diagram of comparison of a user after controlling a movement by a sliding operation according to an embodiment of the present invention
- FIG. 5 is a schematic diagram of an implementation of determining a moving position of a control by a touch track according to an embodiment of the present invention
- FIG. 6 and FIG. 7 are schematic diagrams showing a mobile control displayed in a floating form according to an embodiment of the present invention.
- FIG. 8 is a schematic structural diagram of a handheld terminal according to an embodiment of the present disclosure.
- FIG. 9 is a schematic structural diagram of another handheld terminal according to an embodiment of the present invention.
- the application displays content in full screen in full-screen immersive mode without displaying any menus, buttons or options (and also does not display the status bar and navigation bar) when the user clicks on the screen.
- the application overlays the menu, button, or option to be displayed on top of the current full-screen display for the user to use.
- the embodiment of the present invention provides a menu display method of the user interface. The method processes the overlay interface displayed on the current display interface, and then superimposes and displays on the current display interface. As shown in FIG. 1 , the specific implementation of the method provided by the embodiment of the present invention includes the following steps:
- Step 101 When the handheld terminal detects the first touch operation that satisfies the first preset condition in the full-screen immersive mode, acquire the to-be-displayed overlay interface corresponding to the application of the full-screen immersive mode; wherein the overlay to be displayed When the layer interface is displayed, the overlay is displayed on top of the current display content;
- the first touch operation that satisfies the first preset condition may be a preset operation recognizable by the handheld terminal.
- the control is a control in the application menu.
- the system Before the overlay layer interface of the application is displayed on the screen, the system first loads the overlay interface to be displayed in the memory. After the loading is completed, the system can parse the overlay interface to be displayed to determine the control in the interface. The system can get information about the controls in the interface. The information of the control includes information such as the ID, name, location, and size of the control.
- Step 102 Determine a control displayed in the overlay layer to be displayed
- the application corresponding to the current display content sets all the control displays in the overlay interface.
- the solution provided by the present invention can be displayed before the overlay interface is displayed. The position of the control is adjusted so that the controls displayed after adjustment are more convenient to operate.
- the same control is displayed in the overlay interface.
- Step 103 When it is determined that the user is currently holding the handheld terminal in a one-handed manner, determining a display reference point corresponding to the one-handed holding manner; wherein the display reference point is held by the user The distance between the sides of the handheld terminal is less than a set threshold, and the holding manner includes a left hand grip or a right hand grip;
- the display reference point is used to position the position after the movement of the control, so in order to move the control that is inconvenient for the user to the position where the user is convenient to operate, the display reference point can be set at the position where the user holds the terminal.
- the manner of determining the display reference point corresponding to the one-handed holding manner may be:
- the user after determining that the user holds the position of the handheld terminal, it is determined based on the position to which the control to be moved is moved to the user, so that the user can operate based on the position of the handheld terminal. Holding a side edge, a bottom edge, or a corner vertex corresponding to the side edge and the bottom edge of the handheld terminal determines a certain position as a display reference point.
- the specific implementation manner is not limited in this embodiment, as long as the user can operate conveniently. The following describes the solution of the embodiment of the present invention by taking the side of the handheld terminal as an example:
- A determining a side of the handheld terminal that the user holds
- the specific implementation manner of determining that the user holds the side of the handheld terminal may be:
- the side of the handheld terminal is provided with a touch sensor, and the touch sensor on the side detects a touch signal to determine that the user holds the side of the handheld terminal.
- Step 104 Determine a control to be moved from the control
- all the controls in the overlay interface can be moved to form a new interface with uniform format and beautiful appearance; in addition, it can also be based on a practical point of view, but only controls that are inconvenient for the user to move (according to the industry) Some statistics determine areas that are not easily manipulated on different terminal screens). So after you have determined all the controls included in the overlay interface, you can select some of the controls or all the controls as the controls to be moved.
- the controls to be moved include:
- A detecting a distance value between the to-be-displayed position of the control and the display reference point, when the distance between the to-be-displayed position of the control and the display reference point is greater than the setting Threshold, determining that any of the controls is the control to be moved; or
- each control since the controls involved in this embodiment are displayed in a fixed position in the overlay interface to be displayed, each control has a parameter or attribute that determines the display position, so in this embodiment Based on this parameter or property, you can determine where each control should be displayed when it is displayed on the screen.
- the position to be displayed on the display device that is, the position to be displayed, can be determined with related parameters or attributes.
- the final purpose of the adjustment of the control is to achieve user convenience, so in order to adapt to the needs of each user, the overlay to be displayed can be displayed in the form of a preview, and then the user according to the displayed content, then It can be determined that the external controls are inconvenient to operate, so that the control to be moved that needs to be moved by the position can be selected from the control through the touch operation.
- Step 105 Adjust a display position of the to-be-moved control, and generate a new overlay interface according to the adjusted control position, and replace the to-be-displayed overlay interface with the new overlay interface; wherein, in the new In the overlay interface, the distance between the display position of the control to be moved and the display reference point is less than a set threshold; the functions performed by the handheld terminal of the to-be-moved control before and after the user operates the position adjustment are the same.
- the position of each control is more convenient for the user to control.
- the specific implementation can be:
- the control to be moved is displayed in the final calculated range, that is, the distance between the display position after the movement of the control and the display reference point. It is smaller than the set threshold (as shown in Fig. 2, where a of Figure 2 is before the control moves, and Figure b of Figure 2 is after the control is moved).
- the situation shown in FIG. 2 is a specific implementation of the user holding the handheld terminal in the right hand. When the user holds the left hand, the control can be moved to the left side of the handheld terminal for display, as shown in FIG. 3 .
- the controls to be moved may be displayed separately after being moved or may be combined to form a menu bar, such as the arc-shaped menu shown in b of FIG. 2 (the specific example shown in FIG. 2 is only to implement the present invention.
- a menu bar such as the arc-shaped menu shown in b of FIG. 2
- An optimized example of the embodiment does not limit the solution provided by the embodiment of the present invention, and can be implemented only by the method shown in Figure 2. In a specific application environment, it may be based on design requirements and user convenience.
- the menu is set to various shapes such as a rectangular ellipse.
- the controls in the menu can be scrolled left and right and/or up and down by the user gesture (if the user inputs a touch operation that slides to the right, the display position of the control in the menu can be adjusted correspondingly, the specific effect diagram As shown in FIG. 4, a in FIG. 4 is before scrolling, and b in FIG. 4 is after scrolling), and can also be automatically scrolled (similar to the marquee effect); (optional) the menu below the screen does not move only. Zoom out the display and move only the menu above the screen to the menu near the bottom of the screen.
- the first touch operation input by the user is a specific sliding operation
- the handheld terminal may determine, according to the sliding operation, open the control movement, and determine that the control needs to be specifically moved according to the corresponding operation of the sliding operation.
- Position the specific implementation of determining a point on the handheld terminal as the display reference point based on the side edge may be:
- a point is determined on the handheld terminal as the display reference point based on the side and the sliding direction.
- the touch track of the first touch operation may be combined with the side of the user's grip to determine the final display reference point, as shown in FIG. 5 .
- the user inputs a sliding operation from the upper left to the lower right on the handheld terminal with the finger, and the handheld terminal can be based on
- the operation determines that the user needs to move the control in the upper left corner of the handheld terminal to the lower rear corner (the interface after the movement is as shown in part b of FIG. 5), and the corresponding direction of the control movement can be determined according to the sliding direction of the touch track. Move to the bottom right.
- the specific implementation can be:
- the process of moving the control from the original position to the display reference point may also be displayed in the form of an animation.
- the position of the control in the overlay interface is moved and reorganized based on the above manner, it is also ensured that the function of each control does not change. Therefore, in this embodiment, after the control is moved, it needs to be The moved control is functionally mapped with the corresponding control before the movement, so that when the user clicks on the moved control, the corresponding function of the original position control can be triggered normally; then the processed overlay interface is presented.
- the specific implementation can be:
- the corresponding function is called according to the original coordinates.
- the control to be moved can also be a floating control (in this embodiment, the floating control refers to the free movement according to the user's drag operation).
- the manner of the control is displayed in the new overlay interface, as shown in FIG. 6, in which the three controls that are inconvenient to operate can form a circular floating control in FIG. 6b. (Where, FIG. 6 is a vertical screen operation of the handheld terminal, and FIG. 7 is a horizontal screen operation of the handheld terminal).
- FIG. 8 illustrates a handheld terminal in accordance with an embodiment of the present invention.
- the handheld terminal includes an input unit 801, a processor 802, an output unit 803, a communication unit 804, a storage unit 805, a peripheral interface 806, and a power supply 807. These units communicate over one or more buses. It will be understood by those skilled in the art that the structure of the handheld terminal shown in the figure does not constitute a limitation of the present invention, and it may be a bus-shaped structure or a star-shaped structure, and may include more or less than the illustration. Parts, or combine some parts, or different parts.
- the handheld terminal may be any mobile or portable handheld terminal, including but not limited to a mobile phone, a mobile computer, a tablet computer, a personal digital assistant (PDA), a media player, and the like. Two or more combinations, etc.
- PDA personal digital assistant
- the input unit is used to implement user interaction with the handheld terminal and/or information input into the handheld terminal.
- the input unit can receive numeric or character information input by the user to generate a signal input related to user settings or function control.
- the input unit may be a touch panel, or may be other human-computer interaction interfaces, such as a physical input key, a microphone, etc., and may be other external information capture devices, such as a camera.
- a touch panel also known as a touch screen or touch screen, collects operational actions that the user touches or approaches on.
- the user uses an action of any suitable object or accessory such as a finger or a stylus on or near the touch panel, and drives the corresponding connecting device according to a preset program.
- the touch panel may include two parts: a touch detection device and a touch controller.
- the touch detection device detects a touch operation of the user, converts the detected touch operation into an electrical signal, and transmits the electrical signal to the touch controller;
- the touch controller receives the electrical signal from the touch detection device, and Convert it to the contact coordinates and send it to the processor.
- the touch controller can also receive and execute commands from the processing unit.
- touch panels can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
- the physical input keys used by the input unit may include, but are not limited to, a physical keyboard, function keys (such as a volume control button, a switch button, etc.), a trackball, a mouse, a joystick, and the like. Or a variety.
- An input unit in the form of a microphone can collect the voice input by the user or the environment and convert it into a command executable by the processing unit in the form of an electrical signal.
- the input unit may also be various types of sensor components, such as Hall devices, for detecting physical quantities of the handheld terminal, such as force, moment, pressure, stress, position, displacement, speed. , acceleration, angle, angular velocity, number of revolutions, speed, and time when the working state changes, etc., are converted into electricity for detection and control.
- sensor components may also include gravity sensors, three-axis accelerometers, gyroscopes, electronic compasses, ambient light sensors, proximity sensors, temperature sensors, humidity sensors, pressure sensors, heart rate sensors, fingerprint readers, and the like.
- the output unit includes, but is not limited to, an image output unit and a sound output unit.
- the image output unit is used to output text, pictures, and/or video.
- the image output unit may include a display panel, for example, A display panel configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or a field emission display (FED).
- the image output unit may comprise a reflective display, such as an electrophoretic display, or a display utilizing an Interferometric Modulation of Light.
- the image output unit may comprise a single display or multiple displays of different sizes.
- the touch panel used by the input unit can also serve as a display panel of the output unit.
- the touch panel detects a touch or proximity gesture operation thereon, it is transmitted to the processing unit to determine the type of the touch event, and then the processing unit provides a corresponding visual output on the display panel according to the type of the touch event.
- the input unit and the output unit are two independent components to implement the input and output functions of the handheld terminal, in some embodiments, the touch panel and the display panel may be integrated to implement the handheld terminal. Input and output functions.
- the image output unit may display various graphical user interfaces (GUIs) as virtual control components, including but not limited to windows, scroll axes, icons, and scrapbooks, for the user to touch. The way to operate.
- GUIs graphical user interfaces
- the image output unit includes a filter and an amplifier for filtering and amplifying the video output by the processing unit.
- the audio output unit includes a digital to analog converter for converting the audio signal output by the processing unit from a digital format to an analog format.
- the processor is a control center of the handheld terminal, and connects various parts of the entire mobile terminal by using various interfaces and lines, by running or executing software programs and/or modules stored in the storage unit, and calling data stored in the storage unit, To perform various functions of the mobile terminal and/or process data.
- the system control module may be composed of an integrated circuit (IC), for example, may be composed of a single packaged IC, or may be composed of a plurality of packaged ICs that have the same function or different functions.
- the processor may include only a central processing unit (CPU), or may be a GPU or a digital signal processor (Digital Signal Processor). DSP), and a combination of control chips (eg, baseband chips) in the communication management module.
- the CPU may be a single operation core, and may also include multiple operation cores.
- the communication unit is configured to establish a communication channel, and enable the handheld terminal to perform voice communication, text communication, and data communication with the remote handheld terminal or the server through the communication channel.
- the communication unit may include a wireless local area network (Wireless Local Area Network) module, a Bluetooth module, a baseband module, and the like, and a radio frequency (RF) circuit corresponding to the communication module.
- RF radio frequency
- W-CDMA Wideband Code Division Multiple Access
- HSDPA High Speed Downlink Packet Access
- the communication module is used to control communication of components in the handheld terminal, and can support Direct Memory Access.
- various communication modules in the communication unit generally appear in the form of an integrated circuit chip, and can be selectively combined without including all communication modules and corresponding antennas. group.
- the communication unit may include only a baseband chip, a radio frequency chip, and a corresponding antenna to provide communication functionality in a cellular communication system.
- the handheld terminal can be connected to a cellular network (Cellular Network) or the Internet (Internet) via a wireless communication connection established by the communication unit, such as wireless local area network access or WCDMA access.
- the radio frequency circuit is used for receiving and transmitting signals during information transmission and reception or during a call. For example, after the downlink information of the base station is received, it is processed by the processing unit; in addition, the uplink data is designed to be sent to the base station.
- the radio frequency circuit includes well-known circuits for performing these functions, including but not limited to an antenna system, a radio frequency transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec. (Codec) chipset, Subscriber Identity Module (SIM) card, memory, etc.
- the RF circuit can communicate with the network and other devices through wireless communication.
- the wireless communication can use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, global mobile communication system, GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), High Speed Downlink Packet Access (HSDPA), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
- GSM Global System of Mobile communication
- GPRS General Packet Radio Service
- CDMA Code Division Multiple Access
- WCDMA Wideband Code Division Multiple Access
- HSDPA High Speed Downlink Packet Access
- LTE Long Term Evolution
- e-mail Short Messaging Service
- the storage unit can be used to store software programs and modules, and the processing unit executes various functional applications of the handheld terminal and implements data processing by running software programs and modules stored in the storage unit.
- the storage unit mainly includes a program storage area and a data storage area, wherein the program storage area can store an operating system, an application required for at least one function, such as a sound playing program, an image playing program, and the like; and the data storage area can be stored according to the handheld terminal. Use the created data (such as audio data, phone book, etc.).
- the storage unit may include a volatile memory, such as a non-volatile volatile random access memory (NVRAM) or a phase change random access memory (PRAM).
- NVRAM non-volatile volatile random access memory
- PRAM phase change random access memory
- MRAM magnetoresistive random access memory
- non-volatile memory such as at least one disk storage device, electrically erasable programmable read-only memory (Electrically Erasable Programmable Read-Only Memory) , referred to as EEPROM), flash memory devices, such as NOR flash memory or NAND flash memory.
- the non-volatile memory stores operating systems and applications executed by the processing unit.
- the processing unit loads the running program and data from the non-volatile memory into the memory and stores the digital content in a plurality of storage devices.
- the operating system includes various components and/or drivers for controlling and managing conventional system tasks such as memory management, storage device control, power management, and the like, as well as facilitating communication between various hardware and software.
- the operating system may be an Android system of Google Corporation, Apple's iOS system or Microsoft's Windows system / Windows Phone system, or an embedded operating system such as Vxworks.
- the application includes any application installed on the handheld terminal, including but not limited to browsers, email, instant messaging services, word processing, keyboard virtualization, widgets, encryption, digital rights management, voice recognition, Voice copying, positioning (such as those provided by GPS), music playback, and more.
- the power supply is used to power different parts of the handheld terminal to maintain its operation.
- the power source may be a built-in battery, such as a common lithium ion battery, a nickel hydride battery, etc., and an external power source that directly supplies power to the handheld terminal, such as an AC adapter.
- the power source may also be more widely defined, and may further include, for example, a power management system, a charging system, a power failure detecting circuit, a power converter or an inverter, and a power status indicator (such as light-emitting diodes, as well as any other components associated with the power generation, management and distribution of handheld terminals.
- the input unit 801 is configured to detect, when the handheld terminal is in the full-screen immersive mode, whether there is a first touch operation that satisfies the first preset condition;
- the input unit 801 is mainly used for receiving and detecting the input information.
- the specific implementation may include multiple physical structures, where the touch operation detection may be a touch screen or the like, and the touch operation may be recognized.
- the physical structure may be a touch screen or the like.
- the processor 802 calls the program in the storage unit 805, if there is a first touch operation that satisfies the first preset condition, the acquisition of the overlay interface to be displayed corresponding to the application of the full-screen immersive mode is implemented; and the overlay to be displayed is determined.
- a control displayed in the layer interface when it is determined that the user is currently holding the handheld terminal in a one-handed manner, determining a display reference point corresponding to the one-handed holding manner; Determining a control to be moved in the control; adjusting a display position of the control to be moved, and generating a new overlay interface according to the adjusted position of the control, and replacing the overlay interface to be displayed with the new overlay interface;
- the distance between the display position of the to-be-moved control and the display reference point is less than a set threshold; the handheld terminal of the to-be-moved control before and after the user operates the position adjustment performs the same function
- the overlay interface to be displayed is displayed superimposed on the current display content; the distance between the display reference point and the side of the handheld terminal held by the user is less than a set threshold
- the holding manner includes a left hand grip or a right hand grip.
- the processor is specifically configured to determine a side edge of the handheld terminal that is held by the user; and determine a point on the handheld terminal as the display reference point based on the side edge.
- the specific processor is specifically configured to acquire a touch track of the first touch operation, and according to the touch track Determining a sliding direction corresponding to the touch track with respect to a position of the starting point; determining a point on the handheld terminal as the display reference point based on the side edge and the sliding direction.
- the input unit is further configured to detect the touch signal, so that the processor determines, according to the touch signal, that the user holds the side of the handheld terminal.
- the corresponding physical structure may be a physical structure that can recognize the touch signal, such as a touch sensor.
- the processor is specifically configured to detect a distance between the to-be-displayed position of the control and the display reference point, when between the display position of the control and the display reference point If the distance value is greater than the set threshold, determining that the control is the control to be moved; or outputting the overlay interface to be displayed, and detecting whether there is a second touch operation that satisfies the second preset condition, Determining the to-be-moved control from the control according to the second touch operation.
- the processor is further configured to: when the user operates the first control in the position-adjusted control after the position adjustment, obtain the current display coordinate of the first control in the new overlay interface; The current display coordinates determine corresponding original coordinates of the first control in the overlay layer to be displayed; and the corresponding function is called according to the original coordinates.
- the processor is further configured to display the to-be-moved control in a manner of a floating control in the new overlay interface.
- an embodiment of the present invention further provides a handheld terminal, where the handheld terminal includes:
- the obtaining module 901 is configured to: when the handheld terminal detects the first touch operation that meets the first preset condition in the full-screen immersive mode, acquire the to-be-displayed overlay interface corresponding to the full-screen immersive mode corresponding application; When the displayed overlay interface is displayed, the overlay is displayed above the current display content;
- a first control determining module 902 configured to determine a control displayed in the overlay interface to be displayed
- a reference point determining module 903 configured to determine a display reference point corresponding to the one-handed holding mode when determining that the user is currently in the one-handed holding manner of the handheld terminal; wherein the display reference point and the user grip The distance between the sides of the handheld terminal is less than a set threshold, and the holding manner includes a left hand grip or a right hand grip;
- a second control determining module 904 configured to determine, from the control, a control to be moved
- the adjusting module 905 is configured to adjust a display position of the to-be-moved control, and generate a new overlay interface according to the adjusted control position, and replace the to-be-displayed overlay interface with the new overlay interface; In the new overlay interface, the distance between the display position of the control to be moved and the display reference point is less than a set threshold; the function performed by the handheld terminal of the to-be-moved control before and after the user operates the position adjustment the same.
- the reference point determining module 903 determines a display reference corresponding to the one-handed holding manner.
- Points include:
- Determining that the user holds the side of the handheld terminal determining a point on the handheld terminal as the display reference point based on the side.
- the reference point determining module 903 is specifically configured to acquire a touch track of the first touch operation, and determine a sliding direction corresponding to the touch track according to a position of an end point of the touch track relative to a starting point; A point is determined on the handheld terminal as the display reference point based on the side and the sliding direction.
- the side of the handheld terminal is provided with a touch sensor
- the reference point determining module 903 is further configured to determine, by the touch sensor on the side edge, that the touch signal is used by the user to hold the handheld terminal. Side.
- the second control determining module 904 is specifically configured to detect a distance value between the to-be-displayed position of each of the controls and the display reference point, when the position to be displayed of any control and the display reference If the distance between the points is greater than the set threshold, determining that the any control is the control to be moved; or outputting the overlay interface to be displayed, and detecting whether there is a second condition that satisfies the second preset condition
- the touch operation determines the to-be-moved control from the control according to the second touch operation.
- the handheld terminal further includes:
- a function mapping module configured to acquire a current display coordinate of the first control in the new overlay interface when the user operates the first control in the position-adjusted control; according to the current display
- the coordinates determine corresponding original coordinates of the first control in the overlay layer to be displayed; and the corresponding function is called according to the original coordinates.
- the method and apparatus provided by the embodiments of the present invention need to operate the operating system to move the menu/button/option that should be displayed by the application to the area convenient for finger manipulation after the user clicks on the screen.
- the overlay interface to be displayed is processed and presented after processing.
- the control after the move has the same function as the corresponding control before the move. When the user clicks the moved control, the corresponding function of the original position control can be triggered normally. Therefore, the menu provided by the embodiment of the invention is more convenient for the user to touch, and the interface is refreshing and beautiful.
- the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
- the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
- These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
- the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种用户界面的菜单显示方法及手持终端,该方法包括:当手持终端在全屏沉浸模式下检测到了满足第一预设条件的第一触控操作,获取所述全屏沉浸模式对应应用程序的待显示叠加层界面(101),并确定所述待显示叠加层界面中显示的控件(102),当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点(103);从所述控件中确定待移动控件(104);对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面(105),使用所述新的叠加层界面替换所述待显示叠加层界面;其中,在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值。该方法解决现有技术中用户界面的菜单显示方法不合理导致用户操作不便的问题。
Description
本发明涉及电子技术领域,尤其涉及一种用户界面的菜单显示方法及手持终端。
手持终端已经成为人们日常生活不可缺少的必须品,无论是从任何角度出发都可以看出手持终端的重要性,现在手持终端发展的趋势是屏幕越来越大,但人手掌的大小是固定的,所以现在很多手持终端都需要用户双手进行操作才能覆盖点击到整个屏幕中的控件,但有的时候,用户不得不腾出一只手做其他事情,这样只能单手操作移动终端,此时手指能点击到的区域范围有限,不能覆盖点击到整个屏幕。
而且现在很多用户通过手持终端播放音视频文件,为了提供更好的观看效果一般手持终端都提供全屏沉浸模式。在全屏沉浸模式下,系统菜单包括状态栏、虚拟按键,以及应用菜单都动态隐藏。该全屏沉浸模式对应的应用程序可以使用完整的屏幕空间(即该应用程序的显示内容全屏显示在终端的显示单元上),给用户提供一种更简洁、清爽的使用体验。
当退出沉浸状态后,手机屏幕顶部和底部出现沉浸式菜单(包括系统菜单、应用菜单)。用户可以通过操作沉浸式菜单对当前应用或系统进行控制,实现相应功能。
基于沉浸菜单一般设置在显示屏幕顶部和/或底部的显示方式,用户在单手操作时总有一部分菜单区域不容易触控到。所以现有的会出现菜单不便于操作的问题。
发明内容
本发明提供一种用户界面的菜单显示方法及手持终端,本发明所提供的方法和装置解决现有技术中用户界面的菜单显示方法不合理导致用户操作不便的问题。
第一方面,提供一种用户界面的菜单显示方法,所述方法包括:
当手持终端在全屏沉浸模式下检测到了满足第一预设条件的第一触控操作,获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;其中,所述待显示的叠加层界面显示时,叠加在所述手持终端当前显示内容之上显示;
确定所述待显示叠加层界面中显示的控件;
当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;其中,所述显示基准点与用户握持的所述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握持;
从所述控件中确定待移动控件;
对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;其中,在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同。
结合第一方面,在第一种可能的实现方式中,所述确定与所述单手握持方式对应的显示基准点包括:
确定用户握持的所述手持终端的侧边;
基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
结合第一方面的第一种可能的实现方式,在第二种可能的实现方式中,基于所述侧边在所述手持终端上确定一点作为所述显示基准点包括:
获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于起始点的位置确定所述触控轨迹对应的滑动方向;
基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
结合第一方面第一种或者第二种可能的实现方式,在第三种可能的实现方式中,确定用户握持所述手持终端的侧边包括:
所述手持终端的侧边设置有触摸传感器,通过所述侧边上的所述触摸传感器检测到触摸信号确定用户握持所述手持终端的侧边。
结合第一方面,或者第一方面的第一至三种可能的实现方式,在第四种可能的实现方式中,从所述控件中确定待移动控件包括:
检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者
输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
结合第一方面,或者第一方面的第一至四种可能的实现方式,在第五种可能的实现方式中,使用所述新的叠加层界面替换所述待显示叠加层界面之后,进一步包括:
当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;
根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;
根据所述原始坐标调用对应功能。
结合第一方面,或者第一方面的第一至五种可能的实现方式,在第六种
可能的实现方式中,该方法还包括:所述待移动控件以悬浮控件的方式显示在所述新的叠加层界面中。
第二方面,提供一种手持终端,所述手持终端包括:
输入单元,当手持终端在全屏沉浸模式下用于检测是否有满足第一预设条件的第一触控操作;
处理器,如果有满足第一预设条件的第一触控操作,用于获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;并确定所述待显示叠加层界面中显示的控件;当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;;从所述控件中确定待移动控件;对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同;其中,所述待显示的叠加层界面显示时,叠加在所述当前显示内容之上显示;所述显示基准点与用户握持的所述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握持。
结合第二方面,在第一种可能的实现方式中,所述处理器确定与所述单手握持方式对应的显示基准点具体包括确定用户握持的所述手持终端的侧边;基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
结合第二方面的第一种可能的实现方式,在第二种可能的实现方式中,所述处理器基于所述侧边在所述手持终端上确定一点作为所述显示基准点具体包括获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于起始点的位置确定所述触控轨迹对应的滑动方向;基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
结合第二方面第一种或者第二种可能的实现方式,在第三种可能的实现方式中,所述输入单元还用于检测触摸信号,使得处理器根据该触摸信号确定用户握持所述手持终端的侧边。
结合第二方面,或者第二方面的第一至三种可能的实现方式,在第四种可能的实现方式中,所述处理器从所述控件中确定待移动控件具体包括检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
结合第二方面,或者第二方面的第一至四种可能的实现方式,在第五种可能的实现方式中,所述处理器使用所述新的叠加层界面替换所述待显示叠加层界面之后,还用于当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;根据所述原始坐标调用对应功能。
结合第二方面,或者第二方面的第一至五种可能的实现方式,在第六种可能的实现方式中,所述处理器还用于将所述待移动控件以悬浮控件的方式显示在所述新的叠加层界面中。
上述技术方案中的一个或两个,至少具有如下技术效果:
本发明实施例提供的方法和装置,为了实现在用户点击屏幕之后,应用程序本应显示的菜单/按钮/选项移动到方便手指操控的区域,需要操作系统对待显示的叠加层界面进行处理,处理之后再呈现出来。移动之后的控件与移动前的相应的控件功能相同,用户点击移动后的控件时,原有位置控件的相
应功能能够正常触发。所以本发明实施例提供的菜单更方便用户触控,而且界面清爽美观。
图1为本发明实施例提供的一种用户界面的菜单显示方法的流程示意图;
图2为本发明实施例中用户右手握持手持终端时,待移动控件位置移动前后的对比示意图;
图3为本发明实施例中用户左手握持手持终端时,待移动控件位置移动前后的对比示意图;
图4为本发明实施例中用户通过滑动操作控制移动后控件的对比示意图;
图5为本发明实施例中通过触控轨迹确定控件移动位置的实现示意图;
图6和图7为本发明实施例中移动控件以悬浮形式显示时的示意图;
图8为本发明实施例提供的一种手持终端的结构示意图;
图9为本发明实施例提供的另外一种手持终端的结构示意图。
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
下面结合说明书附图对本发明实施例作进一步详细描述。
现有技术中,应用程序在全屏沉浸模式下会通过全屏显示内容,而不显示任何菜单、按钮或选项(也不会显示状态栏和导航栏),当用户点击屏幕时,
应用程序会在当前全屏显示的内容之上叠加要显示的菜单、按钮或选项,供用户使用。基于上述全屏沉浸模式的特性,为了实现在用户点击屏幕之后,应用程序本应显示的菜单/按钮/选项移动到方便用户手指操控的区域,本发明实施例提供一种用户界面的菜单显示方法,该方法对待叠加显示在当前显示界面上的叠加层界面进行处理,处理之后再叠加在当前显示界面上显示。如图1所示,本发明实施例提供的方法具体实现包括以下步骤:
步骤101,当手持终端在全屏沉浸模式下检测到了满足第一预设条件的第一触控操作,获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;其中,所述待显示的叠加层界面显示时,叠加在所述当前显示内容之上显示;
在该实施例中,满足第一预设条件的第一触控操作可以是预先设定的手持终端可识别的操作。另外,该控件是应用菜单中的控件。
应用程序的待显示叠加层界面显示在屏幕上之前,系统首先要在内存中加载该待显示叠加层界面,在加载完成后系统才可以解析该待显示叠加层界面从而确定该界面中的控件,系统可以获取界面中控件的信息。其中控件的信息包括控件的ID、名称、位置、尺寸等信息。
步骤102,确定所述待显示叠加层界面中显示的控件;
在现有的沉浸模式中,为了给用户提供最好的观看效果,把当前显示内容对应的所有控件都隐藏了。一般情况下,用户对终端进行操作时,当前显示内容对应的应用会将所有的控件显示设置在叠加层界面中,基于该实现方式,本发明提供的方案则可以在叠加层界面显示之前,对控件的位置进行调整,使得调整后显示的控件更方便操作。当然为了保证显示界面的简洁,不管控件位置是否调整,在叠加层界面中相同的控件都只显示一个。
步骤103,当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;其中,所述显示基准点与用户握持的所
述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握;
在该实例中,显示基准点是用于定位控件移动之后的位置的,所以为了将不方便用户操控的控件移动到用户便于操作的位置,可以将显示基准点设置在用户握持终端的位置。具体确定与所述单手握持方式对应的显示基准点的方式可以是:
在该实例中,在确定用户握持手持终端的位置后,就可基于该位置确定待移动的控件移动到什么位置用户方便操作,所以只要确定用户握持手持终端的位置后,就可以基于用户握持手持终端的侧边、底边或者与该侧边和底边对应的拐角顶点确定某一位置作为显示基准点。在该实施例中并不限定具体的实现方式,只要能达到方便用户操作即可。以下将用户握持手持终端的侧边作为实例对本发明实施例的方案进行说明:
A,确定用户握持的所述手持终端的侧边;
B,基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
其中,确定用户握持所述手持终端的侧边的具体实现方式可以是:
所述手持终端的侧边设置有触摸传感器,通过所述侧边上的所述触摸传感器检测到触摸信号确定用户握持所述手持终端的侧边。
步骤104,从所述控件中确定待移动控件;
在该实施例中,可以将叠加层界面中的所有控件都进行移动从而形成一个格式统一并且美观的新界面;另外,也可以基于实用角度出发,只是移动用户不便于操作的控件(根据业界已有的统计数据,确定不同终端屏幕上不容易操控的区域)。所以在确定叠加层界面包括的所有控件之后,可以从中选择一部分控件也可以将所有控件作为待移动控件。
其中,如果要从控件中选择一部分控件进行移动,则从所述控件中确定
待移动控件包括:
A,检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者
在该实现方式中,因为该实施例中所涉及的控件在待显示叠加层界面中是显示在以固定位置的,所以每个控件都有确定显示位置的参数或者属性,所以在该实施例中,根据该参数或者属性就可以确定每个控件在屏幕上显示时,应该显示在什么位置。虽然该控件此时实际未被显示在移动终端的显示装置上,但可以跟相关的参数或者属性确定出其待显示在显示装置上的位置,即待显示位置。
B,输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
在该实施例中,对控件的调整的最终目的是为了达到用户使用方便,所以为了适应每个用户的需求,可以将待显示叠加层以预览的形式进行显示,然后用户根据显示的内容,则可以判定那些外置的控件不方便操作,从而就可以通过触控操作从控件中选择需要进行位置移动的待移动控件。
步骤105,对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;其中,在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同。
在新的叠加层界面中各控件的位置更方便用户操控,为了达到这个目的具体设置控件位置的时候具体实现可以是:
统计用户握持手机时拇指在屏幕上的触控点集合,估算出用户拇指长度,
从而计算出用户拇指以手指根部为基点能够旋转的角度范围,以及拇指能够伸展的最长距离,最终计算出能够覆的范围。即基于步骤102确定的显示基准点(在该实例中可以是手指的根部),将待移动控件显示在最终计算出能够覆的范围,即控件移动后的显示位置与显示基准点之间的距离小于设定阈值(如图2所示,其中图2的a图为控件移动之前,图2的b图为控件移动之后)。图2所示的情况是用户右手握持手持终端的具体实现情况,对于用户左手握持时,则可以对应的将控件移动到手持终端左侧的位置进行显示,具体如图3所示。
在该实施例中,待移动的控件移动之后可以各自独立显示也可以组合在一起形成一个菜单栏,如图2中b所示的圆弧状菜单(图2所示的具体实例只是实现本发明实施例的一种最优化的实例,并不限定本发明实施例所提供的方案只能通过图2所示的方式实现。在具体的应用环境中,可以根据设计需要以及用户操作方便需要等原因将该菜单设置成矩形椭圆形等各种形状)。在该菜单格式的情况下,菜单中的控件可以通过用户手势左右和/或上下滚动显示(如果用户输入向右滑动的触控操作,则可以对应的调整菜单中控件的显示位置,具体效果图如图4所示,图4中的a为滚动之前,图4中的b为滚动之后),也可以自动滚动显示(类似于走马灯效果);(可选的)屏幕下方的菜单不移动位置只是缩小显示范围,而只移动屏幕上方的菜单到屏幕下方菜单附近。
可选的,在该实施例中,用户输入的第一触控操作是一个特定的滑动操作,手持终端可以基于该滑动操作确定,开启控件移动、以及根据滑动操作对应轨迹确定控件需要具体移动的位置,则基于所述侧边在所述手持终端上确定一点作为所述显示基准点的具体实现可以是:
获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于
起始点的位置确定所述触控轨迹对应的滑动方向;
基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
在该实施例中,为了使确定的显示基准点更方便用户操控,可以将第一触控操作的触控轨迹与用户握持的侧边进行结合之后确定最终的显示基准点,如图5所示的实施例:
在图5的a部分中(该图中不方便用户操控的控件1、控件2和控件3),用户用手指在手持终端上输入了一个由左上到右下的滑动操作,手持终端则可以根据该操作确定用户是需要将手持终端左上角的控件移动到后下角(移动之后的界面如图5的b部分所以),则对应的可以根据触控轨迹的滑动方向确定控件移动的方向也是从左上移动到右下。根据分析则可以确定将不方便操控的控件移动到手持终端右下的某个位置,用户则更方便操控。所以具体实现可以是:
(1)确定手持终端中与用户握持方式对应的侧边,并确定该侧边对应的手持终端的两个拐角顶点;确定第一触控操作的触控轨迹对应的滑动方向与侧边的交点,并从两个拐角顶点中确定该交点距离小的拐角顶点作为所述显示基准点(如图5的b部分所示)。
(2)确定手持终端中与用户握持方式对应的侧边;确定第一触控操作的触控轨迹对应的滑动方向与该侧边的交点,并将该交点作为显示基准点。
在该实施例中,为了到达更好的显示效果,还可以在确定显示基准点和需要移动的控件后,还可以以动图的形式显示控件从原位置移动到显示基准点的过程。
基于上述方式将叠加层界面中的控件进行位置移动以及重组之后,还要保证每个控件的功能不发生变化,所以该实施例中在控件移动之后,需要对
移动后的控件与移动前的相应的控件进行功能映射,以便用户点击移动后的控件时,原有位置控件的相应功能能够正常触发;然后将处理后的叠加层界面呈现出来。具体实现可以是:
当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;
根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;
根据所述原始坐标调用对应功能。
可选的,为了方便用户左右手单手握持都方便操控,所以在该实例中待移动控件还可以以悬浮控件(在本实施例中悬浮控件是指根据用户的拖动操作可以而自由移动的控件)的方式显示在所述新的叠加层界面中,如图6所示,在该实例中可以将不方便操作的三个控件形成一个图6b中圆形的悬浮控件。(其中,图6为手持终端竖屏操作,图7为手持终端横屏操作)。
实施例
图8所示为根据本发明一个具体实施方式的手持终端。该手持终端包括输入单元801、处理器802、输出单元803、通信单元804、存储单元805、外设接口806以及电源807。这些单元通过一条或多条总线进行通信。本领域技术人员可以理解,图中示出的手持终端的结构并不构成对本发明的限定,它既可以是总线形结构,也可以是星型结构,还可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施方式中,所述手持终端可以是任何移动或便携式手持终端,包括但不限于移动电话、移动电脑、平板电脑、个人数字助理(Personal Digital Assistant,PDA)、媒体播放器,以及上述两项或两项以上的组合等。
输入单元用于实现用户与手持终端的交互和/或信息输入到手持终端中。
例如,输入单元可以接收用户输入的数字或字符信息,以产生与用户设置或功能控制有关的信号输入。在本发明具体实施方式中,输入单元可以是触控面板,也可以是其他人机交互界面,例如实体输入键、麦克风等,还可是其他外部信息撷取装置,例如摄像头等。触控面板,也称为触摸屏或触控屏,可收集用户在其上触摸或接近的操作动作。比如用户使用手指、触笔等任何适合的物体或附件在触控面板上或接近触控面板的位置的操作动作,并根据预先设定的程式驱动相应的连接装置。可选的,触控面板可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸操作,并将检测到的触摸操作转换为电信号,以及将所述电信号传送给触摸控制器;触摸控制器从触摸检测装置上接收所述电信号,并将它转换成触点坐标,再送给处理器。所述触摸控制器还可以接收处理单元发来的命令并执行。此外,可以采用电阻式、电容式、红外线(Infrared)以及表面声波等多种类型实现触控面板。在本发明的其他实施方式中,输入单元所采用的实体输入键可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。麦克风形式的输入单元可以收集用户或环境输入的语音并将其转换成电信号形式的、处理单元可执行的命令。
在本发明的其他一些实施方式中,所述输入单元还可以是各类传感器件,例如霍尔器件,用于侦测手持终端的物理量,例如力、力矩、压力、应力、位置、位移、速度、加速度、角度、角速度、转数、转速以及工作状态发生变化的时间等,转变成电量来进行检测和控制。其他的一些传感器件还可以包括重力感应计、三轴加速计、陀螺仪、电子罗盘、环境光传感器、接近传感器、温度传感器、湿度传感器、压力传感器、心率传感器、指纹识别器等。
输出单元包括但不限于影像输出单元和声音输出单元。影像输出单元用于输出文字、图片和/或视频。所述影像输出单元可包括显示面板,例如采用
LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)、场发射显示器(field emission display,简称FED)等形式来配置的显示面板。或者所述影像输出单元可以包括反射式显示器,例如电泳式(electrophoretic)显示器,或利用光干涉调变技术(Interferometric Modulation of Light)的显示器。所述影像输出单元可以包括单个显示器或不同尺寸的多个显示器。在本发明的具体实施方式中,上述输入单元所采用的触控面板亦可同时作为输出单元的显示面板。例如,当触控面板检测到在其上的触摸或接近的手势操作后,传送给处理单元以确定触摸事件的类型,随后处理单元根据触摸事件的类型在显示面板上提供相应的视觉输出。虽然在图8中,输入单元与输出单元是作为两个独立的部件来实现手持终端的输入和输出功能,但是在某些实施例中,可以将触控面板与显示面板集成一体而实现手持终端的输入和输出功能。例如,所述影像输出单元可以显示各种图形化用户接口(Graphical User Interface,简称GUI)以作为虚拟控制组件,包括但不限于窗口、卷动轴、图标及剪贴簿,以供用户通过触控方式进行操作。
在本发明具体实施方式中,影像输出单元包括滤波器及放大器,用来将处理单元所输出的视频滤波及放大。音频输出单元包括数字模拟转换器,用来将处理单元所输出的音频信号从数字格式转换为模拟格式。
处理器为手持终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储单元内的软件程序和/或模块,以及调用存储在存储单元内的数据,以执行移动终端的各种功能和/或处理数据。所述系统控制模块可以由集成电路(Integrated Circuit,简称IC)组成,例如可以由单颗封装的IC所组成,也可以由连接多颗相同功能或不同功能的封装IC而组成。举例来说,处理器可以仅包括中央处理器(Central Processing Unit,简称CPU),也可以是GPU、数字信号处理器(Digital Signal Processor,简称
DSP)、及通信管理模块中的控制芯片(例如基带芯片)的组合。在本发明实施方式中,CPU可以是单运算核心,也可以包括多运算核心。
通信单元用于建立通信信道,使手持终端通过所述通信信道与远端手持终端或服务器进行语音通信、文字通信、数据通信。所述通信单元可以包括无线局域网(Wireless Local Area Network,简称wireless LAN)模块、蓝牙模块、基带(Base Band)模块等通信模块,以及所述通信模块对应的射频(Radio Frequency,简称RF)电路,用于进行无线局域网络通信、蓝牙通信、红外线通信及/或蜂窝式通信系统通信,例如宽带码分多重接入(Wideband Code Division Multiple Access,简称W-CDMA)及/或高速下行封包存取(High Speed Downlink Packet Access,简称HSDPA)。所述通信模块用于控制手持终端中的各组件的通信,并且可以支持直接内存存取(Direct Memory Access)。
在本发明的不同实施方式中,所述通信单元中的各种通信模块一般以集成电路芯片(Integrated Circuit Chip)的形式出现,并可进行选择性组合,而不必包括所有通信模块及对应的天线组。例如,所述通信单元可以仅包括基带芯片、射频芯片以及相应的天线以在一个蜂窝通信系统中提供通信功能。经由所述通信单元建立的无线通信连接,例如无线局域网接入或WCDMA接入,所述手持终端可以连接至蜂窝网(Cellular Network)或因特网(Internet)。
射频电路用于信息收发或通话过程中接收和发送信号。例如,将基站的下行信息接收后,给处理单元处理;另外,将设计上行的数据发送给基站。通常,所述射频电路包括用于执行这些功能的公知电路,包括但不限于天线系统、射频收发机、一个或多个放大器、调谐器、一个或多个振荡器、数字信号处理器、编解码(Codec)芯片组、用户身份模块(SIM)卡、存储器等等。此外,射频电路还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of
Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、高速下行链路分组接入技术(High Speed Downlink Packet Access,HSDPA)、LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。
存储单元可用于存储软件程序以及模块,处理单元通过运行存储在存储单元的软件程序以及模块,从而执行手持终端的各种功能应用以及实现数据处理。存储单元主要包括程序存储区和数据存储区,其中,程序存储区可存储操作系统、至少一个功能所需的应用程序,比如声音播放程序、图像播放程序等等;数据存储区可存储根据手持终端的使用所创建的数据(比如音频数据、电话本等)等。在本发明具体实施方式中,存储单元可以包括易失性存储器,例如非挥发性动态随机存取内存(Nonvolatile Random Access Memory,简称NVRAM)、相变化随机存取内存(Phase Change RAM,简称PRAM)、磁阻式随机存取内存(Magetoresistive RAM,简称MRAM)等,还可以包括非易失性存储器,例如至少一个磁盘存储器件、电子可擦除可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,简称EEPROM)、闪存器件,例如反或闪存(NOR flash memory)或是反及闪存(NAND flash memory)。非易失存储器储存处理单元所执行的操作系统及应用程序。所述处理单元从所述非易失存储器加载运行程序与数据到内存并将数字内容储存于大量储存装置中。所述操作系统包括用于控制和管理常规系统任务,例如内存管理、存储设备控制、电源管理等,以及有助于各种软硬件之间通信的各种组件和/或驱动器。
在本发明实施方式中,所述操作系统可以是Google公司的Android系统、
Apple公司开发的iOS系统或Microsoft公司开发的Windows系统/Windows Phone系统等,或者是Vxworks这类的嵌入式操作系统。
所述应用程序包括安装在手持终端上的任何应用,包括但不限于浏览器、电子邮件、即时消息服务、文字处理、键盘虚拟、窗口小部件(Widget)、加密、数字版权管理、语音识别、语音复制、定位(例如由全球定位系统提供的功能)、音乐播放等等。
电源用于给手持终端的不同部件进行供电以维持其运行。作为一般性理解,所述电源可以是内置的电池,例如常见的锂离子电池、镍氢电池等,也包括直接向手持终端供电的外接电源,例如AC适配器等。在本发明的一些实施方式中,所述电源还可以作更为广泛的定义,例如还可以包括电源管理系统、充电系统、电源故障检测电路、电源转换器或逆变器、电源状态指示器(如发光二极管),以及与手持终端的电能生成、管理及分布相关联的其他任何组件。
基于图8所示的结构,为了实现图1所示实施例的方案,具体实现可以是:
输入单元801,当手持终端在全屏沉浸模式下用于检测是否有满足第一预设条件的第一触控操作;
在该实施例中,该输入单元801主要用于接收和检测输入的信息,具体实现时可以包括多种实体结构,在该处实现触控操作检测的可以是触控屏等可以识别触控操作的实体结构。
处理器802调用存储单元805中的程序,如果有满足第一预设条件的第一触控操作,实现获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;并确定所述待显示叠加层界面中显示的控件;当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;从所述
控件中确定待移动控件;对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同;其中,所述待显示的叠加层界面显示时,叠加在所述当前显示内容之上显示;所述显示基准点与用户握持的所述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握持。
可选的,处理器具体用于确定用户握持的所述手持终端的侧边;基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
为了实现基于所述侧边在所述手持终端上确定一点作为所述显示基准点,具体的该处理器具体用于获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于起始点的位置确定所述触控轨迹对应的滑动方向;基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
可选的,输入单元还用于检测触摸信号,使得处理器根据该触摸信号确定用户握持所述手持终端的侧边。
在该处输入单元实现触摸信号检测,则对应的实体结构可以是触摸传感器等可以识别触摸信号的实体结构。
可选的,处理器具体用于检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
可选的,处理器还用于当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;根据所述原始坐标调用对应功能。
可选的,所述处理器还用于将所述待移动控件以悬浮控件的方式显示在所述新的叠加层界面中
实施例
如图9所示,本发明实施例还提供一种手持终端,所述手持终端包括:
获取模块901,用于当手持终端在全屏沉浸模式下检测到了满足第一预设条件的第一触控操作,获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;其中,所述待显示的叠加层界面显示时,叠加在所述当前显示内容之上显示;
第一控件确定模块902,用于确定所述待显示叠加层界面中显示的控件;
基准点确定模块903,用于当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;其中,所述显示基准点与用户握持的所述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握持;
第二控件确定模块904,用于从所述控件中确定待移动控件;
调整模块905,用于对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;其中,在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同。
可选的,基准点确定模块903确定与所述单手握持方式对应的显示基准
点包括:
确定用户握持所述手持终端的侧边;基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
进一步,该基准点确定模块903具体用于获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于起始点的位置确定所述触控轨迹对应的滑动方向;基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
可选的,所述手持终端的侧边设置有触摸传感器,则该基准点确定模块903还用于通过所述侧边上的所述触摸传感器检测到触摸信号确定用户握持所述手持终端的侧边。
可选的,第二控件确定模块904具体用于检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
可选的,该手持终端还包括:
功能映射模块,用于当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;根据所述原始坐标调用对应功能。
本申请实施例中的上述一个或多个技术方案,至少具有如下的技术效果:
本发明实施例提供的方法和装置,为了实现在用户点击屏幕之后,应用程序本应显示的菜单/按钮/选项移动到方便手指操控的区域,需要操作系统对
待显示的叠加层界面进行处理,处理之后再呈现出来。移动之后的控件与移动前的相应的控件功能相同,用户点击移动后的控件时,原有位置控件的相应功能能够正常触发。所以本发明实施例提供的菜单更方便用户触控,而且界面清爽美观。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。
Claims (14)
- 一种用户界面的菜单显示方法,其特征在于,所述方法包括:当手持终端在全屏沉浸模式下检测到了满足第一预设条件的第一触控操作,获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;其中,所述待显示的叠加层界面显示时,叠加在所述手持终端当前显示内容之上显示;确定所述待显示叠加层界面中显示的控件;当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;其中,所述显示基准点与用户握持的所述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握持;从所述控件中确定待移动控件;对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;其中,在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同。
- 如权利要求1所述的方法,其特征在于,所述确定与所述单手握持方式对应的显示基准点包括:确定用户握持的所述手持终端的侧边;基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
- 如权利要求2所述的方法,其特征在于,基于所述侧边在所述手持终端上确定一点作为所述显示基准点包括:获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于起始点的位置确定所述触控轨迹对应的滑动方向;基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
- 如权利要求2或3所述的方法,其特征在于,确定用户握持所述手持终端的侧边包括:所述手持终端的侧边设置有触摸传感器,通过所述侧边上的所述触摸传感器检测到触摸信号确定用户握持所述手持终端的侧边。
- 如权利要求1~4任一所述的方法,其特征在于,从所述控件中确定待移动控件包括:检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
- 如权利要求1~5任一所述的方法,其特征在于,使用所述新的叠加层界面替换所述待显示叠加层界面之后,进一步包括:当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;根据所述原始坐标调用对应功能。
- 如权利要求1~6任一所述的方法,其特征在于,该方法还包括:所述待移动控件以悬浮控件的方式显示在所述新的叠加层界面中。
- 一种手持终端,其特征在于,所述手持终端包括:输入单元,当手持终端在全屏沉浸模式下用于检测是否有满足第一预设 条件的第一触控操作;处理器,如果有满足第一预设条件的第一触控操作,用于获取所述全屏沉浸模式对应应用程序的待显示叠加层界面;并确定所述待显示叠加层界面中显示的控件;当确定用户当前对所述手持终端是单手握持方式,则确定与所述单手握持方式对应的显示基准点;从所述控件中确定待移动控件;对所述待移动控件的显示位置进行调整,并根据调整后的控件位置生成新的叠加层界面,使用所述新的叠加层界面替换所述待显示叠加层界面;在新的叠加层界面中,所述待移动控件的显示位置与所述显示基准点之间的距离小于设定阈值;用户操作位置调整前后的所述待移动控件所述手持终端执行的功能相同;其中,所述待显示的叠加层界面显示时,叠加在所述当前显示内容之上显示;所述显示基准点与用户握持的所述手持终端的侧边之间的距离小于设定阈值,所述握持方式包括左手握持或右手握持。
- 如权利要求8所述的手持终端,其特征在于,所述处理器确定与所述单手握持方式对应的显示基准点具体包括确定用户握持的所述手持终端的侧边;基于所述侧边在所述手持终端上确定一点作为所述显示基准点。
- 如权利要求9所述的手持终端,其特征在于,所述处理器基于所述侧边在所述手持终端上确定一点作为所述显示基准点具体包括获取所述第一触控操作的触控轨迹,并根据所述触控轨迹的终点相对于起始点的位置确定所述触控轨迹对应的滑动方向;基于所述侧边和所述滑动方向在所述手持终端上确定一点作为所述显示基准点。
- 如权利要求9或10所述的手持终端,其特征在于,所述输入单元还用于检测触摸信号,使得处理器根据该触摸信号确定用户握持所述手持终端的侧边。
- 如权利要求8~11任一所述的手持终端,其特征在于,所述处理器从 所述控件中确定待移动控件具体包括检测所述控件中的每一个的待显示位置与所述显示基准点之间的距离值,当任一控件的待显示位置与所述显示基准点之间的距离值大于所述设定阈值,则确定所述任一控件为所述待移动控件;或者输出所述待显示叠加层界面,并检测是否有满足第二预设条件的第二触控操作,根据所述第二触控操作从所述控件中确定所述待移动控件。
- 如权利要求8~12任一所述的手持终端,其特征在于,所述处理器使用所述新的叠加层界面替换所述待显示叠加层界面之后,还用于当用户操作位置调整后的所述待移动控件中的第一控件时,获取所述第一控件在所述新的叠加层界面中的当前显示坐标;根据所述当前显示坐标确定所述第一控件在所述待显示叠加层界面中对应的原始坐标;根据所述原始坐标调用对应功能。
- 如权利要求8~13任一所述的手持终端,其特征在于,所述处理器还用于将所述待移动控件以悬浮控件的方式显示在所述新的叠加层界面中
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201580085533.7A CN108475156A (zh) | 2015-12-31 | 2015-12-31 | 一种用户界面的菜单显示方法及手持终端 |
| PCT/CN2015/100296 WO2017113379A1 (zh) | 2015-12-31 | 2015-12-31 | 一种用户界面的菜单显示方法及手持终端 |
| US16/067,128 US20190018555A1 (en) | 2015-12-31 | 2015-12-31 | Method for displaying menu on user interface and handheld terminal |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/100296 WO2017113379A1 (zh) | 2015-12-31 | 2015-12-31 | 一种用户界面的菜单显示方法及手持终端 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017113379A1 true WO2017113379A1 (zh) | 2017-07-06 |
Family
ID=59224258
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2015/100296 Ceased WO2017113379A1 (zh) | 2015-12-31 | 2015-12-31 | 一种用户界面的菜单显示方法及手持终端 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20190018555A1 (zh) |
| CN (1) | CN108475156A (zh) |
| WO (1) | WO2017113379A1 (zh) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108549516A (zh) * | 2018-04-12 | 2018-09-18 | 北京奇艺世纪科技有限公司 | 一种界面布局调整方法及装置 |
| CN111078114A (zh) * | 2019-12-26 | 2020-04-28 | 上海传英信息技术有限公司 | 单手控制方法、控制装置及终端设备 |
| CN111580920A (zh) * | 2020-05-14 | 2020-08-25 | 网易(杭州)网络有限公司 | 应用程序的界面显示方法、装置及电子设备 |
| CN114661404A (zh) * | 2022-03-31 | 2022-06-24 | Oppo广东移动通信有限公司 | 调节控件的控制方法、装置、电子设备以及存储介质 |
Families Citing this family (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102481643B1 (ko) * | 2017-01-31 | 2022-12-28 | 삼성전자주식회사 | 디스플레이 제어 방법 및 전자 장치 |
| JP6932267B2 (ja) * | 2018-08-21 | 2021-09-08 | 株式会社ソニー・インタラクティブエンタテインメント | コントローラ装置 |
| CN110597427B (zh) * | 2019-09-10 | 2021-07-20 | Oppo广东移动通信有限公司 | 应用管理方法、装置、计算机设备以及存储介质 |
| CN111124247A (zh) * | 2019-12-26 | 2020-05-08 | 上海传英信息技术有限公司 | 控制界面显示方法、移动终端及存储介质 |
| CN111273984A (zh) * | 2020-01-20 | 2020-06-12 | 深圳震有科技股份有限公司 | 一种数值控件的扩展方法、存储介质及终端设备 |
| CN113448479B (zh) * | 2020-03-25 | 2024-03-12 | Oppo广东移动通信有限公司 | 单手操作模式开启方法、终端及计算机存储介质 |
| CN112083858A (zh) * | 2020-08-31 | 2020-12-15 | 珠海格力电器股份有限公司 | 控件的显示位置调整方法及装置 |
| CN114253433B (zh) * | 2020-09-24 | 2024-09-24 | 荣耀终端有限公司 | 一种动态元素控制方法、电子设备和计算机可读存储介质 |
| CN112995401A (zh) * | 2021-02-25 | 2021-06-18 | 北京字节跳动网络技术有限公司 | 控件显示方法、装置、设备及介质 |
| CN113110783B (zh) * | 2021-04-16 | 2022-05-20 | 北京字跳网络技术有限公司 | 控件的显示方法、装置、电子设备和存储介质 |
| CN115793928A (zh) * | 2021-09-09 | 2023-03-14 | 北京字跳网络技术有限公司 | 页面切换方法、装置、设备及存储介质 |
| USD1057761S1 (en) * | 2022-06-02 | 2025-01-14 | Evernorth Strategic Development, Inc. | Display screen with moveable icons |
| CN115501581B (zh) * | 2022-09-29 | 2025-07-04 | 网易(杭州)网络有限公司 | 一种游戏控制方法、装置、计算机设备及存储介质 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104077067A (zh) * | 2013-03-28 | 2014-10-01 | 深圳市快播科技有限公司 | 基于具有触摸屏的装置的播放方法及系统 |
| CN104185053A (zh) * | 2014-08-05 | 2014-12-03 | 百度在线网络技术(北京)有限公司 | 音视频播放方法和装置 |
| CN104714731A (zh) * | 2013-12-12 | 2015-06-17 | 中兴通讯股份有限公司 | 终端界面的显示方法及装置 |
| US20150212656A1 (en) * | 2014-01-29 | 2015-07-30 | Acer Incorporated | Portable apparatus and method for adjusting window size thereof |
Family Cites Families (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB0624885D0 (en) * | 2006-12-13 | 2007-01-24 | Compurants Ltd | Restaurant concept |
| KR20090022297A (ko) * | 2007-08-30 | 2009-03-04 | 삼성전자주식회사 | 디스플레이 제어 방법, 이를 이용한 디스플레이 장치 및디스플레이 시스템 |
| KR20110069476A (ko) * | 2009-12-17 | 2011-06-23 | 주식회사 아이리버 | 사용자 그립 상태를 반영하여 조작가능한 핸드헬드 전자기기 및 조작방법 |
| EP2393000B1 (en) * | 2010-06-04 | 2019-08-07 | Lg Electronics Inc. | Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal |
| WO2011150886A2 (zh) * | 2011-06-23 | 2011-12-08 | 华为终端有限公司 | 手持式终端设备用户界面自动切换方法及手持式终端设备 |
| JP2013218428A (ja) * | 2012-04-05 | 2013-10-24 | Sharp Corp | 携帯型電子機器 |
| KR101979666B1 (ko) * | 2012-05-15 | 2019-05-17 | 삼성전자 주식회사 | 표시부에 출력되는 입력 영역 운용 방법 및 이를 지원하는 단말기 |
| KR102044829B1 (ko) * | 2012-09-25 | 2019-11-15 | 삼성전자 주식회사 | 휴대단말기의 분할화면 처리장치 및 방법 |
| US20140137036A1 (en) * | 2012-11-15 | 2014-05-15 | Weishan Han | Operation Window for Portable Devices with Touchscreen Displays |
| CN103309604A (zh) * | 2012-11-16 | 2013-09-18 | 中兴通讯股份有限公司 | 一种终端及终端屏幕显示信息控制方法 |
| US8769431B1 (en) * | 2013-02-28 | 2014-07-01 | Roy Varada Prasad | Method of single-handed software operation of large form factor mobile electronic devices |
| US20140362119A1 (en) * | 2013-06-06 | 2014-12-11 | Motorola Mobility Llc | One-handed gestures for navigating ui using touch-screen hover events |
| JP5759660B2 (ja) * | 2013-06-21 | 2015-08-05 | レノボ・シンガポール・プライベート・リミテッド | タッチ・スクリーンを備える携帯式情報端末および入力方法 |
| EP3079340B1 (en) * | 2013-12-03 | 2019-03-06 | Huawei Technologies Co., Ltd. | Processing method and apparatus, and terminal |
| KR20150071130A (ko) * | 2013-12-18 | 2015-06-26 | 삼성전자주식회사 | 휴대단말기에서 스크롤을 제어하는 방법 및 장치 |
| US9851883B2 (en) * | 2014-02-17 | 2017-12-26 | Xerox Corporation | Method and apparatus for adjusting and moving a user interface for single handed use on an endpoint device |
| KR102238330B1 (ko) * | 2014-05-16 | 2021-04-09 | 엘지전자 주식회사 | 디스플레이 장치 및 그의 동작 방법 |
| US20170199662A1 (en) * | 2014-05-26 | 2017-07-13 | Huawei Technologies Co., Ltd. | Touch operation method and apparatus for terminal |
| WO2016058092A1 (en) * | 2014-10-16 | 2016-04-21 | Griffin Innovation | Mobile device systems and methods |
| CN105528169A (zh) * | 2014-10-23 | 2016-04-27 | 中兴通讯股份有限公司 | 一种触摸屏设备和对触摸屏设备进行操作的方法 |
| US10082936B1 (en) * | 2014-10-29 | 2018-09-25 | Amazon Technologies, Inc. | Handedness determinations for electronic devices |
| US10444977B2 (en) * | 2014-12-05 | 2019-10-15 | Verizon Patent And Licensing Inc. | Cellphone manager |
| US20160162149A1 (en) * | 2014-12-05 | 2016-06-09 | Htc Corporation | Mobile electronic device, method for displaying user interface, and recording medium thereof |
| EP3210098A1 (en) * | 2015-01-28 | 2017-08-30 | Huawei Technologies Co., Ltd. | Hand or finger detection device and a method thereof |
| WO2016138661A1 (zh) * | 2015-03-05 | 2016-09-09 | 华为技术有限公司 | 终端的用户界面的处理方法、用户界面和终端 |
| CN106796474B (zh) * | 2015-05-19 | 2020-07-24 | 华为技术有限公司 | 一种用于识别用户操作模式的方法及移动终端 |
| GB2557084A (en) * | 2015-08-20 | 2018-06-13 | Motorola Solutions Inc | Method and apparatus for changing a mode of a device from a right-hand mode to a left-hand mode, and vice versa, or to a normal mode to a handedness mode |
| EP3349115A4 (en) * | 2015-09-29 | 2018-11-14 | Huawei Technologies Co., Ltd. | Human machine interaction method and device for user terminal, and user terminal |
| US10782793B2 (en) * | 2017-08-10 | 2020-09-22 | Google Llc | Context-sensitive hand interaction |
-
2015
- 2015-12-31 WO PCT/CN2015/100296 patent/WO2017113379A1/zh not_active Ceased
- 2015-12-31 CN CN201580085533.7A patent/CN108475156A/zh active Pending
- 2015-12-31 US US16/067,128 patent/US20190018555A1/en not_active Abandoned
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104077067A (zh) * | 2013-03-28 | 2014-10-01 | 深圳市快播科技有限公司 | 基于具有触摸屏的装置的播放方法及系统 |
| CN104714731A (zh) * | 2013-12-12 | 2015-06-17 | 中兴通讯股份有限公司 | 终端界面的显示方法及装置 |
| US20150212656A1 (en) * | 2014-01-29 | 2015-07-30 | Acer Incorporated | Portable apparatus and method for adjusting window size thereof |
| CN104185053A (zh) * | 2014-08-05 | 2014-12-03 | 百度在线网络技术(北京)有限公司 | 音视频播放方法和装置 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108549516A (zh) * | 2018-04-12 | 2018-09-18 | 北京奇艺世纪科技有限公司 | 一种界面布局调整方法及装置 |
| CN111078114A (zh) * | 2019-12-26 | 2020-04-28 | 上海传英信息技术有限公司 | 单手控制方法、控制装置及终端设备 |
| CN111580920A (zh) * | 2020-05-14 | 2020-08-25 | 网易(杭州)网络有限公司 | 应用程序的界面显示方法、装置及电子设备 |
| CN111580920B (zh) * | 2020-05-14 | 2022-07-19 | 网易(杭州)网络有限公司 | 应用程序的界面显示方法、装置及电子设备 |
| CN114661404A (zh) * | 2022-03-31 | 2022-06-24 | Oppo广东移动通信有限公司 | 调节控件的控制方法、装置、电子设备以及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190018555A1 (en) | 2019-01-17 |
| CN108475156A (zh) | 2018-08-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017113379A1 (zh) | 一种用户界面的菜单显示方法及手持终端 | |
| US12346553B2 (en) | Widget processing method and related apparatus | |
| US11054988B2 (en) | Graphical user interface display method and electronic device | |
| RU2677595C2 (ru) | Способ и аппаратура для отображения интерфейса приложения и электронное устройство | |
| CN108027706B (zh) | 一种应用界面显示方法以及终端设备 | |
| US9529490B2 (en) | Method and apparatus for improving one-handed operation of a large smartphone or a small tablet computer | |
| CN105677305B (zh) | 图标管理的方法、装置及终端 | |
| WO2019014859A1 (zh) | 一种多任务操作方法及电子设备 | |
| CN105335001A (zh) | 具有弯曲显示器的电子设备以及用于控制其的方法 | |
| KR102307215B1 (ko) | 데이터 처리 방법 및 전자 디바이스 | |
| CN103809888A (zh) | 移动终端及其操控方法 | |
| CN108845782A (zh) | 连接移动终端和外部显示器的方法和实现该方法的装置 | |
| WO2019072172A1 (zh) | 一种显示多个内容卡片的方法及终端设备 | |
| KR20140126949A (ko) | 터치스크린을 구비하는 전자 장치의 메뉴 운용 방법 및 장치 | |
| CN107728886B (zh) | 一种单手操作方法和装置 | |
| US20190266129A1 (en) | Icon Search Method and Terminal | |
| CN109582212B (zh) | 用户界面显示方法及其设备 | |
| CN106062679A (zh) | 一种控制显示屏的方法、装置及终端 | |
| EP3842912B1 (en) | Display interface processing method and mobile terminal | |
| US20140281962A1 (en) | Mobile device of executing action in display unchecking mode and method of controlling the same | |
| CN110221762A (zh) | 一种展示桌面小工具的方法及电子设备 | |
| CN106155452A (zh) | 一种单手操作的实现方法及终端 | |
| CN108509138A (zh) | 一种任务栏按钮显示的方法及其终端 | |
| EP3674867B1 (en) | Human-computer interaction method and electronic device | |
| KR20200086653A (ko) | 컨텐츠의 스크롤 시 대표 정보를 표시하는 전자 장치 및 그 제어 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15912003 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15912003 Country of ref document: EP Kind code of ref document: A1 |