[go: up one dir, main page]

WO2019071419A1 - Procédé d'affichage d'informations basé sur de multiples écrans, et terminal - Google Patents

Procédé d'affichage d'informations basé sur de multiples écrans, et terminal Download PDF

Info

Publication number
WO2019071419A1
WO2019071419A1 PCT/CN2017/105490 CN2017105490W WO2019071419A1 WO 2019071419 A1 WO2019071419 A1 WO 2019071419A1 CN 2017105490 W CN2017105490 W CN 2017105490W WO 2019071419 A1 WO2019071419 A1 WO 2019071419A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display screen
external display
input
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/105490
Other languages
English (en)
Chinese (zh)
Inventor
张献中
黄成钟
郑雪瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Communication Co Ltd
Original Assignee
Shenzhen Transsion Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Communication Co Ltd filed Critical Shenzhen Transsion Communication Co Ltd
Priority to CN201780095813.5A priority Critical patent/CN111201507B/zh
Priority to PCT/CN2017/105490 priority patent/WO2019071419A1/fr
Publication of WO2019071419A1 publication Critical patent/WO2019071419A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the field of electronic display technologies, and in particular, to a method and a terminal for displaying information based on multiple screens.
  • Mobile terminals integrate communication, entertainment, reading and work. More and more people like to use mobile terminals to simultaneously browse news, play games, watch movies or use social applications such as WeChat, QQ, etc., thus connecting at least one external display.
  • the mobile terminal of the screen has been enthusiastically sought after and has been widely used.
  • the present application provides a multi-screen based information display method and terminal.
  • the user is conveniently selected to quickly select an appropriate external display display information.
  • the present application provides a multi-screen based information display method, the method comprising:
  • the object represented by the target icon is displayed in an external display corresponding to the display area for which the second input is directed.
  • the application provides a terminal, where the terminal includes:
  • a first detecting unit configured to detect a first input (such as a long press) for the target icon, where the target icon is displayed in the first display screen;
  • a first display unit configured to respectively display information for indicating the at least one external display screen in at least one display area
  • a second detecting unit configured to detect a second input (such as dragging, sliding, moving, gesture, or releasing operation) for the target icon;
  • a second display unit configured to display an object represented by the target icon in an external display corresponding to the display area for which the second input is directed.
  • the present application provides another terminal, including a processor, an input device, an output device, and a memory, wherein the processor, the input device, the output device, and the memory are connected to each other, wherein the memory is used to store the support terminal.
  • An application code executing the above method, the processor being configured to perform the method of the first aspect above.
  • the present application provides a computer readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the method of the first aspect described above.
  • the terminal detects a first input for the target icon, the target icon is displayed in the first display screen; and further, information for indicating the at least one external display screen is respectively displayed in the at least one display area; And detecting a second input for the target icon; finally, displaying the object represented by the target icon in an external display corresponding to the display area for the second input.
  • the terminal provides a display area for displaying information for representing at least one external display screen, thereby facilitating the user to select an appropriate external display screen to display the icon representation object according to the information of the corresponding external display screen in the display area.
  • the information of the external display screen matching the object represented by the display target icon is displayed differently, which facilitates the user to quickly select an appropriate according to the information of the external display screen in the display area.
  • the external display shows the object represented by the icon.
  • the display box popped in the display area outputs information of the external display screen that matches the object represented by the display target icon, thereby facilitating the user to quickly select an appropriate external display according to the information of the external display screen in the display area.
  • the screen displays the object represented by the icon. This greatly enhances the user experience.
  • FIG. 1 is a schematic flow chart of a multi-screen based information display method provided by the present application.
  • 2A is an interface display diagram of another multi-screen based information display method provided by the present application.
  • 2B is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • 2C is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • 2D is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • 2E is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • 2F is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • 2G is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • 2H is an interface display diagram of still another multi-screen based information display method provided by the present application.
  • FIG. 3 is a functional block diagram of a terminal provided by the present application.
  • FIG. 4 is a schematic block diagram of a terminal provided by the present application.
  • the term “if” can be interpreted as “when” or “once” or “in response to determining” or “in response to detecting” depending on the context. ". Similarly, the phrase “if determined” or “if detected [condition or event described]” may be interpreted in context to mean “once determined” or “in response to determining” or “once detected [condition or event described] ] or “in response to detecting [conditions or events described]”.
  • the terminal described in the embodiments of the present application includes, but is not limited to, other portable devices such as a mobile phone, a laptop computer or a tablet computer with a touch sensitive surface (for example, a touch screen display and/or a touch pad).
  • a touch sensitive surface for example, a touch screen display and/or a touch pad.
  • the device is not a portable communication device, but a desktop computer having a touch sensitive surface (eg, a touch screen display and/or a touch pad).
  • the terminal including a display and a touch sensitive surface is described.
  • the terminal can include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
  • the terminal supports a variety of applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, phone applications Programs, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web browsing applications, digital music player applications, and / or digital video player app.
  • applications such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, phone applications Programs, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web browsing applications, digital music player applications, and / or digital video player app.
  • Various applications that can be executed on the terminal can use at least one common physical user interface device such as a touch sensitive surface.
  • One or more functions of the touch sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed within the application and/or within the respective application.
  • the common physical architecture of the terminal eg, a touch-sensitive surface
  • FIG. 1 is a schematic flowchart of a multi-screen based information display method provided by the present application. As shown in FIG. 1 , the method includes:
  • the terminal detects a first input for the target icon, where the target icon is displayed in the first display screen.
  • the target icon can represent a file, program, web page or command, and the like.
  • User can To execute a command by clicking, double-clicking, dragging, or long-pressing an icon (such as an application that opens an icon representation).
  • the terminal includes a first display screen, and the target icon is an icon for the first input, and may be any icon in the first display screen.
  • the first display screen may be at least one touch screen of the terminal.
  • the first input may be a touch operation, such as a click, a double tap, or a long press, and the terminal may detect the first input through the touch screen.
  • a touch operation such as a click, a double tap, or a long press
  • the first input may be a somatosensory operation (including a gesture operation), such as an operation in which the hand selects the target in the air.
  • the terminal can detect the first input through an infrared ranging sensor in the camera.
  • the first input may also be a voice input, for example, “Open WeChat in an external display” for a specific voice input, and the target icon for the voice input is a WeChat icon.
  • the terminal can detect the first input through the voice recognition module (ie, the voice recognition module recognizes each voice message).
  • first input may also be other forms of user input, which is not limited herein.
  • the terminal includes a first display screen and at least one external display screen, and the display area may include at least one of a two-dimensional display area or a three-dimensional space display area on the first display screen.
  • a scheme for displaying information for representing the at least one external display screen in at least one display area is further explained below in conjunction with FIGS. 2A-2H.
  • FIG. 2A exemplarily shows how a scheme for representing the at least one external display screen is separately displayed in a display area of at least one first display screen.
  • the information of the external display screen may include parameters such as an appearance picture, a brand name, a size, a color, and a shape of the external display.
  • the first display screen 201 is displayed with an icon 202 and includes at least one display area 203 on which the name "External Display 3" of the external display screen 3 is displayed.
  • the examples are merely illustrative of the application and should not be construed as limiting.
  • the information of the external display is removed from the display.
  • the display threshold of the external display is related to the display capability of the external display.
  • the display capability is how many objects can be displayed simultaneously on the external display.
  • the display capability of the external display 3 is capable of displaying two texts simultaneously. files. That is to say, the display threshold of the external display 3 is to display two documents at the same time, and when the external display 3 simultaneously displays two documents, the external display 3 reaches the display threshold.
  • This implementation is not only applicable to the embodiment corresponding to FIG. 2A, but also to the embodiment corresponding to FIG. 2B-2H.
  • FIG. 2A is only for explaining the present application and should not be construed as limiting.
  • the object represented by the icon can be a file, a program, a web page or a command, and the like.
  • the first display screen 201 displays an icon 202 including a first position 204 and a second position 205.
  • the first position 204 is distributed with six display areas such as the display area 203
  • the second position 205 is distributed with the display area. 9 and 6 display areas.
  • Information for indicating a first external display screen is displayed at a first location 204
  • information for indicating a second external display screen is displayed at a second location 205, wherein the first external display screen includes an external display screen 1 and an external display screen 2.
  • 6 external display screens such as external display 3, external display 4, external display 5 and external display 6, the second external display includes external display 7, external display 8, external display 9, external There are six external display screens, such as the display screen 10, the external display screen 11, and the external display screen 12.
  • the object represented by the target icon is displayed in the first external display screen, and the object represented by the target icon is not displayed in the second display screen.
  • the first display screen 201 displays an icon 207 including an inner ring display area 210 and an outer ring display area 211.
  • the first position may be the inner ring display area 210
  • the second position may be the outer ring display area 211
  • the inner ring display area 210 displays information for indicating the first external display screen, which is displayed on the outer ring.
  • the area 211 displays information for indicating the second external display screen, and the object represented by the target icon is displayed in the first external display screen, and the object represented by the target icon is not displayed in the second display screen.
  • 2B and 2E are only used to explain the present application and should not be construed as limiting. In practical applications, the number, size, shape, and the like of the display area may be determined according to actual needs, and are not limited herein.
  • information for indicating the first external display and information for indicating the second external display may also be displayed for distinction.
  • the difference display can refer to the display side of the information.
  • Information is displayed in different ways, such as fonts and colors.
  • the first display screen 201 is displayed with an icon 202 and includes at least one display area 203, and highlights (ie, differentially displays) the name "external display screen 3" and the first display of the external display screen 3 in the display area 203.
  • At least one of the display areas may also be distributed in an annular display area.
  • the first display screen 201 is displayed with an icon 207 and includes a first ring 209, and the first ring 209 is distributed with eight display areas such as the display area 203.
  • the display area 203 is displayed with the name "external display 3" of the external display 3.
  • Figure 2D is only used to explain the present application and should not be construed as limiting.
  • the plurality of display areas may also be distributed in a plurality of nested annular display areas.
  • the first display screen 201 is displayed with an icon 207 and includes an inner ring display area 210 and an outer ring display area 211.
  • the inner ring display area 210 displays information for indicating the first external display screen
  • the outer ring display area 211 displays information for indicating the second external display screen
  • the object of the target icon representation is displayed in the first external display screen.
  • the object represented by the target icon is not displayed in the second display.
  • the terminal may record the duration of the first input in real time, and according to the duration, distinguish the information of the external display in the display area corresponding to the time in real time, and the information of the external display may be The name, size, color, and brand type of the external display.
  • the first display screen 201 is displayed with an icon 207 and includes a first ring 209, and the first ring 209 is distributed with eight display areas such as display areas 203, 219, and 220.
  • the duration of the first input is the time that the user's finger 218 presses the target icon 207, for example, when the user's finger 218 presses the target icon 202 for a duration of 1 s.
  • the name of the external display screen in the display area 219 and the name of the external display in the other display area in the first ring 209 are highlighted (i.e., displayed differently).
  • the display area may also be a three-dimensional space display area
  • the terminal may respectively display information for indicating the at least one external display screen in the at least one three-dimensional space display area.
  • at least one three-dimensional space display area may be generated by the terminal above the terminal by 3D projection.
  • the terminal 200 includes a first display screen 201.
  • the first display screen 201 is displayed with an icon 207.
  • the first projection area 221 is generated by the terminal 200 over the terminal 200 through 3D projection.
  • the first projection area 221 includes three-dimensional. A plurality of display areas such as the space display area 222.
  • the first input may be a somatosensory operation for each three-dimensional space display area, assuming that the first input is a long press input of the user's finger 218, and when the user's finger 218 presses the target icon 207, the terminal 200 projects the first position above the position of the terminal 200.
  • the three-dimensional space display area 222 displays an appearance picture and size information of the external display screen 9.
  • Figure 2G is for illustrative purposes only and should not be construed as limiting. In practical applications, the number, size, shape, and the like of the display area may be determined according to actual needs, and are not limited herein.
  • the terminal may separately display information of an external display screen that matches an object represented by the display target icon according to the type of the object represented by the target icon, and the information of the external display screen may be an external display screen. Size, color and brand type.
  • the object represented by the target icon 207 selected by the user's finger 218 is the video playback software.
  • the information of the external display screen matching the video playing software is highlighted (ie, differentially displayed), and the terminal 200 can recommend the user to use the information by highlighting the external display screen 3 matching the video playing software in the three-dimensional space display area 222.
  • the external display opens the video playback software, because the external display 3 is a wide screen, which is more suitable for the user to watch the movie.
  • This implementation is not only applicable to the embodiment corresponding to FIG. 2H, but also to the corresponding embodiment of FIGS. 2A-2F.
  • Figure 2H is only used to explain the present application and should not be construed as limiting. In actual applications, the number and size of display areas, The shape and the like can be determined according to actual needs, and there is no limitation here.
  • the second input is further explained below in conjunction with the respective embodiments of Figures 2A-2H.
  • the second input may be at least one of: a touch operation (such as a click operation and a slide operation, etc.), a somatosensory operation (including a gesture operation, wherein the gesture operation may be a selection gesture and a sliding gesture, etc.), voice input Operation, image input operation, etc.
  • the second input can be used to select which of the display areas corresponding to the target representation object is displayed by.
  • the second input may be a drag operation for the target icon.
  • the user drags the target icon 202 onto the display area 203, which is the second input.
  • the second input may be a sliding operation for the target icon.
  • the target icon 207 is slid in the direction of the display area 208.
  • the second input may be a release operation for the target icon.
  • the user long presses the release operation after the target icon 207.
  • the second input may be a somatosensory operation (including a gesture operation) for the target icon.
  • the second input may also be other forms of input, and is not limited herein.
  • the target icon can represent a file, program, web page or command, etc., and the icon helps the user to quickly execute the command and open the program file.
  • an object such as a movie
  • the target icon 202 is displayed in the external display screen 3 corresponding to the display area 203 for the second input (eg, drag and drop), that is, it can be targeted at the second input.
  • the movie is played in the external display screen 3 corresponding to the display area 203.
  • a display box for the user to select is displayed in the display area, and is output in the display box.
  • the application first detects the first input (eg, long press) for the target icon, the target icon is displayed in the first display screen; and further, displaying information for indicating at least one external display screen in the at least one display area That is to say, the terminal provides a display area display information for indicating at least one external display screen, thereby facilitating the user to select an appropriate external display screen to display the icon representation object according to the information of the external display screen in the display area. . Further, according to the type of the object represented by the target icon, the information of the external display screen matching the object represented by the display target icon is displayed differently, which facilitates the user to quickly select an appropriate according to the information of the external display screen in the display area.
  • the external display shows the object represented by the icon.
  • the display box popped in the display area outputs information of the external display screen that matches the object represented by the display target icon, thereby facilitating the user to quickly select an appropriate external display according to the information of the external display screen in the display area.
  • the screen displays the object represented by the icon. This greatly enhances the user experience.
  • FIG. 3 is a functional block diagram of a terminal provided by the present application.
  • the functional blocks of the terminal may implement the present application by hardware, software, or a combination of hardware and software.
  • the functional blocks described in FIG. 3 can be combined or separated into several sub-blocks to implement the present application. Accordingly, the above description in this application may support any possible combination or separation or further definition of the functional modules described below.
  • the terminal 300 may include: a first detecting unit 301, a first display unit 302, a second detecting unit 303, and a second display unit 304. among them:
  • a first detecting unit 301 configured to detect a first input (such as a long press, drag, slide, or a somatosensory operation) for the target icon, where the target icon is displayed in the first display screen;
  • a first input such as a long press, drag, slide, or a somatosensory operation
  • the first display unit 302 is configured to respectively display information for indicating the at least one external display screen in at least one display area;
  • a second detecting unit 303 configured to detect a second input (such as drag, slide, somatosensory or release operation) for the target icon;
  • the second display unit 304 is configured to display the object represented by the target icon in an external display corresponding to the display area for which the second input is directed.
  • the first detecting unit 301 is configured to detect a first input for the target icon, where the target icon is displayed Shown in the first display screen. specific:
  • the terminal includes a first display screen, and the target icon is an icon for the first input, and may be any icon in the first display screen.
  • the first detecting unit 301 may detect the first input through the touch screen.
  • the first input may be a somatosensory operation (including a gesture operation), such as an operation in which the hand selects the target in the air.
  • the terminal can detect the first input through an infrared ranging sensor in the camera.
  • the first input may also be a voice input
  • the terminal may detect the first input by using the voice recognition module.
  • the first display unit 302 is configured to respectively display information for indicating the at least one external display screen in at least one display area. specific:
  • the terminal includes a first display screen and at least one external display screen, and the display area may include at least one of a two-dimensional display area or a three-dimensional space display area on the first display screen.
  • the information of the external display is removed from the display.
  • the display threshold of the external display is related to the display capability of the external display.
  • the display capability is how many objects can be displayed simultaneously on the external display.
  • the object represented by the icon can be a file, a program, a web page or a command, and the like.
  • information for indicating the first external display and information for indicating the second external display may also be displayed for distinction.
  • the difference display may mean that the display manner information of the information is displayed in a different manner, such as a font, a color, and the like.
  • At least one of the display areas may also be distributed in an annular display area.
  • multiple display areas may also be distributed in a plurality of nested circular displays. In the area.
  • the terminal may record the duration of the first input in real time, and according to the duration, distinguish the information of the external display in the display area corresponding to the time in real time, and the information of the external display may be The name, size, color, and brand type of the external display.
  • the display area may also be a three-dimensional space display area
  • the first display unit 302 may respectively display information for indicating the at least one external display screen in the at least one three-dimensional space display area.
  • at least one three-dimensional space display area may be generated by the terminal above the terminal by 3D projection.
  • the first display unit 302 may separately display information of an external display screen that matches an object represented by the display target icon according to the type of the object represented by the target icon, and the information of the external display screen may be The size, color, and brand type of the external display.
  • the second detecting unit 303 is configured to detect a second input (such as drag, slide, move, gesture, or release operation) for the target icon. specific:
  • the second input is further explained below in conjunction with the respective embodiments of Figures 2A-2H.
  • the second input may be at least one of: a touch operation (such as a click operation and a slide operation, etc.), a somatosensory operation (including a gesture operation, wherein the gesture operation may be a selection gesture and a sliding gesture, etc.), voice input Operation, image input operation, etc.
  • the second input can be used to select which of the display areas corresponding to the target representation object is displayed by.
  • the second input may be a drag operation for the target icon.
  • the second input may be a sliding operation for the target icon.
  • the second input may be a release operation for the target icon.
  • the second input may be a somatosensory operation (including a gesture operation) for the target icon.
  • the second display unit 304 is configured to display the object represented by the target icon in an external display screen corresponding to the display area for which the second input is directed. specific:
  • a display box for the user to select is displayed in the display area, and the display box is displayed in the display box.
  • a decision statement of "whether or not to continue displaying the object represented by the second icon on the display screen corresponding to the display area for which the second input is directed" is issued. If so, the object represented by the first icon is displayed by the external display screen of the object not represented by the icon, and if not, the object represented by the second icon is displayed by the external display screen of the object not represented by the icon.
  • terminal 400 is a schematic block diagram of a terminal provided by the present application.
  • the terminal device may include a mobile phone, a tablet, a personal digital assistant (PDA), a mobile Internet device (MID), a smart wearable device (such as a smart watch, a smart bracelet).
  • PDA personal digital assistant
  • MID mobile Internet device
  • smart wearable device such as a smart watch, a smart bracelet.
  • terminal 400 can include a baseband chip 410, a memory 415 (one or more computer readable storage media), a radio frequency (RF) module 416, and a peripheral system 417. These components can communicate over one or more communication buses 414.
  • RF radio frequency
  • the peripheral system 417 is mainly used to implement the interaction function between the terminal 410 and the user/external environment, and mainly includes the input and output devices of the terminal 400.
  • the peripheral system 417 can include a touch screen controller 418, a camera controller 419, an audio controller 420, and a sensor management module 421. Each controller may be coupled to a respective peripheral device such as touch screen 423, camera 424, audio circuit 425, and sensor 426.
  • the touch screen 423 can be configured with a touch screen of a self-capacitive floating touch panel or a touch screen configured with an infrared floating touch panel.
  • camera 424 can be a 3D camera. It should be noted that the peripheral system 417 may also include other I/O peripherals.
  • the baseband chip 410 can be integrated to include one or more processors 411, a clock module 412, and a power management module 413.
  • the clock module 412 integrated in the baseband chip 410 is primarily used to generate the clocks required for data transfer and timing control for the processor 411.
  • the power management module 413 integrated in the baseband chip 410 is mainly used to provide a stable, high-accuracy voltage for the processor 411, the radio frequency module 416, and the peripheral system.
  • a radio frequency (RF) module 416 is used to receive and transmit radio frequency signals, primarily integrating the receiver and transmitter of terminal 100.
  • Radio frequency (RF) module 416 communicates with the communication network and other sources via radio frequency signals Device communication.
  • the radio frequency (RF) module 416 can include, but is not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip, a SIM card, and Storage media, etc.
  • a radio frequency (RF) module 416 can be implemented on a separate chip.
  • Memory 415 is coupled to processor 411 for storing various software programs and/or sets of instructions.
  • memory 415 can include high speed random access memory, and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 415 can store an operating system (hereinafter referred to as a system) such as an embedded operating system such as ANDROID, IOS, WINDOWS, or LINUX.
  • the memory 415 can also store a network communication program that can be used to communicate with one or more additional devices, one or more terminal devices, one or more network devices.
  • the memory 415 can also store a user interface program, which can realistically display the content image of the application through a graphical operation interface, and receive user control operations on the application through input controls such as menus, dialog boxes, and keys. .
  • the memory 415 can also store one or more applications. As shown in FIG. 4, these applications may include: social applications (such as Facebook), image management applications (such as photo albums), map applications (such as Google Maps), browsers (such as Safari, Google Chrome), etc. .
  • social applications such as Facebook
  • image management applications such as photo albums
  • map applications such as Google Maps
  • browsers such as Safari, Google Chrome
  • terminal 400 is only an example provided by the embodiments of the present application, and the terminal 400 may have more or less components than the illustrated components, may combine two or more components, or may have components. Different configurations are implemented.
  • a computer readable storage medium is stored, the computer readable storage medium storing a computer program that is implemented by a processor to:
  • the computer readable storage medium may be an internal storage unit of the terminal described in any of the foregoing embodiments, such as a hard disk or a memory of the terminal.
  • the computer readable storage medium may also be an external storage device of the terminal, such as a plug-in hard disk equipped on the terminal, a smart memory card (SMC), and a Secure Digital (SD) card. , Flash Card, etc.
  • the computer readable storage medium may also include both an internal storage unit of the terminal and an external storage device.
  • the computer readable storage medium is for storing the computer program and other programs and data required by the terminal.
  • the computer readable storage medium can also be used to temporarily store data that has been output or is about to be output.
  • terminal embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • multiple units or components may be combined or integrated into Another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, terminal or unit, or an electrical, mechanical or other form of connection.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the embodiments of the present application.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold as a standalone product Or when used, it can be stored in a computer readable storage medium.
  • the technical solution of the present application may be in essence or part of the contribution to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé d'affichage d'information basé sur de multiples écrans et un terminal. Le terminal comprend un premier écran d'affichage, et est en outre connecté à au moins un écran d'affichage externe. Le procédé consiste à : détecter une première entrée (par exemple, une longue pression) dirigée vers une icône cible, l'icône cible étant affichée dans le premier écran d'affichage ; afficher respectivement, dans au moins une zone d'affichage, des informations utilisées pour indiquer au moins un écran d'affichage externe ; la détection d'une seconde entrée (par exemple, une opération de traînée, de glissement, de geste ou de libération) dirigée vers l'icône cible ; et l'affichage d'un objet, représentée par l'icône cible, dans un écran d'affichage externe correspondant à une zone d'affichage vers laquelle la seconde entrée est dirigée. Le procédé d'affichage d'information basé sur de multiples écrans et le terminal de la présente invention améliorent de manière considérable l'expérience de l'utilisateur.
PCT/CN2017/105490 2017-10-10 2017-10-10 Procédé d'affichage d'informations basé sur de multiples écrans, et terminal Ceased WO2019071419A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780095813.5A CN111201507B (zh) 2017-10-10 2017-10-10 一种信息显示方法及终端
PCT/CN2017/105490 WO2019071419A1 (fr) 2017-10-10 2017-10-10 Procédé d'affichage d'informations basé sur de multiples écrans, et terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/105490 WO2019071419A1 (fr) 2017-10-10 2017-10-10 Procédé d'affichage d'informations basé sur de multiples écrans, et terminal

Publications (1)

Publication Number Publication Date
WO2019071419A1 true WO2019071419A1 (fr) 2019-04-18

Family

ID=66100200

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/105490 Ceased WO2019071419A1 (fr) 2017-10-10 2017-10-10 Procédé d'affichage d'informations basé sur de multiples écrans, et terminal

Country Status (2)

Country Link
CN (1) CN111201507B (fr)
WO (1) WO2019071419A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110084986A1 (en) * 2009-10-13 2011-04-14 Samsung Electronics Co. Ltd. Method for displaying background screen in mobile terminal
CN103365536A (zh) * 2012-03-26 2013-10-23 三星电子株式会社 便携式终端中管理屏幕的方法和装置
CN104199552A (zh) * 2014-09-11 2014-12-10 福州瑞芯微电子有限公司 多屏显示方法、设备及系统
CN104915096A (zh) * 2015-05-29 2015-09-16 努比亚技术有限公司 应用界面显示方法及装置
CN105302285A (zh) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 多屏显示方法、设备及系统
CN105975142A (zh) * 2015-10-26 2016-09-28 乐视移动智能信息技术(北京)有限公司 图标移动方法和装置
CN106325650A (zh) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 基于人机交互的3d动态显示方法及移动终端
CN106648329A (zh) * 2016-12-30 2017-05-10 维沃移动通信有限公司 一种应用图标的显示方法及移动终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100474724B1 (ko) * 2001-08-04 2005-03-08 삼성전자주식회사 터치스크린을 가지는 장치 및 그 장치에 외부디스플레이기기를 연결하여 사용하는 방법
US8238979B2 (en) * 2009-04-14 2012-08-07 Qualcomm Incorporated System and method for mobile device display power savings

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110084986A1 (en) * 2009-10-13 2011-04-14 Samsung Electronics Co. Ltd. Method for displaying background screen in mobile terminal
CN103365536A (zh) * 2012-03-26 2013-10-23 三星电子株式会社 便携式终端中管理屏幕的方法和装置
CN105302285A (zh) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 多屏显示方法、设备及系统
CN104199552A (zh) * 2014-09-11 2014-12-10 福州瑞芯微电子有限公司 多屏显示方法、设备及系统
CN104915096A (zh) * 2015-05-29 2015-09-16 努比亚技术有限公司 应用界面显示方法及装置
CN106325650A (zh) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 基于人机交互的3d动态显示方法及移动终端
CN105975142A (zh) * 2015-10-26 2016-09-28 乐视移动智能信息技术(北京)有限公司 图标移动方法和装置
CN106648329A (zh) * 2016-12-30 2017-05-10 维沃移动通信有限公司 一种应用图标的显示方法及移动终端

Also Published As

Publication number Publication date
CN111201507B (zh) 2023-11-03
CN111201507A (zh) 2020-05-26

Similar Documents

Publication Publication Date Title
US10915225B2 (en) User terminal apparatus and method of controlling the same
US10705702B2 (en) Information processing device, information processing method, and computer program
US11188192B2 (en) Information processing device, information processing method, and computer program for side menus
AU2017358278B2 (en) Method of displaying user interface related to user authentication and electronic device for implementing same
CN108334371B (zh) 编辑对象的方法和装置
WO2019062910A1 (fr) Procédé de copier-coller, appareil de traitement de données et dispositif utilisateur
CN105765520A (zh) 用于提供锁定屏幕的设备和方法
JP2015518604A (ja) テキスト選択及び入力
US9792183B2 (en) Method, apparatus, and recording medium for interworking with external terminal
CN105900056A (zh) 辅助显示器的悬停敏感控制
CN105635519B (zh) 视频处理方法、装置及系统
US20150180998A1 (en) User terminal apparatus and control method thereof
US10319338B2 (en) Electronic device and method of extracting color in electronic device
KR102203131B1 (ko) 파일 관리 방법 및 그 전자 장치
CN111597359A (zh) 信息流的分享方法、装置、设备及存储介质
US20150355823A1 (en) Electronic device and method of editing icon in electronic device
US9542094B2 (en) Method and apparatus for providing layout based on handwriting input
KR20160096645A (ko) 컴퓨팅 디바이스에 대한 장치의 결합
US10185457B2 (en) Information processing apparatus and a method for controlling the information processing apparatus
JP2025512309A (ja) キャプチャされたコンテンツの共有
CN105843504B (zh) 一种窗口调整方法及电子设备
CN111201507B (zh) 一种信息显示方法及终端
WO2019051738A1 (fr) Procédé de commutation de liste, et terminal
AU2021105134A4 (en) User interfaces for selecting media items
CN117675994A (zh) 设备连接方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17928289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17928289

Country of ref document: EP

Kind code of ref document: A1