[go: up one dir, main page]

WO2023016463A1 - Procédé et appareil de commande d'affichage, et dispositif électronique et support - Google Patents

Procédé et appareil de commande d'affichage, et dispositif électronique et support Download PDF

Info

Publication number
WO2023016463A1
WO2023016463A1 PCT/CN2022/111192 CN2022111192W WO2023016463A1 WO 2023016463 A1 WO2023016463 A1 WO 2023016463A1 CN 2022111192 W CN2022111192 W CN 2022111192W WO 2023016463 A1 WO2023016463 A1 WO 2023016463A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
display
input
window
split
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2022/111192
Other languages
English (en)
Chinese (zh)
Inventor
汪国全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Publication of WO2023016463A1 publication Critical patent/WO2023016463A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72433User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for voice messaging, e.g. dictaphones
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. short messaging services [SMS] or e-mails
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present application belongs to the field of display technology, and in particular relates to a display control method, device, electronic equipment and media.
  • users use electronic devices more and more frequently. For example, users can browse interested content (such as video, audio, image or text, etc.) through electronic devices.
  • interested content such as video, audio, image or text, etc.
  • the purpose of the embodiments of the present application is to provide a display control method, device, electronic device and medium, which can solve the problem of cumbersome operations in the process of viewing different contents by the user through the electronic device.
  • the embodiment of the present application provides a display control method, the method includes: receiving the first input from the user on the first object in the target interface, and the first object includes at least one of the following: a control, a widget; a response In the first input, the first object is displayed in the first window.
  • the embodiment of the present application provides a display control device, which includes: a receiving module and a display module; At least one of the following is included: a control, a widget; a display module, which displays the first object in the first window in response to the first input received by the receiving module.
  • an embodiment of the present application provides an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored in the memory and operable on the processor, and the program or instruction is The processor implements the steps of the method described in the first aspect when executed.
  • an embodiment of the present application provides a readable storage medium, on which a program or an instruction is stored, and when the program or instruction is executed by a processor, the steps of the method described in the first aspect are implemented .
  • the embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions, so as to implement the first aspect the method described.
  • the display control device may display the first object in the first window.
  • the first object includes at least one of the following: controls and widgets.
  • the display control device in this application can, after receiving the user's first input to the control or widget in the target interface, The control or widget is displayed in the first window, that is, the control or widget is displayed in split screens. In this way, the display mode of the electronic device is more flexible.
  • FIG. 1 is a schematic flow chart of a display control method provided by an embodiment of the present application
  • Fig. 2 is one of the interface schematic diagrams of the application of a display control method provided by the embodiment of the present application;
  • Fig. 3 is the second schematic diagram of the application interface of a display control method provided by the embodiment of the present application.
  • Fig. 4 is the third schematic diagram of the application interface of a display control method provided by the embodiment of the present application.
  • Fig. 5 is the fourth schematic diagram of the application interface of a display control method provided by the embodiment of the present application.
  • Fig. 6 is the fifth schematic diagram of the application interface of a display control method provided by the embodiment of the present application.
  • Fig. 7 is the sixth schematic diagram of the application interface of a display control method provided by the embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a display control device provided by an embodiment of the present application.
  • FIG. 9 is one of the structural schematic diagrams of an electronic device provided in an embodiment of the present application.
  • FIG. 10 is a second schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first”, “second” and the like in the specification and claims of the present application are used to distinguish similar objects, and are not used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the application can be practiced in sequences other than those illustrated or described herein.
  • the objects distinguished by “first”, “second” and so on are generally one type, and the number of objects is not limited, for example, there may be one first object, or there may be multiple ones.
  • “and/or” in the specification and claims means at least one of the connected objects, and the character “/” generally means that the related objects are an "or” relationship.
  • Fig. 1 is a schematic flowchart of a display control method provided by the embodiment of the present application, including steps 201 and 202:
  • Step 201 The display control device receives a first input from a user on a first object in a target interface.
  • the above-mentioned first object includes at least one of the following items: controls and widgets.
  • the above-mentioned target interface may be a program interface of an application program, may also be a desktop, or may be any possible interface, which is not limited in the embodiment of the present application.
  • the objects in the embodiments of the present application may also include at least one of the following: characters, images, web view (WebView) pages, program interfaces, and application icons. It should be noted that the object includes but is not limited to the aforementioned seven types of objects.
  • the above-mentioned controls may be video controls, or may be audio controls, which is not limited in this embodiment of the present application.
  • first objects there may be one or more first objects, which is not limited in the embodiment of the present application.
  • the above-mentioned first object may include controls in the program interface; in the case that the above-mentioned target interface is a desktop, the above-mentioned second An object may also include a program interface.
  • the above application program may be a news application program, a chat application program, a map application program, or any possible application program, which is not limited in this embodiment of the present application.
  • the above-mentioned program interface may be any interface in the application program.
  • the program interface may be a main interface of a news application program, or the program interface may be a chat list interface of a chat application program, etc., which is not limited in this embodiment of the present application.
  • the above-mentioned first input may be: the user's click input on the first object, or the user's drag input on the first object, or a voice command input by the user, or a specific gesture input by the user , which can be determined according to actual usage requirements, which is not limited in this embodiment of the present application.
  • the specific gesture in the embodiment of the present application may be any one of a click gesture, a sliding gesture, a drag gesture, a pressure recognition gesture, a long press gesture, an area change gesture, a double-press gesture, and a double-click gesture;
  • the click input can be single-click input, double-click input, or any number of click inputs, etc., and can also be long-press input or short-press input.
  • the above step 201 may specifically include the following steps: the display control device receives a first sub-input from the user on the first sub-object, And receiving a second sub-input from the user on the second sub-object.
  • Step 202 In response to the first input, the display control device displays the first object in the first window.
  • first windows there may be one or more first windows, which is not limited in the embodiment of the present application.
  • one first window corresponds to at least one first object.
  • the first window displaying the first object may be set by default by the system, or set by the user, which is not limited in this embodiment of the present application.
  • the screen of the mobile phone includes two display windows displayed in a horizontal arrangement, namely the first display window and the second display window as an example, when the user wants to display the audio control on a split screen , the user can long press the audio control.
  • the mobile phone can pop up a window, and the window displays two options, namely the "first display window” option and the "second display window” option.
  • the user wants to display the audio control in the first display window he can Click the "Show window first” option.
  • the mobile phone may display audio controls on the first display window (ie, the above-mentioned first window).
  • the control when the first object is a control in the program interface, the control needs to be encapsulated into an activity form, and then put into the stack of the first window, and the default display size is the size of the control itself
  • the size can also be dynamically adjusted according to the size of the current first window.
  • the method may further include the following step A1:
  • Step A1 The display control device receives the second input from the user.
  • the above-mentioned second input can be: the user's click input on the screen, or a voice command input by the user, or a specific gesture input by the user, which can be specifically determined according to actual usage requirements. Not limited.
  • the display control device may perform any one of step A2 and step A3:
  • Step A2 In response to the above-mentioned second input, the display control device displays at least one split-screen object on the target interface.
  • the at least one split-screen object includes the first object, and the target interface does not include the at least one split-screen object before the display control device receives the second input.
  • the display control device displays at least one split-screen object in a preset manner.
  • the aforementioned preset manners include at least one of the following: highlighting, shaking, and blinking.
  • the above preset manners include but not limited to the above three display manners.
  • the display control device displays at least one split-screen object in a preset manner, that is, prompts the user that the at least one split-screen object is in a draggable state.
  • the display control device may be a mobile terminal, such as a mobile phone.
  • the target interface is the news display interface of the news application program.
  • a news interface is displayed on the screen 31 of the mobile phone. 32.
  • Three pieces of news information are displayed on the news interface 32, namely news information 32a, news information 32b, and news information 32c.
  • the user wants to display news information on a split screen, the user can use three fingers to slide up and then slide down on the screen.
  • the mobile phone can control the three news information to be in a split-screen draggable state, that is, to shake and display the news information 32a, the news information 32b and the news information 32c.
  • Step A3 In response to the above-mentioned second input, the display control device displays split-screen prompt information.
  • the above split-screen prompt information indicates at least one split-screen object.
  • the above split-screen prompt information may include at least one of the following: text and pictures.
  • the display control device may keep displaying the above-mentioned target interface, or may cancel displaying the target interface, which is not limited in this embodiment of the present application.
  • the method may further include the following step 203:
  • Step 203 In response to the above-mentioned first input, the display control device cancels display of the above-mentioned target interface.
  • the display control device when the display control device determines that the first object is the last object in the target interface dragged by the user, it may cancel the display of the target interface.
  • the display control device may execute step 203 before executing step 202, or may execute step 202 while executing Step 203, which is not limited in this embodiment of the present application.
  • the mobile phone can control the three news information to be in a split-screen draggable state, that is, to shake and display the news information 32a, the news information 32b and the news information 32c. Then, the user can sequentially drag the news information 32a, the news information 32b and the news information 32c out of the screen.
  • the mobile phone can cancel the display of the news interface 32 and divide the screen 31 into three display windows, namely a display window 31a, a display window 31b and a display window 31c. Among them, news information 32a is displayed in the display window 31a, news information 32b is displayed in the display window 31b, and news information 32c is displayed in the display window 31c.
  • step 202 may specifically include the following step 202a:
  • Step 202a The display control device displays the above-mentioned first object in the first window, and keeps displaying the above-mentioned target interface in the fourth window.
  • first window and fourth window may be set by default by the system, or may be set by the user, which is not limited in this embodiment of the present application.
  • the mobile phone can divide the screen 31 into two display windows, which are respectively a display window 41a and a display window 41b, and display the news interface 32 in the display window 41a, and display news information in the display window 41b 32b.
  • the mobile phone can display the news information 32a and news information 32c in the news interface 32 by shaking, and cancel the shaking to display the news information 32b.
  • the size of the first window is associated with feature information of the first object, and the feature information includes at least one of the following: object type, and proportion of the object on the target interface.
  • the above-mentioned object type may be an application program, a control, or a widget, which is not limited in this embodiment of the present application.
  • the display control device may determine the display window corresponding to the first object (ie, the first window) according to the characteristic information corresponding to the first object.
  • the display control device may determine whether the first object has corresponding historical display information (that is, user usage habits). If the first object has corresponding historical display information, the display window corresponding to the first object is determined according to the historical display information.
  • the display control device can record the display window of the corresponding object each time the screen is split, so as to record the user's usage habits.
  • the user drags the first object again it can refer to the previous user's usage habits to determine the corresponding window of the first object. Show window.
  • the display control device may determine the proportion of the display window corresponding to the first object in the screen according to the proportion of the first object in the target interface. For example, the proportion of the first object in the target interface is the same as the proportion of the display window corresponding to the first object in the screen.
  • the display control device may display the first object in the first window after receiving the user's first input on the first object in the target interface.
  • the first object includes at least one of the following: controls and widgets.
  • the display control device in this application can, after receiving the user's first input to the control or widget in the target interface, The control or widget is displayed in the first window, that is, the control or widget is displayed in split screens. In this way, the display mode of the electronic device is more flexible.
  • the above-mentioned target interface further includes a second object.
  • the method may also include the following steps 204a to 204c:
  • Step 204a In response to the above-mentioned first input, the display control device acquires target information.
  • the above target information includes at least one of the following: object information and screen information.
  • object information includes at least one of the following: the category of the object, the historical display position of the object, and the display content of the object.
  • screen information is the information of the screen displaying the target interface, and the screen information includes at least one of the following: screen size, screen status (such as portrait or landscape state).
  • the category of the above-mentioned object may be an application program, a control, or a widget, which is not limited in this embodiment of the present application.
  • the display content of the above object may be audio or video, etc., which is not limited in this embodiment of the present application.
  • Step 204b The display control device determines the layout mode of the first object and the second object according to the above object information.
  • the above-mentioned determining the layout manner of the first object and the second object includes determining the display windows corresponding to the first object and the second object, and the relative display positions of the display windows corresponding to the first object and the second object.
  • the layout manner of the first object and the second object may be that the display window corresponding to the first object and the display window corresponding to the second object are displayed adjacently.
  • the adjacent display may be horizontal adjacent display or vertical adjacent display, which is not limited in this embodiment of the present application.
  • the display control device may determine the degree of correlation between the first object and the second object according to the target information corresponding to the first object and the target information corresponding to the second object; In this case, the display window corresponding to the first object and the display window corresponding to the second object are displayed adjacently.
  • the aforementioned preset condition may be that the degree of correlation is greater than or equal to the first preset threshold, and may also be that the degree of correlation is less than or equal to the second preset threshold, which is not limited in this embodiment of the present application. In this way, the process of determining the layout of the first object and the second object by the display control device can be made more flexible.
  • the display control device may calculate the degree of correlation between the category of the first object and the category of the second object as A, which is greater than the preset threshold X. At this time, the display control device may make the display window corresponding to the first object correspond to the second object The display windows of the display are displayed adjacent to each other.
  • the display control device may calculate the correlation degree corresponding to each piece of information, and calculate the first object and the second object by weighting and summing the correlation degrees corresponding to the multiple information The degree of relevance (or affinity) of objects. For example, the display control device may calculate the degree of correlation between the category of the first object and the category of the second object as A, and calculate the degree of correlation between the display content of the first object and the display content of the first object as B. Then, the display control device may perform weighted summation of A and B to obtain the final degree of correlation between the first object and the second object.
  • Step 204c The display control device displays the second object according to the above layout.
  • the display control device may display the second object in the third window according to the above layout manner.
  • the process of the display control device determining the display window corresponding to the second object (that is, the third window) according to the characteristic information corresponding to the second object can refer to the process of determining the first window corresponding to the first object by the display control device according to the characteristic information corresponding to the first object.
  • the description of the window will not be repeated here.
  • the display control device may also display the second object in the first window.
  • displaying the above-mentioned first object in the first window in the above-mentioned step 202 may specifically include the following step 202b:
  • Step 202b The display control device displays the above-mentioned first object in the first window according to the above-mentioned layout method.
  • the display control device displays the third window adjacent to the first window according to the above layout manner.
  • the display control device displays the first object and the second object according to the above-mentioned layout manner as an example below.
  • the display control device may determine whether the first object and the second object have corresponding historical display information (that is, user usage habits). If the first object and the second object have corresponding historical display information, the display windows corresponding to the first object and the second object are determined according to the historical display information.
  • the display control device may record the display window of the corresponding object each time the screen is split, so as to record the user's usage habits.
  • the user's previous usage habits may be used to determine the A display window corresponding to an object.
  • the display control device may determine whether the category of the first object and the category of the second object belong to the same category. When it is determined that the first object and the second object belong to the same category, the display control device may display the display window corresponding to the first object at an adjacent position of the display window corresponding to the second object. For example, in the case that both the category of the first object and the category of the second object are widgets, the display control device may display the display window of the second object adjacent to the display window corresponding to the first object.
  • the display control device may determine whether the display content of the first object is the same as that of the second object. When it is determined that the display content of the first object is the same as that of the second object, the display control device may display the display window corresponding to the first object at a position adjacent to the display window corresponding to the second object. For example, when both the category of the first object and the display content of the second object are videos, the display control device may display the display window of the second object adjacent to the display window corresponding to the first object.
  • the display control device determines the size of the display window corresponding to the second object and the size of the display window corresponding to the first object according to the screen size. For example, the sum of the size of the display window corresponding to the second object and the size of the display window corresponding to the first object is smaller than or equal to the screen size.
  • Example 5 the display control device determines the display windows corresponding to the first object and the second object in combination with the screen state of the screen. For example, for a strip-shaped widget, the display control device may be placed horizontally at the bottom of the screen in portrait mode, and placed vertically on the right side of the screen in landscape mode.
  • the display control device can determine the layout mode of the first object and the second object according to the target information, display the second object according to the layout mode, and display the first object in the first window, so that the display control device can flexibly Display the first object and the second object.
  • the display control device can quickly save the combination mode of object display, so that the user can quickly display the object in the combination mode that can be saved later.
  • the method may further include the following steps 205a to 205c:
  • Step 205a The display control device receives a third input from the user.
  • the above-mentioned third input may be: the user's click input on the screen, or the user's click input on the target control, or a voice command input by the user, or a specific gesture input by the user, and the specific input may be based on the actual situation.
  • the usage requirements are determined, which is not limited in this embodiment of the present application.
  • Step 205b In response to the above third input, the display control device generates a split-screen identifier.
  • the above split-screen identifier indicates the target object in the target interface and the layout mode of the target object.
  • the aforementioned target object includes the aforementioned first object.
  • the aforementioned target objects may be all or part of the objects in the target interface.
  • the identifiers in this embodiment of the application are words, symbols, images, etc. used to indicate information, and controls or other containers may be used as carriers for displaying information, including but not limited to word identifiers, symbol identifiers, and image identifiers.
  • Step 205c The display control device receives a fourth input from the user on the split-screen identifier.
  • the above-mentioned fourth input can be: the user's click input on the target logo, or a voice command input by the user, or a specific gesture input by the user, which can be specifically determined according to actual usage requirements. This is not limited.
  • Step 205d In response to the fourth input, the display control device displays the target object indicated by the split-screen identifier according to the layout mode indicated by the split-screen identifier.
  • the display control device can be triggered to generate a split-screen identification that can indicate the split-screen combination mode of the target object, so that it is convenient for the user to pass through the split-screen identification later. , quickly triggering the display control device to display the target object indicated by the split-screen identifier according to the layout mode indicated by the split-screen identifier.
  • the display control device may process the first window after displaying the first window.
  • the method may further include the following steps 206a and 206b:
  • Step 206a The display control device receives a fifth input from the user on the above-mentioned first window.
  • the above-mentioned fifth input may be: a click input by the user on the first window, or a voice command input by the user, or a specific gesture input by the user, which may be specifically determined according to actual usage requirements.
  • a click input by the user on the first window or a voice command input by the user, or a specific gesture input by the user, which may be specifically determined according to actual usage requirements.
  • a specific gesture input by the user which may be specifically determined according to actual usage requirements.
  • Step 206b In response to the above-mentioned fifth input, the display control device executes the target processing.
  • the above-mentioned target processing includes at least one of the following: moving the first window; when the display area of the screen where the above-mentioned target interface is located includes the second window, merging the first window and the second window; deleting the first window; The size of the window.
  • Example 1 referring to FIG. 3 , when the user wants to move the news information 32b above the news information 32a, the user can drag the display window 31b upward. At this time, as shown in (a) of FIG. 5, the mobile phone displays the display window 31b in the display position of the display window 31a, and displays the display window 31a in the display position of the display window 31b.
  • Example 2 referring to FIG. 3 , when the user wants to combine the news information 32b and the news information 32a, the user can double-click the display window 31a and the display window 31b. Now, as shown in (b) in FIG. 5, the mobile phone merges the display window 31a and the display window 31b to obtain a new display window 33, and the news information 32b and the news information 32a are displayed in the new display window 33.
  • Example 3 referring to FIG. 3, when the user wants to delete the news information 32b, the user can press and hold the screen, and at this time, a " ⁇ " mark is displayed in the upper right corner of the display window 31a, the display window 31b, and the display window 31c. Then, the user can click the " ⁇ " corner mark displayed in the upper right corner of the display window 31b. At this time, as shown in (c) in Figure 5, the mobile phone can delete the display window 31b, and adaptively increase the display window 31a and the display window 31c display size.
  • Example 4 in conjunction with Fig. 3, when the user wants to adjust the display size of the display window 31b, if the user can use two fingers to slide towards each other on the display window 31b, at this time, the mobile phone can reduce the display size of the display window 31b and adapt to The display size of the display window 31a and the display window 31c is permanently increased. If the user can use two fingers to slide relatively on the display window 31b, at this time, as shown in (d) in Figure 5, the mobile phone can increase the display size of the display window 31b, and adaptively reduce the display window 31a and the display window 31c display size.
  • the user after the user triggers the display control device to split-screen display the first object in the target interface, the user can also trigger the display control device to process the split-screen window of the first object according to requirements, so that the display control device split-screen display The object process is more flexible.
  • the user may continue to trigger the display control device to display other objects in a split screen after triggering the display control device to display objects in the target interface in split screen according to requirements.
  • the method may further include the following steps C1 to C4:
  • Step C1 The display control device receives a sixth input from the user.
  • the above-mentioned sixth input may be: the user's input on the screen, or a voice command input by the user, or a specific gesture input by the user, which can be determined according to actual usage requirements, and this embodiment of the present application does not make any reference to it. limited.
  • the above-mentioned user's input to the screen may specifically be a user's three-finger sliding input on the screen.
  • Step C2 In response to the above sixth input, the display control device adds a fourth window in the screen.
  • the above-mentioned fourth window includes at least one icon.
  • an icon may correspond to an application program or a widget.
  • Step C3 The display control device receives a seventh input from the user on the target icon in the at least one icon.
  • the above-mentioned seventh input may be: the user's click input on the target icon, or a voice command input by the user, or a specific gesture input by the user, which may be specifically determined according to actual usage requirements. This is not limited.
  • Step C4 In response to the seventh input, the display control device displays an interface corresponding to the target icon in the fourth window.
  • the screen 51 of the mobile phone is divided into three display windows, namely a display window 51a, a display window 51b and a display window 51c.
  • the video image of the local user is displayed in the display window 51a
  • the video image of the peer user 1 is displayed in the display window 51b
  • the text chat interface between the local user and the peer user 2 is displayed in the display window 51c.
  • the user wants to watch a video
  • the user can swipe up on the screen with three fingers.
  • the mobile phone adds a display window 51d below the screen 51
  • the mobile phone desktop is displayed in the display window 51d
  • the mobile phone desktop displays 4 application program icons and 1 pendant icon.
  • the 4 application icons are respectively the application icon of the video application, the application icon of the map application, the application icon of the photo application and the application icon of the setting application, and the 1 widget icon is the widget 1 pendant icon. Users can then tap the video app's app icon and find the video they want to watch. Finally, as shown in (b) in FIG. 7 , the mobile phone displays the playback interface of video 1 in the display window 51d.
  • the display control device can not only display objects in the target interface in split screens, but also display other objects in split screens, thus enriching the split-screen display modes of the display control device.
  • the display control method provided in the embodiment of the present application may be executed by a display control device, or a control module in the display control device for executing the display control method.
  • the display control device provided in the embodiment of the present application is described by taking the display control device executing the display control method as an example.
  • Fig. 8 is a schematic diagram of a possible structure of a display control device provided by the embodiment of the present application.
  • the display control device 600 includes: a receiving module 601 and a display module 602, wherein: The user's first input to the first object in the target interface, the first object includes at least one of the following: a control, a widget; a display module 602, configured to respond to the first input received by the receiving module 601, and display in the first window Display the first object.
  • the receiving module 601 is further configured to receive a second input from the user; the display module 602 is further configured to display at least one split-screen object on the target interface in response to the second input received by the receiving module 601, the At least one split-screen object includes the above-mentioned first object, and before receiving the second input, the target interface does not include the at least one split-screen object; or, in response to the second input received by the receiving module 601, display split-screen prompt information , the split-screen prompt information indicates at least one split-screen object; or, in response to the second input received by the receiving module 601, at least one split-screen object is displayed in a preset manner
  • the display control device 600 includes: an acquisition module 603 and a determination module 604; the target interface further includes a second object; the acquisition module 603 is configured to, in response to the first input received by the receiving module, Obtain target information, the target information includes at least one of the following: object information, screen information; a determination module 604, configured to determine the layout of the first object and the second object according to the target information acquired by the acquisition module 603; display module 602 , is also used to display the second object according to the layout determined by the determining module 604; and display the first object in the first window according to the layout; wherein, the object information includes at least one of the following: object category, object history The display position, the display content of the object, the screen information is the information of the screen displaying the target interface, and the screen information includes at least one of the following: screen size and screen status.
  • the display control device can determine the layout mode of the first object and the second object according to the target information, display the second object according to the layout mode, and display the first object in the first window, so that the display control device can flexibly Display the first object and the second object.
  • the display control device 600 includes: a generating module 605; a receiving module 601, further configured to receive a third input from the user; a generating module 605, configured to respond to the third input received by the receiving module 601. Input, generate a split-screen logo, the split-screen logo indicates the target object in the target interface and the layout of the target object; the receiving module 601 is also used to receive the user's fourth input to the split-screen logo; the display module 602 is also used to In response to the fourth input received by the receiving module 601, display the target object indicated by the split-screen identifier according to the layout manner indicated by the split-screen identifier.
  • the display control device can be triggered to generate a split-screen identification that can indicate the split-screen combination mode of the target object, so that it is convenient for the user to pass through the split-screen identification later. , quickly triggering the display control device to display the target object indicated by the split-screen identifier according to the layout mode indicated by the split-screen identifier.
  • the display control device 600 includes: an executing module 606; a receiving module 601, further configured to receive a fifth input from the user on the first window; an executing module 606, configured to respond to the receiving module 601 receiving According to the fifth input received, the target processing is executed, and the target processing includes at least one of the following: moving the first window; in the case where the display area of the screen where the target interface is located includes the second window, merging the first window and the second window; Delete the first window; resize the first window.
  • the user after the user triggers the display control device to split-screen display the first object in the target interface, the user can also trigger the display control device to process the split-screen window of the first object according to requirements, so that the display control device split-screen display The object process is more flexible.
  • the size of the first window is associated with feature information of the first object, and the feature information includes at least one of the following: object type, and proportion of the object on the target interface.
  • the first object when the target interface is a program interface of an application program, the first object includes controls in the program interface; when the target interface is a desktop, the first object further includes a program interface.
  • modules that must be included in the display control device 600 are indicated by solid line boxes, such as the receiving module 601; the modules that may or may not be included in the display control device 600 are indicated by dotted line boxes. Module 604.
  • the display control device may display the first object in the first window after receiving the user's first input on the first object in the target interface.
  • the first object includes at least one of the following: controls and widgets.
  • the display control device in this application can, after receiving the user's first input to the control or widget in the target interface, The control or widget is displayed in the first window, that is, the control or widget is displayed in split screens. In this way, the display mode of the electronic device is more flexible.
  • the display control device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the device may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, tablet computer, notebook computer, palmtop computer, vehicle electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant).
  • assistant, PDA personal digital assistant
  • non-mobile electronic devices can be servers, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (television, TV), teller machine or self-service machine, etc., this application Examples are not specifically limited.
  • the display control device in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in this embodiment of the present application.
  • the display control device provided in the embodiment of the present application can realize various processes realized by the method embodiments in FIG. 1 to FIG. 7 , and details are not repeated here to avoid repetition.
  • the embodiment of the present application further provides an electronic device 700, including a processor 701, a memory 702, and programs or instructions stored in the memory 702 and operable on the processor 701,
  • an electronic device 700 including a processor 701, a memory 702, and programs or instructions stored in the memory 702 and operable on the processor 701,
  • the program or instruction is executed by the processor 701
  • each process of the above embodiment of the display control method can be realized, and the same technical effect can be achieved. To avoid repetition, details are not repeated here.
  • the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 10 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110, etc. part.
  • the electronic device 100 can also include a power supply (such as a battery) for supplying power to various components, and the power supply can be logically connected to the processor 110 through the power management system, so that the management of charging, discharging, and function can be realized through the power management system. Consumption management and other functions.
  • a power supply such as a battery
  • the structure of the electronic device shown in FIG. 10 does not constitute a limitation to the electronic device.
  • the electronic device may include more or fewer components than shown in the figure, or combine certain components, or arrange different components, and details will not be repeated here. .
  • the user input unit 107 is configured to receive the user's first input to the first object in the target interface, and the first object includes at least one of the following: a control, a pendant; a display unit 106 configured to respond to the user input unit 107 The first input is received, and the first object is displayed in the first window.
  • the user input unit 107 is further configured to receive a second input from the user; the display unit 106 is further configured to display at least one split-screen object on the target interface in response to the second input received by the user input unit 107 , the at least one split-screen object includes the above-mentioned first object, and before receiving the second input, the target interface does not include the at least one split-screen object; or, in response to the second input received by the user input unit 107, the display split Screen prompt information, where the screen split prompt information indicates at least one split-screen object; or, in response to the second input received by the user input unit 107, at least one split-screen object is displayed in a preset manner.
  • the target interface further includes a second object
  • the processor 110 is configured to acquire target information in response to the first input received by the receiving module, where the target information includes at least one of the following: object information, screen information; and According to the target information, determine the layout of the first object and the second object
  • the display unit 106 is also configured to display the second object according to the layout determined by the processor 110; and display the first object in the first window according to the layout.
  • the object information includes at least one of the following: the category of the object, the historical display position of the object, and the display content of the object.
  • the screen information is the information of the screen displaying the target interface, and the screen information includes at least one of the following: screen size, screen status.
  • the user input unit 107 is further configured to receive a third input from the user; the processor 110 is configured to generate a split screen identifier in response to the third input received by the user input unit 107, and the split screen identifier indicates the target interface
  • the user input unit 107 is also used to receive the user's fourth input on the split-screen identification;
  • the display unit 106 is also used to respond to the fourth input received by the user input unit 107, According to the layout mode indicated by the split-screen identifier, the target object indicated by the split-screen identifier is displayed.
  • the user input unit 107 is further configured to receive a fifth input from the user on the first window; the processor 110 is configured to execute target processing in response to the fifth input received by the user input unit 107, and the target processing includes At least one of the following: moving the first window; when the display area of the screen where the target interface is located includes the second window, merging the first window and the second window; deleting the first window; adjusting the size of the first window.
  • the size of the first window is associated with feature information of the first object, and the feature information includes at least one of the following: object type, and proportion of the object on the target interface.
  • the first object when the target interface is a program interface of an application program, the first object includes controls in the program interface; when the target interface is a desktop, the first object further includes a program interface.
  • the electronic device after the electronic device receives a user's first input on the first object in the target interface, the electronic device may display the first object in the first window.
  • the first object includes at least one of the following: controls and widgets.
  • the display control device in this application can, after receiving the user's first input to the control or widget in the target interface, The control or widget is displayed in the first window, that is, the control or widget is displayed in split screens. In this way, the screen splitting mode of the electronic device is more flexible.
  • the input unit 104 may include a graphics processing unit (Graphics Processing Unit, GPU) 1041 and a microphone 1042, and the graphics processing unit 1041 is used by the image capturing device (such as the image data of the still picture or video obtained by the camera) for processing.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072 .
  • the touch panel 1071 is also called a touch screen.
  • the touch panel 1071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 1072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • Memory 109 may be used to store software programs as well as various data, including but not limited to application programs and operating systems.
  • the processor 110 may integrate an application processor and a modem processor, wherein the application processor mainly processes operating systems, user interfaces, and application programs, and the modem processor mainly processes wireless communications. It can be understood that the foregoing modem processor may not be integrated into the processor 110 .
  • the embodiment of the present application also provides a readable storage medium.
  • the readable storage medium stores programs or instructions.
  • the program or instructions are executed by the processor, the various processes of the above-mentioned display control method embodiments can be achieved, and the same To avoid repetition, the technical effects will not be repeated here.
  • the processor is the processor in the electronic device described in the above embodiments.
  • the readable storage medium includes computer readable storage medium, such as computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the above-mentioned display control method embodiment
  • the chip includes a processor and a communication interface
  • the communication interface is coupled to the processor
  • the processor is used to run programs or instructions to implement the above-mentioned display control method embodiment
  • chips mentioned in the embodiments of the present application may also be called system-on-chip, system-on-chip, system-on-a-chip, or system-on-a-chip.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande divulgue un procédé et un appareil d'affichage de commande d'affichage, ainsi qu'un dispositif électronique et un support. Le procédé consiste à : recevoir une première entrée d'un utilisateur pour un premier objet dans une interface cible, le premier objet comprenant une commande et/ou un gadget logiciel ; et en réponse à la première entrée, afficher le premier objet dans une première fenêtre.
PCT/CN2022/111192 2021-08-13 2022-08-09 Procédé et appareil de commande d'affichage, et dispositif électronique et support Ceased WO2023016463A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110932873.6A CN113783995A (zh) 2021-08-13 2021-08-13 显示控制方法、装置、电子设备和介质
CN202110932873.6 2021-08-13

Publications (1)

Publication Number Publication Date
WO2023016463A1 true WO2023016463A1 (fr) 2023-02-16

Family

ID=78837786

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/111192 Ceased WO2023016463A1 (fr) 2021-08-13 2022-08-09 Procédé et appareil de commande d'affichage, et dispositif électronique et support

Country Status (2)

Country Link
CN (1) CN113783995A (fr)
WO (1) WO2023016463A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113783995A (zh) * 2021-08-13 2021-12-10 维沃移动通信有限公司 显示控制方法、装置、电子设备和介质
CN114489400B (zh) * 2022-01-13 2025-09-30 维沃移动通信有限公司 界面控制方法、装置、电子设备及介质
CN114442881A (zh) * 2022-01-25 2022-05-06 维沃移动通信有限公司 一种信息显示方法及装置、电子设备和可读存储介质
CN114879872A (zh) * 2022-05-12 2022-08-09 维沃移动通信有限公司 显示方法、装置、电子设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471725A (zh) * 2019-07-02 2019-11-19 华为技术有限公司 一种分屏方法及电子设备
US20210096698A1 (en) * 2019-09-27 2021-04-01 Development Guild DDI, Inc. Systems and methods for indicating organizational relationships between objects
CN113783995A (zh) * 2021-08-13 2021-12-10 维沃移动通信有限公司 显示控制方法、装置、电子设备和介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101957173B1 (ko) * 2012-09-24 2019-03-12 삼성전자 주식회사 터치 디바이스에서 멀티윈도우 제공 방법 및 장치
CN108984065A (zh) * 2018-07-10 2018-12-11 Oppo广东移动通信有限公司 分屏显示的处理方法、装置、存储介质及电子设备
CN109800045B (zh) * 2019-01-14 2021-10-29 维沃移动通信有限公司 一种显示方法及终端
CN110727382A (zh) * 2019-09-06 2020-01-24 华为技术有限公司 一种分屏显示方法及电子设备
CN111597006A (zh) * 2020-05-19 2020-08-28 Oppo广东移动通信有限公司 应用分屏方法、装置、存储介质及电子设备
CN112540740B (zh) * 2020-12-09 2025-03-11 维沃移动通信有限公司 分屏显示方法、装置、电子设备和可读存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471725A (zh) * 2019-07-02 2019-11-19 华为技术有限公司 一种分屏方法及电子设备
US20210096698A1 (en) * 2019-09-27 2021-04-01 Development Guild DDI, Inc. Systems and methods for indicating organizational relationships between objects
CN113783995A (zh) * 2021-08-13 2021-12-10 维沃移动通信有限公司 显示控制方法、装置、电子设备和介质

Also Published As

Publication number Publication date
CN113783995A (zh) 2021-12-10

Similar Documents

Publication Publication Date Title
US11989409B2 (en) Device, method, and graphical user interface for displaying a plurality of settings controls
EP3454197B1 (fr) Procédé, dispositif, et support de stockage lisible par ordinateur non transitoire pour commuter des pages d'applications dans un dispositif terminal
US8274536B2 (en) Smart keyboard management for a multifunction device with a touch screen display
US8438504B2 (en) Device, method, and graphical user interface for navigating through multiple viewing areas
US8736561B2 (en) Device, method, and graphical user interface with content display modes and display rotation heuristics
US20110175826A1 (en) Automatically Displaying and Hiding an On-screen Keyboard
WO2023016463A1 (fr) Procédé et appareil de commande d'affichage, et dispositif électronique et support
US20130159900A1 (en) Method, apparatus and computer program product for graphically enhancing the user interface of a device
CN108334371B (zh) 编辑对象的方法和装置
JP2019040622A (ja) コンピュータデバイスのためのユーザインターフェイス
KR20200022546A (ko) 동시에 열린 소프트웨어 애플리케이션들을 관리하기 위한 디바이스, 방법, 및 그래픽 사용자 인터페이스
US20210405864A1 (en) Method for Displaying Graphical User Interface Based on Gesture and Electronic Device
US20240256305A1 (en) Display method and apparatus, electronic device, and readable storage medium
WO2023005920A1 (fr) Procédé et appareil de séparation d'écran, et dispositif électronique
CN108984093B (zh) 触控操作方法、装置、存储介质及电子设备
WO2023005828A1 (fr) Procédé et appareil d'affichage de message et dispositif électronique
CN113835577B (zh) 显示方法、装置、电子设备及存储介质
CN113268182B (zh) 应用图标的管理方法和电子设备
CN112765500A (zh) 信息搜索方法及装置
CN111638828A (zh) 界面显示方法及装置
CN112148406A (zh) 页面切换方法、装置、电子设备和可读存储介质
CN112286615A (zh) 应用程序的信息显示方法及装置
WO2023045927A1 (fr) Procédé de déplacement d'objet et dispositif électronique
WO2023155874A1 (fr) Procédé et appareil de gestion d'icône d'application, et dispositif électronique
WO2023030114A1 (fr) Procédé et appareil d'affichage d'interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22855444

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09.07.2024)

122 Ep: pct application non-entry in european phase

Ref document number: 22855444

Country of ref document: EP

Kind code of ref document: A1