[go: up one dir, main page]

CN107783749A - A kind of display methods of view data, device and mobile terminal - Google Patents

A kind of display methods of view data, device and mobile terminal Download PDF

Info

Publication number
CN107783749A
CN107783749A CN201711108423.5A CN201711108423A CN107783749A CN 107783749 A CN107783749 A CN 107783749A CN 201711108423 A CN201711108423 A CN 201711108423A CN 107783749 A CN107783749 A CN 107783749A
Authority
CN
China
Prior art keywords
display screen
image data
display
pipeline
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711108423.5A
Other languages
Chinese (zh)
Inventor
梅正怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN201711108423.5A priority Critical patent/CN107783749A/en
Publication of CN107783749A publication Critical patent/CN107783749A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiments of the invention provide a kind of display methods of view data, device and mobile terminal, methods described includes:Obtain the figure layer corresponding with second display screen of figure layer corresponding to the first display screen;Determine that display controller distributes to the first pipeline of the first display screen, distributes to the second pipe of second display screen;Call the first pipeline of graphics processor and/or display controller that figure layer corresponding to the first display screen is synthesized into first object view data, first object view data is exported to the first display screen by first pipeline of display controller and shown;Call graphics processor that figure layer corresponding to second display screen is synthesized into the second destination image data, the second destination image data is exported to second display screen by the second pipe of display controller and shown, so that the first display screen and second display screen can display image datas simultaneously.

Description

Image data display method and device and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method for displaying image data, an apparatus for displaying image data, and a mobile terminal.
Background
With the development of mobile communication technology, mobile terminals such as mobile phones are more and more popular, and great convenience is brought to life, study and work of people.
The user's demand for mobile terminals varies, and some mobile terminals are configured with two screens, one being a main screen, the other being a sub-screen,
when the user operates the mobile terminal, the user acquires one or more relevant layers according to different scenes, and the mobile terminal calls certain image hardware resources to synthesize the one or more layers into image data so as to display the image data on a screen.
At present, if the main screen needs to display image data, the image hardware resource is exclusively occupied to perform layer composition, and similarly, if the auxiliary screen needs to display image data, the image hardware resource is exclusively occupied to perform layer composition.
Therefore, when the main screen is used for synthesizing the layer, the auxiliary screen cannot synthesize the layer, and similarly, when the auxiliary screen is used for synthesizing the layer, the main screen cannot synthesize the layer, so that only one screen displays image data at the same time, and the operability of the screen is poor.
Disclosure of Invention
The embodiment of the invention provides a display method and device of image data and a mobile terminal, and aims to solve the problem that only one screen displays the image data at the same time.
According to an aspect of the present invention, there is provided a method for displaying image data, which is applied in a mobile terminal having a first display screen, a second display screen, a graphics processor and a display controller, the method comprising:
obtaining a layer corresponding to the first display screen and a layer corresponding to the second display screen;
determining a first pipe assigned to the first display screen and a second pipe assigned to the second display screen by the display controller;
calling the first pipeline of the graphics processor and/or the display controller to synthesize layers corresponding to the first display screen into first target image data, and outputting the first target image data to the first display screen through the first pipeline of the display controller for displaying;
and calling the graphics processor to synthesize the layer corresponding to the second display screen into second target image data, and outputting the second target image data to the second display screen for displaying through the second pipeline of the display controller.
Optionally, the determining that the display controller is assigned to a first pipe of the first display screen and assigned to a second pipe of the second display screen includes:
inquiring a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen in the display controller according to a preset mapping relation between the display screen and the pipelines;
or,
and according to a preset allocation rule, allocating one part of pipelines of the display controller to the first display screen as a first pipeline, and allocating the other part of pipelines of the display controller to the second display screen as a second pipeline.
Optionally, the invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data includes:
calling the graphics processor to synthesize the layer corresponding to the first display screen into first target image data, and storing the first target image data in a preset memory;
and extracting the first target image data from the memory.
Optionally, the invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data includes:
and respectively inputting the layers corresponding to the first display screens into the first pipelines of the display controllers to synthesize first target image data.
Optionally, the invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data includes:
calling the graphics processor to synthesize a part of layers corresponding to the first display screen into first intermediate image data, and storing the first intermediate image data in a preset memory;
extracting the first intermediate image data from the memory;
and calling the first pipeline of the display controller, and synthesizing the first intermediate image data and the layer corresponding to the other part of the first display screen into first target image data.
Optionally, the invoking the graphics processor to synthesize the layer corresponding to the second display screen into second target image data includes:
calling the graphic processor to synthesize the layer corresponding to the second display screen into second intermediate image data, and storing the second intermediate image data in a preset memory;
and extracting the second intermediate image data from the memory and decoding the second intermediate image data into second target image data suitable for processing by a time sequence control circuit.
According to another aspect of the present invention, there is provided an apparatus for displaying image data, which is applied in a mobile terminal having a first display screen, a second display screen, a graphic processor and a display controller, the apparatus comprising:
the layer acquiring module is used for acquiring a layer corresponding to a first display screen corresponding to the first display screen and a layer corresponding to a second display screen corresponding to the second display screen;
a pipeline determining module for determining a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen by the display controller;
the first layer synthesis module is used for calling the graphics processor and/or the first pipeline of the display controller to synthesize the layer corresponding to the first display screen into first target image data;
the first target image data output module is used for outputting the first target image data to the first display screen for display through the first pipeline of the display controller;
the second layer synthesis module is used for calling the graphics processor to synthesize the layer corresponding to the second display screen into second target image data;
and the second target image data output module is used for outputting the second target image data to the second display screen for displaying through the second pipeline of the display controller.
Optionally, the pipeline determining module comprises:
the relationship distribution submodule is used for inquiring a first pipeline distributed to the first display screen in the display controller and a second pipeline distributed to the second display screen according to a preset mapping relationship between the display screen and the pipeline;
or,
and the rule distribution submodule is used for distributing one part of pipelines of the display controller to the first display screen as a first pipeline and distributing the other part of pipelines of the display controller to the second display screen as a second pipeline according to a preset distribution rule.
Optionally, the first image layer synthesizing module includes:
the first calling sub-module is used for calling the graphics processor to synthesize the layer corresponding to the first display screen into first target image data and storing the first target image data in a preset memory;
and the first extraction submodule is used for extracting the first target image data from the memory.
Optionally, the first image layer synthesizing module includes:
and the second calling sub-module is used for inputting the layers corresponding to the first display screens into the first pipelines of the display controllers respectively so as to synthesize first target image data.
Optionally, the first image layer synthesizing module includes:
the third calling sub-module is used for calling the graphics processor to synthesize a part of layers corresponding to the first display screen into first intermediate image data, and storing the first intermediate image data in a preset memory;
the second extraction submodule is used for extracting the first intermediate image data from the memory;
and the fourth calling submodule is used for calling the first pipeline of the display controller and synthesizing the first intermediate image data and the layer corresponding to the other part of the first display screen into first target image data.
Optionally, the second image layer synthesizing module includes:
the fifth calling submodule is used for calling the graphics processor to synthesize the layer corresponding to the second display screen into second intermediate image data, and the second intermediate image data is stored in a preset memory;
and the decoding submodule is used for extracting the second intermediate image data from the memory and decoding the second intermediate image data into second target image data suitable for the processing of the time sequence control circuit.
According to another aspect of the present invention, there is provided a mobile terminal comprising a first display screen, a second display screen, a graphics processor, a display controller, a processor, a memory and a computer program stored on the memory and executable on the processor;
the computer program realizes the steps of the method for displaying image data when executed by the processor.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the mobile terminal is provided with a first display screen, a second display screen, a graphic processor and a display controller, simultaneously obtains a layer corresponding to the first display screen and a layer corresponding to the second display screen, determines a first pipeline allocated to the first display screen by the display controller and a second pipeline allocated to the second display screen, calls the first pipeline of the graphic processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data, outputs the first target image data to the first display screen for display through the first pipeline of the display controller, simultaneously calls the graphic processor to synthesize the layer corresponding to the second display screen into second target image data, outputs the second target image data to the second display screen for display through the second pipeline of the display controller, simultaneously allocates pipeline resources of the display controller to the first display screen and the second display screen, the method and the device have the advantages that the first display screen or the second display screen is prevented from monopolizing pipeline resources of the display controller, so that the layer corresponding to the first display screen and the layer corresponding to the second display screen can be synthesized at the same time, and image data can be displayed on the first display screen and the second display screen at the same time.
Drawings
FIG. 1 is a flow chart illustrating the steps of a method for displaying image data according to an embodiment of the present invention;
FIGS. 2A and 2B are diagrams illustrating layer composition according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a graphics processor compositing layers corresponding to a first display screen, according to an embodiment of the invention;
FIG. 4 is a diagram illustrating a display controller synthesizing layers corresponding to a first display screen according to an embodiment of the invention;
FIG. 5 is a diagram illustrating a graphics processor and display controller blending layers corresponding to a first display screen according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a graphics processor compositing layers corresponding to a second display screen, in accordance with an embodiment of the invention;
fig. 7 is a block diagram showing a configuration of an image data display device according to an embodiment of the present invention;
fig. 8 is a block diagram showing a partial structure of a mobile phone related to a terminal provided by an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, a flowchart illustrating steps of a method for displaying image data according to an embodiment of the present invention is shown.
In a specific implementation, the embodiments of the present invention may be applied to a mobile terminal, such as a mobile phone, a tablet computer, a personal digital assistant, a wearable device (such as glasses, a watch, etc.), and the like.
The operating systems of these mobile terminals may include Android (Android), IOS, Windows Phone, Windows, and so on.
In the embodiment of the present invention, the mobile terminal has at least two display screens, that is, the mobile terminal may have at least a first display screen and a second display screen, and the positions of the two display screens are generally opposite.
One of the display screens is generally used as a main screen, and a high-resolution (e.g., 1280 × 720, 1920 × 1080) screen, such as a Thin Film Transistor (TFT) liquid crystal screen, is generally configured on the front surface of the mobile terminal, and may be used to display various information of the mobile terminal.
The other display screen is generally used as a sub-screen, and is generally configured with a low-resolution (e.g., 480x320, 640x480) screen, such as a TFT liquid crystal screen, an ink screen, and the like, and is disposed on the back side of the mobile terminal, and may be used to assist in displaying various states of the mobile terminal, including endurance, contacts, unread messages, missed calls, display of a dial-up keypad, weather information, e-book reading, and the like.
In this embodiment of the present invention, the first display screen may be a main screen, and the second display screen may be a sub-screen, or the first display screen may be a sub-screen, and the second display screen may be a main screen, and so on, which is not limited in this embodiment of the present invention.
For example, if the first display screen is a liquid crystal screen, the second display screen may be an ink screen.
For another example, if the first display screen is an ink screen, the second display screen may be a liquid crystal screen.
Of course, the first display screen and the second display screen of the mobile terminal may be in other configurations besides the main screen and the sub-screen, for example, both the first display screen and the second display screen may be high-resolution screens, or both the first display screen and the second display screen may be low-resolution screens, and so on, which is not limited in this embodiment of the present invention.
In addition, the mobile terminal has image hardware resources such as a Graphics Processing Unit (GPU) and a display controller (Overlay), and can be used for layer composition.
The so-called layer composition is, for example, a Launcher (desktop) of the Android system, as shown in fig. 2A, there are four layers in a certain scene, a status bar 201 (displaying information such as battery capacity and time) and a navigation bar 202 are drawn by a systemou (system user interface), wallpaper 203 is provided by a wallpaper service, and an icon layer 204 is drawn by a Launcher application; as shown in fig. 2B, in the composition, the navigation bar 202 is at the bottom, the status bar 201 is at the top, and the middle mixed layer 205 is formed by mixing the wallpaper 203 and the icon layer 204, wherein the wallpaper 203 is only a part, and the layer composition is to display the four layers on the display screen according to the predetermined display area.
The display controller (Overlay) provides a plurality of pipelines (pipe) and a plurality of LayerMixers (layer compositor), the number of pipelines (pipe) provided by the display controller (Overlay) of different chips may be different, some chips with lower performance may provide 4 pipelines (pipe), and some chips with higher performance may provide 6-9 pipelines (pipe).
Generally, an RGB pipe, which is RGB image data corresponding to a UI (User Interface), and a VG pipe, which is RGB or YUV data corresponding to Camera or Video, constitute a pipe pair.
The pipe (pipe) is an input element of the LayerMixer, the LayerMixer in the pipe (pipe) can be input to the LayerMixer for synthesis, and the output of the LayerMixer corresponds to a Liquid Crystal Display (LCD), a Television (TV), a High Definition Multimedia Interface (HDMI), and the like.
It should be noted that, in a Graphics Processing Unit (GPU), besides layer composition, other processing may be performed on the layers, such as texture processing, drawing, and the like, which is not limited in this embodiment of the present invention.
As shown in fig. 1, the method may specifically include the following steps:
step 101, obtaining a layer corresponding to the first display screen and a layer corresponding to the second display screen.
In a specific implementation, the first display screen and the second display screen may display the same content at the same time, or may display different content, so that the layer corresponding to the first display screen and the layer corresponding to the second display screen that are obtained at the same time may be the same layer or may be different layers, which is not limited in this embodiment of the present invention.
Taking a browser as an example, after receiving a page Document, the browser parses a markup language in the Document into a DOM (Document Object Model) tree, combines the DOM tree and a CSS (Cascading Style Sheets) to form a rendering tree of a browser-constructed page, where the rendering tree includes a large number of rendering elements, each rendering element is divided into layers, and each layer is loaded to image hardware resources such as a graphics processor and a display controller to perform processing such as rendering texture and composition.
Step 102, determining a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen by the display controller.
When the first display screen and the second display screen are displayed simultaneously, one part of the pipeline of the display controller can be allocated to the first display screen, the part of the pipeline is called as the first pipeline, and the other part of the pipeline of the display controller can be allocated to the second display screen, the part of the pipeline is called as the second pipeline.
It should be noted that the first pipeline and the second pipeline may be all pipelines of the display controller, or may be partial pipelines of the display controller, which is not limited in this embodiment of the present invention.
In one embodiment of the present invention, step 102 may include the following sub-steps:
and a substep S11, querying a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen in the display controller according to a preset mapping relationship between the display screens and the pipelines.
In the embodiment of the present invention, the pipelines of the display controller may be pre-allocated to each display screen (including the first display screen and the second display screen), and the mapping relationship between each display screen (including the first display screen and the second display screen) and the pipeline of the display controller may be respectively marked.
When the first display screen and the second display screen are displayed simultaneously, the pipeline corresponding to the first display screen can be inquired as the first pipeline according to the mapping relation of the prior mark, and the pipeline corresponding to the second display screen can be inquired as the second pipeline.
In another embodiment of the present invention, step 102 may include the following sub-steps:
and a substep S12, allocating a part of the pipes of the display controller to the first display screen as a first pipe and allocating another part of the pipes of the display controller to the second display screen as a second pipe according to a preset allocation rule.
In the embodiment of the invention, when the first display screen and the second display screen are displayed simultaneously, the pipeline of the display controller can be distributed to the first display screen and the second display screen in real time according to a certain distribution rule.
In an example of an allocation rule, allocation may be performed according to the number of layers.
For example, if the number of layers corresponding to the first display screen is large and the number of layers corresponding to the second display screen is small, more pipelines may be allocated to the first display screen and fewer pipelines may be allocated to the second display screen.
In another example of an allocation rule, the allocation may be in accordance with load balancing.
For example, if the pipe allocated to the first display screen is idle and the pipe allocated to the second display screen is busy, a portion of the pipe allocated to the first display screen may be allocated to the second display screen.
Of course, the allocation rule is only an example, and when the embodiment of the present invention is implemented, other allocation rules may be set according to actual situations, which is not limited in the embodiment of the present invention. In addition, besides the foregoing allocation rule, a person skilled in the art may also adopt other allocation rules according to actual needs, and the embodiment of the present invention is not limited thereto.
Step 103, invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data.
And 104, outputting the first target image data to the first display screen for display through the first pipeline of the display controller.
In a specific implementation, different compositing modes may be selected according to different scenes for compositing the layer corresponding to the first display screen, for example, a graphics processor alone compositing, a display controller alone compositing, and a graphics processor and a display controller mixed compositing.
The synthesized first target image data can be output to the first display screen through the first pipeline of the display controller and displayed in the first display screen.
In one embodiment of the present invention, step 103 may comprise the following sub-steps:
and a substep S21, invoking the graphics processor to synthesize the layer corresponding to the first display screen into first target image data, and storing the first target image data in a preset memory.
And a substep S22 of extracting the first target image data from the memory.
In the embodiment of the present invention, if the layer corresponding to the first display screen does not change, the layer corresponding to the first display screen may be separately synthesized by using a graphics processor, and the display controller allocates the first pipeline of the first display screen as the transmission channel.
For example, there are several layers on the standby interface of the mobile terminal, such as a status bar (located at the top of the screen and displaying information such as signal quality, battery level, etc.), a navigation bar (located at the bottom of the screen and displaying virtual keys), wallpaper, Launcher (application icon), and when the mobile terminal is on the screen (idle), these layers are not changed, and at this time, these layers may be synthesized by using a graphics processor.
As shown in fig. 3, Overlay303 has n pipes (i.e., pipe 1, pipe 2, … …, pipe n, n being a positive integer), where pipe 1 is assigned to the second display screen and pipe 2-pipe n is assigned to the first display screen 304.
The layers (i.e., layer 1, layer 2, … …, and layer m, where m is a positive integer) corresponding to the first display screen are input to the GPU301, and the GPU301 synthesizes the layers corresponding to the first display screen into first target image data, and stores the first target image data in a certain memory (e.g., frame buffer) 302.
The first target image data is read from the memory (e.g., frame buffer)302 through a dedicated bus, and may be transmitted to the first display screen 304 through a first pipe (e.g., pipe 2) allocated to the first display screen 304 by the Overlay303, and the first display screen 304 displays the first target image data.
In another embodiment of the present invention, step 103 may comprise the following sub-steps:
and a substep S31, inputting the layer corresponding to each first display screen into each first pipeline of the display controller, respectively, to synthesize into first target image data.
In the embodiment of the present invention, because the synthesis power consumption of the display controller is low, if the layer corresponding to the first display screen is constantly changed, the layer corresponding to the first display screen may be separately synthesized by using the display controller, and meanwhile, the first pipeline allocated to the first display screen by the display controller is used as the transmission channel.
It should be noted that one pipeline of the display controller may be used for synthesizing one layer, and therefore, in the embodiment of the present invention, the number of layers corresponding to the first display screen is less than or equal to the number of first pipelines of the display controller, so that the layers corresponding to each first display screen may be respectively input into the first pipelines of each display controller, thereby synthesizing the first target image data.
For example, the display controller has n pipelines (n is a positive integer), and when only one layer or the number of layers corresponding to the first display screen is less than or equal to n-1 and the layers corresponding to the first display screen are all changed, all layers can be synthesized by using the display controller.
As shown in fig. 4, Overlay401 has n pipes (i.e., pipe 1, pipe 2, pipe 3, … …, pipe n, n being a positive integer), where pipe 1 is assigned to the second display screen and pipe 2-pipe n is assigned to the first display screen 402.
The layers (i.e., layer 1, layer 2, … …, and layer m, where m is a positive integer) corresponding to the first display screen are input to the Overlay401, and the Overlay401 calls corresponding first pipelines (e.g., pipeline 2, pipeline 3, … …, and pipeline n) respectively, where the layer 1 is input to the pipeline 2, the layer 2 is input to the pipelines 3 and … …, and the layer m is input to the pipeline n, so that the layers corresponding to the first display screen are synthesized into first target image data and transmitted to the first display screen 402, and the first display screen 402 displays the first target image data.
In another embodiment of the present invention, step 103 may comprise the following sub-steps:
and a substep S41, invoking the graphics processor to synthesize a part of layers corresponding to the first display screen into first intermediate image data, and storing the first intermediate image data in a preset memory.
And a substep S42 of extracting the first intermediate image data from the memory.
And a substep S43, calling the first pipeline of the display controller, and synthesizing the first intermediate image data and the layer corresponding to the other part of the first display screen into first target image data.
In the embodiment of the present invention, if a layer corresponding to a part of the first display screen changes, a layer corresponding to a part of the first display screen does not change, or a layer corresponding to the first display screen exceeds a synthesis upper limit of the display controller, the layer corresponding to the first display screen may be synthesized by mixing the graphics processor and the display controller, and meanwhile, the display controller allocates the first pipeline of the first display screen as the transmission channel.
For example, when the user performs a sliding operation on the mobile terminal, the status bar, the navigation bar and the wallpaper are generally unchanged layers, and the application icon on the Launcher changes, at this time, the status bar, the navigation bar and the wallpaper are synthesized by the graphics processor, and the synthesized first intermediate image data and the Launcher are transmitted to the display controller together for synthesis again.
It should be noted that, for the layer corresponding to the first display screen that does not change, the graphics processor may be used to synthesize the first intermediate image data and store the first intermediate image data in the memory, and for the layer corresponding to the portion of the first display screen, when synthesizing the first target image data of the next frame, the first intermediate image data may be directly read from the memory without repeating the synthesis, so that the workload of the graphics processor is reduced, and thus the power consumption of the mobile terminal is reduced.
As shown in fig. 5, Overlay503 has n pipes (i.e. pipe 1, pipe 2, pipe 3, … …, pipe n, n being a positive integer), where pipe 1 is assigned to the second display screen and pipe 2-pipe n is assigned to the first display screen 504.
A part of the layers (i.e., layer 1, layer 2, … …, and layer m, where m is a positive integer) corresponding to the first display screen is input to the GPU301, and the GPU301 synthesizes the layers corresponding to the first display screen into first intermediate image data and stores the first intermediate image data in a certain memory (e.g., frame buffer) 502.
The first intermediate image data is read from the memory (e.g., frame buffer)502 through a dedicated bus, the layers corresponding to another part of the first display screen (i.e., layers m +1, … …, and layer m + n-2), the Overlay503 calls the first pipeline (e.g., pipeline 2, pipeline 3, … …, and pipeline n) allocated to the first display screen 504, respectively, and synthesizes the first intermediate image data and the layers corresponding to another first display screen into first target image data, which is transmitted to the first display screen 504, and the first display screen 504 displays the first target image data.
And 105, calling the graphics processor to synthesize the layer corresponding to the second display screen into second target image data.
And 106, outputting the second target image data to the second display screen for displaying through the second pipeline of the display controller.
In a specific implementation, for the synthesis of a layer corresponding to a second display screen of a second display screen, driving modes of a first display screen and the second display screen are generally different, and data displayed by the second display screen needs to be decoded into a specific form.
The synthesized display data can be acquired from the memory for decoding processing through the synthesis of the graphic processor, and when the display controller is used for synthesis, the synthesized display data can be directly output to the display screen, and the synthesized display data cannot be acquired for decoding processing, so that the graphic processor is used for independent synthesis of the second display screen.
The synthesized second target image data can be output to a second display screen through a second pipeline of the display controller and displayed in the second display screen.
In one embodiment of the present invention, step 105 may comprise the sub-steps of:
and a substep S61, invoking the graphics processor to synthesize the layer corresponding to the second display screen into second intermediate image data, and storing the second intermediate image data in a preset memory.
And a substep S62, extracting the second intermediate image data from the memory, and decoding the second intermediate image data into second target image data suitable for the processing of the timing control circuit.
In the embodiment of the present invention, the graphics processor is adopted to separately synthesize the layer corresponding to the second display screen, the synthesized data is decoded into TCON (timing controller) data, and meanwhile, the display controller allocates the second pipeline of the second display screen as a transmission channel.
As shown in fig. 6, Overlay603 has n pipes (i.e., pipe 1, pipe 2, … …, pipe n, n being a positive integer), where pipe 1 is assigned to the second display screen and pipe 2-pipe n is assigned to the first display screen 504.
The layers corresponding to the second display screen (i.e., layer 1, layer 2, … …, and layer m, where m is a positive integer) are input to the GPU601, and the GPU601 synthesizes the layers corresponding to the second display screen into second intermediate image data and stores the second intermediate image data in a certain memory (e.g., frame buffer) 602.
The second intermediate image data is read from the memory (e.g., frame buffer)602 through a dedicated bus, decoded into TCON data (i.e., second target image data), and the second target image data is transmitted to the second display screen 604 through a second pipeline (e.g., pipeline 1) allocated to the second display screen 604 by the Overlay603, wherein the second display screen 604 displays the first target image data.
In the embodiment of the invention, the mobile terminal is provided with a first display screen, a second display screen, a graphic processor and a display controller, simultaneously obtains a layer corresponding to the first display screen and a layer corresponding to the second display screen, determines a first pipeline allocated to the first display screen by the display controller and a second pipeline allocated to the second display screen, calls the first pipeline of the graphic processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data, outputs the first target image data to the first display screen for display through the first pipeline of the display controller, simultaneously calls the graphic processor to synthesize the layer corresponding to the second display screen into second target image data, outputs the second target image data to the second display screen for display through the second pipeline of the display controller, simultaneously allocates pipeline resources of the display controller to the first display screen and the second display screen, the method and the device have the advantages that the first display screen or the second display screen is prevented from monopolizing pipeline resources of the display controller, so that the layer corresponding to the first display screen and the layer corresponding to the second display screen can be synthesized at the same time, and image data can be displayed on the first display screen and the second display screen at the same time.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 7, a block diagram of a display device for image data according to an embodiment of the present invention is shown, and is applied in a mobile terminal, where the mobile terminal has a first display screen, a second display screen, a graphics processor, and a display controller, and the device may specifically include the following modules:
the layer obtaining module 701 is configured to obtain a layer corresponding to a first display screen corresponding to the first display screen and a layer corresponding to a second display screen corresponding to the second display screen;
a pipeline determining module 702, configured to determine a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen by the display controller;
a first layer composition module 703, configured to invoke the first pipeline of the graphics processor and/or the display controller to combine the layer corresponding to the first display screen into first target image data;
a first target image data output module 704, configured to output the first target image data to the first display screen through the first pipeline of the display controller for display;
a second layer composition module 705, configured to invoke the graphics processor to compose a layer corresponding to the second display screen into second target image data;
a second target image data output module 706, configured to output the second target image data to the second display screen for display through the second pipeline of the display controller.
In one embodiment of the present invention, the pipeline determining module 702 comprises:
the relationship distribution submodule is used for inquiring a first pipeline distributed to the first display screen in the display controller and a second pipeline distributed to the second display screen according to a preset mapping relationship between the display screen and the pipeline;
or,
and the rule distribution submodule is used for distributing one part of pipelines of the display controller to the first display screen as a first pipeline and distributing the other part of pipelines of the display controller to the second display screen as a second pipeline according to a preset distribution rule.
In an embodiment of the present invention, the first image layer synthesizing module 703 includes:
the first calling sub-module is used for calling the graphics processor to synthesize the layer corresponding to the first display screen into first target image data and storing the first target image data in a preset memory;
and the first extraction submodule is used for extracting the first target image data from the memory.
In another embodiment of the present invention, the first image layer synthesizing module 703 includes:
and the second calling sub-module is used for inputting the layers corresponding to the first display screens into the first pipelines of the display controllers respectively so as to synthesize first target image data.
In another embodiment of the present invention, the first image layer synthesizing module 703 includes:
the third calling sub-module is used for calling the graphics processor to synthesize a part of layers corresponding to the first display screen into first intermediate image data, and storing the first intermediate image data in a preset memory;
the second extraction submodule is used for extracting the first intermediate image data from the memory;
and the fourth calling submodule is used for calling the first pipeline of the display controller and synthesizing the first intermediate image data and the layer corresponding to the other part of the first display screen into first target image data.
In an embodiment of the present invention, the second layer composition module 705 includes:
the fifth calling submodule is used for calling the graphics processor to synthesize the layer corresponding to the second display screen into second intermediate image data, and the second intermediate image data is stored in a preset memory;
and the decoding submodule is used for extracting the second intermediate image data from the memory and decoding the second intermediate image data into second target image data suitable for the processing of the time sequence control circuit.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
As shown in fig. 8, for convenience of description, only the parts related to the embodiment of the present invention are shown, and details of the specific technology are not disclosed, please refer to the method part of the embodiment of the present invention. The mobile terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (point of Sales), a vehicle-mounted computer, and the like, taking the mobile terminal as the mobile phone as an example:
fig. 8 is a block diagram showing a partial structure of a mobile phone related to a terminal provided by an embodiment of the present invention. Referring to fig. 8, the handset includes: radio Frequency (RF) circuitry 810, memory 820, input unit 830, display unit 840, sensor 850, audio circuitry 860, wireless fidelity (WiFi) module 870, processor 880, and power supply 890. Those skilled in the art will appreciate that the handset configuration shown in fig. 8 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 8:
the RF circuit 810 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information to the processor 880; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 810 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 810 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 820 may be used to store software programs and modules, and the processor 880 executes various functional applications and data processing of the cellular phone by operating the software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 830 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 830 may include a touch panel 831 and other input devices 832. The touch panel 831, also referred to as a touch screen, can collect touch operations performed by a user on or near the touch panel 831 (e.g., operations performed by the user on the touch panel 831 or near the touch panel 831 using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 831 may include two portions, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 880, and can receive and execute commands from the processor 880. In addition, the touch panel 831 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 830 may include other input devices 832 in addition to the touch panel 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 840 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The Display unit 840 may include a first Display screen 841 and a second Display screen 842, and optionally, the first Display screen 841 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like, and the second Display screen 842 may be configured in the form of Electronic Ink (Electronic Ink), or the like.
Further, touch panel 831 can overlay first display 841, and when touch panel 831 detects a touch operation thereon or nearby, the touch panel can transmit the touch operation to processor 880 to determine the type of touch event, and then processor 880 can provide corresponding visual output on first display 841 according to the type of touch event. Although in fig. 8, the touch panel 831 and the first display 841 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 831 and the first display 841 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 850, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 841 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 841 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 860, speaker 861, microphone 862 may provide an audio interface between the user and the handset. The audio circuit 860 can transmit the electrical signal converted from the received audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 and output; on the other hand, the microphone 862 converts collected sound signals into electrical signals, which are received by the audio circuit 860 and converted into audio data, which are then processed by the audio data output processor 880 and transmitted to, for example, another cellular phone via the RF circuit 810, or output to the memory 820 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 870, and provides wireless broadband Internet access for the user. Although fig. 8 shows WiFi module 870, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 880 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby integrally monitoring the mobile phone. Optionally, processor 880 may include one or more processing units; preferably, the processor 880 may integrate an application processor, a modem processor, a Graphic Processor (GPU), and a display controller (Overlay), wherein the application processor mainly processes an operating system, a user interface, an application program, and the like, the modem processor mainly processes wireless communication, and the Graphic Processor (GPU) and the display controller (Overlay) mainly perform image processing. It will be appreciated that the modem processor described above may not be integrated into processor 880.
The handset also includes a power supply 890 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 880 via a power management system to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiment of the present invention, the processor 880 included in the terminal further has the following functions:
obtaining a layer corresponding to a first display screen corresponding to the first display screen and a layer corresponding to a second display screen corresponding to the second display screen;
determining a first pipe assigned to the first display screen and a second pipe assigned to the second display screen by the display controller;
calling the first pipeline of the graphics processor and/or the display controller to synthesize layers corresponding to the first display screen into first target image data, and outputting the first target image data to the first display screen through the first pipeline of the display controller for displaying;
and calling the graphics processor to synthesize the layer corresponding to the second display screen into second target image data, and outputting the second target image data to the second display screen for displaying through the second pipeline of the display controller.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method, the device and the mobile terminal for displaying image data provided by the present invention are described in detail above, and a specific example is applied in the text to explain the principle and the implementation of the present invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A display method of image data is applied to a mobile terminal, wherein the mobile terminal is provided with a first display screen, a second display screen, a graphics processor and a display controller, and the method comprises the following steps:
obtaining a layer corresponding to the first display screen and a layer corresponding to the second display screen;
determining a first pipe assigned to the first display screen and a second pipe assigned to the second display screen by the display controller;
calling the first pipeline of the graphics processor and/or the display controller to synthesize layers corresponding to the first display screen into first target image data, and outputting the first target image data to the first display screen through the first pipeline of the display controller for displaying;
and calling the graphics processor to synthesize the layer corresponding to the second display screen into second target image data, and outputting the second target image data to the second display screen for displaying through the second pipeline of the display controller.
2. The method of claim 1, wherein determining a first pipe assigned to the first display screen and a second pipe assigned to the second display screen by the display controller comprises:
inquiring a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen in the display controller according to a preset mapping relation between the display screen and the pipelines;
or,
and according to a preset allocation rule, allocating one part of pipelines of the display controller to the first display screen as a first pipeline, and allocating the other part of pipelines of the display controller to the second display screen as a second pipeline.
3. The method according to claim 1 or 2, wherein the invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data includes:
calling the graphics processor to synthesize the layer corresponding to the first display screen into first target image data, and storing the first target image data in a preset memory;
and extracting the first target image data from the memory.
4. The method according to claim 1 or 2, wherein the invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data includes:
and respectively inputting the layers corresponding to the first display screens into the first pipelines of the display controllers to synthesize first target image data.
5. The method according to claim 1 or 2, wherein the invoking the first pipeline of the graphics processor and/or the display controller to synthesize the layer corresponding to the first display screen into first target image data includes:
calling the graphics processor to synthesize a part of layers corresponding to the first display screen into first intermediate image data, and storing the first intermediate image data in a preset memory;
extracting the first intermediate image data from the memory;
and calling the first pipeline of the display controller, and synthesizing the first intermediate image data and the layer corresponding to the other part of the first display screen into first target image data.
6. The method according to claim 1 or 2, wherein the invoking the graphics processor to compose the layer corresponding to the second display screen into second target image data includes:
calling the graphic processor to synthesize the layer corresponding to the second display screen into second intermediate image data, and storing the second intermediate image data in a preset memory;
and extracting the second intermediate image data from the memory and decoding the second intermediate image data into second target image data suitable for processing by a time sequence control circuit.
7. An apparatus for displaying image data, applied in a mobile terminal having a first display screen, a second display screen, a graphic processor and a display controller, the apparatus comprising:
the layer acquiring module is used for acquiring a layer corresponding to a first display screen corresponding to the first display screen and a layer corresponding to a second display screen corresponding to the second display screen;
a pipeline determining module for determining a first pipeline allocated to the first display screen and a second pipeline allocated to the second display screen by the display controller;
the first layer synthesis module is used for calling the graphics processor and/or the first pipeline of the display controller to synthesize the layer corresponding to the first display screen into first target image data;
the first target image data output module is used for outputting the first target image data to the first display screen for display through the first pipeline of the display controller;
the second layer synthesis module is used for calling the graphics processor to synthesize the layer corresponding to the second display screen into second target image data;
and the second target image data output module is used for outputting the second target image data to the second display screen for displaying through the second pipeline of the display controller.
8. The apparatus of claim 7, wherein the pipe determination module comprises:
the relationship distribution submodule is used for inquiring a first pipeline distributed to the first display screen in the display controller and a second pipeline distributed to the second display screen according to a preset mapping relationship between the display screen and the pipeline;
or,
and the rule distribution submodule is used for distributing one part of pipelines of the display controller to the first display screen as a first pipeline and distributing the other part of pipelines of the display controller to the second display screen as a second pipeline according to a preset distribution rule.
9. The apparatus according to claim 7 or 8, wherein the second layer composition module includes:
the fifth calling submodule is used for calling the graphics processor to synthesize the layer corresponding to the second display screen into second intermediate image data, and the second intermediate image data is stored in a preset memory;
and the decoding submodule is used for extracting the second intermediate image data from the memory and decoding the second intermediate image data into second target image data suitable for the processing of the time sequence control circuit.
10. A mobile terminal comprising a first display, a second display, a graphics processor, a display controller, a processor, a memory, and a computer program stored on the memory and executable on the processor;
the computer program, when executed by the processor, implements the steps of the method of displaying image data of any of claims 1 to 6.
CN201711108423.5A 2017-11-09 2017-11-09 A kind of display methods of view data, device and mobile terminal Pending CN107783749A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711108423.5A CN107783749A (en) 2017-11-09 2017-11-09 A kind of display methods of view data, device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711108423.5A CN107783749A (en) 2017-11-09 2017-11-09 A kind of display methods of view data, device and mobile terminal

Publications (1)

Publication Number Publication Date
CN107783749A true CN107783749A (en) 2018-03-09

Family

ID=61432634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711108423.5A Pending CN107783749A (en) 2017-11-09 2017-11-09 A kind of display methods of view data, device and mobile terminal

Country Status (1)

Country Link
CN (1) CN107783749A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109144580A (en) * 2018-07-17 2019-01-04 努比亚技术有限公司 A kind of application progress control method, double screen terminal and computer readable storage medium
CN109445860A (en) * 2018-10-17 2019-03-08 京东方科技集团股份有限公司 The method for guiding electronic apparatus system booting, electronic equipment, readable storage medium storing program for executing
CN110362186A (en) * 2019-07-17 2019-10-22 Oppo广东移动通信有限公司 Layer processing method and device, electronic equipment and computer readable medium
CN112083905A (en) * 2020-09-16 2020-12-15 青岛海信移动通信技术股份有限公司 Electronic equipment and layer drawing method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080195797A1 (en) * 2007-02-13 2008-08-14 Itay Sherman Interface for extending functionality of memory cards
CN202217260U (en) * 2011-09-08 2012-05-09 福州瑞芯微电子有限公司 Multiple screen display controller
CN102622979A (en) * 2012-03-13 2012-08-01 东南大学 A kind of LCD controller and display control method thereof
CN103106058A (en) * 2013-01-25 2013-05-15 Tcl集团股份有限公司 Double-screen display method and intelligent display terminal based on android platform
CN103686304A (en) * 2013-12-09 2014-03-26 华为技术有限公司 Layer synthesis method, device and terminal equipment
CN106331831A (en) * 2016-09-07 2017-01-11 珠海市魅族科技有限公司 Method and device for image processing
CN106933525A (en) * 2017-03-09 2017-07-07 青岛海信移动通信技术股份有限公司 A kind of method and apparatus of display image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080195797A1 (en) * 2007-02-13 2008-08-14 Itay Sherman Interface for extending functionality of memory cards
CN202217260U (en) * 2011-09-08 2012-05-09 福州瑞芯微电子有限公司 Multiple screen display controller
CN102622979A (en) * 2012-03-13 2012-08-01 东南大学 A kind of LCD controller and display control method thereof
CN103106058A (en) * 2013-01-25 2013-05-15 Tcl集团股份有限公司 Double-screen display method and intelligent display terminal based on android platform
CN103686304A (en) * 2013-12-09 2014-03-26 华为技术有限公司 Layer synthesis method, device and terminal equipment
CN106331831A (en) * 2016-09-07 2017-01-11 珠海市魅族科技有限公司 Method and device for image processing
CN106933525A (en) * 2017-03-09 2017-07-07 青岛海信移动通信技术股份有限公司 A kind of method and apparatus of display image

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109144580A (en) * 2018-07-17 2019-01-04 努比亚技术有限公司 A kind of application progress control method, double screen terminal and computer readable storage medium
CN109445860A (en) * 2018-10-17 2019-03-08 京东方科技集团股份有限公司 The method for guiding electronic apparatus system booting, electronic equipment, readable storage medium storing program for executing
CN110362186A (en) * 2019-07-17 2019-10-22 Oppo广东移动通信有限公司 Layer processing method and device, electronic equipment and computer readable medium
CN110362186B (en) * 2019-07-17 2021-02-02 Oppo广东移动通信有限公司 Layer processing method and device, electronic equipment and computer readable medium
CN112083905A (en) * 2020-09-16 2020-12-15 青岛海信移动通信技术股份有限公司 Electronic equipment and layer drawing method thereof

Similar Documents

Publication Publication Date Title
US11237789B2 (en) Display method and apparatus in portable terminal
US8468469B1 (en) Zooming user interface interactions
CN107533450B (en) Display method and terminal equipment
CN106708538B (en) Interface display method and device
CN113360238A (en) Message processing method and device, electronic equipment and storage medium
US20120192113A1 (en) Portable electronic device
CN106919707B (en) Page display method and terminal based on H5
US20220244819A1 (en) Message viewing method and terminal
WO2018161534A1 (en) Image display method, dual screen terminal and computer readable non-volatile storage medium
CN106406892A (en) A shortcut function display method and device for applications and a terminal apparatus
CN111026484A (en) Application sharing method, first electronic device and computer-readable storage medium
CN108513671B (en) Display method and terminal for 2D application in VR equipment
KR20120075826A (en) Method and apparatus for scrolling for electronic device
CN104571979B (en) A kind of method and apparatus for realizing split view
CN106658064B (en) Virtual gift display method and device
CN109407920B (en) A state icon display method, a state icon processing method and related equipment
KR20210057790A (en) Information processing method and terminal
CN107704185B (en) Split screen desktop display method, terminal and computer readable storage medium
US20170046040A1 (en) Terminal device and screen content enlarging method
CN110032309A (en) A split screen method and terminal device
CN107783749A (en) A kind of display methods of view data, device and mobile terminal
CN103488450A (en) Method, device and terminal equipment for projecting picture
CN106527656A (en) Display method, device and terminal equipment
CN107809531A (en) A kind of schedule creation method, mobile terminal
TW201511540A (en) Apparatus and method of showing progress bar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180309

RJ01 Rejection of invention patent application after publication