[go: up one dir, main page]

CN119274456A - Terminal device and display method - Google Patents

Terminal device and display method Download PDF

Info

Publication number
CN119274456A
CN119274456A CN202310837630.3A CN202310837630A CN119274456A CN 119274456 A CN119274456 A CN 119274456A CN 202310837630 A CN202310837630 A CN 202310837630A CN 119274456 A CN119274456 A CN 119274456A
Authority
CN
China
Prior art keywords
data
processing unit
display
processing
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310837630.3A
Other languages
Chinese (zh)
Inventor
杨涛
周逸徉
段利华
欧阳振兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202310837630.3A priority Critical patent/CN119274456A/en
Priority to PCT/CN2024/074054 priority patent/WO2025011005A1/en
Publication of CN119274456A publication Critical patent/CN119274456A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3433Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using light modulating elements actuated by an electric field and being other than liquid crystal devices and electrochromic devices
    • G09G3/344Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using light modulating elements actuated by an electric field and being other than liquid crystal devices and electrochromic devices based on particles moving in a fluid or in a gas, e.g. electrophoretic devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请提供一种终端设备及显示方法,应用于终端技术领域,以解决终端设备显示时功耗大的问题。该终端设备包括第一处理单元、第二处理单元以及显示屏。第一处理单元,用于确定当前场景是否为第二处理单元控制场景;第一处理单元,还用于若当前场景是第二处理单元控制场景,则生成处理数据;第二处理单元用于从第一处理单元同步处理数据,并根据处理数据生成第一显示驱动数据;显示屏,用于根据第一显示驱动数据进行显示;第一处理单元,还用于若当前场景不是第二处理单元控制场景,生成第二显示驱动数据;显示屏,用于根据第二显示驱动数据进行显示。本申请应用在终端设备显示过程中。

The present application provides a terminal device and a display method, which are applied to the field of terminal technology to solve the problem of high power consumption when the terminal device is displayed. The terminal device includes a first processing unit, a second processing unit and a display screen. The first processing unit is used to determine whether the current scene is a scene controlled by the second processing unit; the first processing unit is also used to generate processing data if the current scene is a scene controlled by the second processing unit; the second processing unit is used to synchronize processing data from the first processing unit and generate first display drive data based on the processing data; the display screen is used to display according to the first display drive data; the first processing unit is also used to generate second display drive data if the current scene is not a scene controlled by the second processing unit; the display screen is used to display according to the second display drive data. The present application is applied in the display process of the terminal device.

Description

Terminal equipment and display method
Technical Field
The application relates to the technical field of terminals, in particular to terminal equipment and a display method.
Background
With the continuous development and progress of terminal technology, the types and functions of terminal equipment are increasingly abundant, and the power consumption of the terminal equipment is also focused on reducing while the performance of the terminal equipment is improved. In the use process of the terminal equipment, display power consumption is an outstanding aspect of the power consumption problem. Therefore, how to reduce the display power consumption of the terminal device on the basis of ensuring that the display effect is not affected becomes a technical problem to be solved currently.
Disclosure of Invention
The application provides a terminal device and a display method, which can reduce the display power consumption of the terminal device and improve the endurance of the terminal device on the basis of ensuring that the display effect is not affected.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
The terminal device comprises a first processing unit, a second processing unit and a display screen, wherein the first processing unit is used for determining whether a current scene is a scene controlled by the second processing unit, the first processing unit is also used for generating processing data if the current scene is the scene controlled by the second processing unit, the second processing unit is used for synchronously processing the data from the first processing unit and generating first display driving data according to the processing data, the display screen is used for displaying according to the first display driving data, the first processing unit is also used for generating second display driving data if the current scene is not the scene controlled by the second processing unit, the display screen is used for displaying according to the second display driving data, and the first processing unit is a high-power consumption processing unit and the second processing unit is a low-power consumption processing unit.
Based on the above scheme, under the control scene of the second processing unit, the terminal device firstly processes the processed data by the first processing unit (such as an AP) with high power consumption, then synchronously processes the data by the second processing unit (such as a CP) with low power consumption, generates display driving data and sends the display driving data to display. That is, the first processing unit with high power consumption processes the data required for display, and the second processing unit with low power consumption sends the processed data to the display screen for display. In this way, part of the display load on the high-power-consumption first processing unit is deployed to the low-power-consumption second processing unit, so that the display load of the high-power-consumption first processing unit is reduced, and the dormancy probability of the high-power-consumption first processing unit is increased. In addition, the low-power-consumption second processing unit has the capability of taking over display interaction in an application scene, the application scene of the low-power-consumption second processing unit is widened, the power consumption of the low-power-consumption second processing unit is lower, the overall power consumption of the terminal equipment can be further reduced, and the cruising ability of the terminal equipment is improved.
In one possible implementation manner, the first display driving data is generated by the second processing unit and is used for driving image data displayed by the display screen, and the second display driving data is generated by the first processing unit and is used for driving image data displayed by the display screen.
According to the first aspect, in one possible implementation manner, the display screen is an electronic ink screen, the processing data comprises preprocessing data, the preprocessing data is first primitive data drawn and generated by a first processing unit, the second processing unit is further used for generating first data to be displayed according to the preprocessing data, and the second processing unit is further used for performing conversion operation on the first data to be displayed to generate first display driving data.
In one possible implementation manner, the first data to be displayed is preprocessed data combined to obtain the first primitive control.
In some examples, the second processing unit combines the pre-processed data according to a preset method to generate the first data to be displayed.
In other examples, the second processing unit generates first data to be displayed according to the pre-processed data, the second processing unit reads pixel data in the first data to be displayed in parallel, the second processing unit converts the pixel data into gray-scale data, and the second processing unit generates first display driving data according to the gray-scale data.
Based on the design, the first processing unit with high power consumption draws and generates the preprocessing data, and the second processing unit with low power consumption combines and converts the preprocessing data to generate the first display driving data. The subsequent display load is processed by the second processing unit with low power consumption, so that the display load of the first processing unit with high power consumption is reduced, and the overall power consumption of the terminal equipment can be effectively reduced.
According to the first aspect, in one possible implementation manner, the display screen is an electronic ink screen, the processing data comprises conversion data, the conversion data is second primitive data generated after the first processing unit performs conversion operation on first primitive data drawn and generated by the first processing unit, the second processing unit is further configured to generate second data to be displayed according to the conversion data, and the second processing unit is further configured to determine the second data to be displayed as first display driving data.
In one possible implementation manner, the second primitive control is obtained by combining the second data to be displayed with the converted data.
In some examples, the second processing unit combines the converted data according to a preset method to generate the second data to be displayed.
Based on the design, the first processing unit with high power consumption draws and generates conversion data, and the second processing unit with low power consumption combines the preprocessed data to generate second data to be displayed. The second processing unit with low power consumption completes the display sending operation, so that the data processing speed can be increased, and the power consumption of the terminal equipment can be reduced.
In a possible implementation manner of the first aspect, the first processing unit is further configured to classify each primitive data in the current display image, and the first processing unit is further configured to draw and generate processing data according to the classified primitive data.
In some examples, the primitive data includes constant pattern primitives, enumerable pattern primitives, and combined pattern primitives. The primitive is the most basic data unit displayed by the terminal device, and can be understood as effective data actually displayed in the image.
Based on the design, the high-power-consumption first processing unit does not need to draw and process the whole image, and only needs to draw and generate processing data corresponding to each piece of primitive data according to the primitive data contained in the image. The method has the advantages that the data size is greatly reduced, the memory storage is optimized while the image display effect is not affected, so that the display load is conveniently executed according to the processing data with smaller data size, the data calculation amount and the memory occupation amount can be effectively reduced, the overall power consumption of terminal equipment during display is further reduced, and the cruising ability of the terminal equipment is improved.
In one possible implementation manner, according to the first aspect, the converting operation includes gray scale processing or timing processing.
According to the first aspect, in one possible implementation manner, the display screen is a non-electronic ink screen, the processing data includes preprocessing data, the preprocessing data is first primitive data generated by drawing by a first processing unit, and the second processing unit is further configured to combine the preprocessing data according to a preset method to generate first display driving data.
Based on the design, the display screen of the terminal equipment is a non-electronic ink screen, and when the second processing unit with low power consumption is used for controlling a scene, the first processing unit with high power consumption generates processing data with smaller data quantity, and the second processing unit with low power consumption generates first display driving data according to the processing data combination, so that the calculated quantity of display load processing data is effectively reduced. In addition, the second processing unit with low power consumption has the capability of taking over display interaction in the application scene, the application scene of the second processing unit with low power consumption is widened, the display power consumption of the terminal equipment can be reduced, and the cruising ability of the terminal equipment is improved.
In a possible implementation manner according to the first aspect, the second processing unit is further configured to take over the display control right from the first processing unit if the display control right belongs to the first processing unit after the data is processed synchronously from the first processing unit.
In a possible implementation manner according to the first aspect, the display control right is a right to control the display subsystem.
It should be understood that the display control right is the right to control the display subsystem, and is responsible for extracting the image data from the memory and transmitting the processed image data to the display screen. At the same time, the terminal device can only carry out display control right by one processing unit.
Based on the design, under the control scene of the second processing unit, if the display control right belongs to the first processing unit, the display control right of the first processing unit with high power consumption is switched to the second processing unit with low power consumption, the second processing unit with low power consumption takes over the display control right, can process the data synchronized from the first processing unit with high power consumption, and sends the processed data to the display screen for display, so that the power consumption of the terminal equipment can be reduced, and the cruising ability of the terminal equipment can be improved.
In one possible implementation manner, the second processing unit controls the scene to include a reading browsing scene or a bright screen standby scene.
In a second aspect, a display method is provided and applied to a terminal device comprising a first processing unit, a second processing unit and a display screen. If the current scene is controlled by the second processing unit, after processing data are generated by the first processing unit, the processing data are synchronously processed by the second processing unit from the first processing unit, first display driving data are generated according to the processing data, display is performed according to the first display driving data, and the first display driving data are generated by the second processing unit and are used for driving image data displayed by a display screen. And if the current scene is not the control scene of the second processing unit, generating second display driving data through the first processing unit, displaying according to the second display driving data, wherein the second display driving data is generated by the first processing unit and is used for driving image data displayed by a display screen.
In a possible implementation manner according to the second aspect, the first processing unit is a high power consumption processing unit, and the second processing unit is a low power consumption processing unit.
In a possible implementation manner according to the second aspect, the second processing unit controls the scene to include a reading browsing scene or a bright screen standby scene.
In some examples, the terminal device may determine whether the scene is currently controlled by the second processing unit according to an application used by the user.
According to the second aspect, in one possible implementation manner, the display screen is an electronic ink screen, the processing data comprises preprocessing data, the preprocessing data is first primitive data generated by drawing of the first processing unit, the first display driving data are generated according to the processing data through the second processing unit, the first display driving data are generated according to the preprocessing data through the second processing unit, and the first display driving data are generated by converting the first display driving data.
In a possible implementation manner according to the second aspect, the first data to be displayed is preprocessed data combined to obtain the first primitive control.
According to the second aspect, in one possible implementation manner, the display screen is an electronic ink screen, the processing data comprises conversion data, the conversion data is second primitive data generated after the first processing unit converts the first primitive data drawn and generated by the first processing unit, the first display driving data are generated according to the processing data through the second processing unit, the second display driving data are generated according to the conversion data through the second processing unit, and the second display driving data are determined to be the first display driving data.
In a possible implementation manner according to the second aspect, the second data to be displayed is converted data combination to obtain a second primitive control.
According to a second aspect, in one possible implementation, the converting operation includes a gray scale process or a time sequence process.
In a possible implementation manner according to the second aspect, the processing data is generated by the first processing unit, and includes classifying, by the first processing unit, each primitive data in the current display image, and generating the processing data according to the classified primitive data drawing.
According to the second aspect, in one possible implementation manner, the display screen is a non-electronic ink screen, the processing data comprises preprocessing data, the preprocessing data is first primitive data generated by drawing of the first processing unit, the second processing unit is used for generating first display driving data according to the processing data, and the second processing unit is used for combining the preprocessing data according to a preset method to generate the first display driving data.
In a possible implementation manner according to the second aspect, after the second processing unit processes the data synchronously from the first processing unit, the method further comprises taking over the display control right from the first processing unit if the display control right belongs to the first processing unit.
In a possible implementation manner according to the second aspect, the display control right is a right to control the display subsystem.
In a third aspect, there is provided a terminal device comprising a processor and a memory coupled to the processor, the memory for storing computer readable instructions which, when read from the memory by the processor, cause the terminal device to perform the method of the second aspect or any one of the embodiments of the second aspect.
In a fourth aspect, a system on a chip is provided, comprising at least one processor and at least one interface circuit, the at least one interface circuit being configured to perform a transceiving function and to send instructions to the at least one processor, the at least one processor executing instructions, the at least one processor executing the method of the second aspect or any of the embodiments of the second aspect.
In a fifth aspect, there is provided a computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of the second aspect or any one of the embodiments of the second aspect.
In a sixth aspect, there is provided a computer program product for causing a computer to perform the method of the second aspect or any one of the embodiments of the second aspect when the computer program product is run on the computer.
The technical effects corresponding to the second aspect to the sixth aspect and any implementation manner of the aspects may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a terminal device in different usage scenarios according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a display frame of a terminal device according to an embodiment of the present application;
Fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
Fig. 4 is a schematic diagram of a system architecture of a terminal device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a data collaborative management module according to an embodiment of the present application;
Fig. 6 is a schematic structural diagram of a state collaborative management module according to an embodiment of the present application;
Fig. 7 is a schematic structural diagram of a LiteTCON lightweight image processing module according to an embodiment of the present application;
FIG. 8 is a schematic flow chart of a display method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an interface according to an embodiment of the present application;
FIG. 10 is a schematic diagram of generating display driving data according to an embodiment of the present application;
FIG. 11 is a schematic diagram of memory optimization according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a display effect according to an embodiment of the present application;
FIG. 13 is a schematic diagram of a benefit effect provided by embodiments of the present application;
FIG. 14 is a schematic diagram of yet another benefit provided by embodiments of the present application;
FIG. 15 is a flowchart of another display method according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of still another terminal device according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of still another terminal device according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
In the description of the present application, "/" means that the related objects are in a "or" relationship, for example, a/B may mean a or B, and "and/or" in the present application is merely an association relationship describing the related objects, means that three relationships may exist, for example, a and/or B, and that three cases of a alone, a and B together, and B alone exist, wherein a, B may be singular or plural, unless otherwise stated.
In the description of the present application, unless otherwise indicated, "a plurality" means two or more than two. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (a, b, or c) of a, b, c, a and b, a and c, b and c, a and b and c, wherein a, b, c may be single or plural.
In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
The particular features, structures, or characteristics of the application may be combined in any suitable manner in one or more embodiments. In various embodiments of the present application, the sequence number of each process does not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Some optional features of the embodiments of the present application may be implemented independently without depending on other features in some scenarios, so as to solve corresponding technical problems, achieve corresponding effects, and may also be combined with other features according to requirements in some scenarios.
In the present application, the same or similar parts between the embodiments may be referred to each other unless specifically stated otherwise. In the various embodiments of the application, if there is no specific description or logical conflict, terms and/or descriptions between the various embodiments will be consistent and will reference each other. The embodiments of the present application do not limit the scope of the present application.
In addition, the network architecture and the service scenario described in the embodiments of the present application are for more clearly describing the technical solution of the embodiments of the present application, and do not constitute a limitation on the technical solution provided by the embodiments of the present application, and as a person of ordinary skill in the art can know, with evolution of the network architecture and appearance of a new service scenario, the technical solution provided by the embodiments of the present application is also applicable to similar technical problems.
In order to better understand the technical solution of the present application, technical terms and related concepts possibly related to the embodiments of the present application are first described below.
1. Heterogeneous collaborative display architecture
Heterogeneous collaborative display architecture is an architectural model that controls device display. In some scenarios, heterogeneous collaborative display architecture may refer to devices that include different processing units (or computing modules, or computing units, or processing cores, etc.). By way of example, processing units may include, but are not limited to, application processors (application processor, APs), coprocessors (coprocessor, CPs), graphics processors (graphics processing unit, GPUs), and the like. Different Operating Systems (OSs) may be deployed in the processing units, e.g., an AP may be deployedAn embedded real-time operating system, such as Huawei LiteOS, can be deployed on the CP of the operating system,And the operating system, a binary operating system can be deployed on the GPU.
In some embodiments, when a device employing a heterogeneous collaborative display architecture provides a display function, different processing units need to cooperate with each other, for example, to perform communication, perform synchronization of data or status, etc., and execution logic of the cooperation to complete the display function may be referred to as heterogeneous collaborative display.
2. Electronic ink screen
Electronic ink screens are generally referred to as ink screens, which are screens that utilize electrophoretic display technology (electrophoretic display, EDP) to implement interface displays. The display effect of the electronic ink screen is very similar to that of conventional paper, and is therefore also called "electronic paper".
The electronic ink screen consists of two substrates, wherein the surfaces of the substrates are provided with a plurality of light-emitting units, and each light-emitting unit can be regarded as a pixel point. And a plurality of electronic inks with small volumes are adhered to the surface of the substrate. The electronic ink consists of a plurality of liquid microcapsules, each pixel point corresponds to at least one liquid microcapsule, and positive and negative electric particles are sealed in the liquid microcapsule. Wherein the positive and negative particles are of different colors, e.g. the positive particles are of white and the negative particles are of black. When the level is input to the substrate, the positive and negative particles in the liquid microcapsule are respectively attracted and repelled to the two ends of the liquid microcapsule, so that each pixel point presents a black or white effect.
The electronic ink screen does not need backlight, ambient light is beaten on the electronic ink screen, and the ambient light is reflected to eyes of a user through the electronic ink screen. This simulates the characteristics of ink and paper and ensures that the text appears very pleasant and clear under any light source.
3. Lightweight time series controller (LITE TIMING controller LiteTCON)
The lightweight time schedule controller is a time schedule controller for controlling the screen display of the electronic ink screen. When a screen of a terminal device (such as a mobile phone) in a normal case is switched to an image, the OS performs drawing, rendering and synthesizing of the image, and the synthesized image is directly sent and displayed to the screen to finish image switching. But due to the physical characteristics of the electronic ink screen, each pixel changes according to its corresponding level change. Therefore, the image synthesized by the OS in the electronic ink screen can be normally displayed by the electronic ink screen after being subjected to strict time sequence processing by the lightweight time sequence controller.
4. CP full control display mode
The CP full control display mode is a mode in which the low-power consumption processing unit controls the terminal equipment to display interaction. For example, because the AP has very strong processing and computing power and high computing power, the AP typically performs display interaction of the terminal device (e.g., in a reading display scenario, reading content is displayed in a reading area in a display page, and the next page content is displayed in response to a user operating to turn pages, etc.). The CP is a low-energy-efficiency general processing unit, and has low computational power, and can only complete simple calculation tasks (such as reading display scenes, displaying time and electric quantity in a status bar in a display page, and the like). In the CP full control display mode, the display effect equivalent to that of the AP can be completed through the CP, and the CP completes the display interaction of the terminal device (for example, in the reading display scene, the reading content is displayed in the reading area in the display page, the next page content is displayed in response to the user operation page turning, and the status bar in the display page displays time, electric quantity and the like).
5. Display control right
The display subsystem (display subsystem, DSS) is a generic term for display-related hardware, responsible for retrieving image data from memory and transmitting the processed image data to a display screen. The DSS may be controlled by an operating system in the processing unit, and at the same time the DSS may be controlled by only one processing unit, and the right of such processing unit to DSS control may be referred to as display control right.
Illustratively, the device employs a heterogeneous architecture, where the processing units included in the heterogeneous architecture include APs and CPs. In some embodiments, the DSS may be controlled by the AP while the device is in a high performance mode. When the device switches from the high performance mode to the low power mode, the DSS may switch from AP control to CP control, which takes over display control rights. The process of switching the right of a processing unit to DSS control between two processing units may be referred to as display control right switching.
At present, the terminal equipment has various types, and the electronic ink screen can be applied to terminal equipment with interface display functions such as mobile phones, tablet computers and the like. Taking a terminal device to which an electronic ink screen is applied as an example for explanation, an exemplary, as shown in fig. 1, long-term schematic diagrams of different scenes of the terminal device provided by the embodiment of the application are shown.
It can be understood that the scenario in fig. 1 is a scenario when the user uses the terminal device, and does not include a screen-locked standby scenario when the user does not use the electronic ink terminal device.
As shown in fig. 1, when a terminal device applying an electronic ink screen is used, the use time of reading a browsing scene (e.g. reading a document, browsing a note) is longer than the use time of other scenes (e.g. entertainment, social). Wherein, the usage time of reading and browsing the scene is more than 50%. Accordingly, the display power consumption of the electronic ink screen terminal device is mainly concentrated on reading and browsing scenes, that is, the display power consumption of the electronic ink screen terminal device generated under the reading and browsing scenes is higher than that of other scenes.
In the related art, the electronic ink screen terminal device generally adopts a heterogeneous collaborative display architecture, and when a display service function is realized, display loads such as drawing, rendering, synthesizing and the like of images are generally uniformly transmitted to an AP in a system to be executed (for example, display interaction when a certain application in the electronic ink screen terminal device is run is completed by the AP). The CP is only responsible for simple display tasks (such as a simple control such as a display clock) and has limited application scenes, and cannot complete display interaction when a certain application in the electronic ink screen terminal device runs. However, although the AP has strong operation capability and can provide a better display effect, the AP has high power consumption, so that the whole power consumption of the system is higher when the electronic ink screen terminal equipment displays.
It will be appreciated that the related operation of image display is referred to as display load in the embodiment of the present application. Exemplary, as shown in table 1, a comparison table of display performance of electronic ink screen terminal devices of different brands is provided in the embodiment of the present application.
TABLE 1
From the information shown in Table 1, the chip energy efficiency ordering from high to low should be chip 2> chip 3> chip 1.
According to the information shown in table 1, the display load of brand 1 is disposed on the AP, and the power and energy efficiency of the chip 1 employed by brand 1 are low. Therefore, brand 1 has low power consumption and high endurance, but has poor display effect (clear reading and no afterimage problem). Moreover, the brand 1 is ecologically operated independently, the system is single and closed, so that the brand 1 has relatively single function, cannot rapidly install the application of other systems, has different operation modes from a mobile platform, and has high learning cost for users.
As shown in table 1, the display loads of brand 2 and brand 3 are also disposed on the AP, and the computational power of both chip 2 and chip 3 is high. Therefore, in the display effect, the display effects of the brand 2 and the brand 3 are the same, the functions of high-definition reading and page turning can be provided, and the problem of image sticking interference can not occur when the electronic ink screen switches images. However, the chip 2 and the chip 3 are different in design process, and the chip 2 has a better design process, so that the chip 2 has higher energy efficiency than the chip 3, and the chip 2 has more advantages in base power consumption. That is, the system overall power consumption of brand 2 is superior to the system overall power consumption of brand 3 in terms of the system overall power consumption. Moreover, both brand 2 and brand 3 are compatible with the Android system. Therefore, brand 2 and brand 3 can use the Android system, various applications can be installed, the operation mode is the same as that of the Android system, and the learning cost of a user is low.
After knowing the display performance of the electronic ink screen terminal device in the related art, the following describes a processing flow of displaying the load when the electronic ink screen terminal device displays the image in the related art.
Fig. 2 is a schematic flow chart of a display frame of an electronic ink screen terminal device according to an embodiment of the present application. As shown in fig. 2, the terminal device adopts a heterogeneous collaborative display architecture, and a processing unit in the heterogeneous collaborative display architecture includes an AP with a relatively high operation capability and a CP with relatively low power consumption. The terminal equipment system deploys the UI display load entirely onto the AP.
As shown in fig. 2, the system of the terminal device may include an application layer, a system layer, a driver layer, and a hardware layer.
It will be appreciated that in the embodiment of the present application, various operations performed by the terminal device that need to consume power consumption of the terminal device are referred to as loads.
In the embodiment of the application, the application layer comprises a terminal equipment self-contained system Application (APP) and/or a third party application.
The system application may also be referred to as an embedded application, which is an application program that is a part of the functions implemented by the terminal device. The third party application may also be referred to as a downloadable application. The downloadable application is an application that can provide its own internet protocol multimedia subsystem (internet protocol multimedia subsystem, IMS) connections. The downloadable application may be an application previously installed in the terminal device or may be an application downloaded by a user and installed in the terminal device.
In the embodiment of the application, the application layer is used for determining the display content of the UI of the terminal equipment. As shown in fig. 2, the application layer includes a reading application, a work application, a notes application, and the like. If the reading application is started, the terminal device UI displays the reading content (such as a novel document, a chapter, a novel name, time and the like) corresponding to the reading application.
In an embodiment of the application, the system layer includes a UI display load including UI drawing, UI rendering, synthesizer (composer) synthesis, grayscale clipping, and TCON image processing.
In the embodiment of the application, the system layer is used for drawing and rendering the display content of the terminal equipment UI determined by the application layer.
It can be understood that, because the electronic ink screen displays, each pixel point presents the effect of black or white according to the corresponding level change. Compared with other types of display screens, the electronic ink screen also needs to perform gray-scale clipping and timing controller (timing controller, TCON) image processing.
In the embodiment of the application, UI drawing is mainly used for completing image position calculation and drawing operation of each layer in an image. Specifically, UI drawing may complete drawing operations by processing UI events, animation (animation), measurement (measurement), layout (layout), drawing (draw), and the like. When a certain UI element, such as view (control), is operated by a user in a specific manner, the terminal device generates an event corresponding to the UI element, which is abbreviated as a UI event. Animation may be used to achieve the animation effect of the UI. The measurements and layout may be used to determine the size and location of each control included in the layer to draw, the drawing is used to complete the drawing of the control, and so on.
In the embodiment of the application, UI rendering is mainly used for completing rendering operation of each layer in the image. Specifically, the UI rendering may complete the rendering operation by synchronizing a vertical synchronization (vertical synchronization, VSYNC) signal, updating a rendering cache, and the like. In some embodiments, the VSYNC signal may be used to trigger the process of drawing, rendering, and compositing of the next frame image. Updating the rendering cache may be used to store the rendered layers in the cache. Alternatively, the related computation in the UI rendering may be done by a central processor (central processing unit, CPU) or a graphics processor (graphics processing unit, GPU).
In an embodiment of the application, the synthesizer (composer) synthesizes one or more layers of the rendered image.
In the embodiment of the application, gray-scale clipping is mainly used for performing gray-scale processing, clipping and other operations on a synthesized frame of image to obtain an image to be displayed.
In the embodiment of the application, the TCON image processing is mainly used for performing time sequence processing and image processing on the image to be displayed to obtain the display driving data, and transmitting the display driving data to the display screen.
Wherein the display driving data may also be referred to as timing control data. The display drive data may include waveforms and timings required to display pixels on the electronic ink screen.
It can be understood that the electronic ink screen terminal device realizes interface display by utilizing the electrophoretic display technology, and attracts and repels the positive and negative electric particles in the liquid microcapsule by inputting different levels, so as to change the pixel point to display white or black, thereby displaying different pictures. Therefore, the electronic ink screen terminal device needs to perform time sequence processing and image processing on the image to be displayed, and convert the image to be displayed into display driving data. The electronic ink screen applies different voltages according to waveforms and time sequences in the display driving data, and controls the display color of each corresponding pixel, so that the screen is driven to display different contents.
In an embodiment of the present application, the driver layer includes a general purpose memory manager (UMion), inter-process communication (IPC), shared memory (shared memory), and a Mobile industry processor interface (mobile industry processor interface, MIPI).
In the embodiment of the application, the general memory manager can provide a general memory management interface to manage different types of memories. Inter-process communication is used to transfer or exchange information between different processes. The shared memory is used for sharing data between different processes. The mobile industry processor interface is an interface between the processor and the display screen. The mobile industry processor interface is mainly used for sending display driving data generated by the system layer to a display screen (such as an electronic ink screen), and the display screen displays pictures according to the display driving data.
In the embodiment of the present application, the hardware layer includes an AP, a double-rate memory (double DATA RATE SDRAM) (also referred to as DDR memory), a display screen (display), and a CP.
In the embodiment of the application, the AP is a high-energy-consumption core, can be used as a main control central processing unit (central processing unit, CPU) and is connected with peripheral equipment (such as DDR memory, a display screen and the like) and is mainly responsible for the operation of an electronic ink screen terminal equipment system and the operation of an application. Illustratively, the AP is responsible for rendering, synthesizing, gray-scale clipping and timing processing of the small-to-speak text portion of the display content to obtain display driving data in response to the reading application, and transmitting the display driving data to the display screen through the MIPI. DDR memory may be used to store operational data, as well as data exchanged with external memory such as a hard disk. The display screen displays corresponding pictures according to the display driving data transmitted by the MIPI. The CP is a low-energy-consumption core, and a microcontroller (micro control unit, MCU) is used as the CP and is connected with various sensors (such as an acceleration sensor) and is mainly responsible for processing simple calculation tasks and tasks with high real-time requirements. Illustratively, the CP generates the time-corresponding image in the display content in response to the reading application.
Since the terminal device often needs to execute the UI display load, as described above, the terminal device deploys the UI display load to the AP with higher power consumption, which makes the power consumption of the terminal device higher and the cruising ability worse.
It can be seen that, although the electronic ink screen terminal device in the related art adopts the heterogeneous collaborative display architecture, due to the high computing power of the AP, the terminal device deploys all display loads of a User Interface (UI) to the AP, and the AP runs various loads to complete display interaction. Although the AP has strong operation capability, the AP needs to operate in a high frequency state for a long time to complete displaying the relevant loads, so that the power consumption of the terminal device is still higher. For example, if the frame rate of the User Interface (UI) of the terminal device is 60 hertz (Hz), the AP is required to perform all the display related loads within 16.6 milliseconds, and the system power consumption is high. In addition, the CP cores with low power consumption in the heterogeneous collaborative display architecture are in an idle state and are not fully utilized, the application scene is limited, and resource waste is caused.
Based on the above, the embodiment of the application provides a display method applied to a terminal device comprising a first processing unit and a second processing unit. The method comprises the steps that the terminal equipment judges whether a scene is controlled by the second processing unit currently, if yes, the first processing unit generates processing data, the second processing unit synchronously processes the data from the first processing unit, determines first display driving data according to the processing data, and then transmits the first display driving data to a display screen. Otherwise, the first processing unit processes and generates second display driving data and sends the second display driving data to a display. In the technical scheme provided by the embodiment of the application, when the terminal is in a scene controlled by the second processing unit, the first processing unit executes the first processing to obtain the processed data, the second processing unit synchronously processes the data, and generates display driving data and display sending operation according to the processed data, and the second processing unit processes the display sending operation. The first processing unit may be a processing unit with high power consumption, such as an AP, and the second processing unit may be a processing unit with low power consumption, such as a CP, etc. In this way, the terminal device deploys part of the UI display load on the AP to the CP, so that the UI display load on the AP is reduced, and the probability of AP dormancy can be increased. And the CP is a low-power-consumption core, and the execution power consumption is lower than that of the AP, so that the power consumption of the terminal equipment can be reduced, and the endurance of the terminal equipment is improved.
The technical solution provided by the embodiment of the present application may be applied to the terminal device 100 or applied to a system including the terminal device 100.
Fig. 3 shows a schematic structural diagram of the terminal device 100. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a sensor module 130, and an electronic ink screen 140, among others. Wherein the sensor module 130 may include a pressure sensor 130A, a touch sensor 130K, and the like.
The structure illustrated in the embodiment of the present application does not constitute a specific limitation on the terminal device 100. In other embodiments of the application, terminal device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include a number of different processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), a coprocessor (coprocessor, CP), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, memory, digital signal processor (DIGITAL SIGNAL processor, DSP), baseband processor, neural-network processing unit, NPU, or the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments of the present application, processor 110 may include one or more high power processing units, such as APs, and one or more low power processing units, such as CPs, etc. In some examples, a processing unit is a high power consumption processing unit if its power consumption is greater than or equal to a power consumption threshold. If the power consumption of a processing unit is smaller than the power consumption threshold, the processing unit is a processing unit with low power consumption. Alternatively, the power consumption threshold may be set by a developer according to actual requirements. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The terminal device 100 implements display functions through the GPU, the electronic ink screen 140, the ap, the CP, and the like. The GPU is a microprocessor for image processing, and is connected to the electronic ink screen 140 and the AP. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. The AP processes the image in cooperation with the CP. The electronic ink screen 140 is used to display images. In some embodiments, terminal device 100 may include 1 or N electronic ink screens 140, where N is a positive integer greater than 1. In some embodiments of the application, a display screen may be used to display the image after the composition of one or more layers. The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the terminal device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area.
The storage program area may store, among other things, an operating system, an application program required for at least one function (such as an image display function), and the like. The storage data area may store data created during use of the terminal device 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The sensor module 130 may include a pressure sensor 130A and a touch sensor 130K, also referred to as a "touch panel". The pressure sensor 130A is used for sensing a pressure signal, and may convert the pressure signal into an electrical signal.
In some embodiments, pressure sensor 130A may be disposed on electronic ink screen 140. When a force is applied to the pressure sensor 130A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the electronic ink screen 140, the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 130A. The terminal device 100 may also calculate the position of the touch from the detection signal of the pressure sensor 130A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions.
The touch sensor 130K is also referred to as a "touch panel". The touch sensor 130K may be disposed on the electronic ink screen 140, and the touch sensor 130K and the electronic ink screen 140 form a touch screen, which is also referred to as a "touch screen". The touch sensor 130K is used to detect a touch operation acting thereon or thereabout, and for example, a user may contact the electronic ink screen 140 through an active capacitive pen, a passive capacitive pen, an electromagnetic pen, or the like, thereby inputting a touch operation to the electronic ink screen 140. The touch sensor 130K may detect the touch operation input to the electronic ink screen 140 by the user through an active capacitive pen, a passive capacitive pen, an electromagnetic pen, or the like.
The touch sensor 130K may also communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through electronic ink screen 140. In other embodiments, the touch sensor 130K may also be disposed on the surface of the terminal device 100, different from the location of the electronic ink screen 140.
For example, in an embodiment of the present application, the touch sensor 130K may be configured to detect a refresh operation input by a user through a touch "touch screen" indicating to refresh the user interface, and transmit the detected refresh operation to the processor 100, so that the processor 100 controls to refresh the content displayed on the electronic ink screen 140.
In other embodiments of the application, the terminal device 100 may include more or fewer components than shown in fig. 3, or may combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The embodiment of the application does not limit the structure and the form of the terminal device 100.
By way of example, taking an example that the processing unit installed in the terminal device 100 includes an AP and a CP, fig. 4 shows a schematic system architecture of the terminal device 100 according to an embodiment of the present application.
As shown in fig. 4, the system of the terminal device 100 may include an application layer, a system layer, a driving layer, and a hardware layer.
In the embodiment of the application, the application layer comprises a system application and/or a third party application of the terminal equipment. The application layer is used for determining the display content of the terminal equipment UI. Specific description of the application layer can be found above.
In the embodiment of the application, the system layer comprises a UI display load, a touch interaction load, an AP heterogeneous cooperative network service module and a CP heterogeneous cooperative network service module.
Specifically, the UI display load includes UI drawing, UI rendering, synthesizer (composer) synthesis, grayscale clipping, and TCON image processing. The UI shows that the load is deployed on the AP. Specific description of the UI display load can be found above. The touch interactive load is used for detecting and responding to the touch operation of the user. The AP heterogeneous cooperative network service module is deployed on the AP and comprises a data cooperative management module and a state cooperative management module. The CP heterogeneous cooperative network service module is deployed on the CP and comprises a data cooperative management module, a state cooperative management module and LiteTCON lightweight image processing modules.
In the embodiment of the application, the terminal equipment judges whether the current scene is a CP control scene, and if so, the AP heterogeneous cooperative network service module and the CP heterogeneous cooperative network service module cooperatively perform display processing. If the control scenario is not CP, the AP performs display processing, and the specific implementation is described in detail in fig. 2 above.
Compared with the system layer of the system architecture in the related art, the system layer of the system architecture in the embodiment of the application adds the AP heterogeneous cooperative network service module and the CP heterogeneous cooperative network service module, and the functions and structures of each module in the newly added AP heterogeneous cooperative network service module and CP heterogeneous cooperative network service module are described in detail below.
In the embodiment of the application, the data cooperation management module is used for managing data cooperation between the AP and the CP and determining a display image to the display screen through the MIPI. The data collaborative management module is mainly responsible for data protocol, data update, data destruction and other operations between the AP and the CP.
Exemplary, as shown in fig. 5, a schematic structural diagram of a data collaborative management module according to an embodiment of the present application is provided. The AP-side data collaborative management module (may also be described as an AP data collaborative management module 500) includes a data drawing interface 501, a data management module 502, and a data synchronization module 503. The CP-side data collaborative management module (which may also be described as a CP data collaborative management module 504) includes a data display interface 505, a data management module 502, and a data synchronization module 503.
The data drawing interface 501 on the AP side includes a constant mode primitive drawing unit 5011, an enumeration-able mode primitive drawing unit 5012, and a combined mode primitive drawing unit 5013.
The constant pattern primitive drawing unit 5011 is configured to draw corresponding constant pattern primitive data according to a constant pattern primitive in a current display image, the enumeration pattern primitive drawing unit 5012 is configured to draw corresponding enumeration pattern primitive data according to an enumeration pattern primitive in the current display image, and the combination pattern primitive drawing unit 5013 is configured to draw corresponding combination pattern primitive data according to a combination pattern primitive in the current display image.
The data rendering interface 501 is used to render generation processing data. Wherein the processing data includes pre-processing data or operational data.
Specifically, the data rendering interface 501 is further configured to integrate the primitive data rendered by the primitive rendering units (the constant mode primitive rendering unit 5011, the enumeration mode primitive rendering unit 5012, and the combination mode primitive rendering unit 5013) into the preprocessing data.
Or after obtaining the pre-processed data, the data rendering interface 501 may further perform a conversion operation on the pre-processed data to obtain converted data.
In the embodiment of the present application, whether the data drawing interface 501 generates the processing data is related to the usage scenario of the terminal device 100. When the terminal device 100 is currently in the CP control scene (such as a reading browse scene or a bright screen standby scene), the terminal device 100 performs the display interaction between the CP and the AP, and the data drawing interface 501 generates the processing data.
Illustratively, when the terminal device 100 is in a social scene, the display control right belongs to the AP. The data rendering interface 501 generates processing data when the terminal device 100 switches from a social scenario to a reading browsing scenario.
For example, if the previous scene and the current scene are both CP control scenes, since the first scene has already switched the display control right from the AP to the CP, the display control right is still controlled by the CP in the second scene, and the display control right is not switched from the CP to the AP until the terminal usage scene is not the CP control scene, and the AP performs the display control.
The CP-side data display interface 505 includes a constant mode primitive control unit 5051, an enumerable mode primitive control unit 5052, and a combined mode control drawing unit 5053.
The constant mode primitive control unit 5051 generates a constant mode primitive control according to the constant mode primitive data in the processing data, the enumeration mode primitive control unit 5052 generates an enumeration mode primitive control according to the enumeration mode primitive data in the processing data, and the combination mode primitive control unit 5053 generates a combination mode primitive control according to the combination mode primitive data in the processing data.
The constant mode primitive controls comprise a first constant mode primitive control and a second constant mode primitive control. The enumerated mode primitive controls include a first enumerated mode primitive control and a second enumerated mode primitive control. The combined mode primitive control includes a first combined mode primitive control and a second combined mode primitive control.
Optionally, the first constant mode primitive control, the first enumeratable mode primitive control, and the first combined mode primitive control are first primitive controls generated by each primitive control unit in the data display interface 505 according to the pre-processing data.
Optionally, the second constant mode primitive control, the second enumeratable mode primitive control, and the second combined mode primitive control are second primitive controls generated by each primitive control unit in the data display interface 505 according to the conversion data.
The data display interface 505 is further configured to generate data to be displayed according to the primitive control. The data to be displayed comprises first data to be displayed or second data to be displayed.
Specifically, the data display interface 505 is further configured to integrate the first primitive control generated by each primitive control unit into the first data to be displayed.
Or the data display interface 505 is further configured to integrate the second primitive control generated by each primitive control unit into the second data to be displayed.
The data management module 502 on the AP side includes a primitive management unit 5021 and a primitive issuing unit 5022. The primitive management unit 5021 is used for managing processing data (preprocessing data or conversion data) generated by the data drawing interface drawing. The primitive issuing unit 5022 is configured to issue processing data (preprocessing data or conversion data) generated by drawing of the data drawing interface to the data synchronizing module, so as to synchronize the processing data (preprocessing data or conversion data) by the AP side to the CP side.
The CP-side data management module 502 includes a primitive management unit 5021 and a primitive update unit 5023. The primitive management unit 5021 is configured to manage primitive controls (a first primitive control or a second primitive control) generated by each unit in the data display interface 505. The primitive updating unit 5023 is configured to update each primitive control (the first primitive control or the second primitive control).
The data synchronization module 503 is configured to synchronize data between the AP side and the CP side.
It can be appreciated that the embodiment of the present application is not limited to a specific implementation manner of data coordination interaction between the AP and the CP.
Thus, the data between the AP and the CP are managed and synchronized through the shared memory by the data collaborative management module at the AP side and the data collaborative management module at the CP side.
It should be understood that the data collaborative management module included in the terminal device 100 shown in fig. 5 is only one possible division, and in practical application, the data collaborative management module in the terminal device 100 may include more or less modules, or may have other module division manners, which is not limited in this aspect of the application.
In the embodiment of the present application, the state cooperative management module is configured to manage the cooperative state of the synchronized AP and the CP, and is further configured to determine a processor for controlling display by the current terminal device 100. For example, the state cooperative management module determines whether the current terminal device 100 takes over and controls the processor displayed by the display screen to be an AP or a CP, and further cooperates the states of the AP and the CP. The state collaborative management module is mainly responsible for collaborative state enabling, collaborative state closing, collaborative state synchronization, collaborative state updating and other operations between the AP and the CP.
Exemplary, as shown in fig. 6, a schematic structural diagram of a state collaboration management module according to an embodiment of the present application is provided. The state cooperative management module on the AP side (may also be described as an AP state cooperative management module 600) and the state cooperative management module on the CP side (may also be described as a CP state cooperative management module 601) include an enable state interface 602, a cooperative state management module 603, and a cooperative state synchronization module 604.
Specifically, the enable status interface 602 is responsible for opening and closing the heterogeneous collaborative display function, and is an entry of the UI heterogeneous collaborative display.
Illustratively, after the enable status interface 602 is turned on, the AP and the CP may cooperatively interact, where the AP switches the display control rights to the CP, takes over the display control rights by the CP, and sends the display to the display screen. After the enable state interface 602 is closed, the AP and the CP do not perform cooperative interaction, the display control right belongs to the AP, and the AP is responsible for display processing and sending display to the display screen.
Specifically, the collaboration state management module 603 is responsible for managing and updating the collaboration state between the AP and the CP. The collaboration state comprises a collaboration state on state, a data drawing state, a data transmission state, a collaboration state off state and the like. The collaborative state is opened and used for indicating that the AP and/or the CP are in a collaborative state of an open state, the data drawing state is used for indicating that the AP and/or the CP are in a state of drawing and displaying data, the data transmission state is used for indicating that the AP and/or the CP are in a data transmission state, and the collaborative state is closed and used for indicating that the AP and/or the CP are in a collaborative state of a closed state.
Specifically, the collaborative state synchronization module 604 is responsible for synchronizing the states of the APs and CPs. Illustratively, the collaborative state synchronization module 604 on the AP side may complete state synchronization with the collaborative state synchronization module 604 on the CP side via IPC.
It should be understood that the state cooperative management module included in the terminal device 100 shown in fig. 6 is only one possible division, and in practical application, the state cooperative management module in the terminal device 100 may include more or less modules, or may have other module division manners, which is not limited in this aspect of the present application.
In an embodiment of the present application, liteTCON lightweight image processing modules are deployed on the CP, liteTCON lightweight image processing modules having a lightweight timing controller (not shown) for providing lightweight ink screen image processing capabilities on the CP side. Specifically, the LiteTCON lightweight image processing module is used for processing the data to be displayed generated by the CP side data collaborative management module, and transmitting the display driving data obtained after processing to the display screen through MIPI, so that the display screen displays a corresponding picture according to the processed display driving data.
Exemplary, as shown in fig. 7, a schematic structural diagram of a LiteTCON lightweight image processing module according to an embodiment of the present application is provided. LiteTCON the lightweight image processing module 700 includes an image input module 701, a processing mode selection module 702, a pass-through processing module 703, an image display module 704, a high definition processing module 705, a parallel reading module 706, a pixel conversion module 707, and a parallel processing module 708.
In the embodiment of the present application, the image input module 701 is configured to receive data to be displayed generated from the CP-side data collaborative management module, and send the data to the processing mode selection module 702.
Specifically, the data received by the image input module 701 may be first data to be displayed or second data to be displayed.
In the embodiment of the present application, the processing mode selection module 702 is configured to determine a processing mode according to the data to be displayed transmitted by the image input module 701, and transmit the data to be displayed to the corresponding processing module.
Specifically, if the transmitted data is the first data to be displayed, the processing mode is determined to be a high definition mode, and if the transmitted data is the second data to be displayed, the processing mode is determined to be a direct mode.
In this embodiment of the present application, the through processing module 703 is a processing module corresponding to the through mode, and is configured to determine the second data to be displayed as display driving data, and transmit the display driving data to the image sending and displaying module 704.
In the embodiment of the present application, the image sending and displaying module 704 is configured to send the display driving data processed in different processing modes to the display screen.
Specifically, the image sending and displaying module 704 is connected to the display screen through MIPI, and when the processing mode is the pass-through mode, the display driving data sent by the pass-through processing module 703 is transmitted to the display screen through MIPI. Or when the processing mode is a high definition mode, the display driving data sent by the parallel processing module 708 is transmitted to the display screen through MIPI.
In the embodiment of the present application, the high-definition processing module 705 is a processing module corresponding to a high-definition mode, and is configured to transmit first data to be displayed to the parallel reading module 706.
In an embodiment of the present application, the parallel reading module 706 is configured to read pixel data in the first data to be displayed in parallel, and transmit the pixel data to the pixel conversion module 707.
It can be understood that, in general, the manner in which the AP side reads and writes the pixel data from the memory is to read one pixel data, and write the pixel data after processing the pixel data. However, the computation power of the CP is far lower than that of the AP, and if the CP side uses the same read-write mode as that of the AP side to read and write pixel data, the data reading and writing time is too long, and the CP side cannot read quickly. Therefore, in order to overcome the problem that the CP side reads and writes data slowly, in the embodiment of the present application, the CP side uses a parallel data reading method, that is, reads a large block of pixel data, processes the large block of pixel data, and then writes the processed large block of pixel data. For example, a whole row of pixel data is read, a whole row of pixel data is processed, and the processed whole row of pixel data is written.
Thus, the data processing speed is increased by the parallel processing mode.
In an embodiment of the present application, the pixel conversion module 707 is configured to perform image processing on the pixel data, convert the pixel data into gray-scale data, and transmit the gray-scale data to the parallel processing module 708.
In an embodiment of the present application, the parallel processing module 708 is configured to generate display driving data according to the gray scale data, and transmit the display driving data to the image sending and displaying module 704.
It can be understood that, when the CP side sends display, the data displayed on the display screen is display driving data. In the embodiment of the application, the CP side executes corresponding operations according to different types of data transmitted by the AP side. In the through mode, the CP side directly transmits the second data to be displayed to the display screen as display driving data. In the high-definition mode, the CP side performs image processing operations such as conversion operation on the first data to be displayed to obtain display driving data, and then the display driving data is transmitted to the display screen.
It should be understood that the LiteTCON lightweight image processing modules included in the terminal device 100 shown in fig. 7 are only one possible division, and in practical applications, the LiteTCON lightweight image processing modules in the terminal device 100 may include more or fewer modules, or may have other division manners of modules, which is not limited by the present application.
Therefore, after the system layer is interactively processed by the AP heterogeneous cooperative network service module and the CP heterogeneous cooperative network service module, whether the display control authority is in the AP or in the CP can be determined, the display function of the CP is enriched, and when the display control authority is in the CP, the CP is responsible for transmitting and displaying, so that the overall power consumption of the system can be reduced.
In the embodiment of the application, the AP side driving layer comprises a general memory manager ion, IPC, shared memory and MIPI.
For example, the general memory manager ion of the AP-side driver layer may provide a general memory management interface to manage different types of memory. Inter-process communication is used to transfer or exchange information between different processes. The shared memory is used for sharing data between different processes (such as sharing display data between an AP and a CP, etc.). The MIPI is an interface between the AP and the display screen and is used for sending first display data generated by the AP side system layer to the display screen so that the display screen can display the first display data.
In the embodiment of the application, the CP side driving layer comprises a shared memory and MIPI.
The MIPI is connected with the display screen, and is used for sending second display data generated by a system layer at the CP side to the display screen, and the display screen displays the second display data.
In the embodiment of the present application, the hardware layer includes an AP, a double-rate memory (double DATA RATE SDRAM) (also referred to as DDR memory), a display screen (display), and a CP. The AP and the CP cooperate to interactively determine the attribution of the display control right. If the display control right belongs to the AP, the AP generates an image and sends the image to be displayed. If the display control right belongs to the CP, the AP generates an image, the CP synchronously acquires the image generated by the AP, and the CP sends the image to the display screen according to the image acquired by the AP.
It should be understood that the system architecture of the terminal device 100 shown in fig. 4 is only one possible division manner, and in practical applications, the system architecture of the terminal device 100 may include more or fewer modules, or may have other module division manners, which is not limited by the present application.
It will be understood that, in the embodiment of the present application, the terminal device may perform some or all of the steps in the embodiment of the present application, these steps or operations are merely examples, and the embodiment of the present application may also perform other operations or variations of various operations. Furthermore, the various steps may be performed in a different order presented in accordance with embodiments of the application, and it is possible that not all of the operations in the embodiments of the application may be performed.
By way of example, the technical solutions referred to in the following embodiments may be implemented in the terminal device 100 described above. The following describes the display method provided by the embodiment of the application in detail by combining the drawings and the application scene.
The technical solutions related to the following embodiments are described by using the first processing unit as an AP and the second processing unit as a CP.
Fig. 8 is a schematic flow chart of a display method according to an embodiment of the present application, and the method includes the following steps S801 to S805:
S801, the terminal equipment judges whether the current is a CP control scene.
If yes, step S802 is executed, and if no, step S805 is executed.
In the embodiment of the application, the CP control scene is a scene displayed by CP control. The CP control scene may be a reading browsing scene or a bright screen standby scene, for example. That is, the user uses the terminal device to read the scene of the text or the picture or the scene that the user does not operate the terminal device when the terminal device is on the screen. For example, a user reads a document, a novel, a note, or browses a scene such as a web page using a terminal device.
In one possible implementation, the terminal device may determine whether the CP control scene is currently a CP control scene according to an application used by the user. For example, if the terminal device starts the reading application in response to the user operation and displays the reading content, the current scene may be determined to be the CP control scene. The embodiment of the application does not limit the specific implementation mode of judging whether the current CP control scene is the terminal equipment or not.
It can be understood that, in the embodiment of the present application, based on the experimental data in fig. 1, the reading and browsing scene is preconfigured as the CP control scene, and the CP takes over the display control right of the reading and browsing scene. Moreover, the method is also applicable to the field of the present application. And the terminal equipment is in a bright screen standby scene, the content displayed by the terminal equipment is unchanged, and the CP can take over the display control right of the bright screen standby scene. That is, the low power consumption processing unit CP takes over the display control rights of the read browsing scene and the bright screen standby scene, so that the system power consumption can be effectively reduced. Other scenes can be preconfigured as CP control scenes along with upgrading and changing of terminal equipment.
S802, generating processing data by the AP of the terminal equipment.
In the embodiment of the application, the processing data comprises preprocessing data or conversion data, and the preprocessing data is primitive data which can be displayed on the CP side.
The preprocessing data are primitive data obtained after the AP performs preprocessing operation on the current display image. The conversion data is primitive data obtained after the AP performs conversion operation on the preprocessing data.
The primitive data comprises constant mode primitives, enumeration mode primitives and combination mode primitives. The primitive is the most basic data unit displayed by the terminal equipment. For example, the number "13" is displayed on the current terminal device, and then the primitives of "13" are "1" and "3".
The constant mode graphic element is a graphic element which is fixed when displayed, such as a Bluetooth icon of a status bar in a display interface of the terminal equipment or each character (one character is just one graphic element) in document contents displayed by the display interface, and the like. An enumerated pattern primitive is an image that changes but changes only a limited amount when displayed, such as each primitive for time in a terminal device display interface. The combined mode graphic primitive is a graphic primitive formed by combining the scene mode graphic primitive and the enumeration mode graphic primitive.
In some embodiments of the present application, the AP of the terminal device performs a preprocessing operation on the current display image to obtain preprocessed data.
The preprocessing operation comprises preprocessing operations such as classifying the primitive data, drawing the primitive data and the like.
Illustratively, the AP classifies each primitive data in the current display image according to the primitive category, and generates corresponding preprocessing data according to the classified primitive data drawing.
In other embodiments of the present application, the AP of the terminal device performs a preprocessing operation on the current display image to obtain preprocessed data, and then performs a conversion operation on the preprocessed data to obtain converted data.
The conversion operation includes image conversion processing operations such as gray-scale processing and time sequence processing.
For example, after the AP side performs the preprocessing operation on the current display image to obtain the preprocessed data, the AP side may further read the pixel data in the preprocessed data, where the AP converts the pixel data into gray-scale data, and the AP generates the conversion data according to the gray-scale data.
The gray-scale data may be any one of three modes of black/white/gray.
In a possible implementation manner, the AP side reads one pixel data in the pre-processing data, determines, according to the pixel data, gray-scale data corresponding to the pixel data from a preset gray-scale conversion table, and performs timing processing on the gray-scale data by the AP to generate conversion data.
It can be understood that when the AP generates the processing data, if only the current display image is subjected to the preprocessing operation, the obtained preprocessing data is synchronized to the CP side. Then, the AP side does not perform conversion processing on the data transmitted to the CP side, and therefore, the CP side needs to perform conversion processing after receiving the pre-processed data transmitted from the AP side, and transmit the converted data to display. Or when the AP generates the processing data, if the preprocessing operation is performed on the current display image, the conversion operation is performed on the preprocessing data obtained by the preprocessing operation, and then the obtained conversion data is synchronized to the CP side. Then, the data sent from the AP side to the CP side has been converted, so after the CP side receives the converted data sent from the AP side, the converted data sent from the AP side may be combined according to a preset method and then sent directly to display.
In the embodiment of the application, the AP does not need to draw the current display image, and only needs to draw and generate the preprocessing data corresponding to each primitive data according to the primitive data contained in the current display image.
Illustratively, as shown in fig. 9, the currently displayed image 900 includes a signal icon 901 and a time 902. After determining the current display image 900, the AP classifies the primitive data in the current display image 900 according to primitive types to obtain two types of primitive data, namely constant mode primitive data, signal icons and enumeration mode primitive data, namely 0, 8 and 0. And drawing corresponding preprocessing data according to the classified primitive data, for example, drawing constant mode primitive data related to the signal icon, such as Bluetooth icons, alarm clock icons and the like according to the signal icon. Drawing the enumeration pattern primitive data related to 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 according to the 0, 8 and 8, and integrating related primitive data generated by drawing different types of primitive data to obtain preprocessing data.
Therefore, the preprocessed data after AP preprocessing only comprises related primitive data, and only the preprocessed data is transmitted and stored, so that the data size is greatly reduced, and the memory storage can be optimized.
In the embodiment of the application, an AP heterogeneous cooperative network service module is deployed on an AP, and comprises a data cooperative management module and a state cooperative management module. The CP heterogeneous cooperative network service module is deployed on the CP and comprises a data cooperative management module, a state cooperative management module and a LiteTCON lightweight image processing module.
The data collaborative management module at the AP side is used for interacting with the data collaborative management module at the CP side and is responsible for managing data collaboration between the AP and the CP. The state collaborative management module at the AP side is used for interacting with the state collaborative management module at the CP side and is responsible for managing the state collaboration between the AP and the CP. The LiteTCON lightweight image processing module at the CP side is used for providing lightweight ink screen image processing capability at the CP side so that the CP takes over display control rights and completes the display interaction.
In step S802, after determining that the current scene is the CP control scene, the data collaborative management module on the AP side of the terminal device generates processing data according to the current display image.
Specifically, the data collaborative management module at the AP side comprises a data drawing interface, a data management module and a data synchronization module.
The data drawing interface comprises a constant mode graphic element drawing unit, an enumeration mode graphic element drawing unit and a combination mode graphic element drawing unit. The constant mode graphic element drawing unit is used for drawing corresponding constant mode graphic element data according to the constant mode graphic element in the current display image, the enumeration mode graphic element drawing unit is used for drawing corresponding enumeration mode graphic element data according to the enumeration mode graphic element in the current display image, and the combination mode graphic element drawing unit is used for drawing corresponding combination mode graphic element data according to the combination mode graphic element in the current display image.
The data drawing interface is also used for integrating the primitive data drawn by each primitive drawing unit into preprocessing data.
The data drawing interface is also used for converting the preprocessed data to obtain converted data.
The data management module comprises a primitive management unit and a primitive issuing unit. Wherein the primitive management unit is used for managing the processing data (preprocessing data or conversion data) generated by the drawing of the data drawing interface. The primitive issuing unit is used for issuing the processing data (the preprocessing data or the conversion data) generated by the drawing of the data drawing interface to the data synchronizing module so as to synchronize the processing data (the preprocessing data or the conversion data) to the CP side by the AP side.
The data synchronization module is used for synchronizing processing data (preprocessing data or conversion data) generated by the AP side to the CP side.
In the embodiment of the application, after the processing data is generated, the data collaborative management module at the AP side stores the processing data in the shared content, so that the data collaborative management module at the CP side obtains the processing data from the shared memory.
S803, the terminal equipment synchronizes the processing data from the AP to the CP.
In the embodiment of the application, after generating the processing data, the AP transmits the processing data to the CP. Accordingly, the CP receives the processing data from the AP.
Specifically, the data collaborative management module at the CP side includes a data display interface, a data management module, and a data synchronization module.
The data synchronization module is used for receiving the processing data sent by the AP side. That is, the data synchronization module at the AP side transmits the processing data to the data synchronization module at the CP side, thereby synchronizing the processing data from the AP to the CP.
In the embodiment of the present application, the manner of synchronizing data by the data synchronization module at the AP side and the data synchronization module at the CP side includes, but is not limited to, shared memory interaction.
In the embodiment of the application, after the CP receives the processing data from the AP, response information is returned to the AP so that the AP can determine that the CP receives the processing data.
In some embodiments of the present application, after the terminal device synchronizes the processing data from the AP to the CP, if the display control right belongs to the AP, the terminal device switches the display control right from the AP to the CP, and the CP takes over the display control right, and the CP then executes S804 after taking over the display control right.
For example, if the previous scene is not the CP control scene, the current scene is the CP control scene. And when the terminal equipment displays the previous scene, the AP performs display control, and the display control right belongs to the AP. In the current scene, the terminal device switches the display control right from the AP to the CP, and the CP takes over the display control right because the current scene is the CP control scene.
In the embodiment of the application, the terminal equipment switches the display control right of the AP to the CP according to the synchronous cooperative state.
In the embodiment of the application, after receiving the response information sent by the CP, the AP side starts the heterogeneous collaborative display function, changes the collaborative state, synchronizes the changed collaborative state to the CP, and switches the display control right of the AP to the CP. Correspondingly, the CP receives and synchronizes the changed cooperative state sent by the AP, takes over the display control right according to the synchronized cooperative state, and carries out the display control right by the CP.
Specifically, the state collaborative management module at the AP side and the state collaborative module at the CP side each include an enabling state interface, a collaborative state management module, and a collaborative state synchronization module.
The enabling state interface is responsible for opening and closing the heterogeneous collaborative display function and is an entrance of the UI heterogeneous collaborative display. The cooperative state management module is responsible for managing and updating the cooperative state between the AP and the CP. The collaborative state synchronization module is responsible for synchronizing the states of the AP and the CP.
For example, after receiving the response information sent by the CP, the AP starts an enabling state interface of the state coordination module on the AP side, and the AP side coordination state management module changes the coordination state into a coordination state on in response to an on operation of the enabling state interface on the AP side. The AP side cooperative state synchronization module synchronizes the changed cooperative state (cooperative state opening) to the cooperative state synchronization module of the CP side state cooperative module. Correspondingly, the collaboration state management module at the CP side changes the collaboration state into collaboration state opening, and opens an enabling state interface at the CP side, and at the moment, the CP takes over the display control right.
Optionally, when the terminal device exits from the CP control scene, the CP side closes the heterogeneous collaborative display function, changes the collaborative state, synchronizes the changed collaborative state to the AP, and switches the CP display control right to the AP. Correspondingly, the AP receives and synchronizes the changed cooperative state sent by the CP, takes over the display control right according to the synchronized cooperative state, and carries out the display control right by the AP.
It should be understood that the CP is a low power consumption processing unit, and the CP takes over the display control right, so that the running time of the AP can be reduced, the probability of AP dormancy is increased, and the power consumption is greatly saved.
It can be understood that the present application does not limit the execution sequence of the processing data from the AP to the CP by the terminal device and the switching of the display control right from the AP to the CP by the terminal device when the display control right belongs to the AP, and the execution sequence may be executed synchronously or sequentially. The application is not limited to the order of execution.
In other embodiments of the present application, after the terminal synchronizes the processing data from the AP to the CP, if the display control right belongs to the CP, S804 may be directly performed.
For example, if the previous scene and the current scene are both CP control scenes, since the first scene has already switched the display control right from the AP to the CP, the display control right is still controlled by the CP in the second scene, and the display control right is not switched from the CP to the AP until the terminal usage scene is not the CP control scene, and the AP performs the display control.
S804, the CP of the terminal equipment determines display driving data according to the processing data, and transmits the display driving data to a display screen of the terminal equipment.
Wherein the display driving data may also be referred to as timing control data. The display drive data may include waveforms and timings required to display pixels on the terminal device.
In some embodiments of the present application, if the processing data synchronized by the CP is the preprocessing data, the CP of the terminal device determines display driving data according to the processing data, where the CP generates first data to be displayed according to the preprocessing data, the CP reads pixel data in the first data to be displayed in parallel, the CP converts the pixel data into gray-scale data, and the CP generates display driving data according to the gray-scale data.
In one possible implementation manner, after synchronizing the preprocessed data sent by the AP side, the CP combines the preprocessed data according to a preset method to generate first data to be displayed.
Illustratively, based on the example of S802 and FIG. 9 above, the pre-processing data includes constant pattern primitive data signal icons, which may enumerate pattern primitive data of "0", "1", "2", "3", "4", "5", "6", "7", "8", "9", and ":". And the CP side determines the primitive data to be displayed in the preprocessed data, and encapsulates and combines the primitive data to be displayed to generate first data to be displayed. The CP side combines the signal icons of the constant mode primitive data in the preprocessing data and the ' 0', ' 8 ', ' packages of the enumerated mode primitive data together to generate first data to be displayed.
Specifically, after the data synchronization module of the data collaborative management module at the CP side synchronizes the preprocessed data generated at the AP side, the data display interface of the data collaborative management module at the CP side draws according to the preprocessed data to generate first data to be displayed.
The data display interface of the CP side comprises a constant mode graphic element control unit, an enumeration mode graphic element control unit and a combined mode control drawing unit.
The constant mode primitive control unit generates a first constant mode primitive control according to constant mode primitive data in the preprocessing data, the enumeration mode primitive control unit generates a first enumeration mode primitive control according to enumeration mode primitive data in the preprocessing data, and the combination mode primitive control unit generates a first combination mode primitive control according to combination mode primitive data in the preprocessing data.
The data display interface is further used for integrating a first primitive control generated by each primitive control unit (a constant mode primitive control unit, an enumerated mode primitive control unit and a combined mode primitive control unit) into first data to be displayed.
The data management module at the CP side comprises a primitive management unit and a primitive updating unit. The primitive management unit is used for managing first primitive controls (the first primitive controls comprise a first constant mode primitive control, a first enumeration mode primitive number control and a first combination mode primitive control) generated by each unit in the data display interface. The primitive updating unit is used for updating each first primitive control.
In one possible implementation, the CP reads the pixel data in the first data to be displayed in parallel by direct memory access (direct memory access, DMA).
In one possible implementation manner, after the terminal device CP side reads the pixel data in the first pixel to be displayed in parallel, the pixel data is converted into the gray-scale data according to a preset gray-scale conversion table.
The preset gray level conversion table can be determined according to experimental tests, historical data or experience.
Specifically, after pixel data in a pixel to be displayed is read in parallel, gray-scale data corresponding to the RGBA value of the pixel data is determined from a preset gray-scale conversion table according to the red green blue transparency (red green blue alpha, RGBA) value of the pixel data. In this way, the RGBA value of the pixel data is used as an index, the gray-scale data corresponding to the pixel data is determined through table lookup, and the pixel data is converted into the gray-scale data.
In one possible implementation manner, after the terminal device CP side converts the pixel data into the gray-scale data, the conversion operation is performed according to the current display image and the gray-scale data to generate the display driving data.
The conversion operation includes image conversion processing operations such as gray-scale processing and time sequence processing.
Exemplary, as shown in fig. 10, a schematic diagram of generating display driving data according to an embodiment of the present application is provided. S1 in the figure is the current display image data (which can be understood as the first frame image) of the terminal device interface. The CP side performs a conversion operation in the second frame image processing period according to the currently displayed image data S1, gray-scale data, and the screen refresh rate of the terminal device to generate display driving data S2 (which may be understood as a second frame image). The gray-scale data are obtained according to AP original processing data read from the memory in parallel by the CP. Subsequently, the display screen of the terminal device displays the display drive data S2. After the terminal device displays the display driving data S2, the display driving data S2 becomes current display image data corresponding to the next processing period. That is, the CP side performs a conversion operation in accordance with the display driving data S2, the gradation data, and the screen refresh rate of the terminal device in the third frame image processing period to generate the display driving data S3 (which can be understood as a third frame image). According to the mode, the CP side finally generates n frames of display driving data S2, S3, and sn+1, and sends the n frames of display driving data to the display screen frame by frame so as to be displayed by the display screen. In this way, display driving data to be displayed is generated from the currently displayed image data, the gray-scale data, and the screen refresh rate of the terminal device.
In the embodiment of the application, after the CP side generates the first data to be displayed, the first data to be displayed is transmitted to the LiteTCON lightweight image processing module, and the image input module transmits the first data to be displayed to the processing mode selecting module.
In the embodiment of the application, the LiteTCON lightweight image processing module at the CP side of the terminal equipment is provided with the lightweight time sequence controller, so that the CP side has the image conversion capability, and the conversion operation is completed in the low-power-consumption core, so that the calculation amount and the memory occupation amount of time sequence control in the conversion operation are reduced.
And the processing mode selection module of the LiteTCON lightweight image processing module determines that the current processing mode is a high-definition mode according to the first data to be displayed, and transmits the first data to be displayed to the high-definition processing module.
The high-definition processing module of the LiteTCON lightweight image processing module is a processing module corresponding to a high-definition mode and is used for transmitting the first data to be displayed to the parallel reading module.
The parallel reading module of the LiteTCON lightweight image processing module is used for reading pixel data in the first data to be displayed in parallel and transmitting the pixel data to the pixel conversion module.
The pixel conversion module of the LiteTCON lightweight image processing module is used for performing image processing on the pixel data, converting the pixel data into gray-scale data, and transmitting the gray-scale data to the parallel processing module.
The parallel processing module of the LiteTCON lightweight image processing module is used for generating display driving data according to the gray scale data and transmitting the display driving data to the image transmitting and displaying module.
In other embodiments of the present application, if the CP-synchronized processing data is conversion data, the CP of the terminal device determines display driving data according to the processing data, including the CP generating second data to be displayed according to the conversion data, and the CP determining the second data to be displayed as the display driving data.
In one possible implementation manner, after synchronizing the conversion data sent by the AP side, the CP combines the conversion data according to a preset method to generate second to-be-displayed data.
Specifically, after the data synchronization module of the data collaborative management module at the CP side synchronizes the conversion data generated at the AP side, the data display interface of the data collaborative management module at the CP side draws and generates second data to be displayed according to the conversion data.
The control unit of constant mode primitive in the data display interface of the CP side generates a second constant mode primitive control according to the constant mode primitive data in the conversion data, the control unit of enumeration mode primitive generates a second enumeration mode primitive control according to the enumeration mode primitive data in the conversion data, and the control unit of combination mode primitive generates a second combination mode primitive control according to the combination mode primitive data in the conversion data.
The data display interface is further used for integrating second primitive controls generated by the primitive control units (the constant mode primitive control unit, the enumerated mode primitive control unit and the combined mode primitive control unit) into second data to be displayed.
The data management module at the CP side comprises a primitive management unit and a primitive updating unit. The primitive management unit is used for managing a second primitive control (the second primitive control comprises a second constant mode primitive control, a second enumeration mode primitive number control and a second combination mode primitive control) generated by each unit in the data display interface. The primitive updating unit is used for updating each second primitive control.
It can be understood that the data synchronized by the CP side from the AP side is converted data that has undergone conversion processing, so the CP side does not need to perform conversion processing on the converted data, and the CP side may directly send and display the second data to be displayed generated after packaging and combining the converted data according to a preset method.
In the embodiment of the application, after the second data to be displayed is generated at the CP side, the second data to be displayed is transmitted to the image input module of the LiteTCON lightweight image processing module, and the image input module transmits the second data to be displayed to the processing mode selecting module.
And the processing mode selection module of the LiteTCON lightweight image processing module determines that the current processing mode is a direct mode according to the second data to be displayed, and transmits the second data to be displayed to the direct processing module.
And the straight-through processing module of the LiteTCON lightweight image processing module determines the second data to be displayed as display driving data and transmits the display driving data to the image sending and displaying module.
Exemplary, as shown in fig. 11, a memory optimization schematic is provided in an embodiment of the present application. As shown in fig. 11, the screen pixel of the terminal device is 1920×1421, and the data size required to process a frame of image is 1920×1421×32/8=10.4m.
In fig. 11, 1100 is image data transmitted from the AP side to the display screen when the display control right belongs to the AP side in the related art, and the data size is 1920×1421×32/8=10.4m. Furthermore, 1100 is stored discretely in memory in a discontinuous storage manner. But the actual effective image data (e.g., primitives) of the display screen is only a small fraction of 1100. For example, 1101 is actually valid image data (indicated by a dashed box in fig. 11), and the data size is 276×1421×4=1.5m.
1102 In fig. 11 is image data transmitted from the CP side to the display screen when the display control right belongs to the CP side in the technical solution provided in the embodiment of the present application. In the embodiment of the application, the AP side firstly determines the effective image data in the image and only draws and processes the effective image data. The actual data size processed in the embodiment of the present application is 276×1421×4=1.5m. And 1102 is continuously stored in the memory in a continuous storage mode, so that the memory processing speed is increased, the memory occupation amount is increased, and the memory storage effect is optimized.
In some embodiments of the present application, transmitting display driving data to a display screen of a terminal device includes a CP side transmitting display driving data to the display screen of the terminal device through MIPI.
In the embodiment of the application, the image sending and displaying module of the LiteTCON lightweight image processing module receives the display driving data sent by the parallel processing module, or the image sending and displaying module receives the display driving data sent by the direct processing module. And the image sending and displaying module transmits the received display driving data to the display screen through MIPI.
In the embodiment of the application, after the display screen receives the display driving data, an image is displayed according to the display driving data.
S805, the terminal equipment adopts the AP to perform display control.
In the embodiment of the application, if the terminal equipment determines that the current scene is not the CP control scene, the display control right belongs to the AP, and the AP performs display control. In one possible implementation, the terminal device uses the AP for image processing and image rendering. Illustratively, the AP performs various UI display loads such as drawing, rendering, compositing, grayscale clipping, TCON image processing, and the like to obtain display drive data by the processing flow shown in fig. 2 above. And the AP side transmits the display driving data to a display screen of the terminal equipment through MIPI, and the display screen displays images according to the display driving data.
Based on the technical scheme, after the terminal equipment determines that the current CP control scene is determined, the AP side preprocesses to generate preprocessing data, the preprocessing data are synchronized to the CP side, and the CP side completes display interaction according to the preprocessing data. That is, the terminal device processes the data required for display in the CP control scene, and the CP sends the processed data to the display screen for display. By sharing the calculation force of the AP side, the CP side uses the preprocessing data generated by the AP side to deploy the UI display load (such as the send-display operation and the like) on the AP to the CP, so that the UI display load deployed on the AP is reduced. In addition, the lightweight time sequence controller is deployed on the CP, so that the low-power-consumption CP has the capability of taking over display interaction in an application scene, and the CP application scene is widened. The power consumption of the CP core is lower than that of the AP, and after the CP takes over the display control right, the probability of AP dormancy can be increased, so that the power consumption of the terminal equipment can be reduced, and the endurance capacity of the terminal equipment can be improved.
Fig. 12 is a schematic diagram illustrating a display effect provided by an embodiment of the present application. As in the first interface 1201 in fig. 12, when the display control right belongs to the AP side, the terminal device displays an image. The second interface 1202 is an image displayed by the terminal device when the display control right belongs to the CP side. It can be seen that the display effects of the two are not different, the content is clear and free of burrs, and the page turning is free of afterimages.
By way of example, FIG. 13 illustrates a schematic diagram of a benefit effect provided by embodiments of the present application. It will be appreciated that fig. 13 illustrates a reading scenario in which the terminal device turns pages once a minute. In this scenario, the technical scheme provided by the embodiment of the application can reduce the power consumption of the terminal equipment by about 20 milliamperes.
As shown in fig. 13 (a), in order to make the display control right belong to the AP, when the AP processes the display, the current consumption of the terminal device, for example, in the time period of 1 to 2 minutes, the current consumption of the terminal device is 107.05 milliamperes.
As shown in fig. 13 (b), the current consumption of the terminal device corresponding to the technical solution provided in the embodiment of the present application, that is, the display control right belongs to CP, and when the CP takes over the sending and displaying, the current consumption of the terminal device is 82.35 milliamp, for example, in the time of 3 minutes to 4 minutes, and the current consumption is reduced by about 20 milliamp.
By way of example, FIG. 14 illustrates a schematic diagram of another benefit effect provided by embodiments of the present application. It will be appreciated that fig. 14 is an illustration of a terminal device on-screen standby scenario. In this scenario, the technical scheme provided by the embodiment of the application can reduce the power consumption of the terminal equipment by about 10 milliamperes.
As shown in fig. 14 (a), in order to make the display control right belong to the AP, when the AP processes the display, the current consumption of the terminal device, for example, in the time period from 2 minutes to 3 minutes, is 93.37 milliamperes.
As shown in fig. 14 (b), the current consumption of the terminal device corresponding to the technical solution provided in the embodiment of the present application, that is, the display control right belongs to CP, when the CP takes over the sending and displaying, the current consumption of the terminal device is 82.87 milliamperes, for example, in the time of 2 minutes to 3 minutes, and the current consumption of the terminal device is reduced by about 10 milliamperes.
It can be understood that the above description uses the terminal device as an example of the electronic ink screen terminal device, and the technical solution provided by the embodiment of the present application may also be applied to a terminal device whose display screen is not an electronic ink screen. When the display screen of the terminal device is not the electronic ink screen, gray scale processing, time sequence processing and the like are not needed.
By way of example, the display screen other than the electronic ink screen may be a Liquid Crystal Display (LCD) screen, an organic light-emitting diode display (OLED) screen (also referred to as an OLED screen), an active-matrix organic light-emitting diode (AMOLED) screen, a low-temperature polycrystalline oxide screen (low temperature polycrystalline oxide, LTPO), and any other product or component having a display function such as a flexible display screen.
It should be understood that the image displayed on the display screen other than the electronic ink screen is only drawn, rendered and synthesized before being displayed, and the synthesized image is directly sent to be displayed. The method does not need to be like an electronic ink screen, and the synthesized image is sent and displayed after gray-scale processing and time sequence processing are carried out on the synthesized image.
As shown in fig. 15, an exemplary flowchart of another display method according to an embodiment of the present application includes the following steps S1501 to S1505:
s1501, the terminal device judges whether it is a CP control scene currently.
If yes, step S1502 is executed, and if no, step S1505 is executed.
The specific implementation manner of the terminal device to determine whether the CP control scene is currently the CP control scene is referred to S801 above, and will not be described herein.
S1502, the AP of the terminal equipment generates preprocessing data.
In the embodiment of the application, when the terminal equipment is currently in the CP control scene, the AP only needs to perform preprocessing operation on the current display image to obtain the preprocessed data.
In the embodiment of the application, an AP heterogeneous cooperative network service module is deployed on an AP of a terminal device, and comprises a data cooperative management module and a state cooperative management module. The CP is provided with a CP heterogeneous cooperative network service module, and the CP heterogeneous cooperative network service module comprises a data cooperative management module and a state cooperative management module.
The specific implementation manner of the AP generation preprocessing data of the terminal device is referred to above S802, and will not be described herein.
S1503, the terminal device synchronizes the preprocessing data from the AP to the CP.
In the embodiment of the application, the terminal equipment synchronizes the preprocessing data from the AP to the CP so that the CP can send and display the preprocessing data.
The specific implementation manner of synchronizing the pre-processing data from the AP to the CP by the terminal device is referred to above in S803, and will not be described herein.
In some embodiments of the present application, after the terminal device synchronizes the processing data from the AP to the CP, if the display control right belongs to the AP, the terminal device switches the display control right from the AP to the CP first, and then S1504 is executed.
In other embodiments of the present application, after the terminal device synchronizes the processing data from the AP to the CP, if the display control right belongs to the CP, S1504 is directly performed.
S1504, the CP of the terminal equipment determines display driving data according to the preprocessing data, and transmits the display driving data to a display screen of the terminal equipment.
In one possible implementation manner, after synchronizing the preprocessed data sent by the AP side, the CP combines the preprocessed data according to a preset method to generate display driving data, and the CP transmits the display driving data to a display screen of the terminal device through MIPI.
S1505, the terminal device adopts the AP to perform display control.
The specific implementation manner of the terminal device adopting the AP to perform the display control is referred to S805 above, and will not be described herein.
Therefore, the terminal equipment can cooperatively complete display by the CP and the AP, and part of UI display load is deployed on the CP, so that the CP has the capability of taking over display interaction in an application scene, the sending display is completed by the low-power-consumption core CP, the overall power consumption of the terminal equipment during display is reduced on the basis of ensuring the same effect of the CP side and the AP side, and the cruising ability of the terminal equipment is improved.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. It will be appreciated that, in order to implement the above-mentioned functions, the terminal device includes corresponding hardware structures and/or software modules for performing the respective functions. The various illustrative units and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer-driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present application.
The embodiment of the application can divide the functional modules of the terminal equipment according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
As shown in fig. 16, a schematic structural diagram of still another terminal device provided in an embodiment of the present application, where a terminal device 1600 may be used to implement the methods described in the above method embodiments. The terminal device 1600 may include, for example, a processing module 1601, an acquisition module 1602, and a display module 1603.
Wherein the processing module 1601 is configured to perform a processing function supporting the terminal device 1600 to perform any of fig. 4 to 15.
The acquisition module 1602 is for executing an acquisition function that supports the terminal device 1600 to perform any of fig. 4 to 15. Such as acquiring a currently displayed image, etc.
The display module 1603 may be used to display a screen or the like according to display drive data. And/or the display module 1603 is further configured to support the terminal device 1600 to perform other display operations performed by the terminal device in embodiments of the present application.
Optionally, the terminal device 1600 shown in fig. 16 may further include a communication module (not shown in fig. 16) for supporting the terminal device 1600 to perform the steps of communication between the terminal device and other devices in the embodiment of the present application.
Optionally, the terminal device 1600 shown in fig. 16 may further include a storage module (not shown in fig. 16) in which programs or instructions are stored. When the processing module 1601 executes the program or instructions, the terminal device 1600 shown in fig. 16 is enabled to perform the method shown in the method embodiment described above.
The technical effects of the terminal device 1600 shown in fig. 16 may refer to the technical effects of the method described in the above-mentioned method embodiment, and will not be described herein. The processing module 1601 involved in the terminal device 1600 shown in fig. 16 may be implemented by a processor or processor-related circuit components, and may be a processor or a processing module. The communication module may be implemented by a transceiver or transceiver-related circuit component, which may be a transceiver or transceiver module. The display module 1603 may be implemented by a display related component.
As shown in fig. 17, a schematic structural diagram of still another terminal device provided in an embodiment of the present application, where a terminal device 1700 may be used to implement the methods described in the above method embodiments. The terminal device 1700 may include, for example, a first processing unit 1701, a second processing unit 1702, and a display screen 1703.
In the embodiment of the present application, the first processing unit 1701 is configured to determine whether the current scene is a control scene of the second processing unit.
In the embodiment of the application, the first processing unit 1701 is further configured to generate processing data if the current scene is a second processing unit control scene, the first processing unit 1701 is further configured to switch the display control right from the first processing unit 1701 to the second processing unit 1702, the second processing unit 1702 is configured to generate first display driving data according to the processing data, and the display screen 1703 is configured to display according to the first display driving data.
In the embodiment of the application, the first processing unit 1701 is further configured to generate second display driving data if the current scene is not the second processing unit control scene, and the display screen 1703 is further configured to display according to the second display driving data.
In the embodiment of the present application, the first processing unit 1701 is a high-power processing unit, and the second processing unit 1702 is a low-power processing unit.
In the embodiment of the application, if the display screen is an electronic ink screen, the processing data includes pre-processing data, and the pre-processing data is first primitive data drawn and generated by the first processing unit 1701, the second processing unit 1702 is further configured to generate first data to be displayed according to the pre-processing data, and the second processing unit 1702 is further configured to perform a conversion operation on the first data to be displayed to generate first display driving data.
In the embodiment of the application, the first data to be displayed is the preprocessed data combined to obtain the first primitive control.
In the embodiment of the application, if the display screen is an electronic ink screen, the processing data includes conversion data, the conversion data is second primitive data generated after the first processing unit 1701 performs conversion operation on the first primitive data drawn and generated by the first processing unit, and the second processing unit 1702 is further configured to generate second data to be displayed according to the conversion data, and the second processing unit 1702 is further configured to determine the second data to be displayed as the first display driving data.
In the embodiment of the application, the second data to be displayed is converted data combination to obtain the second primitive control.
In the embodiment of the application, the first processing unit 1701 is further configured to classify each primitive data in the current display image, and the first processing unit 1701 is further configured to draw and generate processing data according to the classified primitive data.
In the embodiment of the application, the conversion operation comprises gray scale processing or time sequence processing.
In the embodiment of the present application, if the display screen is a non-electronic ink screen, the processing data includes pre-processing data, and the pre-processing data is first primitive data generated by the first processing unit 1701, then the second processing unit 1702 is further configured to combine the pre-processing data according to a preset method, so as to generate first display driving data.
In the embodiment of the present application, the second processing unit 1702 is further configured to process data synchronously from the first processing unit 1701 before switching the display control right from the first processing unit 1701 to the second processing unit 1702.
In the embodiment of the application, the second processing unit controls the scene to include a reading browsing scene or a bright screen standby scene.
Embodiments of the present application also provide a chip system including at least one processor 1801 and at least one interface circuit 1802, as shown in fig. 18. The processor 1801 and interface circuit 1802 may be interconnected by wires. For example, interface circuit 1802 may be used to receive signals from other devices. For another example, interface circuit 1802 may be used to send signals to other devices (e.g., processor 1801). The interface circuit 1802 may, for example, read instructions stored in a memory and send the instructions to the processor 1801. The instructions, when executed by the processor 1801, may cause the terminal device to perform the steps performed by the terminal device in the above embodiments. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integral with the processor or separate from the processor, and the application is not limited. The memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately provided on different chips, and the type of memory and the manner of providing the memory and the processor are not particularly limited in the present application.
The chip system may be a field programmable gate array (field programmable GATE ARRAY, FPGA), an Application Specific Integrated Chip (ASIC), a system on chip (SoC), a central processing unit (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (DIGITAL SIGNAL processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip, for example.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
The embodiment of the application also provides a computer storage medium, wherein computer instructions are stored in the computer storage medium, and when the computer instructions run on the terminal equipment, the terminal equipment is caused to execute the method described in the embodiment of the method.
Embodiments of the present application provide a computer program product comprising a computer program or instructions which, when run on a computer, cause the computer to perform the method of the method embodiments described above.
In addition, the embodiment of the application also provides a device which can be a chip, a component or a module, and the device can comprise a processor and a memory which are connected, wherein the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory so as to enable the device to execute the method in the method embodiments.
The terminal device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The embodiments may be combined or referenced to each other without conflict. The above-described apparatus embodiments are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (19)

1.一种终端设备,包括第一处理单元、第二处理单元以及显示屏,其特征在于,1. A terminal device, comprising a first processing unit, a second processing unit and a display screen, characterized in that: 所述第一处理单元,用于确定当前场景是否为第二处理单元控制场景;The first processing unit is used to determine whether the current scene is a scene controlled by the second processing unit; 所述第一处理单元,还用于若当前场景是所述第二处理单元控制场景,则生成处理数据;所述第二处理单元用于从所述第一处理单元同步所述处理数据,并根据所述处理数据生成第一显示驱动数据;所述显示屏,用于根据所述第一显示驱动数据进行显示;所述第一显示驱动数据为所述第二处理单元生成的,用于驱动所述显示屏显示的图像数据;The first processing unit is further configured to generate processing data if the current scene is a scene controlled by the second processing unit; the second processing unit is configured to synchronize the processing data from the first processing unit and generate first display driving data according to the processing data; the display screen is configured to display according to the first display driving data; the first display driving data is image data generated by the second processing unit and used to drive the display screen to display; 所述第一处理单元,还用于若当前场景不是所述第二处理单元控制场景,生成第二显示驱动数据;所述显示屏,用于根据所述第二显示驱动数据进行显示;所述第二显示驱动数据为所述第一处理单元生成的,用于驱动所述显示屏显示的图像数据;The first processing unit is further configured to generate second display driving data if the current scene is not the scene controlled by the second processing unit; the display screen is configured to display according to the second display driving data; the second display driving data is image data generated by the first processing unit and used to drive the display screen to display; 其中,所述第一处理单元为高功耗处理单元,所述第二处理单元为低功耗处理单元。The first processing unit is a high power consumption processing unit, and the second processing unit is a low power consumption processing unit. 2.根据权利要求1所述的终端设备,其特征在于,所述显示屏为电子墨水屏,所述处理数据包括预处理数据,所述预处理数据为所述第一处理单元绘制生成的第一图元数据;2. The terminal device according to claim 1, wherein the display screen is an electronic ink screen, the processed data includes pre-processed data, and the pre-processed data is first graphic element data generated by the first processing unit; 所述第二处理单元,还用于根据所述预处理数据生成第一待显示数据;所述第一待显示数据为所述预处理数据组合得到第一图元控件;The second processing unit is further used to generate first data to be displayed according to the pre-processed data; the first data to be displayed is a first primitive control obtained by combining the pre-processed data; 所述第二处理单元,还用于对所述第一待显示数据进行转换操作生成所述第一显示驱动数据。The second processing unit is further configured to perform a conversion operation on the first data to be displayed to generate the first display driving data. 3.根据权利要求1所述的终端设备,其特征在于,所述显示屏为电子墨水屏,所述处理数据包括转换数据,所述转换数据为所述第一处理单元对所述第一处理单元绘制生成的第一图元数据进行转换操作后生成的第二图元数据;3. The terminal device according to claim 1, wherein the display screen is an electronic ink screen, the processed data includes conversion data, and the conversion data is second graphic element data generated after the first processing unit performs a conversion operation on the first graphic element data generated by the first processing unit; 所述第二处理单元,还用于根据所述转换数据生成第二待显示数据;所述第二待显示数据为所述转换数据组合得到的第二图元控件;The second processing unit is further used to generate second data to be displayed according to the conversion data; the second data to be displayed is a second graphic element control obtained by combining the conversion data; 所述第二处理单元,还用于将所述第二待显示数据确定为所述第一显示驱动数据。The second processing unit is further configured to determine the second data to be displayed as the first display driving data. 4.根据权利要求1-3任一项所述的终端设备,其特征在于,4. The terminal device according to any one of claims 1 to 3, characterized in that: 所述第一处理单元,还用于分类当前显示图像中的各图元数据;The first processing unit is further used to classify each metadata in the currently displayed image; 所述第一处理单元,还用于根据分类后的图元数据绘制生成所述处理数据。The first processing unit is further configured to generate the processed data by drawing according to the classified graphic metadata. 5.根据权利要求2-3任一项所述的终端设备,其特征在于,所述转换操作包括:灰阶处理或时序处理。5. The terminal device according to any one of claims 2-3, characterized in that the conversion operation comprises: grayscale processing or timing processing. 6.根据权利要求1所述的终端设备,其特征在于,所述显示屏为非电子墨水屏,所述处理数据包括预处理数据,所述预处理数据为所述第一处理单元绘制生成的第一图元数据;6. The terminal device according to claim 1, wherein the display screen is a non-electronic ink screen, the processed data includes pre-processed data, and the pre-processed data is first graphic element data generated by the first processing unit; 所述第二处理单元,还用于按照预设方法组合所述预处理数据,生成所述第一显示驱动数据。The second processing unit is further configured to combine the preprocessed data according to a preset method to generate the first display driving data. 7.根据权利要求1-6任一项所述的终端设备,其特征在于,7. The terminal device according to any one of claims 1 to 6, characterized in that: 所述第二处理单元,还用于在从所述第一处理单元同步所述处理数据之后,若显示控制权属于所述第一处理单元,则从所述第一处理单元接管所述显示控制权;所述显示控制权为控制显示子系统的权利。The second processing unit is further configured to take over the display control right from the first processing unit after synchronizing the processed data from the first processing unit if the display control right belongs to the first processing unit; the display control right is the right to control the display subsystem. 8.根据权利要求1-7任一项所述的终端设备,其特征在于,所述第二处理单元控制场景包括阅读浏览场景或亮屏待机场景。8. The terminal device according to any one of claims 1 to 7, characterized in that the second processing unit controls a scene including a reading and browsing scene or a screen-on standby scene. 9.一种显示方法,其特征在于,应用于包括第一处理单元、第二处理单元以及显示屏的终端设备,所述方法包括:9. A display method, characterized in that it is applied to a terminal device including a first processing unit, a second processing unit and a display screen, the method comprising: 确定当前场景是否为第二处理单元控制场景;Determining whether the current scene is a second processing unit control scene; 若当前场景是所述第二处理单元控制场景,则在通过所述第一处理单元生成处理数据后,通过所述第二处理单元从所述第一处理单元同步所述处理数据,并根据所述处理数据生成第一显示驱动数据;根据所述第一显示驱动数据进行显示;所述第一显示驱动数据为所述第二处理单元生成的,用于驱动所述显示屏显示的图像数据;If the current scene is the scene controlled by the second processing unit, after the processing data is generated by the first processing unit, the processing data is synchronized from the first processing unit by the second processing unit, and first display driving data is generated according to the processing data; display is performed according to the first display driving data; the first display driving data is generated by the second processing unit and is used to drive the image data displayed on the display screen; 若当前场景不是所述第二处理单元控制场景,则通过所述第一处理单元生成第二显示驱动数据;根据所述第二显示驱动数据进行显示;所述第二显示驱动数据为所述第一处理单元生成的,用于驱动所述显示屏显示的图像数据;If the current scene is not the scene controlled by the second processing unit, generating second display driving data through the first processing unit; displaying according to the second display driving data; the second display driving data is generated by the first processing unit and is used to drive the image data displayed on the display screen; 其中,所述第一处理单元为高功耗处理单元,所述第二处理单元为低功耗处理单元。The first processing unit is a high power consumption processing unit, and the second processing unit is a low power consumption processing unit. 10.根据权利要求9所述的方法,其特征在于,所述显示屏为电子墨水屏,所述处理数据包括预处理数据,所述预处理数据为所述第一处理单元绘制生成的第一图元数据;所述通过所述第二处理单元根据所述处理数据生成第一显示驱动数据,包括:10. The method according to claim 9, wherein the display screen is an electronic ink screen, the processed data includes pre-processed data, and the pre-processed data is first graphic element data generated by the first processing unit; and the generating the first display driving data according to the processed data by the second processing unit comprises: 通过所述第二处理单元根据所述预处理数据生成第一待显示数据;所述第一待显示数据为所述预处理数据组合得到第一图元控件;Generate first data to be displayed according to the preprocessed data by the second processing unit; the first data to be displayed is a first graphic element control obtained by combining the preprocessed data; 对所述第一待显示数据进行转换操作生成所述第一显示驱动数据。A conversion operation is performed on the first data to be displayed to generate the first display driving data. 11.根据权利要求9所述的方法,其特征在于,所述显示屏为电子墨水屏,所述处理数据包括转换数据,所述转换数据为所述第一处理单元对所述第一处理单元绘制生成的第一图元数据进行转换操作后生成的第二图元数据;所述通过所述第二处理单元根据所述处理数据生成第一显示驱动数据,包括:11. The method according to claim 9, wherein the display screen is an electronic ink screen, the processed data includes conversion data, and the conversion data is second graphic element data generated after the first processing unit performs a conversion operation on the first graphic element data generated by the first processing unit; and the generating the first display driving data according to the processed data by the second processing unit comprises: 通过所述第二处理单元根据所述转换数据生成第二待显示数据;所述第二待显示数据为所述转换数据组合得到的第二图元控件;Generate second data to be displayed according to the conversion data by the second processing unit; the second data to be displayed is a second graphic element control obtained by combining the conversion data; 将所述第二待显示数据确定为所述第一显示驱动数据。The second data to be displayed is determined as the first display driving data. 12.根据权利要求9-11任一项所述的方法,其特征在于,所述通过所述第一处理单元生成处理数据,包括:12. The method according to any one of claims 9 to 11, characterized in that generating processing data by the first processing unit comprises: 通过所述第一处理单元分类当前显示图像中的各图元数据;Classify each element of the graphic data in the currently displayed image by the first processing unit; 根据分类后的图元数据绘制生成所述处理数据。The processed data is generated by drawing according to the classified image element data. 13.根据权利要求10-11任一项所述的方法,其特征在于,所述转换操作包括:灰阶处理或时序处理。13. The method according to any one of claims 10-11, characterized in that the conversion operation comprises: grayscale processing or timing processing. 14.根据权利要求9所述的方法,其特征在于,所述显示屏为非电子墨水屏,所述处理数据包括预处理数据,所述预处理数据为所述第一处理单元绘制生成的第一图元数据;所述通过所述第二处理单元根据所述处理数据生成第一显示驱动数据,包括:14. The method according to claim 9, wherein the display screen is a non-electronic ink screen, the processed data includes pre-processed data, and the pre-processed data is first graphic element data generated by the first processing unit; and the generating the first display driving data according to the processed data by the second processing unit comprises: 通过所述第二处理单元,按照预设方法组合所述预处理数据,生成所述第一显示驱动数据。The second processing unit combines the preprocessed data according to a preset method to generate the first display driving data. 15.根据权利要求9-14任一项所述的方法,其特征在于,在所述第二处理单元从所述第一处理单元同步所述处理数据之后,所述方法还包括:15. The method according to any one of claims 9 to 14, characterized in that after the second processing unit synchronizes the processing data from the first processing unit, the method further comprises: 若显示控制权属于所述第一处理单元,则所述第二处理单元从所述第一处理单元接管所述显示控制权;所述显示控制权为控制显示子系统的权利。If the display control right belongs to the first processing unit, the second processing unit takes over the display control right from the first processing unit; the display control right is the right to control the display subsystem. 16.根据权利要求9-15任一项所述的方法,其特征在于,所述第二处理单元控制场景包括阅读浏览场景或亮屏待机场景。16. The method according to any one of claims 9 to 15, characterized in that the second processing unit controls a scene including a reading and browsing scene or a screen-on standby scene. 17.一种芯片系统,其特征在于,包括至少一个处理器和至少一个接口电路,所述至少一个接口电路用于执行收发功能,并将指令发送给所述至少一个处理器,所述至少一个处理器执行所述指令,所述至少一个处理器执行如权利要求9-16中任一项所述的方法。17. A chip system, characterized in that it comprises at least one processor and at least one interface circuit, wherein the at least one interface circuit is used to perform transceiver functions and send instructions to the at least one processor, and the at least one processor executes the instructions, and the at least one processor executes the method as described in any one of claims 9-16. 18.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求9-16中任一项所述的方法。18. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a computer program, and when the computer program is run on an electronic device, the electronic device executes the method according to any one of claims 9 to 16. 19.一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求9-16中任一项所述的方法。19. A computer program product, characterized in that when the computer program product is run on a computer, the computer is enabled to execute the method according to any one of claims 9 to 16.
CN202310837630.3A 2023-07-07 2023-07-07 Terminal device and display method Pending CN119274456A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310837630.3A CN119274456A (en) 2023-07-07 2023-07-07 Terminal device and display method
PCT/CN2024/074054 WO2025011005A1 (en) 2023-07-07 2024-01-25 Terminal device and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310837630.3A CN119274456A (en) 2023-07-07 2023-07-07 Terminal device and display method

Publications (1)

Publication Number Publication Date
CN119274456A true CN119274456A (en) 2025-01-07

Family

ID=94113451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310837630.3A Pending CN119274456A (en) 2023-07-07 2023-07-07 Terminal device and display method

Country Status (2)

Country Link
CN (1) CN119274456A (en)
WO (1) WO2025011005A1 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6421920B2 (en) * 2014-09-03 2018-11-14 カシオ計算機株式会社 Display device, display control method thereof, and control program
KR20160046620A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Display driver circuit and display system
CN108227895A (en) * 2017-07-26 2018-06-29 珠海市魅族科技有限公司 One kind puts out screen display methods and terminal, computer installation and readable storage medium storing program for executing
CN110187856B (en) * 2019-05-31 2021-02-19 联想(北京)有限公司 Control method, electronic device, and computer-readable medium
CN114816607B (en) * 2021-01-29 2024-11-22 华为技术有限公司 Screen-off display method, terminal device and chip
KR20230022774A (en) * 2021-08-09 2023-02-16 삼성전자주식회사 Electronic device for displaying image and operating method thereof
CN116027877A (en) * 2021-10-25 2023-04-28 华为终端有限公司 Screen-off display method and terminal equipment
CN116710875A (en) * 2021-12-31 2023-09-05 华为技术有限公司 Chip system and control method
CN116088781A (en) * 2022-12-28 2023-05-09 联想(北京)有限公司 Display control method and display device

Also Published As

Publication number Publication date
WO2025011005A1 (en) 2025-01-16

Similar Documents

Publication Publication Date Title
CN114518817B (en) Display method, electronic device and storage medium
CN114648951B (en) Method for controlling dynamic change of screen refresh rate and electronic equipment
CN114697446B (en) Refresh rate switching method, electronic device and storage medium
CN117711355A (en) Screen refresh rate switching method and electronic equipment
WO2021013019A1 (en) Picture processing method and apparatus
CN114780012B (en) Method for displaying lock screen wallpaper of electronic equipment and related device
CN113805744A (en) Window display method and electronic device
CN113656118B (en) A screen-off display method and electronic device
CN115904563A (en) Data processing method and device in application program starting and storage medium
CN116091329B (en) Image processing method, device, equipment and storage medium
CN117133232A (en) Display method, chip and electronic equipment
WO2023001163A1 (en) Screen refreshing method and device capable of improving dynamic effect performance
CN115639920B (en) Drawing method, electronic device, and readable storage medium
CN117806745A (en) Interface generation method and electronic device
CN116719587B (en) Screen display method, electronic device and computer readable storage medium
CN119274456A (en) Terminal device and display method
CN120418772A (en) Display method, terminal device and computer readable storage medium
CN117058291A (en) Video memory switching method and electronic equipment
CN119790425A (en) Data processing method, device, equipment and storage medium
CN119271280B (en) Screen bright processing method, device, chip, electronic device and medium
US12387655B2 (en) Zframe data display method, electronic device, and storage medium
CN119207336A (en) Display method and device
CN117724779B (en) Method for generating interface image and electronic device
CN120281969A (en) Dynamic frequency raising method for CPU, electronic equipment and storage medium
US20250182349A1 (en) Data processing method and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination