[go: up one dir, main page]

CN115167796A - Display content acquisition method, device, terminal equipment and medium - Google Patents

Display content acquisition method, device, terminal equipment and medium Download PDF

Info

Publication number
CN115167796A
CN115167796A CN202210766622.XA CN202210766622A CN115167796A CN 115167796 A CN115167796 A CN 115167796A CN 202210766622 A CN202210766622 A CN 202210766622A CN 115167796 A CN115167796 A CN 115167796A
Authority
CN
China
Prior art keywords
image
analyzed
data
module
rgb data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210766622.XA
Other languages
Chinese (zh)
Inventor
孙晓明
修平
战磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN202210766622.XA priority Critical patent/CN115167796A/en
Publication of CN115167796A publication Critical patent/CN115167796A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • G06F3/1475Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels with conversion of CRT control signals to flat panel control signals, e.g. adapting the palette memory
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

In some embodiments of the present application, when a page content of a specified layer jumps, a synthesis mode of an Hwc module is set as GPU synthesis, an image to be analyzed after the Hwc module is synthesized based on the content of each layer is obtained, a data type of the image to be analyzed is set to be readable, so as to obtain RGB data in the image to be analyzed, and target data of the image to be analyzed is determined according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed, thereby realizing a requirement for obtaining the target data corresponding to the preset area in real time.

Description

Display content acquisition method, device, terminal equipment and medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a display content obtaining method, an apparatus, a terminal device, and a medium.
Background
Compared with a common sensor, the light sensing quantity of the off-screen light sensor is more easily interfered by current screen brightness and display content, wherein the screen brightness is generally the brightness value of a screen of the terminal equipment installed with the light sensor in the current environment, and the display content is generally RGB data corresponding to the current screen display content of the terminal equipment installed with the light sensor.
In an Android system, a surfefinger module generally synthesizes display contents of all layers into one image according to a visible area, a Z-Order and a screen size of each layer, and then sends the synthesized image to a screen for display, but the synthesized image is private format data which cannot be obtained. For the acquisition of display content, the related art is generally acquired by screenshot or creating a virtual screen, but the related art is inefficient and occupies virtual screen resources.
Therefore, how to improve the efficiency of acquiring the display content becomes an urgent problem to be solved.
Disclosure of Invention
The application provides a display content acquisition method, a display content acquisition device, a terminal device and a medium, and aims to solve the problems that in the prior art, the display content acquisition efficiency is low and virtual screen resources are occupied.
In a first aspect, the present application provides a display content obtaining method, including:
when the page content of the layer corresponding to the non-state column and the non-navigation column is determined to jump, setting a synthesis mode of a hardware synthesis Hwc module of a hardware abstraction layer HAL as graphics processing unit GPU synthesis; acquiring an image to be analyzed which is synthesized by the Hwc module based on the content of each layer, and setting the data type of the image to be analyzed to be readable;
and determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
In a second aspect, some embodiments of the present application further provide a display content obtaining apparatus, including:
the setting module is used for setting the synthesis mode of a hardware synthesis Hwc module of the hardware abstraction layer HAL as graphic processor GPU synthesis when the jumping of the page content of the layer corresponding to the non-status bar and the non-navigation bar is determined;
the acquisition module is used for acquiring the image to be analyzed which is synthesized by the Hwc module based on the content of each layer;
the setting module is further used for setting the data type of the image to be analyzed to be readable;
and the determining module is used for determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
In a third aspect, some embodiments of the present application further provide a terminal device, where the terminal device includes:
a display, a processor, and a memory;
the display is used for displaying a screen display area;
the memory to store the processor-executable instructions;
the processor is configured to execute the instructions to implement the display content acquisition method as described in any one of the above.
In a fourth aspect, some embodiments of the present application further provide a computer-readable storage medium storing a computer program, which when executed by a processor, implements the steps of the display content acquiring method as described in any one of the above.
Some embodiments of the present application provide a display content obtaining method, apparatus, terminal device, and medium, where the method includes: when the page content of the image layer corresponding to the non-state column and the non-navigation column is determined to jump, setting the synthesis mode of the Hwc module of the hardware abstraction layer HAL as the synthesis of a GPU; acquiring an image to be analyzed which is synthesized by the Hwc module based on the content of each layer, and setting the data type of the image to be analyzed to be readable; and determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed. In some embodiments of the application, when the page content of the designated layer jumps, the synthesis mode of the Hwc module is set as GPU synthesis, the image to be analyzed, which is synthesized by the Hwc module based on the content of each layer, is obtained, the data type of the image to be analyzed is set to be readable, so as to obtain RGB data in the image to be analyzed, and the target data of the image to be analyzed is determined according to the preset data analysis algorithm and the RGB data corresponding to the preset area in the image to be analyzed, thereby realizing the requirement of obtaining the target data corresponding to the preset area in real time.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 shows a schematic structural diagram of a terminal device 100;
fig. 2 is a block diagram of a software structure of a terminal device 100 according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a display content acquisition process provided by some embodiments of the present application;
fig. 4 is a schematic diagram of a screen layer of a terminal device according to some embodiments of the present application;
fig. 5a is a schematic diagram of a screen layer of a terminal device according to some embodiments of the present application;
fig. 5b is a schematic diagram illustrating screen layer content skipping of a terminal device according to some embodiments of the present application;
FIG. 6 is a schematic diagram of a target data determination process provided by some embodiments of the present application;
FIG. 7 is a schematic diagram of a display content acquisition process provided by some embodiments of the present application;
FIG. 8 is a schematic diagram illustrating a process for performing light-sensing data calibration based on a display content acquisition process according to some embodiments of the present disclosure;
fig. 9 is a schematic structural diagram of a display content acquiring apparatus according to some embodiments of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to some embodiments of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application clearer, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings. All other embodiments that can be derived from the embodiments given herein by a person of ordinary skill in the art are intended to be within the scope of the present disclosure.
Some embodiments of the present application provide a display content obtaining method, apparatus, terminal device, and medium, where the method includes: when the page content of the layer corresponding to the non-state column and the non-navigation column is determined to jump, setting a synthesis mode of an Hwc module of a hardware abstraction layer HAL as graphics processing unit GPU synthesis; the method comprises the steps that an image to be analyzed which is synthesized by an Hwc module based on the content of each layer is obtained, and the data type of the image to be analyzed is set to be readable; and determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed. In some embodiments of the application, when the page content of the designated layer jumps, the synthesis mode of the Hwc module is set as GPU synthesis, the image to be analyzed, which is synthesized by the Hwc module based on the content of each layer, is obtained, the data type of the image to be analyzed is set to be readable, so as to obtain RGB data in the image to be analyzed, and the target data of the image to be analyzed is determined according to the preset data analysis algorithm and the RGB data corresponding to the preset area in the image to be analyzed, thereby realizing the requirement of obtaining the target data corresponding to the preset area in real time.
Fig. 1 shows a schematic structural diagram of a terminal device 100. It should be understood that the terminal device 100 shown in fig. 1 is only an example, and the terminal device 100 may have more or less components than those shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
A block diagram of a hardware configuration of a terminal device 100 according to an exemplary embodiment is exemplarily shown in fig. 1. As shown in fig. 1, the terminal device 100 includes: a Radio Frequency (RF) circuit 110, a memory 120, a display unit 130, a camera 140, a sensor 150, an audio circuit 160, a Wireless Fidelity (Wi-Fi) module 170, a processor 180, a bluetooth module 181, and a power supply 190.
The RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 180 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 120 may be used to store software programs and data. The processor 180 performs various functions of the terminal device 100 and data processing by executing software programs or data stored in the memory 120. The memory 120 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 120 stores an operating system that enables the terminal device 100 to operate. The memory 120 may store an operating system and various application programs, and may also store program codes for executing the display content acquiring method of the terminal device according to some embodiments of the present application.
The display unit 130 may be used to receive input numeric or character information and generate signal input related to user settings and function control of the terminal device 100, and specifically, the display unit 130 may include a touch screen 131 disposed on the front surface of the terminal device 100 and capable of collecting touch operations, such as button clicking, by the user thereon or nearby.
The display unit 130 may also be used to display a Graphical User Interface (GUI) of information input by or provided to the user and various menus of the terminal apparatus 100. Specifically, the display unit 130 may include a display screen 132 disposed on the front surface of the terminal device 100. The display screen 132 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 130 may be configured to display a screen display area of the terminal device in the present application.
The touch screen 131 may cover the display screen 132, or the touch screen 131 and the display screen 132 may be integrated to implement the input and output functions of the terminal device 100, and after the integration, the touch screen may be referred to as a touch display screen for short. In the present application, the display unit 130 may display the application programs and the corresponding operation steps.
The camera 140 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 180 for conversion into digital image signals.
The terminal device 100 may further comprise at least one sensor 150, such as an acceleration sensor 151, a distance sensor 152, a fingerprint sensor 153, a temperature sensor 154. The terminal device 100 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, light sensor, motion sensor, and the like.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the terminal device 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The terminal device 100 may further be configured with a volume button for adjusting the volume of the sound signal, and may be configured to combine other buttons to adjust the closed area. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to, for example, another terminal device, or outputs the audio data to the memory 120 for further processing.
Wi-Fi belongs to a short-distance wireless transmission technology, and the terminal device 100 can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the Wi-Fi module 170, and provides wireless broadband internet access for the user.
The processor 180 is a control center of the terminal device 100, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs stored in the memory 120 and calling data stored in the memory 120. In some embodiments, processor 180 may include one or more processing units; the processor 180 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 180. In the present application, the processor 180 may run an operating system, an application program, a user interface display, a touch response, and a display content obtaining method of a terminal device according to some embodiments of the present application. Further, the processor 180 is coupled with the display unit 130.
And the bluetooth module 181 is configured to perform information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the terminal device 100 may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) that is also equipped with a bluetooth module through the bluetooth module 181, so as to perform data interaction.
The terminal device 100 also includes a power supply 190 (such as a battery) for powering the various components. The power supply may be logically connected to the processor 180 through a power management system to manage charging, discharging, power consumption, etc. through the power management system. The terminal device 100 may further be configured with a power button for powering on and off the terminal device, and locking the screen.
Fig. 2 is a block diagram of a software structure of a terminal device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system libraries, and a kernel layer, from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include phone, multimedia message, wi-Fi, weChat, information, alarm, gallery, calendar, WLAN, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, dialed and received calls, browsing history and bookmarks, phone books, short messages, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying a picture.
The phone manager is used to provide the communication function of the terminal device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.
The notification manager allows the application to display notification information (e.g., the message content of a short message) in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
A 2D (an animation mode) graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The terminal device 100 in the embodiment of the present application may be an electronic device including, but not limited to, a smart phone, a tablet computer, a wearable electronic device (e.g., a smart watch), a notebook computer, and the like.
Fig. 3 is a schematic diagram of a display content acquiring process provided in some embodiments of the present application, where the process specifically includes the following steps:
s301: when the page content of the Layer corresponding to the non-status bar and the non-navigation bar is determined to jump, a synthesis mode of a Hardware synthesis (Hwc) module of a Hardware Abstraction Layer (HAL) is set to be a Graphics Processing Unit (GPU) synthesis.
Some embodiments of the present application provide a display content obtaining process suitable for a terminal device, where the terminal device may be the terminal device shown in fig. 1 or fig. 2.
When a user uses a terminal device, an image observed in a screen of the terminal device is generally composed of multiple layers, fig. 4 is a schematic diagram of a screen layer of the terminal device provided in some embodiments of the present Application, as shown in fig. 4, an image displayed in the screen of the terminal device includes three layers, which are layer 1, layer 2, and layer 3, where layer 1 is a layer corresponding to a state bar, layer 2 is a layer corresponding to a display page of various Applications (APP), and layer 3 is a layer corresponding to a navigation bar. When any layer changes, the surfeFinger module of the terminal device synthesizes all layer contents into one image, and sends the synthesized image to a screen for display.
In some embodiments of the present application, in order to save resources, when it is determined that the page content of the image layer corresponding to the non-state bar and the non-navigation bar skips, the display content may be analyzed. Specifically, when the layer name of the layer corresponding to the non-state column and the non-navigation column is not consistent with the layer name of the layer corresponding to the non-state column and the non-navigation column when the layer synthesis is performed by the surfaceflunger module last time, it may be considered that the page content of the layer skips.
Of course, in some embodiments of the present application, it may also be determined whether the page content of the layer jumps according to the content displayed by the layer. Fig. 5a is a schematic diagram of a screen layer of a terminal device according to some embodiments of the present application, and as shown in fig. 5a, a layer corresponding to a non-state column and a non-navigation column is a layer 2, and a list page of a commodity is being displayed on the current layer 2. Fig. 5b is a schematic diagram illustrating screen layer content jump of a terminal device according to some embodiments of the present application, as shown in fig. 5b, a layer 2 corresponding to a non-status column and a non-navigation column currently displays a detail page of a product 2 in fig. 5a, where the product 2 is a second product of the list page in fig. 5a, and a case where the layer 2 is converted from the page content in fig. 5a to the page content in fig. 5b is that the page content jumps.
Since the synthesized images are all private format data and cannot be directly analyzed, in some embodiments of the present application, in order to obtain the display content, when it is determined that the page content of the layer corresponding to the non-status bar and the non-navigation bar skips, a synthesis mode of the Hwc module of the HAL may be set to be GPU synthesis.
S302: and acquiring an image to be analyzed which is synthesized by the Hwc module based on the content of each layer, and setting the data type of the image to be analyzed to be readable.
In some embodiments of the present application, an image to be analyzed that is synthesized by the Hwc module based on the content of each layer may be obtained, and the data type of the image to be analyzed is set to be readable.
Specifically, the data type of the image to be analyzed can be set to be readable in real time through the following interface functions: native _ window _ set _ use (DEFAULT _ USAGE).
S303: and determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
In some embodiments of the present application, after the data type of the image to be analyzed is set to be readable, the RGB data of the image to be analyzed in the display path may be directly read. The target data of the image to be analyzed can be determined according to a preset analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed. A person skilled in the art of the preset analysis algorithm may configure the preset analysis algorithm as needed, for example, the chromaticity of the preset region corresponding to the image to be analyzed may be counted. In some embodiments of the present application, the preset area may be any shape such as a rectangle, a diamond, or a triangle, and the size of the preset area may be determined according to the size of the light sensing range of the light sensor. In some embodiments of the present application, the target data may include a plurality of data or may include only one data.
In some embodiments of the present application, after determining target data for an image to be analyzed, the target data may be stored in a shared storage area for use by other devices, components, or modules.
In some embodiments of the application, when the page content of the designated layer jumps, the synthesis mode of the Hwc module is set as GPU synthesis, the image to be analyzed, which is synthesized by the Hwc module based on the content of each layer, is obtained, the data type of the image to be analyzed is set to be readable, so as to obtain RGB data in the image to be analyzed, and the target data of the image to be analyzed is determined according to the preset data analysis algorithm and the RGB data corresponding to the preset area in the image to be analyzed, thereby realizing the requirement of obtaining the target data corresponding to the preset area in real time.
In order to improve the accuracy of obtaining the display content, on the basis of the foregoing embodiments, in some embodiments of the present application, after the data type of the image to be analyzed is set to be readable, before the target data of the image to be analyzed is determined according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed, the method further includes:
and extracting the RGB data of the image to be analyzed, and locking the RGB data.
In some embodiments of the present application, after the data type of the image to be analyzed is set to be readable, RGB data of the image to be analyzed may be extracted.
Specifically, the image to be analyzed may be named as tex, and the data type corresponding to the tex is explicit texture, that is, the handle of the explicit texture type of the image to be analyzed is tex. In some embodiments of the present application, the RGB data of the image to be analyzed may be extracted based on a getBuffer interface function of the ExternalTexture.
In order to prevent the RGB data of the image to be analyzed from being tampered with, in some embodiments of the present application, after the RGB data of the image to be analyzed is extracted, the extracted RGB data may also be locked.
Specifically, in some embodiments of the present application, the extracted RGB data of the image to be analyzed may be named buf, where the data type corresponding to the buf is graphics buffer, that is, the handle of the graphics buffer type of the extracted RGB data of the image to be analyzed is buf. In some embodiments of the present application, the extracted RGB data of the image to be analyzed may be locked based on a lock interface function of the graphic buffer, so as to prevent data tampering.
After the extracted RGB data is locked, the subsequent step of determining the target data of the image to be analyzed according to the preset data analysis algorithm and the RGB data corresponding to the preset region in the image to be analyzed may be continuously performed.
In order to ensure that the terminal device correctly displays the image to be analyzed, on the basis of the foregoing embodiments, in some embodiments of the present application, after determining the target data of the image to be analyzed, the method further includes:
and unlocking the RGB data of the image to be analyzed.
Since the RGB data is already locked before the target data of the image to be analyzed is determined according to the preset data analysis algorithm and the RGB data corresponding to the preset area in the image to be analyzed, in order to prevent the RGB data of the image to be analyzed from being falsified, in some embodiments of the present application, the locking of the RGB data of the image to be analyzed may be released in order to ensure that the terminal device can normally display the image to be analyzed after the target data of the image to be analyzed is determined.
Specifically, in some embodiments of the present application, the RGB data of the extracted image to be analyzed may be unlocked based on an unlock interface function of the graphic buffer, so that the terminal device normally displays the image to be analyzed.
In order to further improve the efficiency of obtaining the display content, on the basis of the foregoing embodiments, in some embodiments of the present application, the determining, according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed, target data of the image to be analyzed includes:
aiming at each pixel point in a target analysis area in the image to be analyzed, respectively acquiring and storing the numerical value of each color channel of the RGB data corresponding to the pixel point;
aiming at each color channel of RGB data, determining an average value corresponding to the color channel according to the numerical value of each pixel point correspondingly stored by the color channel and the number of the pixel points contained in the target analysis area;
and determining the target data of the image to be analyzed according to the average value corresponding to each color channel.
In order to further improve the efficiency of obtaining the display content, in some embodiments of the present application, after the data type of the image to be analyzed is set to be readable, for each pixel point in a target analysis region in the image to be analyzed, a numerical value of each color channel of the RGB data corresponding to the pixel point may be obtained and stored, where the target analysis region is determined according to a preset region.
Specifically, in some embodiments of the present application, the target analysis region in the image to be analyzed may be determined according to pre-stored position information of a preset region, where the position information of the preset region may be coordinates of any vertex of the preset region, and a width and a height of the preset region. For example, the pre-saved position information of the preset region may be (x, y) coordinates of a vertex at the upper left corner of the preset region, and the width and height of the preset region. In the case where one vertex coordinate of the preset region and the width and height of the preset region are known, the target analysis region in the image to be analyzed can also be determined.
After the target analysis area is determined, the values of the R, G, and B color channels of the RGB data of each pixel point may be respectively stored in TotalR, totalG, and TotalB by traversing the RGB data of each pixel point in the target analysis area in the image to be analyzed.
And aiming at each color channel of the RGB data, calculating an average value corresponding to the color channel according to the numerical value of each pixel point correspondingly stored aiming at the color channel and the number of the pixel points contained in the target analysis area.
Specifically, in some embodiments of the present application, a sum of values stored corresponding to TotalR may be determined, and then the sum is divided by the number of pixel points included in the target analysis region to obtain an average value of the color channel.
Similarly, for the G and B color channels, the sum of the values stored corresponding to each color channel may be determined, and each sum is divided by the number of pixels included in the target analysis region to obtain the average value of the corresponding color channel.
After the average value corresponding to each color channel of the RGB data is determined, the target data of the image to be analyzed may be determined according to the average value corresponding to the channel of each color.
In some embodiments of the present application, when determining target data of an image to be analyzed, each average value may be compared, and the average value with the largest value may be determined as the target data of the image to be analyzed; the average value corresponding to each color channel may also be stored in the target data as target sub-data of the target data, that is, the target data includes an average value of each component of RGB data, that is, the target data includes an average value corresponding to the R color channel, an average value corresponding to the G color channel, and an average value corresponding to the B color channel.
Specifically, assume that the average value for the R color channel is 231, the average value for the g color channel is 152, and the average value for the b color channel is 128. The determined target data of the image to be analyzed may include three target sub-data 231, 152, and 128 according to the average value corresponding to each color channel.
The following describes a process for determining target data according to some embodiments of the present application with reference to a specific embodiment, and fig. 6 is a schematic diagram of a process for determining target data according to some embodiments of the present application, where the process includes the following steps:
s601: and acquiring the position information of the preset area, and determining a target analysis area corresponding to the preset area in the image to be analyzed.
S602: and circularly obtaining the RGB data of each pixel point in the target analysis area.
S603: and determining the average value of each color channel of the RGB data of the pixel points in the target analysis area.
S604: and determining target data of the image to be analyzed according to the average value corresponding to each color channel.
In order to further ensure that the terminal device correctly displays the image to be analyzed, on the basis of the foregoing embodiments, in some embodiments of the present application, after the target data of the image to be analyzed is determined, the method further includes:
setting the synthesis mode of the Hwc module of the HAL as Hwc synthesis; and setting the data type of the image to be analyzed to be unreadable.
In order to further ensure that the terminal device correctly displays the image to be analyzed, after the target data of the image to be analyzed is determined, in some embodiments of the present application, a data format of the image to be analyzed in the display path may be recovered, specifically, a synthesis manner of an Hwc module of the HAL may be recovered to Hwc synthesis, so that the surfafinger module smoothly performs next layer synthesis, and a data type of the image to be analyzed may be set to be unreadable, so that the terminal device may normally display the image to be analyzed.
According to the display content obtaining process, the synthesis mode can be controlled in real time, the synthesis mode of the image to be analyzed is synthesized, the RGB data of the image to be analyzed can be directly read, data migration is not involved, safety and high efficiency are achieved, the display content of one frame of image can be obtained only within 1ms, the updating time of each frame of image of the Android system of the terminal device is 16ms, and the fact that the screen display content is obtained in real time without influencing the display performance is achieved.
The display content obtaining process provided in some embodiments of the present application may be applied to a surface flinger module, and the display content obtaining process provided in some embodiments of the present application is described below with reference to a specific embodiment, and fig. 7 is a schematic diagram of the display content obtaining process provided in some embodiments of the present application, where as shown in fig. 7, the surface flinger module performs layer synthesis in real time according to content changes of each layer, and updates an image displayed on a screen of a terminal device. The layer name recognition sub-module is added in an OutPut sub-module of the surfaceFlinger module, and is used for recognizing whether the layer name of a layer changes or not so as to detect whether the page content of the layer changes or not, when the layer name recognition sub-module detects that the layer name changes, the render surface sub-module is used for setting the data type of an image to be analyzed to be readable in real time based on a provided interface for updating window _ set _ use, and is used for switching the synthesis mode of the Hwc module, and when the page content of the layer corresponding to a non-status bar and a non-navigation bar jumps, the synthesis mode of the Hwc module is set to be GPU synthesis. And the OutPut submodule of the SurfaceFlinger module is additionally provided with a data analysis submodule, and the data analysis submodule is used for reading and analyzing the RGB data corresponding to the preset area in the image to be analyzed.
The display content acquiring process provided in some embodiments of the present application can be applied to light sensation data calibration of a light sensor, and the light sensation data calibration of the light sensor is described below with reference to another specific embodiment, and fig. 8 is a schematic diagram of a process for performing light sensation data calibration based on the display content acquiring process provided in some embodiments of the present application, as shown in fig. 8, the process includes the following steps:
s801: and carrying out layer composition according to the content change of each layer in real time, and updating the image displayed on the screen of the terminal equipment.
S802: and when the image layer names of the image layers corresponding to the non-state column and the non-navigation column are identified to be changed, setting the synthesis mode of the Hwc module of the HAL to be GPU synthesis, and setting the data type of the image to be analyzed to be readable.
S803: and extracting RGB data of the image to be analyzed, and locking the RGB data.
S804: and calculating target data of a target analysis area of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
S805: and storing the target data into the shared area, releasing the locking of the RGB data of the image to be analyzed, setting the synthesis mode of the Hwc module of the HAL as Hwc synthesis, and setting the data type of the image to be analyzed as unreadable, thereby continuing the display process.
S806: the light sensor connected with the terminal equipment acquires stored target data, namely RGB data corresponding to display content, from the shared area, acquires corresponding screen brightness, corrects light sensation data of the light sensor according to the acquired screen brightness and the RGB data corresponding to the display content, reports the corrected light sensation data to the terminal equipment, and enables the terminal equipment to adjust the screen brightness according to the corrected light sensation data.
In some embodiments of the present application, since the RGB data corresponding to the display content can be obtained at a high frame rate, the light sensor can compensate the light sensing data for the RGB data at the position where the light sensing sensor is located and the brightness of the current screen, and the display performance is not affected.
In the related art, the process of correcting the light sensing data of the light sensor based on the RGB data corresponding to the screen brightness and the display content has been described in detail, and is not repeated in the embodiment of the present application.
Fig. 9 is a schematic structural diagram of a display content acquiring apparatus according to some embodiments of the present application, as shown in fig. 9, the apparatus includes:
a setting module 901, configured to set a synthesis mode of a hardware synthesis Hwc module of a hardware abstraction layer HAL as a graphics processing unit GPU synthesis when it is determined that page contents of layers corresponding to a non-status bar and a non-navigation bar jump;
an obtaining module 902, configured to obtain an image to be analyzed that is synthesized by the Hwc module based on the content of each layer;
the setting module 901 is further configured to set the data type of the image to be analyzed to be readable;
the determining module 903 is configured to determine target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
In a possible implementation manner, the obtaining module 902 is further configured to extract RGB data of the image to be analyzed;
the setting module 901 is further configured to lock the RGB data.
In a possible embodiment, the setting module 901 is further configured to unlock RGB data of the image to be analyzed.
In a possible implementation manner, the determining module 903 is specifically configured to, for each pixel point in a target analysis region in the image to be analyzed, respectively obtain and store a numerical value of each color channel of RGB data corresponding to the pixel point; aiming at each color channel of RGB data, determining an average value corresponding to the color channel according to the value of each pixel point correspondingly stored in the color channel and the number of the pixel points contained in the target analysis area; and determining the target data of the image to be analyzed according to the average value corresponding to each color channel.
In a possible embodiment, the setting module 901 is further configured to set the combining manner of the Hwc module of the HAL to Hwc combining; and setting the data type of the image to be analyzed to be unreadable.
Based on the same inventive concept, fig. 10 is another schematic structural diagram of a terminal device according to some embodiments of the present application, as shown in fig. 10, including: one or more (including two) processors 1001 and a communications interface 1002.
The processor 1001 stores therein a computer program, which, when executed by the processor 1001, causes the processor 1001 to execute the steps of the display content acquisition method in any of the embodiments described above.
Optionally, the terminal device further comprises a memory 1003, and the memory 1003 may comprise a read-only memory and a random access memory and provides the processor with operation instructions and data. The portion of memory may also include non-volatile random access memory (NVRAM).
In some embodiments, as shown in FIG. 10, memory 1003 stores elements, execution modules or data structures, or a subset thereof, or an expanded set thereof.
As shown in fig. 10, in some embodiments of the present application, the corresponding operation is performed by calling an operation instruction stored in the memory 1003 (the operation instruction may be stored in an operating system).
As shown in fig. 10, a processor 1001, which may also be referred to as a Central Processing Unit (CPU), controls the processing operations of the head-end device.
As shown in fig. 10, memory 1003 may include both read-only memory and random-access memory and provides instructions and data to the processor. A portion of the memory 1003 may also include NVRAM. Such as application communication interfaces, and memory, are coupled together by a bus system 1004, where the bus system 1004 may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For clarity of illustration, the various buses are designated in figure 10 as the bus system 1004.
On the basis of the foregoing embodiments, the present application further provides a computer-readable storage medium, in which a computer program executable by a processor is stored, and when the program runs on the processor, the processor is caused to execute the steps of the display content acquiring method in any one of the foregoing embodiments.
Since the principle of solving the problem of the computer readable medium is similar to that of the display content acquiring method, after the processor executes the computer program in the computer readable medium, the implementation steps can be referred to the above embodiments, and repeated parts are not described again.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A display content acquisition method, characterized in that the method comprises:
when the page content of the layer corresponding to the non-state column and the non-navigation column is determined to jump, setting the synthesis mode of a hardware synthesis Hwc module of a hardware abstraction layer HAL as the synthesis of a graphic processor GPU; acquiring an image to be analyzed which is synthesized by the Hwc module based on the content of each layer, and setting the data type of the image to be analyzed to be readable;
and determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
2. The method as claimed in claim 1, wherein after the data type of the image to be analyzed is set to be readable, before the target data of the image to be analyzed is determined according to a preset data analysis algorithm and RGB data corresponding to a preset region in the image to be analyzed, the method further comprises:
and extracting the RGB data of the image to be analyzed, and locking the RGB data.
3. The method of claim 2, wherein after determining the target data for the image to be analyzed, the method further comprises:
and unlocking the RGB data of the image to be analyzed.
4. The method as claimed in claim 1, wherein the determining the target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset region in the image to be analyzed comprises:
aiming at each pixel point in a target analysis area in the image to be analyzed, respectively acquiring and storing the numerical value of each color channel of the RGB data corresponding to the pixel point;
aiming at each color channel of RGB data, determining an average value corresponding to the color channel according to the numerical value of each pixel point correspondingly stored by the color channel and the number of the pixel points contained in the target analysis area;
and determining the target data of the image to be analyzed according to the average value corresponding to each color channel.
5. The method of claim 1, wherein after determining the target data for the image to be analyzed, the method further comprises:
setting the synthesis mode of the Hwc module of the HAL as Hwc synthesis; and setting the data type of the image to be analyzed to be unreadable.
6. A display content acquisition apparatus, characterized in that the apparatus comprises:
the setting module is used for setting the synthesis mode of a hardware synthesis Hwc module of the hardware abstraction layer HAL as graphic processor GPU synthesis when the jumping of the page content of the layer corresponding to the non-status bar and the non-navigation bar is determined;
the acquisition module is used for acquiring the image to be analyzed which is synthesized by the Hwc module based on the content of each layer;
the setting module is further used for setting the data type of the image to be analyzed to be readable;
and the determining module is used for determining target data of the image to be analyzed according to a preset data analysis algorithm and RGB data corresponding to a preset area in the image to be analyzed.
7. The apparatus of claim 6, wherein the obtaining module is further configured to extract RGB data of the image to be analyzed;
the setting module is further used for locking the RGB data.
8. The apparatus according to claim 6, wherein the determining module is specifically configured to, for each pixel point in a target analysis region in the image to be analyzed, respectively obtain and store a numerical value of each color channel of the RGB data corresponding to the pixel point; aiming at each color channel of RGB data, determining an average value corresponding to the color channel according to the numerical value of each pixel point correspondingly stored by the color channel and the number of the pixel points contained in the target analysis area; and determining the target data of the image to be analyzed according to the average value corresponding to each color channel.
9. A terminal device, characterized in that the terminal device comprises:
a display, a processor, and a memory;
the display is used for displaying a screen display area;
the memory to store the processor-executable instructions;
the processor is configured to execute the instructions to implement the display content acquisition method of any one of claims 1-5.
10. A computer-readable storage medium, characterized in that it stores a computer program which, when executed by a processor, implements the steps of the display content acquisition method according to any one of claims 1 to 5.
CN202210766622.XA 2022-06-30 2022-06-30 Display content acquisition method, device, terminal equipment and medium Pending CN115167796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210766622.XA CN115167796A (en) 2022-06-30 2022-06-30 Display content acquisition method, device, terminal equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210766622.XA CN115167796A (en) 2022-06-30 2022-06-30 Display content acquisition method, device, terminal equipment and medium

Publications (1)

Publication Number Publication Date
CN115167796A true CN115167796A (en) 2022-10-11

Family

ID=83489560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210766622.XA Pending CN115167796A (en) 2022-06-30 2022-06-30 Display content acquisition method, device, terminal equipment and medium

Country Status (1)

Country Link
CN (1) CN115167796A (en)

Similar Documents

Publication Publication Date Title
US12340082B2 (en) Card display method, electronic device, and computer readable storage medium
CN111597000B (en) Small window management method and terminal
CN111343339B (en) Mobile terminal and image display method thereof
CN113223464A (en) Ink screen image display method and ink screen terminal
CN111508039A (en) Word processing method of ink screen and communication terminal
US20240236227A9 (en) Information display method and electronic device
CN112925596B (en) Mobile terminal and display method of display object thereof
CN113360122B (en) Mobile terminal and text display method thereof
CN112184595A (en) Mobile terminal and image display method thereof
CN112612386B (en) Mobile terminal and display method of application card thereof
CN111726605B (en) Resolving power determining method and device, terminal equipment and storage medium
CN113709026A (en) Method, device, storage medium and program product for processing instant communication message
CN111193874B (en) Image display parameter adjusting method and mobile terminal
CN113038141B (en) Video frame processing method and electronic equipment
CN114067758B (en) Mobile terminal and image display method thereof
CN111479075B (en) Photographing terminal and image processing method thereof
CN114063945B (en) Mobile terminal and image display method thereof
CN115167796A (en) Display content acquisition method, device, terminal equipment and medium
CN114594894A (en) Interface element marking method, terminal device and storage medium
CN111399955B (en) Mobile terminal and interface display method of application program thereof
CN113900740A (en) Method and device for loading multiple list data
CN114639358A (en) Ink screen refreshing method, terminal device, storage medium and program product
CN114546219A (en) Picture list processing method and related device
CN113934340A (en) Terminal device and progress bar display method
CN113760164A (en) Display device and its response method to control operation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: Shandong City, Qingdao Province, Jiangxi City Road No. 11

Applicant after: Qingdao Hisense Mobile Communication Technology Co.,Ltd.

Address before: Shandong City, Qingdao Province, Jiangxi City Road No. 11

Applicant before: HISENSE MOBILE COMMUNICATIONS TECHNOLOGY Co.,Ltd.

Country or region before: China