[go: up one dir, main page]

CN110908754A - Image display method and electronic equipment - Google Patents

Image display method and electronic equipment Download PDF

Info

Publication number
CN110908754A
CN110908754A CN201911077866.1A CN201911077866A CN110908754A CN 110908754 A CN110908754 A CN 110908754A CN 201911077866 A CN201911077866 A CN 201911077866A CN 110908754 A CN110908754 A CN 110908754A
Authority
CN
China
Prior art keywords
image
images
target
determining
party
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911077866.1A
Other languages
Chinese (zh)
Inventor
曹新英
周伟伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911077866.1A priority Critical patent/CN110908754A/en
Publication of CN110908754A publication Critical patent/CN110908754A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例提供了一种图像显示方法及电子设备,所述方法包括:确定第一图像;将第一图像分别在至少两个第三方图像处理应用程序中进行图像处理,生成对应的至少两个第二图像;确定至少两个第二图像中的目标图像;在图像预览界面中显示目标图,能够用户无需逐一打开不同的第三方图像处理应用程序得到不同的修图结果的第二图像,而是在图库中直接显示不同第三方图像处理应用程序处理后的各目标图像,供用户查看并选择,操作步骤简单,提升用户的使用体验。

Figure 201911077866

Embodiments of the present invention provide an image display method and an electronic device. The method includes: determining a first image; performing image processing on the first image in at least two third-party image processing applications, respectively, to generate corresponding at least two image processing applications. a second image; determine the target image in at least two second images; display the target image in the image preview interface, so that the user does not need to open different third-party image processing applications one by one to obtain second images with different retouching results, Instead, each target image processed by different third-party image processing applications is directly displayed in the gallery for the user to view and select. The operation steps are simple and the user experience is improved.

Figure 201911077866

Description

Image display method and electronic equipment
Technical Field
The invention relates to the technical field of terminal equipment, in particular to an image display method and electronic equipment.
Background
At present, when people use terminal equipment to take pictures, the shot images need to be processed by means of a third-party image processing application program.
However, there are a plurality of third-party image processing applications in the prior art, and different third-party image processing applications have different effects when processing images.
When a user needs to process images, different third-party image processing application programs need to be opened one by one to obtain images with different image repairing results, and selection is performed in different images, so that the operation times of the user are increased, and the use experience of the user is influenced.
Disclosure of Invention
The embodiment of the invention provides an image display method and electronic equipment, and aims to solve the problem that in the prior art, when an image is processed, the operation is complex.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention further provides an image display method, which is applied to a terminal device, where the method includes: determining a first image; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image of the at least two second images; and displaying the target graph in an image preview interface.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes: a first determining module for determining a first image; the first generation module is used for respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; and the second determining module is used for determining a target image display module in the at least two second images and is used for displaying the target image in an image preview interface.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the image display method.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to implement the steps of the image display method.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flowchart illustrating steps of an image display method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of an image displaying method according to a second embodiment of the present invention;
fig. 3 is a block diagram of an electronic device according to a third embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a mobile terminal according to a fifth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of an image display method according to a first embodiment of the present invention is shown.
The image display method provided by the embodiment of the invention comprises the following steps:
step 101: a first image is determined.
Receiving a viewing instruction of a user for a first image, wherein the first image can be any image in a gallery and can also be an image shot by the user.
Step 102: and respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images.
In the terminal device, different third-party image processing applications are installed.
Different third-party image processing application programs correspond to different processing modes, namely different filters, so that different filters of different third-party image processing application programs are adopted to process the first image to obtain second images with different effects.
Step 103: a target image of the at least two second images is determined.
And calculating scores of the target images for second images with different effects, and determining the second images with higher scores as the target images.
Or, for the second image with different effects and a standard atlas, namely the standard atlas with high quality acknowledged in the industry, acquiring an image consistent with the image information in the first image from the standard atlas, counting the parameters of the selected images in the standard atlas, calculating the mean value of the parameters, calculating the Euclidean distance between the parameters of the first image and the mean value, and sequencing according to the sequence of the Euclidean distance values from small to large, thereby determining each target image, wherein the smaller the Euclidean distance, the greater the similarity between the first image and the standard image.
Or sorting the second images processed by the different third-party image processing application programs according to the frequency of the user using the different third-party image processing application programs, and acquiring the second images with the frequency greater than the preset frequency as target images.
Step 104: and displaying the target image in the image preview interface.
The user can check each target image in the preview interface, wherein each target image can be displayed above the preview interface, icons corresponding to different third-party image processing application programs are displayed below the preview interface, and when the user clicks different icons, the target image corresponding to the third-party image processing application program corresponding to the clicked icon is displayed above the preview interface.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of an image display method according to a second embodiment of the present invention is shown.
The image display method provided by the embodiment of the invention comprises the following steps:
step 201: a first image is determined.
Receiving a viewing instruction of a user for a first image, wherein the first image can be any image in a gallery and can also be an image shot by the user.
Step 202: and respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images.
In the terminal device, different third-party image processing applications are installed.
Different third-party image processing application programs correspond to different processing modes, namely different filters, so that different filters of different third-party image processing application programs are adopted to process the first image to obtain second images with different effects.
Step 203: a standard set of images is obtained from a server and image information for the first image is identified.
Wherein, the standard image set is as follows: the industry recognizes a very high quality standard atlas.
According to the image recognition technology, image information in the first image is recognized, wherein the image information can be scene information, character information, background information and the like.
Step 204: and acquiring each third image consistent with the image information from the standard image set.
Because a plurality of images with different image information exist in the standard image set, the image information of the first image is matched with the image information of each image in the standard image set, and each fifth image after successful matching is obtained.
Step 205: first image parameter values for each third image are determined.
Wherein the first image parameter values include: exposure value, contrast value, and hue value.
Step 206: and determining Euclidean distance values of the second image parameter values and the first image parameter values of the second image respectively aiming at each second image and each third image.
For each second image and each fifth image, a respective difference of the second image parameter values of the second image and the parameter values of the first image is calculated. And calculating the Euclidean distance value according to the difference values.
Counting the first image parameter values of each fifth image, and determining the average value of the first image parameter values of each fifth image, for example: when there are three image parameter values, the generated vector q is equal to (q)1,q2,q3)。q1,q2,q3Respectively, exposure value, contrast value, and hue value.
Similarly, the processed second images of the plurality of third-party image processing applications are subjected to corresponding parameter statistics. For example, if there are k third-party image processing applications, there are k second images. Generating k vectors, k1=(k11,k12,k13),k2=(k21,k22,k23)…kk=(kk1,kk2,kk3) The euclidean distances between these vectors and the q vector are calculated separately.
Step 207: and acquiring each second image corresponding to the Euclidean distance value larger than the preset Euclidean distance value as a target image.
And sorting according to the sequence of the Euclidean distance values from small to large, wherein the smaller the Euclidean distance is, the greater the similarity with the fifth image in the standard image set is. And acquiring a preset number of second images which are sequenced at the top as target images.
The determination of each target image from each second image in steps 203 to 207 may also be achieved by:
the first mode is as follows: inputting at least two second images into an image scoring network respectively; obtaining the score of each second image output by the image scoring network; and taking the second image with the score larger than the first preset score as a target image. In this way, optionally, the target images may be displayed in order from the largest value to the smallest value of the score.
It should be noted that, a person skilled in the art may set the first preset score according to actual situations, where the first preset score may be 70 points, 80 points, 90 points, and so on.
Specifically, the method comprises the following steps: inputting each second image into an image scoring network; obtaining the score of each second image output by the image scoring network; acquiring images with the scores larger than the preset scores as fourth images; determining a third-party image processing application program corresponding to each fourth image; acquiring fifth images with the same third-party image processing application program aiming at different third-party image processing application programs; and acquiring a fourth image with the highest score from the fifth images as a target image.
According to the method for determining the target image, the filter effect of the generated target image is better matched with the image of the standard atlas, a user does not need to perform other processing on the image, and the use experience of the user is improved.
The second way may be: determining a first usage frequency and a first weight value of each third-party image processing application program; respectively determining a second use frequency and a second weight value corresponding to different filter parameters in a third-party image processing application program; determining a target score corresponding to a filter parameter in a third-party image processing application program according to the first using frequency, the first weight value, the second using frequency and the second weight value; and taking the second image corresponding to the filter parameter with the target score larger than the second preset score as the target image.
According to the first use frequency and the first weight value of the third-party application program, and the second use frequency and the second weight value of different filters of the third-party image processing application program, the target image is determined, the filter corresponding to the third-party image processing application program frequently used by a user can be used, and the generated target image is more in line with the use habit of the user.
Step 208: and displaying the target image in the image preview interface.
After step 208, there may optionally be step 209 and step 210.
Step 209: and receiving the upglide touch operation of the user on any target image.
Step 210: and deleting the target image on the image preview interface according to the upglide touch operation.
The user can delete, save and modify the displayed target image in the preview interface, and when the user presses any target image in the preview screen, the user can delete the preview result image of the current third-party image processing application program, namely delete the current target image.
The user can check each target image in the preview interface, wherein each target image can be displayed above the preview interface, icons corresponding to different third-party image processing application programs are displayed below the preview interface, and when the user clicks different icons, the target image corresponding to the third-party image processing application program corresponding to the clicked icon is displayed above the preview interface.
When the user presses the third-party image processing application program icon for a long time, the user enters the interface of the current third-party image processing application program, and the user can use the third-party image processing application program to carry out image repairing processing on the target image.
The user can also click a 'save' button in the image preview interface, or press any target image and drag the target image downwards to save the selected target image.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Having described the image processing method for application programs according to the embodiments of the present invention, the following describes a terminal device according to the embodiments of the present invention with reference to the accompanying drawings.
EXAMPLE III
Referring to fig. 3, a block diagram of an electronic device according to a third embodiment of the present invention is shown.
The terminal device provided by the embodiment of the invention comprises: a first determining module 301, configured to determine a first image; a first generating module 302, configured to perform image processing on the first image in at least two third-party image processing applications, respectively, and generate at least two corresponding second images; a second determining module 303, configured to determine a target image in the at least two second images; a first display module 304, configured to display the target image in an image preview interface.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Example four
The terminal device provided by the embodiment of the invention comprises: a first determining module 401, configured to determine a first image; a first generating module 402, configured to perform image processing on the first image in at least two third-party image processing applications, respectively, and generate at least two corresponding second images; a second determining module 403, configured to determine a target image in the at least two second images; a first display module 404, configured to display the target image in an image preview interface.
The second determining module 403 includes: the input sub-module 4031 is used for inputting each second image into the image scoring network; a first obtaining sub-module 4032, configured to obtain a score of each second image output by the image scoring network; a first determining sub-module 4033, configured to use the second image with the score greater than a first preset score as the target image.
Preferably, the electronic device further includes: a second display module 405, configured to display the target images in descending order of score after the first determination sub-module 4033 takes the second image with the score larger than a first preset score as the target image.
Preferably, the second determining module 403 includes: the second determining module 403 includes: a second obtaining submodule 4034, configured to obtain a standard image set from a server and identify image information of the first image; a third obtaining sub-module 4035, configured to obtain, from the standard image set, each third image that is consistent with the image information; a second determining submodule 4036 configured to determine first image parameter values of each of the third images, where the first image parameter values include: an exposure value, a contrast value, and a hue value; a third determining submodule 4037, configured to determine, for each second image and each third image, euclidean distance values between the second image parameter values of the second image and the first image parameter values, respectively; and a fourth obtaining sub-module 4038, configured to obtain each second image corresponding to the euclidean distance value greater than the preset euclidean distance value, as a target image.
Preferably, the second determining module 403 includes: a fourth determining submodule 4039 for determining, for each of the third-party image processing applications, a first frequency of use and a first weight value for the third-party image processing application; a fifth determining submodule 40310, configured to determine, for different filter parameters in the third-party image processing application, a second usage frequency and a second weight value corresponding to the filter parameter respectively; a sixth determining submodule 40311, configured to determine, according to the first usage frequency, the first weight value, the second usage frequency, and the second weight value, a target score corresponding to the filter parameter in the third-party image processing application; a seventh determining submodule 40312, configured to use the second image corresponding to the filter parameter with the target score being greater than a second preset score as the target image.
Preferably, the terminal device further includes: a receiving module 406, configured to receive a user's slide-up touch operation on any one of the target images after the first display module 404 displays each target image in an image preview interface; and the deleting module 407 is configured to delete the target image on the image preview interface according to the up-sliding touch operation.
The terminal device provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 and fig. 2, and is not described herein again to avoid repetition.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
EXAMPLE five
Referring to fig. 4, a schematic diagram of a hardware structure of a mobile terminal according to a fifth embodiment of the present invention is shown;
the mobile terminal 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 4 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 501 is configured to receive a touch operation of a user on a first image.
A processor 510, configured to determine, according to the touch operation, each third-party image processing application in the terminal device; performing image processing on the first image in each third-party image processing application program to generate each second image; acquiring each target image from each second image according to a preset acquisition rule; and displaying each target image in the image preview interface.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 502, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the mobile terminal 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The mobile terminal 500 also includes at least one sensor 505, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 5061 and/or a backlight when the mobile terminal 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 6071 detects a touch operation on or near the touch panel, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 4, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 508 is an interface through which an external device is connected to the mobile terminal 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 500 or may be used to transmit data between the mobile terminal 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the mobile terminal. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The mobile terminal 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the mobile terminal 500 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program, when executed by the processor 510, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image display method applied to an electronic device, the method comprising:
determining a first image;
respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images;
determining a target image of the at least two second images;
and displaying the target image in an image preview interface.
2. The method according to claim 1, wherein the step of determining the target image of the at least two second images specifically comprises:
inputting the at least two second images into an image scoring network respectively;
obtaining the score of each second image output by the image scoring network;
and taking the second image with the score larger than a first preset score as the target image.
3. The method according to claim 2, characterized in that after said step of taking as said target image said second image with a score greater than a first preset score, it further comprises:
and displaying the target images in an order from large to small according to the scores.
4. The method of claim 1, wherein the step of determining the target image of the at least two second images comprises:
acquiring a standard image set from a server and identifying image information of the first image;
acquiring each third image consistent with the image information from the standard image set;
determining a first image parameter value for each of the third images, wherein the first image parameter values include: an exposure value, a contrast value, and a hue value;
respectively determining Euclidean distance values of second image parameter values of the second image and the first image parameter values for each second image and each third image;
and acquiring each second image corresponding to the Euclidean distance value larger than the preset Euclidean distance value as a target image.
5. The method according to claim 1, wherein the step of determining the target image of the at least two second images specifically comprises:
for each third-party image processing application program, determining a first use frequency and a first weight value of the third-party image processing application program;
respectively determining a second use frequency and a second weight value corresponding to different filter parameters in the third-party image processing application program;
determining a target score corresponding to the filter parameter in the third-party image processing application program according to the first use frequency, the first weight value, the second use frequency and the second weight value;
and taking the second image corresponding to the filter parameter with the target score larger than a second preset score as the target image.
6. The method of claim 1, wherein after the step of displaying each of the target images in an image preview interface, the method further comprises:
receiving an upward-sliding touch operation of a user on any one target image;
and deleting the target image on the image preview interface according to the up-sliding touch operation.
7. An electronic device, characterized in that the electronic device comprises:
a first determining module for determining a first image;
the first generation module is used for respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images;
a second determination module for determining a target image of the at least two second images
And the first display module is used for displaying the target image in an image preview interface.
8. The electronic device of claim 7, wherein the second determining module comprises:
the input submodule is used for inputting each second image into the image scoring network;
the first obtaining submodule is used for obtaining the score of each second image output by the image scoring network;
and the first determining sub-module is used for taking the second image with the score larger than a first preset score as the target image.
9. The electronic device of claim 8, wherein the electronic device comprises:
and the second display module is used for displaying the target images in an order from large to small according to the scores after the first determining sub-module takes the second image with the score larger than a first preset score as the target image.
10. The electronic device of claim 7, wherein the second determining module comprises:
the second obtaining submodule is used for obtaining the standard image set from the server and identifying the image information of the first image;
the third acquisition sub-module is used for acquiring each third image consistent with the image information from the standard image set;
a second determining sub-module, configured to determine a first image parameter value of each of the third images, where the first image parameter value includes: an exposure value, a contrast value, and a hue value;
a third determining submodule, configured to determine, for each second image and each third image, euclidean distance values between second image parameter values of the second images and the first image parameter values, respectively;
and the fourth obtaining submodule is used for obtaining each second image corresponding to the Euclidean distance value larger than the preset Euclidean distance value as the target image.
CN201911077866.1A 2019-11-06 2019-11-06 Image display method and electronic equipment Pending CN110908754A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911077866.1A CN110908754A (en) 2019-11-06 2019-11-06 Image display method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911077866.1A CN110908754A (en) 2019-11-06 2019-11-06 Image display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN110908754A true CN110908754A (en) 2020-03-24

Family

ID=69814820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911077866.1A Pending CN110908754A (en) 2019-11-06 2019-11-06 Image display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110908754A (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040032508A1 (en) * 1998-07-10 2004-02-19 Kia Silverbrook Cascading image modification using multiple digital cameras incorporating image processing
US20110227939A1 (en) * 2007-08-06 2011-09-22 Panasonic Corporation Image generation device, image display device, image generation method, image display method, image generation program, and image display program
CN102426511A (en) * 2010-11-16 2012-04-25 微软公司 System level search user interface
CN103793875A (en) * 2014-02-25 2014-05-14 厦门美图之家科技有限公司 Image processing system capable of supporting third party application
CN104407769A (en) * 2014-10-28 2015-03-11 小米科技有限责任公司 Picture processing method, device and equipment
CN105279161A (en) * 2014-06-10 2016-01-27 腾讯科技(深圳)有限公司 Filter sequencing method and filter sequencing device for picture processing application
CN105357451A (en) * 2015-12-04 2016-02-24 Tcl集团股份有限公司 Image processing method and apparatus based on filter special efficacies
CN105511738A (en) * 2016-01-26 2016-04-20 努比亚技术有限公司 Method and device for regulating image processing menu
CN106651761A (en) * 2016-12-27 2017-05-10 维沃移动通信有限公司 Method for adding filters to pictures, and mobile terminal
CN106775902A (en) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 A kind of method and apparatus of image procossing, mobile terminal
CN106775332A (en) * 2017-02-07 2017-05-31 珠海市魅族科技有限公司 Image processing method and picture processing system
CN107729160A (en) * 2017-09-29 2018-02-23 努比亚技术有限公司 Application control method, mobile terminal and computer-readable recording medium
CN107728877A (en) * 2017-09-29 2018-02-23 维沃移动通信有限公司 One kind applies recommendation method and mobile terminal
WO2018049952A1 (en) * 2016-09-14 2018-03-22 厦门幻世网络科技有限公司 Photo acquisition method and device
CN110035184A (en) * 2019-04-22 2019-07-19 珠海格力电器股份有限公司 Image processing method and device and folding screen terminal
CN110058754A (en) * 2019-03-29 2019-07-26 维沃移动通信有限公司 A kind of option display method and terminal device
CN110223292A (en) * 2019-06-20 2019-09-10 厦门美图之家科技有限公司 Image evaluation method, device and computer readable storage medium
CN110221794A (en) * 2018-03-02 2019-09-10 阿里巴巴集团控股有限公司 A kind of object displaying method and terminal

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040032508A1 (en) * 1998-07-10 2004-02-19 Kia Silverbrook Cascading image modification using multiple digital cameras incorporating image processing
US20110227939A1 (en) * 2007-08-06 2011-09-22 Panasonic Corporation Image generation device, image display device, image generation method, image display method, image generation program, and image display program
CN102426511A (en) * 2010-11-16 2012-04-25 微软公司 System level search user interface
CN103793875A (en) * 2014-02-25 2014-05-14 厦门美图之家科技有限公司 Image processing system capable of supporting third party application
CN105279161A (en) * 2014-06-10 2016-01-27 腾讯科技(深圳)有限公司 Filter sequencing method and filter sequencing device for picture processing application
CN104407769A (en) * 2014-10-28 2015-03-11 小米科技有限责任公司 Picture processing method, device and equipment
CN105357451A (en) * 2015-12-04 2016-02-24 Tcl集团股份有限公司 Image processing method and apparatus based on filter special efficacies
CN105511738A (en) * 2016-01-26 2016-04-20 努比亚技术有限公司 Method and device for regulating image processing menu
WO2018049952A1 (en) * 2016-09-14 2018-03-22 厦门幻世网络科技有限公司 Photo acquisition method and device
CN106651761A (en) * 2016-12-27 2017-05-10 维沃移动通信有限公司 Method for adding filters to pictures, and mobile terminal
CN106775902A (en) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 A kind of method and apparatus of image procossing, mobile terminal
CN106775332A (en) * 2017-02-07 2017-05-31 珠海市魅族科技有限公司 Image processing method and picture processing system
CN107729160A (en) * 2017-09-29 2018-02-23 努比亚技术有限公司 Application control method, mobile terminal and computer-readable recording medium
CN107728877A (en) * 2017-09-29 2018-02-23 维沃移动通信有限公司 One kind applies recommendation method and mobile terminal
CN110221794A (en) * 2018-03-02 2019-09-10 阿里巴巴集团控股有限公司 A kind of object displaying method and terminal
CN110058754A (en) * 2019-03-29 2019-07-26 维沃移动通信有限公司 A kind of option display method and terminal device
CN110035184A (en) * 2019-04-22 2019-07-19 珠海格力电器股份有限公司 Image processing method and device and folding screen terminal
CN110223292A (en) * 2019-06-20 2019-09-10 厦门美图之家科技有限公司 Image evaluation method, device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN110891144B (en) Image display method and electronic equipment
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
CN109078319B (en) Game interface display method and terminal
CN108491123B (en) Method for adjusting application program icon and mobile terminal
CN110109593B (en) Screen capturing method and terminal equipment
CN109005336B (en) Image shooting method and terminal equipment
CN108427873B (en) A biometric identification method and mobile terminal
CN108763317B (en) A kind of method and terminal device for assisting selection of pictures
CN109495616B (en) Photographing method and terminal equipment
CN109388456B (en) Head portrait selection method and mobile terminal
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN110096203B (en) A screenshot method and mobile terminal
CN109286726B (en) A content display method and terminal device
CN109544172B (en) Display method and terminal equipment
CN109246351B (en) Composition method and terminal equipment
CN108833791B (en) A shooting method and device
CN108174110B (en) A kind of photographing method and flexible screen terminal
CN110636225B (en) Photographing method and electronic device
CN109246474A (en) A kind of video file edit methods and mobile terminal
CN110007821B (en) Operation method and terminal equipment
CN109462727B (en) Filter adjusting method and mobile terminal
CN109639981B (en) Image shooting method and mobile terminal
CN110851042A (en) Interface display method and electronic equipment
CN108628534B (en) Character display method and mobile terminal
CN108965701B (en) Jitter correction method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200324

RJ01 Rejection of invention patent application after publication