Image display method and electronic equipment
Technical Field
The invention relates to the technical field of terminal equipment, in particular to an image display method and electronic equipment.
Background
At present, when people use terminal equipment to take pictures, the shot images need to be processed by means of a third-party image processing application program.
However, there are a plurality of third-party image processing applications in the prior art, and different third-party image processing applications have different effects when processing images.
When a user needs to process images, different third-party image processing application programs need to be opened one by one to obtain images with different image repairing results, and selection is performed in different images, so that the operation times of the user are increased, and the use experience of the user is influenced.
Disclosure of Invention
The embodiment of the invention provides an image display method and electronic equipment, and aims to solve the problem that in the prior art, when an image is processed, the operation is complex.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention further provides an image display method, which is applied to a terminal device, where the method includes: determining a first image; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image of the at least two second images; and displaying the target graph in an image preview interface.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes: a first determining module for determining a first image; the first generation module is used for respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; and the second determining module is used for determining a target image display module in the at least two second images and is used for displaying the target image in an image preview interface.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the image display method.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to implement the steps of the image display method.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flowchart illustrating steps of an image display method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of an image displaying method according to a second embodiment of the present invention;
fig. 3 is a block diagram of an electronic device according to a third embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a mobile terminal according to a fifth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of an image display method according to a first embodiment of the present invention is shown.
The image display method provided by the embodiment of the invention comprises the following steps:
step 101: a first image is determined.
Receiving a viewing instruction of a user for a first image, wherein the first image can be any image in a gallery and can also be an image shot by the user.
Step 102: and respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images.
In the terminal device, different third-party image processing applications are installed.
Different third-party image processing application programs correspond to different processing modes, namely different filters, so that different filters of different third-party image processing application programs are adopted to process the first image to obtain second images with different effects.
Step 103: a target image of the at least two second images is determined.
And calculating scores of the target images for second images with different effects, and determining the second images with higher scores as the target images.
Or, for the second image with different effects and a standard atlas, namely the standard atlas with high quality acknowledged in the industry, acquiring an image consistent with the image information in the first image from the standard atlas, counting the parameters of the selected images in the standard atlas, calculating the mean value of the parameters, calculating the Euclidean distance between the parameters of the first image and the mean value, and sequencing according to the sequence of the Euclidean distance values from small to large, thereby determining each target image, wherein the smaller the Euclidean distance, the greater the similarity between the first image and the standard image.
Or sorting the second images processed by the different third-party image processing application programs according to the frequency of the user using the different third-party image processing application programs, and acquiring the second images with the frequency greater than the preset frequency as target images.
Step 104: and displaying the target image in the image preview interface.
The user can check each target image in the preview interface, wherein each target image can be displayed above the preview interface, icons corresponding to different third-party image processing application programs are displayed below the preview interface, and when the user clicks different icons, the target image corresponding to the third-party image processing application program corresponding to the clicked icon is displayed above the preview interface.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of an image display method according to a second embodiment of the present invention is shown.
The image display method provided by the embodiment of the invention comprises the following steps:
step 201: a first image is determined.
Receiving a viewing instruction of a user for a first image, wherein the first image can be any image in a gallery and can also be an image shot by the user.
Step 202: and respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images.
In the terminal device, different third-party image processing applications are installed.
Different third-party image processing application programs correspond to different processing modes, namely different filters, so that different filters of different third-party image processing application programs are adopted to process the first image to obtain second images with different effects.
Step 203: a standard set of images is obtained from a server and image information for the first image is identified.
Wherein, the standard image set is as follows: the industry recognizes a very high quality standard atlas.
According to the image recognition technology, image information in the first image is recognized, wherein the image information can be scene information, character information, background information and the like.
Step 204: and acquiring each third image consistent with the image information from the standard image set.
Because a plurality of images with different image information exist in the standard image set, the image information of the first image is matched with the image information of each image in the standard image set, and each fifth image after successful matching is obtained.
Step 205: first image parameter values for each third image are determined.
Wherein the first image parameter values include: exposure value, contrast value, and hue value.
Step 206: and determining Euclidean distance values of the second image parameter values and the first image parameter values of the second image respectively aiming at each second image and each third image.
For each second image and each fifth image, a respective difference of the second image parameter values of the second image and the parameter values of the first image is calculated. And calculating the Euclidean distance value according to the difference values.
Counting the first image parameter values of each fifth image, and determining the average value of the first image parameter values of each fifth image, for example: when there are three image parameter values, the generated vector q is equal to (q)1,q2,q3)。q1,q2,q3Respectively, exposure value, contrast value, and hue value.
Similarly, the processed second images of the plurality of third-party image processing applications are subjected to corresponding parameter statistics. For example, if there are k third-party image processing applications, there are k second images. Generating k vectors, k1=(k11,k12,k13),k2=(k21,k22,k23)…kk=(kk1,kk2,kk3) The euclidean distances between these vectors and the q vector are calculated separately.
Step 207: and acquiring each second image corresponding to the Euclidean distance value larger than the preset Euclidean distance value as a target image.
And sorting according to the sequence of the Euclidean distance values from small to large, wherein the smaller the Euclidean distance is, the greater the similarity with the fifth image in the standard image set is. And acquiring a preset number of second images which are sequenced at the top as target images.
The determination of each target image from each second image in steps 203 to 207 may also be achieved by:
the first mode is as follows: inputting at least two second images into an image scoring network respectively; obtaining the score of each second image output by the image scoring network; and taking the second image with the score larger than the first preset score as a target image. In this way, optionally, the target images may be displayed in order from the largest value to the smallest value of the score.
It should be noted that, a person skilled in the art may set the first preset score according to actual situations, where the first preset score may be 70 points, 80 points, 90 points, and so on.
Specifically, the method comprises the following steps: inputting each second image into an image scoring network; obtaining the score of each second image output by the image scoring network; acquiring images with the scores larger than the preset scores as fourth images; determining a third-party image processing application program corresponding to each fourth image; acquiring fifth images with the same third-party image processing application program aiming at different third-party image processing application programs; and acquiring a fourth image with the highest score from the fifth images as a target image.
According to the method for determining the target image, the filter effect of the generated target image is better matched with the image of the standard atlas, a user does not need to perform other processing on the image, and the use experience of the user is improved.
The second way may be: determining a first usage frequency and a first weight value of each third-party image processing application program; respectively determining a second use frequency and a second weight value corresponding to different filter parameters in a third-party image processing application program; determining a target score corresponding to a filter parameter in a third-party image processing application program according to the first using frequency, the first weight value, the second using frequency and the second weight value; and taking the second image corresponding to the filter parameter with the target score larger than the second preset score as the target image.
According to the first use frequency and the first weight value of the third-party application program, and the second use frequency and the second weight value of different filters of the third-party image processing application program, the target image is determined, the filter corresponding to the third-party image processing application program frequently used by a user can be used, and the generated target image is more in line with the use habit of the user.
Step 208: and displaying the target image in the image preview interface.
After step 208, there may optionally be step 209 and step 210.
Step 209: and receiving the upglide touch operation of the user on any target image.
Step 210: and deleting the target image on the image preview interface according to the upglide touch operation.
The user can delete, save and modify the displayed target image in the preview interface, and when the user presses any target image in the preview screen, the user can delete the preview result image of the current third-party image processing application program, namely delete the current target image.
The user can check each target image in the preview interface, wherein each target image can be displayed above the preview interface, icons corresponding to different third-party image processing application programs are displayed below the preview interface, and when the user clicks different icons, the target image corresponding to the third-party image processing application program corresponding to the clicked icon is displayed above the preview interface.
When the user presses the third-party image processing application program icon for a long time, the user enters the interface of the current third-party image processing application program, and the user can use the third-party image processing application program to carry out image repairing processing on the target image.
The user can also click a 'save' button in the image preview interface, or press any target image and drag the target image downwards to save the selected target image.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Having described the image processing method for application programs according to the embodiments of the present invention, the following describes a terminal device according to the embodiments of the present invention with reference to the accompanying drawings.
EXAMPLE III
Referring to fig. 3, a block diagram of an electronic device according to a third embodiment of the present invention is shown.
The terminal device provided by the embodiment of the invention comprises: a first determining module 301, configured to determine a first image; a first generating module 302, configured to perform image processing on the first image in at least two third-party image processing applications, respectively, and generate at least two corresponding second images; a second determining module 303, configured to determine a target image in the at least two second images; a first display module 304, configured to display the target image in an image preview interface.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
Example four
The terminal device provided by the embodiment of the invention comprises: a first determining module 401, configured to determine a first image; a first generating module 402, configured to perform image processing on the first image in at least two third-party image processing applications, respectively, and generate at least two corresponding second images; a second determining module 403, configured to determine a target image in the at least two second images; a first display module 404, configured to display the target image in an image preview interface.
The second determining module 403 includes: the input sub-module 4031 is used for inputting each second image into the image scoring network; a first obtaining sub-module 4032, configured to obtain a score of each second image output by the image scoring network; a first determining sub-module 4033, configured to use the second image with the score greater than a first preset score as the target image.
Preferably, the electronic device further includes: a second display module 405, configured to display the target images in descending order of score after the first determination sub-module 4033 takes the second image with the score larger than a first preset score as the target image.
Preferably, the second determining module 403 includes: the second determining module 403 includes: a second obtaining submodule 4034, configured to obtain a standard image set from a server and identify image information of the first image; a third obtaining sub-module 4035, configured to obtain, from the standard image set, each third image that is consistent with the image information; a second determining submodule 4036 configured to determine first image parameter values of each of the third images, where the first image parameter values include: an exposure value, a contrast value, and a hue value; a third determining submodule 4037, configured to determine, for each second image and each third image, euclidean distance values between the second image parameter values of the second image and the first image parameter values, respectively; and a fourth obtaining sub-module 4038, configured to obtain each second image corresponding to the euclidean distance value greater than the preset euclidean distance value, as a target image.
Preferably, the second determining module 403 includes: a fourth determining submodule 4039 for determining, for each of the third-party image processing applications, a first frequency of use and a first weight value for the third-party image processing application; a fifth determining submodule 40310, configured to determine, for different filter parameters in the third-party image processing application, a second usage frequency and a second weight value corresponding to the filter parameter respectively; a sixth determining submodule 40311, configured to determine, according to the first usage frequency, the first weight value, the second usage frequency, and the second weight value, a target score corresponding to the filter parameter in the third-party image processing application; a seventh determining submodule 40312, configured to use the second image corresponding to the filter parameter with the target score being greater than a second preset score as the target image.
Preferably, the terminal device further includes: a receiving module 406, configured to receive a user's slide-up touch operation on any one of the target images after the first display module 404 displays each target image in an image preview interface; and the deleting module 407 is configured to delete the target image on the image preview interface according to the up-sliding touch operation.
The terminal device provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 and fig. 2, and is not described herein again to avoid repetition.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
EXAMPLE five
Referring to fig. 4, a schematic diagram of a hardware structure of a mobile terminal according to a fifth embodiment of the present invention is shown;
the mobile terminal 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 4 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 501 is configured to receive a touch operation of a user on a first image.
A processor 510, configured to determine, according to the touch operation, each third-party image processing application in the terminal device; performing image processing on the first image in each third-party image processing application program to generate each second image; acquiring each target image from each second image according to a preset acquisition rule; and displaying each target image in the image preview interface.
In an embodiment of the invention, the first image is determined; respectively carrying out image processing on the first image in at least two third-party image processing application programs to generate at least two corresponding second images; determining a target image in the at least two second images; the target images are displayed in the image preview interface, a user can directly display each target image processed by different third-party image processing application programs in the gallery without opening different third-party image processing application programs one by one to obtain second images of different retouching results, so that the user can check and select the target images, the operation steps are simple, and the use experience of the user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 502, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the mobile terminal 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The mobile terminal 500 also includes at least one sensor 505, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 5061 and/or a backlight when the mobile terminal 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 6071 detects a touch operation on or near the touch panel, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 4, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 508 is an interface through which an external device is connected to the mobile terminal 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 500 or may be used to transmit data between the mobile terminal 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the mobile terminal. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The mobile terminal 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the mobile terminal 500 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program, when executed by the processor 510, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.