CN104811469B - Emotion sharing method and device for mobile terminal and mobile terminal thereof - Google Patents
Emotion sharing method and device for mobile terminal and mobile terminal thereof Download PDFInfo
- Publication number
- CN104811469B CN104811469B CN201410043862.2A CN201410043862A CN104811469B CN 104811469 B CN104811469 B CN 104811469B CN 201410043862 A CN201410043862 A CN 201410043862A CN 104811469 B CN104811469 B CN 104811469B
- Authority
- CN
- China
- Prior art keywords
- contact
- emotion
- user
- information
- emotional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Telephone Function (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
公开一种用于移动终端的情绪共享方法和装置及其移动终端,所述情绪共享方法包括:接收存储在移动终端中的至少一个联系人的情绪信息;基于接收到的所述至少一个联系人的情绪信息,更新所述至少一个联系人中的每个联系人的情绪标签,其中,联系人的情绪标签显示在包含联系人信息的界面中。
An emotion sharing method and device for a mobile terminal and a mobile terminal thereof are disclosed, the emotion sharing method comprising: receiving emotion information of at least one contact stored in the mobile terminal; based on the received at least one contact The emotion information of the contact is updated, and the emotion label of each contact in the at least one contact is updated, wherein the emotion label of the contact is displayed in the interface containing the contact information.
Description
Technical Field
The present invention relates to the field of mobile terminals, and more particularly, to an emotion sharing method and apparatus for contributing to an enhanced communication effect and a mobile terminal including the same.
Background
Existing emotion recognition technologies are typically associated with voice (speech), expression (camera), social tools, biometric characteristics, and big data analytics, among others. In the aspect of emotion recognition, a more typical way is to judge the emotion of a user by voice (i.e., collecting information during a conversation of the user). On the other hand, the expression of people can be captured through the camera, and the emotion of people is analyzed and judged. In addition, the emotion of the wearer can be analyzed by monitoring biological data such as a skin radio wave using a sensor. In addition, there is also an emotion recognition method that can be implemented by big data analysis, for example, by statistically analyzing usage data of time, frequency, content, and the like of a user's access to the internet to find regularity, thereby recognizing the emotion of the user. By analyzing the mood of the user, undesirable tendencies such as depression and suicidal tendencies can also be discovered.
The communication in time has important significance according to emotion sharing. For example, by 2013 summer, statistics show that the incidence rate of depression in China is about 3% -5%, and more than 2600 million people suffer from depression at present. In the past 50 years, the suicide rate in China has increased by 60%, 28.7 million people die of suicide, 200 million people are suicided and unstopped each year, and about 95% of suicide groups suffer from mental disorders. However, the rate of identification of depression by hospitals above the national city level is less than 20%. Under the condition, the emotion states of the related personnel are monitored in real time through emotion sharing, and the emotion monitoring system can communicate with the related personnel in time and is beneficial to emotion adjustment of the related personnel.
Currently, emotion sharing is typically achieved through social software or instant messaging tools. However, this approach has disadvantages. For example, social tools or instant messenger are of the user-active input type in terms of emotion sharing, i.e., a user needs to log in to the social tool or instant messenger, input text or other content, then determine the emotion of the user and share the emotion. However, in this case, the emotion is not updated in a timely manner. Furthermore, emotion sharing through social tools or instant messaging tools often chooses to use multiple tools to circle different contacts, and the whole process is rather cumbersome. Moreover, for the user, to know the updated emotion of other people through the social tool or the instant messaging tool, the user needs to log in the corresponding social tool or the instant messaging tool, and can know the updated emotion information of other people by checking the updated information of other people, so that the mode of acquiring the emotion information of other people is not timely or the updated emotion information of other people is easily missed.
Disclosure of Invention
Therefore, an object of the present invention is to provide an emotion sharing method and apparatus capable of effectively performing emotion sharing in time so that a user can quickly and simply select a communication object, and a mobile terminal including the same.
According to an aspect of the present invention, there is provided an emotion sharing method for a mobile terminal, including: receiving emotion information of at least one contact stored in the mobile terminal; updating the emotion label of each contact in the at least one contact based on the received emotion information of the at least one contact, wherein the emotion label of the contact is displayed in an interface containing the contact information.
Preferably, the emotion sharing method further includes: determining emotion information of a user of the mobile terminal, and storing the determined emotion information of the user.
Preferably, the emotion sharing method further includes: modifying the stored emotional information of the user in response to a modification request input by the user.
Preferably, the emotion sharing method further includes: transmitting the emotional information of the user to at least one contact selected by the user.
Preferably, the mobile terminal of the at least one contact determines the emotional information of the at least one contact by voice recognition of the user, facial expression recognition of the user, analysis of information contained in social tools, analysis of biological data, or statistical analysis of usage data.
Preferably, the mobile terminal determines emotional information of the user through voice recognition of the user, expression recognition of the user, analysis of information contained in the social tool, analysis of biological data, or statistical analysis of usage data.
Preferably, the emotion sharing method further includes: and displaying the emotion information of the user.
Preferably, the step of receiving emotional information of at least one contact stored in a phone book of the mobile terminal comprises: receiving emotional information of at least one contact selected by a user.
Preferably, the emotional tag comprises an icon and/or a textual description indicating an emotional state of the at least one contact.
Preferably, the emotion sharing method further includes: if the emotional information of the at least one contact is not received again within a predetermined time period after receiving the emotional information of the at least one contact, displaying auxiliary information indicating an obtaining time of the emotional information of the at least one contact around the emotional tag.
Preferably, the emotion sharing method further includes: outputting an update notification indicating that the emotion tag of the at least one contact has been updated, according to the user's selection.
Preferably, the update notification is output in a pop-up window or in a manner of playing a sound.
Preferably, the emotion sharing method further includes: and matching the emotion information of the at least one received contact with the emotion information of the user, and recommending the user to communicate with the matched contact.
Preferably, the emotion sharing method further includes: in response to a selection by the user after viewing the emotion tag of the contact, initiating communication with the selected contact.
According to another aspect of the present invention, there is provided an emotion sharing method for a mobile terminal, including: determining emotion information of a user of the mobile terminal, and storing the determined emotion information; the determined emotion information is transmitted to at least one contact stored in the mobile terminal.
Preferably, the mobile terminal determines emotional information of the user through voice recognition of the user, expression recognition of the user, analysis of information contained in the social tool, analysis of biological data, or statistical analysis of usage data.
Preferably, the emotion sharing method further includes: receiving emotion information of at least one contact stored in the mobile terminal; updating the emotion label of each contact in the at least one contact based on the received emotion information of the at least one contact, wherein the emotion label of the contact is displayed in an interface containing the contact information.
Preferably, the emotion sharing method further includes: outputting an update notification indicating that the emotion tag of the at least one contact has been updated, according to the user's selection.
Preferably, the emotion sharing method further includes: and matching the received emotion information of the at least one contact with the determined emotion information, and recommending the user to communicate with the matched contact.
According to another aspect of the present invention, there is provided an emotion sharing apparatus for a mobile terminal, including: the receiving module is used for receiving emotion information of at least one contact stored in the mobile terminal; and the updating module is used for updating the emotion label of each contact in the at least one contact based on the received emotion information of the at least one contact, wherein the emotion label of the contact is displayed in an interface containing the contact information.
Preferably, the emotion sharing apparatus further includes: and the determining and storing module is used for determining the emotion information of the user of the mobile terminal and storing the determined emotion information of the user.
Preferably, the emotion sharing apparatus further includes: and the modification module is used for modifying the stored emotion information of the user in response to a modification request input by the user.
Preferably, the emotion sharing apparatus further includes: and the sending control module is used for controlling sending of the emotion information of the user to at least one contact selected by the user.
Preferably, the mobile terminal of the at least one contact determines the emotional information of the at least one contact by voice recognition of the user, facial expression recognition of the user, analysis of information contained in social tools, analysis of biological data, or statistical analysis of usage data.
Preferably, the mobile terminal determines emotional information of the user through voice recognition of the user, expression recognition of the user, analysis of information contained in the social tool, analysis of biological data, or statistical analysis of usage data.
Preferably, the emotion sharing apparatus further includes: and the display control module controls and displays the emotion information of the user.
Preferably, the receiving module receives emotional information of at least one contact selected by the user.
Preferably, the emotional tag comprises an icon and/or a textual description indicating an emotional state of the at least one contact.
Preferably, the emotion sharing apparatus further includes: a display control module controlling to display auxiliary information indicating an obtaining time of the emotion information of the at least one contact if the emotion information of the at least one contact is not received again within a predetermined time period after receiving the emotion information of the at least one contact.
Preferably, the emotion sharing apparatus further includes: a notification output module that outputs an update notification indicating that the emotion tag of the at least one contact has been updated, according to a selection of a user.
Preferably, the notification output module controls to output the update notification in a pop-up window or in a manner of playing a sound.
Preferably, the emotion sharing apparatus further includes: and the matching and recommending module is used for performing emotion matching on the received emotion information of the at least one contact and the emotion information of the user and recommending the user to communicate with the matched contact.
According to another aspect of the present invention, there is provided an emotion sharing apparatus for a mobile terminal, including: the determining and storing module is used for determining the emotion information of the user of the mobile terminal and storing the determined emotion information of the user; and a transmission control module for controlling the transmission of the emotion information of the user to at least one contact stored in the mobile terminal.
Preferably, the mobile terminal determines emotional information of the user through voice recognition of the user, expression recognition of the user, analysis of information contained in the social tool, analysis of biological data, or statistical analysis of usage data.
Preferably, the emotion sharing apparatus further includes: the receiving module is used for receiving emotion information of at least one contact stored in the mobile terminal; and the updating module is used for updating the emotion label of each contact in the at least one contact based on the received emotion information of the at least one contact, wherein the emotion label of the contact is displayed in an interface containing the contact information.
Preferably, the emotion sharing apparatus further includes: a notification output module that outputs an update notification indicating that the emotion tag of the at least one contact has been updated, according to a selection of a user.
Preferably, the emotion sharing apparatus further includes: and the matching and recommending module is used for performing emotion matching on the received emotion information of the at least one contact and the emotion information of the user and recommending the user to communicate with the matched contact.
According to another aspect of the present invention, there is provided a mobile terminal including the emotion sharing apparatus as described above.
According to the invention, the emotion information can be automatically shared and updated, so that the user of the mobile terminal can quickly and simply select the communication object, the operation complexity of the mobile terminal is further reduced, and the operation efficiency of the mobile terminal is improved.
Drawings
These and/or other aspects and advantages of the invention will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment of the present invention;
fig. 2 is a diagram showing an example of an "emotion sharing" setting interface for an emotion sharing apparatus according to an exemplary embodiment of the present invention;
fig. 3 is a flowchart illustrating an emotion sharing method according to an exemplary embodiment of the present invention;
FIG. 4A is a diagram showing emotion tags displayed in a contact list interface;
FIG. 4B is a diagram showing emotion labels displayed in a contact details interface;
fig. 4C is a diagram showing emotion labels displayed in a call interface;
fig. 4D is a diagram showing emotion labels displayed in the call record interface;
fig. 4E is a diagram showing emotion labels displayed in the short message interface;
FIG. 4F is a diagram showing emotion labels displayed in the ChatOn application interface;
fig. 4G is a diagram showing emotion labels displayed in the WeChat application interface;
fig. 5 is a diagram illustrating another example of an "emotion sharing" setting interface for an emotion sharing apparatus according to an exemplary embodiment of the present invention;
fig. 6 is a diagram showing an example of outputting an update notification in the form of a pop-up window;
FIG. 7 is a diagram showing an example of recommending communications;
fig. 8 is a block diagram illustrating an emotion sharing apparatus according to an exemplary embodiment of the present invention;
fig. 9 is a block diagram illustrating an XMPP server according to an exemplary embodiment of the present invention.
Detailed Description
The present invention is described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular is intended to include the plural unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Here, the mobile terminal means a portable phone, a smart phone, or the like capable of performing a call function while on the move.
Hereinafter, the present invention will be explained in detail with reference to the accompanying drawings.
Fig. 1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment of the present invention.
Referring to fig. 1, the mobile terminal 100 includes a Central Processing Unit (CPU) 110, an input unit 120, a camera unit 130, a broadcast receiving unit 140, a memory 150, a wireless communication unit 160, a speaker 171, a microphone 172, a display unit 180, and an emotion sharing device 190.
The CPU110 may control the overall operation of the mobile terminal and may include a decoder (not shown) for decoding audio and video streams. The CPU110 may process and control communications (e.g., voice communications and data communications), decode and output audio streams and video streams, and may perform various control functions in response to a request of the emotion sharing apparatus 190. In addition, the CPU110 may control the display unit 180 to display various user interfaces, for example, an input interface including a virtual keyboard.
In the case where the display unit 180 is implemented as a touch screen, the input unit 120 may have only a small number of keys. For example, a power key for turning on and off the mobile terminal. The display unit 180 may also be implemented as other types of screens, for example, a non-touch screen, a flexible screen, a foldable screen, and the like.
The camera unit 130 is an optional unit and may include a lens, an imaging sensor such as a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor, an analog-to-digital (a/D) converter, and the like. The camera unit 130 may acquire an image signal by capturing an image of a subject, may convert the acquired image signal into a digital signal, and may transmit the digital signal to the CPU 110. The CPU110 may perform image processing (such as noise removal and color processing) on the digital signal provided by the camera unit 130. The CPU110 may display an image of the object captured by the camera unit 130 on the display unit 180.
The broadcast receiving unit 140 may receive broadcast programs through a broadcast channel such as a terrestrial broadcast channel, a satellite broadcast channel, or a bidirectional channel such as the internet. In the case where the broadcast receiving unit 140 receives a broadcast program through a broadcast channel, the broadcast receiving unit 140 may include: a tuner (not shown) for outputting a transport stream by tuning and for demodulating an input broadcast signal; a demultiplexer (not shown) demultiplexes the transport stream output by the tuner into a video stream and an audio stream.
The memory 150 may store a program for controlling the overall operation of the mobile terminal 100. For example, memory 150 may store various configuration programs used by CPU110, as well as local application programs and the like. In addition, the memory 150 temporarily stores data generated during operation and other data. For example, the memory 150 may store video files, audio files, text, and the like.
The wireless communication unit 160 may perform wireless communication with other mobile terminals or various servers. For example, after the mobile communication terminal 100 performs user authentication by using the user identification information stored in the memory 150, the mobile communication terminal 100 will access a wireless network so that the wireless communication unit 160 can perform wireless communication with other mobile terminals via the mobile communication network. The wireless communication unit 160 may be configured to support a variety of communication protocols such as bluetooth, Zigbee, WLAN, Wi-Fi, home RF, UWB, wireless 1394, or a combination thereof. On the other hand, for non-portable electronic devices, a wired communication unit, such as a network card or the like, may be included; the unit may not be included for an electronic apparatus that does not need to communicate.
The speaker 171 plays back audio data transmitted and received during a call, audio data contained in a received message, audio data according to playback of an audio file stored in the storage unit 150, and the like. The microphone 172 may receive an audio signal from the outside of the mobile communication terminal 100 and may transmit the received audio signal to the CPU 110.
The display unit 180 may display various user interfaces. In the case where the display unit 180 is implemented as a touch screen, the touch screen may sense a user's touch through a touch sensor disposed thereunder and transmit a touch signal generated due to the user's touch to the CPU110, so that the CPU110 performs a corresponding operation. For example, the touch screen may display a virtual keyboard, which may include virtual number keys and virtual function keys as objects. When the user clicks a specific object, the touch screen transmits a corresponding touch signal to the CPU 110.
The emotion sharing device 190 may be a separate hardware component or may be a software component that runs in the mobile terminal. First, the operating system of the mobile terminal may add an "emotion sharing" setting interface including an option of "turning on an emotion sharing function" in the setting menu. In this way, when the user selects the "turn on emotion sharing function" option, the emotion sharing apparatus 190 may receive emotion information related to a contact stored in the mobile terminal from other mobile terminals and update the emotion tag of the contact based on the received emotion information. The contacts stored in the mobile terminal may be contacts in a phone book, or contacts in an instant messenger (e.g., ChatOn, Facebook, QQ, wechat, etc.). According to an exemplary embodiment of the present invention, the emotion sharing apparatus 190 may control the emotion tag to be displayed in the interface containing the contact information, for example, in the periphery of the contact information, wherein the user may select the interface on which the emotion tag of the contact is to be displayed, for example, the emotion tag of the contact is displayed on each interface containing the contact information, or the emotion tag of the contact is displayed on a specific interface containing the contact information. In addition, the "emotion sharing" setting interface also includes an "emotion recognition mode selection" option. For example, the emotion sharing device 190 may determine the emotional information of the user through voice recognition of the user, facial recognition of the user, analysis of information contained in social tools (ChatOn, Facebook, QQ, WeChat, etc.), analysis of biological data, or statistical analysis of usage data. A user of the mobile terminal may select a manner for determining emotional information of the user. Fig. 2 is a diagram illustrating an example of an "emotion sharing" setting interface for the emotion sharing apparatus 190 according to an exemplary embodiment of the present invention. The emotion sharing device 190 may also receive emotion information actively input by the user.
After the emotion sharing device 190 updates the emotion tags of the contacts, the user may view the emotion tags of the contacts by active viewing or based on the update notification, and if the user determines after viewing the emotion tags of the contacts that communication with the contacts therein is required, the emotion sharing device 190 further initiates communication with the selected contacts in response to the user's selection after viewing the emotion tags of the contacts. Therefore, the reference is provided for the communication of the user by updating the emotion label of the contact.
The operation and structure of the emotion sharing device 190 are described in detail below with reference to fig. 3 to 9.
Fig. 3 is a flowchart illustrating an emotion sharing method according to an exemplary embodiment of the present invention. According to an exemplary embodiment of the present invention, an emotion sharing method includes the steps of: receiving emotion information of at least one contact stored in the mobile terminal; updating the emotion label of the at least one contact based on the received emotion information of the at least one contact, wherein the emotion label is displayed around the contact information. According to another exemplary embodiment of the present invention, an emotion sharing method includes the steps of: determining emotion information of a user of the mobile terminal, and storing the determined emotion information; the determined emotion information is transmitted to at least one contact stored in the mobile terminal.
Referring to fig. 3, after the user of the mobile terminal selects an option of "turn on an emotion sharing function" from a setting menu, in step S301, emotion information of the user of the mobile terminal is determined and the determined emotion information of the user is stored. According to an exemplary embodiment of the present invention, the emotion information of the determined user may be stored in the memory 150 or in a memory provided inside the emotion sharing apparatus 190. Here, the emotional information of the user may be determined through voice recognition of the user, facial expression recognition of the user, analysis of information contained in a social tool, analysis of biological data, or statistical analysis of usage data according to the selection of the setting menu by the user. It is possible to recognize contents of a user call, a recording, voice information, etc. to extract emotion information of the user, and update the emotion information immediately whenever emotion information different from previous emotion information (stored emotion information) is extracted, and store the updated emotion information. For example, emotion information of a user may be analyzed by performing expression recognition in a user video call, photographing (i.e., self-photographing, requiring recognition of a photographer as a cell phone user, or providing user selection confirmation), eyeball recognition, or the like, and whenever emotion information different from previous emotion information (stored emotion information) is analyzed, the emotion information is immediately updated, and the updated emotion information is stored. Further, the emotion information of the user can be determined by analyzing information input when the social tool (e.g., ChatOn, WeChat, Facebook, etc.) used by the user, and the emotion information is updated immediately whenever emotion information different from the previous emotion information (stored emotion information) is determined, and the updated emotion information is stored. Further, it is possible to analyze emotion information of the user from the skin radio wave using a device such as a wrist strap sensor, and update the emotion information immediately whenever emotion information different from the previous emotion information (stored emotion information) is analyzed, and store the updated emotion information. Further, emotional information of the user can be recognized by statistically analyzing usage data of time, frequency, contents, etc. when the user accesses the internet using the mobile terminal to find regularity, and the emotional information is immediately updated whenever emotional information different from previous emotional information (stored emotional information) is recognized, and the updated emotional information is stored.
Alternatively, the user of the mobile terminal may modify his/her own emotion information through a modification interface provided by the emotion sharing apparatus 190, and the modified emotion information may be stored. In addition, the display unit 180 may be controlled to display the determined emotion information of the user or the modified emotion information of the user.
Next, in step S302, emotion information of the user may be transmitted to at least one contact stored in the mobile terminal. For example, the emotional information of the user may be transmitted to a contact, contacts, a group of contacts, or groups of contacts selected by the user from a phonebook or an instant messenger of the mobile terminal. According to an embodiment of the present invention, the wireless communication unit 160 may be controlled to transmit the emotion information. In another aspect, the emotional information of the user may be transmitted to the at least one contact selected by the user through an intermediate server. For example, the emotional information may be communicated via a server-client communication supported by the XMPP protocol. The XMPP server capable of providing the cloud service may store basic information of the user (e.g., whether to turn on a selection of an "turn on emotion sharing function" option) and a contact set by the user to share emotion information. The XMPP server can send the XMPP message containing the emotion information of the user to the contact set by the user and sharing the emotion information, and informs the contact of the emotion change of the user. The XMPP server will be described in detail later.
In step S303, the mobile terminal receives emotion information of at least one contact stored in the mobile terminal. The at least one contact may be pre-selected by a user. Similarly, the emotion information may be received through the wireless communication unit 160. As described above, the XMPP server receives emotional information of at least one contact, and the mobile terminal of the at least one contact determines the emotional information of the at least one contact through voice recognition of the at least one contact (i.e., the user of the mobile terminal), emotion recognition of the at least one contact, analysis of information contained in a social tool, analysis of biological data, or statistical analysis of usage data.
Subsequently, in step S304, the mobile terminal updates the emotion tag of the at least one contact based on the received emotion information of the at least one contact. Here, the emotion label may be displayed around the contact information. Alternatively, the emotional information may be classified into four basic categories of emotion according to international standards, i.e., happiness, anger, fear, and sadness. On the other hand, to increase the emotion category and facilitate mutual understanding between contacts, the emotion information may be further classified into, for example, seven emotions in china: joy, anger, worry, thinking, sadness, terrorism and fright. However, the present invention is not limited thereto, and the emotion information may be classified into other plural different categories.
According to an exemplary embodiment of the present invention, the emotion label may be displayed in a graphical form. The emoticon tags may include icons and/or textual descriptions that indicate the emotional state of the contact. The emoticon tag may be displayed in an interface containing the contact information and the emoticon tag may be displayed on the right side of the contact information. However, the present invention is not limited thereto, and the emotion label may be displayed in various directions around the contact information, for example, the emotion label may be displayed in an area (emotion label display area) separately provided in the interface containing the contact information. Specifically, fig. 4A shows the emotion tags displayed in the contact list interface, fig. 4B shows the emotion tags displayed in the contact details interface, fig. 4C shows the emotion tags displayed in the call interface, fig. 4D shows the emotion tags displayed in the call record interface, fig. 4E shows the emotion tags displayed in the note interface, fig. 4F shows the emotion tags displayed in the ChatOn application interface, and fig. 4G shows the emotion tags displayed in the WeChat application interface. In particular, auxiliary information is also displayed in the contact detail interface and the short message interface. If emotional information is not obtained again from a contact within a predetermined time (e.g., without limitation, one hour) after obtaining its emotional information from the contact, auxiliary information indicating the time of obtaining the emotional information of the contact (e.g., how long ago the emotional information was obtained, the specific time of obtaining the emotional information, etc.) may be displayed. The auxiliary information may be displayed around the emotion label, for example, on the right side of the emotion label. By displaying the emotion label on the interface containing the contact information, a user can see the emotion label of the contact first when trying to communicate with the contact and know the latest emotion information of the contact, so that the communication effect can be improved, and the problem that some topics are inappropriate is avoided.
In step S305, if the user selects to output the emotion notification in the setting menu, an update notification indicating that the emotion tag of at least one contact has been updated may be output when receiving the emotion information of the at least one contact and updating the emotion tag of the at least one contact. According to an exemplary embodiment of the present invention, the "emotion sharing" setting interface further includes a "receive emotion notification" option. Fig. 5 is a diagram illustrating another example of an "emotion sharing" setting interface for the emotion sharing apparatus 190 according to an exemplary embodiment of the present invention. If the user selects the "receive emotional notification" option, an emotional information update notification may be output upon receiving the emotional information. In addition, a subject that receives an emotional notification may be further selected. In other words, the user may choose to output an emotional information update notification for at least one contact only for that contact. On the other hand, the "emotion sharing" setting interface further includes an "emotion notification manner" option, and the user can determine a manner of outputting the update notification by selecting the "emotion notification manner" option. For example, the display unit 180 and/or the speaker 171 may be controlled to output the update notification. That is, the update notification may be output in a pop-up window or in a manner of playing a sound according to a user's selection, and the type of playing the sound may be determined by the user. Alternatively, when the update notification is output in a pop-up window, the update notification may include contact information and an emotional tag, as well as the necessary text description. Therefore, users can pay attention to each other in time. Further, the duration of the output notification may also be set, for example, the pop-up window disappears after being held for 5 seconds, or does not disappear until the user confirms it. Fig. 6 is a diagram showing an example of outputting an update notification in the form of a pop-up window.
Alternatively, the mobile terminal may emotionally match the received emotion information of at least one contact with the emotion information of the user and recommend the user to communicate with the matched contact at step S306. Here, the display unit 180 may be controlled to output a recommendation notification. For example, if the user matches the mood of a contact, such as being very happy as well, or one party is low and the other party is low, a notification is sent to the user suggesting that the user communicate with the matched contact. Fig. 7 is a diagram showing an example of recommending communication. In fig. 7, the emotion of the user and the contact "friend B" are both very happy, so the user is recommended to communicate with "friend B". In this way, communication may be initiated with the matched contact in response to the user selecting the recommended communication.
In the flow corresponding to fig. 3, steps S301 and S302 independently constitute a process of sharing emotion information, steps S303 and S306 independently constitute a process of receiving emotion information and updating an emotion tag, and if the mobile terminal is only used as an emotion information sharing terminal, the steps S301 and S302 may be executed; if the mobile terminal is only used as the receiving end of the emotion information, the step S303 and the step S306 are executed; if the mobile terminal is used as both the emotion information sharing terminal and the emotion information receiving terminal, the two processes are executed, and the two processes are not sequentially executed.
Fig. 8 is a block diagram illustrating an emotion sharing apparatus according to an exemplary embodiment of the present invention. According to an exemplary embodiment of the present invention, an emotion sharing apparatus includes the following modules: the receiving module is used for receiving emotion information of at least one contact stored in the mobile terminal; and the updating module is used for updating the emotion label of each contact in the at least one contact based on the received emotion information of the at least one contact, wherein the emotion label of the contact is displayed in an interface containing the contact information. According to another exemplary embodiment of the present invention, an emotion sharing apparatus includes the following modules: the determining and storing module is used for determining the emotion information of the user of the mobile terminal and storing the determined emotion information of the user; and a transmission control module for controlling the transmission of the emotion information of the user to at least one contact stored in the mobile terminal.
Referring to fig. 8, the emotion sharing apparatus may include a receiving module 801, an updating module 802, a determination and storage module 803, a modification module 804, a transmission control module 805, a display control module 806, a notification output module 807, and a matching and recommendation module 808. When the emotion sharing apparatus is used as an emotion information sharing terminal and an emotion information receiving terminal, the apparatus may include the receiving module 801, the updating module 802, the determining and storing module 803, and the sending control module 805, and may optionally include a modifying module 804, a display control module 806, a notification output module 807, and a matching and recommending module 808; when the emotion sharing apparatus is only used as an emotion information sharing side, the apparatus may include a determination and storage module 803 and a transmission control module 805, and may optionally include a modification module 804, a display control module 806, and a matching and recommendation module 808; when the emotion sharing apparatus serves as only an emotion information receiving terminal, the apparatus may include a receiving module 801 and an updating module 802, and may optionally include a notification output module 807 and a matching and recommending module 808.
As described above, after the user of the mobile terminal selects the "open emotion sharing function" option from the setting menu, the determination and storage module 803 determines emotion information of the user of the mobile terminal and stores the determined emotion information of the user. The modification module 804 modifies the stored emotional information of the user in response to a modification request input by the user. The transmission control module 805 controls transmission of the emotion information of the user to at least one contact selected by the user and stored in the mobile terminal. The display control module 806 controls display of emotional information of the user.
The receiving module 801 receives emotion information of at least one contact stored in the mobile terminal, and the updating module 802 updates an emotion tag of the at least one contact based on the received emotion information of the at least one contact, wherein the at least one contact is selected by a user. As described above, the emoticon label includes an icon and/or textual description indicating an emotional state of the at least one contact and may be displayed around the contact information. The mobile terminal of the contact determines emotional information of the contact through voice recognition of the contact, facial expression recognition of the contact, analysis of information contained in the social tool, analysis of biological data, or statistical analysis of usage data. The mobile terminal of the user determines emotional information of the user through voice recognition of the user, expression recognition of the user, analysis of information contained in social tools, analysis of biological data, or statistical analysis of usage data.
Display control module 806 may control display of the emoticon tags in an interface containing contact information. At this time, the emotion tag may be displayed on the right side of the contact information in the interface or in a separately provided area in the interface. When the emotion tag is displayed, if the emotion information of at least one contact is not received again within a predetermined time period after the emotion information of the at least one contact is received, the display control module 806 may control to display auxiliary information indicating the acquisition time of the emotion information of the at least one contact. The auxiliary information may be displayed around the emotion label, for example, on the right side of the emotion label. The notification output module 807 may output an update notification indicating that the emotion label of the contact has been updated in a pop-up window or in the form of a play sound according to the user's selection. Matching and recommending module 808 may perform emotion matching on the received emotion information of the contacts with the emotion information of the user and recommend the user to communicate with the matched contacts.
Fig. 9 is a block diagram illustrating an XMPP server according to an exemplary embodiment of the present invention.
Referring to fig. 9, an XMPP server communicates with a plurality of mobile terminals through a data network or a wireless network. The XMPP server can be operated in a cloud service mode, so that each mobile terminal can conveniently access any area. The XMPP server may include a user relationship database 901, a message listening and parsing engine 902, a logical processing engine 903, and a notification sending engine 904.
The user relationship database 901 may store users and their relationship information. For example, after the user of the mobile terminal selects the "open emotion sharing function" option, when the user selects one or more contacts that can share emotion, the mobile terminal may transmit information of itself and information of the one or more contacts (e.g., a phone number of the mobile terminal and a phone number of the contact) to the XMPP server, and the information will be stored in the user relationship database 901. In this way, when the user wishes to share the emotion information, the mobile terminal only needs to transmit the information of itself and the emotion information to the XMPP server, and does not need to transmit the information of the contact person, so that the speed of sharing the emotion information can be significantly increased.
The message interception and parsing engine 902 intercepts and parses various information transmitted by the mobile terminal, and the logic processing engine 903 may extract information and emotion information of the mobile terminal from the parsed information, and the notification transmission engine 904 may transmit the emotion information to the mobile terminal of the corresponding contact.
The information received/sent by the XMPP server is XMPP information. For example, if the user "+ 8613901088801" has selected the "open emotion sharing function" option, and is set to share his own emotion with contacts "+ 8613901088802" and "+ 8613901088803".
A sample of messages that update the mood of the user (i.e. one message sent by the mobile terminal to the server) is as follows:
the message sample for emotion notification (i.e., the server sends two pieces of information to both mobile terminals) is as follows:
several scenarios to which the emotion sharing method and apparatus according to the exemplary embodiments of the present invention are applied are described below.
Scene one:
user a has made a mistake, has made a trouble or loss with friend B, and wants to regress by phone and friend. According to the exemplary embodiment of the invention, when the user a turns over the phone book or the call record wants to communicate with the friend B, the user a will see the emotion tag of the friend B first, so that the user a can select the friend B to either ignore the mood or regress the timing of the happiness, thereby allowing the other party to easily achieve understanding and improving the relationship between the two parties. Meanwhile, the situation that when B emotions are not good, communication is not smooth, and a gap is generated is avoided.
Scene two:
user C is very busy and often on business. According to the exemplary embodiment of the present invention, the user C can choose to receive the emotion information of the girlfriend D, so that the emotion change of the girlfriend can be easily known, and when the emotion of the girlfriend is bad, the user C can choose to communicate with the girlfriend in time through a telephone or other ways to enhance the emotion, thereby avoiding the psychological fall of the girlfriend being cold.
Scene three:
the user E has great recent living pressure and unstable emotion, and often has ideas of 'I do not go soon', 'too oppressed' and 'life really has no meaning'. This evening, user E receives the phone of the three best friends at once, and everyone shares a happy experience with her, encourages or comforts E, and still looks around her weekend. These all make E in the mood of the mother. In fact, the friend of E can timely communicate with her by sharing emotion and receiving emotion information sent by her mobile terminal to know her mental state.
According to the embodiment of the invention, the emotion information can be automatically shared and updated, so that the user of the mobile terminal can quickly and simply select the communication object, the operation complexity of the mobile terminal is further reduced, and the operation efficiency of the mobile terminal is improved.
The emotion sharing apparatus according to the exemplary embodiment of the present invention described above may be implemented in hardware or firmware, or as software or computer code, or a combination thereof. In addition, the software or computer code may also be stored in a non-transitory recording medium (read only memory (ROM), Random Access Memory (RAM), Compact Disc (CD) -ROM, magnetic tape, floppy disk, optical data storage device, and carrier wave (such as data transmission through the internet)) or computer code downloaded through a network, wherein the computer code is initially stored on a remote recording medium, a computer readable recording medium, or a non-transitory machine readable medium and is to be stored on a local recording medium, the methods described herein may thus be implemented using a general purpose computer, digital computer, or special purpose processor, with such software, computer code, software modules, software objects, instructions, applications, applets, apps, etc., stored on a recording medium, or in programmable or special purpose hardware, such as an ASIC or FPGA. As understood in the art: the computer, processor, microprocessor controller or programmable hardware includes volatile and/or non-volatile memory and memory components (e.g., RAM, ROM, flash memory, etc.), wherein the memory and memory elements can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, will implement the processing methods described herein. Further, it will be appreciated that: when a general-purpose computer accesses code for implementing the processes shown herein, execution of the code transforms the general-purpose computer into a special-purpose computer for performing the processes shown herein. Further, the program may be electronically transferred via any medium (e.g., communication signals transmitted via wired/wireless connection and equivalents thereof). The program and the computer-readable recording medium may also be distributed over network-coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.
Although a few embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.
Claims (28)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410043862.2A CN104811469B (en) | 2014-01-29 | 2014-01-29 | Emotion sharing method and device for mobile terminal and mobile terminal thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410043862.2A CN104811469B (en) | 2014-01-29 | 2014-01-29 | Emotion sharing method and device for mobile terminal and mobile terminal thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104811469A CN104811469A (en) | 2015-07-29 |
CN104811469B true CN104811469B (en) | 2021-06-04 |
Family
ID=53695958
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410043862.2A Active CN104811469B (en) | 2014-01-29 | 2014-01-29 | Emotion sharing method and device for mobile terminal and mobile terminal thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104811469B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104427078A (en) * | 2013-09-06 | 2015-03-18 | 联想(北京)有限公司 | Information output method and electronic device |
CN105159979A (en) * | 2015-08-27 | 2015-12-16 | 广东小天才科技有限公司 | friend recommendation method and device |
CN105512945B (en) * | 2015-12-24 | 2020-01-10 | 小米科技有限责任公司 | Social network interaction information processing method and device |
US10762429B2 (en) * | 2016-05-18 | 2020-09-01 | Microsoft Technology Licensing, Llc | Emotional/cognitive state presentation |
EP3549002A4 (en) * | 2016-11-30 | 2020-07-15 | Microsoft Technology Licensing, LLC | Sentiment-based interaction method and apparatus |
KR102387400B1 (en) * | 2017-08-08 | 2022-04-15 | 라인 가부시키가이샤 | Method and system for recognizing emotions during a call and utilizing the recognized emotions |
CN109194807A (en) * | 2018-10-16 | 2019-01-11 | 珠海格力电器股份有限公司 | Telephone number management method and device and terminal equipment |
CN110134577A (en) * | 2019-04-30 | 2019-08-16 | 上海掌门科技有限公司 | Show the method and apparatus of user emotion |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101523935A (en) * | 2006-09-29 | 2009-09-02 | Ti广场技术株式会社 | Method and apparatus for transmitting the result of voice analysis |
CN102523502A (en) * | 2011-12-15 | 2012-06-27 | 四川长虹电器股份有限公司 | Intelligent television interaction system and interaction method |
US20130144937A1 (en) * | 2011-12-02 | 2013-06-06 | Samsung Electronics Co., Ltd. | Apparatus and method for sharing user's emotion |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101917512A (en) * | 2010-07-26 | 2010-12-15 | 宇龙计算机通信科技(深圳)有限公司 | Method and system for displaying head picture of contact person and mobile terminal |
CN102790732B (en) * | 2012-07-18 | 2015-10-21 | 上海量明科技发展有限公司 | The method that in instant messaging, state is mated, client and system |
CN102780651A (en) * | 2012-07-21 | 2012-11-14 | 上海量明科技发展有限公司 | Method for inserting emotion data in instant messaging messages, client and system |
-
2014
- 2014-01-29 CN CN201410043862.2A patent/CN104811469B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101523935A (en) * | 2006-09-29 | 2009-09-02 | Ti广场技术株式会社 | Method and apparatus for transmitting the result of voice analysis |
US20130144937A1 (en) * | 2011-12-02 | 2013-06-06 | Samsung Electronics Co., Ltd. | Apparatus and method for sharing user's emotion |
CN102523502A (en) * | 2011-12-15 | 2012-06-27 | 四川长虹电器股份有限公司 | Intelligent television interaction system and interaction method |
Also Published As
Publication number | Publication date |
---|---|
CN104811469A (en) | 2015-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104811469B (en) | Emotion sharing method and device for mobile terminal and mobile terminal thereof | |
CN105099877B (en) | Instant communication message treating method and apparatus | |
CN105407036B (en) | Red packet sending method and device | |
CN107767864B (en) | Method and device for sharing information based on voice and mobile terminal | |
CN105099724B (en) | Group creating method and device | |
CN104079473B (en) | information issuing method and device | |
WO2017020483A1 (en) | Photo sharing method and apparatus | |
US10701315B2 (en) | Video communication device and video communication method | |
CN105141506B (en) | communication message processing method and device | |
JP2017506403A (en) | Apparatus, method, terminal device, program, and recording medium for realizing touch button and fingerprint authentication | |
CN106470357A (en) | barrage message display method and device | |
CN106067111A (en) | Message processing method and device | |
CN107888965A (en) | Image present methods of exhibiting and device, terminal, system, storage medium | |
CN105488026A (en) | Concerned topic reminding method and apparatus | |
CN106128478A (en) | Voice broadcast method and device | |
CN106550252A (en) | The method for pushing of information, device and equipment | |
CN109412929A (en) | The method, device and mobile terminal that expression adaptively adjusts in instant messaging application | |
CN106375178A (en) | Message display method and device based on instant messaging | |
JP2016535527A (en) | Call transfer method, apparatus and terminal, program, and recording medium | |
CN104333641B (en) | Call method and device | |
CN109788367A (en) | A kind of information cuing method, device, electronic equipment and storage medium | |
US10278033B2 (en) | Electronic device and method of providing message via electronic device | |
CN106792446A (en) | Information push method and device based on wearable device | |
CN106572397A (en) | Interaction method and device for live video application | |
CN109450894A (en) | Information interacting method, device, system, server user's terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
EXSB | Decision made by sipo to initiate substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |