WO2024075150A1 - Information processing device, information processing method, program, and recording medium - Google Patents
Information processing device, information processing method, program, and recording medium Download PDFInfo
- Publication number
- WO2024075150A1 WO2024075150A1 PCT/JP2022/036944 JP2022036944W WO2024075150A1 WO 2024075150 A1 WO2024075150 A1 WO 2024075150A1 JP 2022036944 W JP2022036944 W JP 2022036944W WO 2024075150 A1 WO2024075150 A1 WO 2024075150A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- user
- user terminal
- information
- reaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
Definitions
- This disclosure relates to an information processing device, information processing method, program, and recording medium that acquires content from a provider terminal via a network and provides the acquired content to a user terminal via the network.
- Patent Document 1 describes an example of an information processing device that provides content provided by an information provider to a user terminal used by a user via a network.
- the information processing device described in Patent Document 1 is a server device.
- This server device is characterized by having an acquisition means for acquiring input information including at least one of information regarding tips given to players and impressions from spectators (users) who watched the player's match (content), an evaluation means for evaluating the input information acquired by the acquisition means for each player, and a provision means for providing the evaluation information evaluated by the evaluation means to the outside.
- the acquisition means described in Patent Document 1 includes a display control means for displaying an input screen for inputting input information on a device owned by a spectator, and a receiving means for receiving the input information input via the input screen. Furthermore, it is described that the input screen includes a button for inputting tips, and a button for continuously inputting impression points that indicate the impression the spectator has on a player. Specifically, spectators watch the game at the venue or via a television broadcast, and depending on the degree of impression they feel as a result of watching the game, can access the site server via a personal computer (user terminal) or mobile terminal (user terminal) and input the information via a browser screen.
- the inventors of this application recognized the problem that, in the server device described in Patent Document 1, users operate their user terminals to input their own impression points, and therefore the impression points acquired by the server device are determined subjectively by the users.
- Patent Document 1 In addition to the invention of Patent Document 1, it has been difficult to ensure the true meaning of reviews of various products and services using only text and images.
- the purpose of this disclosure is to provide an information processing device, information processing method, program, and recording medium that can prevent the reaction information acquired from a user terminal from including the user's subjective opinion.
- the present disclosure relates to an information processing device having a content acquisition unit that acquires content from a provider terminal via a network, and a content provision unit that provides the content acquired by the content acquisition unit to a user terminal via the network, the information processing device having a reaction recognition unit that acquires reaction information of the user based on at least one of biometric authentication information of the user when the user uses the content on the user terminal or video information capturing the facial expression of the user, and a processing execution unit that performs a predetermined process that associates the content with the user terminal based on the reaction information acquired by the reaction recognition unit.
- FIG. 1 is a block diagram showing a configuration of an information processing system including an information processing device and a recording medium.
- 1 is a flowchart illustrating an example of an information processing method performed in an information processing system.
- FIG. 3A is a flowchart showing another example of processing performed in the information processing system
- FIG. 3B is a flowchart showing another example of processing performed in the information processing system.
- FIG. 4A is a flowchart showing another example of processing 3 performed in the information processing system
- FIG. 4B is a flowchart showing another example of processing 4 performed in the information processing system.
- the present disclosure relates to an information processing device, an information processing method, a program, and a recording medium for obtaining content from a provider terminal via a network, providing the obtained content to a user terminal via the network, and obtaining facial expression and biometric information and associating the user with the content.
- the information processing system described below includes these information processing devices, information processing methods, programs, and recording media. Below, several embodiments included in the information processing system will be described with reference to the drawings. In the drawings for explaining the embodiments of the information processing system, the same parts are generally designated by the same reference numerals, and repeated description will be omitted.
- the information processing system 10 shown in FIG. 1 includes a server 11 as an information processing device, a provider terminal 12, a user terminal 13, a biometric authentication device 36, and a recording medium 70.
- the server 11 and the provider terminal 12 are configured to be able to communicate with each other via a network 14, and the server 11 and the user terminal 13 are configured to be able to communicate with each other via the network 14.
- the biometric authentication device 36 is a device that detects the biometric information of the user of the user terminal 13 and outputs a signal according to the detection result.
- a single provider terminal 12 is shown in FIG. 1, but multiple provider terminals 12 can be connected to the server 11 independently.
- a single user terminal 13 is shown, but multiple user terminals 13, 13A can be connected to the server 11 independently.
- the server 11 is managed and operated by an information administrator.
- the provider terminal 12 is operated and used by a provider who publishes content on the network 14.
- the user terminal 13 is operated and used by a user who uses the content published on the network 14.
- the content includes digital content such as video including audio information, still images, music, games, etc.
- the use of content includes purchase, download, playback, display, operation, etc. Downloading includes streaming.
- the use of content may be either using the content in parallel with the acquisition of the content on the user terminal 13, or using the content on the user terminal 13 a predetermined time after acquiring and storing the content on the user terminal 13.
- the recording medium 70 includes a magnetic tape, a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc.
- the recording medium 70 records, that is, stores, programs and applications that are started in the server 11 to perform processing, judgment, etc.
- the server 11 is a computer capable of performing various processes, judgments, decisions, etc., based on input signals and pre-stored information such as programs, applications, data, etc.
- the server 11 has a control unit 15, a memory unit 16, a communication unit 17, and a reading unit 71.
- the control unit 15 has an input port, an output port, and an arithmetic processing circuit.
- the control unit 15 can transmit and receive signals between the memory unit 16 and the communication unit 17.
- the memory unit 16 stores information such as programs, applications, data, etc. in advance.
- the memory unit 16 also stores information transmitted from the provider terminal 12, information transmitted from the user terminal 13, contract contents exchanged between the provider terminal 12 and the user terminal 13, etc.
- the communication unit 17 has a function of transmitting signals output from the control unit 15 to the provider terminal 12 and the user terminal 13 via the network 14, and a function of sending signals received from the provider terminal 12 and the user terminal 13 via the network 14 to the control unit 15. Furthermore, the communication unit 17 can also transmit and receive signals to and from computers other than the provider terminal 12 and the user terminal 13 via the network 14.
- the reading unit 71 reads programs, applications, etc. from the recording medium 70, and sends the read programs, applications, etc. to the control unit 15.
- the control unit 15 has a content acquisition unit 18, a content provision unit 19, a reaction recognition unit 20, a processing execution unit 21, a content recommendation unit 22, a reaction memory unit 39, a customer management unit 40, and a consideration management unit 41.
- the content acquisition unit 18 has a function of acquiring content received from the provider terminal 12 and processing the content.
- the content provision unit 19 has a function of publishing the acquired content on the network 14, and a function of providing content to the user terminal 13 in response to a request from the user terminal 13.
- the reaction recognition unit 20 has a function of recognizing and judging reaction information received from the user terminal 13. The meaning of the reaction information will be described later.
- the reaction memory unit 39 stores the reaction information processed by the reaction recognition unit 20.
- the processing execution unit 21 performs a predetermined process that associates content with the user terminal 13 based on the reaction information. The predetermined process will be described later.
- the content recommendation unit 22 has a function of recommending to the user terminal 13 content that has not been provided to the user terminal 13 before.
- the customer management unit 40 has a function of managing the provider information of the provider terminal 12 and the user information of the user terminal 13.
- the customer management unit 40 also has a function of determining the characteristics and trends of the content used for each user terminal 13, and accumulating and analyzing reaction information to the content used for each user terminal 13.
- the compensation management unit 41 manages the account numbers included in the user information, the account numbers included in the provider information, and the deposit account set up by the server 11 on the network 14.
- the compensation management unit 41 also manages the compensation granted to the user terminal 13 from the deposit account, the fees paid from the user terminal 13, and the fees paid from the provider terminal 12 to the server 11.
- the control unit 15 can store the programs, applications, etc. that the reading unit 71 reads from the recording medium 70 in the memory unit 16.
- the control unit 15 activates the programs, applications, etc. that the reading unit 71 reads from the recording medium 70, thereby causing the content providing unit 19, content acquiring unit 18, reaction recognition unit 20, processing execution unit 21, content recommendation unit 22, reaction storage unit 39, customer management unit 40, and consideration management unit 41 to function.
- the control unit 15 performs processing, comparison, and judgment based on signals input via the communication unit 17, information stored in the reaction storage unit 39, information stored in the customer management unit 40, information in the consideration management unit 41, and information stored in the memory unit 16.
- the control unit 15 also stores the results of the executed processing, comparison, and judgment in the memory unit 16.
- the server 11 can start programs, applications, etc. recorded on the recording medium 70 while the reading unit 71 is connected to the recording medium 70.
- the server 11 can also start programs, applications, etc. read from the recording medium 70 while the reading unit 71 is not connected to the recording medium 70 after reading the programs, applications, etc. recorded on the recording medium 70 with the reading unit 71 and storing them in the memory unit 16.
- the provider terminal 12 may be any of a desktop personal computer, a notebook personal computer, a smartphone, a tablet terminal, etc.
- the provider terminal 12 has a control unit 23, an operation unit 24, a display unit 25, a storage unit 26, and a communication unit 27.
- the control unit 23 has an arithmetic processing circuit, an input port, and an output port.
- the control unit 23 can transmit and receive signals between the storage unit 26 and the communication unit 27.
- the storage unit 26 stores information such as programs, applications, and data in advance.
- the storage unit 26 also stores information transmitted from the server 11, the contents of the contract exchanged between the provider terminal 12 and the user terminal 13, and the results of processing and judgment performed by the control unit 23.
- the communication unit 27 has a function of transmitting a signal output from the control unit 23 to the server 11 via the network 14, and a function of sending a signal received from the server 11 via the network 14 to the control unit 23.
- the communication unit 27 sends a signal to the server 11, the signal is transmitted in association with the provider information of the provider terminal 12.
- the communication unit 27 can also transmit and receive signals to and from computers other than the server 11 via the network 14.
- the operation unit 24 is constructed with at least one element of a touch switch, a display screen such as a liquid crystal panel, a keyboard, a mouse, a scanner, etc.
- a signal corresponding to the operation is input to the control unit 23.
- the display unit 25 is constructed with a touch panel, a liquid crystal panel, etc.
- the display unit 25 displays information input at the provider terminal 12, information acquired from the server 11 and the user terminal 13, etc.
- the content displayed on the display unit 25 is controlled and switched by the control unit 23. If the provider terminal 12 is a smartphone or a tablet terminal, the operation unit 24 corresponds to an operation menu (operation buttons) displayed on the display screen of the display unit 25.
- the control unit 23 performs processing, comparison, judgment, etc. based on the signal input via the communication unit 27, the signal corresponding to the operation of the operation unit 24, and the information stored in the memory unit 16.
- the user terminal 13 may be any of a desktop personal computer, a notebook personal computer, a smartphone, a tablet terminal, etc.
- the user terminal 13 has a control unit 28, an operation unit 29, a display unit 30, a memory unit 31, a communication unit 32, a speaker 33, a camera 34, and a microphone 35.
- the control unit 28 has an arithmetic processing circuit, an input port, and an output port.
- the control unit 28 can transmit and receive signals between the memory unit 31, the communication unit 32, the speaker 33, the camera 34, and the microphone 35.
- the memory unit 31 stores information such as programs, applications, and data in advance.
- the memory unit 31 also stores information transmitted from the server 11, the contents of the contract exchanged between the provider terminal 12 and the user terminal 13, and the results of processing and judgment performed by the control unit 28.
- the communication unit 32 has a function of transmitting a signal output from the control unit 28 to the server 11 via the network 14, and a function of sending a signal received from the server 11 via the network 14 to the control unit 28.
- the communication unit 32 transmits a signal to the server 11, the signal is transmitted in association with the user information of the user terminal 13. Furthermore, the communication unit 32 can also transmit and receive signals to and from computers other than the server 11 via the network 14.
- the operation unit 29 is constructed with at least one element of a touch switch, a display screen such as a liquid crystal panel, a keyboard, a mouse, a scanner, etc.
- a signal corresponding to the operation is input to the control unit 28.
- the display unit 30 is constructed with a touch panel, a liquid crystal panel, etc.
- the display unit 30 displays information input at the user terminal 13, information acquired from the server 11 and the provider terminal 12, and images included in the acquired content.
- the images include still images and videos.
- the content displayed on the display unit 30 is controlled and switched by the control unit 28. If the user terminal 13 is a smartphone or a tablet terminal, the operation unit 29 corresponds to an operation menu (operation buttons) displayed on the display screen of the display unit 30.
- the camera 34 may be either directly attached to the main body of the user terminal 13, or connected to the user terminal 13 via a signal cable.
- the camera 34 captures the user's face and facial expression, and a signal of the captured video information is input to the control unit 28.
- the video captured by the camera 34 may be either a still image or a video.
- a signal including shooting information such as the shooting date and shooting time is attached to the video captured by the camera 34.
- the microphone 35 collects sounds such as laughter and cheers made by the user, and a signal corresponding to the collected sound and volume is input to the control unit 28.
- the speaker 33 is controlled by the control unit 28, and outputs the sound and music included in the content obtained from the server 11.
- the biometric authentication device 36 is a reaction information detection device that detects reaction information of a user who uses content and outputs a signal.
- the biometric authentication device 36 detects the reaction information of the user from the biometric authentication information and behavioral characteristics of the user.
- the biometric authentication device 36 includes a heart rate monitor 37 and a blood pressure monitor 38 in addition to the above-mentioned camera 34 and microphone 35.
- the heart rate monitor 37 measures the user's heart rate per predetermined time and outputs a signal.
- the blood pressure monitor 38 measures the user's blood pressure and outputs a signal.
- the signal output from the heart rate monitor 37 and the signal output from the blood pressure monitor 38 are processed by the control unit 28.
- the user's laughter and cheers, the user's facial expression when laughing, the user's facial expression when moved, etc. are the user's behavioral characteristics.
- the user's pulse rate and blood pressure are the user's biometric authentication information.
- the control unit 28 performs processing and judgment based on signals input via the communication unit 32, signals corresponding to the operation of the operation unit 29, signals input from the camera 34, signals input from the microphone 35, signals input from the heart rate monitor 37, signals input from the blood pressure monitor 38, and programs and applications stored in the memory unit 31.
- the control unit 28 also sends signals corresponding to the results of the processing and judgment to the network 14 via the communication unit 32.
- the server 11 executes the processing and judgment steps by starting up the programs and applications acquired from the recording medium 70.
- the user terminal 13 inputs user information in step S10 and transmits it to the server 11.
- the user information includes the identification number of the user terminal 13, the user's name (name), address, date of birth, age, password, user ID, telephone number, email address, bank or post office account number, account number for electronic money or cryptocurrency, etc.
- Cryptocurrency includes virtual currency.
- step S10 a signal including the image information of the user's face photographed by the camera 34, the user's heart rate per specified time, the user's blood pressure, etc. is transmitted from the user terminal 13 to the server 11.
- the provider terminal 12 inputs provider information in step S11 and transmits it to the server 11.
- the provider information includes the identification number of the provider terminal 12, the provider's name (title), address, password, user ID, telephone number, email address, bank or post office account number, account number for electronic money or crypto assets, etc.
- Crypto assets include virtual currencies.
- the server 11 processes and stores the information transmitted from the user terminal 13 in step S12. The server 11 also processes and stores the information transmitted from the provider terminal 12.
- the user terminal 13 inputs the contract details in step S13 and transmits them to the server 11, and the provider terminal 12 inputs the contract details in step S14 and transmits them to the server 11.
- the server 11 acquires and stores the contract details exchanged between the user terminal 13 and the provider terminal 12 in step S15.
- the contract details include the following. For example, it includes a contract whereby the user terminal 13 pays a fee (usage fee) to the provider terminal 12 when content is used at the user terminal 13 and the user uses the content and shows a reaction. It also includes a contract whereby the user terminal 13 does not pay a fee to the provider terminal 12 when the user uses the content and shows no reaction.
- a reaction when using the content includes at least one of the following: a smile on the user's face, the user laughs, and the user cheers.
- step S15 the server 11 performs a process to receive the compensation included in the contract. Note that steps S10, S11, and S12 may be performed after steps S13, S14, and S15 have been performed.
- the provider terminal 12 transmits the content to the server 11 in step S16.
- the server 11 acquires the content in step S17 and stores it in the storage unit 16.
- the server 11 also publishes on the network 14 in step S17 that the acquired content can be viewed from outside.
- step S18 the user terminal 13 can check a list of content published on the network 14.
- the list of content is displayed on the display unit 30.
- a request to acquire the selected content can be sent to the server 11 in step S18.
- the server 11 that has received the request to acquire the content sends the content to the user terminal 13 in step S19.
- the server 11 associates the history of receiving requests to acquire content from the user terminal 13 and the history of sending content to the user terminal 13 with the user information of the user terminal 13 and stores them in the storage unit 16 in step S19.
- step S20 the user terminal 13 acquires the content from the server 11 and uses the content.
- Using the content includes displaying still images or videos on the display unit 30, outputting music and audio contained in the content from the speaker 33, and the like. This allows the user to enjoy the provided content.
- the biometric authentication device 36 detects the user's reaction information and outputs a signal.
- the reaction information detected by the biometric authentication device 36 is processed in the user terminal 13 in step S21.
- the user terminal 13 transmits the reaction information to the server 11 in step S22.
- the server 11 acquires the reaction information from the user terminal 13 in step S23, and the reaction recognition unit 20 processes and judges the reaction information, and stores the judgment result in the memory unit 16.
- the reaction recognition unit 20 processes the reaction information including the change in the facial expression of the user of the user terminal 13, the heart rate of the user of the user terminal 13 stored in step S12, the blood pressure of the user of the user terminal 13 stored in step S12, and laughter and cheers emitted by the user, thereby making it possible to determine whether or not the user of the user terminal 13 has reacted, the number of times the user of the user terminal 13 has reacted within a specified period of time, the timing at which the user of the user terminal 13 reacted, etc.
- the presence or absence of a reaction of the user of user terminal 13 refers to whether the user of user terminal 13 laughed, whether the user of user terminal 13 cheered, and whether the user of user terminal 13 became excited.
- Whether the user of user terminal 13 laughed can be determined from a change in facial expression and the presence or absence of laughter.
- the number of times the user of user terminal 13 reacted within a specified time period refers to the number of times the user of user terminal 13 laughed, the number of times the user of user terminal 13 cheered, and the number of times the user of user terminal 13 became excited within the specified time period.
- the timing at which the user of user terminal 13 reacted refers to the timing at which the user reacted between the start and end of use of the content.
- the reaction recognition unit 20 can determine whether the user of the user terminal 13 has laughed or not by performing face recognition processing by comparing the image captured by the camera 34 with the image of the face of the user of the user terminal 13 previously stored in step S12. Since face recognition processing is publicly known as described in JP 2005-275605 A, JP 2005-352892 A, JP 2007-148968 A, etc., a detailed explanation will be omitted.
- the reaction recognition unit 20 determines the level of laughter of the user of the user terminal 13 from the volume (volume) of the laughter of the user of the user terminal 13 when using the content. Specifically, it is determined that the louder the laughter, the higher the level of laughter.
- the reaction recognition unit 20 compares the heart rate of the user of the user terminal 13 when not using the content with the heart rate of the user of the user terminal 13 when using the content, and determines whether the user of the user terminal 13 is excited, the level of excitement of the user of the user terminal 13, etc.
- the reaction recognition unit 20 also compares the blood pressure of the user of the user terminal 13 when not using the content with the blood pressure of the user of the user terminal 13 when using the content, and judges whether or not the user of the user terminal 13 is excited, and the level of excitement of the user of the user terminal 13. For example, it judges that the greater the increase in heart rate, the higher the level of excitement, and that the greater the increase in blood pressure, the higher the level of excitement.
- the server 11 also creates evaluation information of the user of the user terminal 13 regarding the content based on the reaction information acquired from the user terminal 13, and stores the evaluation information in the storage unit 16. The evaluation information associates the content with the presence or absence of a reaction of the user of the user terminal 13 corresponding to the content, and the reaction level of the user of the user terminal 13.
- step S24 the server 11 judges whether the contract contents can be executed.
- the server 11 makes the judgment in step S24 based on the judgment result of the reaction information. For example, if there is a reaction from the user of the user terminal 13, it judges that "the contract contents will be executed," and if there is no reaction from the user of the user terminal 13, it judges that "the contract contents will not be executed.”
- the reaction information including a smiling face, a laughing voice, a cheer, an increase in heart rate or blood pressure, etc.
- none of the reaction information including a smiling face, a laughing voice, a cheer, an increase in heart rate or blood pressure, etc. is detected, it is judged that there is no reaction from the user of the user terminal 13.
- the server 11 may determine that the contract contents are to be executed if the number of times the user of the user terminal 13 responds within a specified time period, or the response level of the user of the user terminal 13, exceeds a threshold value previously stored in the storage unit 16. In contrast, the server 11 may determine that the contract contents are not to be executed if the number of times the user of the user terminal 13 responds within a specified time period, or the response level of the user of the user terminal 13, is equal to or lower than a threshold value previously stored in the storage unit 16.
- the server 11 may change the "degree of execution of the contract contents" according to the number of times the user of the user terminal 13 reacts within a specified time period, or the reaction level of the user of the user terminal 13.
- the "degree of execution of the contract contents” includes increasing the amount of compensation paid the higher the reaction level, increasing the amount of cryptocurrency paid the higher the reaction level, increasing the number of points awarded the higher the reaction level, etc.
- step S24 the server 11 performs the procedures specified in the contract contents in step S24. That is, the server 11 performs processes such as payment and settlement of cash or cryptocurrency to the user's account in the user terminal 13, granting points to the user terminal 13, and payment and settlement to the provider's account.
- step S25 the server 11 transmits information to the user terminal 13 and the provider terminal 12 indicating that the procedures specified in the contract contents have been performed or that the procedures specified in the contract contents have not been performed. Furthermore, the information transmitted to the provider terminal 12 in step S25 may include evaluation information of the user of the user terminal 13 regarding the content.
- the user terminal 13 acquires information from the server 11 in step S26.
- the provider terminal 12 acquires information from the server 11 in step S27.
- the reaction information acquired by the server 11 from the user terminal 13 is detected by the biometric authentication device 36. Therefore, it is possible to prevent the reaction information acquired by the server 11 from including the subjective opinion of the user.
- the provider terminal 12 can acquire evaluation information on the content provided to the user terminal 13 in step S27. Therefore, the provider who manages the provider terminal 12 can use the evaluation information acquired from the server 11 when promoting its own products, i.e., content, on social networking services (SNS), TikTok (registered trademark), YouTube (registered trademark), etc. on the network 14. Furthermore, the provider can use the reaction information acquired from the server 11 for business-to-business transactions via other computers connected to the network 14. Furthermore, the server 11 can also send a message to the provider terminal 12 in step S25 stating, "If the reaction of the user terminal is good, we will publish your company's advertisement on the network free of charge.”
- the information processing system 10 can also perform the processing of Fig. 3A on the premise that steps S17, S19, and S23 of Fig. 2 have been performed.
- the processing and judgment steps executed by the server 11 are executed by the server 11 by activating a program or application acquired from the recording medium 70.
- step S30 the server 11 selects other content that is presumed to provide reaction information similar to the reaction information of the content acquired by the user terminal 13, from among the content that the user terminal 13 has not previously acquired.
- the server 11 then recommends the other content selected in step S30 to the user terminal 13 in step S31.
- the user terminal 13 can display the recommended information on the display unit 30, and issue a request to acquire other content in step S18. This makes it easier for the user of the user terminal 13 to search for content that matches their preferences.
- the following process may be performed.
- the user terminal 13A When the user terminal 13A is connected to the server 12, the user terminal 13A can execute the contents of steps S10, S13, S18, S20, S21, S22, and S26 in FIG. 2.
- the server 12 also executes the contents of steps S12, S15, S19, S23, S24, and S25 based on the information transmitted from the user terminal 13A.
- the control unit 15 of the server 12 processes and stores the reaction information transmitted from the multiple user terminals 13A in step S23.
- the users of the multiple user terminals 13A are stratified into multiple age groups. Furthermore, the average heart rate and the average blood pressure are calculated and stored for each age group.
- the presence or absence of a reaction from the user the number of times the user laughed, the number of times the user reacted within a specified time, the timing of the user's reaction, and the like are stored.
- the control unit 15 can execute the following process in step S23.
- the control unit 15 can compare the average heart rate for each age group of users acquired by the server 12 from the user terminal 13A and the average blood pressure for each age group of users acquired by the server 12 from the user terminal 13A with the heart rate and blood pressure of the user transmitted from the user terminal 13 in step S22, it is possible to determine whether or not the user of the user terminal 13 has responded, and to determine whether or not the user of the user terminal 13 is excited, the level of excitement of the user of the user terminal 13, etc.
- the average heart rate and average blood pressure for each age group published by a public institution may be stored in advance in the memory unit 16 of the server 11.
- the control unit 15 of the server 11 compares the average heart rate for each age group stored in the memory unit 16 and the average blood pressure for each age group stored in the memory unit 16 with the user's heart rate and blood pressure transmitted from the user terminal 13 in step S22 to determine whether or not the user of the user terminal 13 has responded, and can also determine whether or not the user of the user terminal 13 is excited, the level of excitement of the user of the user terminal 13, etc.
- the information processing system 10 can also perform the processing of FIG. 3B on the premise that steps S17, S19, and S23 of FIG. 2 have been performed.
- the server 11 executes the processing and judgment steps by starting a program and application acquired from the recording medium 70.
- the server 11 can detect the presence or absence of a user terminal 13 that has acquired similar reaction information among a plurality of user terminals 13 that have used the same content.
- the similar reaction information includes laughing at the same timing, cheering at the same timing, getting excited at the same timing, the difference in the number of laughs being a predetermined number or less, the difference in the number of cheers being a predetermined number or less, the difference in the number of excited times being a predetermined number or less, and the like, during the use of the content. When at least one of these is detected, it can be determined that there is a user terminal 13 that has acquired similar reaction information.
- step S41 the server 11 can transmit auxiliary information such as "There is another user terminal that has had similar reaction information about the content you used" to each of the multiple user terminals 13 that have obtained similar reaction information by using the same content.
- the multiple user terminals 13 can obtain the auxiliary information in step S42 and display the auxiliary information on the display unit 30.
- a plurality of user terminals 13, 13A search for other user terminals 13A that have similar reactions to the same content on the network 14, which becomes a clue for connecting the user terminals 13, 13A with each other.
- a plurality of user terminals 13, 13A can search for partners, for example, through an application programming interface (API), a social networking service (SNS), a matching app, etc.
- API application programming interface
- SNS social networking service
- users can easily find user terminals 13 that show similar reactions when using content.
- the level of information to be sent to each user terminal 13, 13A for example the level of connection, may be changed according to the degree to which similar reaction information regarding the same content has been obtained. For example, the higher the volume, the stronger the level of connection, the longer the duration of laughter, or the more times the user laughs, the stronger the level of connection.
- the information processing system 10 can execute the process of FIG. 4A.
- the server 11 executes the process and judgment steps by starting up a program and application acquired from the recording medium 70.
- the server 11 randomly selects a user terminal 13 based on the content usage history of the first user terminal 13 and the history of reaction information when the content is used in the first user terminal 13.
- the user terminal 13 to be selected is a second user terminal 13A that has no history of acquiring the content acquired in the first user terminal 13 and is different from the first user terminal 13.
- the second user terminal 13A may be either a single terminal or a plurality of terminals.
- step S51 the server 11 performs a process of transmitting information recommending new content to the second user terminal 13A selected in step S50.
- the second user terminal 13A obtains the recommended new information, and in step S18, can transmit a request to obtain the content. This makes it easier for the user of the second user terminal 13A to search for new content that matches their preferences.
- the information processing system 10 can execute the process of Fig. 4B on the premise that steps S17, S19, and S23 of Fig. 2 have been performed.
- the process and judgment steps executed by the server 11 are executed by the server 11 activating a program and an application acquired from the recording medium 70.
- step S60 the server 60 selects other content that has not received a content acquisition request from the specified user terminal 13 and is presumed to produce reaction information similar to the reaction information for the content acquired by the specified user terminal 13.
- the server 11 then recommends the other content to the specified user terminal 13 in step S61. It also sends a message to the specified user terminal 13 stating, "If the recommended content is not interesting, we will provide you with compensation.”
- step S23 If there is no reaction from the user, the server 11 can also make a decision to grant a compensation to the specified user terminal 13 at step S62. Therefore, the user of the specified user terminal 13 can obtain the compensation. Note that if there is no reaction from the user at step S23 on the server 11, the server 11 makes a decision not to grant a compensation to the specified user terminal 13 at step S62.
- the predetermined processing includes executing the contract exchanged between the provider terminal 12 and the user terminal 13 in step S24 of FIG. 2, the processing of step S31 of FIG. 3(A), the processing of step S41 of FIG. 3(B), the processing of step S51 of FIG. 4(A), the processing of step S61 of FIG. 4(B), etc.
- the server 11 is an example of an information processing device.
- the content acquisition unit 18 can also be understood as a content acquisition circuit or a content acquirer.
- the content provision unit 19 can also be understood as a content provision circuit or a content provision circuit.
- the reaction recognition unit 20 can also be understood as a reaction processing circuit or a reaction processor.
- the processing execution unit 21 can also be understood as a processing execution circuit or a processing execution unit.
- the customer management unit 40 can also be understood as a customer management circuit or a customer management unit.
- the content recommendation unit 22 can also be understood as a content recommendation circuit or a content recommender.
- the biometric authentication device 36 can also be recognized as a reaction detection circuit or a reaction detector.
- Step S17 in FIG. 2 is an example of a content acquisition step.
- Step S19 is an example of a content provision step.
- Step S23 is an example of a reaction recognition step.
- Step S24 is an example of a process execution step.
- the content acquisition step can also be understood as a content acquisition means.
- the content provision step can also be understood as a content provision means.
- the reaction recognition step can also be understood as a reaction recognition means.
- the process execution step can also be understood as a process execution means.
- the information processing system may be understood as an information management system.
- the information processing device may be understood as an information management device.
- the biometric authentication device may be either one that is provided in the user terminal itself, or an external device provided separately from the user terminal.
- the information processing device is a computer that performs various processes, judgments, decisions, etc., based on information including input signals, pre-stored programs, applications, data, etc.
- the computer as the information processing device includes at least one of a server, a workstation, a mainframe, a supercomputer, etc.
- This disclosure can be used in an information processing device, information processing method, program, and recording medium that provide information prepared by an information provider to a user terminal used by a user via a network.
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
本開示は、提供者端末からネットワークを介してコンテンツを取得し、取得したコンテンツを、ネットワークを介して利用者端末に提供する情報処理装置、情報処理方法、プログラム及び記録媒体に関する。 This disclosure relates to an information processing device, information processing method, program, and recording medium that acquires content from a provider terminal via a network and provides the acquired content to a user terminal via the network.
情報提供者から提供されるコンテンツを、利用者が利用する利用者端末に、ネットワークを介して提供する情報処理装置の一例が、特許文献1に記載されている。特許文献1に記載されている情報処理装置は、サーバ装置である。このサーバは、選手の試合(コンテンツ)を観戦した観客(利用者)による、選手に対するおひねり及び感動に関する情報の少なくとも1つを含む入力情報を取得する取得手段と、取得手段によって取得した入力情報を選手ごとに評価する評価手段と、評価手段によって評価された評価情報を外部へ提供する提供手段と、を備えることを特徴とするサーバ装置である。 Patent Document 1 describes an example of an information processing device that provides content provided by an information provider to a user terminal used by a user via a network. The information processing device described in Patent Document 1 is a server device. This server device is characterized by having an acquisition means for acquiring input information including at least one of information regarding tips given to players and impressions from spectators (users) who watched the player's match (content), an evaluation means for evaluating the input information acquired by the acquisition means for each player, and a provision means for providing the evaluation information evaluated by the evaluation means to the outside.
特許文献1に記載されている取得手段は、入力情報を入力するための入力画面を観客が所有する装置へ表示させる表示制御手段と、入力画面を介して入力された入力情報を受け付ける受け付け手段と、を備える。さらに、入力画面には、おひねりを入力するためのボタンと、選手に対する感動を示す感動ポイントを連続入力可能なボタンと、が含まれる、と記載されている。具体的には、観客等は試合会場やテレビ放送等により試合を観戦し、観戦の結果、感動した度合に応じて各自がパーソナルコンピュータ(利用者端末)や携帯端末(利用者端末)を介してサイトサーバへアクセスし、ブラウザ画面を介してそれらの情報を入力することができる。 The acquisition means described in Patent Document 1 includes a display control means for displaying an input screen for inputting input information on a device owned by a spectator, and a receiving means for receiving the input information input via the input screen. Furthermore, it is described that the input screen includes a button for inputting tips, and a button for continuously inputting impression points that indicate the impression the spectator has on a player. Specifically, spectators watch the game at the venue or via a television broadcast, and depending on the degree of impression they feel as a result of watching the game, can access the site server via a personal computer (user terminal) or mobile terminal (user terminal) and input the information via a browser screen.
本願発明者は、特許文献1に記載されているサーバ装置は、利用者が利用者端末を操作して各自で感動ポイントを入力するため、サーバ装置が取得する感動ポイントの判断は、利用者の主観である、という課題を認識した。 The inventors of this application recognized the problem that, in the server device described in Patent Document 1, users operate their user terminals to input their own impression points, and therefore the impression points acquired by the server device are determined subjectively by the users.
特許文献1のような発明の他、各種商品やサービスのレビューについてもテキストや画像のみでは真意を担保することが困難なところがあった。本開示の目的は、利用者端末から取得する反応情報に、利用者の主観が含まれることを抑制可能な情報処理装置、情報処理方法、プログラム及び記録媒体を提供することにある。 In addition to the invention of Patent Document 1, it has been difficult to ensure the true meaning of reviews of various products and services using only text and images. The purpose of this disclosure is to provide an information processing device, information processing method, program, and recording medium that can prevent the reaction information acquired from a user terminal from including the user's subjective opinion.
本開示は、提供者端末からネットワークを介してコンテンツを取得するコンテンツ取得部と、前記コンテンツ取得部が取得した前記コンテンツを、前記ネットワークを介して利用者端末へ提供するコンテンツ提供部と、を有する情報処理装置であって、利用者が前記利用者端末で前記コンテンツを利用した場合における前記利用者の生体認証情報、または、前記利用者の顔の表情を撮影した映像情報のうち、少なくとも一方の情報に基づいて、前記利用者の反応情報を取得する反応認識部と、前記反応認識部が取得した前記反応情報に基づいて、前記コンテンツと前記利用者端末とを関連付けた所定の処理を行う処理実行部と、を有する、情報処理装置である。 The present disclosure relates to an information processing device having a content acquisition unit that acquires content from a provider terminal via a network, and a content provision unit that provides the content acquired by the content acquisition unit to a user terminal via the network, the information processing device having a reaction recognition unit that acquires reaction information of the user based on at least one of biometric authentication information of the user when the user uses the content on the user terminal or video information capturing the facial expression of the user, and a processing execution unit that performs a predetermined process that associates the content with the user terminal based on the reaction information acquired by the reaction recognition unit.
本開示によれば、情報処理装置が利用者端末から取得する反応情報に、利用者の主観が含まれることを抑制可能である。 According to the present disclosure, it is possible to prevent the reaction information acquired by an information processing device from a user terminal from including the subjectivity of the user.
(概要)
本開示は、提供者端末からネットワークを介してコンテンツを取得し、取得したコンテンツを、ネットワークを介して利用者端末に提供すること、表情及び生体情報を取得し利用者とコンテンツを関連付けること、を目的とした情報処理装置、情報処理方法、プログラム及び記録媒体に関する。次に説明する情報処理システムは、これらの情報処理装置、情報処理方法、プログラム及び記録媒体を含む。以下、情報処理システムに含まれるいくつかの実施形態を図面に基づいて説明する。情報処理システムの実施形態を説明するための図において、同一部には原則として同一の符号を付し、その繰り返しの説明は省略する。
(overview)
The present disclosure relates to an information processing device, an information processing method, a program, and a recording medium for obtaining content from a provider terminal via a network, providing the obtained content to a user terminal via the network, and obtaining facial expression and biometric information and associating the user with the content. The information processing system described below includes these information processing devices, information processing methods, programs, and recording media. Below, several embodiments included in the information processing system will be described with reference to the drawings. In the drawings for explaining the embodiments of the information processing system, the same parts are generally designated by the same reference numerals, and repeated description will be omitted.
図1に示す情報処理システム10は、情報処理装置としてのサーバ11、提供者端末12、利用者端末13、生体認証装置36及び記録媒体70を含む。サーバ11と提供者端末12とが、ネットワーク14を介して相互に通信できるように構成され、サーバ11と利用者端末13とが、ネットワーク14を介して相互に通信できるように構成されている。 The information processing system 10 shown in FIG. 1 includes a server 11 as an information processing device, a provider terminal 12, a user terminal 13, a biometric authentication device 36, and a recording medium 70. The server 11 and the provider terminal 12 are configured to be able to communicate with each other via a network 14, and the server 11 and the user terminal 13 are configured to be able to communicate with each other via the network 14.
生体認証装置36は、利用者端末13の利用者の生体情報を検出し、検出結果に応じた信号を出力する装置である。図1では、便宜上、単数の提供者端末12が示されているが、複数の提供者端末12をそれぞれ単独でサーバ11へ接続できる。また、単数の利用者端末13が示されているが、複数の利用者端末13,13Aをそれぞれ単独でサーバ11へ接続できる。 The biometric authentication device 36 is a device that detects the biometric information of the user of the user terminal 13 and outputs a signal according to the detection result. For convenience, a single provider terminal 12 is shown in FIG. 1, but multiple provider terminals 12 can be connected to the server 11 independently. Also, a single user terminal 13 is shown, but multiple user terminals 13, 13A can be connected to the server 11 independently.
本開示において、サーバ11は、情報管理者により管理及び運営される。提供者端末12は、ネットワーク14でコンテンツを公開する提供者により操作及び利用される。利用者端末13は、ネットワーク14で公開されているコンテンツを利用する利用者により操作及び利用される。コンテンツは、音声情報を含む動画、静止画、音楽、ゲーム等のデジタルコンテンツを含む。コンテンツの利用は、購入、ダウンロード、再生、表示、動作等を含む。ダウンロードは、ストリーミングを含む。コンテンツの利用は、利用者端末13でコンテンツの取得と並行して、コンテンツを利用すること、または、利用者端末13でコンテンツを取得して記憶した後、所定時間後にコンテンツを利用者端末13で利用すること、の何れでもよい。 In this disclosure, the server 11 is managed and operated by an information administrator. The provider terminal 12 is operated and used by a provider who publishes content on the network 14. The user terminal 13 is operated and used by a user who uses the content published on the network 14. The content includes digital content such as video including audio information, still images, music, games, etc. The use of content includes purchase, download, playback, display, operation, etc. Downloading includes streaming. The use of content may be either using the content in parallel with the acquisition of the content on the user terminal 13, or using the content on the user terminal 13 a predetermined time after acquiring and storing the content on the user terminal 13.
(記録媒体の説明)
記録媒体70は、記録媒体70は、磁気テープ、磁気ディスク、光ディスク、光磁気ディスク、半導体メモリ等を含む。記録媒体70には、サーバ11で処理、判断等を行うために起動されるプログラム及びアプリケーションが記録、つまり、記憶されている。
(Description of Recording Medium)
The recording medium 70 includes a magnetic tape, a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc. The recording medium 70 records, that is, stores, programs and applications that are started in the server 11 to perform processing, judgment, etc.
(サーバの説明)
サーバ11は、入力される信号、予め記憶されているプログラム、アプリケーション、データ等の情報に基づいて、各種の処理、判断、決定等を行うことができるコンピュータである。サーバ11は、制御部15、記憶部16、通信部17及び読取部71を有する。制御部15は、入力ポート、出力ポート及び演算処理回路を有する。制御部15は、記憶部16及び通信部17との間で相互に信号の送信及び受信を行うことができる。記憶部16には、予めプログラム、アプリケーション、データ等の情報が記憶されている。また、記憶部16には、提供者端末12から送信される情報、利用者端末13から送信される情報、提供者端末12と利用者端末13と間で取り交わされる契約内容、等が記憶される。
(Server description)
The server 11 is a computer capable of performing various processes, judgments, decisions, etc., based on input signals and pre-stored information such as programs, applications, data, etc. The server 11 has a control unit 15, a memory unit 16, a communication unit 17, and a reading unit 71. The control unit 15 has an input port, an output port, and an arithmetic processing circuit. The control unit 15 can transmit and receive signals between the memory unit 16 and the communication unit 17. The memory unit 16 stores information such as programs, applications, data, etc. in advance. The memory unit 16 also stores information transmitted from the provider terminal 12, information transmitted from the user terminal 13, contract contents exchanged between the provider terminal 12 and the user terminal 13, etc.
通信部17は、制御部15から出力された信号をネットワーク14を介して提供者端末12及び利用者端末13へ送信する機能と、提供者端末12及び利用者端末13からネットワーク14を介して受信する信号を制御部15へ送る機能と、を有する。さらに、通信部17は、提供者端末12及び利用者端末13以外のコンピュータとの間で、ネットワーク14を介して相互に信号の送信及び受信を行うこともできる。読取部71は、記録媒体70からプログラム、アプリケーション等を読み取り、読み取ったプログラム、アプリケーション等を制御部15へ送る。 The communication unit 17 has a function of transmitting signals output from the control unit 15 to the provider terminal 12 and the user terminal 13 via the network 14, and a function of sending signals received from the provider terminal 12 and the user terminal 13 via the network 14 to the control unit 15. Furthermore, the communication unit 17 can also transmit and receive signals to and from computers other than the provider terminal 12 and the user terminal 13 via the network 14. The reading unit 71 reads programs, applications, etc. from the recording medium 70, and sends the read programs, applications, etc. to the control unit 15.
制御部15は、コンテンツ取得部18、コンテンツ提供部19、反応認識部20、処理実行部21、コンテンツ推奨部22、反応記憶部39、顧客管理部40、対価管理部41を有する。コンテンツ取得部18は、提供者端末12から受信したコンテンツを取得し、かつ、コンテンツを処理する機能を有する。コンテンツ提供部19は、取得したコンテンツをネットワーク14上で公開する機能と、利用者端末13の要求に応じてコンテンツを利用者端末13へ提供する機能と、を有する。 The control unit 15 has a content acquisition unit 18, a content provision unit 19, a reaction recognition unit 20, a processing execution unit 21, a content recommendation unit 22, a reaction memory unit 39, a customer management unit 40, and a consideration management unit 41. The content acquisition unit 18 has a function of acquiring content received from the provider terminal 12 and processing the content. The content provision unit 19 has a function of publishing the acquired content on the network 14, and a function of providing content to the user terminal 13 in response to a request from the user terminal 13.
反応認識部20は、利用者端末13から受信する反応情報を認識及び判断する機能を有する。なお、反応情報の意味は、後述する。反応記憶部39は、反応認識部20で処理された反応情報を記憶する。処理実行部21は、反応情報に基づいて、コンテンツと利用者端末13とを関連付けた所定の処理を行う。所定の処理は、後述する。コンテンツ推奨部22は、利用者端末13へ提供されたことが無いコンテンツを、利用者端末13に対し推奨する機能を有する。 The reaction recognition unit 20 has a function of recognizing and judging reaction information received from the user terminal 13. The meaning of the reaction information will be described later. The reaction memory unit 39 stores the reaction information processed by the reaction recognition unit 20. The processing execution unit 21 performs a predetermined process that associates content with the user terminal 13 based on the reaction information. The predetermined process will be described later. The content recommendation unit 22 has a function of recommending to the user terminal 13 content that has not been provided to the user terminal 13 before.
顧客管理部40は、提供者端末12の提供者情報、及び利用者端末13の利用者情報を管理する機能を有する。また、顧客管理部40は、利用者端末13毎に、利用されるコンテンツの特徴及び傾向の判断、利用者端末13毎に、利用されたコンテンツに対する反応情報等の集積及び分析等を行う機能を有する。対価管理部41は、利用者情報に含まれる口座番号の管理、提供者情報に含まれる口座番号の管理、サーバ11がネットワーク14上で設定した預託口座の管理等を行う。また、対価管理部41は、預託口座から利用者端末13へ付与される対価の管理、利用者端末13から支払われる料金の管理、提供者端末12からサーバ11へ支払われる料金の管理等を行う。 The customer management unit 40 has a function of managing the provider information of the provider terminal 12 and the user information of the user terminal 13. The customer management unit 40 also has a function of determining the characteristics and trends of the content used for each user terminal 13, and accumulating and analyzing reaction information to the content used for each user terminal 13. The compensation management unit 41 manages the account numbers included in the user information, the account numbers included in the provider information, and the deposit account set up by the server 11 on the network 14. The compensation management unit 41 also manages the compensation granted to the user terminal 13 from the deposit account, the fees paid from the user terminal 13, and the fees paid from the provider terminal 12 to the server 11.
制御部15は、読取部71が記録媒体70から読み取ったプログラム、アプリケーション等を記憶部16へ記憶することができる。制御部15は、読取部71が記録媒体70から読み取ったプログラム、アプリケーション等を起動させることにより、コンテンツ提供部19、コンテンツ取得部18、反応認識部20、処理実行部21、コンテンツ推奨部22、反応記憶部39、顧客管理部40、対価管理部41を機能させる。制御部15は、通信部17を介して入力される信号、反応記憶部39に記憶されている情報、顧客管理部40に記憶されている情報、対価管理部41の情報、及び記憶部16に記憶されている情報に基づいて、処理、比較、判断を行う。また、制御部15は、実行した処理、比較、判断の結果を、記憶部16へ記憶する。 The control unit 15 can store the programs, applications, etc. that the reading unit 71 reads from the recording medium 70 in the memory unit 16. The control unit 15 activates the programs, applications, etc. that the reading unit 71 reads from the recording medium 70, thereby causing the content providing unit 19, content acquiring unit 18, reaction recognition unit 20, processing execution unit 21, content recommendation unit 22, reaction storage unit 39, customer management unit 40, and consideration management unit 41 to function. The control unit 15 performs processing, comparison, and judgment based on signals input via the communication unit 17, information stored in the reaction storage unit 39, information stored in the customer management unit 40, information in the consideration management unit 41, and information stored in the memory unit 16. The control unit 15 also stores the results of the executed processing, comparison, and judgment in the memory unit 16.
なお、サーバ11は、読取部71が記録媒体70に接続されている状態で、記録媒体70に記録されているプログラム、アプリケーション等を起動させることができる。また、サーバ11は、記録媒体70に記録されているプログラム、アプリケーション等を、読取部71で読み取って記憶部16に記憶した後、読取部71が記録媒体70に接続されていない状態で、記録媒体70から読み取ったプログラム、アプリケーション等を、起動させることもできる。 Note that the server 11 can start programs, applications, etc. recorded on the recording medium 70 while the reading unit 71 is connected to the recording medium 70. The server 11 can also start programs, applications, etc. read from the recording medium 70 while the reading unit 71 is not connected to the recording medium 70 after reading the programs, applications, etc. recorded on the recording medium 70 with the reading unit 71 and storing them in the memory unit 16.
(提供者端末の説明)
提供者端末12は、デスクトップ型パーソナルコンピュータ、ノート型パーソナルコンピュータ、スマートフォン、タブレット端末等のうちの何れでもよい。提供者端末12は、制御部23、操作部24、表示部25、記憶部26、通信部27を有する。制御部23は、演算処理回路、入力ポート及び出力ポートを有する。制御部23は、記憶部26及び通信部27との間で相互に信号の送信及び受信を行うことができる。記憶部26には、予めプログラム、アプリケーション、データ等の情報が記憶されている。また、記憶部26には、サーバ11から送信される情報、提供者端末12と利用者端末13と間で取り交わされる契約内容、制御部23で行った処理及び判断の結果が記憶される。
(Explanation of provider terminal)
The provider terminal 12 may be any of a desktop personal computer, a notebook personal computer, a smartphone, a tablet terminal, etc. The provider terminal 12 has a control unit 23, an operation unit 24, a display unit 25, a storage unit 26, and a communication unit 27. The control unit 23 has an arithmetic processing circuit, an input port, and an output port. The control unit 23 can transmit and receive signals between the storage unit 26 and the communication unit 27. The storage unit 26 stores information such as programs, applications, and data in advance. The storage unit 26 also stores information transmitted from the server 11, the contents of the contract exchanged between the provider terminal 12 and the user terminal 13, and the results of processing and judgment performed by the control unit 23.
通信部27は、制御部23から出力された信号をネットワーク14を介してサーバ11へ送信する機能と、サーバ11からネットワーク14を介して受信する信号を制御部23へ送る機能と、を有する。通信部27がサーバ11へ信号を送る場合、提供者端末12の提供者情報と関連付けて送信される。さらに、通信部27は、サーバ11以外のコンピュータとの間で、ネットワーク14を介して相互に信号の送信及び受信を行うこともできる。 The communication unit 27 has a function of transmitting a signal output from the control unit 23 to the server 11 via the network 14, and a function of sending a signal received from the server 11 via the network 14 to the control unit 23. When the communication unit 27 sends a signal to the server 11, the signal is transmitted in association with the provider information of the provider terminal 12. Furthermore, the communication unit 27 can also transmit and receive signals to and from computers other than the server 11 via the network 14.
操作部24は、タッチスイッチ、液晶パネル等の表示画面、キーボード、マウス、スキャナ等のうちの少なくとも1要素により構築されている。提供者が操作部24を操作すると、その操作に応じた信号が、制御部23へ入力される。表示部25は、タッチパネル、液晶パネル等により構築されている。表示部25には、提供者端末12で入力する情報、及びサーバ11及び利用者端末13から取得する情報等が表示される。表示部25に表示される内容は、制御部23により制御及び切り替えられる。提供者端末12がスマートフォン、またはタブレット端末であると、操作部24は、表示部25の表示画面に表示される操作メニュー(操作ボタン)に相当する。制御部23は、通信部27を介して入力される信号、操作部24の操作に応じた信号、及び記憶部16に記憶されている情報に基づいて、処理、比較、判断等を行う。 The operation unit 24 is constructed with at least one element of a touch switch, a display screen such as a liquid crystal panel, a keyboard, a mouse, a scanner, etc. When the provider operates the operation unit 24, a signal corresponding to the operation is input to the control unit 23. The display unit 25 is constructed with a touch panel, a liquid crystal panel, etc. The display unit 25 displays information input at the provider terminal 12, information acquired from the server 11 and the user terminal 13, etc. The content displayed on the display unit 25 is controlled and switched by the control unit 23. If the provider terminal 12 is a smartphone or a tablet terminal, the operation unit 24 corresponds to an operation menu (operation buttons) displayed on the display screen of the display unit 25. The control unit 23 performs processing, comparison, judgment, etc. based on the signal input via the communication unit 27, the signal corresponding to the operation of the operation unit 24, and the information stored in the memory unit 16.
(利用者端末の説明)
利用者端末13は、デスクトップ型パーソナルコンピュータ、ノート型パーソナルコンピュータ、スマートフォン、タブレット端末等のうちの何れでもよい。利用者端末13は、制御部28、操作部29、表示部30、記憶部31、通信部32、スピーカ33、カメラ34及びマイク35を有する。制御部28は、演算処理回路、入力ポート及び出力ポートを有する。制御部28は、記憶部31、通信部32、スピーカ33、カメラ34、マイク35との間で相互に信号の送信及び受信を行うことができる。記憶部31には、予めプログラム、アプリケーション、データ等の情報が記憶されている。また、記憶部31には、サーバ11から送信される情報、提供者端末12と利用者端末13と間で取り交わされる契約内容、制御部28で行った処理及び判断の結果が記憶される。
(Explanation of user terminal)
The user terminal 13 may be any of a desktop personal computer, a notebook personal computer, a smartphone, a tablet terminal, etc. The user terminal 13 has a control unit 28, an operation unit 29, a display unit 30, a memory unit 31, a communication unit 32, a speaker 33, a camera 34, and a microphone 35. The control unit 28 has an arithmetic processing circuit, an input port, and an output port. The control unit 28 can transmit and receive signals between the memory unit 31, the communication unit 32, the speaker 33, the camera 34, and the microphone 35. The memory unit 31 stores information such as programs, applications, and data in advance. The memory unit 31 also stores information transmitted from the server 11, the contents of the contract exchanged between the provider terminal 12 and the user terminal 13, and the results of processing and judgment performed by the control unit 28.
通信部32は、制御部28から出力された信号をネットワーク14を介してサーバ11へ送信する機能と、サーバ11からネットワーク14を介して受信する信号を制御部28へ送る機能と、を有する。通信部32がサーバ11へ信号を送信する場合、利用者端末13の利用者情報と関連付けて送信される。さらに、通信部32は、サーバ11以外のコンピュータとの間で、ネットワーク14を介して相互に信号の送信及び受信を行うこともできる。 The communication unit 32 has a function of transmitting a signal output from the control unit 28 to the server 11 via the network 14, and a function of sending a signal received from the server 11 via the network 14 to the control unit 28. When the communication unit 32 transmits a signal to the server 11, the signal is transmitted in association with the user information of the user terminal 13. Furthermore, the communication unit 32 can also transmit and receive signals to and from computers other than the server 11 via the network 14.
操作部29は、タッチスイッチ、液晶パネル等の表示画面、キーボード、マウス、スキャナ等のうちの少なくとも1要素により構築されている。利用者が操作部29を操作すると、その操作に応じた信号が、制御部28へ入力される。表示部30は、タッチパネル、液晶パネル等により構築されている。表示部30には、利用者端末13で入力する情報、及びサーバ11及び提供者端末12から取得する情報、取得したコンテンツに含まれる映像が表示される。映像は、静止画及び動画を含む。表示部30に表示される内容は、制御部28により制御及び切り替えられる。利用者端末13がスマートフォン、またはタブレット端末であると、操作部29は、表示部30の表示画面に表示される操作メニュー(操作ボタン)に相当する。 The operation unit 29 is constructed with at least one element of a touch switch, a display screen such as a liquid crystal panel, a keyboard, a mouse, a scanner, etc. When a user operates the operation unit 29, a signal corresponding to the operation is input to the control unit 28. The display unit 30 is constructed with a touch panel, a liquid crystal panel, etc. The display unit 30 displays information input at the user terminal 13, information acquired from the server 11 and the provider terminal 12, and images included in the acquired content. The images include still images and videos. The content displayed on the display unit 30 is controlled and switched by the control unit 28. If the user terminal 13 is a smartphone or a tablet terminal, the operation unit 29 corresponds to an operation menu (operation buttons) displayed on the display screen of the display unit 30.
カメラ34は、利用者端末13の本体に直接に設けられる構造、または、利用者端末13に信号ケーブルを介して接続される構造の何れでもよい。カメラ34は、利用者の顔及び表情を撮影し、撮影した映像情報の信号が制御部28へ入力される。カメラ34が撮影する映像は、静止画または動画の何れでもよい。カメラ34で撮影された映像には、撮影年月日、撮影時刻等の撮影情報を含む信号が添付される。マイク35は、利用者が発した笑い声、歓声等の音声を回収し、回収された音声及び音量に応じた信号が制御部28へ入力される。スピーカ33は、制御部28により制御され、かつ、サーバ11から取得したコンテンツに含まれる音声及び音楽を出力する。 The camera 34 may be either directly attached to the main body of the user terminal 13, or connected to the user terminal 13 via a signal cable. The camera 34 captures the user's face and facial expression, and a signal of the captured video information is input to the control unit 28. The video captured by the camera 34 may be either a still image or a video. A signal including shooting information such as the shooting date and shooting time is attached to the video captured by the camera 34. The microphone 35 collects sounds such as laughter and cheers made by the user, and a signal corresponding to the collected sound and volume is input to the control unit 28. The speaker 33 is controlled by the control unit 28, and outputs the sound and music included in the content obtained from the server 11.
(生体認証装置の説明)
生体認証装置36は、コンテンツを利用した利用者の反応情報を検出して信号を出力する反応情報検出装置である。生体認証装置36は、利用者の反応情報を、利用者の生体認証情報及び行動的特徴から検出する。生体認証装置36は、前述のカメラ34及びマイク35に加え、心拍計37及び血圧計38を含む。心拍計37は、所定時間当たりにおける利用者の心拍数を測定して信号を出力する。血圧計38は、利用者の血圧を測定して信号を出力する。心拍計37から出力される信号、及び血圧計38から出力される信号は、制御部28により処理される。利用者の笑い声及び歓声、利用者が笑った時の表情、利用者が感動した時の表情等は、利用者の行動的特徴である。利用者の脈拍数及び血圧は、利用者の生体認証情報である。
(Explanation of the biometric authentication device)
The biometric authentication device 36 is a reaction information detection device that detects reaction information of a user who uses content and outputs a signal. The biometric authentication device 36 detects the reaction information of the user from the biometric authentication information and behavioral characteristics of the user. The biometric authentication device 36 includes a heart rate monitor 37 and a blood pressure monitor 38 in addition to the above-mentioned camera 34 and microphone 35. The heart rate monitor 37 measures the user's heart rate per predetermined time and outputs a signal. The blood pressure monitor 38 measures the user's blood pressure and outputs a signal. The signal output from the heart rate monitor 37 and the signal output from the blood pressure monitor 38 are processed by the control unit 28. The user's laughter and cheers, the user's facial expression when laughing, the user's facial expression when moved, etc. are the user's behavioral characteristics. The user's pulse rate and blood pressure are the user's biometric authentication information.
制御部28は、通信部32を介して入力される信号、操作部29の操作に応じた信号、カメラ34から入力される信号、マイク35から入力される信号、心拍計37から入力される信号、血圧計38から入力される信号、記憶部31に記憶されているプログラム及びアプリケーショに基づいて、処理及び判断を行う。また、制御部28は、処理及び判断結果に応じた信号を、通信部32を介してネットワーク14へ送る。 The control unit 28 performs processing and judgment based on signals input via the communication unit 32, signals corresponding to the operation of the operation unit 29, signals input from the camera 34, signals input from the microphone 35, signals input from the heart rate monitor 37, signals input from the blood pressure monitor 38, and programs and applications stored in the memory unit 31. The control unit 28 also sends signals corresponding to the results of the processing and judgment to the network 14 via the communication unit 32.
(処理例)
情報処理システム10で行われる処理例を、図2を参照して説明する。図2に示される処理例のうち、サーバ11が実行する処理及び判断のステップは、サーバ11が、記録媒体70から取得したプログラム及びアプリケーションを起動して実行する。利用者端末13は、ステップS10において利用者情報を入力し、かつ、サーバ11へ送信する。利用者情報は、利用者端末13の識別番号、利用者の氏名(名称)、住所、生年月日、年齢、パスワード、利用者ID、電話番号、メースアドレス、銀行または郵便局の口座番号、電子マネーや暗号資産の口座番号等である。暗号資産は、仮想通貨を含む。
(Processing example)
An example of processing performed in the information processing system 10 will be described with reference to Fig. 2. Among the example of processing shown in Fig. 2, the server 11 executes the processing and judgment steps by starting up the programs and applications acquired from the recording medium 70. The user terminal 13 inputs user information in step S10 and transmits it to the server 11. The user information includes the identification number of the user terminal 13, the user's name (name), address, date of birth, age, password, user ID, telephone number, email address, bank or post office account number, account number for electronic money or cryptocurrency, etc. Cryptocurrency includes virtual currency.
また、利用者端末13でコンテンツが利用されていない状態において、利用者の顔(表情)がカメラ34で撮影され、利用者の心拍数が心拍計37で計測され、利用者の血圧が血圧計38で計測される。カメラ34で撮影された利用者の顔の映像情報、所定時間当たりにおける利用者の心拍数、利用者の血圧等を含む信号が、ステップS10において、利用者端末13からサーバ11へ送信される。 Furthermore, when content is not being used on the user terminal 13, the user's face (facial expression) is photographed by the camera 34, the user's heart rate is measured by the heart rate meter 37, and the user's blood pressure is measured by the blood pressure meter 38. In step S10, a signal including the image information of the user's face photographed by the camera 34, the user's heart rate per specified time, the user's blood pressure, etc. is transmitted from the user terminal 13 to the server 11.
提供者端末12は、ステップS11において提供者情報を入力し、かつ、サーバ11へ送信する。提供者情報は、提供者端末12の識別番号、提供者の氏名(名称)、住所、パスワード、利用者ID、電話番号、メースアドレス、銀行または郵便局の口座番号、電子マネーや暗号資産の口座番号等である。暗号資産は、仮想通貨を含む。サーバ11は、ステップS12において、利用者端末13から送信される情報を処理し、かつ、記憶する。また、サーバ11は、提供者端末12から送信される情報を処理し、かつ、記憶する。 The provider terminal 12 inputs provider information in step S11 and transmits it to the server 11. The provider information includes the identification number of the provider terminal 12, the provider's name (title), address, password, user ID, telephone number, email address, bank or post office account number, account number for electronic money or crypto assets, etc. Crypto assets include virtual currencies. The server 11 processes and stores the information transmitted from the user terminal 13 in step S12. The server 11 also processes and stores the information transmitted from the provider terminal 12.
また、利用者端末13は、ステップS13において契約内容を入力してサーバ11へ送信し、かつ、提供者端末12は、ステップS14において契約内容を入力してサーバ11へ送信する。サーバ11は、ステップS15において、利用者端末13と提供者端末12との間で取り交わされた契約内容を取得及び記憶する。契約内容は、次のようなもの含む。例えば、利用者端末13でコンテンツが利用され、かつ、利用者がコンテンツを利用して反応を示した場合に、利用者端末13から提供者端末12へ対価(利用料)を支払う契約を含む。また、利用者がコンテンツを利用して反応を示さなかった場合に、利用者端末13が提供者端末12へ対価を支払わない契約を含む。ここで、“コンテンツを利用して反応”は、利用者の顔が笑ったこと、利用者が笑い声を出したこと、利用者が歓声を出したことのうち、少なくとも1つを含む。 The user terminal 13 inputs the contract details in step S13 and transmits them to the server 11, and the provider terminal 12 inputs the contract details in step S14 and transmits them to the server 11. The server 11 acquires and stores the contract details exchanged between the user terminal 13 and the provider terminal 12 in step S15. The contract details include the following. For example, it includes a contract whereby the user terminal 13 pays a fee (usage fee) to the provider terminal 12 when content is used at the user terminal 13 and the user uses the content and shows a reaction. It also includes a contract whereby the user terminal 13 does not pay a fee to the provider terminal 12 when the user uses the content and shows no reaction. Here, "a reaction when using the content" includes at least one of the following: a smile on the user's face, the user laughs, and the user cheers.
さらに、利用者がコンテンツを利用して反応を示した場合に、利用者端末13から提供者端末12へ支払う対価を、反応のレベルに応じて変更する契約を含む。例えば、反応のレベルが高い程、対価を増加させる契約である。なお、利用者端末13から提供者端末12へ支払われる対価は、現金、暗号通貨、ポイント等を含む。さらに、サーバ11は、ステップS15において、契約内容に含まれる対価の預託を受ける処理を行う。なお、ステップS13,S14,S15の処理を行った後、ステップS10,S11,S12の処理が行われてもよい。 Furthermore, it includes a contract that changes the compensation paid from the user terminal 13 to the provider terminal 12 in response to a user's reaction to the content depending on the level of the reaction. For example, the higher the level of the reaction, the higher the compensation. The compensation paid from the user terminal 13 to the provider terminal 12 includes cash, cryptocurrency, points, etc. Furthermore, in step S15, the server 11 performs a process to receive the compensation included in the contract. Note that steps S10, S11, and S12 may be performed after steps S13, S14, and S15 have been performed.
サーバ11でステップS12,S15の処理が行われた後、提供者端末12は、コンテンツをステップS16においてサーバ11へ送信する。サーバ11は、ステップS17においてコンテンツを取得し、かつ、記憶部16へ記憶する。また、サーバ11は、取得したコンテンツを外部から閲覧可能であることを、ステップS17においてネットワーク14上で公開する。 After the processes of steps S12 and S15 are performed by the server 11, the provider terminal 12 transmits the content to the server 11 in step S16. The server 11 acquires the content in step S17 and stores it in the storage unit 16. The server 11 also publishes on the network 14 in step S17 that the acquired content can be viewed from outside.
利用者端末13では、ステップS18において、ネットワーク14上で公開されているコンテンツの一覧を確認できる。コンテンツの一覧は、表示部30へ表示される。また、利用者端末13において、何れかのコンテンツを選択する操作を操作部29で行うと、選択したコンテンツを取得する要求を、ステップS18においてサーバ11へ送信できる。すると、コンテンツを取得する要求を受けたサーバ11は、ステップS19において、コンテンツを利用者端末13へ送信する。また、サーバ11は、利用者端末13からコンテンツを取得する要求を受けた履歴、及びコンテンツを利用者端末13へ送信した履歴を、利用者端末13の利用者情報に関連付けて、ステップS19で記憶部16へ記憶する。 In step S18, the user terminal 13 can check a list of content published on the network 14. The list of content is displayed on the display unit 30. In addition, when an operation to select any content is performed on the operation unit 29 in the user terminal 13, a request to acquire the selected content can be sent to the server 11 in step S18. Then, the server 11 that has received the request to acquire the content sends the content to the user terminal 13 in step S19. In addition, the server 11 associates the history of receiving requests to acquire content from the user terminal 13 and the history of sending content to the user terminal 13 with the user information of the user terminal 13 and stores them in the storage unit 16 in step S19.
利用者端末13は、ステップS20において、サーバ11からコンテンツを取得し、かつ、コンテンツを利用する。コンテンツの利用は、表示部30で静止画または動画を表示すること、スピーカ33からコンテンツに含まれる音楽及び音声を出力すること、等を含む。このため、利用者は提供されたコンテンツを楽しむことができる。また、利用者がコンテンツを利用中、生体認証装置36は、利用者の反応情報を検出して信号を出力する。 In step S20, the user terminal 13 acquires the content from the server 11 and uses the content. Using the content includes displaying still images or videos on the display unit 30, outputting music and audio contained in the content from the speaker 33, and the like. This allows the user to enjoy the provided content. In addition, while the user is using the content, the biometric authentication device 36 detects the user's reaction information and outputs a signal.
生体認証装置36で検出された反応情報は、利用者端末13においてステップS21で処理される。利用者端末13は、ステップS22において、反応情報をサーバ11へ送信する。サーバ11は、ステップS23において、利用者端末13から反応情報を取得し、反応認識部20が反応情報を処理及び判断し、かつ、判断結果を記憶部16へ記憶する。具体的に説明すると、反応認識部20は、利用者端末13の利用者の表情の変化、ステップS12で記憶した利用者端末13の利用者の心拍数、ステップS12で記憶した利用者端末13の利用者の血圧、利用者が発した笑い声及び歓声、を含む反応情報を処理することにより、利用者端末13の利用者の反応の有無、利用者端末13の利用者が所定時間内に反応した回数、利用者端末13の利用者が反応したタイミング等を判断できる。利用者端末13の利用者の反応の有無は、利用者端末13の利用者が笑ったか否か、利用者端末13の利用者が歓声を発したか否か、利用者端末13の利用者が興奮したか否かである。利用者端末13の利用者が笑ったか否かは、表情の変化、笑い声の有無により判断できる。利用者端末13の利用者が所定時間内に反応した回数は、利用者端末13の利用者が所定時間内に笑った回数、利用者端末13の利用者が歓声を発した回数、利用者端末13の利用者が興奮した回数である。利用者端末13の利用者が反応したタイミングは、コンテンツの利用を開始した時点から終了するまでの時点の間において、利用者が反応したタイミングである。 The reaction information detected by the biometric authentication device 36 is processed in the user terminal 13 in step S21. The user terminal 13 transmits the reaction information to the server 11 in step S22. The server 11 acquires the reaction information from the user terminal 13 in step S23, and the reaction recognition unit 20 processes and judges the reaction information, and stores the judgment result in the memory unit 16. Specifically, the reaction recognition unit 20 processes the reaction information including the change in the facial expression of the user of the user terminal 13, the heart rate of the user of the user terminal 13 stored in step S12, the blood pressure of the user of the user terminal 13 stored in step S12, and laughter and cheers emitted by the user, thereby making it possible to determine whether or not the user of the user terminal 13 has reacted, the number of times the user of the user terminal 13 has reacted within a specified period of time, the timing at which the user of the user terminal 13 reacted, etc. The presence or absence of a reaction of the user of user terminal 13 refers to whether the user of user terminal 13 laughed, whether the user of user terminal 13 cheered, and whether the user of user terminal 13 became excited. Whether the user of user terminal 13 laughed can be determined from a change in facial expression and the presence or absence of laughter. The number of times the user of user terminal 13 reacted within a specified time period refers to the number of times the user of user terminal 13 laughed, the number of times the user of user terminal 13 cheered, and the number of times the user of user terminal 13 became excited within the specified time period. The timing at which the user of user terminal 13 reacted refers to the timing at which the user reacted between the start and end of use of the content.
利用者端末13の利用者の表情の変化、利用者端末13の利用者が笑ったか否かは、カメラ34で撮影された映像と、予めステップS12で記憶されている利用者端末13の利用者の顔の映像とを比較して、反応認識部20が顔認証処理を行って判断できる。顔認証処理は、特開2005-275605号公報、特開2005-352892号公報、特開2007-148968号公報等に記載されているように公知であるため、具体的な説明を省略する。反応認識部20は、コンテンツ利用時における利用者端末13の利用者の笑い声の大きさ(音量)から、利用者端末13の利用者の笑いのレベルを判断する。具体的には、笑い声が大きい程、笑いのレベルが高いと判断する。反応認識部20は、コンテンツを利用していない状態における利用者端末13の利用者の心拍数と、コンテンツ利用時における利用者端末13の利用者の心拍数とを比較し、利用者端末13の利用者の興奮の有無、利用者端末13の利用者の興奮のレベル等を判断する。 The reaction recognition unit 20 can determine whether the user of the user terminal 13 has laughed or not by performing face recognition processing by comparing the image captured by the camera 34 with the image of the face of the user of the user terminal 13 previously stored in step S12. Since face recognition processing is publicly known as described in JP 2005-275605 A, JP 2005-352892 A, JP 2007-148968 A, etc., a detailed explanation will be omitted. The reaction recognition unit 20 determines the level of laughter of the user of the user terminal 13 from the volume (volume) of the laughter of the user of the user terminal 13 when using the content. Specifically, it is determined that the louder the laughter, the higher the level of laughter. The reaction recognition unit 20 compares the heart rate of the user of the user terminal 13 when not using the content with the heart rate of the user of the user terminal 13 when using the content, and determines whether the user of the user terminal 13 is excited, the level of excitement of the user of the user terminal 13, etc.
また、反応認識部20は、コンテンツを利用していない状態における利用者端末13の利用者の血圧と、コンテンツ利用時における利用者端末13の利用者の血圧とを比較し、利用者端末13の利用者の興奮の有無、利用者端末13の利用者の興奮のレベル等を判断する。例えば、心拍数の増加数が大きい程、興奮のレベルが高いと判断し、血圧の上昇量が大きい程、興奮のレベルが高いと判断する。また、サーバ11は、ステップS23において、利用者端末13から取得した反応情報に基づいて、コンテンツに対する利用者端末13の利用者の評価情報を作成し、かつ、評価情報を記憶部16へ記憶する。評価情報は、コンテンツと、コンテンツに対応する利用者端末13の利用者の反応の有無、及び利用者端末13の利用者の反応レベルとを関連付けたものである。 The reaction recognition unit 20 also compares the blood pressure of the user of the user terminal 13 when not using the content with the blood pressure of the user of the user terminal 13 when using the content, and judges whether or not the user of the user terminal 13 is excited, and the level of excitement of the user of the user terminal 13. For example, it judges that the greater the increase in heart rate, the higher the level of excitement, and that the greater the increase in blood pressure, the higher the level of excitement. In step S23, the server 11 also creates evaluation information of the user of the user terminal 13 regarding the content based on the reaction information acquired from the user terminal 13, and stores the evaluation information in the storage unit 16. The evaluation information associates the content with the presence or absence of a reaction of the user of the user terminal 13 corresponding to the content, and the reaction level of the user of the user terminal 13.
サーバ11は、ステップS24において、契約内容の実行可否を判断する。サーバ11は、反応情報の判断結果に基づいてステップS24の判断を行う。例えば、利用者端末13の利用者の反応があった場合に、“契約内容を実行”と判断し、利用者端末13の利用者の反応が無かった場合に、“契約内容を実行しない”と判断する。ここで、反応情報に含まれる笑った顔、笑った声、歓声を発した、心拍数または血圧の上昇、等のうち、少なくとも1つが検出された場合に、利用者端末13の利用者の反応があったと判断する。また、反応情報に含まれる笑った顔、笑った声、歓声を発した、心拍数または血圧の上昇、等のうち何れも検出されなかった場合に、利用者端末13の利用者の反応が無かったと判断する。 In step S24, the server 11 judges whether the contract contents can be executed. The server 11 makes the judgment in step S24 based on the judgment result of the reaction information. For example, if there is a reaction from the user of the user terminal 13, it judges that "the contract contents will be executed," and if there is no reaction from the user of the user terminal 13, it judges that "the contract contents will not be executed." Here, if at least one of the reaction information including a smiling face, a laughing voice, a cheer, an increase in heart rate or blood pressure, etc. is detected, it is judged that there is a reaction from the user of the user terminal 13. Also, if none of the reaction information including a smiling face, a laughing voice, a cheer, an increase in heart rate or blood pressure, etc. is detected, it is judged that there is no reaction from the user of the user terminal 13.
また、サーバ11は、所定時間内に利用者端末13の利用者が反応した回数、または、利用者端末13の利用者の反応レベルが、予め記憶部16に記憶されている閾値を超えた場合に、“契約内容を実行”と判断してもよい。これに対して、サーバ11は、所定時間内に利用者端末13の利用者が反応した回数、または、利用者端末13の利用者の反応レベルが、予め記憶部16に記憶されている閾値以下である場合に、“契約内容を実行しない”と判断してもよい。 In addition, the server 11 may determine that the contract contents are to be executed if the number of times the user of the user terminal 13 responds within a specified time period, or the response level of the user of the user terminal 13, exceeds a threshold value previously stored in the storage unit 16. In contrast, the server 11 may determine that the contract contents are not to be executed if the number of times the user of the user terminal 13 responds within a specified time period, or the response level of the user of the user terminal 13, is equal to or lower than a threshold value previously stored in the storage unit 16.
さらに、サーバ11は、ステップS24において“契約内容を実行”と判断する場合、所定時間内に利用者端末13の利用者が反応した回数、または、利用者端末13の利用者の反応レベルに応じて、“契約内容の実行程度”を変更してもよい。ここで、“契約内容の実行程度”は、支払われる対価の額を、反応レベルが高い程高額にすること、支払われる暗号資産の額を、反応レベルが高い程高額にすること、付与されるポイントの数を、反応レベルが高い程多くすること、等を含む。 Furthermore, when the server 11 determines in step S24 that the "contract contents are to be executed," it may change the "degree of execution of the contract contents" according to the number of times the user of the user terminal 13 reacts within a specified time period, or the reaction level of the user of the user terminal 13. Here, the "degree of execution of the contract contents" includes increasing the amount of compensation paid the higher the reaction level, increasing the amount of cryptocurrency paid the higher the reaction level, increasing the number of points awarded the higher the reaction level, etc.
また、サーバ11は、ステップS24で“契約内容を実行”と判断すると、サーバ11は、ステップS24において、契約内容に定められた手続きを行う。つまり、利用者端末13の利用者の口座に対する現金または暗号資産の支払い及び精算、利用者端末13へのポイント付与、提供者の口座への支払い及び精算等の処理を行う。また、サーバ11は、ステップS25において、契約内容に定められた手続きを行った、または、契約内容に定められた手続きを行わなかったことを示す情報を、利用者端末13及び提供者端末12へ送信する。さらに、ステップS25において、提供者端末12へ送信される情報は、コンテンツに対する利用者端末13の利用者の評価情報が含まれていてもよい。利用者端末13は、ステップS26においてサーバ11から情報を取得する。また、提供者端末12は、ステップS27においてサーバ11から情報を取得する。 If the server 11 determines in step S24 that the contract contents are to be executed, the server 11 performs the procedures specified in the contract contents in step S24. That is, the server 11 performs processes such as payment and settlement of cash or cryptocurrency to the user's account in the user terminal 13, granting points to the user terminal 13, and payment and settlement to the provider's account. In step S25, the server 11 transmits information to the user terminal 13 and the provider terminal 12 indicating that the procedures specified in the contract contents have been performed or that the procedures specified in the contract contents have not been performed. Furthermore, the information transmitted to the provider terminal 12 in step S25 may include evaluation information of the user of the user terminal 13 regarding the content. The user terminal 13 acquires information from the server 11 in step S26. The provider terminal 12 acquires information from the server 11 in step S27.
このように、サーバ11が利用者端末13から取得する反応情報は、生体認証装置36により検出されたものである。したがって、サーバ11が取得する反応情報に、利用者の主観が含まれることを抑制できる。また、提供者端末12では、利用者端末13へ提供したコンテンツに対する評価情報を、ステップS27で取得することができる。このため、提供者端末12を管理する提供者は、ネットワーク14上において、ソーシャル・ネットワーキング・サービス(SNS)、TikTok(登録商標)、ユーチューブ(登録商標)等で自社製品、つまり、コンテンツをアピールするにあたり、サーバ11から取得した評価情報を利用できる。さらに、提供者は、サーバ11から取得した反応情報を、ネットワーク14に接続される他のコンピュータを介して、企業間取引に利用できる。さらに、サーバ11は、提供者端末12に対し、“利用者端末の反応が良ければ、無料で貴社の広告をネットワーク上で公開します。”とのメッセージを、ステップS25で送信することもできる。 In this way, the reaction information acquired by the server 11 from the user terminal 13 is detected by the biometric authentication device 36. Therefore, it is possible to prevent the reaction information acquired by the server 11 from including the subjective opinion of the user. Furthermore, the provider terminal 12 can acquire evaluation information on the content provided to the user terminal 13 in step S27. Therefore, the provider who manages the provider terminal 12 can use the evaluation information acquired from the server 11 when promoting its own products, i.e., content, on social networking services (SNS), TikTok (registered trademark), YouTube (registered trademark), etc. on the network 14. Furthermore, the provider can use the reaction information acquired from the server 11 for business-to-business transactions via other computers connected to the network 14. Furthermore, the server 11 can also send a message to the provider terminal 12 in step S25 stating, "If the reaction of the user terminal is good, we will publish your company's advertisement on the network free of charge."
(他の処理例1)
さらに、情報処理システム10では、図2のステップS17,S19,S23が行われたことを前提として、図3(A)の処理を行うこともできる。図3(A)に示される処理例のうち、サーバ11が実行する処理及び判断のステップは、サーバ11が、記録媒体70から取得したプログラム、アプリケーションを起動して実行する。
(Another Processing Example 1)
Furthermore, the information processing system 10 can also perform the processing of Fig. 3A on the premise that steps S17, S19, and S23 of Fig. 2 have been performed. In the processing example shown in Fig. 3A, the processing and judgment steps executed by the server 11 are executed by the server 11 by activating a program or application acquired from the recording medium 70.
サーバ11は、ステップS30において、利用者端末13が取得した履歴が無いコンテンツのうち、利用者端末13で取得されたコンテンツの反応情報と近似する反応情報を得られる、と推定される他のコンテンツを選択する。そして、サーバ11は、ステップS30で選択した他のコンテンツを、ステップS31で利用者端末13へ推奨する。利用者端末13は、推奨された情報を表示部30で表示し、かつ、他のコンテンツを取得する要求を、ステップS18で出すことができる。したがって、利用者端末13の利用者は、自分の嗜好に合うコンテンツを検索し易くなる。 In step S30, the server 11 selects other content that is presumed to provide reaction information similar to the reaction information of the content acquired by the user terminal 13, from among the content that the user terminal 13 has not previously acquired. The server 11 then recommends the other content selected in step S30 to the user terminal 13 in step S31. The user terminal 13 can display the recommended information on the display unit 30, and issue a request to acquire other content in step S18. This makes it easier for the user of the user terminal 13 to search for content that matches their preferences.
図2の制御例では、次の処理が行われてもよい。利用者端末13Aがサーバ12に接続されると、利用者端末13Aは、図2のステップS10、ステップS13、ステップS18、ステップS20、ステップS21、ステップS22、ステップS26の内容を実行できる。また、サーバ12は、利用者端末13Aから送信される情報に基づいて、ステップS12、ステップS15、ステップS19、ステップS23、ステップS24、ステップS25の内容を実行する。そして、サーバ12の制御部15は、ステップS23において、複数の利用者端末13Aから送信された反応情報を処理し、かつ、記憶する。ここで、複数の利用者端末13Aの利用者を、複数の年齢層毎に階層化する。さらに、各年齢層毎に、心拍数の平均及び血圧の平均を求めて記憶する。また、複数の利用者端末13Aに提供したコンテンツ毎に、利用者の反応の有無、利用者が笑った回数、利用者が所定時間内に反応した回数、利用者が反応したタイミング等を記憶する。 In the control example of FIG. 2, the following process may be performed. When the user terminal 13A is connected to the server 12, the user terminal 13A can execute the contents of steps S10, S13, S18, S20, S21, S22, and S26 in FIG. 2. The server 12 also executes the contents of steps S12, S15, S19, S23, S24, and S25 based on the information transmitted from the user terminal 13A. The control unit 15 of the server 12 processes and stores the reaction information transmitted from the multiple user terminals 13A in step S23. Here, the users of the multiple user terminals 13A are stratified into multiple age groups. Furthermore, the average heart rate and the average blood pressure are calculated and stored for each age group. Also, for each content provided to the multiple user terminals 13A, the presence or absence of a reaction from the user, the number of times the user laughed, the number of times the user reacted within a specified time, the timing of the user's reaction, and the like are stored.
そして、利用者端末13がサーバ12に接続されると、制御部15は、ステップS23で次の処理を実行できる。サーバ12が利用者端末13Aから取得した利用者の年齢層毎の心拍数の平均値、及びサーバ12が利用者端末13Aから取得した利用者の年齢層毎の血圧の平均値と、ステップS22で利用者端末13から送信される利用者の心拍数及び血圧と、を比較して、利用者端末13の利用者の反応の有無を判断し、かつ、利用者端末13の利用者の興奮の有無、利用者端末13の利用者の興奮のレベル等を判断することができる。 Then, when the user terminal 13 is connected to the server 12, the control unit 15 can execute the following process in step S23. By comparing the average heart rate for each age group of users acquired by the server 12 from the user terminal 13A and the average blood pressure for each age group of users acquired by the server 12 from the user terminal 13A with the heart rate and blood pressure of the user transmitted from the user terminal 13 in step S22, it is possible to determine whether or not the user of the user terminal 13 has responded, and to determine whether or not the user of the user terminal 13 is excited, the level of excitement of the user of the user terminal 13, etc.
さらに、サーバ11の記憶部16に、公的機関から発表される、各年齢層毎の心拍数の平均及び血圧の平均が予め記憶されていてもよい。そして、サーバ11の制御部15は、記憶部16に記憶されている各年齢層毎の心拍数の平均値、及び記憶部16に記憶されている各年齢層毎の血圧の平均値と、ステップS22で利用者端末13から送信される利用者の心拍数及び血圧と、を比較して、利用者端末13の利用者の反応の有無を判断し、かつ、利用者端末13の利用者の興奮の有無、利用者端末13の利用者の興奮のレベル等を判断することができる。 Furthermore, the average heart rate and average blood pressure for each age group published by a public institution may be stored in advance in the memory unit 16 of the server 11. The control unit 15 of the server 11 then compares the average heart rate for each age group stored in the memory unit 16 and the average blood pressure for each age group stored in the memory unit 16 with the user's heart rate and blood pressure transmitted from the user terminal 13 in step S22 to determine whether or not the user of the user terminal 13 has responded, and can also determine whether or not the user of the user terminal 13 is excited, the level of excitement of the user of the user terminal 13, etc.
(他の処理例2)
さらに、情報処理システム10では、図2のステップS17,S19,S23が行われたことを前提として、図3(B)の処理を行うこともできる。図3(B)に示される処理例のうち、サーバ11が実行する処理及び判断のステップは、サーバ11が、記録媒体70から取得したプログラム、アプリケーションを起動して実行する。サーバ11は、ステップS40において、同一のコンテンツを利用した複数の利用者端末13のうち、近似する反応情報を取得した利用者端末13の有無を検出できる。近似する反応情報は、コンテンツの利用中に、同一のタイミングで笑ったこと、同一のタイミングで歓声を発したこと、同一のタイミングで興奮したこと、笑った回数の差が所定数以下であること、歓声を発した回数の差が所定数以下であること、興奮した回数の差が所定数以下であること、等を含む。これらのうち、少なくとも1つが検出された場合に、近似する反応情報を取得した利用者端末13があると判断できる。
(Another Processing Example 2)
Furthermore, the information processing system 10 can also perform the processing of FIG. 3B on the premise that steps S17, S19, and S23 of FIG. 2 have been performed. In the processing example shown in FIG. 3B, the server 11 executes the processing and judgment steps by starting a program and application acquired from the recording medium 70. In step S40, the server 11 can detect the presence or absence of a user terminal 13 that has acquired similar reaction information among a plurality of user terminals 13 that have used the same content. The similar reaction information includes laughing at the same timing, cheering at the same timing, getting excited at the same timing, the difference in the number of laughs being a predetermined number or less, the difference in the number of cheers being a predetermined number or less, the difference in the number of excited times being a predetermined number or less, and the like, during the use of the content. When at least one of these is detected, it can be determined that there is a user terminal 13 that has acquired similar reaction information.
そして、サーバ11は、ステップS41において、同一のコンテンツを利用して近似する反応情報を取得された複数の利用者端末13のそれぞれに対し、“あなたが利用したコンテンツについて、近似する反応情報があった他の利用者端末があります。”等の補助情報を送信することができる。複数の利用者端末13は、ステップS42において補助情報を取得し、かつ、補助情報を表示部30で表示できる。 Then, in step S41, the server 11 can transmit auxiliary information such as "There is another user terminal that has had similar reaction information about the content you used" to each of the multiple user terminals 13 that have obtained similar reaction information by using the same content. The multiple user terminals 13 can obtain the auxiliary information in step S42 and display the auxiliary information on the display unit 30.
図3(B)の処理を行うと、複数の利用者端末13,13A同士の間で、同一のコンテンツに関して近似する反応があった他の利用者端末13Aを、ネットワーク14上で探し、利用者端末13,13A同士でつながる手掛かりになる。複数の利用者端末13,13Aは、例えば、アプリケーション・プログラミング・インタフェース(API)、ソーシャル・ネットワーキング・サービス(SNS)、マッチングアプリ等で、相手を探すことができる。したがって、利用者は、コンテンツを利用した場合に近似する反応を示す利用者端末13を、容易に探し出すことができる。なお、同一コンテンツに関して近似する反応情報を取得した程度に応じて、各利用者端末13,13Aへ送信する情報の程度、例えば、つながり程度を変更してもよい。例えば、音量が大きいほど、つながり程度を強くすること、笑った時間が長いほど、つながり程度を強くすること、笑った回数が多いほど、つながり程度を強くすること、を行える。 When the process of FIG. 3(B) is performed, a plurality of user terminals 13, 13A search for other user terminals 13A that have similar reactions to the same content on the network 14, which becomes a clue for connecting the user terminals 13, 13A with each other. A plurality of user terminals 13, 13A can search for partners, for example, through an application programming interface (API), a social networking service (SNS), a matching app, etc. Thus, users can easily find user terminals 13 that show similar reactions when using content. Note that the level of information to be sent to each user terminal 13, 13A, for example the level of connection, may be changed according to the degree to which similar reaction information regarding the same content has been obtained. For example, the higher the volume, the stronger the level of connection, the longer the duration of laughter, or the more times the user laughs, the stronger the level of connection.
(他の処理例3)
情報処理システム10が、図2のステップS17,S19,S23を行ったことを前提として、情報処理システム10では、図4(A)の処理を実行できる。図4(A)に示される処理例のうち、サーバ11が実行する処理及び判断のステップは、サーバ11が、記録媒体70から取得したプログラム、アプリケーションを起動して実行する。サーバ11は、第1の利用者端末13におけるコンテンツの利用履歴、第1の利用者端末13でコンテンツが利用された場合の反応情報の履歴に基づき、ステップS50において、ランダムに利用者端末13を選択する。選択する利用者端末13は、第1の利用者端末13で取得されたコンテンツを取得した履歴が無く、かつ、第1の利用者端末13とは異なる第2の利用者端末13Aである。第2の利用者端末13Aは、単数または複数の何れでもよい。
(Other Processing Example 3)
On the premise that the information processing system 10 has performed steps S17, S19, and S23 in FIG. 2, the information processing system 10 can execute the process of FIG. 4A. In the process example shown in FIG. 4A, the server 11 executes the process and judgment steps by starting up a program and application acquired from the recording medium 70. In step S50, the server 11 randomly selects a user terminal 13 based on the content usage history of the first user terminal 13 and the history of reaction information when the content is used in the first user terminal 13. The user terminal 13 to be selected is a second user terminal 13A that has no history of acquiring the content acquired in the first user terminal 13 and is different from the first user terminal 13. The second user terminal 13A may be either a single terminal or a plurality of terminals.
そして、サーバ11は、ステップS50で選択した第2の利用者端末13Aへ、新たなコンテンツを推奨する情報を、ステップS51で送信する処理を行う。第2の利用者端末13Aは、推奨された新たな情報を取得し、ステップS18において、コンテンツを取得する要求を送信できる。したがって、第2の利用者端末13Aの利用者は、自分の嗜好に合った新たなコンテンツを検索し易くなる。 Then, in step S51, the server 11 performs a process of transmitting information recommending new content to the second user terminal 13A selected in step S50. The second user terminal 13A obtains the recommended new information, and in step S18, can transmit a request to obtain the content. This makes it easier for the user of the second user terminal 13A to search for new content that matches their preferences.
(他の処理例4)
情報処理システム10は、図2のステップS17,S19,S23を行ったことを前提として、図4(B)の処理を実行できる。図4(B)に示される処理例のうち、サーバ11が実行する処理及び判断のステップは、サーバ11が、記録媒体70から取得したプログラム、アプリケーションを起動して実行する。
(Other Processing Example 4)
The information processing system 10 can execute the process of Fig. 4B on the premise that steps S17, S19, and S23 of Fig. 2 have been performed. Among the process examples shown in Fig. 4B, the process and judgment steps executed by the server 11 are executed by the server 11 activating a program and an application acquired from the recording medium 70.
サーバ60は、ステップS60において、所定の利用者端末13からコンテンツの取得要求を受けたことが無く、かつ、所定の利用者端末13が取得したコンテンツにおける反応情報と近似する反応情報を得られると推定される他のコンテンツを選択することである。そして、サーバ11は、ステップS61において、他のコンテンツを所定の利用者端末13へ推奨する。また、“推奨したコンテンツが面白くなかったら対価を付与します。”とのメッセージを、所定の利用者端末13へ送信する。 In step S60, the server 60 selects other content that has not received a content acquisition request from the specified user terminal 13 and is presumed to produce reaction information similar to the reaction information for the content acquired by the specified user terminal 13. The server 11 then recommends the other content to the specified user terminal 13 in step S61. It also sends a message to the specified user terminal 13 stating, "If the recommended content is not interesting, we will provide you with compensation."
そして、所定の利用者端末13において図2のステップS18,S20,S21,S22と同様の処理が行われる。その後、サーバ11においてステップS23に進み、利用者の反応が無かった場合、サーバ11はステップS62において、所定の利用者端末13へ対価を付与する判断を行うこともできる。したがって、所定の利用者端末13の利用者は、対価を取得できる。なお、サーバ11のステップS23において、利用者の反応が無かった場合、サーバ11は、ステップS62において、所定の利用者端末13へ対価を付与しない判断をする。 Then, the same processing as steps S18, S20, S21, and S22 in FIG. 2 is performed at the specified user terminal 13. After that, the server 11 proceeds to step S23. If there is no reaction from the user, the server 11 can also make a decision to grant a compensation to the specified user terminal 13 at step S62. Therefore, the user of the specified user terminal 13 can obtain the compensation. Note that if there is no reaction from the user at step S23 on the server 11, the server 11 makes a decision not to grant a compensation to the specified user terminal 13 at step S62.
本実施形態で説明した事項の技術的意味の一例は、次の通りである。所定の処理には、図2のステップS24において、提供者端末12と利用者端末13との間で取り交わされた契約を実行すること、図3(A)のステップS31の処理、図3(B)のステップS41の処理、図4(A)のステップS51の処理、図4(B)のステップS61の処理等が含まれる。 An example of the technical meaning of the matters described in this embodiment is as follows. The predetermined processing includes executing the contract exchanged between the provider terminal 12 and the user terminal 13 in step S24 of FIG. 2, the processing of step S31 of FIG. 3(A), the processing of step S41 of FIG. 3(B), the processing of step S51 of FIG. 4(A), the processing of step S61 of FIG. 4(B), etc.
サーバ11は、情報処理装置の一例である。コンテンツ取得部18は、コンテンツ取得回路または、コンテンツ取得器として把握することもできる。コンテンツ提供部19は、コンテンツ提供回路、または、コンテンツ提供回路として把握することもできる。反応認識部20は、反応処理回路、または、反応処理器として把握することもできる。処理実行部21は、処理実行回路、または、処理実行器として把握することもできる。顧客管理部40は、顧客管理回路、または、顧客管理器として把握することもできる。コンテンツ推奨部22は、コンテンツ推奨回路、または、コンテンツ推奨器として把握することもできる。生体認証装置36は、反応検出回路、または、反応検出器として認識することも可能である。 The server 11 is an example of an information processing device. The content acquisition unit 18 can also be understood as a content acquisition circuit or a content acquirer. The content provision unit 19 can also be understood as a content provision circuit or a content provision circuit. The reaction recognition unit 20 can also be understood as a reaction processing circuit or a reaction processor. The processing execution unit 21 can also be understood as a processing execution circuit or a processing execution unit. The customer management unit 40 can also be understood as a customer management circuit or a customer management unit. The content recommendation unit 22 can also be understood as a content recommendation circuit or a content recommender. The biometric authentication device 36 can also be recognized as a reaction detection circuit or a reaction detector.
図2に示す処理、図3(A),(B)に示す処理、図4(A),(B)に示す処理は、それぞれ情報処理方法の一例である。図2のステップS17は、コンテンツ取得ステップの一例である。ステップS19は、コンテンツ提供ステップの一例である。ステップS23は、反応認識ステップの一例である。ステップS24は、処理実行ステップの一例である。コンテンツ取得ステップは、コンテンツ取得手段として把握することもできる。コンテンツ提供ステップは、コンテンツ提供手段として把握することもできる。反応認識ステップは、反応認識手段として把握することもできる。処理実行ステップは、処理実行手段として把握することもできる。 The process shown in FIG. 2, the process shown in FIGS. 3(A) and (B), and the process shown in FIGS. 4(A) and (B) are each an example of an information processing method. Step S17 in FIG. 2 is an example of a content acquisition step. Step S19 is an example of a content provision step. Step S23 is an example of a reaction recognition step. Step S24 is an example of a process execution step. The content acquisition step can also be understood as a content acquisition means. The content provision step can also be understood as a content provision means. The reaction recognition step can also be understood as a reaction recognition means. The process execution step can also be understood as a process execution means.
本実施形態は、図面を用いて開示されたものに限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能である。例えば、情報処理システムは、情報管理システムと把握してもよい。また、情報処理装置は、情報管理装置と把握してもよい。本実施形態において、生体認証装置は、利用者端末自体が備えているもの、または、利用者端末とは別に設けられた外部デバイスの何れであってもよい。情報処理装置は、入力される信号、予め記憶されているプログラム、アプリケーション、データ等を含む情報に基づいて、各種の処理、判断、決定等を行うコンピュータである。情報処理装置としてのコンピュータは、サーバ、ワークステーション、メインフレーム、スーパーコンピュータ等のうちの少なくとも1つを含む。 This embodiment is not limited to what is disclosed using the drawings, and various modifications are possible without departing from the gist of the present invention. For example, the information processing system may be understood as an information management system. Furthermore, the information processing device may be understood as an information management device. In this embodiment, the biometric authentication device may be either one that is provided in the user terminal itself, or an external device provided separately from the user terminal. The information processing device is a computer that performs various processes, judgments, decisions, etc., based on information including input signals, pre-stored programs, applications, data, etc. The computer as the information processing device includes at least one of a server, a workstation, a mainframe, a supercomputer, etc.
本開示は、情報提供者が用意した情報を、利用者が利用する利用者端末に、ネットワークを介して提供する情報処理装置、情報処理方法、プログラム及び記録媒体に利用可能である。 This disclosure can be used in an information processing device, information processing method, program, and recording medium that provide information prepared by an information provider to a user terminal used by a user via a network.
11…サーバ、12…提供者端末、18…コンテンツ取得部、19…コンテンツ提供部、20…反応認識部、21…処理実行部、22…コンテンツ推奨部、40…顧客管理部、70…記録媒体 11...server, 12...provider terminal, 18...content acquisition unit, 19...content provision unit, 20...response recognition unit, 21...processing execution unit, 22...content recommendation unit, 40...customer management unit, 70...recording medium
Claims (10)
前記コンテンツ取得部が取得した前記コンテンツを、前記ネットワークを介して利用者端末へ提供するコンテンツ提供部と、
を有する情報処理装置であって、
利用者が前記利用者端末で前記コンテンツを利用した場合における前記利用者の生体認証情報、または、前記利用者の顔の表情を撮影した映像情報のうち、少なくとも一方の情報に基づいて、前記利用者の反応情報を取得する反応認識部と、
前記反応認識部が取得した前記反応情報に基づいて、前記コンテンツと前記利用者端末とを関連付けた所定の処理を行う処理実行部と、
を有する、情報処理装置。 a content acquisition unit that acquires content from a provider terminal via a network;
a content providing unit that provides the content acquired by the content acquiring unit to a user terminal via the network;
An information processing device having
a reaction recognition unit that acquires reaction information of the user based on at least one of biometric authentication information of the user when the user uses the content on the user terminal and video information capturing a facial expression of the user;
a processing execution unit that performs a predetermined process of associating the content with the user terminal based on the reaction information acquired by the reaction recognition unit;
An information processing device having the above configuration.
前記提供者端末と前記利用者端末との間で取り交わされた契約を記憶する顧客管理部を有し、
前記処理実行部が行う前記所定の処理は、前記提供者端末と前記利用者端末との間で取り交わされた契約を実行することを含む、情報処理装置。 2. The information processing device according to claim 1,
a customer management unit for storing a contract concluded between the provider terminal and the user terminal;
An information processing device, wherein the predetermined processing performed by the processing execution unit includes executing a contract concluded between the provider terminal and the user terminal.
前記利用者端末で前記コンテンツが利用された場合の前記反応情報と近似する反応情報を得られると推定される他のコンテンツを、前記利用者端末に推奨する、コンテンツ推奨部を有する、情報処理装置。 2. The information processing device according to claim 1,
An information processing device having a content recommendation unit that recommends to the user terminal other content that is estimated to produce reaction information similar to the reaction information when the content is used on the user terminal.
前記コンテンツ提供部は、前記ネットワークを介して複数の前記利用者端末へ前記コンテンツを提供でき、
複数の前記利用者端末のうち、第1の利用者端末における前記コンテンツの利用履歴、及び前記第1の利用者端末で前記コンテンツを利用した場合の反応情報に基づいて、複数の前記利用者端末にのうち、前記第1の利用者端末以外の第2の利用者端末に対し、前記第1の利用者端末で利用された前記コンテンツを推奨するコンテンツ推奨部を有する、情報処理装置。 2. The information processing device according to claim 1,
the content providing unit is capable of providing the content to a plurality of the user terminals via the network;
An information processing device having a content recommendation unit that recommends the content used on a first user terminal to a second user terminal other than the first user terminal among the multiple user terminals, based on the content usage history of the first user terminal among the multiple user terminals and reaction information when the content is used on the first user terminal.
前記コンテンツ提供部は、前記ネットワークを介して複数の前記利用者端末へ前記コンテンツを提供でき、
前記処理実行部が行う前記所定の処理は、同一の前記コンテンツを利用した複数の前記利用者端末のうち、近似する前記反応情報を取得した前記利用者端末のそれぞれに対し、提供した前記コンテンツについて近似する前記反応情報を取得した前記利用者端末が存在していることを示す補助情報を送信することを含む、情報処理装置。 2. The information processing device according to claim 1,
the content providing unit is capable of providing the content to a plurality of the user terminals via the network;
The specified processing performed by the processing execution unit includes sending auxiliary information to each of the user terminals among the multiple user terminals that have used the same content and have obtained similar reaction information, indicating that there is a user terminal that has obtained similar reaction information for the provided content.
前記利用者の生体認証情報は、前記利用者の心拍数または前記利用者の血圧のうち、少なくとも一方を含む、情報処理装置。 2. The information processing device according to claim 1,
An information processing device, wherein the biometric authentication information of the user includes at least one of the user's heart rate or the user's blood pressure.
前記コンテンツは、動画または音楽のうち、少なくとも一方のデジタルコンテンツを含む、情報処理装置。 2. The information processing device according to claim 1,
The information processing device, wherein the content includes at least one of digital content of video and music.
前記提供者端末から前記ネットワークを介してコンテンツを取得するコンテンツ取得ステップと、
前記コンテンツ取得ステップで取得した前記コンテンツを、前記ネットワークを介して前記利用者端末へ提供するコンテンツ提供ステップと、
利用者が前記利用者端末で前記コンテンツを利用した場合における前記利用者の生体認証情報、または、前記利用者の顔の表情を撮影した映像情報のうち、少なくとも一方の情報に基づいて、前記利用者の反応情報を取得する反応認識ステップと、
前記反応認識ステップで取得した前記反応情報に基づいて、前記コンテンツと前記利用者端末とを関連付けた所定の処理を行う処理実行ステップと、
を実行する、情報処理方法。 An information processing device connected to a provider terminal and a user terminal via a network,
a content acquisition step of acquiring content from the provider terminal via the network;
a content providing step of providing the content acquired in the content acquiring step to the user terminal via the network;
a reaction recognition step of acquiring reaction information of the user based on at least one of biometric authentication information of the user when the user uses the content on the user terminal and video information capturing a facial expression of the user;
a processing execution step of performing a predetermined process in which the content and the user terminal are associated with each other based on the reaction information acquired in the reaction recognition step;
An information processing method.
前記提供者端末から前記ネットワークを介してコンテンツを取得するコンテンツ取得部、
前記コンテンツ取得部で取得した前記コンテンツを、前記ネットワークを介して前記利用者端末へ提供するコンテンツ提供部、
利用者が前記利用者端末で前記コンテンツを利用した場合における前記利用者の生体認証情報、または、前記利用者の顔の表情を撮影した映像情報のうち、少なくとも一方の情報に基づいて、前記利用者の反応情報を取得する反応認識部、
前記反応認識部が取得した前記反応情報に基づいて、前記コンテンツと前記利用者端末とを関連付けた所定の処理を行う処理実行部、
として機能させるプログラム。 An information processing device connected to a provider terminal and a user terminal via a network,
a content acquisition unit that acquires content from the provider terminal via the network;
a content providing unit that provides the content acquired by the content acquiring unit to the user terminal via the network;
a reaction recognition unit that acquires reaction information of the user based on at least one of biometric authentication information of the user when the user uses the content on the user terminal and video information capturing a facial expression of the user;
a processing execution unit that performs a predetermined process of associating the content with the user terminal based on the reaction information acquired by the reaction recognition unit;
A program that functions as a
前記提供者端末から前記ネットワークを介してコンテンツを取得するコンテンツ取得部、
前記コンテンツ取得部で取得した前記コンテンツを、前記ネットワークを介して前記利用者端末へ提供するコンテンツ提供部、
利用者が前記利用者端末で前記コンテンツを利用した場合における前記利用者の生体認証情報、または、前記利用者の顔の表情を撮影した映像情報のうち、少なくとも一方の情報に基づいて、前記利用者の反応情報を取得する反応認識部、
前記反応認識部が取得した前記反応情報に基づいて、前記コンテンツと前記利用者端末とを関連付けた所定の処理を行う処理実行部、
として機能させるプログラムが記録された記録媒体。 An information processing device connected to a provider terminal and a user terminal via a network,
a content acquisition unit that acquires content from the provider terminal via the network;
a content providing unit that provides the content acquired by the content acquiring unit to the user terminal via the network;
a reaction recognition unit that acquires reaction information of the user based on at least one of biometric authentication information of the user when the user uses the content on the user terminal and video information capturing a facial expression of the user;
a processing execution unit that performs a predetermined process of associating the content with the user terminal based on the reaction information acquired by the reaction recognition unit;
A recording medium on which a program that functions as a
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/036944 WO2024075150A1 (en) | 2022-10-03 | 2022-10-03 | Information processing device, information processing method, program, and recording medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/036944 WO2024075150A1 (en) | 2022-10-03 | 2022-10-03 | Information processing device, information processing method, program, and recording medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024075150A1 true WO2024075150A1 (en) | 2024-04-11 |
Family
ID=90607727
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/036944 Ceased WO2024075150A1 (en) | 2022-10-03 | 2022-10-03 | Information processing device, information processing method, program, and recording medium |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024075150A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011217209A (en) * | 2010-03-31 | 2011-10-27 | Sony Corp | Electronic apparatus, content recommendation method, and program |
| JP2012009957A (en) * | 2010-06-22 | 2012-01-12 | Sharp Corp | Evaluation information report device, content presentation device, content evaluation system, evaluation information report device control method, evaluation information report device control program, and computer-readable recording medium |
| JP2020039029A (en) * | 2018-09-03 | 2020-03-12 | グリー株式会社 | Video distribution system, video distribution method, and video distribution program |
-
2022
- 2022-10-03 WO PCT/JP2022/036944 patent/WO2024075150A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011217209A (en) * | 2010-03-31 | 2011-10-27 | Sony Corp | Electronic apparatus, content recommendation method, and program |
| JP2012009957A (en) * | 2010-06-22 | 2012-01-12 | Sharp Corp | Evaluation information report device, content presentation device, content evaluation system, evaluation information report device control method, evaluation information report device control program, and computer-readable recording medium |
| JP2020039029A (en) * | 2018-09-03 | 2020-03-12 | グリー株式会社 | Video distribution system, video distribution method, and video distribution program |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8745024B2 (en) | Techniques for enhancing content | |
| US20190052925A1 (en) | Method and System for Recognizing, Analyzing, and Reporting on Subjects in Videos without Interrupting Video Play | |
| AU2013201902B2 (en) | Converting a digital media item from a rental to a purchase | |
| US9027048B2 (en) | Automatic deal or promotion offering based on audio cues | |
| CN106941624A (en) | The treating method and apparatus that Internet video is tried | |
| US20240373086A1 (en) | Interactive broadcast overlay making use of ultra-dynamic qr code embedding | |
| CN111723237A (en) | Media content access control method | |
| US20240296195A1 (en) | System, method and computer-readable medium for recommendation | |
| US12477184B2 (en) | System, method and computer-readable medium for recommendation | |
| US20240146979A1 (en) | System, method and computer-readable medium for live streaming recommendation | |
| US20240089547A1 (en) | System and method for group consensus voting | |
| US20230062650A1 (en) | Systems and methods to enhance interactive program watching | |
| WO2024075150A1 (en) | Information processing device, information processing method, program, and recording medium | |
| US20250056097A1 (en) | System and method for stream distribution | |
| WO2021258071A1 (en) | System and method of rewarding users to watch video advertisements without skipping | |
| US12132959B1 (en) | Perceptual threshold trigger | |
| JP6473531B1 (en) | Automatic split payment system using face recognition technology | |
| JP6518359B1 (en) | Credit management and automatic payment system by face recognition technology | |
| US20230274313A1 (en) | System and methods for facilitating content promotion transactions between content promoters and artists | |
| US20240169372A1 (en) | Server device, presentation method, and non-transitory computer readable media for presenting candidate lists | |
| US20250315882A1 (en) | System and method for order management | |
| US20240414398A1 (en) | System and method for stream analysis | |
| US20250373870A1 (en) | System, method and computer-readable medium | |
| US12432395B2 (en) | Server, method and user terminal | |
| JP7597358B2 (en) | Information processing device, information providing system, method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22961342 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24.07.2025) |