WO2014007020A1 - Dispositif et procédé de traitement d'informations et programme - Google Patents
Dispositif et procédé de traitement d'informations et programme Download PDFInfo
- Publication number
- WO2014007020A1 WO2014007020A1 PCT/JP2013/065544 JP2013065544W WO2014007020A1 WO 2014007020 A1 WO2014007020 A1 WO 2014007020A1 JP 2013065544 W JP2013065544 W JP 2013065544W WO 2014007020 A1 WO2014007020 A1 WO 2014007020A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- image
- posting
- unit
- program
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a program to be executed by the information processing apparatus.
- Patent Document 2 discloses a technique that can detect the face of a person as a subject from an image and display the name of the person. By using the technique disclosed in Patent Document 2, the user can quickly start writing a comment to be posted on a person who cannot immediately remember his name even when he / she looks at his face.
- JP 2000-324416 A Japanese Patent Laid-Open No. 2003-150617
- the portable terminal has a small operation area for inputting characters, and it takes time to input not only comments but also names of people and program titles. Therefore, when the user finishes writing a comment, the scene may be over, and it is difficult to post a comment in a timely manner.
- One of the objects of the present invention is to provide an information processing apparatus, an information processing method, and a program to be executed by a computer that can easily and quickly post a program being distributed or broadcast.
- An information processing apparatus detects a program that is being distributed or broadcasted, a display unit that displays a program posting column, an operation on an image displayed by the display unit, and a region in which the operation is detected
- An operation unit that outputs information indicating the position of the region
- an image recognition unit that extracts an image corresponding to the position of the detection region from a program image displayed on the display unit, and a server that distributes information displayed in the posting column
- the image extracted by the image recognition unit from the image recognition unit and the communication unit that communicates with each other via the network are received, the posting information related to the image is displayed on the display unit, and the moving path of the detection area received from the operation unit
- a control unit that transmits the posted information to the server via the communication unit when recognizing that an operation for moving the image of the posted information to the posting column has been performed based on the information.
- An information processing method is an information processing method executed by an information processing apparatus having a display unit and an operation unit, and displays a program being distributed or broadcast and a posting column of the program on the display unit,
- the operation unit outputs information indicating the position of the detection region, which is a region where an operation on the image displayed on the display unit is detected, an image corresponding to the position of the detection region is extracted from the program image displayed on the display unit,
- the posted information is transmitted to the server that distributes the information displayed in the posting column. Is.
- a program includes a procedure for causing an information processing apparatus having a display unit and an operation unit to display a program being distributed or broadcast and a posting column of the program on the display unit, and an operation for an image displayed on the display unit
- the operation unit When the operation unit outputs information indicating the position of the detection region that is the region where the detection is detected, a procedure for extracting an image corresponding to the position of the detection region from the program image displayed on the display unit, and a movement path of the detection region
- the procedure for transmitting the posted information to the server that distributes the information displayed in the posting column is executed. is there.
- FIG. 1 is a block diagram showing a configuration example of an information processing apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram for explaining an information processing system for posting to a broadcast program using the portable terminal of the present embodiment.
- FIG. 3 is a block diagram illustrating a configuration example of the mobile terminal according to the present embodiment.
- 4 is an external perspective view showing an example of the mobile terminal shown in FIG.
- FIG. 5 is an exploded perspective view showing a state in which a part of the mobile terminal shown in FIG. 4 is disassembled.
- FIG. 6 is a diagram illustrating an example of an image displayed on the display unit.
- FIG. 7 is a diagram for explaining coordinates set in common to the touch panel, the display unit, and the image recognition unit.
- FIG. 1 is a block diagram showing a configuration example of an information processing apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram for explaining an information processing system for posting to a broadcast program using the portable terminal of the present embodiment.
- FIG. 8A is a diagram illustrating an example of an image when a candidate list is displayed as a search result.
- FIG. 8B is a diagram for explaining a method for the user to select post information from the candidate list shown in FIG. 8A.
- FIG. 9 is a diagram illustrating an example where the operation article passes through the mark area.
- FIG. 10 is a diagram illustrating an example of post information that can be input in the case of a route via the mark A.
- FIG. 11 is a diagram illustrating an example when the control unit displays an image for character input on the display unit.
- FIG. 12 is a diagram illustrating an example of a series of operations from selection of a person image to transmission of post information.
- FIG. 13 is a flowchart showing an operation procedure of the mobile terminal according to the present embodiment.
- FIG. 13 is a flowchart showing an operation procedure of the mobile terminal according to the present embodiment.
- FIG. 14A is a diagram showing a specific example of a user operation from selection of information about a person in a program to posting.
- FIG. 14B is a diagram showing a specific example of a user operation from selection of information about a person in a program to posting.
- FIG. 14C is a diagram showing a specific example of user operations from selection of information about a person in a program to posting.
- FIG. 14D is a diagram illustrating a specific example of a user operation from selection of information about a person in a program to posting.
- FIG. 14E is a diagram showing a specific example of user operations from selection of information about a person in a program to posting.
- FIG. 14F is a diagram illustrating a specific example of user operations from selection of information about a person in a program to posting.
- FIG. 14G is a diagram illustrating a specific example of user operations from selection of information about a person in a program to posting.
- FIG. 15A is a diagram for explaining a case where link information of an official home page of a person in a program can be attached to posted information.
- FIG. 15B is a diagram for explaining a case where link information of an official home page of a person in a program can be attached to posted information.
- FIG. 1 is a block diagram showing a configuration example of an information processing apparatus according to an embodiment of the present invention.
- the information processing apparatus 1 includes a display unit 7, an operation unit 5, an image recognition unit 6, a communication unit 8, and a control unit 2.
- the display unit 7 displays a program being distributed or a program being broadcast and a posting column for the program.
- the operation unit 5 detects an operation on the image displayed on the display unit 7 and outputs information indicating the position of a detection region that is a region where the operation is detected.
- the image recognition unit 6 extracts an image corresponding to the detection area from the image of the program displayed on the display unit 7.
- the communication unit 8 communicates with a server that distributes information displayed in the posting column via a network.
- control unit 2 When the control unit 2 receives the image extracted by the image recognition unit 6 from the image recognition unit 6, the control unit 2 causes the display unit 7 to display post information related to the image. When the control unit 2 recognizes that an operation for moving the image of the posting information to the posting column is performed based on the information on the movement path of the detection area received from the operation unit 5, the control unit 2 transmits the posting information to the communication unit 8. To the server via
- the posting information related to the image is displayed. And if a user performs operation which moves the image of the posting information to a posting column using the operation part 5, an information processing apparatus will transmit the posting information to a server. Therefore, the user can post information related to the program being broadcast simply by moving the image of the information displayed on the display unit 7 to the posting column. It can be done easily and quickly.
- the information processing apparatus of the present invention may be a mobile phone including a smart phone, a simple mobile phone (Personal Handy-phone System), a PDA (Personal Digital Assistant), an electronic book, a portable game machine, or the like. .
- FIG. 2 is a block diagram for explaining an information processing system for posting to a broadcast program using the mobile terminal of the present embodiment.
- the mobile terminal 11 of the present embodiment is connected to the base station 23 by wireless communication.
- the base station 23 is connected to the server 21 via the network 25.
- the mobile terminal 11 receives a radio wave for reproducing a program from a broadcasting device (not shown) of the broadcasting station 33 via a cable and a radio tower 31.
- the radio wave transmitted from the radio tower 31 is assumed to be a terrestrial digital broadcast wave, and the network 25 is assumed to be the Internet.
- the server 21 accepts postings for programs broadcast by the broadcasting station 33 and distributes posting messages to the mobile terminal 11.
- the case where the number of mobile terminals 11 connected to the network 25 is one will be described.
- the number of mobile terminals is not limited to one and may be plural.
- information related to posting that the mobile terminal 11 transmits to the server 21 is referred to as post information
- information distributed by the server 21 for display in the posting column of the display unit of the mobile terminal 11 is referred to as a post message. Called.
- the server 21 distributes a program posting message to the mobile terminal 11 via the network 25, but a broadcasting device (not shown) of the broadcasting station 33 is connected to the network 25.
- the posted message received from the server 21 may be broadcast together with the program.
- the mobile terminal 11 has been described with respect to a case where a program is received by receiving radio waves, but a program distributed via the network 25 may be played. Further, although the case where the mobile terminal 11 is connected to the network 25 via the base station 23 by wireless communication will be described, it may be connected by wire. Further, a plurality of servers that publish websites are connected to the network 25, but in FIG. 2, illustration of these servers is omitted.
- FIG. 3 is a block diagram illustrating a configuration example of the mobile terminal according to the present embodiment.
- the mobile terminal 11 includes a storage unit 14, a touch panel 15, an image recognition unit 16, a display unit 17, a communication unit 18, a television unit 19, and a control unit 12.
- the control unit 12 is connected to each of the storage unit 14, the touch panel 15, the image recognition unit 16, the display unit 17, the communication unit 18, and the television unit 19 via the bus 13. The configuration of each unit shown in FIG. 3 will be described.
- the television unit 19 includes a television tuner that receives radio waves transmitted from the radio tower 31, and transmits a program reproduced based on the received radio waves to the control unit 12.
- the television unit 19 transmits “data broadcast information” included in the digital broadcast wave to the control unit 12.
- the data broadcast information includes weather forecasts and news information.
- the data broadcast information notified by the television unit 19 to the control unit 12 is the program name and appearance of the program being broadcast. It is assumed that the program information such as the name of the person and the time information such as the start time, end time, and current time of the program are included.
- the storage unit 14 stores in advance a plurality of types of post information including at least characters or symbols.
- post information related to the evaluation of the posting target the first group of good evaluation contents such as “Like, Nice !, GOOD !,...” And the first of bad evaluation contents such as anger and fear.
- Two groups are stored in the storage unit 14 in advance.
- an image of mark A for selecting the first group and an image of mark B for selecting the second group are stored in the storage unit 14.
- the post information may be a pictograph combining characters and symbols.
- the storage unit 14 stores information on the result of the search performed by the control unit 12 on the website on the network 25.
- information searched in the past by the user is registered in the storage unit 14 will be described.
- the storage unit 14 when the user inputs the name of the entertainer to the portable terminal 11 to execute the search, data in which the name information and the face image are paired is registered in the storage unit 14.
- the data may include not only the name of the entertainer but also the name of the drama that appeared in the past, information on the URL (Uniform Resource Locator) of the official homepage, and the like.
- URL Uniform Resource Locator
- not only a person but also an object such as a building or a product may be registered in the storage unit 14 as a combination of name information and an appearance image.
- the communication unit 18 is connected to the network 25 through the base station 23 by communicating with the base station 23 wirelessly.
- the communication unit 18 receives the posted message distributed by the server 21 via the base station 23 and the network 25, the communication unit 18 passes the posted message to the control unit 12.
- the communication unit 18 receives the posting information and the posting transmission signal including the instruction to transmit the posting information to the server 25 from the control unit 12, the communication unit 18 transmits the posting information to the server 21 via the base station 23 and the network 25. To do.
- the communication unit 18 acquires a web page from a website published on the network 25 in accordance with an instruction from the control unit 12.
- FIG. 4 is an external perspective view showing an example of the mobile terminal shown in FIG.
- FIG. 5 is an exploded perspective view showing a state in which a part of the mobile terminal shown in FIG. 4 is disassembled.
- the touch panel 15 corresponds to the operation unit 5 shown in FIG.
- the display unit 17 is, for example, an LCD (Liquid Crystal Display). As shown in FIG. 5, the touch panel 15 is superimposed on the display unit 17, and a configuration in which these are integrated is mounted on the main body 10 of the mobile terminal 11 as shown in FIG. 4.
- the touch panel 15 is composed of a transparent substrate on which a sensor is provided. Therefore, the user can view the image displayed on the display unit 17 provided on the lower layer side than the touch panel 15 through the touch panel 15.
- FIG. 6 is a diagram showing an example of an image displayed on the display unit. As shown in FIG. 6, from the left side of the figure, the display unit 17 displays a television area 71 in which programs being broadcast are displayed, a mark area 72 in which marks A and B are displayed, and a posted message. It is divided into three areas, a posting area 73.
- Common coordinates are set in advance on the substrate surface of the touch panel 15 and the screen of the display unit 17.
- the touch panel 15 transmits information on the coordinates of the detection area that is the detected area to the control unit 12.
- the operation article is a finger
- the coordinates set in common on the touch panel 15 and the display unit 17 will be described in detail later.
- Touch operation is an operation for the user to select an image displayed on the display unit 17.
- the touch panel 15 detects a touch operation in which the user touches a region on the substrate surface corresponding to the image desired to be selected by the finger, the touch panel 15 transmits the coordinates of the detection region corresponding to the image to the control unit 12.
- the touch panel 15 detects a touch operation on the image of the program displayed in the television area 71, the touch panel 15 transmits the coordinates of the detection area to the image recognition unit 16.
- the drag operation is an operation for moving the image selected by the touch operation in the image displayed on the display unit 17.
- the touch panel 15 detects a drag operation of sliding a finger touching the substrate surface to a position where the user wants to move the image selected by the user, the touch panel 15 transmits the coordinates of the movement path of the detection area to the control unit 12.
- the image recognizing unit 16 has a frame memory (not shown) that stores images that are sequentially displayed in the television area 71 of the display unit 17.
- the image recognition unit 16 receives information on the coordinates of the detection area from the touch panel 15, and if the coordinates corresponding to the detection area do not change for a certain period of time, the image stored in the frame memory at that time is made a still image.
- the fixed time is, for example, 0.5 seconds.
- the image recognition unit 16 extracts a human face or object from the image corresponding to the detection area, and passes the extracted image data to the control unit 12.
- a technique for extracting a face portion from a person image is disclosed in Patent Document 2, and detailed description thereof is omitted in this embodiment.
- the image recognition unit 16 detects the edge of the object, recognizes the outer shape, and transmits image data cut out along the outer shape to the control unit 12.
- the image recognition unit 16 notifies the control unit 12 of information indicating that it is neither a person nor an object.
- FIG. 7 is a diagram for explaining that the two-dimensional coordinates common to the frame memory (not shown) of the touch panel 15, the display unit 17, and the image recognition unit 16 are set.
- the x-axis is defined in the left-right direction of the figure
- the y-axis is defined in the up-down direction of the figure.
- the touch panel 15 displays the coordinates of the detection area. (X1, y1) is detected as
- the star image is an image of a program displayed in the television area 71
- the touch panel 15 transmits information on the coordinates (x1, y1) to the image recognition unit 16.
- the image recognition unit 16 extracts the outline of the object or the face of the person located at the coordinates.
- control unit 12 Next, the configuration of the control unit 12 will be described. In order to explain the operation of the control unit 12 in an easy-to-understand manner, the operation performed by the user using the touch panel 15 and the screen of the display unit 17 will be described.
- the control unit 12 includes a CPU (Central Processing Unit) (not shown) that executes processing according to a program, and a memory (not shown) that stores the program.
- a CPU Central Processing Unit
- memory not shown
- the control unit 12 causes the display unit 17 to display the program received from the television unit 19 in the television area 71, display the mark A and mark B in the mark area 72, and post the posted message received from the communication unit 18. Control to display in the area 73 is performed. At that time, the control unit 12 also transmits the image data of the program received from the television unit 19 to the image recognition unit 16. At this time, an example of the screen displayed by the display unit 17 is as shown in FIG.
- control unit 12 Upon receiving image data from the image recognition unit 16, the control unit 12 checks the storage unit 14 as to whether or not data including key data is registered using the image as key data that is data for search. . If there is data including key data in the storage unit 14, the name registered together with the image is read and displayed on the television area 71 of the display unit 17 so as to be superimposed on the program image.
- the control unit 12 searches the website on the network 25 for the web page including the key data via the communication unit 18, and starts from the website. Perform a network search to obtain. And the control part 12 reads the character string described in the acquired web page, measures the count number described for every character string, and produces the candidate list which arranged the character string in an order from the largest count number To do. The control unit 12 causes the candidate list to be displayed on the television area 71 of the display unit 17 so as to overlap the program image.
- Patent Document 2 discloses a method of collating a face image with a reference image serving as a reference for image search and a newly input image. It is possible to cause the control unit 12 to execute this collation method, and a detailed description thereof is omitted in the present embodiment. Specifically, when an image stored in the storage unit 14 or an image published on a website on the network 25 is used as a reference image, the face recognition is performed on the image recognition unit 16 from the image selected by the user through a touch operation. A part may be extracted, and the extracted face image and reference image may be collated with the control unit 12. Even when the image received from the image recognition unit 16 is an object, the control unit 12 performs collation in the same manner as in the case of a human face image.
- FIG. 8A is a diagram illustrating an example of an image when a candidate list is displayed on the display unit as a search result of the network search executed by the control unit.
- FIG. 8A shows a case where plural types of character strings are described in order in the candidate list.
- the control unit 12 performs a network search, in addition to the image received from the image recognition unit 16, information on the program name received from the television unit 19 is used as key data.
- the name of the performer “Takuya Kimura” is displayed in the first candidate in the candidate list, and the role name “Nichiro Furuhata” in the program is displayed in the second candidate.
- the title of the role in the program is “Criminal Section 1 Chief”.
- the program name is included in the key data, the name or word associated with the program can be displayed.
- FIG. 8B is a diagram for explaining a method for the user to select post information from the candidate list shown in FIG. 8A.
- a black circle indicates a detection area when the user performs a touch operation and selects, and a broken line indicates a movement path of the detection area.
- the touch panel 15 detects the movement path of the detection area and transmits the coordinates where the detection area has stopped to the control unit 12.
- the control unit 12 recognizes a character string corresponding to the position of the coordinate as post information. Then, the control unit 12 displays the recognized posting information on the display unit 17 so as to be different from the other character strings, and then deletes the other character strings so that only the posting information is displayed on the display unit 17. To do.
- This post information is defined as first post information.
- FIG. 8B illustrates a case where the user has selected the second candidate “Nichiro Furuhata” from the candidate list illustrated in FIG. 8A.
- control unit 12 Up to here, the operation of the control unit 12 in the case where the user selects from the information collected from the storage unit 14 or the network 25 as post information has been described. However, the user can be determined based on the evaluation information of the posting target registered in the storage unit 14 in advance. The operation of the control unit 12 when selecting is described.
- the posting information of the evaluation for the posting target is referred to as second posting information.
- FIG. 9 is a diagram for explaining a method in which the user selects the second posting information from a plurality of types of character strings registered in advance.
- the routes 51 to 53 shown in FIG. 9 indicate detection region routes for selecting the second posted information after the user selects the first posted information.
- the provisional starting point of the movement path of the detection area is assumed to be the position of the person image.
- the control unit 12 displays the image of the first posting information selected by the user on the display unit 17 as the detection area moves.
- the first posting information is illustrated in the drawing. Omitted.
- the control unit 12 When the user wants to select a character string registered in the first group in order to submit a good evaluation for the posting target, the user slides the surface of the touch panel 15 along the path 51 to the mark A.
- the control unit 12 When receiving the coordinates of the position of the mark A from the touch panel 15, the control unit 12 reads the character string registered in the first group from the storage unit 14 and causes the display unit 17 to display the character string.
- the control unit 12 when the user wants to select a character string registered in the second group in order to submit a bad evaluation to the posting target, the user touches the surface of the touch panel 15 up to the mark B along the path 53. Slide.
- the control unit 12 receives the coordinates of the position of the mark B from the touch panel 15, the control unit 12 reads the character string registered in the second group from the storage unit 14 and causes the display unit 17 to display the character string.
- FIG. 10 is a diagram illustrating an example of a message that can be input in the case of a route via the mark A.
- the control unit 12 displays a plurality of types of character strings such as “Like”, “Nice!”, And “GOOD!” On the display unit 17. Display. The user can select one character string from the plurality of character strings as the second posting information.
- the control unit 12 changes the character string selected by the user to the second posting information. Recognize as Then, the control unit 12 displays the second post information on the display unit 17 so as to be different from the other character strings, and then deletes the other character strings and displays only the second post information on the display unit 17. Let it be in the state you let it.
- the control unit 12 causes the display unit 17 to display a character input image.
- FIG. 11 is a diagram showing an example when the control unit displays an image for character input on the display unit. Although FIG. 11 shows a case where the character input image 74 is displayed in the television area 71, the place where the character input image 74 is displayed is not limited to the television area 71.
- control unit 12 causes the display unit 17 to display “Furuhata Neiichiro” (see FIG. 8B), which is the first post information selected by the user, in the comment input field 75. Therefore, the user does not need to input the character string of “Nichiro Furuhata”, and may input the subsequent comment.
- FIG. 12 is a diagram illustrating an example of a series of operations from when a user selects a posting target to when posting information is transmitted.
- the control unit 12 displays the first post information candidate list. It is displayed on the display unit 17. Subsequently, when the user selects the character string “Nichiro Furuhata” from the candidate list, the control unit 12 causes the display unit 17 to display the character string “Nichiro Furuhata”. When the user drags the character string “Nichiro Furuhata” to the mark A along the broken line shown in FIG. 12, the control unit 12, as described with reference to FIG. A plurality of types of character strings corresponding to the mark A are displayed on the display unit 17.
- the control unit 12 when the user selects “Nice!” From a plurality of types of character strings in the first group, the control unit 12 combines “first post information and second post information”. Post information of “Ichiro Nice!” Is created and displayed on the display unit 17. Then, when the user drags the image of “Nichiro Furuhata Nice!” Displayed on the display unit 17 to the posting area 73, the control unit 12 posts information based on the coordinate information of the detection area received from the touch panel 15. Is recognized as having been moved to the posting area 73, and a posting transmission signal is transmitted to the communication unit 18.
- the user slides his / her finger from the touch panel 15 to the posting area 73 from the TV area 71 as a starting point, and sequentially selects a character string in the TV area 71 and the mark area 72, By dragging the selected character string to the posting area 73, it becomes possible to post the selected character string in combination. At that time, the user can change the content of the evaluation on the posting target depending on which mark in the mark area 72 the finger passes.
- FIG. 13 is a flowchart showing an operation procedure of the mobile terminal according to the present embodiment.
- 14A to 14G are diagrams showing specific examples of user operations from selection of information about a person in a program to posting. Here, a case where the user posts about a person appearing in a program being broadcast will be described.
- the user touches his / her finger on the position of the person's image for a certain period of time.
- the touch panel 15 detects that the finger as the operation article has been in contact for a certain time or more (step 101), and passes the coordinates of the detection area to the image recognition unit 16.
- the image recognition unit 16 When the image recognition unit 16 receives the coordinates of the detection region from the touch panel 15, the image recognition unit 16 collates the image of the image data stored in the frame memory (not shown) with the coordinates of the detection region, and determines whether or not the selected image is a person. Is determined (step 102). When the image recognition unit 16 determines that the selected image is a person, the image recognition unit 16 extracts the face of the person and passes the face image data to the control unit 12 (step 103).
- step 102 if the selected image is not a person, the image recognition unit 16 determines whether the selected image is an object (step 104). If the selected image is an object, the image recognition unit 16 passes the image data from which the object has been extracted to the control unit 12 (step 105). In step 104, if the selected image is not an object, the image recognition unit 16 notifies the control unit 12 of information that the selected image is neither a person nor an object. In this case, the control unit 12 acquires program information and time information from the television unit 19 (step 106).
- step 107 the control unit 12 checks whether an image that matches the image extracted by the image recognition unit 16 is stored in the storage unit 14 (step 107).
- the control unit 12 registers the information registered as a pair with the image. Is read from the data, and the read information is set as a candidate list and the process proceeds to step 110.
- control unit 12 executes a network search using the person image as key data (step 108)
- the search result is stored in the storage unit 14 (step 109).
- the control unit 12 creates a candidate list from the search result.
- step 106 let program information and time information be a candidate list.
- a process through step 106 for example, a case where the user selects and submits a program name can be considered.
- the control unit 12 displays the created candidate list on the display unit 17 (step 110).
- the touch panel 15 transmits the coordinates of the position where the detection area is stopped to the control unit 12.
- the control unit 12 recognizes the character string corresponding to the position of the coordinate as the first posting information. Then, the control unit 12 makes the display unit 17 display only the posted information. This post information is defined as first post information.
- the control unit 12 determines whether or not the coordinates of the mark position are received from the touch panel 15 (step 112).
- the control unit 12 receives the coordinates in the mark area 72 from the touch panel 15, but if the coordinates do not match any of the coordinates of the marks A and B, the control unit 12 displays a character input image on the display unit 17 (step 113).
- the control unit 12 creates post information in which the character input by the user is combined with the first post information.
- step 112 when the control unit 12 receives the coordinates of the mark position from the touch panel 15, the control unit 12 reads the character string of the group corresponding to the received mark from the storage unit 14, and displays the read character string as a candidate list. 17 is displayed (step 114).
- the control unit 12 displays the first group of character strings on the display unit 17, and when the mark B is selected, the control unit 12 displays the second group of character strings on the display unit 17. .
- step 114 when the user slides his / her finger to one of the plurality of types of character strings displayed in the candidate list while dragging the first posting information, the touch panel 15 stops the detection area.
- the coordinates of the determined position are transmitted to the control unit 12.
- the control unit 12 recognizes the character string corresponding to the position of the coordinate as the second posting information. Then, the control unit 12 creates post information in which the first post information and the second post information are combined and displays the post information on the display unit 17.
- control unit 12 determines whether or not an operation for moving the posting information to the posting area 73 has been performed based on the coordinate information of the detection area received from the touch panel 15 (step 116).
- the control unit 12 recognizes that an operation for moving the posted information to the posting area 73 has been performed, and sends the posted information to the communication unit. 18 to the server 21 (step 117).
- step numbers in the following description correspond to the steps in the flowchart of FIG. 14A to 14G indicate positions where the user's finger is touching the touch panel 15 in order to easily explain the image or character string selected by the user.
- step 101 when the user touches the person displayed on the television area 71 for a certain time or longer (step 101), the portable terminal 11 is touched by the user as shown in FIG. 14B.
- the search result for the selected person is displayed on the display unit 17 as a candidate list (step 110).
- candidate list shown in FIG. 14B character strings “Takuya Kimura”, “Nichiro Furuhata”, “Chief of criminal Affairs 1”,...
- the mobile terminal 11 causes the display unit 17 to display “Nichiro Furuhata” selected by the user in a frame.
- the user can recognize that “Furuhata Toichiro” has been selected.
- the case where the character string selected by the user is surrounded by a frame has been described, but the color of the character string selected by the user may be displayed in a color different from that of other character strings.
- the mobile terminal 11 causes the display unit 17 to delete character strings other than the selected character string “Nichiro Furuhata”.
- the character string “Nichiro Furuhata” moves to the mark A.
- the mobile terminal 11 When the user reaches the position of the mark A, as shown in FIG. 14E, the mobile terminal 11 displays the character strings “Like”, “Nice!”, “GOOD!”,. It is displayed on the display unit 17 (step 114). Thereafter, the user slides the finger to the position “Nice!” While touching the touch panel 15. Accordingly, as illustrated in FIG. 14F, the mobile terminal 11 creates post information “Natsuichiro Furuhata Nice!” Combining “Nichiro Furuhata” and “Nice!”.
- the mobile terminal 11 transmits the posting information including “Nice Ichiro Furuhata!” To the server 21 (step 117). . Thereafter, when the mobile terminal 11 receives a post message “Nichiro Furuhata Nice!” From the server 21, the mobile terminal 11 displays the post message in the post area 73 of the display unit 17 as shown in FIG. 14G.
- the program to be broadcast is a sports program
- the user presses the image of the target sports player with a finger, and then moves on the character string of “Like” displayed corresponding to the mark. If you pass it and drag the character string to the posting area at the end, you can quickly make a post such as “Good XX player”.
- the user can input post information with the character string selected by the user by dragging the character string with the finger in the route of “person” ⁇ “mark” ⁇ “posting area”. , Quick character input can be realized.
- the user can easily and quickly obtain the character string associated with the program by simply touching the video of the program and dragging the selected character string. You can post without missing.
- the key data is an image of a person appearing in the program.
- the name of the product By searching for images including program information and sponsor information in key data, such as stadium names and place names if the program is a sports program, the character string associated with the program can be obtained more accurately. Can do.
- the character string obtained by the search is displayed in a selectable manner. If the character string is the name of a celebrity, the person's official home page is displayed. If the character string of the URL is a product name, the URL of the product manufacturer's site may be added as link information for posting. When the character string is a place name, map information indicating the position of the place name on the map may be added to the posted information.
- FIG. 15A and FIG. 15B are diagrams for explaining a case where link information of a person's official homepage in a program can be attached to posted information.
- control unit 12 refers to the search result of the network search, and information on the URL of the official homepage of “Nichiro Furuhata” Is included, an image in which URL attachment can be selected is displayed on the display unit 17 as shown in FIG. 15A.
- the image recognition unit 16 extracts a person's face or object from the program image.
- the image recognition unit 16 performs the same as the method for extracting an object on the program.
- a telop character string may be extracted.
- a specific example of the character string extraction method will be described.
- For each character character information and its image are registered in advance in the storage unit 14 as a set.
- the image recognition unit 16 detects the edge of the telop on the program, extracts the telop part from the program image, and sequentially matches the image of the telop character image from the first character to the end character of the telop. Search in the storage unit 14.
- the image recognition unit 16 reads information on the registered character paired with the image, and reads the read character from the first character of the telop to the end character. It arranges in order corresponding to. In this way, the image recognition unit 16 recognizes the character string of the telop on the program.
- control unit 12 refers to the program information acquired from the television unit 19 for the expression “this gag samui” registered in advance in the storage unit 14 as the character string of the mark B, and broadcasts the program. If is a broadcasting station from Tohoku to Hokkaido, it is converted to the expression “This is gagging”.
- a program for causing a computer to execute the information processing method described in this embodiment may be stored in a computer-readable recording medium. In this case, by installing the program from the recording medium into another information processing apparatus, it is possible to cause the other information processing apparatus to execute the above information processing method.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012-149545 | 2012-07-03 | ||
| JP2012149545 | 2012-07-03 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014007020A1 true WO2014007020A1 (fr) | 2014-01-09 |
Family
ID=49881780
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/065544 Ceased WO2014007020A1 (fr) | 2012-07-03 | 2013-06-05 | Dispositif et procédé de traitement d'informations et programme |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2014007020A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106131593A (zh) * | 2016-06-28 | 2016-11-16 | 乐视控股(北京)有限公司 | 内容处理方法及装置 |
| JP2019169976A (ja) * | 2019-06-09 | 2019-10-03 | 田村 昌和 | ライブ配信者などをアシストするライブ配信サーバ、ライブ配信方法及びプログラム |
| CN111385436A (zh) * | 2018-12-28 | 2020-07-07 | 富士施乐株式会社 | 控制装置、控制方法和计算机可读记录介质 |
| CN112602330A (zh) * | 2019-10-29 | 2021-04-02 | 海信视像科技股份有限公司 | 电子设备及非易失性存储介质 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010178337A (ja) * | 2009-02-02 | 2010-08-12 | Samsung Electronics Co Ltd | 問答サービス方法、問答サービス機能を有する放送受信機及び記録媒体 |
| JP2011151741A (ja) * | 2010-01-25 | 2011-08-04 | Nippon Hoso Kyokai <Nhk> | 選択肢生成提示装置及び選択肢生成提示プログラム |
-
2013
- 2013-06-05 WO PCT/JP2013/065544 patent/WO2014007020A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010178337A (ja) * | 2009-02-02 | 2010-08-12 | Samsung Electronics Co Ltd | 問答サービス方法、問答サービス機能を有する放送受信機及び記録媒体 |
| JP2011151741A (ja) * | 2010-01-25 | 2011-08-04 | Nippon Hoso Kyokai <Nhk> | 選択肢生成提示装置及び選択肢生成提示プログラム |
Non-Patent Citations (2)
| Title |
|---|
| "Social Terebi Service no Tameno Comment Kaiseki Gijutsu", GIKEN KOKAI 2010 TENJI SHIRYO, May 2010 (2010-05-01) * |
| YAMAMOTO ET AL.: "Web-based Video Annotation and its Applications", FIT2004 DAI 3 KAI FORUM ON INFORMATION TECHNOLOGY IPPAN KOEN RONBUNSHU SEPARATE, vol. 3, 20 August 2004 (2004-08-20), pages 7 - 10 * |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106131593A (zh) * | 2016-06-28 | 2016-11-16 | 乐视控股(北京)有限公司 | 内容处理方法及装置 |
| CN111385436A (zh) * | 2018-12-28 | 2020-07-07 | 富士施乐株式会社 | 控制装置、控制方法和计算机可读记录介质 |
| JP2019169976A (ja) * | 2019-06-09 | 2019-10-03 | 田村 昌和 | ライブ配信者などをアシストするライブ配信サーバ、ライブ配信方法及びプログラム |
| CN112602330A (zh) * | 2019-10-29 | 2021-04-02 | 海信视像科技股份有限公司 | 电子设备及非易失性存储介质 |
| US12039228B2 (en) | 2019-10-29 | 2024-07-16 | Hisense Visual Technology Co., Ltd. | Electronic device and non-transitory storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9355496B2 (en) | Image processing apparatus, image processing method, and medium to display augmented reality objects | |
| US10839605B2 (en) | Sharing links in an augmented reality environment | |
| US9549143B2 (en) | Method and mobile terminal for displaying information, method and display device for providing information, and method and mobile terminal for generating control signal | |
| JP5833551B2 (ja) | ビデオデバイスにおいてインターネットで検索するためのシステムおよび方法 | |
| CN109391834B (zh) | 一种播放处理方法、装置、设备和存储介质 | |
| CN111436006B (zh) | 一种视频上展示信息的方法、装置、设备和存储介质 | |
| US10147399B1 (en) | Adaptive fiducials for image match recognition and tracking | |
| US20140111542A1 (en) | Platform for recognising text using mobile devices with a built-in device video camera and automatically retrieving associated content based on the recognised text | |
| US9986283B2 (en) | Display device, method of providing personalized object, and method of providing information | |
| EP2530675A2 (fr) | Appareil de traitement d'informations, procédé de traitement d'informations, et programme de traitement d'informations | |
| KR101783115B1 (ko) | 명령 처리를 위한 텔레스트레이션 시스템 | |
| CN102855464A (zh) | 信息处理设备、元数据设置方法以及程序 | |
| CN104954829A (zh) | 显示装置及其操作方法 | |
| KR102591292B1 (ko) | 디스플레이 장치와 방법 및 광고 서버 | |
| KR101727040B1 (ko) | 전자 장치 및 메뉴 제공 방법 | |
| US20150261868A1 (en) | Method For Processing Information And Apparatus Thereof | |
| CN113342221A (zh) | 评论信息引导方法、装置、存储介质及电子设备 | |
| JP2020052986A (ja) | サーバシステム、アプリケーションプログラム配信サーバ、閲覧用端末、コンテンツ閲覧方法、アプリケーションプログラム、配信方法、アプリケーションプログラム配信方法 | |
| WO2014007020A1 (fr) | Dispositif et procédé de traitement d'informations et programme | |
| JP2020053026A (ja) | サーバシステム、アプリケーションプログラム配信サーバ、閲覧用端末、コンテンツ閲覧方法、アプリケーションプログラム、配信方法、アプリケーションプログラム配信方法 | |
| AU2012205152B2 (en) | A platform for recognising text using mobile devices with a built-in device video camera and automatically retrieving associated content based on the recognised text | |
| KR20120029229A (ko) | 투명 디스플레이 장치 및 지역 정보 제공 방법 | |
| JP5476730B2 (ja) | 携帯端末 | |
| KR20150097250A (ko) | 태그 정보를 이용한 스케치 검색 시스템, 사용자 장치, 서비스 제공 장치, 그 서비스 방법 및 컴퓨터 프로그램이 기록된 기록매체 | |
| KR20140136310A (ko) | 영상 표시 장치 및 그것의 제어 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13813293 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13813293 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |