WO2010021102A1 - 関連シーン付与装置及び関連シーン付与方法 - Google Patents
関連シーン付与装置及び関連シーン付与方法 Download PDFInfo
- Publication number
- WO2010021102A1 WO2010021102A1 PCT/JP2009/003836 JP2009003836W WO2010021102A1 WO 2010021102 A1 WO2010021102 A1 WO 2010021102A1 JP 2009003836 W JP2009003836 W JP 2009003836W WO 2010021102 A1 WO2010021102 A1 WO 2010021102A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- search
- time
- scene
- image data
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/322—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
Definitions
- the present invention relates to a related scene assignment device and a related scene assignment support for supporting the addition of a scene of a moving image content related to a search result to the search result when the search result of information related to a certain scene is shared with others. Regarding the method.
- PCs personal computers
- an increasing number of users search for information related to a television program while using a video content, for example, a currently broadcast television program, using an Internet search engine.
- information to be searched for example, when the program being viewed is a travel program, it is information on a currently displayed place or a store.
- quiz program it is information relating to a quiz answer that is currently being presented.
- animal program it is information regarding the name of the currently displayed animal and the place where the animal can be viewed.
- a sports program such as soccer or baseball
- information on actions or rules in addition to information on the currently displayed player, information on actions or rules.
- the scene that triggered the search for the search result can be shared with others. It is considered effective to do so.
- a digital video reproduction apparatus that extracts subtitles included in content and extracts scenes that have triggered search (for example, see Patent Document 1).
- the digital video reproduction apparatus When receiving a content recording request from a user, the digital video reproduction apparatus creates a table in which subtitle character data is associated with presentation time information.
- the digital video playback device searches for a subtitle character related to the character input from the user using the table, and displays the presentation time of the searched subtitle character. Play the video.
- Patent Document 1 uses caption information. For this reason, there is a problem in that it is impossible to extract a scene of a broadcast program that has triggered information retrieval for content that does not include subtitles. For example, under the present circumstances, subtitles are hardly given to live broadcast programs such as sports programs such as soccer or baseball. For this reason, the applicable range of the above-described digital video reproduction apparatus is limited.
- the above-described digital video reproduction apparatus performs a search based on characters input from the user. For this reason, it is necessary for the user to grasp the keyword displayed in the scene that triggered the information search and input it. Furthermore, when the input keyword is used in a plurality of subtitles, a plurality of scenes are detected. For this reason, as the number increases, search results also increase. Therefore, there is a problem in that it takes time and effort to find a scene that has triggered the information retrieval desired by the user.
- a related scene assigning apparatus and a related scene assigning method capable of extracting a scene that triggers an information search performed by a user even when a text tag such as subtitles is not assigned to each scene of content
- the purpose is to provide.
- a related scene assigning apparatus of the present invention is a related scene assigning apparatus that associates a related result, which is image data related to a search, with a search result, and reproduces the image data and the image data.
- An operation that associates an image storage unit that stores time, an information search execution unit that searches for information according to a search condition input by a user, the search condition, and a time when the search condition is accepted
- An operation history storage unit that stores the history of the operation, and a history of the operation related to the search result of the scene assignment target that is the search information specified by the user among the information searched by the information search execution unit
- a search start time estimation unit that estimates a search start time that is a time when input of a search condition for obtaining the scene assignment target search result is started based on a search start point;
- the image data reproduced in time including the search start time estimated by the search start time estimation unit, and a related scene extracting unit associated with the scene grantees search results.
- the present invention can be realized not only as a related scene providing device including such a characteristic processing unit, but also as a related scene adding method using the characteristic processing unit included in the related scene providing device as a step. Can be realized. It can also be realized as a program that causes a computer to execute the characteristic steps included in the related scene assigning method. Such a program can be distributed via a recording medium such as a CD-ROM (Compact Disc-Read Only Memory) or a communication network such as the Internet.
- a recording medium such as a CD-ROM (Compact Disc-Read Only Memory) or a communication network such as the Internet.
- the related scene assigning apparatus of the present invention it is possible to extract a scene that has triggered information search even for video content in which no information about the scene such as text information is assigned to each scene. Become. Furthermore, in the scene assignment apparatus according to the present invention, it is not necessary to input a keyword only for scene extraction. For this reason, it is possible to reduce the burden of the user giving the related scene to the information search result.
- FIG. 1 is an external view showing a configuration of a search system according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing a configuration of the search system according to Embodiment 1 of the present invention.
- FIG. 3 is a flowchart showing a user operation procedure according to the first embodiment of the present invention.
- FIG. 4 is a flowchart of processing executed by the related scene assigning apparatus according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram showing an example of operation history information in Embodiment 1 of the present invention.
- FIG. 6A is a diagram showing an example of an output screen of the related scene assigning apparatus according to Embodiment 1 of the present invention.
- FIG. 6B is a diagram showing an example of an output screen of the related scene assigning apparatus according to Embodiment 1 of the present invention.
- FIG. 6C is a diagram showing an example of an output screen of the related scene assigning apparatus according to Embodiment 1 of the present invention.
- FIG. 6D is a diagram showing an example of an output screen of the related scene assigning apparatus according to Embodiment 1 of the present invention.
- FIG. 7 is a functional block showing a detailed configuration of the search start time estimation unit in the first embodiment of the present invention.
- FIG. 8 is a detailed flowchart of the search start point estimation process according to Embodiment 1 of the present invention.
- FIG. 9 is a diagram showing an example of similarity information stored in the similarity storage unit according to Embodiment 1 of the present invention.
- FIG. 10 is a diagram showing an example of an output screen of the related scene assigning apparatus according to Embodiment 1 of the present invention.
- FIG. 11A is a diagram showing an example of an output screen of the shared television in Embodiment 1 of the present invention.
- FIG. 11B is a diagram showing an example of an output screen of the shared television in Embodiment 1 of the present invention.
- FIG. 12 is a diagram illustrating an example of similarity information stored in the similarity storage unit according to Embodiment 1 of the present invention.
- FIG. 13 is a block diagram showing the configuration of the search system according to Embodiment 2 of the present invention.
- FIG. 14 is a detailed flowchart of related scene extraction processing according to Embodiment 2 of the present invention.
- FIG. 15 is a diagram showing an example of an output screen of the related scene assigning apparatus according to Embodiment 2 of the present invention.
- a user accesses a search server that exists on the Internet, such as a PC or a mobile phone. It is assumed that an information search is performed by using a device capable of information search (hereinafter referred to as an “information search device”) triggered by a scene while the user is watching a TV program.
- an information search device a device capable of information search
- a method of estimating a scene of a TV program that has triggered information search when an information search result by an information search apparatus is displayed on a shared TV and shared with others will be described.
- a method for supporting the task of adding the estimated scene to the information search result will be described. For example, consider a case where the user immediately searches for offside when there is an offside while watching a soccer broadcast program with his family. In such a situation, a search system for associating a search result related to offside with an offside scene will be described.
- FIG. 1 is an external view showing a configuration of a search system according to Embodiment 1 of the present invention.
- the search system includes a related scene assigning device 100, a shared television 114, and a portable terminal 1601 that are connected to each other via a computer network 1602 such as a LAN (Local Area Network).
- a computer network 1602 such as a LAN (Local Area Network).
- FIG. 2 is a block diagram showing the configuration of the search system according to Embodiment 1 of the present invention.
- the search system is a system for performing information search and displaying the information search result by giving it to the scene of the TV program that triggered the information search.
- the related scene assigning apparatus 100 is an apparatus that performs an information search and assigns an information search result to a TV program scene that has triggered the information search.
- the related scene assignment device 100 includes an input unit 101, an information search execution unit 102, a search result storage unit 103, an operation history collection unit 104, an operation history storage unit 105, and a timer 106.
- the search start time estimation unit 107 and the related scene assigning device 100 further include a related scene extraction unit 108, an image acquisition unit 109, an image storage unit 110, a search result output unit 111, an output unit 112, and a search result.
- a transmission unit 113 is an apparatus that performs an information search and assigns an information search result to a TV program scene that has triggered the information search.
- the related scene assignment device 100 includes an input unit 101, an information search execution unit 102, a search result storage unit 103, an operation history collection unit 104, an operation history storage unit 105, and a timer 106.
- the related scene assigning apparatus 100 is configured by a general computer including a CPU (Central Processing Unit), a memory, a communication interface, and the like. And each processing part is functionally implement
- Each storage unit is realized by a memory or an HDD (Hard Disk Drive).
- the input unit 101 is a processing unit that receives input from the user such as buttons, a touch panel, or a cross key.
- the information search execution unit 102 is a processing unit that executes information search by accessing a search server existing on the Internet.
- the search result storage unit 103 is a storage device that stores information search results obtained by the information search execution unit 102.
- the search result stored in the search result storage unit 103 is a search result in which the user designates the scene addition.
- the operation history collection unit 104 is a processing unit that collects an operation history of information search performed by the information search execution unit 102.
- the operation history collection unit 104 collects a history of operations such as a keyword input by the user at the time of search, a word included in the selected item, a URL (Uniform Resource Locator) indicated by the selected item, and a user instruction.
- the operation history storage unit 105 is a storage device that stores the operation history collected by the operation history collection unit 104.
- the timer 106 is a processing unit that acquires the current time.
- the search start time estimation unit 107 is a processing unit that estimates the time when the information search by the user is started. That is, the search start time estimation unit 107 uses the search result in the search result storage unit 103 and the operation history in the operation history storage unit 105 to provide information corresponding to the search result stored in the search result storage unit 103. Estimate the search start operation that started the search. Further, the search start time estimation unit 107 estimates the time when the information search corresponding to the search result is started using the time when the search start operation is executed. In this way, the search start time estimation unit 107 is the time when the user started to input the search condition for the purpose of obtaining the search result from the history of the search condition input to obtain the search result specified by the user. Is estimated.
- the related scene extraction unit 108 stores the scene around the time when the information search estimated by the search start time estimation unit 107 is started as a scene related to the search result stored in the search result storage unit 103.
- moving image content is assumed, but the same processing can be performed for still image content.
- the image acquisition unit 109 is a processing unit that acquires a moving image that triggers a search.
- the moving image acquired by the image acquisition unit 109 is broadcast TV program content or accumulated moving image content.
- the image storage unit 110 is a storage device that stores the moving image data acquired by the image acquisition unit 109 and the reproduction time thereof.
- the search result output unit 111 is a processing unit that combines the search result stored in the search result storage unit 103 with the related scene extracted by the related scene extraction unit 108 and outputs the combined result to the output unit 112. .
- the output unit 112 is a display device such as a display that displays the result output from the search result output unit 111.
- the search result transmission unit 113 is a processing unit that transmits data obtained as a result of combining the search result created by the search result output unit 111 and the related scene to an external device.
- the shared television 114 is a television that can receive the data transmitted from the search result transmission unit 113 and display the received data.
- the shared TV 114 is a large TV set installed in a living room.
- the shared television 114 displays the search result data to which the related scene is added by the related scene adding device 100, so that information can be shared among a plurality of users.
- the mobile terminal 1601 is a mobile terminal that can receive the data transmitted from the search result transmission unit 113 and display the received data.
- the mobile terminal 1601 is a device that is used while being carried by the user and is a mobile phone or the like.
- a scene that triggered the scene that the user has viewed on the shared television 114 is estimated from the operation history and the result of the search. Further, the estimated scene is added to the search result and transmitted to the shared television 114.
- FIG. 3 is a flowchart showing a user operation procedure.
- FIG. 4 is a flowchart of processing executed by the related scene assigning apparatus 100.
- the moving image content of the program that the user is viewing on the shared television 114 is acquired by the image acquisition unit 109 and stored in the image storage unit 110.
- the image acquisition unit 109 and the image storage unit 110 are provided in the related scene assigning apparatus 100, but the present invention is not limited to this.
- moving image content may be stored in an external recording device, and the related scene extraction unit 108 may extract a scene from the moving image content stored in the recording device.
- the user uses the input unit 101 to give an information search instruction to the related scene assignment device 100 triggered by a scene on the television (S301). For example, when viewing a travel program, the user issues a search instruction for information on a currently displayed place or store. In addition, when the user is viewing a quiz program, the user issues a search instruction for information related to the answer to the quiz currently presented. Further, when viewing the animal program, the user issues a search instruction for information regarding the name of the currently displayed animal or the place where the animal can be seen. In addition, when viewing a sports program such as soccer or baseball, the user issues a search instruction for information on the currently displayed player or information on the player's action or rules. In the present embodiment, it is assumed that there is “offside” while watching soccer, and the “offside” rule is searched for information by accessing a search server existing on the Internet.
- the information search execution unit 102 of the related scene assignment device 100 executes a search process. Further, the operation history collection unit 104 stores an operation history of information retrieval by the user in the operation history storage unit 105 (S401). It should be noted that the user may not only search for information triggered by a television scene, but may also simply search for information of interest. Therefore, the operation history collection unit 104 stores all information search operations performed by the information search execution unit 102 in the operation history storage unit 105. Note that the stored operation history may be deleted, for example, at the end of the day or when there is no user operation history for a certain time or more so as not to exceed the capacity of the operation history storage unit 105. . Alternatively, the old operation history may be deleted in order.
- FIG. 5 shows an example of operation history information stored in the operation history storage unit 105.
- the operation history information includes an operation history including an operation number 501, an operation time 502, a display URL 503, a search word 504, a selection item 505, and other operations 506.
- the operation number 501 is a number for specifying the operation history, and numbers are assigned in ascending order from the oldest operation time 502.
- the operation time 502 is information indicating the time when the user performs an operation on the related scene assignment device 100 using the input unit 101.
- the display URL 503 is information indicating the URL of the Web page displayed on the output unit 112 at the time of operation.
- the search word 504 is information indicating the search word input by the user. The search word may be input by the user using the keyboard or button of the input unit 101, or may be a search keyword presented by the related scene assignment device 100.
- the selection item 505 is information indicating a list of search results displayed on the output unit 112 or an item selected by the user from the Web page in order to transition to a Web page with another URL.
- Other operations 506 are information indicating operations that the information search execution unit 102 can accept, such as a search execution operation for executing a search, an operation for returning to the previous Web page, and an operation for creating a new window. It
- FIG. 6A to 6D show examples of screens when information about offside is searched.
- the user inputs the keyword “offside” on the keyword input type search input receiving page as shown in FIG. 6A (operation number 7 in FIG. 5).
- a search result as shown in FIG. 6B is displayed.
- the user selects a selection item (link) “offside (soccer) -Wike” from the search results (operation number 8 in FIG. 5).
- a page as shown in FIG. 6C is displayed.
- the user examines whether or not the page is shared among a plurality of users by displaying the page on the shared television 114.
- the user determines that this page is not suitable for sharing, and performs an operation for returning to the screen (FIG. 6B) on which the previous search result was displayed (operation number 9 in FIG. 5). Then, the user reselects “rule book (offside)” from the search results shown in FIG. 6B (operation number 10 in FIG. 5). As a result of the selection, a page as shown in FIG. 6D is displayed. As a result of examining this page, the user decides to share this page among a plurality of users, and makes a request for related scene assignment to the related scene assignment apparatus 100.
- “offside” that is a search word or a selection item may be input using a keyboard or a numeric keypad. Therefore, in Japanese, there may be a case where a Roman character input such as “o, f, u, s, a, i, d, o” is input, but in this embodiment, the input unit is determined by a meaningful break as an operation. The history is determined as shown in FIG.
- the user decides a search result that the user wants to share, that is, a related scene (hereinafter referred to as “scene assignment target search result”) (S302).
- a request for related scene assignment from the user to the related scene assignment device 100 is made by performing a predetermined operation using the input unit 101 while the search result is displayed on the output unit 112.
- the information search execution unit 102 stores the scene addition target search result in the search result storage unit 103 (S402).
- the search start time estimation unit 107 uses the time when the search for the scene assignment target search result is started (hereinafter referred to as “search start time”) as the operation for starting the search for the scene assignment target search result (hereinafter “the search start time”). (Referred to as “search start point”) (S403).
- search start time the time when the search for the scene assignment target search result is started
- search start time the search start time
- search start point the search start point
- the related scene extraction unit 108 extracts a related scene from the image storage unit 110 based on the search start time (S404).
- search start point estimation process it is determined from the user's operation history whether the user is searching for the same item as the search for the scene assignment target search result, and the determination result is used to estimate the search start point.
- the search start time estimation unit 107 uses the feature that the user searches for the same content when the scene addition target search result and the information input or viewed by the user at the time of the search are close. Estimate the search origin. Further, even when information input or browsed by the user at the time of search is close, the search starting point is estimated using the feature that the user is searching for the same content. This process will be described in detail below.
- FIG. 7 is a functional block showing a detailed configuration of the search start time estimation unit 107.
- FIG. 8 is a detailed flowchart of the search start point estimation process (S403 in FIG. 4) executed by the search start time estimation unit 107.
- the search origin time estimation unit 107 includes a text corpus collection unit 701, a word information storage unit 702, a word similarity calculation unit 703, a page information collection unit 704, and a page similarity calculation unit 705. And a search state determination unit 706.
- the text corpus collection unit 701 is a processing unit that collects a text corpus to be used for quantifying the semantic similarity between words and creates information for calculating the similarity between words or between words and documents. is there.
- the word information storage unit 702 is a storage device that stores information for calculating the similarity created by the text corpus collection unit 701.
- the word similarity calculation unit 703 is a processing unit that calculates the similarity between words or between words and documents using information stored in the word information storage unit 702.
- the page information collection unit 704 is a processing unit that collects information on a page viewed by a user or information on a page related to a search result of a scene addition target.
- the page similarity calculation unit 705 is a processing unit that calculates the similarity between designated pages based on the page information collected by the page information collection unit 704.
- the search state determination unit 706 is a processing unit that determines whether the same item is being searched between operations based on the operation history and the scene assignment target search result.
- the similarity storage unit 707 is a storage device that stores information used to determine whether or not the same item is being searched.
- the search start point estimation unit 708 is a processing unit that estimates the search start point and the search start time using the determination result in the search state determination unit 706.
- the text corpus collection unit 701 collects a large amount of documents, and extracts words (hereinafter referred to as “index words”) useful for use in searches such as nouns and verbs. Then, the text corpus collection unit 701 creates a matrix subjected to dimension compression by performing singular value decomposition on the index word / document matrix in which the extracted index word and each document are represented by a matrix. Next, the text corpus collection unit 701 calculates the index word vector and the document vector by expressing the index word and the document as a compressed dimension vector using the dimension-compressed matrix, respectively. Store in the word information storage unit 702.
- dimensional compression is performed in order to make it possible to express a more meaningful similarity and to improve the calculation speed during search.
- the similarity between words or between words and documents can be quantified without performing dimension compression.
- a vector may be created without performing dimension compression, and the similarity may be calculated based on the vector.
- other methods for obtaining the semantic similarity between words include a method in which a user creates a semantic similarity in advance, and a method in which a semantic similarity is calculated using a dictionary such as a thesaurus. May be.
- the distance between words can be defined by the number of links between words.
- the document collected by the text corpus collection unit 701 may be a general document set prepared in advance by the system developer, but is not limited to this.
- the text corpus collection unit 701 collects a document set obtained from a page having a high degree of similarity with a scene assignment target search result obtained by searching related documents based on the scene assignment target search result. Also good. Further, a document set obtained from pages viewed by the user within a certain period may be collected. By creating a matrix necessary for calculating the semantic similarity between words from these special document sets, it is possible to accurately represent the distance between words at the time of the user's actual search. For this reason, the search start point estimation unit 708 can estimate the search start point with higher accuracy.
- the matrix necessary for calculating the semantic similarity between the words need only be created once.
- a text corpus is acquired based on the search result of the scene assignment target, it is necessary to create the matrix each time.
- the distance between words can be automatically defined by using, for example, a text corpus acquired from the Web or the like as the text corpus.
- a text corpus acquired from the Web or the like As the text corpus.
- the search origin is specified by using a search history during broadcasting of a genre related to a news program or current affairs, it is more appropriate to create the matrix using a text corpus such as the Web There is also.
- the matrix can be created using a dictionary such as a thesaurus built in advance.
- the above-mentioned matrix can be created using the information on the date when the recorded program was recorded and the text corpus used at that date and time. Conceivable.
- the search state determination unit 706 acquires information related to an operation history that is a target for determining the search state (S802). That is, the search state determination unit 706 selects an operation history in which a search word is input or a selection item is selected from the operation history information illustrated in FIG. 5 stored in the operation history storage unit 105. To do.
- the search state determination unit 706 acquires, from the selected operation histories, the input or selected word set from the past operation histories that are closest in time to the operation that determined the scene assignment target search result.
- the aforementioned operation history is shown as the operation history with the closest operation history number shown in the operation number 501 of FIG. Further, the above-described word set is indicated by “rule book (offside)” of operation number 10 in the specific example shown in FIG.
- the search state determination unit 706 compares information related to the operation for which the scene assignment target search result has been determined with information related to the operation history for determining the search state acquired in the operation history information acquisition process (S802) (S803). That is, the search state determination unit 706 acquires the scene assignment target search result stored in the search result storage unit 103 and extracts text information included in the search result. The search state determination unit 706 causes the word similarity calculation unit 703 to vectorize the extracted text information using the dimension-compressed matrix. A vector generated as a result of this vectorization is referred to as a “search result vector”. Similarly, the search state determination unit 706 causes the word similarity calculation unit 703 to vectorize the word set acquired in the operation history information acquisition process (S802) using the dimension-compressed matrix.
- a vector generated as a result of this vectorization is called an “input word vector”.
- the search state determination unit 706 causes the word similarity calculation unit 703 to obtain the similarity between the input word vector and the search result vector, and stores the obtained similarity in the similarity storage unit 707. Note that the similarity between vectors is obtained by using, for example, a cosine scale (an angle formed by two vectors) or an inner product often used in document retrieval.
- FIG. 9 shows an example of similarity information calculated by the search state determination unit 706 and stored in the similarity storage unit 707.
- the similarity information includes a similarity history including an operation number 901, an operation time 902, a search word 903, a selection item 904, a first similarity 905, and a second similarity 906.
- the operation number 901, operation time 902, search word 903, and selection item 904 correspond to the operation number 501, operation time 502, search word 504, and selection item 505 of the operation history shown in FIG.
- the first similarity 905 indicates the similarity between the search result vector and the input word vector created by the word set indicated by the search word 903 or the selection item 904 for each operation history.
- the second similarity 906 is an input word vector included in the previous operation history (temporarily subsequent operation history) and an input created by the word set indicated by the search word 903 or the selection item 904. Indicates the similarity to a word vector.
- the comparison process (S803), the first similarity is calculated, but the second similarity is calculated together with the calculation of the first similarity or instead of the calculation of the first similarity. May be.
- the search state determination unit 706 determines whether or not the similarity (first similarity) between the search result vector and the input word vector is equal to or less than a threshold (S803). If the first similarity is greater than the threshold (NO in S803), it is determined that the search for the same item as the search for the scene assignment target search result continues. Then, the search state determination unit 706 acquires information related to the operation history that is the next target for determining the search state (S802). That is, the search state determination unit 706 targets an operation history in which a search word is input or a selection item is selected from the operation history information stored in the operation history storage unit 105.
- the search state determination unit 706 selects a past operation history that is closest in time to the operation history that is the target of word set acquisition in the previous operation history information acquisition process (S802) among the target operation history. select.
- the search state determination unit 706 acquires an input or selected word set from the selected operation history. In the specific example shown in FIG. 5, “offside (soccer)” of operation number 8 is acquired.
- the search start point estimation unit 708 determines the search start point time by determining the search start point by the method described below (S804).
- the threshold value is 0.5
- the first similarity 905 from the operation number 10 to the operation number 7 is larger than the threshold value (NO in S803). Therefore, in the operation from the operation number 10 to the operation number 7, it is determined that the search for the same item as the search for the scene assignment target search result is continued.
- the selection item “change” is selected in the operation indicated by the operation number 5
- the first similarity 905 is below the threshold value for the first time (YES in S803). For this reason, it is determined that the search for the same item as the search for the scene assignment target search result is completed.
- the search start point estimation unit 708 estimates the operation history of the operation number 7 having the smallest operation history number as the search start point.
- the search start point estimation unit 708 determines the time when the search start point operation is performed as the search start time.
- the search start time is an operation time 902 included in the operation history corresponding to the search start point.
- “20:21:20” which is the operation time 902 included in the operation history, is set as the search start time.
- the related scene extraction unit 108 extracts the scene at the search start time from the image storage unit 110 (S404).
- the related scene extraction unit 108 extracts moving image information that is a length ⁇ t before the search start time as a related scene. That is, the related scene extraction unit 108 extracts moving image information included in the range of (search start time ⁇ t) to search start time as a related scene.
- ⁇ t a constant value previously determined by the system developer is used. Note that ⁇ t may be variable and may be set by the user.
- the search result output unit 111 combines the related scene with the length ⁇ t extracted by the related scene extraction unit 108 and the scene addition target search result, and outputs the combined result to the output unit 112 (S405).
- FIG. 10 is a diagram illustrating an example of a screen output by the output unit 112.
- a scene assignment target search result 1001 and a related scene 1002 are combined and displayed on the screen.
- an icon group 1003 and a menu button 1006 are displayed on the screen, which are used when a search result to which the related scene 1002 is assigned is sent to another terminal.
- the above-described menu button 1006 is a button for designating what kind of display is performed on the destination terminal when sending to another terminal.
- the user When the user determines to send the search result to another terminal, the user presses one of the icons (icon 1004 or 1005) included in the icon group 1003 and the menu button 1006. By this operation, the user makes a request for transmitting the search result to which the related scene is added to the search result transmission unit 113 (S303).
- the search result transmission unit 113 transmits the search result to which the related scene is assigned to the designated transmission destination based on the user request (S303) (S406).
- the user performs a search result transmission operation to which a related scene is assigned using, for example, an icon group 1003 displayed on the screen shown in FIG.
- the user selects the icon 1004 when he wants to transmit the search result to the shared television 114, and selects the icon 1005 when he wants to transmit the search result to another portable terminal 1601 such as a PC or a mobile phone.
- a search result with a related scene is transmitted to a terminal on which moving image content is being broadcast / reproduced, it may be a problem if the moving image content displayed on the terminal becomes invisible.
- the output unit 112 displays the menu button 1006 on the screen, and allows selection of the display method after transmission.
- the search result transmission unit 113 uses all the screens of the transmission destination terminal, and displays the search results to which the related scenes are assigned as the screen of the transmission destination terminal. Send the data to be displayed.
- the search result transmission unit 113 is given a related scene while partially displaying the moving image content that is being played back or broadcast on the screen of the transmission destination terminal. The data for displaying the retrieved result on the screen of the destination terminal is transmitted. Examples of this screen are shown in FIGS. 11A and 11B. FIG.
- FIG. 11A is a diagram illustrating a display screen example of the transmission destination terminal when the user selects the menu button “full screen” among the menu buttons 1006 displayed on the screen illustrated in FIG. 10.
- FIG. 11B is a diagram showing a display screen example when the user selects the menu button “multi-screen” among the menu buttons 1006 displayed on the screen shown in FIG. A plurality of screens are displayed on the screen of the transmission destination terminal.
- the screen of the transmission destination terminal includes a display screen 1101 of the moving image content reproduced or broadcast on the transmission destination terminal, and a screen including a search result to which the related scenes transmitted to the transmission destination terminal have been given so far. 1102 is displayed.
- the search results to which the related scenes included in the screen 1102 are assigned are displayed together with the search start time 1103 for each search result.
- information related to the sender of the search result may be displayed together.
- FIG. 1 is a diagram showing a display example of a search result to which a related scene is assigned. As shown in the figure, the result searched by the related scene assigning apparatus 100 and the related scene are transmitted to the shared television 114 and the portable terminal 1601 via the computer network 1602 and displayed.
- the related scene assigning apparatus 100 extracts the related scene of the moving image content that has triggered the search using the user's information search operation history and the information search result for each scene. is doing. For this reason, related scenes can be extracted even for moving image content in which text information is not assigned to each scene. Further, it is not necessary to input a keyword only for extracting related scenes. For this reason, it is possible to reduce the burden of the user giving the related scene to the information search result.
- the search result vector generated from the input word vector generated from the word set input or selected by the user and the scene assignment target search result is used.
- the 1st similarity which is a similarity with is used.
- the determination as to whether or not the user is searching for the same item as the scene assignment target search result is not limited to this method.
- a search keyword vector obtained by vectorizing a search keyword input to obtain a scene assignment target search result by the same method as generating an input word vector, an input word vector, Similarities may be used.
- the search state determination unit 706 uses the second similarity described above to determine whether or not the user is searching for the same item. Different from Form 1. That is, in the first modification, while searching for the same item, the feature that the similarity between words that are input or selected is high is used. Using this feature, the search state determination unit 706 calculates the above-described second similarity by comparing input word vectors in adjacent operation histories. If the second similarity is greater than the threshold, it is determined that the same item is being searched between adjacent operation histories. For example, consider a case where the search state is determined for operation number 8 in the specific example of similarity information shown in FIG.
- the search state determination unit 706 inputs an input word vector of “rulebook (offside)” that is an input at operation number 10 one time later, and an input word of “offside (soccer)” that is an input at operation number 8 A second similarity that is a similarity to the vector is calculated. Then, the second similarity is stored in the second similarity 906.
- the search state determination unit 706 determines whether or not the same item is being searched by comparing the calculated second similarity with a threshold value. For example, in the search state at operation numbers 8 and 7, the second similarity is greater than the threshold, but in the search state at operation number 5, the second similarity is less than or equal to the threshold. In this case, the same item is searched for operation numbers 8 and 7, but when the operation number is 5, another item is searched. For this reason, the oldest operation number 7 becomes the search starting point.
- the method for comparing adjacent input word vectors is described, but the present invention is not limited to this.
- the search result for the scene assignment target is displayed, but the same item is searched by comparing the similarity between the input word vector and another input word vector in the operation history immediately before the displayed time with a threshold value. It may be determined whether or not.
- Modification 2 In the second modification, the method for determining whether or not the user is searching for the same item is different from the first embodiment. More specifically, in the comparison process (S803 in FIG. 8), while the search state determination unit 706 searches for the same item, the similarity of words included in the text of the page being viewed is high. The above-mentioned judgment is performed using the feature. That is, the search state determination unit 706 calculates the similarity between the adjacent browsing pages by comparing the adjacent browsing pages. If the similarity is larger than the threshold value, the same item is set between the browsing pages. Is determined to be searched. Specifically, the page information collection unit 704 acquires the display URL of the page that the user was browsing from the operation history stored in the operation history storage unit 105.
- the information search execution unit 102 acquires page information of the display URL.
- the search state determination unit 706 of the search start time estimation unit 107 extracts words from the text information included in the acquired page information.
- the search state determination unit 706 requests the page similarity calculation unit 705 to calculate the similarity between the pages, and the user searches for the same matter using the similarity calculated by the page similarity calculation unit 705. Judgment is made.
- the similarity between the pages used for the determination may be obtained in the same manner as in the case of using the input or selected word set. That is, the similarity between the vector indicating the text information included in the scene assignment target search result page and the vector indicating the text information included in each page viewed by the user may be used as the similarity between pages. Further, the similarity between vectors indicating text information included in adjacent browsing pages may be used as the similarity between pages. Furthermore, instead of calculating the degree of similarity using a matrix as described above, the number of words included in two pages is simply counted, and if the number of words is equal to or less than a threshold, it is determined that the same item is not searched You may do it.
- the similarity information includes a similarity history including an operation number 1201, an operation time 1202, a display URL 1203, a third similarity 1204, and a fourth similarity 1205.
- the operation number 1201, the operation time 1202, and the display URL 1203 correspond to the operation number 501 of the operation history, the operation time 502, and the display URL 503 shown in FIG.
- the third similarity 1204 indicates the similarity between the page of the scene assignment target search result and the page browsed with each operation number.
- the fourth similarity 1205 is similar to the page viewed with each operation number and the page viewed with the page and the previous operation history (the operation history one time later). Degrees.
- the similarity was determined using text information included in the page. Furthermore, when determining the similarity using a Web page, a URL is associated with each page. Therefore, it may be determined from the URL information whether the contents are the same.
- Modification 3 a method for determining whether or not the user is searching for the same item is different from that in the first embodiment. More specifically, in the comparison process (S803 in FIG. 8), the search state determination unit 706 is characterized in that some of the keywords input by the user are common between operations while searching for the same item. Use the above-mentioned judgment.
- the search state determination unit 706 compares the input word in the previous operation with the input word in the current operation, and when some words are changed or when words are added, between the operations. It can be determined that the same item is being searched.
- the input word newly added by the current operation is changed from the input word input by the previous operation to a predetermined operator (for example, AND This is a case where a keyword search is performed by combining them with an operator and an OR operator.
- the main difference between the present embodiment and the first embodiment is that, in the related scene extraction process (S404 in FIG. 4), the related scene extraction unit further determines ⁇ t.
- Other components and the processes executed by them are the same as those in the first embodiment. Therefore, in the present embodiment, the description will focus on the differences from the first embodiment.
- FIG. 13 is a block diagram showing the configuration of the search system according to Embodiment 2 of the present invention.
- This search system uses the related scene assigning device 200 in place of the related scene assigning device 100 in the configuration of the search system shown in FIG.
- the related scene assigning apparatus 200 includes a related scene extracting unit 1308 instead of the related scene extracting unit 108 in the configuration of the related scene assigning apparatus 100 according to the first embodiment shown in FIG. Yes.
- an electronic dictionary storage unit 1315 and a program information acquisition unit 1316 are added to the related scene assigning apparatus 100 according to Embodiment 1.
- the related scene assignment device 200 is configured by a general computer including a CPU, a memory, a communication interface, and the like. Each processing unit is functionally realized by executing a program for realizing each processing unit included in the related scene assigning apparatus 200 on the CPU. Each storage unit is realized by a memory or an HDD (Hard Disk Drive).
- HDD Hard Disk Drive
- the electronic dictionary storage unit 1315 is a storage device that stores explanations about proper nouns and words that describe actions, and stores, for example, explanations about names of people, animals, places, etc., explanations about sports rules and actions, and the like. ing.
- the electronic dictionary storage unit 1315 stores information related to words and parts of speech of the words.
- the program information acquisition unit 1316 is a processing unit that acquires information related to a program stored in the image storage unit 110 by broadcasting, and acquires program information such as EPG (Electric Program Guide) data, for example.
- program information includes information such as a program name, broadcast date and time, genre, performers, and program contents.
- FIG. 14 is a detailed flowchart of the related scene extraction process (S404 in FIG. 4).
- the related scene extraction unit 108 After determining the length ⁇ t of the related scene (S1401 to S1412), the related scene extraction unit 108 extracts the related scene having the length ⁇ t from the image storage unit 110 (S1413).
- the related scene extraction unit 1308 uses a word that is considered to be important among the information of a program from which a scene is to be extracted or a word that is input or selected by the user when searching for a scene assignment target search result (hereinafter referred to as “search”).
- An important word ") and a word representing the page of the scene assignment target search result (hereinafter referred to as" page representative word ") are determined (S1401).
- the related scene extraction unit 1308 weights words that frequently appear in the word set input or selected by the user or words that are input at the search starting point so as to increase the weight, and obtains a score for each word. calculate.
- the related scene extraction unit 1308 determines a predetermined number of words having the highest calculated score as search important words.
- the related scene extraction unit 1308 determines a word having a high appearance frequency or a word used as a title of the page in the text information included in the scene assignment target search result page as a page representative word.
- the related scene extracting unit 1308 sets the related scene length ⁇ t to an initial value of 10 seconds (S1402). Next, the related scene extraction unit 1308 uses the electronic dictionary storage unit 1315 to determine the part-of-speech of the search important word and the page representative word determined in the above process (S1403).
- the related scene extraction unit 1308 refers to the result of the part-of-speech determination process (S1403), and whether or not a proper noun such as a place name, a person name, or an animal name is included in the search important word and the page representative word. Is determined (S1404). If it is determined that a proper noun is included (YES in S1404), the related scene extraction unit 1308 sets ⁇ t to be shorter (for example, ⁇ t is set to 0 and is not a moving image but a moving image) (S1405).
- the related scene extraction unit 1308 refers to the result in the part of speech determination process (S1403), and includes the search important word and the page representative word. It is determined whether or not a word representing an action is included (S1406). If it is determined that a word representing an action is included (YES in S1406), the related scene extraction unit 1308 sets a certain length (for example, 3 minutes) to ⁇ t (S1407).
- the related scene extraction unit 1308 determines that “offside” is the search important word (S1401). Since “offside” is a word representing an action (NO in S1404, YES in S1406), the related scene extraction unit 1308 sets a time (for example, 3 minutes) when the word is an action as ⁇ t ( S1407).
- the related scene extraction unit 1308 acquires program information at the search start time that triggers the search from the program information acquired by the program information acquisition unit 1316 (S1408).
- the related scene extraction unit 1308 determines whether or not the genre of the program indicated by the acquired program information is a quiz program (S1409).
- the genre is a quiz program (YES in S1409)
- the user often performs a search for a quiz question.
- the related scene extraction unit 1308 sets an average time (for example, 4 minutes) required for explanation of quiz questions and answers to ⁇ t (S1410).
- the related scene extraction unit 1308 determines whether the genre of the program indicated by the acquired program information is a news program (S1411). If the genre is a news program (YES in S1411), the user often performs a search for a news topic. For this reason, the related scene extraction unit 1308 sets an average time (for example, 2 minutes) required for explaining one topic in the news to ⁇ t (S1412). With the above processing, the value of ⁇ t is determined.
- the related scene extraction unit 1308 extracts the moving image information included in the range of (search start time ⁇ t) to (search start time) from the image storage unit 110 as a related scene (S1413).
- ⁇ t 0, the related scene extraction unit 1308 extracts a still image at the search start time.
- a margin ⁇ ⁇ is a positive value
- moving image information included in the range of (search start time ⁇ t ⁇ ) to (search start time + ⁇ ) may be extracted.
- the value of ⁇ t may be determined only from the genre of the program without using the word part of speech. That is, the value of ⁇ t is defined corresponding to the genre of the program, and the value of ⁇ t is determined according to the genre of the target program. This eliminates the need to determine important search words and page representative words and specify the type of words such as part of speech. For this reason, ⁇ t can be determined by a simpler process. Further, when the similarity between the search important word and the page representative word is small and it is not clear which word should be used to determine ⁇ t, a method using only this genre may be used.
- FIG. 15 is a diagram illustrating an example of a screen output to the output unit 112 by the search result output process (S405 in FIG. 4).
- an operation bar 1507 for moving image playback is further displayed in the screen example shown in FIG.
- the length of the related scene is expressed by the length of the bar.
- the time position of the related scene 1002 currently displayed is represented by a symbol 1508.
- ⁇ t is 3 minutes and ⁇ is 1 minute.
- the search start time (time at operation number 7 in FIG. 5) is set to 20:21.
- a moving image in a section from 20:17 (search start time ⁇ t ⁇ ) to 20:22 (search start time + ⁇ ) is extracted as a related scene.
- the related scene 1002 is displayed on the screen as a moving image of a section from 20:18 (search start time- ⁇ t) to 20:21 (search start time).
- the time interval ⁇ t of the related scene can be automatically changed according to the search content. For this reason, it is possible to add a related scene having an appropriate length to the search result.
- the search start time is set as the time at which the search for the scene assignment target search result is started. Take time. For this reason, it is good also considering the search start time as the time which went back for the predetermined time from the time which started the search with respect to a scene provision object search result.
- the present invention is generally applicable to search processing based on moving images, it can be applied not only to the television broadcasting described in the embodiment, but also to moving image content on the Internet and video shot privately.
- the sharing method can be used in various situations such as not only displaying on the shared display described in the embodiment but also attaching it to an e-mail, and the applicability is very large.
- the present invention can be applied to a corresponding scene assignment device, an information search device, and the like that associate an information search result with a scene of a video content when performing an information search while viewing the video content.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
本実施の形態では、ユーザがPCや携帯電話などインターネット上に存在する検索サーバにアクセスする。このことにより情報検索が可能な装置(以下、「情報検索装置」という。)を用いて、テレビ番組を視聴中にあるシーンをきっかけとした情報検索を行う場合を想定する。本実施の形態では、情報検索装置による情報検索結果を共有テレビに表示して他人と共有する際に、情報検索のきっかけとなったテレビ番組のシーンを推定する方法について説明する。また、推定したシーンを情報検索結果に付与する作業を支援する方法について説明する。例えば、家族でサッカー中継番組を視聴中に、オフサイドがあった場合に、ユーザが直ちにオフサイドについて検索を行う場合を考える。このような状況において、オフサイドに関する検索結果と、オフサイドのシーンとを関連付けるための検索システムについて説明する。
変形例1では、比較処理(図8のS803)において、検索状態判定部706は、上述した第2類似度を用いて、ユーザが同一事項を検索しているか否かについて判断する点が実施の形態1と異なる。つまり、変形例1では、同一事項を検索している間は、入力又は選択している単語間の類似度が高いという特徴を利用する。この特徴を利用して、検索状態判定部706は、隣接する操作履歴における入力単語ベクトル同士の比較を行うことにより、上述した第2類似度を算出する。第2類似度が閾値よりも大きい場合には、隣接する操作履歴間で同一事項を検索していると判断する。例えば、図9に示す類似度情報の具体例で、操作番号8について検索状態を判定する場合を考える。検索状態判定部706は、時間的に1つ後の操作番号10における入力である「ルールブック(オフサイド)」の入力単語ベクトルと、操作番号8における入力である「オフサイド(サッカー)」の入力単語ベクトルとの類似度である第2類似度を計算する。そして、第2類似度を第2類似度906に記憶する。検索状態判定部706は、計算された第2類似度と閾値とを比較することにより、同一事項を検索しているか否かについて判断する。例えば、操作番号8及び7における検索状態においては、第2類似度が閾値よりも大きくなっているが、操作番号5における検索状態において、第2類似度が閾値以下となっているとする。この場合、操作番号8及び7においては同一事項を検索しているが、操作番号5になると別事項を検索していたことが分かる。このため、最も時間的に古い操作番号7が検索起点となる。
変形例2では、ユーザが同一事項を検索しているか否かについて判断する手法が実施の形態1と異なる。より具体的には、比較処理(図8のS803)において、検索状態判定部706が、同一事項を検索している間は、閲覧しているページのテキストに含まれる単語の類似度が高いという特徴を利用して前述の判断を行う。つまり、検索状態判定部706は、隣接する閲覧ページ同士の比較を行うことにより、隣接する閲覧ページ間の類似度を算出し、類似度が閾値よりも大きい場合には、閲覧ページ間で同一事項を検索していると判断する。具体的には、ページ情報収集部704が、操作履歴記憶部105に記憶されている操作履歴の中から、ユーザが閲覧していたページの表示URLを取得する。情報検索実行部102は、当該表示URLのページ情報を取得する。検索起点時刻推定部107の検索状態判定部706は、取得されたページ情報に含まれるテキスト情報の中から単語を抽出する。検索状態判定部706は、ページ類似度計算部705に各ページ間の類似度の計算を要求し、ページ類似度計算部705において計算された類似度を利用して、ユーザが同一事項を検索しているかの判断を行う。
変形例3では、ユーザが同一事項を検索しているか否かについて判断する手法が実施の形態1と異なる。より具体的には、比較処理(図8のS803)において、検索状態判定部706は、同一事項を検索している間は、ユーザが入力するキーワードの一部が操作間で共通するという特徴を利用して、前述の判断を行う。
上記実施の形態1では、抽出される関連シーンの長さΔtはシステム開発者が予め決定しておいた一定値を利用した。本実施の形態2では、Δtの長さはユーザが検索した内容によって異なるべきという考え方のもとに、Δtの長さをユーザが入力又は選択した単語と検索結果とを利用して決定する。これにより検索結果に応じて適切な長さのΔtを自動的に決定することが可能となる。
101 入力部
102 情報検索実行部
103 検索結果記憶部
104 操作履歴収集部
105 操作履歴記憶部
106 タイマー
107 検索起点時刻推定部
108 関連シーン抽出部
109 画像取得部
110 画像記憶部
111 検索結果出力部
112 出力部
113 検索結果送信部
114 共有テレビ
1601 携帯端末
Claims (14)
- 検索結果に、検索に関連する画像データである関連シーンを関連付ける関連シーン付与装置であって、
画像データと、前記画像データの再生時刻とを記憶している画像記憶部と、
ユーザが入力する検索条件に従って、情報を検索する情報検索実行部と、
前記検索条件と、前記検索条件が受け付けられた時刻と、を対応付けた操作の履歴を記憶している操作履歴記憶部と、
前記情報検索実行部により検索された情報の中で前記ユーザにより指定された検索情報であるシーン付与対象検索結果に関連する前記操作の履歴である検索起点に基づいて、前記シーン付与対象検索結果を得るための検索条件の入力が開始された時刻である検索起点時刻を推定する検索起点時刻推定部と、
前記検索起点時刻推定部で推定された前記検索起点時刻を含む時間に再生された画像データを、前記シーン付与対象検索結果と関連付ける関連シーン抽出部と
を備える関連シーン付与装置。 - 前記検索起点時刻推定部は、前記情報検索実行部により検索された情報の中で、前記ユーザにより指定された検索情報であるシーン付与対象検索結果に関連する前記操作の履歴のうち、最も古い時刻における操作の履歴である検索起点に基づいて、前記シーン付与対象検索結果を得るための検索条件の入力が開始された時刻である検索起点時刻を推定する
請求項1記載の関連シーン付与装置。 - 前記検索起点時刻推定部は、
前記操作の履歴毎に、前記シーン付与対象検索結果と、前記シーン付与対象検索結果を検索するために入力された検索条件である第1の検索条件が入力された時刻よりも時間的に前に入力された検索条件との類似度を計算する単語類似度計算部と、
所定の値よりも大きい前記単語類似度計算部で計算された前記類似度のうち、前記第1の検索条件が入力された時刻から最も遠い時刻に入力された検索条件を用いて計算された類似度を特定し、前記最も遠い時刻における操作の履歴を前記検索起点として特定し、当該検索起点に基づいて、前記検索起点時刻を推定する検索起点推定部とを含む
請求項2記載の関連シーン付与装置。 - 前記検索起点時刻推定部は、
前記操作の履歴毎に、前記シーン付与対象検索結果を検索するために入力された検索条件である第1の検索条件と、当該第1の検索条件が入力されるよりも時間的に前に受け付けられた検索条件との類似度を計算する単語類似度計算部と、
所定の値よりも大きい前記単語類似度計算部で計算された前記類似度のうち、前記第1の検索条件が受け付けられた時刻から最も遠い時刻に受け付けられた検索条件を用いて計算された類似度を特定し、前記最も遠い時刻における操作の履歴を前記検索起点として特定し、当該検索起点に基づいて、前記検索起点時刻を推定する検索起点推定部とを含む
請求項2記載の関連シーン付与装置。 - 前記検索起点時刻推定部は、
前記シーン付与対象検索結果を検索するために入力された検索条件である第1の検索条件が入力された時刻よりも時間的に前に入力された時間的に隣接する操作の履歴の組毎に、当該操作の履歴の組に含まれる検索条件同士の類似度を計算する単語類似度計算部と、
所定の値以下となる前記単語類似度計算部で計算された前記類似度のうち、前記第1の検索条件が入力された時刻に最も近い時刻に入力された検索条件を用いて計算された類似度を特定し、前記最も近い時刻における操作の履歴を前記検索起点として特定し、当該検索起点に基づいて、前記検索起点時刻を推定する検索起点推定部とを含む
請求項2記載の関連シーン付与装置。 - 前記検索起点時刻推定部は、
前記シーン付与対象検索結果を検索するために入力された検索条件である第1の検索条件が入力された時刻よりも時間的に前に入力された時間的に隣接する操作の履歴の組毎に、当該操作の履歴の組に含まれる検索条件間に共通する単語が存在するか否かを判断する単語類似度計算部と、
共通する単語が存在しないと判断された操作の履歴の組のうち、前記第1の検索条件が入力された時刻に最も近い時刻に入力された検索条件を含む操作の履歴を前記検索起点として特定し、当該検索起点に基づいて、前記検索起点時刻を推定する検索起点推定部とを含む
請求項2記載の関連シーン付与装置。 - 前記検索起点時刻推定部は、
前記操作の履歴毎に、前記シーン付与対象検索結果と、前記シーン付与対象検索結果を検索するために入力された検索条件である第1の検索条件が入力された時刻よりも時間的に前に入力された検索条件に基づく検索結果との類似度を計算する単語類似度計算部と、
所定の値よりも大きい前記単語類似度計算部で計算された前記類似度のうち、前記第1の検索条件が入力された時刻から最も遠い時刻に入力された検索条件を用いて計算された類似度を特定し、前記最も遠い時刻における操作の履歴を前記検索起点として特定し、当該検索起点に基づいて、前記検索起点時刻を推定する検索起点推定部とを含む
請求項2記載の関連シーン付与装置。 - 前記検索起点時刻推定部は、
前記シーン付与対象検索結果を検索するために入力された検索条件である第1の検索条件が入力された時刻よりも時間的に前に入力された時間的に隣接する操作の履歴の組毎に、当該操作の履歴の組に含まれる検索条件に基づく検索結果同士の類似度を計算する単語類似度計算部と、
所定の値以下となる前記単語類似度計算部で計算された前記類似度のうち、前記第1の検索条件が入力された時刻に最も近い時刻に入力された検索条件を用いて計算された類似度を特定し、前記最も近い時刻における操作の履歴を前記検索起点として特定し、当該検索起点に基づいて、前記検索起点時刻を推定する検索起点推定部とを含む
請求項2記載の関連シーン付与装置。 - さらに、単語と前記単語の品詞に関する情報を記憶している電子辞書記憶部を備え、
前記関連シーン抽出部は、前記電子辞書記憶部に記憶されている情報を参照することにより、前記シーン付与対象検索結果に関連する単語の品詞を決定し、決定された前記単語の品詞に応じて時間幅を決定し、前記検索起点時刻推定部で推定された前記検索起点時刻を含む前記時間幅内に再生された動画像データ又は静止画像データを、前記画像記憶部に記憶されている前記動画像データより抽出し、抽出した前記動画像データ又は前記静止画像データを、前記シーン付与対象検索結果と関連付ける
請求項1~8のいずれか1項記載の関連シーン付与装置。 - さらに、前記画像記憶部に記憶されている前記動画像データの種類に関する情報を取得する番組情報取得部を備え、
前記関連シーン抽出部は、前記番組情報取得部で取得された情報を参照することにより、前記検索起点時刻において再生された前記動画像データの種類を決定し、決定された前記動画像データの種類に応じて時間幅を決定し、前記検索起点時刻推定部で推定された前記検索起点時刻を含む前記時間幅内に再生された動画像データ又は静止画像データを、前記画像記憶部に記憶されている前記動画像データより抽出し、抽出した前記動画像データ又は前記静止画像データを、前記シーン付与対象検索結果と関連付ける
請求項1~8のいずれか1項記載の関連シーン付与装置。 - さらに、前記関連シーン抽出部で前記動画像データ又は前記静止画像データが関連付けられた前記シーン付与対象検索結果を、外部の機器に出力する検索結果出力部を備える
請求項1~10のいずれか1項記載の関連シーン付与装置。 - 動画像データを表示する表示装置と、検索結果に、検索のきっかけとなった動画像データ又は静止画像データである関連シーンを関連付ける関連シーン付与装置とを備える検索システムであって、
前記関連シーン付与装置は、
前記表示装置で表示されるのと同じ動画像データと、当該動画像データの再生時刻とを記憶している画像記憶部と、
ユーザが入力する検索条件を受け付け、当該検索条件に従って、情報を検索する情報検索実行部と、
前記情報検索実行部において受け付けられた前記検索条件と、当該検索条件が受け付けられた時刻とを対応付けた操作の履歴を記憶している操作履歴記憶部と、
前記情報検索実行部による検索結果のうち前記ユーザにより指定された検索結果であるシーン付与対象検索結果と、前記操作履歴記憶部に記憶されている前記操作の履歴との関連性に基づいて、前記シーン付与対象検索結果に関連する前記操作履歴記憶部に記憶されている前記操作の履歴である検索起点を特定し、当該検索起点に基づいて、前記シーン付与対象検索結果を得るための検索条件の入力が開始された時刻である検索起点時刻を推定する検索起点時刻推定部と、
前記検索起点時刻推定部で推定された前記検索起点時刻を含む時間に再生された動画像データ又は静止画像データを、前記画像記憶部に記憶されている前記動画像データより抽出し、抽出した前記動画像データ又は前記静止画像データを、前記シーン付与対象検索結果と関連付ける関連シーン抽出部と、
前記関連シーン抽出部で前記動画像データ又は前記静止画像データが関連付けられた前記シーン付与対象検索結果を、前記表示装置へ出力検索結果出力部を備え、
前記表示装置は、前記検索結果出力部より、前記関連シーン抽出部で前記動画像データ又は前記静止画像データが関連付けられた前記シーン付与対象検索結果を受信し、受信した前記シーン付与対象検索結果を表示する
検索システム。 - 検索結果に、検索のきっかけとなった動画像データ又は静止画像データである関連シーンを関連付ける関連シーン付与装置のための関連シーン付与方法であって、
前記関連シーン付与装置は、
動画像データと、当該動画像データの再生時刻とを記憶している画像記憶部と、
ユーザが入力する検索条件と、当該検索条件が受け付けられた時刻とを対応付けた操作の履歴を記憶している操作履歴記憶部とを備え、
ユーザが入力する検索条件を受け付け、当該検索条件に従って、情報を検索する情報検索実行ステップと、
前記情報検索実行ステップによる検索結果のうち前記ユーザにより指定された検索結果であるシーン付与対象検索結果と、前記操作履歴記憶部に記憶されている前記操作の履歴との関連性に基づいて、前記シーン付与対象検索結果に関連する前記操作履歴記憶部に記憶されている前記操作の履歴である検索起点を特定し、当該検索起点に基づいて、前記シーン付与対象検索結果を得るための検索条件の入力が開始された時刻である検索起点時刻を推定する検索起点時刻推定ステップと、
前記検索起点時刻推定ステップで推定された前記検索起点時刻を含む時間に再生された動画像データ又は静止画像データを、前記画像記憶部に記憶されている前記動画像データより抽出し、抽出した前記動画像データ又は前記静止画像データを、前記シーン付与対象検索結果と関連付ける関連シーン抽出ステップと
を含む関連シーン付与方法。 - 検索結果に、検索のきっかけとなった動画像データ又は静止画像データである関連シーンを関連付けるコンピュータ実行可能なプログラムであって、
メモリは、動画像データと、当該動画像データの再生時刻とを記憶し、前記メモリは、さらに、ユーザが入力する検索条件と、当該検索条件が受け付けられた時刻とを対応付けた操作の履歴を記憶しており、
前記プログラムは、
ユーザが入力する検索条件を受け付け、当該検索条件に従って、情報を検索する情報検索実行ステップと、
前記情報検索実行ステップによる検索結果のうち前記ユーザにより指定された検索結果であるシーン付与対象検索結果と、前記メモリに記憶されている前記操作の履歴との関連性に基づいて、前記シーン付与対象検索結果に関連する前記メモリに記憶されている前記操作の履歴である検索起点を特定し、当該検索起点に基づいて、前記シーン付与対象検索結果を得るための検索条件の入力が開始された時刻である検索起点時刻を推定する検索起点時刻推定ステップと、
前記検索起点時刻推定ステップで推定された前記検索起点時刻を含む時間に再生された動画像データ又は静止画像データを、前記メモリに記憶されている前記動画像データより抽出し、抽出した前記動画像データ又は前記静止画像データを、前記シーン付与対象検索結果と関連付ける関連シーン抽出ステップと
をコンピュータに実行させるプログラム。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/865,879 US8174579B2 (en) | 2008-08-22 | 2009-08-10 | Related scene addition apparatus and related scene addition method |
| JP2009545752A JP4487018B2 (ja) | 2008-08-22 | 2009-08-10 | 関連シーン付与装置及び関連シーン付与方法 |
| CN200980119475XA CN102084645B (zh) | 2008-08-22 | 2009-08-10 | 关联场景赋予装置以及关联场景赋予方法 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008-214654 | 2008-08-22 | ||
| JP2008214654 | 2008-08-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010021102A1 true WO2010021102A1 (ja) | 2010-02-25 |
Family
ID=41706995
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2009/003836 Ceased WO2010021102A1 (ja) | 2008-08-22 | 2009-08-10 | 関連シーン付与装置及び関連シーン付与方法 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US8174579B2 (ja) |
| JP (1) | JP4487018B2 (ja) |
| CN (1) | CN102084645B (ja) |
| WO (1) | WO2010021102A1 (ja) |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102072989B1 (ko) * | 2013-01-14 | 2020-03-02 | 삼성전자주식회사 | 멀티스크린 지원을 위한 마크-업 구성장치 및 방법 |
| US10915543B2 (en) | 2014-11-03 | 2021-02-09 | SavantX, Inc. | Systems and methods for enterprise data search and analysis |
| US9990441B2 (en) * | 2014-12-05 | 2018-06-05 | Facebook, Inc. | Suggested keywords for searching content on online social networks |
| DE102015208060A1 (de) | 2015-02-12 | 2016-08-18 | Ifm Electronic Gmbh | Verfahren zum Betrieb eines Impulsgenerators für kapazitive Sensoren und Impulsgenerator |
| CN104866308A (zh) * | 2015-05-18 | 2015-08-26 | 百度在线网络技术(北京)有限公司 | 一种场景图像的生成方法及装置 |
| US11328128B2 (en) | 2017-02-28 | 2022-05-10 | SavantX, Inc. | System and method for analysis and navigation of data |
| WO2018160605A1 (en) | 2017-02-28 | 2018-09-07 | SavantX, Inc. | System and method for analysis and navigation of data |
| WO2018174884A1 (en) * | 2017-03-23 | 2018-09-27 | Rovi Guides, Inc. | Systems and methods for calculating a predicted time when a user will be exposed to a spoiler of a media asset |
| WO2020213757A1 (ko) * | 2019-04-17 | 2020-10-22 | 엘지전자 주식회사 | 단어 유사도 판단 방법 |
| WO2022198474A1 (en) * | 2021-03-24 | 2022-09-29 | Sas Institute Inc. | Speech-to-analytics framework with support for large n-gram corpora |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003150928A (ja) * | 2001-11-15 | 2003-05-23 | Nikon Gijutsu Kobo:Kk | 画像管理装置および画像保管装置 |
| JP2005033619A (ja) * | 2003-07-08 | 2005-02-03 | Matsushita Electric Ind Co Ltd | コンテンツ管理装置およびコンテンツ管理方法 |
| JP2005115504A (ja) * | 2003-10-06 | 2005-04-28 | Mega Chips Corp | 画像検索システムおよび画像データベース |
| JP2007049739A (ja) * | 2006-10-11 | 2007-02-22 | Hitachi Ltd | メディアシーンの属性情報を格納する情報格納装置、情報表示装置、および情報格納方法 |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11353325A (ja) | 1998-06-10 | 1999-12-24 | Hitachi Ltd | 映像と関連情報の同期表示システム |
| CN1371502B (zh) * | 1999-06-30 | 2010-05-05 | 夏普公司 | 活动图像检索信息记录装置及活动图像检索装置 |
| US7336775B2 (en) | 2001-10-30 | 2008-02-26 | Nikon Corporation | Image storage apparatus, image storage supporting apparatus, image storage system, image management apparatus and image saving apparatus |
| JP2003304523A (ja) * | 2002-02-08 | 2003-10-24 | Ntt Docomo Inc | 情報配信システム、情報配信方法、情報配信サーバ、コンテンツ配信サーバ及び端末 |
| JP2004080476A (ja) | 2002-08-20 | 2004-03-11 | Sanyo Electric Co Ltd | ディジタル映像再生装置 |
| JP2005333280A (ja) * | 2004-05-19 | 2005-12-02 | Dowango:Kk | 番組連動システム |
| JP4252030B2 (ja) * | 2004-12-03 | 2009-04-08 | シャープ株式会社 | 記憶装置およびコンピュータ読取り可能な記録媒体 |
-
2009
- 2009-08-10 WO PCT/JP2009/003836 patent/WO2010021102A1/ja not_active Ceased
- 2009-08-10 US US12/865,879 patent/US8174579B2/en not_active Expired - Fee Related
- 2009-08-10 CN CN200980119475XA patent/CN102084645B/zh not_active Expired - Fee Related
- 2009-08-10 JP JP2009545752A patent/JP4487018B2/ja not_active Expired - Fee Related
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003150928A (ja) * | 2001-11-15 | 2003-05-23 | Nikon Gijutsu Kobo:Kk | 画像管理装置および画像保管装置 |
| JP2005033619A (ja) * | 2003-07-08 | 2005-02-03 | Matsushita Electric Ind Co Ltd | コンテンツ管理装置およびコンテンツ管理方法 |
| JP2005115504A (ja) * | 2003-10-06 | 2005-04-28 | Mega Chips Corp | 画像検索システムおよび画像データベース |
| JP2007049739A (ja) * | 2006-10-11 | 2007-02-22 | Hitachi Ltd | メディアシーンの属性情報を格納する情報格納装置、情報表示装置、および情報格納方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20110008020A1 (en) | 2011-01-13 |
| JPWO2010021102A1 (ja) | 2012-01-26 |
| JP4487018B2 (ja) | 2010-06-23 |
| CN102084645B (zh) | 2013-06-19 |
| CN102084645A (zh) | 2011-06-01 |
| US8174579B2 (en) | 2012-05-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP4487018B2 (ja) | 関連シーン付与装置及び関連シーン付与方法 | |
| US9008489B2 (en) | Keyword-tagging of scenes of interest within video content | |
| US11197036B2 (en) | Multimedia stream analysis and retrieval | |
| KR100684484B1 (ko) | 비디오 세그먼트를 다른 비디오 세그먼트 또는 정보원에링크시키는 방법 및 장치 | |
| US9582582B2 (en) | Electronic apparatus, content recommendation method, and storage medium for updating recommendation display information containing a content list | |
| CN1326075C (zh) | 自动视频检索器精灵 | |
| JP3917648B2 (ja) | 連想辞書作成装置 | |
| JP4370850B2 (ja) | 情報処理装置および方法、プログラム、並びに記録媒体 | |
| WO2010113619A1 (ja) | コンテンツ推薦装置、方法、及びプログラム | |
| US20120066235A1 (en) | Content processing device | |
| JP5250381B2 (ja) | 索引ビデオ生成装置、動画像検索装置及び動画像検索システム | |
| US11968428B2 (en) | Navigating content by relevance | |
| JP5335500B2 (ja) | コンテンツ検索装置及びコンピュータプログラム | |
| US20100281046A1 (en) | Method and web server of processing a dynamic picture for searching purpose | |
| JP5916790B2 (ja) | 動画再生装置、候補抽出方法及びプログラム | |
| JP4734048B2 (ja) | 情報検索装置、情報検索方法および情報検索プログラム | |
| JP5008250B2 (ja) | 情報処理装置および方法、プログラム、並びに記録媒体 | |
| JP2009230306A (ja) | 映像記録再生装置 | |
| JP4794610B2 (ja) | 関連情報付与装置及びその方法 | |
| JP2005284392A (ja) | ダイジェスト配信リスト生成サーバ及びダイジェスト配信リスト生成プログラム | |
| KR101480411B1 (ko) | 전자 장치 상에서 정보 검색을 용이하게 하는 방법 및 시스템 | |
| JP2009048334A (ja) | 映像識別処理装置、画像識別処理装置、およびコンピュータプログラム | |
| JP4961760B2 (ja) | コンテンツ出力装置、及びコンテンツ出力方法 | |
| JP2025034996A (ja) | 番組選択装置及び番組選択プログラム | |
| JP2008099012A (ja) | コンテンツ再生システム及びコンテンツ蓄積システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200980119475.X Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2009545752 Country of ref document: JP |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09808038 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 12865879 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 09808038 Country of ref document: EP Kind code of ref document: A1 |