WO2010012312A1 - Method and apparatus for identifying additional content relating to a media file - Google Patents
Method and apparatus for identifying additional content relating to a media file Download PDFInfo
- Publication number
- WO2010012312A1 WO2010012312A1 PCT/EP2008/060175 EP2008060175W WO2010012312A1 WO 2010012312 A1 WO2010012312 A1 WO 2010012312A1 EP 2008060175 W EP2008060175 W EP 2008060175W WO 2010012312 A1 WO2010012312 A1 WO 2010012312A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- media file
- content
- data
- user
- additional content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23614—Multiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4348—Demultiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8455—Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8543—Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
Definitions
- the invention relates to a method and apparatus for identifying additional content for communication to a user in association with the playback of a media file.
- a method and apparatus for generating data used to identify additional content are also provided as well as a system for communicating additional content to a user.
- This method of providing advertisements is not only limited in terms of the relevance of the advertisements to the viewer, but also has the drawback that the video stream format of the advertising is required to match the video stream format of the programme, thus adding a further constraint on the flexibility of the system.
- the present invention aims to address the drawbacks associated with known arrangements.
- a method of identifying additional content for communication to a user in association with the playback of a media file to the user comprising retrieving data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and identifying the additional content on the basis of the at least one content descriptor. Additional content such as advertisements can therefore be automatically selected based on a content descriptor for the media file and communicated to a user during playback of the media file.
- the data can further comprise time information associated with the at least one content descriptor, the identified additional content to be communicated to a user in association with the playback of the media file and in accordance with the time information.
- the time information can, for instance, be used to specify the time during playback of the media file at which the additional content is communicated to the user.
- the at least one content descriptor can relate to content of the media file at a temporal location in the media file, the temporal location being indicated by the associated time information.
- the media file can comprise at least one of audio and visual data and the method can further comprise playing back the media file at a display apparatus and communicating the identified additional content to the user in association with the play back and in accordance with the time information.
- the method can further comprise receiving the media file and the data relating to the media file in the form of one or more data streams received from one or more streaming servers.
- the time information can comprise start and finish times, the additional content being communicated to the user at a time between the associated start time and the associated finish time.
- the time information can further comprise at least one insertion time associated with at least one content descriptor, the insertion time indicative of a temporal location in the media file at which the media file is to be paused while additional content corresponding to the at least one content descriptor associated with the insertion time is communicated to the user.
- the data can further comprise screen region information associated with the time information, the display apparatus interpreting the screen region information and providing a visual indication of the relevant screen region during display of the additional content.
- the additional content can comprise at least one of image data, audio data and video data and at least one of the image and video data can further provide a link enabling a user to access further information relating to the additional content.
- a method of generating data used to identify additional content for communication to a user in association with the playback of a media file to the user comprising storing data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and is arranged for use in identifying the additional content.
- the method of generating data may further comprise storing time information associated with the at least one content descriptor, the identified additional content to be communicated to a user in association with the playback of the media file and in accordance with the time information.
- the method of generating data may further comprise performing computer - A -
- an apparatus for identifying additional content for communication to a user in association with the playback of a media file to the user comprising a storage unit arranged to store data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file, and a processor for identifying the additional content on the basis of the at least one content descriptor.
- the storage unit can be further arranged to store time information associated with the at least one content descriptor, the additional content identified by the processor to be communicated to a user in association with the playback of the media file in accordance with the time information.
- the media file can comprise at least one of audio and visual data, the apparatus further configured to play back the media file and to communicate the identified additional content to the user in association with the play back and in accordance with the time information.
- the apparatus can be arranged to receive the media file and the data relating to the media file in the form of one or more data streams received from one or more streaming servers.
- the apparatus can be further configured to stream the media file to a display apparatus and to stream the identified additional content to the display apparatus for communication to the user in association with the play back of the media file and in accordance with the time information.
- an apparatus for generating data used to identify additional content for communication to a user in association with the playback of a media file to the user comprising means for generating and storing data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and is arranged for use in identifying the additional content.
- a system for communicating additional content to a user in association with the playback of a media file to the user comprising a server configuration for providing a media file, data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file, and additional content corresponding to the at least one content descriptor, and a display apparatus and an associated processor configured to receive the media file from the server configuration, request the data relating to the media file from the server configuration, receive said data and extract the at least one content descriptor, request additional content corresponding to the at least one content descriptor from the server configuration, receive the additional content and communicate the additional content to a user in accordance with said time information and in association with the playback of the media file at the display apparatus.
- a computer-readable storage medium storing data defining additional content for communication to a user in association with the playback of a media file to the user, the data comprising at least one content descriptor and associated time information, wherein the content descriptor defines the additional content and relates to content of the media file at a time during playback of the media file, the time being indicated by the associated time information.
- Figure 1 schematically illustrates a system for identifying and displaying additional content, according to an embodiment of the present invention
- Figure 2 schematically illustrates a system for manually or automatically capturing index data for a data file, according to an embodiment of the present invention
- Figure 3 schematically illustrates the structure of the manual capture apparatus of Figure 2;
- Figure 4 schematically illustrates the structure of both the automatic capture apparatus and the index data storage system of Figure 2;
- Figure 5 schematically illustrates the relationship between index data and the timeline of a media file, according to an embodiment of the present invention;
- Figure 6 illustrates an example index data file in XML format, according to an embodiment of the present invention;
- Figure 7 is a flow diagram illustrating the steps performed in requesting and displaying additional content relating to a media file, according to an embodiment of the present invention
- Figure 8 is a flow diagram illustrating the steps performed in using index data to control the display of additional content at a display apparatus, according to an embodiment of the present invention
- Figure 9 is a flow diagram illustrating the steps performed in manually capturing index data according to an embodiment of the present invention
- Figure 10 schematically illustrates a graphical user interface for manually capturing index data, according to an embodiment of the present invention
- Figure 11 schematically illustrates a user interface for viewing a data file along with additional data linked to a particular screen region of the displayed data file, according to an embodiment of the present invention
- Figure 12 schematically illustrates a user interface for viewing a data file along with additional data, according to an embodiment of the present invention
- Figure 13 schematically illustrates a system for providing index data with a data file, according to an embodiment of the present invention.
- a system for defining and identifying additional content for communication to a user during playback of a media file includes a media server 100, an additional content server 101 and an index data server 102, each being connected to a display apparatus 103 via a computer network, in this example the internet 104.
- a media storage unit 105 is connected to the media server 100
- an additional content storage unit 106 is connected to the additional content server 101
- an index data storage system 107 is connected to the index data server 102.
- Any of the storage units or system 105, 106, 107 can alternatively or additionally be connected directly to the internet 104 to transfer stored data, for instance under the control of a respective server 100, 101, 102.
- the media server 100 is a streaming media server, providing a streamed audio-visual data file, hereinafter referred to as a "data stream", to the display apparatus 103.
- the additional content server 101 is an advertisement server providing advertisements to be displayed in conjunction with a data stream. Streaming media servers and advertisement servers such as these will be familiar to those skilled in the art, and are not therefore described in detail here.
- the display apparatus 103 is a typical desktop computer, but an alternative device such as a mobile telephone, laptop computer, personal data assistant or television may readily be substituted as appropriate.
- FIG 2 a system according to an embodiment of the present invention for manually or automatically capturing index data for a media stream is illustrated, the system including the index data storage system 107 as shown in Figure 1, connected via the internet 104 to both a manual index data capture apparatus 200 and an automatic index data capture apparatus 202.
- the index data storage system 107 is shown to be connected via the internet 104 to both the manual apparatus 200 and the automatic apparatus 202, the person skilled in the art will readily appreciate that other connections may be substituted as appropriate, such as wired local area network (LAN) or wireless (WLAN) connections.
- LAN local area network
- WLAN wireless
- either or both of the manual index data capture apparatus 200 and the automatic index data capture apparatus 202 are provided in a single physical device, which can either automatically generate and record index data for a media file, or be operated by a user to manually generate and record index data for a media file.
- FIG 3 illustrates the arrangement of components of the manual capture apparatus 200.
- the manual capture apparatus 200 includes a network interface unit 304, connected to each of a storage unit 303, user interface 300 and a processing unit 301, the processing unit 301 being further connected to a memory 302.
- the user interface 300 is provided in the form of a standard keyboard, mouse and computer monitor combination, but in alternative embodiments the manual capture apparatus may be controlled via a web browser from a remote computer.
- the network interface unit 304 is also connected to the internet 104, through which the manual capture apparatus 200 may connect to other devices as seen in Figure 2.
- FIG. 4 illustrates the structure of both the automatic index data capture system 202 and the index data storage system 107.
- Each system includes a network interface unit 404, connected to a storage unit 403 and a processing unit 401, the processing unit 401 being further connected to a memory 402.
- the network interface unit 404 is connected to the internet 104 in the present example; in alternative embodiments, other network connections may be substituted as appropriate.
- the storage unit 403 in the present example is a magnetic hard disk, although any form of machine-readable memory may readily be substituted.
- the automatic index data capture apparatus 202 uses image processing software stored in the storage unit 403 to analyse the content of a data stream received via the network interface unit 404, and accordingly generate index terms and index times for the data stream. This information is saved as index data and transmitted to the index data storage system 107 via the network interface unit 404.
- the storage unit 403 is also used for storing any other software programs for controlling the automatic index data capture system 202, the programs being executed by the processing unit 401 operating in conjunction with the memory 402.
- the index data storage system 107 receives index data via the network interface unit 404, and stores the index data in the storage unit 403.
- the processing unit 401 searches the storage unit 403 and fetches the appropriate index data file, which is then transmitted to the source of the request via the network interface unit 404.
- Index data comprises both index terms 508, 509, 510 and index times 501, 502, 503, 504, 505, 506.
- the data stream 507 comprises both video 511 and audio 512 streams, and the index times 501, 502, 503, 504, 505, 506 correspond directly to times within the audio and video streams.
- the index times 501, 502, 503, 504, 505, 506 are chosen either by a manual or automatic process, to correspond to segments of the data stream 507 containing material relating to the associated one of the index terms 508, 509, 510.
- Snapshot images 513, 514, 515 are provided from the video stream 511 for illustrative purposes only, by way of example of the type of content material for which it may be desired to provide index terms and index times.
- the index times 501, 502, 503, 504, 505, 506 in the present example further comprise start times 501, 502, 505, finish times 503, 506, and an insertion time 504.
- Each start time 501 has a corresponding finish time 506, and an associated index term 508. Additional content corresponding to the index term 508 is displayed to a user along with the data stream 507 during the period of the data stream 507 as indicated by the start and finish times 501, 506.
- the index data further comprises an insertion time 504 provided for an index term 508
- streaming of the data stream 507 is paused at the insertion time 504, and additional content displayed to the user in place of the data stream.
- this additional content takes the form of a conventional audio-visual advertisement 516.
- the advertisement 516 has finished, streaming and display of the data stream 507 resumes.
- the present invention allows additional content files to be presented either simultaneously with a data stream 507 via the provision of start and finish times, or sequentially via the provision of an insertion time.
- index term "fashion" 508 having both start/finish times 501, 506 and an insertion time 504
- any given index term it is entirely possible for any given index term to have only start and finish times provided, or only insertion times provided.
- An index term also referred to as a content descriptor, is in the present example a keyword relating to the content of a media file at any given time or over a time period, or, in cases where screen region information is provided as will be described below, relating to the content of a particular screen region of a media file at a given time.
- the keyword is used to identify the additional content to be communicated to the user.
- FIG 6 illustrates index data 600 corresponding to the timeline illustrated in Figure 5, in an XML file format.
- the index data 600 comprises a media reference section 601 which provides information about the media file to which the index data relates.
- the media file is a video stream with the title "Deep Pink Catwalk Live", encoded in the MP4 format and with a length of 2:00 minutes.
- the index data 600 further comprises a tag dictionary 602 which contains a listing of the index terms 603, followed by a timeline section 604 which contains a plurality of context tags and insertion tags 612.
- a context tag 608 comprises an index term 609, index times 610, and screen region information 611.
- the screen region information 611 is optional; for example, context tag 606 includes an index term and index times, but no screen region information.
- the screen region information 611 contains both x and y coordinates, as well as height and width dimensions, allowing a bounding box to be drawn overlaying the displayed video file.
- the tag dictionary 602 contains an index term "fashion” 605
- the timeline section 604 contains both context tag 606 and insertion tag 607 associated with this index term 605.
- a display apparatus would therefore display additional content corresponding to the index term "fashion" during the index times provided in context tag 606, i.e.
- the display apparatus would pause the media file and display an additional content file comprising an audio-visual advertisement corresponding to the index term "fashion".
- the method steps in Figure 7 are divided into those performed at an additional content server 101, those performed at a display apparatus 103, and those performed at an index storage system 107.
- the display apparatus 103 identifies a media file from a media server 100 for displaying to a user (step 701), for instance in response to a user request to view the media file received at a user interface associated with the display apparatus.
- the media server 100 is a streaming media server, and accordingly the media file is a data stream.
- the display apparatus requests index data corresponding to the media file from the index data storage system 107 (step 702), via the index data server 102.
- the index data storage system 107 receives the request from index data server 102 (step 703), and retrieves index data from the storage unit 403 (step 704).
- the index data is then transmitted to the display apparatus 103 (step 705), via the index data server.
- the index data is transmitted in its entirety in this step; however, in other embodiments, the index data may be streamed to the display apparatus 103 in synchronisation with the media stream.
- the index data storage system can be arranged to receive the request for index data directly from the display apparatus 103 via the network interface unit 404, and to respond directly, without the need for the index data server 102.
- the display apparatus 103 receives the index data via the network interface unit 304 (step 706).
- the display apparatus 103 then extracts index terms from the index data (step 707).
- the index data is formatted according to the XML standard as shown in Figure 6, with the index terms 603 being contained in a tag dictionary 602.
- the display apparatus 103 transmits via the network interface unit 304 a request for additional content corresponding to the index terms 603 to the additional content server 101 (step 708). Since in the present example the additional content server 101 is an advertisement server, this step comprises requesting advertisements corresponding to the index terms.
- the additional content server 101 receives the request via the network interface unit 404 (step 709) and retrieves the relevant additional content files from the storage unit 403 (step 710). In the case where a plurality of additional content files in the storage unit 403 all correspond to the same index term, the additional content server 101 randomly selects one of the plurality of additional content files for transmission to the display apparatus 103. The relevant additional content files are transmitted via the network interface unit 404 to the display apparatus 103 (step 711).
- the display apparatus 103 receives the additional content via the network interface unit 304 (step 712), and finally displays the additional content in association with the original media file based on the index data (step 713).
- the method illustrated in Figure 7 applies to a system such as that illustrated in Figure 1, comprising an additional content server 101, a display apparatus 103 and an index data storage system 107.
- the person skilled in the art may readily combine several of these devices into a single device as appropriate, for example by providing a single server connected to the media storage unit 105 and index data storage system 107 and performing all tasks previously performed by the separate streaming media server 100 and index data streaming system 102.
- the display apparatus 103 uses index data to control the display of additional content along with a media file
- the method of Figure 8 corresponding to step 713 in Figure 7.
- the index data is that shown in Figure 6, but the method illustrated in Figure 8 may readily be applied to any set of index data.
- the media file is received from a streaming media server, accordingly the media file is a data stream.
- the method of Figure 8 applies in the case in which the index data is downloaded in its entirety once the media file has been identified (see Figure 7), but a method similar to that described in Figure 8 may be applied in embodiments in which the index data is streamed in synchronisation with a media file.
- the display apparatus 103 begins streaming the media file from the media server 100 (step 801) and subsequently displaying the streamed media file (step 802). During this process, it is checked whether the end of the media file has been reached (step 803); if so, the process ends, but if not, the display apparatus 103 determines the current time in the media file (step 804).
- the current time as determined in step 804 is compared to the index start times (step 805).
- the index data is formatted according to the XML standard as shown in Figure 6, with index start and finish times 610 being provided in the timeline section 604. If an index start time matching the current time is found, the process continues to step 806, but if not, the process continues directly to step 809. In the present example, if the current time is 1 :34, it is found that there is a matching index start time 610 (step 805).
- the index term associated with the index start time found during step 805 is retrieved (step 806).
- the corresponding index term 609 is "casual shoe”.
- the additional content corresponding to the index term is then retrieved (step 807), and displayed along with the media file (step 808).
- the additional content would be that additional content which corresponds to the index term 609 "casual shoe”.
- the additional content can, for instance, be an image file advertising a particular brand of casual shoes.
- the display apparatus 103 continues to display the additional content and proceeds to step 809.
- the current time is compared to the index finish times (step 809); if an index finish time matching the current time is found, the process continues to step 810, but if not, the process continues directly to step 812. In the present example, if the current time is 2:01, it is found that there is a matching index finish time 610 (step 809).
- the index term associated with the index finish time found during step 809 is retrieved (step 810).
- the corresponding index term 609 is "casual shoe".
- the display apparatus 103 stops displaying any additional content corresponding to the index term found during step 810 (step 811); in the present example, this would be any additional content corresponding to the index term 609 "casual shoe", for example the image file advertising a particular brand of casual shoes.
- the process then continues to step 812.
- the current time is compared to the insertion times (step 812); if an insertion time matching the current time is found, the process continues to step 813, but if not, the process returns to step 803. In the present example, if the current time is 1 :03, it is found that there is a matching insertion time 607 (step 812).
- the index term associated with the insertion time found during step 812 is retrieved (step 813).
- the corresponding index term is "fashion”.
- the display apparatus 103 then pauses display of the media file (step 814) and displays the additional content associated with the index term found during step 813 (step 815).
- the additional content is provided as a conventional audio-visual advertisement of predetermined length, corresponding to the index term "fashion”. Accordingly, the audio-visual advertisement is displayed until the end of the advertisement is reached (step 815), at which point display of the media file resumes (step 816), the process then returning to step 803.
- step 803 It is again checked whether the end of the media file has been reached (step 803); if so, the process is complete, if not, the process repeats the steps as described above.
- the additional content displayed between the start and finish times and/or at the insertion time comprise advertisements in the form of images
- other forms of data may be substituted.
- a video or audio file comprising commentary on a certain portion of the media file may be displayed or inserted through the use of the start and finish times or an insertion time.
- a static advertisement or message for instance in the form of text, may be presented for a predetermined length of time between the start and finish times or while the media file is paused.
- index data may be manually captured for a media file, according to an embodiment of the present invention.
- the manual index data capture apparatus 200 begins playing the media file (step
- step 901 If the user selects to pause the media file (step 902), the process continues to step 903; if not, then the manual capture apparatus 200 continues to display the media file and proceeds to step 913. If the user selects to pause the media file, the media file is paused (step 903), and the current time in the media file is recorded (step 904) before continuing to step 905.
- step 905 It is then determined whether the user selects the "tag start” command (step 905). If so, the process continues to step 906, but if not, the process skips to step 907. If the user selects the "tag start” command, the current time as recorded in step 904 is recorded as an index start time for a corresponding index term as selected by the user in response to a prompt displayed to the user, for instance in the form of a drop-down list of available index terms (step 906). The process then continues to step 907.
- step 907 It is then determined whether the user selects the "tag stop” command (step 907). If so, the process continues to step 908, but if not, the process skips to step 909. If the user selects the "tag stop” command, the current time as recorded in step 904 is recorded as an index finish time for a corresponding index term as selected by the user in response to a prompt displayed to the user (step 908). The process then continues to step 909.
- step 909 It is then determined whether the user selects the "tag insertion time” command (step 909). If so, the process continues to step 910, but if not, the process skips to step 911. If the user selects the "tag insertion time” command, the current time as recorded in step 904 is recorded as an insertion time for a corresponding index term as selected by the user in response to a prompt displayed to the user (step 910). The process then continues to step 911.
- step 911 It is then determined whether the user selects to resume playing the media file (step 911). If not, the process returns to step 905 and the user may continue to add index start/finish times or insertion times. If the user selects to resume playing the media file, playing of the media file is resumed (step 912), and it is checked whether the end of the media file has been reached (step 913). If so, the process continues to step 914, but if not the process returns to step 902 allowing the user to pause the media file again if and when it is required, and add further index start/finish times and insertion times.
- index data manual capture apparatus 200 stores the index data in the index data storage system 107 (step 915).
- a user selects to begin playing the media file (step 901).
- the user selects to pause the media file (step 902).
- the process then continues through steps 903 and 904 as described above, and the user selects the "tag start” command (step 905) and specifies the index term "fashion” (step 906), recording the current time (i.e. 0:23) as an index start time for the index term "fashion".
- the process continues through steps 907 and 909 to step 911, when the user selects to resume playing the media file. It is then determined that the media file is not yet finished, since it is currently only 23 seconds through a 2-minute long media file (step 913).
- the user then continues through steps 902 to 913 as above to record the remaining index times, insertion times and associated index terms as required.
- a method of automatically capturing index data may be used. Such a method would be performed by an automatic index data capture system 202 as shown in Figure 2.
- the automatic index data capture system uses image recognition software to classify the content of the media file at various times, and accordingly record index terms, index start/finish times and insertion times using method steps similar to those illustrated in Figure 9.
- GUI graphical user interface
- the GUI 1001 comprises a display area 1002 in which the video from the media file is displayed, stop 1003 and start 1004 buttons allowing the user to pause and resume playing of the media file, and a timeline 1005 showing the current time in the media file. It is determined whether the stop button 1003 is selected by the user in step 902 of Figure 9, and whether the start button is selected by the user in step 911 of Figure 9. The current time as displayed in the timeline 1005 is recorded as the current time in step 904 of Figure 9.
- the GUI 1001 further comprises tag start buttons 1006, 1008 and tag stop buttons 1010, 1012.
- the tag start button 1006 is associated with a text input box 1007, into which a user can type a new index term to be associated with the index start time recorded when button 1006 is selected.
- Tag start button 1008 is associated with an existing index term 1009 "shoes" which has previously been recorded for this media file. According to the present example, additional buttons 1008 may be provided for each existing index term 1009, but alternative GUI methods may be readily substituted to accommodate multiple existing index terms, such as drop-down menus.
- the user selects a tag start button 1006, 1008 at step 905 in Figure 9, the current time being recorded as an index start time along with the associated index term 1007, 1009 in step 906 of Figure 9.
- tag stop buttons 1010, 1012 are provided for each index term 1011, 1013 for which an index start time exists with no corresponding index finish time, i.e. those index terms for which an index finish time is still required.
- tag start buttons in the case where there are multiple such index terms 1011, 1013, separate tag stop buttons may be provided for each index term as in the present example, or in alternative embodiments other methods such as drop-down menus may be used.
- the user selects a tag stop button 1010, 1012 at step 907 in Figure 9, the current time being recorded as an index finish time along with the associated index term 1011, 1013 in step 908 of Figure 9.
- the GUI 1001 further comprises a tag insert button 1014.
- the tag insert button 1014 is associated with a drop-down list 1015 comprising each index term previously recorded for the media file; in alternative embodiments, a text box such as 1007 may be provided for the tag insert button, allowing a user to specify a new index term to be associated with the tag insert button 1014.
- the user selects the tag insert button 1014 at step 909, the current time being recorded as an insertion time along with the associated index term specified using the drop-down menu 1015.
- the user selects the stop button 1003 and then while the media file is paused, proceeds to select one or more of the tag start 1006, 1008, tag stop 1010, 1012, and tag insert 1014 buttons as required. Once all index start/finish times and insertion times have been recorded for the current time, the user resumes playing the media file by selecting the play button 1004.
- the start 1003 and stop 1004 buttons may be replaced by a single pause button, which changes appearance to a play button when selected in a manner familiar to those skilled in the art.
- the GUI 1001 also comprises a save button 1016, which the user selects when they have finished recording index start/finish times and insertion times for the current media file, in order to save the recorded index data to the index data storage system 107.
- the GUI presents the user with an error message requesting them to record the required index finish times.
- the manual index data capture apparatus 200 automatically records the required index finish times as times corresponding to the end of the media file before proceeding to save the index data as normal, and does not present the user with an error message.
- a GUI 1100 is illustrated for viewing a media file along with additional content at the display apparatus 103, according to an embodiment of the present invention.
- the media file is a data stream received from a streaming media server.
- the GUI 1100 comprises a display area for the data stream 1101, stop 1102 and start 1103 buttons allowing the user to stop or start playback of the data stream, and a timeline 1104 showing the current time in the data stream and the total length of the data stream.
- the GUI 1100 further comprises display areas for displaying additional content 1105, 1106; in the present example, the additional content 1105, 1106 takes the form of static graphical images which, which clicked on using a mouse or otherwise selected by a user, are configured to link the user to further information associated with the additional content, for instance a web page for a particular brand.
- Alternative data formats for the additional content are possible, such as audio or video files, which may be substituted as appropriate.
- the additional content 1105, 1106 is displayed to the user during the times indicated by the index start and finish times, although in some embodiments of the invention it is not necessary that the additional content is displayed for the entire length of time between the start and finish times.
- the GUI 1100 further comprises a bounding box 1107. This is drawn according to screen region information included in the index data.
- the context tag 608 specifies screen region information 611 for the index term.
- the display apparatus is configured to draw a bounding box 1107 overlain on the data stream 1101, using the dimensions and coordinate information provided in the screen region information 611.
- the bounding box 1107 is only displayed for a predetermined time after the index start time, for example 2 seconds.
- the bounding box 1107 is displayed from the index start time until the index finish time 610.
- the duration for which the bounding box is displayed may be chosen according to the media file in question, in particular according to how rapidly the content of the particular media file changes.
- the screen region information may be manually or automatically captured in conjunction with the capture methods and GUIs described herein.
- the additional content being displayed at the current time in the media file comprises two adverts 1105 and 1106, both relating to the same index term.
- the user interface 1100 may be arranged to display additional content comprising any number of individual files, in such formats as plain text, image, video or audio. Additional content corresponding to multiple index terms may also be displayed simultaneously when appropriate; for example in the case of the index data illustrated in Figure 5, multiple index terms are active at a single time i.e. the "fashion” index times 508 overlap with the "handbag” and "casual shoe” index times 509, 510.
- the user interface 1100 may further be configured to provide a visual indication along with the bounding box 1107, to indicate the additional content to which the bounding box applies.
- Various methods may be used, for example highlighting the relevant additional content while the bounding box is displayed, or drawing a line or arrow connecting the bounding box to the relevant additional content.
- Figure 12 illustrates an alternative embodiment of a GUI 1200 for viewing a data stream along with additional content at the display apparatus 103.
- the GUI 1200 comprises a display area for the data stream 1201, stop 1202 and start 1203 buttons allowing the user to stop or start playback of the data stream, and a timeline 1204 showing the current time in the data stream and the total length of the data stream.
- the additional content 1205, 1206 are displayed below the data stream 1201 rather than alongside it.
- Alternative arrangements may be readily substituted depending on the size and aspect ratio of the area available for the GUI, and the relative required sizes of display areas for the data stream and additional content.
- the index data storage system 107 is connected directly to the media server 100, and the display apparatus 103 communicates with the media server 100 via the internet 104 to receive both a media file and index data corresponding to the media file.
- system for providing index data has been described as providing index data corresponding to a data stream received from a streaming media server
- alternative data transfer methods may be used.
- the system may be arranged for the display apparatus to fully download the media file before displaying it, in which case the display apparatus requests and fully downloads the corresponding index data and additional content before displaying the media file.
- the index data server may be arranged as an index data streaming server configured to stream the index data to the display apparatus in synchronisation with the media stream, or the media server can be arranged to stream both the media file and the index data to the display apparatus.
- the index data may be streamed as a separate data stream or embedded in the media file stream and extracted at the display apparatus.
- a limit may be placed on the maximum number of additional content files which can be displayed at any given time at the display apparatus.
- the display apparatus may randomly select which of the additional content files to display. Alternatively, the display apparatus may cycle display of the multiple additional content files.
- the additional content in the embodiments described herein comprises advertisements
- the same system for providing index data may readily be adapted to situations in which alternative forms of additional content is provided.
- the additional content may comprise commentary on certain sections of a data stream, or in the case where the data stream comprises an educational documentary, the additional content may comprise additional educational material relating to certain sections of the documentary.
- the data stream comprises sports clips
- the additional content may comprise additional information relating to individuals or teams featured in the sports clip currently playing.
- Alternative forms of additional content may be substituted as required.
- index data has been described herein with reference to the XML format, alternative formats such as database tables or indexed text files may readily be substituted as appropriate.
- GUI for use at the display apparatus
- the person skilled in the art will recognise that many alternative arrangements are possible.
- the additional content may be presented in the form up overlays, pop-ups, highlights and so on.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The invention relates to storing and interpreting data relating to a media file to identify additional content for communication to a user in association with the playback, to the user, of the media file. A method and apparatus for generating the data for use in the identification are described, the data including at least one content descriptor used to identify the additional content and relating to the content of the media file. A method of identifying additional content includes retrieving data relating to the media file, the data including the content descriptor, and identifying the additional content on the basis of the content descriptor.
Description
Method and Apparatus for Identifying Additional Content Relating to a Media File
Field of the Invention The invention relates to a method and apparatus for identifying additional content for communication to a user in association with the playback of a media file. A method and apparatus for generating data used to identify additional content are also provided as well as a system for communicating additional content to a user.
Background
In recent years, the availability of devices with high-speed and high-bandwidth internet connectivity, such as personal computers, mobile telephones and personal data assistants (PDAs), has increased rapidly. Media publishers have, as a result, become increasingly interested in distributing media in an electronic form via the internet. One of the most common methods of distributing media over the internet is to provide a video and audio stream to the end user, which is displayed even as it is being downloaded to the user's device.
However, one negative aspect for publishers providing streaming media content instead of physical media such as compact discs, is that the use of streaming media in this way limits the publisher's ability to charge for the downloaded material. As a result, publishers are becoming increasingly reliant on advertising to subsidise the provision of streaming media content. For advertising to provide maximum revenue, it must be relevant to both the viewer and the viewed content. Existing solutions cannot achieve this, as they typically provide advertisements in a manner similar to the familiar television commercial spot model, where advertisements that have been manually selected to correspond to programme demographics are inserted during programming. This method of providing advertisements is not only limited in terms of the relevance of the advertisements to the viewer, but also has the drawback that the video stream format of the advertising is required to match the video stream format of the programme, thus adding a further constraint on the flexibility of the system.
Summary of the Invention
The present invention aims to address the drawbacks associated with known arrangements.
According to the invention, there is provided a method of identifying additional content for communication to a user in association with the playback of a media file to the user, the method comprising retrieving data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and identifying the additional content on the basis of the at least one content descriptor. Additional content such as advertisements can therefore be automatically selected based on a content descriptor for the media file and communicated to a user during playback of the media file.
The data can further comprise time information associated with the at least one content descriptor, the identified additional content to be communicated to a user in association with the playback of the media file and in accordance with the time information. In this manner, the time information can, for instance, be used to specify the time during playback of the media file at which the additional content is communicated to the user.
The at least one content descriptor can relate to content of the media file at a temporal location in the media file, the temporal location being indicated by the associated time information.
The media file can comprise at least one of audio and visual data and the method can further comprise playing back the media file at a display apparatus and communicating the identified additional content to the user in association with the play back and in accordance with the time information.
The method can further comprise receiving the media file and the data relating to the media file in the form of one or more data streams received from one or more streaming servers.
The time information can comprise start and finish times, the additional content being communicated to the user at a time between the associated start time and the associated finish time.
The time information can further comprise at least one insertion time associated with at least one content descriptor, the insertion time indicative of a temporal location in the media file at which the media file is to be paused while additional content corresponding to the at least one content descriptor associated with the insertion time is communicated to the user.
The data can further comprise screen region information associated with the time information, the display apparatus interpreting the screen region information and providing a visual indication of the relevant screen region during display of the additional content.
The additional content can comprise at least one of image data, audio data and video data and at least one of the image and video data can further provide a link enabling a user to access further information relating to the additional content.
According to the invention, there is further provided a method of generating data used to identify additional content for communication to a user in association with the playback of a media file to the user, the method comprising storing data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and is arranged for use in identifying the additional content.
The method of generating data may further comprise storing time information associated with the at least one content descriptor, the identified additional content to be communicated to a user in association with the playback of the media file and in accordance with the time information.
The method of generating data may further comprise performing computer
- A -
processing of the content of the media file and automatically generating the content descriptor based on the computer processing.
According to the invention, there is further provided a computer program which, when executed, causes a method according to the invention to be performed.
According to the invention, there is also provided an apparatus for identifying additional content for communication to a user in association with the playback of a media file to the user, the apparatus comprising a storage unit arranged to store data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file, and a processor for identifying the additional content on the basis of the at least one content descriptor.
The storage unit can be further arranged to store time information associated with the at least one content descriptor, the additional content identified by the processor to be communicated to a user in association with the playback of the media file in accordance with the time information.
The media file can comprise at least one of audio and visual data, the apparatus further configured to play back the media file and to communicate the identified additional content to the user in association with the play back and in accordance with the time information.
The apparatus can be arranged to receive the media file and the data relating to the media file in the form of one or more data streams received from one or more streaming servers.
The apparatus can be further configured to stream the media file to a display apparatus and to stream the identified additional content to the display apparatus for communication to the user in association with the play back of the media file and in accordance with the time information.
According to the invention, there is further provided an apparatus for generating data used to identify additional content for communication to a user in association with the playback of a media file to the user, the apparatus comprising means for generating and storing data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and is arranged for use in identifying the additional content.
According to the invention, there is further provided a system for communicating additional content to a user in association with the playback of a media file to the user, the system comprising a server configuration for providing a media file, data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file, and additional content corresponding to the at least one content descriptor, and a display apparatus and an associated processor configured to receive the media file from the server configuration, request the data relating to the media file from the server configuration, receive said data and extract the at least one content descriptor, request additional content corresponding to the at least one content descriptor from the server configuration, receive the additional content and communicate the additional content to a user in accordance with said time information and in association with the playback of the media file at the display apparatus.
According to the invention, there is also provided a computer-readable storage medium storing data defining additional content for communication to a user in association with the playback of a media file to the user, the data comprising at least one content descriptor and associated time information, wherein the content descriptor defines the additional content and relates to content of the media file at a time during playback of the media file, the time being indicated by the associated time information.
Brief Description of the Drawings
Embodiments of the invention will now be described, by way of reference to the accompanying drawings, in which:
Figure 1 schematically illustrates a system for identifying and displaying additional content, according to an embodiment of the present invention; Figure 2 schematically illustrates a system for manually or automatically capturing index data for a data file, according to an embodiment of the present invention; Figure 3 schematically illustrates the structure of the manual capture apparatus of Figure 2;
Figure 4 schematically illustrates the structure of both the automatic capture apparatus and the index data storage system of Figure 2; Figure 5 schematically illustrates the relationship between index data and the timeline of a media file, according to an embodiment of the present invention; Figure 6 illustrates an example index data file in XML format, according to an embodiment of the present invention;
Figure 7 is a flow diagram illustrating the steps performed in requesting and displaying additional content relating to a media file, according to an embodiment of the present invention;
Figure 8 is a flow diagram illustrating the steps performed in using index data to control the display of additional content at a display apparatus, according to an embodiment of the present invention; Figure 9 is a flow diagram illustrating the steps performed in manually capturing index data according to an embodiment of the present invention;
Figure 10 schematically illustrates a graphical user interface for manually capturing index data, according to an embodiment of the present invention; Figure 11 schematically illustrates a user interface for viewing a data file along with additional data linked to a particular screen region of the displayed data file, according to an embodiment of the present invention;
Figure 12 schematically illustrates a user interface for viewing a data file along with additional data, according to an embodiment of the present invention; and Figure 13 schematically illustrates a system for providing index data with a data file, according to an embodiment of the present invention.
Detailed Description
Referring to Figure 1, a system for defining and identifying additional content for communication to a user during playback of a media file includes a media server
100, an additional content server 101 and an index data server 102, each being connected to a display apparatus 103 via a computer network, in this example the internet 104. A media storage unit 105 is connected to the media server 100, an additional content storage unit 106 is connected to the additional content server 101 and an index data storage system 107 is connected to the index data server 102.
Any of the storage units or system 105, 106, 107 can alternatively or additionally be connected directly to the internet 104 to transfer stored data, for instance under the control of a respective server 100, 101, 102.
In the present example, the media server 100 is a streaming media server, providing a streamed audio-visual data file, hereinafter referred to as a "data stream", to the display apparatus 103. The additional content server 101 is an advertisement server providing advertisements to be displayed in conjunction with a data stream. Streaming media servers and advertisement servers such as these will be familiar to those skilled in the art, and are not therefore described in detail here. Furthermore, in the present example, the display apparatus 103 is a typical desktop computer, but an alternative device such as a mobile telephone, laptop computer, personal data assistant or television may readily be substituted as appropriate.
Referring to Figure 2, a system according to an embodiment of the present invention for manually or automatically capturing index data for a media stream is illustrated, the system including the index data storage system 107 as shown in Figure 1, connected via the internet 104 to both a manual index data capture apparatus 200 and an automatic index data capture apparatus 202. Although in the present example the index data storage system 107 is shown to be connected via the internet 104 to both the manual apparatus 200 and the automatic apparatus 202, the person skilled in the art will readily appreciate that other connections may be substituted as appropriate, such as wired local area network (LAN) or wireless (WLAN) connections.
In alternative embodiments, either or both of the manual index data capture apparatus 200 and the automatic index data capture apparatus 202 are provided in a single physical device, which can either automatically generate and record index data
for a media file, or be operated by a user to manually generate and record index data for a media file.
Figure 3 illustrates the arrangement of components of the manual capture apparatus 200. The manual capture apparatus 200 includes a network interface unit 304, connected to each of a storage unit 303, user interface 300 and a processing unit 301, the processing unit 301 being further connected to a memory 302. In the present example, the user interface 300 is provided in the form of a standard keyboard, mouse and computer monitor combination, but in alternative embodiments the manual capture apparatus may be controlled via a web browser from a remote computer. The network interface unit 304 is also connected to the internet 104, through which the manual capture apparatus 200 may connect to other devices as seen in Figure 2.
Figure 4 illustrates the structure of both the automatic index data capture system 202 and the index data storage system 107. Each system includes a network interface unit 404, connected to a storage unit 403 and a processing unit 401, the processing unit 401 being further connected to a memory 402. The network interface unit 404 is connected to the internet 104 in the present example; in alternative embodiments, other network connections may be substituted as appropriate. The storage unit 403 in the present example is a magnetic hard disk, although any form of machine-readable memory may readily be substituted.
The automatic index data capture apparatus 202 uses image processing software stored in the storage unit 403 to analyse the content of a data stream received via the network interface unit 404, and accordingly generate index terms and index times for the data stream. This information is saved as index data and transmitted to the index data storage system 107 via the network interface unit 404. The storage unit 403 is also used for storing any other software programs for controlling the automatic index data capture system 202, the programs being executed by the processing unit 401 operating in conjunction with the memory 402.
Also referring to Figure 4, the index data storage system 107 receives index data via the network interface unit 404, and stores the index data in the storage unit 403. On receiving a request for index data corresponding to a particular data stream, the processing unit 401 searches the storage unit 403 and fetches the appropriate index data file, which is then transmitted to the source of the request via the network interface unit 404.
Referring now to Figure 5, the relationship between index data and the timeline of a data stream 507 is schematically illustrated. Index data comprises both index terms 508, 509, 510 and index times 501, 502, 503, 504, 505, 506. In the present example, the data stream 507 comprises both video 511 and audio 512 streams, and the index times 501, 502, 503, 504, 505, 506 correspond directly to times within the audio and video streams. The index times 501, 502, 503, 504, 505, 506 are chosen either by a manual or automatic process, to correspond to segments of the data stream 507 containing material relating to the associated one of the index terms 508, 509, 510. Snapshot images 513, 514, 515 are provided from the video stream 511 for illustrative purposes only, by way of example of the type of content material for which it may be desired to provide index terms and index times.
The index times 501, 502, 503, 504, 505, 506 in the present example further comprise start times 501, 502, 505, finish times 503, 506, and an insertion time 504. Each start time 501 has a corresponding finish time 506, and an associated index term 508. Additional content corresponding to the index term 508 is displayed to a user along with the data stream 507 during the period of the data stream 507 as indicated by the start and finish times 501, 506.
In the event that the index data further comprises an insertion time 504 provided for an index term 508, streaming of the data stream 507 is paused at the insertion time 504, and additional content displayed to the user in place of the data stream. In the present example, this additional content takes the form of a conventional audio-visual advertisement 516. When the advertisement 516 has finished, streaming and display of the data stream 507 resumes. In this way, the present invention allows additional content files to be presented either simultaneously with
a data stream 507 via the provision of start and finish times, or sequentially via the provision of an insertion time.
There is no limit on the number of insertion times and start/finish times which may be provided in the index data for any given data stream, so long as each insertion time and pair of start/finish times has an associated index term. Furthermore, although the present example illustrates an index term "fashion" 508 having both start/finish times 501, 506 and an insertion time 504, it is entirely possible for any given index term to have only start and finish times provided, or only insertion times provided.
An index term, also referred to as a content descriptor, is in the present example a keyword relating to the content of a media file at any given time or over a time period, or, in cases where screen region information is provided as will be described below, relating to the content of a particular screen region of a media file at a given time. The keyword is used to identify the additional content to be communicated to the user.
Figure 6 illustrates index data 600 corresponding to the timeline illustrated in Figure 5, in an XML file format. The index data 600 comprises a media reference section 601 which provides information about the media file to which the index data relates. In the present example, the media file is a video stream with the title "Deep Pink Catwalk Live", encoded in the MP4 format and with a length of 2:00 minutes. The index data 600 further comprises a tag dictionary 602 which contains a listing of the index terms 603, followed by a timeline section 604 which contains a plurality of context tags and insertion tags 612.
A context tag 608 comprises an index term 609, index times 610, and screen region information 611. The screen region information 611 is optional; for example, context tag 606 includes an index term and index times, but no screen region information. The screen region information 611 contains both x and y coordinates, as well as height and width dimensions, allowing a bounding box to be drawn overlaying the displayed video file.
According to the present example, the tag dictionary 602 contains an index term "fashion" 605, and the timeline section 604 contains both context tag 606 and insertion tag 607 associated with this index term 605. A display apparatus would therefore display additional content corresponding to the index term "fashion" during the index times provided in context tag 606, i.e. from 0:23 seconds into the playback of the media file to 2:00 minutes into the display of the file. Furthermore, on reaching the insertion time 1 :03 indicated by the insertion tag 607, the display apparatus would pause the media file and display an additional content file comprising an audio-visual advertisement corresponding to the index term "fashion".
A method by which additional content relating to a media file may be requested and displayed according to an embodiment of the present invention, will now be explained with reference to Figure 7. The method steps are described with reference to the devices and components illustrated in Figures 1, 3 and 4, with reference numerals provided where appropriate.
The method steps in Figure 7 are divided into those performed at an additional content server 101, those performed at a display apparatus 103, and those performed at an index storage system 107. The display apparatus 103 identifies a media file from a media server 100 for displaying to a user (step 701), for instance in response to a user request to view the media file received at a user interface associated with the display apparatus. In the present example, the media server 100 is a streaming media server, and accordingly the media file is a data stream. The display apparatus requests index data corresponding to the media file from the index data storage system 107 (step 702), via the index data server 102.
The index data storage system 107 receives the request from index data server 102 (step 703), and retrieves index data from the storage unit 403 (step 704). The index data is then transmitted to the display apparatus 103 (step 705), via the index data server. In the present example, the index data is transmitted in its entirety in this
step; however, in other embodiments, the index data may be streamed to the display apparatus 103 in synchronisation with the media stream.
Although the request for index data is received via the index data server 102 in the above described embodiment, alternatively, the index data storage system can be arranged to receive the request for index data directly from the display apparatus 103 via the network interface unit 404, and to respond directly, without the need for the index data server 102.
The display apparatus 103 receives the index data via the network interface unit 304 (step 706). The display apparatus 103 then extracts index terms from the index data (step 707). In the present example, the index data is formatted according to the XML standard as shown in Figure 6, with the index terms 603 being contained in a tag dictionary 602. Continuing the explanation of the method steps with reference to Figure 7, the display apparatus 103 transmits via the network interface unit 304 a request for additional content corresponding to the index terms 603 to the additional content server 101 (step 708). Since in the present example the additional content server 101 is an advertisement server, this step comprises requesting advertisements corresponding to the index terms.
The additional content server 101 receives the request via the network interface unit 404 (step 709) and retrieves the relevant additional content files from the storage unit 403 (step 710). In the case where a plurality of additional content files in the storage unit 403 all correspond to the same index term, the additional content server 101 randomly selects one of the plurality of additional content files for transmission to the display apparatus 103. The relevant additional content files are transmitted via the network interface unit 404 to the display apparatus 103 (step 711).
The display apparatus 103 receives the additional content via the network interface unit 304 (step 712), and finally displays the additional content in association with the original media file based on the index data (step 713).
According to the present example, the method illustrated in Figure 7 applies to a system such as that illustrated in Figure 1, comprising an additional content server 101, a display apparatus 103 and an index data storage system 107. However, the person skilled in the art may readily combine several of these devices into a single device as appropriate, for example by providing a single server connected to the media storage unit 105 and index data storage system 107 and performing all tasks previously performed by the separate streaming media server 100 and index data streaming system 102.
Referring now to Figure 8, a method is described by which the display apparatus 103 uses index data to control the display of additional content along with a media file, the method of Figure 8 corresponding to step 713 in Figure 7. In the present example, the index data is that shown in Figure 6, but the method illustrated in Figure 8 may readily be applied to any set of index data. Furthermore, since in the present example the media file is received from a streaming media server, accordingly the media file is a data stream. The method of Figure 8 applies in the case in which the index data is downloaded in its entirety once the media file has been identified (see Figure 7), but a method similar to that described in Figure 8 may be applied in embodiments in which the index data is streamed in synchronisation with a media file.
The display apparatus 103 begins streaming the media file from the media server 100 (step 801) and subsequently displaying the streamed media file (step 802). During this process, it is checked whether the end of the media file has been reached (step 803); if so, the process ends, but if not, the display apparatus 103 determines the current time in the media file (step 804).
The current time as determined in step 804 is compared to the index start times (step 805). According to the present example the index data is formatted according to the XML standard as shown in Figure 6, with index start and finish times 610 being provided in the timeline section 604. If an index start time matching the current time is found, the process continues to step 806, but if not, the process
continues directly to step 809. In the present example, if the current time is 1 :34, it is found that there is a matching index start time 610 (step 805).
The index term associated with the index start time found during step 805 is retrieved (step 806). For the start and finish times 610 the corresponding index term 609 is "casual shoe". The additional content corresponding to the index term is then retrieved (step 807), and displayed along with the media file (step 808). In the present example, the additional content would be that additional content which corresponds to the index term 609 "casual shoe". The additional content can, for instance, be an image file advertising a particular brand of casual shoes. The display apparatus 103 continues to display the additional content and proceeds to step 809.
The current time is compared to the index finish times (step 809); if an index finish time matching the current time is found, the process continues to step 810, but if not, the process continues directly to step 812. In the present example, if the current time is 2:01, it is found that there is a matching index finish time 610 (step 809).
The index term associated with the index finish time found during step 809 is retrieved (step 810). For the start and finish times 610 the corresponding index term 609 is "casual shoe". The display apparatus 103 stops displaying any additional content corresponding to the index term found during step 810 (step 811); in the present example, this would be any additional content corresponding to the index term 609 "casual shoe", for example the image file advertising a particular brand of casual shoes. The process then continues to step 812.
The current time is compared to the insertion times (step 812); if an insertion time matching the current time is found, the process continues to step 813, but if not, the process returns to step 803. In the present example, if the current time is 1 :03, it is found that there is a matching insertion time 607 (step 812).
The index term associated with the insertion time found during step 812 is retrieved (step 813). For the insertion time 607 the corresponding index term is "fashion".
The display apparatus 103 then pauses display of the media file (step 814) and displays the additional content associated with the index term found during step 813 (step 815). In the present example, the additional content is provided as a conventional audio-visual advertisement of predetermined length, corresponding to the index term "fashion". Accordingly, the audio-visual advertisement is displayed until the end of the advertisement is reached (step 815), at which point display of the media file resumes (step 816), the process then returning to step 803.
It is again checked whether the end of the media file has been reached (step 803); if so, the process is complete, if not, the process repeats the steps as described above.
Although in the present example, the additional content displayed between the start and finish times and/or at the insertion time comprise advertisements in the form of images, it will be readily appreciated that other forms of data may be substituted. For example, a video or audio file comprising commentary on a certain portion of the media file may be displayed or inserted through the use of the start and finish times or an insertion time. Alternatively, a static advertisement or message, for instance in the form of text, may be presented for a predetermined length of time between the start and finish times or while the media file is paused.
Referring now to Figure 9, a method is illustrated by which index data may be manually captured for a media file, according to an embodiment of the present invention.
The manual index data capture apparatus 200 begins playing the media file (step
901). If the user selects to pause the media file (step 902), the process continues to step 903; if not, then the manual capture apparatus 200 continues to display the media file and proceeds to step 913. If the user selects to pause the media file, the media file is paused (step 903), and the current time in the media file is recorded (step 904) before continuing to step 905.
It is then determined whether the user selects the "tag start" command (step 905). If so, the process continues to step 906, but if not, the process skips to step 907. If
the user selects the "tag start" command, the current time as recorded in step 904 is recorded as an index start time for a corresponding index term as selected by the user in response to a prompt displayed to the user, for instance in the form of a drop-down list of available index terms (step 906). The process then continues to step 907.
It is then determined whether the user selects the "tag stop" command (step 907). If so, the process continues to step 908, but if not, the process skips to step 909. If the user selects the "tag stop" command, the current time as recorded in step 904 is recorded as an index finish time for a corresponding index term as selected by the user in response to a prompt displayed to the user (step 908). The process then continues to step 909.
It is then determined whether the user selects the "tag insertion time" command (step 909). If so, the process continues to step 910, but if not, the process skips to step 911. If the user selects the "tag insertion time" command, the current time as recorded in step 904 is recorded as an insertion time for a corresponding index term as selected by the user in response to a prompt displayed to the user (step 910). The process then continues to step 911.
It is then determined whether the user selects to resume playing the media file (step 911). If not, the process returns to step 905 and the user may continue to add index start/finish times or insertion times. If the user selects to resume playing the media file, playing of the media file is resumed (step 912), and it is checked whether the end of the media file has been reached (step 913). If so, the process continues to step 914, but if not the process returns to step 902 allowing the user to pause the media file again if and when it is required, and add further index start/finish times and insertion times.
The user is asked whether they wish to save the recorded index start/finish times and insertion times, along with associated index terms, as index data for the current media file (step 914). If not, the process is completed, but if so, the index data
manual capture apparatus 200 stores the index data in the index data storage system 107 (step 915).
In the present example, to record the index data illustrated in Figure 6 a user selects to begin playing the media file (step 901). At 0:23, the user selects to pause the media file (step 902). The process then continues through steps 903 and 904 as described above, and the user selects the "tag start" command (step 905) and specifies the index term "fashion" (step 906), recording the current time (i.e. 0:23) as an index start time for the index term "fashion". There are no other index times or insertion times to be recorded at 0:23, and so the process continues through steps 907 and 909 to step 911, when the user selects to resume playing the media file. It is then determined that the media file is not yet finished, since it is currently only 23 seconds through a 2-minute long media file (step 913). The user then continues through steps 902 to 913 as above to record the remaining index times, insertion times and associated index terms as required.
As an alternative to the manual index data capture method illustrated in Figure 9, a method of automatically capturing index data may be used. Such a method would be performed by an automatic index data capture system 202 as shown in Figure 2. In this embodiment, the automatic index data capture system uses image recognition software to classify the content of the media file at various times, and accordingly record index terms, index start/finish times and insertion times using method steps similar to those illustrated in Figure 9.
A graphical user interface (GUI) for manually capturing index data corresponding to a media file, according to an embodiment of the present invention, will now be described with reference to Figure 10. Such a GUI may be used to perform the method illustrated in Figure 9, and the features of the GUI are described below with reference to the relevant method steps of Figure 9.
The GUI 1001 comprises a display area 1002 in which the video from the media file is displayed, stop 1003 and start 1004 buttons allowing the user to pause and resume playing of the media file, and a timeline 1005 showing the current time in the media
file. It is determined whether the stop button 1003 is selected by the user in step 902 of Figure 9, and whether the start button is selected by the user in step 911 of Figure 9. The current time as displayed in the timeline 1005 is recorded as the current time in step 904 of Figure 9.
The GUI 1001 further comprises tag start buttons 1006, 1008 and tag stop buttons 1010, 1012. In the present example, the tag start button 1006 is associated with a text input box 1007, into which a user can type a new index term to be associated with the index start time recorded when button 1006 is selected. Tag start button 1008 is associated with an existing index term 1009 "shoes" which has previously been recorded for this media file. According to the present example, additional buttons 1008 may be provided for each existing index term 1009, but alternative GUI methods may be readily substituted to accommodate multiple existing index terms, such as drop-down menus. The user selects a tag start button 1006, 1008 at step 905 in Figure 9, the current time being recorded as an index start time along with the associated index term 1007, 1009 in step 906 of Figure 9.
According to the present example, tag stop buttons 1010, 1012 are provided for each index term 1011, 1013 for which an index start time exists with no corresponding index finish time, i.e. those index terms for which an index finish time is still required. As with the tag start buttons, in the case where there are multiple such index terms 1011, 1013, separate tag stop buttons may be provided for each index term as in the present example, or in alternative embodiments other methods such as drop-down menus may be used. The user selects a tag stop button 1010, 1012 at step 907 in Figure 9, the current time being recorded as an index finish time along with the associated index term 1011, 1013 in step 908 of Figure 9.
The GUI 1001 further comprises a tag insert button 1014. According to the present example, the tag insert button 1014 is associated with a drop-down list 1015 comprising each index term previously recorded for the media file; in alternative embodiments, a text box such as 1007 may be provided for the tag insert button, allowing a user to specify a new index term to be associated with the tag insert button 1014. The user selects the tag insert button 1014 at step 909, the current
time being recorded as an insertion time along with the associated index term specified using the drop-down menu 1015.
To record an index start/finish time or insertion time, the user selects the stop button 1003 and then while the media file is paused, proceeds to select one or more of the tag start 1006, 1008, tag stop 1010, 1012, and tag insert 1014 buttons as required. Once all index start/finish times and insertion times have been recorded for the current time, the user resumes playing the media file by selecting the play button 1004. In alternative embodiments, the start 1003 and stop 1004 buttons may be replaced by a single pause button, which changes appearance to a play button when selected in a manner familiar to those skilled in the art.
The GUI 1001 also comprises a save button 1016, which the user selects when they have finished recording index start/finish times and insertion times for the current media file, in order to save the recorded index data to the index data storage system 107.
In the event that a user chooses to save index data comprising at least one index start time without an associated index finish time, according to the present invention the GUI presents the user with an error message requesting them to record the required index finish times. In an alternative embodiment, the manual index data capture apparatus 200 automatically records the required index finish times as times corresponding to the end of the media file before proceeding to save the index data as normal, and does not present the user with an error message.
Referring now to Figure 11, a GUI 1100 is illustrated for viewing a media file along with additional content at the display apparatus 103, according to an embodiment of the present invention. In the present example, the media file is a data stream received from a streaming media server. The GUI 1100 comprises a display area for the data stream 1101, stop 1102 and start 1103 buttons allowing the user to stop or start playback of the data stream, and a timeline 1104 showing the current time in the data stream and the total length of the data stream. The GUI 1100 further comprises display areas for displaying additional content 1105, 1106; in the present
example, the additional content 1105, 1106 takes the form of static graphical images which, which clicked on using a mouse or otherwise selected by a user, are configured to link the user to further information associated with the additional content, for instance a web page for a particular brand. Alternative data formats for the additional content are possible, such as audio or video files, which may be substituted as appropriate. The additional content 1105, 1106 is displayed to the user during the times indicated by the index start and finish times, although in some embodiments of the invention it is not necessary that the additional content is displayed for the entire length of time between the start and finish times.
The GUI 1100 further comprises a bounding box 1107. This is drawn according to screen region information included in the index data. In the present example, for the index data illustrated in Figure 6, the context tag 608 specifies screen region information 611 for the index term. Accordingly, when the index start time 610 is reached in the data stream 1101, the display apparatus is configured to draw a bounding box 1107 overlain on the data stream 1101, using the dimensions and coordinate information provided in the screen region information 611. In the present example, the bounding box 1107 is only displayed for a predetermined time after the index start time, for example 2 seconds. However, according to another embodiment, the bounding box 1107 is displayed from the index start time until the index finish time 610. Alternatively, the duration for which the bounding box is displayed may be chosen according to the media file in question, in particular according to how rapidly the content of the particular media file changes. The screen region information may be manually or automatically captured in conjunction with the capture methods and GUIs described herein.
Furthermore, in the present example, the additional content being displayed at the current time in the media file comprises two adverts 1105 and 1106, both relating to the same index term. However, in alternative embodiments, the user interface 1100 may be arranged to display additional content comprising any number of individual files, in such formats as plain text, image, video or audio. Additional content corresponding to multiple index terms may also be displayed simultaneously when appropriate; for example in the case of the index data illustrated in Figure 5,
multiple index terms are active at a single time i.e. the "fashion" index times 508 overlap with the "handbag" and "casual shoe" index times 509, 510.
For the case in which a plurality of additional content files corresponding to a plurality of index terms are displayed at a single time, the user interface 1100 may further be configured to provide a visual indication along with the bounding box 1107, to indicate the additional content to which the bounding box applies. Various methods may be used, for example highlighting the relevant additional content while the bounding box is displayed, or drawing a line or arrow connecting the bounding box to the relevant additional content.
Figure 12 illustrates an alternative embodiment of a GUI 1200 for viewing a data stream along with additional content at the display apparatus 103. As in the GUI 1100 of Figure 11, the GUI 1200 comprises a display area for the data stream 1201, stop 1202 and start 1203 buttons allowing the user to stop or start playback of the data stream, and a timeline 1204 showing the current time in the data stream and the total length of the data stream. However, in the GUI 1200 the additional content 1205, 1206 are displayed below the data stream 1201 rather than alongside it. Alternative arrangements may be readily substituted depending on the size and aspect ratio of the area available for the GUI, and the relative required sizes of display areas for the data stream and additional content.
Referring now to Figure 13, an alternative embodiment of a system for providing index data with a media file is illustrated, according to an embodiment of the present invention. In this embodiment, the index data storage system 107 is connected directly to the media server 100, and the display apparatus 103 communicates with the media server 100 via the internet 104 to receive both a media file and index data corresponding to the media file.
Although various embodiments of the present invention have been described herein with reference to the Figures, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope of the invention as defined by the accompanying claims.
Whilst the system for providing index data has been described as providing index data corresponding to a data stream received from a streaming media server, alternative data transfer methods may be used. For example, the system may be arranged for the display apparatus to fully download the media file before displaying it, in which case the display apparatus requests and fully downloads the corresponding index data and additional content before displaying the media file. Alternatively, the index data server may be arranged as an index data streaming server configured to stream the index data to the display apparatus in synchronisation with the media stream, or the media server can be arranged to stream both the media file and the index data to the display apparatus. In the latter example, the index data may be streamed as a separate data stream or embedded in the media file stream and extracted at the display apparatus.
Additionally, it will be appreciated by the person skilled in the art that although certain devices are shown to communicate via the internet or via a direct link, alternative communication links may be substituted where appropriate.
In an alternative embodiment of the index data providing system, a limit may be placed on the maximum number of additional content files which can be displayed at any given time at the display apparatus. In the event where, at a given time in the data stream, the index data specifies a greater number of index terms for which additional content should be displayed than this maximum number, the display apparatus may randomly select which of the additional content files to display. Alternatively, the display apparatus may cycle display of the multiple additional content files.
Additionally, whilst the additional content in the embodiments described herein comprises advertisements, the same system for providing index data may readily be adapted to situations in which alternative forms of additional content is provided. For example, the additional content may comprise commentary on certain sections of a data stream, or in the case where the data stream comprises an educational documentary, the additional content may comprise additional educational material
relating to certain sections of the documentary. Similarly, if the data stream comprises sports clips, the additional content may comprise additional information relating to individuals or teams featured in the sports clip currently playing. Alternative forms of additional content may be substituted as required.
Furthermore, although index data has been described herein with reference to the XML format, alternative formats such as database tables or indexed text files may readily be substituted as appropriate.
Although the GUI for use at the display apparatus has been described as displaying the additional content files alongside the data stream, the person skilled in the art will recognise that many alternative arrangements are possible. For example, the additional content may be presented in the form up overlays, pop-ups, highlights and so on.
Claims
1. A method of identifying additional content for communication to a user in association with the playback of a media file to the user, the method comprising: retrieving data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file; and identifying the additional content on the basis of the at least one content descriptor.
2. A method according to claim 1, wherein the data further comprises time information associated with the at least one content descriptor, the identified additional content to be communicated to a user in association with the playback of the media file and in accordance with the time information.
3. A method according to claim 2, wherein the at least one content descriptor relates to content of the media file at a temporal location in the media file, the temporal location being indicated by the associated time information.
4. A method according to claim 2 or 3, wherein the media file comprises at least one of audio and visual data, the method further comprising: playing back the media file at a display apparatus and communicating the identified additional content to the user in association with the play back and in accordance with the time information.
5. A method according to claim 4, further comprising receiving the media file and the data relating to the media file in the form of one or more data streams received from one or more streaming servers.
6. A method according to claim 4 or 5, wherein the time information comprises start and finish times, the additional content being communicated to the user at a time between the associated start time and the associated finish time.
7. A method according to claim 4, 5 or 6, wherein the time information further comprises at least one insertion time associated with at least one content descriptor, the insertion time indicative of a temporal location in the media file at which the media file is to be paused while additional content corresponding to the at least one content descriptor associated with the insertion time is communicated to the user.
8. A method according to any one of claims 4 to 7, wherein the data further comprises screen region information associated with the time information, the display apparatus interpreting the screen region information and providing a visual indication of the relevant screen region during display of the additional content.
9. A method according to any one of the preceding claims, wherein the additional content comprises at least one of image data, text data, audio data and video data.
10. A method according to any one of the preceding claims, wherein at least one of the image, text and video data further provides a link enabling a user to access further information relating to the additional content.
11. A method of generating data used to identify additional content for communication to a user in association with the playback of a media file to the user, the method comprising: storing data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and is arranged for use in identifying the additional content.
12. A method according to claim 11, further comprising: storing time information associated with the at least one content descriptor, the identified additional content to be communicated to a user in association with the playback of the media file and in accordance with the time information.
13. A method according to claim 11 or 12, further comprising performing computer processing of the content of the media file and automatically generating the content descriptor based on the computer processing.
14. A computer program which, when executed, causes a method according to any one of the preceding claims to be performed.
15. Apparatus for identifying additional content for communication to a user in association with the playback of a media file to the user, the apparatus comprising: a storage unit arranged to store data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file; and a processor for identifying the additional content on the basis of the at least one content descriptor.
16. Apparatus according to claim 16, wherein the storage unit is further arranged to store time information associated with the at least one content descriptor, the additional content identified by the processor to be communicated to a user in association with the playback of the media file in accordance with the time information.
17. Apparatus according to claim 16, wherein the media file comprises at least one of audio and visual data, the apparatus further configured to play back the media file and to communicate the identified additional content to the user in association with the play back and in accordance with the time information.
18. Apparatus according to claim 17, wherein the apparatus is arranged to receive the media file and the data relating to the media file in the form of one or more data streams received from one or more streaming servers.
19. Apparatus according to claim 16, the apparatus further configured to stream the media file to a display apparatus and to stream the identified additional content to the display apparatus for communication to the user in association with the play back of the media file and in accordance with the time information.
20. Apparatus for generating data used to identify additional content for communication to a user in association with the playback of a media file to the user, the apparatus comprising: means for generating and storing data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file and is arranged for use in identifying the additional content.
21. A system for communicating additional content to a user in association with the playback of a media file to the user, the system comprising: a server configuration for providing a media file, data relating to the media file, the data comprising at least one content descriptor, wherein the at least one content descriptor relates to content of the media file, and additional content corresponding to the at least one content descriptor; and a display apparatus and an associated processor configured to receive the media file from the server configuration, request the data relating to the media file from the server configuration, receive said data and extract the at least one content descriptor, request additional content corresponding to the at least one content descriptor from the server configuration, receive the additional content and communicate the additional content to a user in accordance with said time information and in association with the playback of the media file at the display apparatus.
22. A computer-readable storage medium storing data defining additional content for communication to a user in association with the playback of a media file to the user, the data comprising at least one content descriptor and associated time information, wherein the content descriptor defines the additional content and relates to content of the media file at a time during playback of the media file, the time being indicated by the associated time information.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2008/060175 WO2010012312A1 (en) | 2008-08-01 | 2008-08-01 | Method and apparatus for identifying additional content relating to a media file |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2008/060175 WO2010012312A1 (en) | 2008-08-01 | 2008-08-01 | Method and apparatus for identifying additional content relating to a media file |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010012312A1 true WO2010012312A1 (en) | 2010-02-04 |
Family
ID=40589796
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2008/060175 Ceased WO2010012312A1 (en) | 2008-08-01 | 2008-08-01 | Method and apparatus for identifying additional content relating to a media file |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2010012312A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050229227A1 (en) * | 2004-04-13 | 2005-10-13 | Evenhere, Inc. | Aggregation of retailers for televised media programming product placement |
| GB2435114A (en) * | 2006-02-08 | 2007-08-15 | Rapid Mobile Media Ltd | Providing targeted additional content |
| US20070250901A1 (en) * | 2006-03-30 | 2007-10-25 | Mcintire John P | Method and apparatus for annotating media streams |
| US20080040768A1 (en) * | 2006-08-14 | 2008-02-14 | Alcatel | Approach for associating advertising supplemental information with video programming |
| US20080163379A1 (en) * | 2000-10-10 | 2008-07-03 | Addnclick, Inc. | Method of inserting/overlaying markers, data packets and objects relative to viewable content and enabling live social networking, N-dimensional virtual environments and/or other value derivable from the content |
-
2008
- 2008-08-01 WO PCT/EP2008/060175 patent/WO2010012312A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080163379A1 (en) * | 2000-10-10 | 2008-07-03 | Addnclick, Inc. | Method of inserting/overlaying markers, data packets and objects relative to viewable content and enabling live social networking, N-dimensional virtual environments and/or other value derivable from the content |
| US20050229227A1 (en) * | 2004-04-13 | 2005-10-13 | Evenhere, Inc. | Aggregation of retailers for televised media programming product placement |
| GB2435114A (en) * | 2006-02-08 | 2007-08-15 | Rapid Mobile Media Ltd | Providing targeted additional content |
| US20070250901A1 (en) * | 2006-03-30 | 2007-10-25 | Mcintire John P | Method and apparatus for annotating media streams |
| US20080040768A1 (en) * | 2006-08-14 | 2008-02-14 | Alcatel | Approach for associating advertising supplemental information with video programming |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11457256B2 (en) | System and method for video conversations | |
| US12536602B2 (en) | Systems, methods, and user interface for navigating media playback using scrollable text | |
| US9715901B1 (en) | Video preview generation | |
| US9800941B2 (en) | Text-synchronized media utilization and manipulation for transcripts | |
| US9380282B2 (en) | Providing item information during video playing | |
| US9124950B2 (en) | Providing item information notification during video playing | |
| US20080281689A1 (en) | Embedded video player advertisement display | |
| US20130014155A1 (en) | System and method for presenting content with time based metadata | |
| US20090254823A1 (en) | Bookmark Interpretation Service | |
| KR101328270B1 (en) | Annotation method and augmenting video process in video stream for smart tv contents and system thereof | |
| US10575039B2 (en) | Delivering media content | |
| JP5143592B2 (en) | Content reproduction apparatus, content reproduction method, content reproduction system, program, and recording medium | |
| US20250260866A1 (en) | Systems and methods for automatically generating content items from identified events | |
| US20260012667A1 (en) | Smart automatic skip mode | |
| CN116781971B (en) | Video playback method and device | |
| US20090328102A1 (en) | Representative Scene Images | |
| WO2010012312A1 (en) | Method and apparatus for identifying additional content relating to a media file | |
| Vasudevan et al. | Content re-monetisation: How to have your cake and eat it too |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08786790 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 08786790 Country of ref document: EP Kind code of ref document: A1 |