[go: up one dir, main page]

HK1110691B - Apparatus and method for reproducing storage medium that stores metadata for providing enhanced search function - Google Patents

Apparatus and method for reproducing storage medium that stores metadata for providing enhanced search function Download PDF

Info

Publication number
HK1110691B
HK1110691B HK08104870.9A HK08104870A HK1110691B HK 1110691 B HK1110691 B HK 1110691B HK 08104870 A HK08104870 A HK 08104870A HK 1110691 B HK1110691 B HK 1110691B
Authority
HK
Hong Kong
Prior art keywords
scene
search
metadata
data
title
Prior art date
Application number
HK08104870.9A
Other languages
Chinese (zh)
Other versions
HK1110691A1 (en
Inventor
千慧祯
朴成煜
Original Assignee
三星电子株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020050001749A external-priority patent/KR100782810B1/en
Application filed by 三星电子株式会社 filed Critical 三星电子株式会社
Publication of HK1110691A1 publication Critical patent/HK1110691A1/en
Publication of HK1110691B publication Critical patent/HK1110691B/en

Links

Description

Apparatus and method for reproducing storage medium for storing metadata providing enhanced search function
Technical Field
The present invention relates to reproducing audio-visual (AV) data recorded on a storage medium, and more particularly, to a storage medium containing metadata for providing an enhanced search function, an apparatus and method for reproducing AV data from a storage medium storing metadata for providing an enhanced search function.
Background
Storage media, such as DVDs and blu-ray discs (BDs), store audio-visual (AV) data composed of video, audio, and/or subtitles compression-encoded according to digital video and audio compression standards, such as MPEG (moving picture experts group) standards. The storage medium also stores additional information such as encoding characteristics of AV data or the order in which the AV data is to be reproduced. Generally, moving pictures recorded on a storage medium are sequentially reproduced in a predetermined order. However, in reproducing AV data, a moving picture can be reproduced in units of chapters.
Fig. 1 illustrates a structure of AV data recorded on a typical storage medium. As shown in fig. 1, a storage medium (e.g., the medium 250 shown in fig. 2) is generally formed in multiple layers to manage the structure of AV data recorded on the medium. The data structure 100 includes: one or more clips 110, which are recording units of multimedia images (AV data); one or more playlists 120, which are reproduction units of multimedia images (AV data); a movie object 130 including navigation commands for reproducing multimedia images (AV data); and an index table 140 for specifying a movie object to be first reproduced and a title of the movie object 130.
The clip 110 is implemented as one object including a clip AV stream 112 of an AV data stream for a high picture quality movie and clip information 114 about attributes corresponding to the AV data stream. For example, the AV data stream may be compressed according to a standard (e.g., moving picture experts group MPEG). However, in all aspects of the invention, such clips 110 do not require compression of the AV data stream 112. In addition, the clip information 114 may include audio/video characteristics of the AV data stream 112, an entry point map in which information regarding the positions of entry points that can be randomly accessed is recorded in units of a predetermined section (section), and the like.
Each playlist 120 includes a playlist mark composed of marks indicating positions of clips 110 corresponding to the playlist 120. Each playlist 120 also includes a series of reproduction intervals of the clips 110, each reproduction interval being referred to as a playltem 122. Accordingly, AV data can be reproduced in units of playlists 120 and in the order of playitems 122 listed in playlists 120.
The movie objects 130 are formed of a navigation command program that starts reproduction of the playlist 120, switches or manages reproduction of the playlist 120 among the movie objects 130 according to the preference of the user.
The index table 140 is a table located at the top layer of the storage medium to define a plurality of titles and menus and includes start position information of all the titles and menus so that a title or a menu selected by a user operation such as title search or menu call can be reproduced. The index table 140 further includes start position information of a title or menu that is automatically reproduced first when the storage medium is placed in the reproducing apparatus.
Technical problem
However, in such a storage medium, there is no method of jumping to an arbitrary scene and reproducing the scene according to a search condition (e.g., scene, character, place, sound, or item) desired by a user. In other words, a typical storage medium does not provide a function of moving to and reproducing a portion of AV data according to a search condition (e.g., scene, character, place, sound, or item) set by a user, and thus, such a storage medium cannot provide various search functions.
Since AV data is compression-encoded according to the MPEG2 standard and recorded on a conventional storage medium and multiplexed, it is difficult to manufacture a storage medium containing metadata required to search for a moving picture. Further, once a storage medium is manufactured, it is almost impossible to edit or reuse AV data or metadata stored on the storage medium.
In addition, currently defined playlist marks cannot distinguish between multiple angles (multiple angles) or multiple paths (multiple tracks). Therefore, even when AV data supports multiple angles or multiple paths, it is difficult to provide various enhanced search functions for AV data.
Technical solution
Aspects and exemplary embodiments of the present invention provide an apparatus and method of reproducing a storage medium storing metadata for providing an enhanced search function by using various search keywords of audio-visual (AV) data. In addition, the present invention also provides an apparatus and method for reproducing a storage medium storing metadata for efficiently providing an enhanced search function with respect to AV data of various formats.
Advantageous effects
As described above, the present invention provides a storage medium storing metadata for providing an enhanced search function by using various search keywords for AV data, an apparatus and a method for reproducing the storage medium. The present invention can also provide an enhanced search function related to AV data of various formats.
In other words, metadata for providing an enhanced search function is defined by an author in scenes, each scene including information on at least one search keyword. Further, each scene includes information on an entry point and/or duration, angle, etc. Accordingly, an enhanced search function can be performed using various search keywords.
In addition, search results may be reproduced according to various scenarios, and an enhanced search function may be provided for movie titles that support multiple angles or multiple paths. Also, metadata can be created in multiple languages, thereby enabling enhanced search functionality that provides support for multiple languages.
Drawings
Fig. 1 illustrates a structure of AV data recorded on a typical storage medium;
fig. 2 is a block diagram of an exemplary reproducing apparatus that reproduces a storage medium storing metadata for providing an enhanced search function according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a method of reproducing a recording medium storing metadata for providing an enhanced search function according to an embodiment of the present invention;
fig. 4 illustrates an exemplary screen displayed in an example of searching for a desired scene using metadata for title scene search;
fig. 5 illustrates a relationship between metadata and audio-visual (AV) data for title scene search according to an embodiment of the present invention;
FIG. 6 illustrates a directory of metadata according to an embodiment of the present invention;
FIG. 7 illustrates naming rules for an exemplary metadata file in accordance with an embodiment of the present invention;
FIG. 8 illustrates a structure of metadata according to an embodiment of the present invention;
FIG. 9 illustrates a detailed structure of metadata illustrated in FIG. 8;
fig. 10 illustrates an application range of a title providing an enhanced search function;
FIG. 11 illustrates the application of metadata according to an embodiment of the present invention;
FIG. 12 illustrates the application of metadata according to another embodiment of the present invention;
fig. 13 illustrates an example of highlight play using metadata according to an embodiment of the present invention;
FIG. 14 illustrates a multi-angle title providing an enhanced search function using metadata according to an embodiment of the present invention;
fig. 15 illustrates a reproducing process of an exemplary reproducing apparatus according to an embodiment of the present invention.
Best mode for carrying out the invention
According to an aspect of the present invention, there is provided a reproducing apparatus that reproduces audio-visual (AV) data stored in an information storage medium. The reproduction apparatus includes: a search unit for searching for scenes matching a search keyword by performing an enhanced search function on AV data with reference to metadata containing information on at least one search keyword for each scene of the AV data; and a reproducing unit for reproducing the AV data corresponding to the at least one scene found by the searching unit.
The apparatus may further comprise: and a user interface for receiving a search keyword input by a user and displaying search results regarding the search keyword.
The enhanced search function may be enabled when the AV data is reproduced according to a main play path defined by an author, and the enhanced search function may be disabled when the AV data is reproduced according to a side play path defined by a user.
When there are found scenes, the reproducing unit may display the found scenes on the user interface, receive information on selection of one of the found scenes by the user, and reproduce AV data corresponding to the selected scene.
The reproducing unit may reproduce AV data corresponding to a scene directly preceding or following the selected scene according to an input of a user.
When there is a found scene, the reproducing unit may sequentially reproduce AV data corresponding to the found scene.
When the AV data supports multi-angle, the reproducing apparatus may reproduce the AV data corresponding to a predetermined angle or an angle input by a user using information about angles included in the metadata.
Metadata may be defined for each of the scenes. The reproducing unit may find a start position of at least one found scene by using an entry point indicating the start position of the at least one found scene. The search results may be displayed with the corresponding thumbnail.
According to another aspect of the present invention, there is provided a method of reproducing AV data stored in an information storage medium. The method comprises the following steps: searching for scenes matching the search keyword by performing an enhanced search function on the AV data with reference to metadata containing information on at least one search keyword for each scene of the AV data; AV data corresponding to the found scenes is reproduced.
In addition to the exemplary embodiments and aspects described above, further aspects and embodiments of the invention will become apparent by reference to the drawings and by study of the following descriptions.
Detailed Description
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures.
Fig. 2 is a block diagram of an exemplary reproducing apparatus that reproduces a storage medium storing metadata for providing an enhanced search function according to an embodiment of the present invention. Referring to fig. 2, the reproducing apparatus 200 includes a reading unit 210, a reproducing unit 220, a searching unit 230, and a user interface 240.
The reading unit 210 reads audio-visual (AV) data and metadata for providing an enhanced search function from a storage medium 250, such as a blu-ray disc (BD). The reproducing unit 220 decodes and reproduces AV data. In detail, when the user inputs a search keyword, the reproducing unit 220 receives information on a scene matching the search keyword from the searching unit 230 and reproduces the scene. When there are a plurality of scenes matching the search keyword, the reproducing unit 220 displays all scenes matching the search keyword on the user interface 240, and reproduces one or more scenes selected by the user or sequentially reproduces all scenes. The reproducing unit 220 may also be referred to as a play control engine.
The search unit 230 receives a search keyword from the user interface 240 and searches for scenes matching the search keyword. Then, the search unit 230 transmits the search result to the user interface 240 to display the search result in the form of a list, or transmits the search result to the reproducing unit 220 to reproduce the search result. As shown in fig. 2, the search results may be displayed as a list of scenes matching the search keyword.
The user interface 240 receives a search keyword input by a user or displays a search result. Further, when the user selects a scene from the search results (i.e., a list of scenes found and displayed on the user interface 240), the user interface 240 receives information about the selection.
Fig. 3 is a flowchart illustrating a method of reproducing a recording medium storing metadata for providing an enhanced search function according to an embodiment of the present invention. Referring to the reproduction method 300 shown in fig. 3, in block 310, a user inputs a search keyword using the user interface 240 as shown in fig. 2. The search keyword may be a scene type, a character, an actor, a term, a place, a sound, or any word defined by an author. For example, when a movie "hacker empire" is reproduced, all scenes in which the character "Neo" appears can be searched. Furthermore, all scenarios where the term "mobile phone" appears can also be searched.
Next, at block 320, all scenes matching the input search keyword are searched for with reference to the metadata file. The metadata file defines a plurality of scenes and includes information on a search keyword related to each scene and an entry point of each scene. The structure of the metadata file will be described in detail later. At block 330, a portion of AV data corresponding to the found scene is searched for and reproduced by using the entry point of the found scene. In this way, an enhanced search may be performed on AV data using various search keywords. Hereinafter, the enhanced search function will also be referred to as a "title scene search function".
Fig. 4 illustrates an exemplary screen 400 displayed in an example of searching for a desired scene using metadata for title scene search. The metadata for title scene search includes search information on each scene in AV data recorded on a storage medium 250, such as a blu-ray disc (BD) shown in fig. 2. Referring to fig. 4, while a movie title such as "hacker empire" or "ring king" is being reproduced in phase #1, a user selects a title scene search function to search for scenes related to a desired search keyword using a user interface 240 (e.g., a remote controller) as shown in fig. 2.
In stage #2, the user selects one of a plurality of search keyword categories displayed on the user interface 240, and selects one search keyword from the selected search keyword category in stage # 3. For example, when the user selects "Item" as the search keyword category and "Tower" as the search keyword corresponding to "Item", in stage #4, scenes in which "Tower" appears are searched for in the movie title, and the search results are displayed together with the corresponding thumbnail images. When the user selects one of the search results (i.e., found scenes), the selected scene is reproduced at stage # 5. Using a command on the user interface 240, such as "skip to next search result" or "skip to previous search result", at stage #6, the next or previous scene may be searched and rendered.
A "highlight play" function for sequentially reproducing all the scenes found may also be provided. In the highlight play, all search results are sequentially reproduced. As a result, it is not necessary to wait until the user selects one of the search results. When a user selects one of the search keywords related to the content, a search result regarding the selected search keyword is obtained. The search results form a highlight of the content related to the selected search keyword.
The structure of metadata for title scene search will be described in detail below.
Fig. 5 illustrates a relationship between AV data and metadata 500 for title scene search in a storage medium according to an embodiment of the present invention. Referring to fig. 5, a storage medium (such as the medium 250 shown in fig. 2) according to an embodiment of the present invention stores metadata 550 in addition to AV data shown in fig. 1. The metadata 500 may be stored in a file separate from a movie playlist, which is a reproduction unit. A metadata file 510 is created for each playlist 520, the metadata file 510 including a plurality of scenes 512, which are sections (sections) of each playlist 520 defined by an author. Each scene 512 includes an entry point indicating a start position of the scene. In an exemplary embodiment of the present invention, each scene 512 may also include its duration.
Each entry point is converted into an address of a scene in the clip AV stream 112 included in each clip 110 using an Entry Point (EP) map included in the clip information 114. Accordingly, the start position of each scene contained in the clip AV stream 112, which is real AV data, can be found using the entry point. Each scene 512 also contains information about a search keyword related to the scene (hereinafter referred to as search keyword information). For example, the search keyword information may include the following:
scenario 1 is a war scenario in which,
the roles are A, B and C, and,
the actors are a, b and c,
the field is x.
Thus, the user can search for scenes matching a desired search keyword based on the search keyword information of each scene 512. Also, the start position of the found scene in the clip AV stream 112 may be determined using the entry point of the found scene, and then the found scene may be reproduced.
Fig. 6 illustrates a directory of metadata 500 according to an embodiment of the present invention. Referring to fig. 6, metadata 500 related to AV data shown in fig. 5 is stored in files under respective directories. In detail, the index table is stored in an index. Further, clip information is stored in xxxxx.clpi files under the CLIPINF directory, clip AV STREAMs are stored in xxxxx.m2ts files under the STREAM directory, and other data are stored in files under the AUXDATA directory.
The metadata 500 for title scene search is stored in a file under a META directory separate from AV data. The metadata file for the disc library is dlmt _ xxx.xml, and the metadata file for the title scene search is esmt _ xxx _ yyyyyy.xml. According to an embodiment of the present invention, the metadata 500 is recorded in an XML format and a markup language for easy editing and reuse. Accordingly, after the storage medium is manufactured, data recorded thereon can be edited and reused.
FIG. 7 illustrates naming rules for an exemplary metadata file 510 according to an embodiment of the present invention. Referring to fig. 7, the name of the metadata file 510 starts with a prefix esmt indicating the metadata 500. The next 3 characters indicate a language code according to the ISO639-2 standard, and the next 5 characters indicate a corresponding playlist number. As described above, as shown in fig. 5, a metadata file 510 is created for each playlist 520. In addition, the menu displayed during the title scene search may support a plurality of languages using language codes according to the ISO639-2 standard.
Fig. 8 illustrates the structure of an exemplary metadata file 510 according to an embodiment of the present invention. As described with reference to fig. 5, each metadata file 510 includes a plurality of scenes 512. Referring to fig. 8, each scene 512 corresponds to a plurality of search keywords, such as scene type, character, actor, etc. The value of each search keyword may be expressed using attributes or sub-elements of the search keyword according to XML rules.
Fig. 9 illustrates a detailed structure of the exemplary metadata file 510 illustrated in fig. 8. Referring to fig. 9, each scene 512 for title scene search includes a scene type element, a character element, an actor element, or an "author defined" element as an author defined search keyword. Further, each scene 512 includes 'entry _ point' indicating a start position of each scene and 'duration' indicating a period for reproducing each scene. When multi-angle is supported, each scene 512 further includes 'angle _ num' indicating a specific angle. Whether to include 'duration' and 'angle _ num' in each scene 512 is optional.
Now, an example of performing a title scene search using the metadata 500 will be described below.
In detail, fig. 10 illustrates an application range of a title providing an enhanced search function according to an embodiment of the present invention. As previously shown in fig. 5, a storage medium 250, such as a blu-ray disc (BD), may store a movie title for reproducing moving pictures, such as movies, and an interactive title containing a program for providing interactive functions to a user. The metadata 500 for title scene search provides an enhanced search function while a moving picture is being reproduced. Thus, the metadata 500 is only for movie titles. The type of the Title may be identified by a Title _ playback _ type field. If the "Title _ playback _ type" field of the Title is 0b, the Title is a movie Title. If the "Title _ playback _ type" field of the Title is 1b, the Title is an interactive Title. Accordingly, the Title scene search according to an embodiment of the present invention can be performed only when the "Title _ playback _ type" field is 0 b.
Referring to fig. 10, when a storage medium 250, such as a blu-ray disc (BD), is loaded into the exemplary reproducing apparatus 200 shown in fig. 2, title #1 is accessed using an index table. When the navigation command "Play playlist # 1" (Play playlist #1) contained in movie object #1 of title #1 is executed, playlist #1 is reproduced. As shown in fig. 10, playlist #1 is composed of at least one playitem. The author can arbitrarily define a chapter or a scene regardless of the playitem.
A playlist that is automatically reproduced according to the index table when the storage medium 250 is loaded into the exemplary reproducing apparatus 200 as shown in fig. 2 is referred to as a main playback path playlist, and a playlist that is reproduced by another movie object called by a user using a button object while the main playback path playlist is being reproduced is referred to as a side playback path playlist (playlist). The side playback path playlist is not within the scope of the author-defined scenes or chapters. Therefore, according to the embodiment of the present invention, the title scene search function is enabled for the main playback path playlist, and the title scene search function is disabled for the sub playback path playlist.
In summary, the application range of the title providing the enhanced search function has the following limitations:
1. title scene search is applied to movie titles.
2. Metadata for title scene search is defined in units of playlists. Since a movie title may contain one or more playlists, one or more metadata may be defined for the playlists.
3. The title scene search is applied to the main playback path playlist, but not to the side playback path playlist.
Fig. 11 illustrates an application of metadata 500 according to an embodiment of the present invention. Referring to fig. 11, scenes used in metadata 500 are defined. A scene is a basic unit used in the metadata 500 for title scene search, and is a basic unit of content included in a playlist. The author may specify the entry point in the playlist on a global time axis (global time axis). The content between two adjacent entry points is a scene.
When a user searches for content using a search keyword, a search result is represented as a set of entry points included in a scene having metadata whose search keyword information matches the search keyword. These entry points are arranged in time series and sent to a play control engine, i.e., a reproducing unit 200 as shown in fig. 2. The play control engine may search for a plurality of scenes related to the same search keyword and reproduce the scenes.
Referring to fig. 11, an entry point of each search keyword is expressed as a circle. For example, when the user selects scene type #1 as a search keyword, the search result includes scene #1, scene #3, and scene # n. Then, the user may select some of scene #1, scene #3, and scene # n to reproduce. In addition, the user may navigate and reproduce a next search result or a previous search result using a User Operation (UO) such as 'skip to next ()' or 'skip to previous ()' through the user interface 240 as shown in fig. 2.
Fig. 12 illustrates an application of metadata 500 according to another embodiment of the present invention. Referring to fig. 12, a scene is defined using a duration in addition to the above-described entry points. The interval between the entry point and the point at the end of the duration is defined as a scene. When the user selects a scene, the search result may be reproduced according to three situations (scenarios).
Case 1: simple play
Regardless of the duration, the playlist is reproduced from the entry point of the scene selected by the user from the search result to the end of the playlist unless the user input occurs. For example, when the user selects scene type #1, playlist #1 is reproduced from the entry point of scene #1 until the end of playlist # 1.
Case 2: highlight play
The playlist is reproduced from the entry point of the scene selected by the user from the search result until the end of the duration of the selected scene. Then, the reproducing unit 20 jumps to the next scene and reproduces the next scene. For example, when the user selects scene type #2, only scene #1 and scene #3 as search results are reproduced. In other words, only highlights of the playlist #1 related to the search keyword scene type #2 are reproduced. Fig. 13 shows another example of highlight playback. Referring to fig. 13, search results are sequentially reproduced. Therefore, it is not necessary to stop and wait for user input after reproducing the found scene. In other words, after one of the plurality of search results about the actor 'a' is reproduced, the next search result is sequentially reproduced. In this way, only the highlight of the actor 'a' is reproduced, and each search result is expressed using the duration and the entry point for the highlight play. The search results may be linked and sequentially reproduced using the entry point and duration information.
Case 3: scene-based playback
The search result is reproduced by scene. In other words, a scene selected by the user from the search result is reproduced for as long as the duration of the scene from the entry point of the scene. After the duration, the reproduction is stopped until a user input is received. Scenario 3 is similar to scenario 2, except that the reproduction is stopped at the end of the scene.
Fig. 14 illustrates an exemplary multi-angle title providing an enhanced search function using metadata 500 according to an embodiment of the present invention. Referring to fig. 14, an example of a multi-path title composed of multiple angles is shown. The multi-path title is composed of 5 playitems, and among the 5 playitems, the second playitem is composed of 3 angles and the fourth playitem is composed of 4 angles. In a playlist supporting multi-angle, scene #1 and scene #2 matching the search keyword scene type #1 and scene #3 and scene #4 matching the search keyword scene type #2 are found. Each scene is defined by an entry point and a duration.
Since overlapped entry points can be distinguished by "angle _ num" shown in fig. 14, found scenes can overlap each other. However, when the entry points do not overlap each other, scenes found as a result of the enhanced search cannot overlap each other. When the user desires to reproduce the search result according to scenario 2, the reproducing apparatus sequentially reproduces scenes along the dotted arrow shown in fig. 14.
Referring to fig. 14, a scene covering a part of a playitem or a plurality of playitems is shown. In each scene, the metadata 500 of AV data thereof is defined.
In case of playltems (e.g., second and fourth playltems) supporting multi-angles, the metadata 500 is applied to AV data corresponding to one of the supported multi-angles. For example, in case of scene #1, a part of the first and second playitems is defined as a reproduction section, and angle _ num has a value of 3. The value of angle _ num is applied only to playltems supporting multi-angle. Therefore, a play item that does not support multiple angles is reproduced at a default angle. A Player Status Register (PSR)3, which is a status register of the reproducing apparatus 200 (e.g., as shown in fig. 2), is specified as a default angle. Thus, when scene #1 is reproduced, PlayItem #1, which does not support multiple angles, is reproduced at a default angle, and PlayItem #2, which supports multiple angles, is reproduced at angle 3 according to the value of the attribute designated as angle _ num. In this case, the search keyword for title scene search defined for scene #1 is applied to angle 3 of playitem 2 supporting multiple angles. As described above, when the metadata 500 including angle _ num is used, a title supporting multi-angle can also provide various enhanced search functions according to a designated search keyword.
Fig. 15 illustrates a reproducing process of an exemplary reproducing apparatus according to an embodiment of the present invention. Referring to fig. 15, the reproducing apparatus 200 shown in fig. 2 provides a title scene search function when reproducing a movie title. When the storage medium 250, such as a blu-ray disc (BD), is loaded to the reproducing apparatus 200 and starts reproduction of a movie title (operation 1510), the title scene search function is activated to be in an active state (operation 1520). As described with reference to fig. 14, when a movie title that supports multi-angle is reproduced, a title scene search may be performed by changing angles (operation 1530). Also, if a multi-path playlist is supported (operation 1522), when the playlist is changed to a main playback path playlist, the title scene search function is activated to be in an active state (operation 1534). However, when the playlist is changed to the side play path playlist, the title scene search function becomes invalid (operation 1532). Further, when the title is changed to the interactive title (not the movie title), the title scene search function becomes invalid (operation 1538).
Exemplary embodiments of the enhanced search method according to the present invention can also be written as computer programs and can be executed in general-use digital computers that execute the computer programs recorded on computer readable media. Codes and code segments constituting the computer program can be easily derived by computer programmers in the art. The computer readable medium can be any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the internet). The computer readable medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention. For example, any computer readable medium or data storage device may be used as long as the metadata is contained in the playlist in the manner shown in fig. 5 to 15. Furthermore, the metadata may also be structured in a different manner than in fig. 5. Also, the reproducing apparatus shown in fig. 2 may be implemented as a component of the recording apparatus, or alternatively, as a separate apparatus for performing a recording and/or reproducing function with respect to the storage medium. Also, the CPU may be implemented as a chipset having firmware or, alternatively, as a general or special purpose computer programmed to perform the methods described above (e.g., the methods described with reference to fig. 3 and 10-15). Therefore, it is intended that the invention not be limited to the various example embodiments disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (11)

1. A reproducing apparatus that reproduces audio-visual (AV) data stored on an information storage medium, the apparatus comprising:
a reading unit that reads AV data and metadata from an information storage medium, the metadata being for performing a title scene search of the AV data by scenes using information of at least one search keyword, a scene being a unit that describes contents in a playlist;
a reproducing unit reproducing the AV data using the metadata,
wherein the metadata contains entry points indicating start points of scenes, the entry points being sequentially arranged in a list.
2. The apparatus of claim 1, wherein metadata is defined for each scene.
3. The apparatus of claim 1, wherein the reproducing unit finds a starting point of the at least one found scene using an entry point in the metadata.
4. The apparatus of claim 1, further comprising: and a user interface for receiving a search keyword input by a user and displaying search results regarding the search keyword.
5. The apparatus of claim 4, wherein the search results are displayed with respective thumbnails.
6. The apparatus of claim 1, wherein the title scene search function is activated when a movie title is reproduced.
7. The apparatus of claim 1, wherein the title scene search function is enabled when the AV data is reproduced along a main play path defined by an author, and is disabled when the AV data is reproduced along a side play path defined by a user.
8. The apparatus of claim 4, wherein when there are found scenes, the reproducing unit displays the found scenes on the user interface, receives information on selection of one of the found scenes by the user, and reproduces AV data corresponding to the selected scene.
9. The apparatus of claim 8, wherein the reproducing unit reproduces AV data corresponding to a scene directly preceding or following the selected scene based on an input of a user.
10. The apparatus of claim 4, wherein when there is a found scene, the reproducing unit sequentially reproduces AV data corresponding to the found scene.
11. The apparatus of claim 4, wherein the reproducing apparatus reproduces AV data corresponding to a predetermined angle or an angle input by a user using information on an angle contained in the metadata, when the AV data supports multiple angles.
HK08104870.9A 2005-01-07 2006-01-06 Apparatus and method for reproducing storage medium that stores metadata for providing enhanced search function HK1110691B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020050001749A KR100782810B1 (en) 2005-01-07 2005-01-07 Method and apparatus for reproducing a storage medium having recorded metadata for providing an extended search function
KR10-2005-0001749 2005-01-07
PCT/KR2006/000051 WO2006073276A1 (en) 2005-01-07 2006-01-06 Apparatus and method for reproducing storage medium that stores metadata for providing enhanced search function

Publications (2)

Publication Number Publication Date
HK1110691A1 HK1110691A1 (en) 2008-07-18
HK1110691B true HK1110691B (en) 2012-03-09

Family

ID=

Similar Documents

Publication Publication Date Title
US8630531B2 (en) Apparatus and method for reproducing storage medium that stores metadata for providing enhanced search function
US8437606B2 (en) Storage medium storing metadata for providing enhanced search function
KR101029073B1 (en) An storage medium having metadata for providing enhanced search and a reproducing apparatus
HK1110691B (en) Apparatus and method for reproducing storage medium that stores metadata for providing enhanced search function
KR101029078B1 (en) A method for reproducing data from an information storage medium
HK1113852B (en) Apparatus and method for processing data from information storage medium