US20130287364A1 - Data generating device and data generating method, and data processing device and data processing method - Google Patents
Data generating device and data generating method, and data processing device and data processing method Download PDFInfo
- Publication number
- US20130287364A1 US20130287364A1 US13/927,853 US201313927853A US2013287364A1 US 20130287364 A1 US20130287364 A1 US 20130287364A1 US 201313927853 A US201313927853 A US 201313927853A US 2013287364 A1 US2013287364 A1 US 2013287364A1
- Authority
- US
- United States
- Prior art keywords
- track
- data
- group information
- audio
- section
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/30—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
- G11B27/3027—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4854—End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4856—End-user interface for client configuration for language selection, e.g. for the menu or subtitles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8106—Monomedia components thereof involving special audio data, e.g. different tracks for different languages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/85406—Content authoring involving a specific file format, e.g. MP4 format
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/912—Applications of a database
- Y10S707/913—Multimedia
Definitions
- the present disclosure relates to a data generating device and a data generating method, and a data processing device and a data processing method, and particularly to a data generating device and a data generating method, and a data processing device and a data processing method, which allow a desired combination of a plurality of kinds of data to be easily selected as reproduction objects.
- MP4 is a file format for storing data coded by an MPEG-4 (Moving Picture Experts Group phase 4) system or the like, and is defined in ISO/IEC 14496.
- MP4 is described in JP-T-2006-507553, JP-T-2005-524128, JP-T-2005-527885, JP-T-2005-525627, Japanese Patent Laid-Open No. 2004-227633 (referred to as Patent Documents 1 to 5 respectively, hereinafter), for example.
- AV Audio Video
- main image Video
- audio Audio
- secondary image a secondary image
- a track is a unit of AV data that can be managed independently.
- tracks of a same kind for example main images, audio, secondary images and the like
- tracks of different kinds have not been able to be grouped.
- a plurality of kinds of tracks have not been able to be grouped.
- the user needs to select a desired track for each kind, and thereby change the tracks as reproduction objects.
- the user when the user desires to listen to audio for English and view subtitles for Japanese as a secondary image together with a movie as a main image, and the tracks of the movie and audio for Japanese are set as initial values of reproduction objects, the user needs to select the track of audio for English as the track of audio as a reproduction object, and select the track of subtitles for Japanese as the track of a secondary image as a reproduction object, which is troublesome. There is thus a desire to enable a desired combination of a plurality of kinds of tracks to be easily selected as reproduction objects.
- the present disclosure has been made in view of such a situation. It is desirable to enable a desired combination of a plurality of kinds of data to be easily selected as reproduction objects.
- a data generating device including: a coding section coding a plurality of kinds of data, and generating coded data; an information generating section generating a plurality of pieces of group information indicating combinations of a plurality of kinds of the data; and a file generating section generating a coded data storage file including the coded data of the plurality of kinds of the data and the plurality of pieces of the group information.
- a data generating method according to the first embodiment of the present disclosure corresponds to the data generating device according to the first embodiment of the present disclosure.
- a plurality of kinds of data are coded, coded data is generated, a plurality of pieces of group information indicating combinations of a plurality of kinds of the data are generated, and a coded data storage file including the coded data of the plurality of kinds of the data and the plurality of pieces of the group information is generated.
- a data processing device including: an obtaining section obtaining a coded data storage file including coded data of a plurality of kinds of data and a plurality of pieces of group information indicating combinations of a plurality of kinds of the data; a display controlling section making a screen for selecting a combination indicated by the group information displayed on a basis of the plurality of pieces of the group information; a selecting section selecting a combination of data as a reproduction object from the combinations indicated by the plurality of pieces of the group information according to an input from a user to the screen; and a decoding section decoding the coded data of all the data included in the combination selected by the selecting section.
- a data processing method according to the second embodiment of the present disclosure corresponds to the data processing device according to the second embodiment of the present disclosure.
- a coded data storage file including coded data of a plurality of kinds of data and a plurality of pieces of group information indicating combinations of a plurality of kinds of the data is obtained, a screen for selecting a combination indicated by the group information is displayed on a basis of the plurality of pieces of the group information, a combination of data as a reproduction object is selected from the combinations indicated by the plurality of pieces of the group information according to an input from a user to the screen, and the coded data of all the data included in the selected combination is decoded.
- the first embodiment of the present disclosure it is possible to generate a file that enables a desired combination of a plurality of kinds of data to be easily selected as a reproduction object.
- a desired combination of a plurality of kinds of data can be easily selected as a reproduction object.
- FIG. 1 is a block diagram showing an example of configuration of one embodiment of a recording device as a data generating device to which the present technology is applied;
- FIG. 2 is a diagram showing an example of configuration of an MP4 file
- FIG. 3 is a diagram showing an example of description of a presentation track group box
- FIG. 4 is a diagram showing a first example of tracks
- FIG. 5 is a diagram showing an example of group information in a case where the tracks shown in FIG. 4 are recorded;
- FIG. 6 is a diagram showing a second example of tracks
- FIG. 7 is a diagram showing an example of group information in a case where the tracks shown in FIG. 6 are recorded;
- FIG. 8 is a diagram showing a third example of tracks
- FIG. 9 is a diagram of assistance in explaining a black band part of a screen.
- FIG. 10 is a diagram showing an example of group information in a case where the tracks shown in FIG. 8 are recorded;
- FIG. 11 is a flowchart of assistance in explaining a recording process
- FIG. 12 is a block diagram showing an example of configuration of one embodiment of a reproducing device as a data processing device to which the present technology is applied;
- FIG. 13 is a diagram showing an example of a menu screen
- FIG. 14 is a diagram showing an example of a screen displayed when a main part button is selected.
- FIG. 15 is a diagram showing an example of a screen displayed when a presentation button is selected.
- FIG. 16 is a flowchart of assistance in explaining a reproducing process
- FIG. 17 is a flowchart of assistance in explaining a track changing process.
- FIG. 18 is a diagram showing an example of configuration of one embodiment of a computer.
- FIG. 1 is a block diagram showing an example of configuration of an embodiment of a recording device as a data generating device to which the present technology is applied.
- the recording device 10 of FIG. 1 includes a recording processing section 11 , recording media 12 , a user input section 13 , and a control section 14 .
- the recording device 10 generates and records an MP4 file of AV data.
- the recording processing section 11 includes a data input section 21 , a data coding section 22 , and a recording section 23 .
- the data input section 21 in the recording processing section 11 obtains AV data in track units from the outside of the recording device 10 .
- the AV data includes 2D main images, 3D main images, images for conversion from 2D main images to 3D main images (which images for the conversion will hereinafter be referred to as conversion images), audio for various languages, 2D secondary images for various languages, 3D secondary images for various languages, and the like.
- a 3D main image includes for example a main image for a left eye and a main image for a right eye.
- a conversion image is one of a main image for a left eye and a main image for a right eye.
- a 2D main image is used as the other at a time of reproduction of the conversion image.
- Secondary images include subtitles, comment images, menu screens, and the like.
- the data input section 21 (information generating means) generates group information indicating a combination of at least two kinds of tracks of a main images, audio, and a secondary image according to an instruction from the control section 14 .
- the data input section 21 supplies the AV data and the group information to the data coding section 22 .
- the data coding section 22 includes a preprocessing section 31 , an encoding section 32 , and a file generating section 33 .
- the data coding section 22 generates an MP4 file.
- the preprocessing section 31 in the data coding section 22 applies the preprocessing of a predetermined system to the AV data in track units of 3D main images and 3D secondary images as 3D images supplied from the data input section 21 .
- the predetermined system includes a frame sequential system, a side by side system, a top and bottom system, and the like.
- the preprocessing of the frame sequential system is processing for alternately outputting the AV data of an image for a left eye and an image for a right eye which images form a 3D image.
- the preprocessing of the side by side system is processing for generating the AV data of images, of which an image for a left eye which image forms a 3D image is disposed in one of a left region and a right region on a screen and of which an image for a right eye is disposed in the other region, from the AV data of the 3D image, and outputting the AV data of the images.
- the preprocessing of the top and bottom system is processing for generating the AV data of images, of which an image for a left eye which image forms a 3D image is disposed in one of an upper region and a lower region on a screen and of which an image for a right eye is disposed in the other region, from the AV data of the 3D image, and outputting the AV data of the images.
- the preprocessing section 31 supplies the encoding section 32 with AV data other than the AV data of 3D main images and 3D secondary images and the group information as they are.
- the encoding section 32 codes the AV data in track units which AV data is supplied from the preprocessing section 31 by a system in accordance with MP4.
- the encoding section 32 codes the AV data of a main image by an MPEG-1 system, an MPEG-2 system, an MPEG-4 system or the like, and codes the AV data of a secondary image by a JPEG (Joint Photographic Experts Group) system, a PNG (Portable Network Graphics) system or the like.
- the encoding section 32 codes the AV data of audio by an AAC (Advanced Audio Coding) system, an MP3 (Moving Picture Experts Group Audio Layer-3) system or the like.
- the encoding section 32 supplies an AV stream in track units which AV stream is obtained as a result of the coding to the file generating section 33 .
- the encoding section 32 also supplies the file generating section 33 with the group information as it is.
- the file generating section 33 (file generating means) generates an MP4 file using the AV stream in track units and the group information that are supplied from the encoding section 32 as well as management information for each track which management information is supplied from the control section 14 .
- the file generating section 33 supplies the generated MP4 file to the recording section 23 .
- the recording section 23 supplies the MP4 file supplied from the file generating section 33 to the recording media 12 to make the MP4 file recorded on the recording media 12 .
- the recording media 12 are formed by a flash memory, an HDD (Hard Disk Drive), a DVD (Digital Versatile Disk), and the like.
- the user input section 13 includes an operating button and the like.
- the user input section 13 receives an instruction from a user, and supplies the instruction to the control section 14 .
- the control section 14 performs processing such as control and the like on each part of the recording processing section 11 .
- the control section 14 determines a coding system for each track according to an instruction from the user input section 13 , and controls the encoding section 32 so as to code each track by the coding system.
- the control section 14 generates management information including a track ID, which is an ID unique to each track, information indicating the contents of the track, the coding system, and the like according to an instruction from the user input section 13 , and supplies the management information to the file generating section 33 .
- FIG. 2 is a diagram showing an example of configuration of an MP4 file generated by the file generating section 33 .
- the MP4 file has an object-oriented data structure.
- Each object is referred to as a box (Box).
- the MP4 file of FIG. 2 includes a file type box (ftyp), a movie box (moov), and a real data box (mdat).
- the movie box includes a presentation track group box (PTGP) in which group information is disposed, a box (trak) for each track in which management information for each track is disposed, and the like.
- PTGP presentation track group box
- trak box
- the AV stream is disposed in track units in the real data box.
- FIG. 3 is a diagram showing an example of description of a presentation track group box.
- a description “presentation_unit_size” in the fourth row indicates the data size of the group information.
- a description “number_of_track_IDs” in the fifth row indicates the number of track IDs of tracks included in a combination indicated by the group information.
- a description “track_ID” in the seventh row indicates the track ID of a track included in the combination indicated by the group information.
- a description “metadata” in the ninth row represents metadata indicating the contents of tracks included in the combination indicated by the group information. This metadata is for example character data coded by a UTF16BE system, and is data ending with a NULL character.
- the size of the group information, the number of all tracks included in the combination indicated by the group information and the track IDs of the tracks, and the metadata are described for each piece of group information in the presentation track group box.
- FIGS. 4 to 10 are diagrams of assistance in explaining examples of tracks and group information recorded on the recording media 12 .
- FIG. 4 is a diagram showing a first example of tracks recorded on the recording media 12 .
- the track having the track ID “ 1 ” is the track of a main image, and is formed by the image data of a 2D American movie.
- the track having the track ID “ 2 ” is a first audio track, and is formed by the audio data of 5.1-ch audio for Japanese.
- the track having the track ID “ 3 ” is a second audio track, and is formed by the audio data of 2-ch audio for Japanese.
- the track having the track ID “ 4 ” is a third audio track, and is formed by the audio data of 5.1-ch audio for English.
- the track having the track ID “ 5 ” is the track of a first secondary image, and is formed by the image data of 2D subtitles for Japanese.
- the track having the track ID “ 6 ” is the track of a second secondary image, and is the image data of a 2D comment image for Japanese.
- This comment image is for example an image showing a Japanese translation of a comment by a director of the 2D American movie.
- FIG. 5 is a diagram showing an example of group information in a case where the tracks shown in FIG. 4 are recorded on the recording media 12 .
- the first group information includes track IDs “ 1 ” and “ 2 ” and metadata “Dubbed in Japanese (5.1 ch).” That is, the first group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ” and the track of the 5.1-ch audio for Japanese which track has the track ID “ 2 .”
- the image and audio reproduced on the basis of this first group information is a 5.1-ch Japanese-dubbed version of the 2D American movie.
- the second group information includes track IDs “ 1 ” and “ 3 ” and metadata “Dubbed in Japanese (2 ch).” That is, the second group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ” and the track of the 2-ch audio for Japanese which track has the track ID “ 3 .”
- the image and audio reproduced on the basis of this second group information is a 2-ch Japanese-dubbed version of the 2D American movie.
- the third group information includes track IDs “ 1 ” and “ 4 ” and metadata “English (5.1 ch).” That is, the third group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ” and the track of the 5.1-ch audio for English which track has the track ID “ 4 .” The image and audio reproduced on the basis of this third group information is the 2D American movie.
- the fourth group information includes track IDs “ 1 ,” “ 4 ,” and “ 5 ” and metadata “English (5.1 ch, Japanese Subtitle).” That is, the fourth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the 5.1-ch audio for English which track has the track ID “ 4 ,” and the track of the 2D subtitles for Japanese which track has the track ID “ 5 .”
- the image and audio reproduced on the basis of this fourth group information is a Japanese subtitle version of the 2D American movie.
- the fifth group information includes track IDs “ 1 ,” “ 4 ,” and “ 6 ” and metadata “English (5.1 ch, Japanese Comment).” That is, the fifth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the 5.1-ch audio for English which track has the track ID “ 4 ,” and the track of the 2D comment image for Japanese which track has the track ID “ 6 .”
- the image and audio reproduced on the basis of this fifth group information is the 2D American movie with a Japanese comment.
- FIG. 6 is a diagram showing a second example of tracks recorded on the recording media 12 .
- the track having the track ID “ 1 ” is the track of a main image, and is formed by the image data of a 2D American movie.
- the track having the track ID “ 2 ” is a first audio track, and is formed by the audio data of 2-ch audio for Japanese.
- the track having the track ID “ 3 ” is a second audio track, and is formed by the audio data of 2-ch audio for English.
- the track having the track ID “ 4 ” is the track of a first secondary image, and is formed by the image data of 2D forced subtitles for Japanese audio.
- the forced subtitles are subtitles for a main language which subtitles are to be displayed when audio as a reproduction object includes audio in a language different from a main language, and which subtitles correspond to the audio in the different language.
- the forced subtitles are subtitles for Japanese which subtitles are to be displayed when the audio as a reproduction object is audio for Japanese and audio in a language other than Japanese is included in the middle of the audio for Japanese, and which subtitles correspond to the audio in the language other than Japanese.
- 2D forced subtitles for Japanese audio are 2D forced subtitles for Japanese which subtitles are to be displayed when audio for Japanese is set as a reproduction object.
- the track having the track ID “ 5 ” is the track of a second secondary image, and is the image data of 2D subtitles for Japanese and 2D forced subtitles for Japanese.
- FIG. 7 is a diagram showing an example of group information in a case where the tracks shown in FIG. 6 are recorded on the recording media 12 .
- the first group information includes track IDs “ 1 ,” “ 2 ,” and “ 4 ” and metadata “Dubbed in Japanese (Forced Subtitle).” That is, the first group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the 2-ch audio for Japanese which track has the track ID “ 2 ,” and the track of the 2D forced subtitles for Japanese audio which track has the track ID “ 4 .”
- the image and audio reproduced on the basis of this first group information is a Japanese-dubbed version of the 2D American movie with the forced subtitles.
- the second group information includes track IDs “ 1 ,” “ 3 ,” and “ 5 ” and metadata “English Audio (Japanese Subtitles+Forced Subtitles).” That is, the second group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the 2-ch audio for English which track has the track ID “ 3 ,” and the track of the 2D subtitles for Japanese and the 2D forced subtitles for Japanese which track has the track ID “ 5 .”
- the image and audio reproduced on the basis of this second group information is a Japanese subtitle version of the 2D American movie with the forced subtitles.
- FIG. 8 is a diagram showing a third example of tracks recorded on the recording media 12 .
- the track having the track ID “ 1 ” is the track of a first main image, and is formed by the image data of a 2D American movie.
- the track having the track ID “ 2 ” is the track of a second main image, and is formed by the image data of a conversion image of the American movie.
- the track having the track ID “ 3 ” is a first audio track, and is formed by the audio data of 2-ch audio for Japanese.
- the track having the track ID “ 4 ” is a second audio track, and is formed by the audio data of 2-ch audio for English.
- the track having the track ID “ 5 ” is the track of a first secondary image, and is formed by the image data of 2D subtitles for Japanese.
- the track having the track ID “ 6 ” is the track of a second secondary image, and is formed by the image data of 3D subtitles for Japanese.
- the track having the track ID “ 7 ” is the track of a third secondary image, and is formed by the image data of 3D subtitles for Japanese which subtitles are displayed in a black band part of a screen (which subtitles will hereinafter be referred to as 3D Japanese black band subtitles).
- a black band part of a screen is a black display region 40 disposed in an upper part or a lower part of the screen when a main image is a movie in a CinemaScope size or the like, as shown in FIG. 9 .
- FIG. 10 is a diagram showing an example of group information in a case where the tracks shown in FIG. 8 are recorded on the recording media 12 .
- the first group information includes track IDs “ 1 ” and “ 3 ” and metadata “[2D] Dubbed in Japanese.” That is, the first group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ” and the track of the 2-ch audio for Japanese which track has the track ID “ 3 .”
- the image and audio reproduced on the basis of this first group information is a Japanese-dubbed version of the 2D American movie.
- the second group information includes track IDs “ 1 ,” “ 4 ,” and “ 5 ” and metadata “[2D] English Audio and Japanese Subtitles.” That is, the second group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the 2-ch audio for English which track has the track ID “ 4 ,” and the track of the 2D subtitles for Japanese which track has the track ID “ 5 .”
- the image and audio reproduced on the basis of this second group information is a Japanese subtitle version of the 2D American movie.
- the third group information includes track IDs “ 1 ,” “ 2 ,” and “ 3 ” and metadata “[3D] Dubbed in Japanese.” That is, the third group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the conversion image of the American movie which track has the track ID “ 2 ,” and the track of the 2-ch audio for Japanese which track has the track ID “ 3 .”
- the image and audio reproduced on the basis of this third group information is a Japanese-dubbed version of the 3D American movie.
- the fourth group information includes track IDs “ 1 ,” “ 2 ,” “ 4 ,” and “ 6 ” and metadata “[3D] English Audio and Japanese Subtitles.” That is, the fourth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the conversion image of the American movie which track has the track ID “ 2 ,” the track of the 2-ch audio for English which track has the track ID “ 4 ,” and the track of the 3D subtitles for Japanese which track has the track ID “ 6 .”
- the image and audio reproduced on the basis of this fourth group information is a Japanese subtitle version of the 3D American movie.
- the fifth group information includes track IDs “ 1 ,” “ 2 ,” “ 4 ,” and “ 7 ” and metadata “[3D] English Audio and Japanese Subtitles 2.” That is, the fifth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “ 1 ,” the track of the conversion image of the American movie which track has the track ID “ 2 ,” the track of the 2-ch audio for English which track has the track ID “ 4 ,” and the track of the 3D Japanese black band subtitles which track has the track ID “ 7 .”
- the image and audio reproduced on the basis of this fifth group information is a Japanese black band subtitle version of the 3D American movie.
- group information indicating predetermined combinations is recorded on the recording media 12 .
- the group information in FIG. 5 does not include group information indicating a combination of the tracks having the track IDs “ 1 ,” “ 2 ,” and “ 5 .” That is, because there are few users who desire to listen to the audio for Japanese and view the 2D subtitles for Japanese together with the image of the 2D American movie, the group information indicating the combination of the track of the image of the 2D American movie, the track of the audio for Japanese, and the track of the 2D subtitles for Japanese is not recorded on the recording media 12 .
- the group information in FIG. 7 does not include group information indicating combinations that do not include the track having the track ID “ 4 ” or “ 5 .” That is, because there are few users who do not need forced subtitles, the group information indicating the combinations that do not include the tracks including the forced subtitles is not recorded on the recording media 12 .
- group information as information indicating a combination of tracks as reproduction objects on a reproducing device for reproducing the recording media 12 , which reproducing device will be described later
- the user can select and specify desired group information quickly as compared with a case where group information indicating all combinations is recorded on the recording media 12 .
- FIG. 11 is a flowchart of assistance in explaining a recording process by the recording device 10 of FIG. 1 .
- step S 11 the data input section 21 obtains AV data such as 2D main images, 3D main images, conversion images, audio for various languages, 2D secondary images for various languages, 3D secondary images for various languages, and the like from the outside of the recording device 10 .
- the data input section 21 supplies the AV data to the data coding section 22 .
- step S 12 the data input section 21 generates group information according to an instruction from the control section 14 , and supplies the group information to the data coding section 22 .
- step S 13 the preprocessing section 31 in the data coding section 22 applies the preprocessing of a predetermined system to the AV data in track units of the 3D main images and the 3D secondary images as 3D images supplied from the data input section 21 .
- the preprocessing section 31 supplies the AV data other than the AV data of the 3D main images and the 3D secondary images and the group information supplied from the data input section 21 to the encoding section 32 as they are.
- step S 14 the encoding section 32 codes the AV data in track units which AV data is supplied from the preprocessing section 31 by a system in accordance with MP4.
- the encoding section 32 supplies an AV stream in track units which AV stream is obtained as a result of the coding to the file generating section 33 .
- the encoding section 32 supplies the file generating section 33 with the group information as it is.
- step S 15 the file generating section 33 generates an MP4 file using the AV stream in track units and the group information supplied from the encoding section 32 and management information for each track which management information is supplied from the control section 14 .
- the file generating section 33 supplies the generated MP4 file to the recording section 23 .
- step S 16 the recording section 23 supplies the MP4 file supplied from the file generating section 33 to the recording media 12 to make the MP4 file recorded on the recording media 12 .
- the process is then ended.
- the recording device 10 disposes the group information in the MP4 file.
- the reproducing device to be described later can present the group information to the user.
- the user can easily specify a combination of tracks indicated by the group information as reproduction objects.
- the user of the recording device 10 can make the user of the reproducing device to be described later recognize a combination of appropriate tracks intended by the user of the recording device 10 .
- FIG. 12 is a block diagram showing an example of configuration of an embodiment of a reproducing device as a data processing device to which the present technology is applied.
- the reproducing device 50 of FIG. 12 includes recording media 12 , a reproduction processing section 51 , a user input section 52 , and a control section 53 .
- the reproducing device 50 reproduces an MP4 file from the recording media 12 on which the MP4 file has been recorded by the recording device 10 of FIG. 1 .
- the reproduction processing section 51 includes a readout section 61 , a data decoding section 62 , a display section 63 , and a speaker 64 .
- the readout section 61 (obtaining means) in the reproduction processing section 51 reads and obtains the MP4 file recorded on the recording media 12 .
- the readout section 61 supplies the MP4 file to the data decoding section 62 .
- the data decoding section 62 includes a file analyzing section 71 , a decoding section 72 , and a display information generating section 73 .
- the file analyzing section 71 in the data decoding section 62 analyzes the MP4 file supplied from the readout section 61 , and obtains information disposed in each of a file type box and a movie box. The file analyzing section 71 then supplies the control section 53 with group information disposed in a presentation track group box of the movie box and management information for each track which management information is disposed in a box for each track. In addition, according to an instruction from the control section 53 , the file analyzing section 71 analyzes the MP4 file, obtains an AV stream of tracks as reproduction objects disposed in a real data box, and supplies the AV stream to the decoding section 72 .
- the decoding section 72 decodes the AV stream supplied from the file analyzing section 71 by a system corresponding to the coding system of the encoding section 32 in FIG. 1 under control of the control section 53 .
- the decoding section 72 supplies AV data obtained as a result of the decoding to the display information generating section 73 .
- the display information generating section 73 effects display of a menu screen on the display section 63 on the basis of the AV data of the menu screen supplied from the decoding section 72 according to an instruction from the control section 53 .
- the display information generating section 73 (display controlling means) effects display of a screen for selecting tracks to be set as reproduction objects on the display section 63 on the basis of the management information or the group information supplied from the control section 53 .
- the display information generating section 73 applies the postprocessing of a system corresponding to the preprocessing section 31 in FIG. 1 to the AV data of a 3D main image and a 3D secondary image as 3D images supplied from the decoding section 72 , and generates the AV data of an image for a left eye and an image for a right eye. In addition, the display information generating section 73 generates the AV data of an image for a left eye and an image for a right eye using the AV data of a 2D main image and a conversion image supplied from the decoding section 72 .
- the display information generating section 73 effects display of a 3D main image and a 3D secondary image corresponding to the AV data on the display section 63 .
- the display information generating section 73 effects display of a 2D main image and a 2D secondary image corresponding to the AV data on the display section 63 .
- the display information generating section 73 outputs the audio corresponding to the AV data to the speaker 64 .
- the user input section 52 includes an operating button and the like.
- the user input section 52 receives an instruction from a user, and supplies the instruction to the control section 53 .
- the control section 53 performs processing such as control and the like on each part of the reproduction processing section 51 .
- the control section 53 supplies the management information or the group information to the display information generating section 73 according to an instruction from the user input section 52 .
- the control section 53 determines tracks as reproduction objects on the basis of an instruction from the user input section 52 and the management information or the group information.
- the control section 53 then instructs the file analyzing section 71 to obtain the AV stream of the tracks as reproduction objects.
- FIGS. 13 to 15 are diagrams showing an example of a screen displayed on the display section 63 .
- FIGS. 13 to 15 are diagrams showing an example of a screen in a case where the tracks shown in FIG. 8 and the group information shown in FIG. 10 are recorded on the recording media 12 .
- FIG. 13 is a diagram showing an example of a menu screen.
- the menu screen 100 includes a main part button 101 , an audio button 102 , a subtitle button 103 , and a presentation button 104 .
- the main part button 101 is selected to display a screen for selecting the track of an image of the American movie as a reproduction object.
- the audio button 102 is selected to display a screen for selecting the track of audio as a reproduction object.
- the subtitle button 103 is selected to display a screen for selecting the track of subtitles as a reproduction object.
- the presentation button 104 is selected to display a screen for selecting group information indicating a combination of tracks as reproduction objects.
- the display section 63 displays a screen 110 shown in FIG. 14 .
- control section 53 supplies management information for the tracks of main images to the display information generating section 73 according to the instruction to select the main part button 101 which instruction is supplied from the user input section 52 .
- the display information generating section 73 effects display of a 2D reproduction button 111 indicating the contents of the track having the track ID “ 1 ” and a 3D reproduction button 112 indicating the contents of the track having the track ID “ 2 ” on the basis of information indicating the contents of the tracks which information is included in the management information for the tracks of the main images which management information is supplied from the control section 53 .
- the 2D reproduction button 111 is selected to set the track of the image of the 2D American movie which track has the track ID “ 1 ” as a reproduction object.
- the 3D reproduction button 112 is selected to display the image of the 3D American movie with the track of the image of the 2D American movie which track has the track
- buttons indicating the contents of respective tracks of audio or secondary images are displayed on the basis of information indicating the contents of the tracks which information is included in the management information for the tracks of the audio or the secondary images.
- the display section 63 displays a screen 120 shown in FIG. 15 .
- the control section 53 supplies group information to the display information generating section 73 according to the instruction to select the presentation button 104 which instruction is supplied from the user input section 52 .
- the display information generating section 73 effects display of a “[2D] Dubbed in Japanese” button 121 corresponding to the metadata of the first group information, a “[2D] English Audio and Japanese Subtitles” button 122 corresponding to the metadata of the second group information, a “[3D] Dubbed in Japanese” button 123 corresponding to the metadata of the third group information, a “[3D] English Audio and Japanese Subtitles” button 124 corresponding to the metadata of the fourth group information, and a “[3D] English Audio and Japanese Subtitles 2” button 125 corresponding to the metadata of the fifth group information.
- the “[2D] Dubbed in Japanese” button 121 is selected to set the combination of the tracks indicated by the first group information, that is, the tracks having the track IDs “ 1 ” and “ 3 ,” as reproduction objects.
- the “[2D] English Audio and Japanese Subtitles” button 122 is selected to set the combination of the tracks indicated by the second group information, that is, the tracks having the track IDs “ 1 ,” “ 4 ,” and “ 5 ,” as reproduction objects.
- the “[3D] Dubbed in Japanese” button 123 is selected to set the combination of the tracks indicated by the third group information, that is, the tracks having the track IDs “ 1 ,” “ 2 ,” and “ 3 ,” as reproduction objects.
- the “[3D] English Audio and Japanese Subtitles” button 124 is selected to set the combination of the tracks indicated by the fourth group information, that is, the tracks having the track IDs “ 1 ,” “ 2 ,” “ 4 ,” and “ 6 ,” as reproduction objects.
- the “[3D] English Audio and Japanese Subtitles 2” button 125 is selected to set the combination of the tracks indicated by the fifth group information, that is, the tracks having the track IDs “ 1 ,” “ 2 ,” “ 4 ,” and “ 7 ,” as reproduction objects.
- FIG. 16 is a flowchart of assistance in explaining a reproducing process by the reproducing device 50 of FIG. 12 .
- This reproducing process is started when the user gives an instruction to reproduce the recording media 12 by operating the user input section 52 , for example.
- step S 30 in FIG. 16 according to an instruction from the control section 53 which instruction corresponds to the reproducing instruction from the user, the readout section 61 reads an MP4 file recorded on the recording media 12 , and supplies the MP4 file to the data decoding section 62 .
- step S 31 the file analyzing section 71 analyzes the MP4 file supplied from the readout section 61 , and determines whether there is a presentation track group box in a movie box of the MP4 file.
- the file analyzing section 71 supplies group information disposed in the presentation track group box to the control section 53 .
- the file analyzing section 71 also supplies management information for each track which management information is disposed in a box for each track in the movie box to the control section 53 .
- step S 32 the control section 53 selects predetermined group information from the group information supplied from the file analyzing section 71 .
- a first group information selecting method is for example a method of selecting group information at a first position.
- the user of the recording device 10 (producer of the recording media 12 ) disposes the group information intended by the user of the recording device 10 at a first position. Thereby, the group information intended by the user of the recording device 10 can be made to be selected at a time of reproduction of the recording media 12 .
- a second group information selecting method is for example a method of selecting group information including the track of audio for the language of a country in which the reproducing device 50 is used.
- the control section 53 recognizes track IDs included in each piece of group information, and recognizes the contents of tracks having the track IDs from management information for the tracks.
- the control section 53 selects group information including the track ID of a track whose contents are audio for the language of the country in which the reproducing device 50 is used.
- group information detected first is selected, for example.
- a third group information selecting method is for example a method of selecting group information including the track of subtitles for the language of a country in which the reproducing device 50 is used.
- the control section 53 recognizes track IDs included in each piece of group information, and recognizes the contents of tracks having the track IDs from management information for the tracks.
- the control section 53 selects group information including the track ID of a track whose contents are subtitles for the language of the country in which the reproducing device 50 is used.
- group information detected first is selected, for example.
- the language of the country in which the reproducing device 50 is used which language is used by the second selecting method and the third selecting method, is specified by the user via the user input section 52 , for example.
- step S 33 the control section 53 sets all track IDs included in the group information selected in step S 32 as the track IDs of tracks as reproduction objects. Then, the control section 53 instructs the file analyzing section 71 to obtain an AV stream of the tracks having the track IDs, and advances the process to step S 35 .
- the file analyzing section 71 supplies management information for each track, which management information is disposed in the box for each track in the movie box, to the control section 53 .
- step S 34 the control section 53 sets predetermined track IDs, which are set in advance as initial values of track IDs, as the track IDs of tracks as reproduction objects.
- the initial values of the track IDs are for example disposed in the movie box, and supplied to the control section 53 via the file analyzing section 71 .
- the control section 53 instructs the file analyzing section 71 to obtain the AV stream of the tracks having the set track IDs, and then advances the process to step S 35 .
- step S 35 according to the instruction from the control section 53 , the file analyzing section 71 analyzes the MP4 file, obtains the AV stream of the tracks as reproduction objects disposed in the real data box, and supplies the AV stream to the decoding section 72 .
- step S 36 the decoding section 72 decodes the AV stream supplied from the file analyzing section 71 by a system corresponding to the coding system of the encoding section 32 in FIG. 1 under control of the control section 53 .
- the decoding section 72 supplies AV data obtained as a result of the decoding to the display information generating section 73 .
- step S 37 on the basis of the AV data supplied from the decoding section 72 , the display information generating section 73 effects display of a main image and a secondary image corresponding to the AV data on the display section 63 , and outputs audio corresponding to the AV data to the speaker 64 .
- the process is then ended.
- FIG. 17 is a flowchart of assistance in explaining a track changing process for changing tracks as reproduction objects, which process is performed by the reproducing device 50 in FIG. 12 .
- This track changing process is started when the user gives an instruction to display the menu screen by operating the user input section 52 , for example.
- step S 51 in FIG. 17 the display information generating section 73 effects display of the menu screen 100 on the display section 63 on the basis of the AV data of the menu screen supplied from the decoding section 72 according to an instruction from the control section 53 which instruction corresponds to the instruction of the user to display the menu screen 100 .
- step S 52 the control section 53 determines whether the presentation button 104 within the menu screen 100 is selected. When it is determined in step S 52 that the presentation button 104 is selected, the control section 53 supplies the group information to the display information generating section 73 .
- step S 53 the display information generating section 73 effects display of the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , and the “[3D] English Audio and Japanese Subtitles 2” button 125 in FIG. 15 on the basis of the metadata included in the group information supplied from the control section 53 .
- step S 54 the control section 53 determines whether one of the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , and the “[3D] English Audio and Japanese Subtitles 2” button 125 is selected.
- step S 54 When it is determined in step S 54 that none of the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , and the “[3D] English Audio and Japanese Subtitles 2” button 125 is selected, the control section 53 waits until one of the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , and the “[3D] English Audio and Japanese Subtitles 2” button 125 is selected.
- step S 54 When it is determined in step S 54 that one of the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , and the “[3D] English Audio and Japanese Subtitles 2” button 125 is selected, the process proceeds to step S 55 .
- step S 55 the control section 53 changes the track IDs of tracks set as reproduction objects to the track IDs included in the group information corresponding to the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , or the “[3D] English Audio and Japanese Subtitles 2” button 125 that is selected.
- control section 53 selects, as a combination of tracks as reproduction objects, the combination indicated by the group information corresponding to the “[2D] Dubbed in Japanese” button 121 , the “[2D] English Audio and Japanese Subtitles” button 122 , the “[3D] Dubbed in Japanese” button 123 , the “[3D] English Audio and Japanese Subtitles” button 124 , or the “[3D] English Audio and Japanese Subtitles 2” button 125 that is selected. The process is then ended.
- step S 52 when it is determined in step S 52 that the presentation button 104 is not selected, the control section 53 in step S 56 determines whether the main part button 101 is selected.
- step S 56 When it is determined in step S 56 that the main part button 101 is selected, the control section 53 supplies management information for the tracks of main images to the display information generating section 73 .
- step S 57 the display information generating section 73 effects display of the 2D reproduction button 111 and the 3D reproduction button 112 in FIG. 14 on the basis of information indicating the contents of the tracks of the main images which information is included in the management information for the tracks of the main images which management information is supplied from the control section 53 .
- step S 58 the control section 53 determines whether one of the 2D reproduction button 111 and the 3D reproduction button 112 is selected. When it is determined in step S 58 that none of the 2D reproduction button 111 and the 3D reproduction button 112 is selected, the control section 53 waits until one of the 2D reproduction button ill and the 3D reproduction button 112 is selected.
- step S 58 When it is determined in step S 58 that one of the 2D reproduction button 111 and the 3D reproduction button 112 is selected, the process proceeds to step S 59 .
- step S 59 the control section 53 changes the track ID of the track of a main image set as a reproduction object to the track ID of the track of a main image corresponding to the 2D reproduction button 111 or the 3D reproduction button 112 that is selected. The process is then ended.
- step S 56 When it is determined in step S 56 that the main part button 101 is not selected, on the other hand, the control section 53 in step S 60 determines whether the audio button 102 is selected.
- step S 60 When it is determined in step S 60 that the audio button 102 is selected, the control section 53 in step S 61 supplies management information for the tracks of audio to the display information generating section 73 .
- step S 61 the display information generating section 73 effects display of various buttons on the basis of information indicating the contents of the tracks of the audio which information is included in the management information for the tracks of the audio which management information is supplied from the control section 53 .
- step S 62 the control section 53 determines whether one of the buttons displayed in step S 61 is selected. When it is determined in step S 62 that none of the buttons displayed in step S 61 is selected, the control section 53 waits until one of the buttons displayed in step S 61 is selected.
- step S 62 When it is determined in step S 62 that one of the buttons displayed in step S 61 is selected, the process proceeds to step S 63 .
- step S 63 the control section 53 changes the track ID of the track of audio set as a reproduction object to the track ID of the track of audio corresponding to the selected button. The process is then ended.
- step S 60 When it is determined in step S 60 that the audio button 102 is not selected, on the other hand, the control section 53 in step S 64 determines whether the subtitle button 103 is selected.
- step S 64 When it is determined in step S 64 that the subtitle button 103 is not selected, the process returns to step S 52 to repeat the processes of steps S 52 , S 56 , S 60 , and S 64 until the presentation button 104 , the main part button 101 , the audio button 102 , or the subtitle button 103 is selected.
- step S 64 When it is determined in step S 64 that the subtitle button 103 is selected, on the other hand, the control section 53 in step S 65 supplies management information for the tracks of subtitles to the display information generating section 73 .
- step S 65 the display information generating section 73 effects display of various buttons on the basis of information indicating the contents of the tracks of the subtitles which information is included in the management information for the tracks of the subtitles which management information is supplied from the control section 53 .
- step S 66 the control section 53 determines whether one of the buttons displayed in step S 65 is selected. When it is determined in step S 66 that none of the buttons displayed in step S 65 is selected, the control section 53 waits until one of the buttons displayed in step S 65 is selected.
- step S 66 When it is determined in step S 66 that one of the buttons displayed in step S 65 is selected, the process proceeds to step S 67 .
- step S 67 the control section 53 changes the track ID of the track of subtitles set as a reproduction object to the track ID of the track of subtitles corresponding to the selected button. The process is then ended.
- the reproducing device 50 obtains an MP4 file having group information disposed therein, and on the basis of the group information, displays the screen 120 for selecting group information indicating a combination of tracks as reproduction objects.
- group information indicating a combination of tracks as reproduction objects.
- the MP4 file is recorded on the recording media 12 .
- the MP4 file may be transmitted via a predetermined network.
- the series of processes described above can be not only performed by hardware but also performed by software.
- a program constituting the software is installed onto a general-purpose computer or the like.
- FIG. 18 shows an example of configuration of one embodiment of a computer onto which the program for performing the series of processes described above is installed.
- the program can be recorded in advance in a storage section 208 or a ROM (Read Only Memory) 202 as a recording medium included in the computer.
- the program can be stored (recorded) on removable media 211 .
- removable media 211 can be provided as so-called packaged software.
- the removable media 211 include for example a flexible disk, a CD-ROM (Compact Disk Read Only Memory), an MO (Magneto Optical) disk, a DVD (Digital Versatile Disk), a magnetic disk, and a semiconductor memory.
- the program can be not only installed from the removable media 211 as described above onto the computer via a drive 210 but also downloaded to the computer via a communication network or a broadcasting network and installed into the built-in storage section 208 .
- the program can be for example transferred from a download site to the computer by radio via an artificial satellite for digital satellite broadcasting or transferred to the computer by wire via a network such as a LAN (Local Area Network), the Internet or the like.
- LAN Local Area Network
- the computer includes a CPU (Central Processing Unit) 201 .
- the CPU 201 is connected with an input-output interface 205 via a bus 204 .
- the CPU 201 executes the program stored in the ROM 202 according to the command.
- the CPU 201 loads the program stored in the storage section 208 into a RAM (Random Access Memory) 203 , and then executes the program.
- RAM Random Access Memory
- the CPU 201 thereby performs the processes according to the above-described flowcharts or processes performed by the configurations of the above-described block diagrams. Then, the CPU 201 for example outputs a result of a process from an output section 207 via the input-output interface 205 , transmits the result of the process from a communicating section 209 via the input-output interface 205 , or records the result of the process in the storage section 208 via the input-output interface 205 , as required.
- the input section 206 includes a keyboard, a mouse, a microphone and the like.
- the input section 206 corresponds to the user input section 13 and the user input section 52 .
- the output section 207 includes an LCD (Liquid Crystal Display), a speaker and the like.
- the output section 207 corresponds to for example the display section 63 and the speaker 64 .
- the recording media 12 may be inserted into the computer via a drive not shown in the figure, or may be included in the computer as a part of the storage section 208 .
- processes performed by the computer according to the program do not necessarily need to be performed in time series in the order described in the flowcharts. That is, processes performed by the computer according to the program also include processes performed in parallel or individually (for example parallel processing or processing according to objects).
- the program may be processed by one computer (processor), or may be subjected to a distributed processing by a plurality of computers. Further, the program may be transferred to a remote computer to be executed.
- the steps describing the program stored on a program recording medium include not only processes performed in time series in the described order but also processes not necessarily performed in time series but performed in parallel or individually.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Disclosed herein is a data generating device including: a coding section coding a plurality of kinds of data, and generating coded data; an information generating section generating a plurality of pieces of group information indicating combinations of a plurality of kinds of the data; and a file generating section generating a coded data storage file including the coded data of the plurality of kinds of the data and the plurality of pieces of the group information.
Description
- This application is a continuation of U.S. application Ser. No. 13/179,697, filed Jul. 11, 2011,the entire contents of which are incorporated herein by reference. U.S. application Ser. No. 13/179,697 claims the benefit of priority under 35 U.S.C. § 119 from Japanese Patent Application No. 2010-173600 filed Aug. 2, 2010
- The present disclosure relates to a data generating device and a data generating method, and a data processing device and a data processing method, and particularly to a data generating device and a data generating method, and a data processing device and a data processing method, which allow a desired combination of a plurality of kinds of data to be easily selected as reproduction objects.
- MP4 is a file format for storing data coded by an MPEG-4 (Moving Picture Experts Group phase 4) system or the like, and is defined in ISO/IEC 14496. MP4 is described in JP-T-2006-507553, JP-T-2005-524128, JP-T-2005-527885, JP-T-2005-525627, Japanese Patent Laid-Open No. 2004-227633 (referred to as
Patent Documents 1 to 5 respectively, hereinafter), for example. - In the past, AV (Audio Video) data such as a main image (Video), audio (Audio), a secondary image
- (Subtitle), and the like is stored in an MP4 file on a track-by-track basis. Incidentally, a track is a unit of AV data that can be managed independently. In addition, in an MP4 file, tracks of a same kind (for example main images, audio, secondary images and the like) can be grouped.
- However, tracks of different kinds have not been able to be grouped. Thus, a plurality of kinds of tracks have not been able to be grouped. As a result, when tracks of respective kinds set as initial values of reproduction objects are not tracks intended by a user, the user needs to select a desired track for each kind, and thereby change the tracks as reproduction objects.
- For example, when the user desires to listen to audio for English and view subtitles for Japanese as a secondary image together with a movie as a main image, and the tracks of the movie and audio for Japanese are set as initial values of reproduction objects, the user needs to select the track of audio for English as the track of audio as a reproduction object, and select the track of subtitles for Japanese as the track of a secondary image as a reproduction object, which is troublesome. There is thus a desire to enable a desired combination of a plurality of kinds of tracks to be easily selected as reproduction objects.
- The present disclosure has been made in view of such a situation. It is desirable to enable a desired combination of a plurality of kinds of data to be easily selected as reproduction objects.
- According to a first embodiment of the present disclosure, there is provided a data generating device including: a coding section coding a plurality of kinds of data, and generating coded data; an information generating section generating a plurality of pieces of group information indicating combinations of a plurality of kinds of the data; and a file generating section generating a coded data storage file including the coded data of the plurality of kinds of the data and the plurality of pieces of the group information.
- A data generating method according to the first embodiment of the present disclosure corresponds to the data generating device according to the first embodiment of the present disclosure.
- In the first embodiment of the present disclosure, a plurality of kinds of data are coded, coded data is generated, a plurality of pieces of group information indicating combinations of a plurality of kinds of the data are generated, and a coded data storage file including the coded data of the plurality of kinds of the data and the plurality of pieces of the group information is generated.
- According to a second embodiment of the present disclosure, there is provided a data processing device including: an obtaining section obtaining a coded data storage file including coded data of a plurality of kinds of data and a plurality of pieces of group information indicating combinations of a plurality of kinds of the data; a display controlling section making a screen for selecting a combination indicated by the group information displayed on a basis of the plurality of pieces of the group information; a selecting section selecting a combination of data as a reproduction object from the combinations indicated by the plurality of pieces of the group information according to an input from a user to the screen; and a decoding section decoding the coded data of all the data included in the combination selected by the selecting section.
- A data processing method according to the second embodiment of the present disclosure corresponds to the data processing device according to the second embodiment of the present disclosure.
- In the second embodiment of the present disclosure, a coded data storage file including coded data of a plurality of kinds of data and a plurality of pieces of group information indicating combinations of a plurality of kinds of the data is obtained, a screen for selecting a combination indicated by the group information is displayed on a basis of the plurality of pieces of the group information, a combination of data as a reproduction object is selected from the combinations indicated by the plurality of pieces of the group information according to an input from a user to the screen, and the coded data of all the data included in the selected combination is decoded.
- According to the first embodiment of the present disclosure, it is possible to generate a file that enables a desired combination of a plurality of kinds of data to be easily selected as a reproduction object.
- According to the second embodiment of the present disclosure, a desired combination of a plurality of kinds of data can be easily selected as a reproduction object.
-
FIG. 1 is a block diagram showing an example of configuration of one embodiment of a recording device as a data generating device to which the present technology is applied; -
FIG. 2 is a diagram showing an example of configuration of an MP4 file; -
FIG. 3 is a diagram showing an example of description of a presentation track group box; -
FIG. 4 is a diagram showing a first example of tracks; -
FIG. 5 is a diagram showing an example of group information in a case where the tracks shown inFIG. 4 are recorded; -
FIG. 6 is a diagram showing a second example of tracks; -
FIG. 7 is a diagram showing an example of group information in a case where the tracks shown inFIG. 6 are recorded; -
FIG. 8 is a diagram showing a third example of tracks; -
FIG. 9 is a diagram of assistance in explaining a black band part of a screen; -
FIG. 10 is a diagram showing an example of group information in a case where the tracks shown inFIG. 8 are recorded; -
FIG. 11 is a flowchart of assistance in explaining a recording process; -
FIG. 12 is a block diagram showing an example of configuration of one embodiment of a reproducing device as a data processing device to which the present technology is applied; -
FIG. 13 is a diagram showing an example of a menu screen; -
FIG. 14 is a diagram showing an example of a screen displayed when a main part button is selected; -
FIG. 15 is a diagram showing an example of a screen displayed when a presentation button is selected; -
FIG. 16 is a flowchart of assistance in explaining a reproducing process; -
FIG. 17 is a flowchart of assistance in explaining a track changing process; and -
FIG. 18 is a diagram showing an example of configuration of one embodiment of a computer. - [Example of Configuration of An Embodiment of Recording Device]
-
FIG. 1 is a block diagram showing an example of configuration of an embodiment of a recording device as a data generating device to which the present technology is applied. - The
recording device 10 ofFIG. 1 includes arecording processing section 11, recordingmedia 12, auser input section 13, and acontrol section 14. Therecording device 10 generates and records an MP4 file of AV data. - Specifically, the
recording processing section 11 includes adata input section 21, adata coding section 22, and arecording section 23. - The
data input section 21 in therecording processing section 11 obtains AV data in track units from the outside of therecording device 10. The AV data includes 2D main images, 3D main images, images for conversion from 2D main images to 3D main images (which images for the conversion will hereinafter be referred to as conversion images), audio for various languages, 2D secondary images for various languages, 3D secondary images for various languages, and the like. - Incidentally, a 3D main image includes for example a main image for a left eye and a main image for a right eye. The same is true for a 3D secondary image. A conversion image is one of a main image for a left eye and a main image for a right eye. A 2D main image is used as the other at a time of reproduction of the conversion image. Secondary images include subtitles, comment images, menu screens, and the like.
- The data input section 21 (information generating means) generates group information indicating a combination of at least two kinds of tracks of a main images, audio, and a secondary image according to an instruction from the
control section 14. Thedata input section 21 supplies the AV data and the group information to thedata coding section 22. - The
data coding section 22 includes a preprocessingsection 31, anencoding section 32, and afile generating section 33. Thedata coding section 22 generates an MP4 file. - Specifically, the
preprocessing section 31 in thedata coding section 22 applies the preprocessing of a predetermined system to the AV data in track units of 3D main images and 3D secondary images as 3D images supplied from thedata input section 21. The predetermined system includes a frame sequential system, a side by side system, a top and bottom system, and the like. - The preprocessing of the frame sequential system is processing for alternately outputting the AV data of an image for a left eye and an image for a right eye which images form a 3D image. The preprocessing of the side by side system is processing for generating the AV data of images, of which an image for a left eye which image forms a 3D image is disposed in one of a left region and a right region on a screen and of which an image for a right eye is disposed in the other region, from the AV data of the 3D image, and outputting the AV data of the images. The preprocessing of the top and bottom system is processing for generating the AV data of images, of which an image for a left eye which image forms a 3D image is disposed in one of an upper region and a lower region on a screen and of which an image for a right eye is disposed in the other region, from the AV data of the 3D image, and outputting the AV data of the images.
- In addition, the
preprocessing section 31 supplies theencoding section 32 with AV data other than the AV data of 3D main images and 3D secondary images and the group information as they are. - The encoding section 32 (coding means) codes the AV data in track units which AV data is supplied from the
preprocessing section 31 by a system in accordance with MP4. For example, theencoding section 32 codes the AV data of a main image by an MPEG-1 system, an MPEG-2 system, an MPEG-4 system or the like, and codes the AV data of a secondary image by a JPEG (Joint Photographic Experts Group) system, a PNG (Portable Network Graphics) system or the like. In addition, theencoding section 32 codes the AV data of audio by an AAC (Advanced Audio Coding) system, an MP3 (Moving Picture Experts Group Audio Layer-3) system or the like. - The
encoding section 32 supplies an AV stream in track units which AV stream is obtained as a result of the coding to thefile generating section 33. Theencoding section 32 also supplies thefile generating section 33 with the group information as it is. - The file generating section 33 (file generating means) generates an MP4 file using the AV stream in track units and the group information that are supplied from the
encoding section 32 as well as management information for each track which management information is supplied from thecontrol section 14. Thefile generating section 33 supplies the generated MP4 file to therecording section 23. - The
recording section 23 supplies the MP4 file supplied from thefile generating section 33 to therecording media 12 to make the MP4 file recorded on therecording media 12. - The
recording media 12 are formed by a flash memory, an HDD (Hard Disk Drive), a DVD (Digital Versatile Disk), and the like. - The
user input section 13 includes an operating button and the like. Theuser input section 13 receives an instruction from a user, and supplies the instruction to thecontrol section 14. - The
control section 14 performs processing such as control and the like on each part of therecording processing section 11. For example, thecontrol section 14 determines a coding system for each track according to an instruction from theuser input section 13, and controls theencoding section 32 so as to code each track by the coding system. In addition, thecontrol section 14 generates management information including a track ID, which is an ID unique to each track, information indicating the contents of the track, the coding system, and the like according to an instruction from theuser input section 13, and supplies the management information to thefile generating section 33. - [Example of configuration of MP4 file]
-
FIG. 2 is a diagram showing an example of configuration of an MP4 file generated by thefile generating section 33. - As shown in
FIG. 2 , the MP4 file has an object-oriented data structure. Each object is referred to as a box (Box). - The MP4 file of
FIG. 2 includes a file type box (ftyp), a movie box (moov), and a real data box (mdat). - Information on a file type is disposed in the file type box.
- Management information for an AV stream in track units which AV stream is disposed in the real data box is disposed in the movie box. Specifically, the movie box includes a presentation track group box (PTGP) in which group information is disposed, a box (trak) for each track in which management information for each track is disposed, and the like.
- The AV stream is disposed in track units in the real data box.
- [Example of Description of Presentation Track Group Box]
-
FIG. 3 is a diagram showing an example of description of a presentation track group box. - A description “for (i=1; i≦number_of_presentations; i++)” in a third row of the description of the presentation track group box shown in
FIG. 3 indicates that a description in a fourth to a ninth row is repeated a number of times which number is equal to the number of pieces of group information. That is, the description in the fourth to ninth rows is group information. - Specifically, a description “presentation_unit_size” in the fourth row indicates the data size of the group information. A description “number_of_track_IDs” in the fifth row indicates the number of track IDs of tracks included in a combination indicated by the group information. A description “for (j=1; j≦number_of_track_IDs; j++)” in the sixth row indicates that a description in the seventh row is repeated a number of times which number is equal to the number of track IDs of tracks included in the combination indicated by the group information. A description “track_ID” in the seventh row indicates the track ID of a track included in the combination indicated by the group information. A description “metadata” in the ninth row represents metadata indicating the contents of tracks included in the combination indicated by the group information. This metadata is for example character data coded by a UTF16BE system, and is data ending with a NULL character.
- As described above, the size of the group information, the number of all tracks included in the combination indicated by the group information and the track IDs of the tracks, and the metadata are described for each piece of group information in the presentation track group box.
- [Example of Tracks And Group Information]
-
FIGS. 4 to 10 are diagrams of assistance in explaining examples of tracks and group information recorded on therecording media 12. -
FIG. 4 is a diagram showing a first example of tracks recorded on therecording media 12. - In the example of
FIG. 4 , six tracks having track IDs of 1 to 6 are recorded on therecording media 12. The track having the track ID “1” is the track of a main image, and is formed by the image data of a 2D American movie. The track having the track ID “2” is a first audio track, and is formed by the audio data of 5.1-ch audio for Japanese. The track having the track ID “3” is a second audio track, and is formed by the audio data of 2-ch audio for Japanese. The track having the track ID “4” is a third audio track, and is formed by the audio data of 5.1-ch audio for English. - The track having the track ID “5” is the track of a first secondary image, and is formed by the image data of 2D subtitles for Japanese. The track having the track ID “6” is the track of a second secondary image, and is the image data of a 2D comment image for Japanese. This comment image is for example an image showing a Japanese translation of a comment by a director of the 2D American movie.
-
FIG. 5 is a diagram showing an example of group information in a case where the tracks shown inFIG. 4 are recorded on therecording media 12. - In the example of
FIG. 5 , five pieces of group information are recorded on therecording media 12. The first group information includes track IDs “1” and “2” and metadata “Dubbed in Japanese (5.1 ch).” That is, the first group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1” and the track of the 5.1-ch audio for Japanese which track has the track ID “2.” The image and audio reproduced on the basis of this first group information is a 5.1-ch Japanese-dubbed version of the 2D American movie. - The second group information includes track IDs “1” and “3” and metadata “Dubbed in Japanese (2 ch).” That is, the second group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1” and the track of the 2-ch audio for Japanese which track has the track ID “3.” The image and audio reproduced on the basis of this second group information is a 2-ch Japanese-dubbed version of the 2D American movie.
- The third group information includes track IDs “1” and “4” and metadata “English (5.1 ch).” That is, the third group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1” and the track of the 5.1-ch audio for English which track has the track ID “4.” The image and audio reproduced on the basis of this third group information is the 2D American movie.
- The fourth group information includes track IDs “1,” “4,” and “5” and metadata “English (5.1 ch, Japanese Subtitle).” That is, the fourth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the 5.1-ch audio for English which track has the track ID “4,” and the track of the 2D subtitles for Japanese which track has the track ID “5.” The image and audio reproduced on the basis of this fourth group information is a Japanese subtitle version of the 2D American movie.
- The fifth group information includes track IDs “1,” “4,” and “6” and metadata “English (5.1 ch, Japanese Comment).” That is, the fifth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the 5.1-ch audio for English which track has the track ID “4,” and the track of the 2D comment image for Japanese which track has the track ID “6.” The image and audio reproduced on the basis of this fifth group information is the 2D American movie with a Japanese comment.
-
FIG. 6 is a diagram showing a second example of tracks recorded on therecording media 12. - In the example of
FIG. 6 , five tracks having track IDs of 1 to 5 are recorded on therecording media 12. The track having the track ID “1” is the track of a main image, and is formed by the image data of a 2D American movie. The track having the track ID “2” is a first audio track, and is formed by the audio data of 2-ch audio for Japanese. The track having the track ID “3” is a second audio track, and is formed by the audio data of 2-ch audio for English. - The track having the track ID “4” is the track of a first secondary image, and is formed by the image data of 2D forced subtitles for Japanese audio. Incidentally, the forced subtitles are subtitles for a main language which subtitles are to be displayed when audio as a reproduction object includes audio in a language different from a main language, and which subtitles correspond to the audio in the different language. For example, the forced subtitles are subtitles for Japanese which subtitles are to be displayed when the audio as a reproduction object is audio for Japanese and audio in a language other than Japanese is included in the middle of the audio for Japanese, and which subtitles correspond to the audio in the language other than Japanese. In addition, 2D forced subtitles for Japanese audio are 2D forced subtitles for Japanese which subtitles are to be displayed when audio for Japanese is set as a reproduction object.
- The track having the track ID “5” is the track of a second secondary image, and is the image data of 2D subtitles for Japanese and 2D forced subtitles for Japanese.
-
FIG. 7 is a diagram showing an example of group information in a case where the tracks shown inFIG. 6 are recorded on therecording media 12. - In the example of
FIG. 7 , two pieces of group information are recorded on therecording media 12. The first group information includes track IDs “1,” “2,” and “4” and metadata “Dubbed in Japanese (Forced Subtitle).” That is, the first group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the 2-ch audio for Japanese which track has the track ID “2,” and the track of the 2D forced subtitles for Japanese audio which track has the track ID “4.” The image and audio reproduced on the basis of this first group information is a Japanese-dubbed version of the 2D American movie with the forced subtitles. - The second group information includes track IDs “1,” “3,” and “5” and metadata “English Audio (Japanese Subtitles+Forced Subtitles).” That is, the second group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the 2-ch audio for English which track has the track ID “3,” and the track of the 2D subtitles for Japanese and the 2D forced subtitles for Japanese which track has the track ID “5.” The image and audio reproduced on the basis of this second group information is a Japanese subtitle version of the 2D American movie with the forced subtitles.
-
FIG. 8 is a diagram showing a third example of tracks recorded on therecording media 12. - In the example of
FIG. 8 , seven tracks having track IDs of 1 to 7 are recorded on therecording media 12. The track having the track ID “1” is the track of a first main image, and is formed by the image data of a 2D American movie. The track having the track ID “2” is the track of a second main image, and is formed by the image data of a conversion image of the American movie. - The track having the track ID “3” is a first audio track, and is formed by the audio data of 2-ch audio for Japanese. The track having the track ID “4” is a second audio track, and is formed by the audio data of 2-ch audio for English.
- The track having the track ID “5” is the track of a first secondary image, and is formed by the image data of 2D subtitles for Japanese. The track having the track ID “6” is the track of a second secondary image, and is formed by the image data of 3D subtitles for Japanese. The track having the track ID “7” is the track of a third secondary image, and is formed by the image data of 3D subtitles for Japanese which subtitles are displayed in a black band part of a screen (which subtitles will hereinafter be referred to as 3D Japanese black band subtitles).
- Incidentally, a black band part of a screen is a
black display region 40 disposed in an upper part or a lower part of the screen when a main image is a movie in a CinemaScope size or the like, as shown inFIG. 9 . -
FIG. 10 is a diagram showing an example of group information in a case where the tracks shown inFIG. 8 are recorded on therecording media 12. - In the example of
FIG. 10 , five pieces of group information are recorded on therecording media 12. The first group information includes track IDs “1” and “3” and metadata “[2D] Dubbed in Japanese.” That is, the first group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1” and the track of the 2-ch audio for Japanese which track has the track ID “3.” The image and audio reproduced on the basis of this first group information is a Japanese-dubbed version of the 2D American movie. - The second group information includes track IDs “1,” “4,” and “5” and metadata “[2D] English Audio and Japanese Subtitles.” That is, the second group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the 2-ch audio for English which track has the track ID “4,” and the track of the 2D subtitles for Japanese which track has the track ID “5.” The image and audio reproduced on the basis of this second group information is a Japanese subtitle version of the 2D American movie.
- The third group information includes track IDs “1,” “2,” and “3” and metadata “[3D] Dubbed in Japanese.” That is, the third group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the conversion image of the American movie which track has the track ID “2,” and the track of the 2-ch audio for Japanese which track has the track ID “3.” The image and audio reproduced on the basis of this third group information is a Japanese-dubbed version of the 3D American movie.
- The fourth group information includes track IDs “1,” “2,” “4,” and “6” and metadata “[3D] English Audio and Japanese Subtitles.” That is, the fourth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the conversion image of the American movie which track has the track ID “2,” the track of the 2-ch audio for English which track has the track ID “4,” and the track of the 3D subtitles for Japanese which track has the track ID “6.” The image and audio reproduced on the basis of this fourth group information is a Japanese subtitle version of the 3D American movie.
- The fifth group information includes track IDs “1,” “2,” “4,” and “7” and metadata “[3D] English Audio and
Japanese Subtitles 2.” That is, the fifth group information indicates a combination of the track of the image of the 2D American movie which track has the track ID “1,” the track of the conversion image of the American movie which track has the track ID “2,” the track of the 2-ch audio for English which track has the track ID “4,” and the track of the 3D Japanese black band subtitles which track has the track ID “7.” The image and audio reproduced on the basis of this fifth group information is a Japanese black band subtitle version of the 3D American movie. - As described above, only group information indicating predetermined combinations, rather than group information indicating all combinations, is recorded on the
recording media 12. - For example, the group information in
FIG. 5 does not include group information indicating a combination of the tracks having the track IDs “1,” “2,” and “5.” That is, because there are few users who desire to listen to the audio for Japanese and view the 2D subtitles for Japanese together with the image of the 2D American movie, the group information indicating the combination of the track of the image of the 2D American movie, the track of the audio for Japanese, and the track of the 2D subtitles for Japanese is not recorded on therecording media 12. - In addition, the group information in
FIG. 7 does not include group information indicating combinations that do not include the track having the track ID “4” or “5.” That is, because there are few users who do not need forced subtitles, the group information indicating the combinations that do not include the tracks including the forced subtitles is not recorded on therecording media 12. - Thus, when a user specifies group information as information indicating a combination of tracks as reproduction objects on a reproducing device for reproducing the
recording media 12, which reproducing device will be described later, the user can select and specify desired group information quickly as compared with a case where group information indicating all combinations is recorded on therecording media 12. - [Description of process of data generating device]
-
FIG. 11 is a flowchart of assistance in explaining a recording process by therecording device 10 ofFIG. 1 . - In step S11, the
data input section 21 obtains AV data such as 2D main images, 3D main images, conversion images, audio for various languages, 2D secondary images for various languages, 3D secondary images for various languages, and the like from the outside of therecording device 10. Thedata input section 21 supplies the AV data to thedata coding section 22. - In step S12, the
data input section 21 generates group information according to an instruction from thecontrol section 14, and supplies the group information to thedata coding section 22. - In step S13, the
preprocessing section 31 in thedata coding section 22 applies the preprocessing of a predetermined system to the AV data in track units of the 3D main images and the 3D secondary images as 3D images supplied from thedata input section 21. In addition, thepreprocessing section 31 supplies the AV data other than the AV data of the 3D main images and the 3D secondary images and the group information supplied from thedata input section 21 to theencoding section 32 as they are. - In step S14, the
encoding section 32 codes the AV data in track units which AV data is supplied from thepreprocessing section 31 by a system in accordance with MP4. Theencoding section 32 supplies an AV stream in track units which AV stream is obtained as a result of the coding to thefile generating section 33. In addition, theencoding section 32 supplies thefile generating section 33 with the group information as it is. - In step S15, the
file generating section 33 generates an MP4 file using the AV stream in track units and the group information supplied from theencoding section 32 and management information for each track which management information is supplied from thecontrol section 14. Thefile generating section 33 supplies the generated MP4 file to therecording section 23. - In step S16, the
recording section 23 supplies the MP4 file supplied from thefile generating section 33 to therecording media 12 to make the MP4 file recorded on therecording media 12. The process is then ended. - As described above, the
recording device 10 disposes the group information in the MP4 file. Thus, the reproducing device to be described later can present the group information to the user. As a result, by merely selecting desired group information from the presented group information, the user can easily specify a combination of tracks indicated by the group information as reproduction objects. In addition, the user of therecording device 10 can make the user of the reproducing device to be described later recognize a combination of appropriate tracks intended by the user of therecording device 10. - [Example of Configuration of An Embodiment of Reproducing Device]
-
FIG. 12 is a block diagram showing an example of configuration of an embodiment of a reproducing device as a data processing device to which the present technology is applied. - The reproducing
device 50 ofFIG. 12 includesrecording media 12, areproduction processing section 51, auser input section 52, and acontrol section 53. The reproducingdevice 50 reproduces an MP4 file from therecording media 12 on which the MP4 file has been recorded by therecording device 10 ofFIG. 1 . - Specifically, the
reproduction processing section 51 includes areadout section 61, adata decoding section 62, adisplay section 63, and aspeaker 64. - The readout section 61 (obtaining means) in the
reproduction processing section 51 reads and obtains the MP4 file recorded on therecording media 12. Thereadout section 61 supplies the MP4 file to thedata decoding section 62. - The
data decoding section 62 includes afile analyzing section 71, adecoding section 72, and a displayinformation generating section 73. - The
file analyzing section 71 in thedata decoding section 62 analyzes the MP4 file supplied from thereadout section 61, and obtains information disposed in each of a file type box and a movie box. Thefile analyzing section 71 then supplies thecontrol section 53 with group information disposed in a presentation track group box of the movie box and management information for each track which management information is disposed in a box for each track. In addition, according to an instruction from thecontrol section 53, thefile analyzing section 71 analyzes the MP4 file, obtains an AV stream of tracks as reproduction objects disposed in a real data box, and supplies the AV stream to thedecoding section 72. - The decoding section 72 (decoding means) decodes the AV stream supplied from the
file analyzing section 71 by a system corresponding to the coding system of theencoding section 32 inFIG. 1 under control of thecontrol section 53. Thedecoding section 72 supplies AV data obtained as a result of the decoding to the displayinformation generating section 73. - The display
information generating section 73 effects display of a menu screen on thedisplay section 63 on the basis of the AV data of the menu screen supplied from thedecoding section 72 according to an instruction from thecontrol section 53. In addition, the display information generating section 73 (display controlling means) effects display of a screen for selecting tracks to be set as reproduction objects on thedisplay section 63 on the basis of the management information or the group information supplied from thecontrol section 53. - The display
information generating section 73 applies the postprocessing of a system corresponding to thepreprocessing section 31 inFIG. 1 to the AV data of a 3D main image and a 3D secondary image as 3D images supplied from thedecoding section 72, and generates the AV data of an image for a left eye and an image for a right eye. In addition, the displayinformation generating section 73 generates the AV data of an image for a left eye and an image for a right eye using the AV data of a 2D main image and a conversion image supplied from thedecoding section 72. - On the basis of the generated AV data of the image for the left eye and the image for the right eye, the display
information generating section 73 effects display of a 3D main image and a 3D secondary image corresponding to the AV data on thedisplay section 63. In addition, on the basis of the AV data of images other than the 3D main image, the 3D secondary image, and the conversion image supplied from thedecoding section 72, the displayinformation generating section 73 effects display of a 2D main image and a 2D secondary image corresponding to the AV data on thedisplay section 63. In addition, on the basis of the AV data of audio, the displayinformation generating section 73 outputs the audio corresponding to the AV data to thespeaker 64. - The
user input section 52 includes an operating button and the like. Theuser input section 52 receives an instruction from a user, and supplies the instruction to thecontrol section 53. - The
control section 53 performs processing such as control and the like on each part of thereproduction processing section 51. For example, thecontrol section 53 supplies the management information or the group information to the displayinformation generating section 73 according to an instruction from theuser input section 52. In addition, thecontrol section 53 determines tracks as reproduction objects on the basis of an instruction from theuser input section 52 and the management information or the group information. Thecontrol section 53 then instructs thefile analyzing section 71 to obtain the AV stream of the tracks as reproduction objects. - [Example of Screen Displayed on Display Section]
-
FIGS. 13 to 15 are diagrams showing an example of a screen displayed on thedisplay section 63. - Incidentally,
FIGS. 13 to 15 are diagrams showing an example of a screen in a case where the tracks shown inFIG. 8 and the group information shown inFIG. 10 are recorded on therecording media 12. -
FIG. 13 is a diagram showing an example of a menu screen. - As shown in
FIG. 13 , themenu screen 100 includes amain part button 101, anaudio button 102, asubtitle button 103, and apresentation button 104. - The
main part button 101 is selected to display a screen for selecting the track of an image of the American movie as a reproduction object. Theaudio button 102 is selected to display a screen for selecting the track of audio as a reproduction object. Thesubtitle button 103 is selected to display a screen for selecting the track of subtitles as a reproduction object. Thepresentation button 104 is selected to display a screen for selecting group information indicating a combination of tracks as reproduction objects. - When the user gives an instruction to select the
main part button 101 on themenu screen 100 ofFIG. 13 by operating theuser input section 52, thedisplay section 63 displays ascreen 110 shown inFIG. 14 . - Specifically, the
control section 53 supplies management information for the tracks of main images to the displayinformation generating section 73 according to the instruction to select themain part button 101 which instruction is supplied from theuser input section 52. The displayinformation generating section 73 effects display of a2D reproduction button 111 indicating the contents of the track having the track ID “1” and a3D reproduction button 112 indicating the contents of the track having the track ID “2” on the basis of information indicating the contents of the tracks which information is included in the management information for the tracks of the main images which management information is supplied from thecontrol section 53. - The
2D reproduction button 111 is selected to set the track of the image of the 2D American movie which track has the track ID “1” as a reproduction object. The3D reproduction button 112 is selected to display the image of the 3D American movie with the track of the image of the 2D American movie which track has the track - ID “1” and the track of the conversion image of the American movie which track has the track ID “2” as reproduction objects.
- Incidentally, though not shown in the figures, as in the case where the
main part button 101 is selected, when theaudio button 102 or thesubtitle button 103 is selected, buttons indicating the contents of respective tracks of audio or secondary images are displayed on the basis of information indicating the contents of the tracks which information is included in the management information for the tracks of the audio or the secondary images. - On the other hand, when the user gives an instruction to select the
presentation button 104 on themenu screen 100 ofFIG. 13 by operating theuser input section 52, thedisplay section 63 displays ascreen 120 shown inFIG. 15 . - Specifically, the
control section 53 supplies group information to the displayinformation generating section 73 according to the instruction to select thepresentation button 104 which instruction is supplied from theuser input section 52. On the basis of the metadata included in the group information supplied from thecontrol section 53, the displayinformation generating section 73 effects display of a “[2D] Dubbed in Japanese”button 121 corresponding to the metadata of the first group information, a “[2D] English Audio and Japanese Subtitles”button 122 corresponding to the metadata of the second group information, a “[3D] Dubbed in Japanese”button 123 corresponding to the metadata of the third group information, a “[3D] English Audio and Japanese Subtitles”button 124 corresponding to the metadata of the fourth group information, and a “[3D] English Audio andJapanese Subtitles 2”button 125 corresponding to the metadata of the fifth group information. - The “[2D] Dubbed in Japanese”
button 121 is selected to set the combination of the tracks indicated by the first group information, that is, the tracks having the track IDs “1” and “3,” as reproduction objects. The “[2D] English Audio and Japanese Subtitles”button 122 is selected to set the combination of the tracks indicated by the second group information, that is, the tracks having the track IDs “1,” “4,” and “5,” as reproduction objects. The “[3D] Dubbed in Japanese”button 123 is selected to set the combination of the tracks indicated by the third group information, that is, the tracks having the track IDs “1,” “2,” and “3,” as reproduction objects. - The “[3D] English Audio and Japanese Subtitles”
button 124 is selected to set the combination of the tracks indicated by the fourth group information, that is, the tracks having the track IDs “1,” “2,” “4,” and “6,” as reproduction objects. The “[3D] English Audio andJapanese Subtitles 2”button 125 is selected to set the combination of the tracks indicated by the fifth group information, that is, the tracks having the track IDs “1,” “2,” “4,” and “7,” as reproduction objects. - [Description of Process of Reproducing Device]
-
FIG. 16 is a flowchart of assistance in explaining a reproducing process by the reproducingdevice 50 ofFIG. 12 . This reproducing process is started when the user gives an instruction to reproduce therecording media 12 by operating theuser input section 52, for example. - In step S30 in
FIG. 16 , according to an instruction from thecontrol section 53 which instruction corresponds to the reproducing instruction from the user, thereadout section 61 reads an MP4 file recorded on therecording media 12, and supplies the MP4 file to thedata decoding section 62. - In step S31, the
file analyzing section 71 analyzes the MP4 file supplied from thereadout section 61, and determines whether there is a presentation track group box in a movie box of the MP4 file. When it is determined in step S31 that there is a presentation track group box, thefile analyzing section 71 supplies group information disposed in the presentation track group box to thecontrol section 53. Thefile analyzing section 71 also supplies management information for each track which management information is disposed in a box for each track in the movie box to thecontrol section 53. - Then, in step S32, the
control section 53 selects predetermined group information from the group information supplied from thefile analyzing section 71. - A first group information selecting method is for example a method of selecting group information at a first position. In this case, the user of the recording device 10 (producer of the recording media 12) disposes the group information intended by the user of the
recording device 10 at a first position. Thereby, the group information intended by the user of therecording device 10 can be made to be selected at a time of reproduction of therecording media 12. - A second group information selecting method is for example a method of selecting group information including the track of audio for the language of a country in which the reproducing
device 50 is used. When this method is used, thecontrol section 53 recognizes track IDs included in each piece of group information, and recognizes the contents of tracks having the track IDs from management information for the tracks. Thecontrol section 53 then selects group information including the track ID of a track whose contents are audio for the language of the country in which the reproducingdevice 50 is used. Incidentally, when there are a plurality of pieces of group information including the track ID of a track whose contents are audio for the language of the country in which the reproducingdevice 50 is used, group information detected first is selected, for example. - A third group information selecting method is for example a method of selecting group information including the track of subtitles for the language of a country in which the reproducing
device 50 is used. When this method is used, thecontrol section 53 recognizes track IDs included in each piece of group information, and recognizes the contents of tracks having the track IDs from management information for the tracks. Thecontrol section 53 then selects group information including the track ID of a track whose contents are subtitles for the language of the country in which the reproducingdevice 50 is used. Incidentally, when there are a plurality of pieces of group information including the track ID of a track whose contents are subtitles for the language of the country in which the reproducingdevice 50 is used, group information detected first is selected, for example. - Incidentally, the language of the country in which the reproducing
device 50 is used, which language is used by the second selecting method and the third selecting method, is specified by the user via theuser input section 52, for example. - In step S33, the
control section 53 sets all track IDs included in the group information selected in step S32 as the track IDs of tracks as reproduction objects. Then, thecontrol section 53 instructs thefile analyzing section 71 to obtain an AV stream of the tracks having the track IDs, and advances the process to step S35. - On the other hand, when it is determined in step S31 that there is no presentation track group box, the
file analyzing section 71 supplies management information for each track, which management information is disposed in the box for each track in the movie box, to thecontrol section 53. - Then, in step S34, the
control section 53 sets predetermined track IDs, which are set in advance as initial values of track IDs, as the track IDs of tracks as reproduction objects. The initial values of the track IDs are for example disposed in the movie box, and supplied to thecontrol section 53 via thefile analyzing section 71. After the process of step S34, thecontrol section 53 instructs thefile analyzing section 71 to obtain the AV stream of the tracks having the set track IDs, and then advances the process to step S35. - In step S35, according to the instruction from the
control section 53, thefile analyzing section 71 analyzes the MP4 file, obtains the AV stream of the tracks as reproduction objects disposed in the real data box, and supplies the AV stream to thedecoding section 72. - In step S36, the
decoding section 72 decodes the AV stream supplied from thefile analyzing section 71 by a system corresponding to the coding system of theencoding section 32 inFIG. 1 under control of thecontrol section 53. Thedecoding section 72 supplies AV data obtained as a result of the decoding to the displayinformation generating section 73. - In step S37, on the basis of the AV data supplied from the
decoding section 72, the displayinformation generating section 73 effects display of a main image and a secondary image corresponding to the AV data on thedisplay section 63, and outputs audio corresponding to the AV data to thespeaker 64. The process is then ended. -
FIG. 17 is a flowchart of assistance in explaining a track changing process for changing tracks as reproduction objects, which process is performed by the reproducingdevice 50 inFIG. 12 . This track changing process is started when the user gives an instruction to display the menu screen by operating theuser input section 52, for example. - Incidentally, while the track changing process in the case where the tracks shown in
FIG. 8 and the group information shown inFIG. 10 are recorded on therecording media 12 will be described with reference toFIG. 17 , a similar process is performed also in cases where other tracks and other group information are recorded on therecording media 12, with only the kinds of displayed buttons different. - In step S51 in
FIG. 17 , the displayinformation generating section 73 effects display of themenu screen 100 on thedisplay section 63 on the basis of the AV data of the menu screen supplied from thedecoding section 72 according to an instruction from thecontrol section 53 which instruction corresponds to the instruction of the user to display themenu screen 100. - In step S52, the
control section 53 determines whether thepresentation button 104 within themenu screen 100 is selected. When it is determined in step S52 that thepresentation button 104 is selected, thecontrol section 53 supplies the group information to the displayinformation generating section 73. - Then, in step S53, the display
information generating section 73 effects display of the “[2D] Dubbed in Japanese”button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, and the “[3D] English Audio andJapanese Subtitles 2”button 125 inFIG. 15 on the basis of the metadata included in the group information supplied from thecontrol section 53. - In step S54, the
control section 53 determines whether one of the “[2D] Dubbed in Japanese”button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, and the “[3D] English Audio andJapanese Subtitles 2”button 125 is selected. - When it is determined in step S54 that none of the “[2D] Dubbed in Japanese”
button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, and the “[3D] English Audio andJapanese Subtitles 2”button 125 is selected, thecontrol section 53 waits until one of the “[2D] Dubbed in Japanese”button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, and the “[3D] English Audio andJapanese Subtitles 2”button 125 is selected. - When it is determined in step S54 that one of the “[2D] Dubbed in Japanese”
button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, and the “[3D] English Audio andJapanese Subtitles 2”button 125 is selected, the process proceeds to step S55. - In step S55, the
control section 53 changes the track IDs of tracks set as reproduction objects to the track IDs included in the group information corresponding to the “[2D] Dubbed in Japanese”button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, or the “[3D] English Audio andJapanese Subtitles 2”button 125 that is selected. - That is, the control section 53 (selecting means) selects, as a combination of tracks as reproduction objects, the combination indicated by the group information corresponding to the “[2D] Dubbed in Japanese”
button 121, the “[2D] English Audio and Japanese Subtitles”button 122, the “[3D] Dubbed in Japanese”button 123, the “[3D] English Audio and Japanese Subtitles”button 124, or the “[3D] English Audio andJapanese Subtitles 2”button 125 that is selected. The process is then ended. - On the other hand, when it is determined in step S52 that the
presentation button 104 is not selected, thecontrol section 53 in step S56 determines whether themain part button 101 is selected. - When it is determined in step S56 that the
main part button 101 is selected, thecontrol section 53 supplies management information for the tracks of main images to the displayinformation generating section 73. - Then, in step S57, the display
information generating section 73 effects display of the2D reproduction button 111 and the3D reproduction button 112 inFIG. 14 on the basis of information indicating the contents of the tracks of the main images which information is included in the management information for the tracks of the main images which management information is supplied from thecontrol section 53. - In step S58, the
control section 53 determines whether one of the2D reproduction button 111 and the3D reproduction button 112 is selected. When it is determined in step S58 that none of the2D reproduction button 111 and the3D reproduction button 112 is selected, thecontrol section 53 waits until one of the 2D reproduction button ill and the3D reproduction button 112 is selected. - When it is determined in step S58 that one of the
2D reproduction button 111 and the3D reproduction button 112 is selected, the process proceeds to step S59. - In step S59, the
control section 53 changes the track ID of the track of a main image set as a reproduction object to the track ID of the track of a main image corresponding to the2D reproduction button 111 or the3D reproduction button 112 that is selected. The process is then ended. - When it is determined in step S56 that the
main part button 101 is not selected, on the other hand, thecontrol section 53 in step S60 determines whether theaudio button 102 is selected. - When it is determined in step S60 that the
audio button 102 is selected, thecontrol section 53 in step S61 supplies management information for the tracks of audio to the displayinformation generating section 73. - Then, in step S61, the display
information generating section 73 effects display of various buttons on the basis of information indicating the contents of the tracks of the audio which information is included in the management information for the tracks of the audio which management information is supplied from thecontrol section 53. - In step S62, the
control section 53 determines whether one of the buttons displayed in step S61 is selected. When it is determined in step S62 that none of the buttons displayed in step S61 is selected, thecontrol section 53 waits until one of the buttons displayed in step S61 is selected. - When it is determined in step S62 that one of the buttons displayed in step S61 is selected, the process proceeds to step S63. In step S63, the
control section 53 changes the track ID of the track of audio set as a reproduction object to the track ID of the track of audio corresponding to the selected button. The process is then ended. - When it is determined in step S60 that the
audio button 102 is not selected, on the other hand, thecontrol section 53 in step S64 determines whether thesubtitle button 103 is selected. - When it is determined in step S64 that the
subtitle button 103 is not selected, the process returns to step S52 to repeat the processes of steps S52, S56, S60, and S64 until thepresentation button 104, themain part button 101, theaudio button 102, or thesubtitle button 103 is selected. - When it is determined in step S64 that the
subtitle button 103 is selected, on the other hand, thecontrol section 53 in step S65 supplies management information for the tracks of subtitles to the displayinformation generating section 73. - Then, in step S65, the display
information generating section 73 effects display of various buttons on the basis of information indicating the contents of the tracks of the subtitles which information is included in the management information for the tracks of the subtitles which management information is supplied from thecontrol section 53. - In step S66, the
control section 53 determines whether one of the buttons displayed in step S65 is selected. When it is determined in step S66 that none of the buttons displayed in step S65 is selected, thecontrol section 53 waits until one of the buttons displayed in step S65 is selected. - When it is determined in step S66 that one of the buttons displayed in step S65 is selected, the process proceeds to step S67. In step S67, the
control section 53 changes the track ID of the track of subtitles set as a reproduction object to the track ID of the track of subtitles corresponding to the selected button. The process is then ended. - As described above, the reproducing
device 50 obtains an MP4 file having group information disposed therein, and on the basis of the group information, displays thescreen 120 for selecting group information indicating a combination of tracks as reproduction objects. Thus, by merely selecting a button corresponding to desired group information on thescreen 120, the user can easily select a combination of tracks indicated by the group information as reproduction objects. - Incidentally, in the present embodiment, the MP4 file is recorded on the
recording media 12. However, the MP4 file may be transmitted via a predetermined network. - [Description of Computer To Which Present Technology Is Applied]
- Next, the series of processes described above can be not only performed by hardware but also performed by software. When the series of processes is performed by software, a program constituting the software is installed onto a general-purpose computer or the like.
-
FIG. 18 shows an example of configuration of one embodiment of a computer onto which the program for performing the series of processes described above is installed. - The program can be recorded in advance in a
storage section 208 or a ROM (Read Only Memory) 202 as a recording medium included in the computer. - Alternatively, the program can be stored (recorded) on
removable media 211. Suchremovable media 211 can be provided as so-called packaged software. In this case, theremovable media 211 include for example a flexible disk, a CD-ROM (Compact Disk Read Only Memory), an MO (Magneto Optical) disk, a DVD (Digital Versatile Disk), a magnetic disk, and a semiconductor memory. - Incidentally, the program can be not only installed from the
removable media 211 as described above onto the computer via adrive 210 but also downloaded to the computer via a communication network or a broadcasting network and installed into the built-instorage section 208. Specifically, the program can be for example transferred from a download site to the computer by radio via an artificial satellite for digital satellite broadcasting or transferred to the computer by wire via a network such as a LAN (Local Area Network), the Internet or the like. - The computer includes a CPU (Central Processing Unit) 201. The
CPU 201 is connected with an input-output interface 205 via abus 204. When a command is input to theCPU 201 via the input-output interface 205 by an operation of aninput section 206 by the user or the like, theCPU 201 executes the program stored in theROM 202 according to the command. Alternatively, theCPU 201 loads the program stored in thestorage section 208 into a RAM (Random Access Memory) 203, and then executes the program. - The
CPU 201 thereby performs the processes according to the above-described flowcharts or processes performed by the configurations of the above-described block diagrams. Then, theCPU 201 for example outputs a result of a process from anoutput section 207 via the input-output interface 205, transmits the result of the process from a communicatingsection 209 via the input-output interface 205, or records the result of the process in thestorage section 208 via the input-output interface 205, as required. - Incidentally, the
input section 206 includes a keyboard, a mouse, a microphone and the like. Theinput section 206 corresponds to theuser input section 13 and theuser input section 52. Theoutput section 207 includes an LCD (Liquid Crystal Display), a speaker and the like. Theoutput section 207 corresponds to for example thedisplay section 63 and thespeaker 64. Therecording media 12 may be inserted into the computer via a drive not shown in the figure, or may be included in the computer as a part of thestorage section 208. - In the present specification, the processes performed by the computer according to the program do not necessarily need to be performed in time series in the order described in the flowcharts. That is, processes performed by the computer according to the program also include processes performed in parallel or individually (for example parallel processing or processing according to objects).
- In addition, the program may be processed by one computer (processor), or may be subjected to a distributed processing by a plurality of computers. Further, the program may be transferred to a remote computer to be executed.
- It is to be noted that in the present specification, the steps describing the program stored on a program recording medium include not only processes performed in time series in the described order but also processes not necessarily performed in time series but performed in parallel or individually.
- In addition, embodiments of the present disclosure are not limited to the foregoing embodiments, and various changes can be made without departing from the spirit of the present disclosure.
- The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-173600 filed in the Japan Patent Office on Aug. 2, 2010, the entire content of which is hereby incorporated by reference.
Claims (6)
1. A data generating device comprising:
circuitry configured to:
store a plurality of kinds of coded data, each kind of coded data corresponding to an individual track of audio/video (A/V) content; and
generate a plurality of pieces of group information and metadata corresponding to each of the plurality of pieces of group information, each piece of group information identifying a combination of the individual tracks of the A/V content, wherein
the individual tracks of the A/V content include at least a main image track, a secondary image track, a first audio track and a second audio track, and
the circuitry is configured to generate a first piece of group information identifying a combination of at least the main image track and the first audio track, and generate a second piece of group information identifying a combination of the main image track, the secondary image track and the second audio track.
2. The data generating device according to claim 1 , wherein
the coded data forms part of an MP4 file, and
the plurality of pieces of group information are included a movie box of the MP4 file.
3. The data generating device according to claim 1 , wherein
the plurality of kinds of coded data are image data of a main image, image data of a secondary image, and audio data.
4. The data generating device according to claim 1 , wherein
the plurality of kinds of coded data are image data of a three-dimensional main image, image data of a three-dimensional secondary image, and audio data.
5. A data generating method executed by a data generating device comprising:
storing a plurality of kinds of coded data, each kind of coded data corresponding to an individual track of audio/video (A/V) content; and
generating a plurality of pieces of group information and metadata corresponding to each of the plurality of pieces of group information, each piece of group information identifying a combination of the individual tracks of the A/V content, wherein
the individual tracks of the A/V content include at least a main image track, a secondary image track, a first audio track and a second audio track, and
the generating the plurality of pieces of group information includes generating a first piece of group information identifying a combination of at least the main image track and the first audio track, and generating a second piece of group information identifying a combination of the main image track, the secondary image track and the second audio track.
6. A non-transitory computer-readable medium including computer program instruction, which when executed by a data generating device, causes the data generating device to:
store a plurality of kinds of coded data, each kind of coded data corresponding to an individual track of audio/video (A/V) content; and
generate a plurality of pieces of group information and metadata corresponding to each of the plurality of pieces of group information, each piece of group information identifying a combination of the individual tracks of the A/V content, wherein
the individual tracks of the A/V content include at least a main image track, a secondary image track, a first audio track and a second audio track, and
the generating the plurality of pieces of group information includes generating a first piece of group information identifying a combination of at least the main image track and the first audio track, and generating a second piece of group information identifying a combination of the main image track, the secondary image track and the second audio track.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/927,853 US20130287364A1 (en) | 2010-08-02 | 2013-06-26 | Data generating device and data generating method, and data processing device and data processing method |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010173600A JP5652642B2 (en) | 2010-08-02 | 2010-08-02 | Data generation apparatus, data generation method, data processing apparatus, and data processing method |
| JP2010-173600 | 2010-08-02 | ||
| US13/179,697 US8504591B2 (en) | 2010-08-02 | 2011-07-11 | Data generating device and data generating method, and data processing device and data processing method |
| US13/927,853 US20130287364A1 (en) | 2010-08-02 | 2013-06-26 | Data generating device and data generating method, and data processing device and data processing method |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/179,697 Continuation US8504591B2 (en) | 2010-08-02 | 2011-07-11 | Data generating device and data generating method, and data processing device and data processing method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20130287364A1 true US20130287364A1 (en) | 2013-10-31 |
Family
ID=44503581
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/179,697 Expired - Fee Related US8504591B2 (en) | 2010-08-02 | 2011-07-11 | Data generating device and data generating method, and data processing device and data processing method |
| US13/927,853 Abandoned US20130287364A1 (en) | 2010-08-02 | 2013-06-26 | Data generating device and data generating method, and data processing device and data processing method |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/179,697 Expired - Fee Related US8504591B2 (en) | 2010-08-02 | 2011-07-11 | Data generating device and data generating method, and data processing device and data processing method |
Country Status (4)
| Country | Link |
|---|---|
| US (2) | US8504591B2 (en) |
| EP (1) | EP2416321B1 (en) |
| JP (1) | JP5652642B2 (en) |
| CN (1) | CN102347046B (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10109285B2 (en) | 2014-09-08 | 2018-10-23 | Sony Corporation | Coding device and method, decoding device and method, and program |
| US10142757B2 (en) | 2014-10-16 | 2018-11-27 | Sony Corporation | Transmission device, transmission method, reception device, and reception method |
| US10475463B2 (en) | 2015-02-10 | 2019-11-12 | Sony Corporation | Transmission device, transmission method, reception device, and reception method for audio streams |
| US10856042B2 (en) | 2014-09-30 | 2020-12-01 | Sony Corporation | Transmission apparatus, transmission method, reception apparatus and reception method for transmitting a plurality of types of audio data items |
| US11051083B2 (en) * | 2017-09-15 | 2021-06-29 | Sony Corporation | Image processing apparatus and file generation apparatus |
| US11818406B2 (en) * | 2020-07-23 | 2023-11-14 | Western Digital Technologies, Inc. | Data storage server with on-demand media subtitles |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103152607B (en) * | 2013-01-10 | 2016-10-12 | 上海思华科技股份有限公司 | The supper-fast thick volume method of video |
| CN114242082B (en) * | 2014-05-30 | 2025-11-04 | 索尼公司 | Information processing device and information processing method |
| KR20240065194A (en) * | 2014-06-30 | 2024-05-14 | 소니그룹주식회사 | Information processor and information-processing method |
| WO2016002513A1 (en) * | 2014-07-01 | 2016-01-07 | ソニー株式会社 | Information processing device and method |
| CN111951814B (en) * | 2014-09-04 | 2025-03-07 | 索尼公司 | Transmission device, transmission method, receiving device and receiving method |
| JP6724783B2 (en) * | 2014-09-12 | 2020-07-15 | ソニー株式会社 | Transmission device, transmission method, reception device, and reception method |
| CN113242448B (en) * | 2015-06-02 | 2023-07-14 | 索尼公司 | Sending device and method, media processing device and method, and receiving device |
Citations (55)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6128434A (en) * | 1993-10-29 | 2000-10-03 | Kabushiki Kaisha Toshiba | Multilingual recording medium and reproduction apparatus |
| US6360234B2 (en) * | 1997-08-14 | 2002-03-19 | Virage, Inc. | Video cataloger system with synchronized encoders |
| US6442333B1 (en) * | 1997-12-25 | 2002-08-27 | Pioneer Electronic Corporation | Information reproducing apparatus |
| US20020168179A1 (en) * | 2001-05-10 | 2002-11-14 | Shinichi Kikuchi | Digital recording/reproducing apparatus |
| US20030093790A1 (en) * | 2000-03-28 | 2003-05-15 | Logan James D. | Audio and video program recording, editing and playback systems using metadata |
| US6611655B1 (en) * | 1999-04-02 | 2003-08-26 | Matsushita Electric Industrial Co., Ltd. | Optical disc, recording device and reproducing device |
| US20030221014A1 (en) * | 2002-05-24 | 2003-11-27 | David Kosiba | Method for guaranteed delivery of multimedia content based on terminal capabilities |
| US20040146285A1 (en) * | 2002-05-28 | 2004-07-29 | Yoshinori Matsui | Moving picture data reproducing device with improved random access |
| US20040268386A1 (en) * | 2002-06-08 | 2004-12-30 | Gotuit Video, Inc. | Virtual DVD library |
| US20050084006A1 (en) * | 2003-10-16 | 2005-04-21 | Shawmin Lei | System and method for three-dimensional video coding |
| US20050152256A1 (en) * | 2004-01-08 | 2005-07-14 | Shinichi Kikuchi | Information recording medium, information recording method, information playback method, information recording apparatus, and information playback apparatus |
| US20050196148A1 (en) * | 2004-02-10 | 2005-09-08 | Seo Kang S. | Recording medium having a data structure for managing font information for text subtitles and recording and reproducing methods and apparatuses |
| US6970638B1 (en) * | 1999-06-22 | 2005-11-29 | Funai Electric Co., Ltd. | Recording medium reproducing apparatus |
| US20060093324A1 (en) * | 1996-04-04 | 2006-05-04 | Pioneer Electronic Corporation | Information record medium, apparatus for recording the same and apparatus for reproducing the same |
| US7062758B2 (en) * | 2001-12-04 | 2006-06-13 | Hitachi, Ltd. | File conversion method, file converting device, and file generating device |
| US20060127051A1 (en) * | 2004-06-16 | 2006-06-15 | Yasufumi Tsumagari | Information recording medium, information playback method, and information playback apparatus |
| US20060210245A1 (en) * | 2003-02-21 | 2006-09-21 | Mccrossan Joseph | Apparatus and method for simultaneously utilizing audio visual data |
| US20070050517A1 (en) * | 2005-08-24 | 2007-03-01 | Kazumi Doi | Content editing apparatus and content reproducing apparatus |
| US20070147770A1 (en) * | 2005-12-26 | 2007-06-28 | Kabushiki Kaisha Toshiba | Moving image reproducing apparatus |
| US20080063369A1 (en) * | 2006-04-17 | 2008-03-13 | Kim Kun S | Method and apparatus for reproducing data, recording medium, and method and apparatus for recording data |
| US20080109449A1 (en) * | 2004-09-15 | 2008-05-08 | Samsung Electronics Co., Ltd. | Information storage medium for storing metadata supporting multiple languages, and systems and methods of processing metadata |
| US20080129864A1 (en) * | 2006-12-01 | 2008-06-05 | General Instrument Corporation | Distribution of Closed Captioning From a Server to a Client Over a Home Network |
| US20080219641A1 (en) * | 2007-03-09 | 2008-09-11 | Barry Sandrew | Apparatus and method for synchronizing a secondary audio track to the audio track of a video source |
| US20080252719A1 (en) * | 2007-04-13 | 2008-10-16 | Samsung Electronics Co., Ltd. | Apparatus, method, and system for generating stereo-scopic image file based on media standards |
| US20080294691A1 (en) * | 2007-05-22 | 2008-11-27 | Sunplus Technology Co., Ltd. | Methods for generating and playing multimedia file and recording medium storing multimedia file |
| US20090024644A1 (en) * | 2004-10-13 | 2009-01-22 | Electronics And Telecommunications Research Institute | Extended Multimedia File Structure and Multimedia File Producting Method and Multimedia File Executing Method |
| US20090046597A1 (en) * | 2007-08-13 | 2009-02-19 | Samsung Electronics Co., Ltd. | Method and apparatus for generating and accessing metadata in media file format |
| US20090094113A1 (en) * | 2007-09-07 | 2009-04-09 | Digitalsmiths Corporation | Systems and Methods For Using Video Metadata to Associate Advertisements Therewith |
| US20090208190A1 (en) * | 2006-05-18 | 2009-08-20 | Pioneer Corporation | Information reproducing apparatus and method, managing apparatus and method, information reproducing system, and computer program |
| US7634727B2 (en) * | 2005-04-26 | 2009-12-15 | Microsoft Corporation | System for abstracting audio-video codecs |
| US20100015339A1 (en) * | 2008-03-07 | 2010-01-21 | Evonik Degussa Gmbh | Silane-containing corrosion protection coatings |
| US20100021125A1 (en) * | 2006-09-20 | 2010-01-28 | Claudio Ingrosso | Methods and apparatus for creation, distribution and presentation of polymorphic media |
| US20100042924A1 (en) * | 2006-10-19 | 2010-02-18 | Tae Hyeon Kim | Encoding method and apparatus and decoding method and apparatus |
| US20100098389A1 (en) * | 2007-03-22 | 2010-04-22 | Masaaki Shimada | Video reproducing apparatus and method |
| US20100124409A1 (en) * | 2008-11-14 | 2010-05-20 | Samsung Electronics Co., Ltd. | Method of selecting content reproducing apparatus and content reproducing apparatus selector |
| US20100138418A1 (en) * | 2008-11-28 | 2010-06-03 | Samsung Electronics Co., Ltd. | Method and apparatus for reproducing content by using metadata |
| US20100153395A1 (en) * | 2008-07-16 | 2010-06-17 | Nokia Corporation | Method and Apparatus For Track and Track Subset Grouping |
| US20100183278A1 (en) * | 2009-01-16 | 2010-07-22 | David Robert Black | Capturing and inserting closed captioning data in digital video |
| US20100310233A1 (en) * | 1998-06-24 | 2010-12-09 | Samsung Electronics, Co., Ltd. | Recording medium for storing information for still picture, recording and/or reproducing method and apparatus therefor |
| US20110020774A1 (en) * | 2009-07-24 | 2011-01-27 | Echostar Technologies L.L.C. | Systems and methods for facilitating foreign language instruction |
| US20110030031A1 (en) * | 2009-07-31 | 2011-02-03 | Paul Lussier | Systems and Methods for Receiving, Processing and Organizing of Content Including Video |
| US20110064146A1 (en) * | 2009-09-16 | 2011-03-17 | Qualcomm Incorporated | Media extractor tracks for file format track selection |
| US20110164859A1 (en) * | 2010-01-05 | 2011-07-07 | Kuang-Tsai Hao | Electronic audio/video story construction method |
| US7996871B2 (en) * | 2004-09-23 | 2011-08-09 | Thomson Licensing | Method and apparatus for using metadata for trick play mode |
| US7996449B2 (en) * | 2004-11-11 | 2011-08-09 | Samsung Electronics Co., Ltd. | Storage medium storing audio-visual data including metadata, reproducing apparatus, and method of searching for audio-visual data using the metadata |
| US8059941B2 (en) * | 2004-05-10 | 2011-11-15 | Via Technologies Inc. | Multiplex DVD player |
| US20120263438A1 (en) * | 2008-05-01 | 2012-10-18 | Mobitv, Inc. | Search system using media metadata tracks |
| US8331769B2 (en) * | 2002-09-12 | 2012-12-11 | Panasonic Corporation | Recording medium, playback device, program, playback method, and recording method |
| US8566393B2 (en) * | 2009-08-10 | 2013-10-22 | Seawell Networks Inc. | Methods and systems for scalable video chunking |
| US8631047B2 (en) * | 2010-06-15 | 2014-01-14 | Apple Inc. | Editing 3D video |
| US8655854B2 (en) * | 2010-07-27 | 2014-02-18 | Avid Technology, Inc. | Hierarchical multimedia program composition |
| US8719437B1 (en) * | 2009-08-13 | 2014-05-06 | Avvasi Inc. | Enabling streaming to a media player without native streaming support |
| US8782268B2 (en) * | 2010-07-20 | 2014-07-15 | Microsoft Corporation | Dynamic composition of media |
| US8838594B2 (en) * | 2006-12-27 | 2014-09-16 | International Business Machines Corporation | Automatic method to synchronize the time-line of video with audio feature quantity |
| US8966371B2 (en) * | 2006-09-11 | 2015-02-24 | Apple Inc. | Metadata for providing media content |
Family Cites Families (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE69624732T2 (en) * | 1995-08-21 | 2003-03-13 | Matsushita Electric Industrial Co., Ltd. | Device and method for reproducing optical disks which enable dynamic switching of reproduced data |
| KR100247345B1 (en) * | 1997-01-28 | 2000-03-15 | 윤종용 | Dvd audio disc reproducing apparatus and method |
| JPH11185463A (en) * | 1997-12-19 | 1999-07-09 | Sony Corp | Data recording medium and menu control method and apparatus |
| US7613727B2 (en) | 2002-02-25 | 2009-11-03 | Sont Corporation | Method and apparatus for supporting advanced coding formats in media files |
| AU2003213555B2 (en) | 2002-02-25 | 2008-04-10 | Sony Electronics, Inc. | Method and apparatus for supporting AVC in MP4 |
| EP1481555A1 (en) | 2002-02-25 | 2004-12-01 | Sony Electronics Inc. | Method and apparatus for supporting avc in mp4 |
| US20030163477A1 (en) | 2002-02-25 | 2003-08-28 | Visharam Mohammed Zubair | Method and apparatus for supporting advanced coding formats in media files |
| US7788277B2 (en) * | 2002-07-24 | 2010-08-31 | General Instrument Corporation | Methods and apparatus for rapid capture of program identifier data in a broadband transcoder multiplexer |
| JP3937223B2 (en) | 2003-01-21 | 2007-06-27 | ソニー株式会社 | Recording apparatus, reproducing apparatus, recording method, and reproducing method |
| KR100587324B1 (en) * | 2003-06-14 | 2006-06-08 | 엘지전자 주식회사 | Digital Multimedia Broadcasting Service Method, Transceiver, and Data Structure |
| KR101004505B1 (en) * | 2003-06-18 | 2010-12-31 | 파나소닉 주식회사 | Playback device, recording medium, recording method, playback method |
| KR100939860B1 (en) * | 2004-06-18 | 2010-01-29 | 파나소닉 주식회사 | Reproduction device, and reproduction method |
| JP5035583B2 (en) * | 2004-12-06 | 2012-09-26 | ソニー株式会社 | Recording apparatus and recording method, reproducing apparatus and reproducing method, recording / reproducing apparatus, recording / reproducing method, and program |
| CN101111894A (en) * | 2005-01-25 | 2008-01-23 | 尼禄股份公司 | Method for preparing DVD-Video format data, method for reconstructing DVD-Video data and structure of DVD-Video data |
| JP4968506B2 (en) | 2005-03-04 | 2012-07-04 | ソニー株式会社 | REPRODUCTION DEVICE, REPRODUCTION METHOD, AND PROGRAM |
| JP5530720B2 (en) * | 2007-02-26 | 2014-06-25 | ドルビー ラボラトリーズ ライセンシング コーポレイション | Speech enhancement method, apparatus, and computer-readable recording medium for entertainment audio |
| JP5121935B2 (en) * | 2007-10-13 | 2013-01-16 | 三星電子株式会社 | Apparatus and method for providing stereoscopic 3D video content for LASeR-based terminals |
| BRPI0818398B1 (en) * | 2007-10-19 | 2021-02-23 | Samsung Electronics Co., Ltd | three-dimensional image data storage method |
| KR101530713B1 (en) * | 2008-02-05 | 2015-06-23 | 삼성전자주식회사 | Apparatus and method for generating and displaying video files |
| CN102355590B (en) * | 2008-09-30 | 2014-11-12 | 松下电器产业株式会社 | Recording medium, playback device, system LSI, playback method, glasses, and display device for 3D images |
| JP5369727B2 (en) | 2009-02-02 | 2013-12-18 | 日本電気株式会社 | Operation status notification method |
| KR20100113266A (en) * | 2009-04-13 | 2010-10-21 | 삼성전자주식회사 | Apparatus and method for manufacturing three-dementions image message in electronic terminal |
-
2010
- 2010-08-02 JP JP2010173600A patent/JP5652642B2/en not_active Expired - Fee Related
-
2011
- 2011-07-07 EP EP11173118.8A patent/EP2416321B1/en active Active
- 2011-07-11 US US13/179,697 patent/US8504591B2/en not_active Expired - Fee Related
- 2011-07-26 CN CN201110209390.XA patent/CN102347046B/en not_active Expired - Fee Related
-
2013
- 2013-06-26 US US13/927,853 patent/US20130287364A1/en not_active Abandoned
Patent Citations (55)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6128434A (en) * | 1993-10-29 | 2000-10-03 | Kabushiki Kaisha Toshiba | Multilingual recording medium and reproduction apparatus |
| US20060093324A1 (en) * | 1996-04-04 | 2006-05-04 | Pioneer Electronic Corporation | Information record medium, apparatus for recording the same and apparatus for reproducing the same |
| US6360234B2 (en) * | 1997-08-14 | 2002-03-19 | Virage, Inc. | Video cataloger system with synchronized encoders |
| US6442333B1 (en) * | 1997-12-25 | 2002-08-27 | Pioneer Electronic Corporation | Information reproducing apparatus |
| US20100310233A1 (en) * | 1998-06-24 | 2010-12-09 | Samsung Electronics, Co., Ltd. | Recording medium for storing information for still picture, recording and/or reproducing method and apparatus therefor |
| US6611655B1 (en) * | 1999-04-02 | 2003-08-26 | Matsushita Electric Industrial Co., Ltd. | Optical disc, recording device and reproducing device |
| US6970638B1 (en) * | 1999-06-22 | 2005-11-29 | Funai Electric Co., Ltd. | Recording medium reproducing apparatus |
| US20030093790A1 (en) * | 2000-03-28 | 2003-05-15 | Logan James D. | Audio and video program recording, editing and playback systems using metadata |
| US20020168179A1 (en) * | 2001-05-10 | 2002-11-14 | Shinichi Kikuchi | Digital recording/reproducing apparatus |
| US7062758B2 (en) * | 2001-12-04 | 2006-06-13 | Hitachi, Ltd. | File conversion method, file converting device, and file generating device |
| US20030221014A1 (en) * | 2002-05-24 | 2003-11-27 | David Kosiba | Method for guaranteed delivery of multimedia content based on terminal capabilities |
| US20040146285A1 (en) * | 2002-05-28 | 2004-07-29 | Yoshinori Matsui | Moving picture data reproducing device with improved random access |
| US20040268386A1 (en) * | 2002-06-08 | 2004-12-30 | Gotuit Video, Inc. | Virtual DVD library |
| US8331769B2 (en) * | 2002-09-12 | 2012-12-11 | Panasonic Corporation | Recording medium, playback device, program, playback method, and recording method |
| US20060210245A1 (en) * | 2003-02-21 | 2006-09-21 | Mccrossan Joseph | Apparatus and method for simultaneously utilizing audio visual data |
| US20050084006A1 (en) * | 2003-10-16 | 2005-04-21 | Shawmin Lei | System and method for three-dimensional video coding |
| US20050152256A1 (en) * | 2004-01-08 | 2005-07-14 | Shinichi Kikuchi | Information recording medium, information recording method, information playback method, information recording apparatus, and information playback apparatus |
| US20050196148A1 (en) * | 2004-02-10 | 2005-09-08 | Seo Kang S. | Recording medium having a data structure for managing font information for text subtitles and recording and reproducing methods and apparatuses |
| US8059941B2 (en) * | 2004-05-10 | 2011-11-15 | Via Technologies Inc. | Multiplex DVD player |
| US20060127051A1 (en) * | 2004-06-16 | 2006-06-15 | Yasufumi Tsumagari | Information recording medium, information playback method, and information playback apparatus |
| US20080109449A1 (en) * | 2004-09-15 | 2008-05-08 | Samsung Electronics Co., Ltd. | Information storage medium for storing metadata supporting multiple languages, and systems and methods of processing metadata |
| US7996871B2 (en) * | 2004-09-23 | 2011-08-09 | Thomson Licensing | Method and apparatus for using metadata for trick play mode |
| US20090024644A1 (en) * | 2004-10-13 | 2009-01-22 | Electronics And Telecommunications Research Institute | Extended Multimedia File Structure and Multimedia File Producting Method and Multimedia File Executing Method |
| US7996449B2 (en) * | 2004-11-11 | 2011-08-09 | Samsung Electronics Co., Ltd. | Storage medium storing audio-visual data including metadata, reproducing apparatus, and method of searching for audio-visual data using the metadata |
| US7634727B2 (en) * | 2005-04-26 | 2009-12-15 | Microsoft Corporation | System for abstracting audio-video codecs |
| US20070050517A1 (en) * | 2005-08-24 | 2007-03-01 | Kazumi Doi | Content editing apparatus and content reproducing apparatus |
| US20070147770A1 (en) * | 2005-12-26 | 2007-06-28 | Kabushiki Kaisha Toshiba | Moving image reproducing apparatus |
| US20080063369A1 (en) * | 2006-04-17 | 2008-03-13 | Kim Kun S | Method and apparatus for reproducing data, recording medium, and method and apparatus for recording data |
| US20090208190A1 (en) * | 2006-05-18 | 2009-08-20 | Pioneer Corporation | Information reproducing apparatus and method, managing apparatus and method, information reproducing system, and computer program |
| US8966371B2 (en) * | 2006-09-11 | 2015-02-24 | Apple Inc. | Metadata for providing media content |
| US20100021125A1 (en) * | 2006-09-20 | 2010-01-28 | Claudio Ingrosso | Methods and apparatus for creation, distribution and presentation of polymorphic media |
| US20100042924A1 (en) * | 2006-10-19 | 2010-02-18 | Tae Hyeon Kim | Encoding method and apparatus and decoding method and apparatus |
| US20080129864A1 (en) * | 2006-12-01 | 2008-06-05 | General Instrument Corporation | Distribution of Closed Captioning From a Server to a Client Over a Home Network |
| US8838594B2 (en) * | 2006-12-27 | 2014-09-16 | International Business Machines Corporation | Automatic method to synchronize the time-line of video with audio feature quantity |
| US20080219641A1 (en) * | 2007-03-09 | 2008-09-11 | Barry Sandrew | Apparatus and method for synchronizing a secondary audio track to the audio track of a video source |
| US20100098389A1 (en) * | 2007-03-22 | 2010-04-22 | Masaaki Shimada | Video reproducing apparatus and method |
| US20080252719A1 (en) * | 2007-04-13 | 2008-10-16 | Samsung Electronics Co., Ltd. | Apparatus, method, and system for generating stereo-scopic image file based on media standards |
| US20080294691A1 (en) * | 2007-05-22 | 2008-11-27 | Sunplus Technology Co., Ltd. | Methods for generating and playing multimedia file and recording medium storing multimedia file |
| US20090046597A1 (en) * | 2007-08-13 | 2009-02-19 | Samsung Electronics Co., Ltd. | Method and apparatus for generating and accessing metadata in media file format |
| US20090094113A1 (en) * | 2007-09-07 | 2009-04-09 | Digitalsmiths Corporation | Systems and Methods For Using Video Metadata to Associate Advertisements Therewith |
| US20100015339A1 (en) * | 2008-03-07 | 2010-01-21 | Evonik Degussa Gmbh | Silane-containing corrosion protection coatings |
| US20120263438A1 (en) * | 2008-05-01 | 2012-10-18 | Mobitv, Inc. | Search system using media metadata tracks |
| US20100153395A1 (en) * | 2008-07-16 | 2010-06-17 | Nokia Corporation | Method and Apparatus For Track and Track Subset Grouping |
| US20100124409A1 (en) * | 2008-11-14 | 2010-05-20 | Samsung Electronics Co., Ltd. | Method of selecting content reproducing apparatus and content reproducing apparatus selector |
| US20100138418A1 (en) * | 2008-11-28 | 2010-06-03 | Samsung Electronics Co., Ltd. | Method and apparatus for reproducing content by using metadata |
| US20100183278A1 (en) * | 2009-01-16 | 2010-07-22 | David Robert Black | Capturing and inserting closed captioning data in digital video |
| US20110020774A1 (en) * | 2009-07-24 | 2011-01-27 | Echostar Technologies L.L.C. | Systems and methods for facilitating foreign language instruction |
| US20110030031A1 (en) * | 2009-07-31 | 2011-02-03 | Paul Lussier | Systems and Methods for Receiving, Processing and Organizing of Content Including Video |
| US8566393B2 (en) * | 2009-08-10 | 2013-10-22 | Seawell Networks Inc. | Methods and systems for scalable video chunking |
| US8719437B1 (en) * | 2009-08-13 | 2014-05-06 | Avvasi Inc. | Enabling streaming to a media player without native streaming support |
| US20110064146A1 (en) * | 2009-09-16 | 2011-03-17 | Qualcomm Incorporated | Media extractor tracks for file format track selection |
| US20110164859A1 (en) * | 2010-01-05 | 2011-07-07 | Kuang-Tsai Hao | Electronic audio/video story construction method |
| US8631047B2 (en) * | 2010-06-15 | 2014-01-14 | Apple Inc. | Editing 3D video |
| US8782268B2 (en) * | 2010-07-20 | 2014-07-15 | Microsoft Corporation | Dynamic composition of media |
| US8655854B2 (en) * | 2010-07-27 | 2014-02-18 | Avid Technology, Inc. | Hierarchical multimedia program composition |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10109285B2 (en) | 2014-09-08 | 2018-10-23 | Sony Corporation | Coding device and method, decoding device and method, and program |
| US10446160B2 (en) | 2014-09-08 | 2019-10-15 | Sony Corporation | Coding device and method, decoding device and method, and program |
| US10856042B2 (en) | 2014-09-30 | 2020-12-01 | Sony Corporation | Transmission apparatus, transmission method, reception apparatus and reception method for transmitting a plurality of types of audio data items |
| US11871078B2 (en) | 2014-09-30 | 2024-01-09 | Sony Corporation | Transmission method, reception apparatus and reception method for transmitting a plurality of types of audio data items |
| US12283282B2 (en) | 2014-09-30 | 2025-04-22 | Sony Group Corporation | Transmission apparatus, transmission method, reception apparatus and reception method for transmitting a plurality of types of audio data items |
| US10142757B2 (en) | 2014-10-16 | 2018-11-27 | Sony Corporation | Transmission device, transmission method, reception device, and reception method |
| US10475463B2 (en) | 2015-02-10 | 2019-11-12 | Sony Corporation | Transmission device, transmission method, reception device, and reception method for audio streams |
| US11051083B2 (en) * | 2017-09-15 | 2021-06-29 | Sony Corporation | Image processing apparatus and file generation apparatus |
| US11818406B2 (en) * | 2020-07-23 | 2023-11-14 | Western Digital Technologies, Inc. | Data storage server with on-demand media subtitles |
Also Published As
| Publication number | Publication date |
|---|---|
| CN102347046B (en) | 2016-08-17 |
| EP2416321B1 (en) | 2020-02-19 |
| US20120030253A1 (en) | 2012-02-02 |
| JP2012033243A (en) | 2012-02-16 |
| CN102347046A (en) | 2012-02-08 |
| JP5652642B2 (en) | 2015-01-14 |
| US8504591B2 (en) | 2013-08-06 |
| EP2416321A1 (en) | 2012-02-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8504591B2 (en) | Data generating device and data generating method, and data processing device and data processing method | |
| Li et al. | Fundamentals of multimedia | |
| US20210258514A1 (en) | Multimedia Distribution System for Multimedia Files with Packed Frames | |
| US9514783B2 (en) | Video editing with connected high-resolution video camera and video cloud server | |
| US12289483B2 (en) | Encoding device and method, reproduction device and method, and program | |
| JP6402632B2 (en) | DATA GENERATION DEVICE, DATA GENERATION METHOD, DATA REPRODUCTION DEVICE, AND DATA REPRODUCTION METHOD | |
| JP6402633B2 (en) | File generation apparatus, file generation method, file reproduction apparatus, and file reproduction method | |
| US20140099066A1 (en) | Content processing apparatus for processing high resolution content and content processing method thereof | |
| JP6617719B2 (en) | Information processing apparatus, information recording medium, information processing method, and program | |
| TWI630820B (en) | File generation device, file generation method, file reproduction device, and file reproduction method | |
| KR100963005B1 (en) | How to create a file in accordance with the free point of view service | |
| JPWO2020137854A1 (en) | Information processing equipment and information processing method | |
| CN112312219A (en) | Streaming media video playing and generating method and equipment | |
| US20230156257A1 (en) | Information processing apparatus, information processing method, and storage medium | |
| JP2022063882A (en) | Information processing device and method, and reproduction device and method | |
| Macbryde | Edit without Tears with Final Cut Pro: Elevate your video editing skills with professional workflows and techniques | |
| KR101512813B1 (en) | computer-readable recording medium having a program recorded media and playing a media performing said method and reproduction method, and screen switching by recognition of image expression | |
| KR20110126927A (en) | Video equipment and thumbnail information processing method | |
| CN102790897A (en) | Apparatus and method for converting 2d content into 3d content, and computer-readable storage medium thereof | |
| KR20130008244A (en) | Image processing apparatus and control method thereof | |
| JPWO2010041576A1 (en) | Content distribution system | |
| KR20160009921A (en) | Method for sharing video file based on slide show |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |