[go: up one dir, main page]

WO2014002539A1 - Appareil de traitement de données, et procédé pour un traitement de données - Google Patents

Appareil de traitement de données, et procédé pour un traitement de données Download PDF

Info

Publication number
WO2014002539A1
WO2014002539A1 PCT/JP2013/057937 JP2013057937W WO2014002539A1 WO 2014002539 A1 WO2014002539 A1 WO 2014002539A1 JP 2013057937 W JP2013057937 W JP 2013057937W WO 2014002539 A1 WO2014002539 A1 WO 2014002539A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
depth information
depth map
recording
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2013/057937
Other languages
English (en)
Inventor
Hiroshi Fujimoto
Tatsuhiro Nishioka
Akinori Komaki
Masaru Ohba
Megumi AIKAWA
Tsugutoyo Osaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Publication of WO2014002539A1 publication Critical patent/WO2014002539A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Definitions

  • Embodiments described herein relate generally to an information processing apparatus and information processing method.
  • a television which can display a 3D image (to be referred to as a stereoscopic TV hereinafter) displays a 3D image based on input 3D content.
  • the stereoscopic TV generates depth information based on input two-dimensional (2D) content, converts a 2D image into a 3D image based on this depth information, and outputs the converted 3D image.
  • a temporal limitation is set on generation of depth information to assure that processing is carried out in real time to maintain synchronization with content reproduction processing. Also, in terms of the performance of a CPU and a memory size, a resource limitation is set on generation of depth information. Therefore, it is not easy to generate depth information with high precision. Since various limitations are set on generation of depth information, it is not easy to provide a high-quality 3D image satisfactory to the user.
  • FIG. 1 is a schematic block diagram showing an example of the arrangement of an information processing apparatus according to an embodiment
  • FIG. 2 is a flowchart showing an example of generation processing of various depth maps,, recording processing, and reproduction processing of a 3D image based on various depth maps according to the
  • FIG. 3 is a block diagram showing an example of the generation processing of various depth maps, the recording processing, and the reproduction processing of a 3D image based on various depth maps according to the embodiment.
  • FIG. 4 is a view showing a display example of a content recording list according to the embodiment.
  • an information processing apparatus includes a generator and a recorder.
  • the generator is configured to generate first depth information of content before reproduction of the content.
  • the recorder is
  • FIG. 1 is a schematic block diagram showing an example of the arrangement of an information processing apparatus according to an embodiment.
  • An information processing apparatus 100 generates depth information (depth map) based on a resolution of a 2D image
  • the information processing apparatus 100 includes a control unit 1, content acquisition unit 2, content recording unit 3, depth map generation unit 4, depth map recording unit 5, 3D image conversion unit 6, and 3D image display unit 7. Note that the content recording unit 3 and depth map
  • recording unit 5 need not be independent units.
  • the content acquisition unit 2 may include a receiving unit which receives broadcast content.
  • the content acquisition unit 2 may include a communication unit which downloads network distributed content.
  • the content acquisition unit 2 may include a reading unit which reads package content from various media.
  • the 3D image display unit 7 is not an essential component, and the apparatus 100 includes a 3D image output unit in place of the 3D image display unit 7.
  • the content acquisition unit 2 of the information processing apparatus 100 receives broadcast content or network distributed content, the depth map generation unit 4 generates first depth information (to be
  • a first high-quality depth map hereinafter
  • the content recording unit 3 records the content
  • the depth map recording unit 5 records the first high-quality depth map.
  • the depth map generation unit 4 starts
  • the depth map generation unit 4 starts generation of the first depth map during the recording processing of the content.
  • the depth map generation unit 4 may execute calculation processing of the first depth map so as to end generation of the first depth map simultaneously with the end of recording of the content.
  • the depth map generation unit 4 may execute the calculation processing of the first depth map so that generation of the first depth map is continued for a predetermined period of time after the end of recording of the content.
  • the first high-quality depth map allows generation of a 3D image having a higher quality than third depth information (to be referred to as a standard depth map hereinafter) , generation of which is started during reproduction of the content. Since generation of the first high-quality depth map is started before reproduction of the content and during recording of the content, the first high-quality depth map hardly suffers temporal and resource limitations in generation of the map compared to the standard depth map, which is generated parallel to reproduction of the content. For this reason, the first high-quality depth map can be generated by calculations with a heavier load and by spending a longer time than the standard depth map. As a result, the first high-quality depth map allows generation of a 3D image with a higher quality than the standard depth map.
  • the information processing apparatus 100 generates the first high-quality depth map based on the content while recording the content, and records the generated first high-quality depth map.
  • the content recording unit 3 decodes a 2D image
  • the depth map generation unit 4 can generate a first high-quality depth map corresponding to the 2D image using results of various kinds of image processing executed in the recording processing of the content. That is, compared to a case in which the recording processing and depth map
  • the efficiency of the depth map generation processing can be improved by parallelly executing the recording processing and depth map generation processing. Note that the content recording processing suffers a lighter processing load than the content reproduction
  • the processing load is lighter than a case in which the depth map (standard depth map) is generated while reproducing the content.
  • the information processing apparatus 100 may generate a second high- quality depth map based on the recorded content and may record the generated second high-quality depth map.
  • the information processing apparatus 100 can generate the second high-quality depth map using parameters of higher levels than those (the resolution of the image and estimation algorithms) used at the time of
  • the depth map generation unit 4 can generate the second high-quality depth map during a designated time zone (for example, midnight), and the depth map recording unit 5 can record the second high-quality depth map during the designated time zone.
  • a designated time zone for example, midnight
  • the second high-quality depth map allows
  • the second high-quality depth map Since generation of the second high-quality depth map is started before reproduction of the content and after recording of the content, the second high-quality depth map hardly suffers any resource limitation compared to the first high-quality depth map, generation of which is started during recording of the content. For this reason, the second high-quality depth map can be generated by calculations with a heavier load than the first high-quality depth map. As a result, the second high-quality depth map allows generation of a 3D image having a higher quality than the first depth map.
  • the second high-quality depth map may be recorded by replacing the first high-quality depth map.
  • the information processing apparatus 100 may replace the first high-quality depth map by the second high- quality depth map without deleting the first high- quality depth map, or may delete the first high-quality depth map and may replace the first high-quality depth map by the second high-quality depth map in terms of management .
  • the content recording unit 3 of the information processing apparatus 100 records content to be recorded. Note that the content
  • recording unit 3 records the content together with content recording identification information (for example, an ID 1001) .
  • content recording identification information for example, an ID 1001 .
  • the depth map generation unit 4 starts generation of a first high-quality depth map by realtime processing from a 2D image included in the content to be recorded based on first depth estimation parameters (BLOCK 12), and the depth map recording unit 5 records the first high-quality depth map in
  • the depth map Furthermore, after the content and first high- quality depth map are recorded, the depth map
  • the generation unit 4 starts generation of a second high- quality depth map from the 2D image included in the recorded content based on second depth estimation parameters of higher levels than the first depth estimation parameters (BLOCKS 15 and 16) , and the depth map recording unit 5 records the second high-quality depth map in association with the ID 1001.
  • second depth estimation parameters of higher levels than the first depth estimation parameters (BLOCKS 15 and 16)
  • the depth map recording unit 5 records the second high-quality depth map in association with the ID 1001.
  • the depth map recording unit 5 records the second high-quality depth map in place of the first high-quality depth map (BLOCK 17).
  • the information processing apparatus 100 starts generation of the first high-quality depth map corresponding to a 2D image included in the content during recording of the content before the user views the content, and records the generated first high-quality depth map. Furthermore, the information processing apparatus 100 starts generation of the second high-quality depth map corresponding to the 2D image included in the content before the user views the content and after recording of the content (after recording of the first high-quality depth map) , and records the generated second high-quality depth map.
  • the information processing apparatus 100 receives a reproduction instruction of the content (that of a 3D image) from the user (YES in BLOCK 18), and the control unit 1 searches for a first high-quality depth map associated with the ID 1001 based on the ID 1001 of the content. For example, when the reproduction
  • the control unit 1 can detect the first high-quality depth map associated with the ID 1001 (cannot find the second high-quality depth map) (YES in BLOCK 19), the depth map generation unit 4 reads the first high-quality depth map (BLOCK 20), the 3D image conversion unit 6 converts a 2D image included in the recorded content into a 3D image based on the first high-quality depth map (BLOCK 21), and the 3D image display unit 7 outputs (displays) the converted 3D image (BLOCK 22) (BLOCK 33 ⁇ BLOCK 34 ⁇ BLOCK 35 in FIG. 3) .
  • the control unit 1 can detect the second high- quality depth map associated with the ID 1001 based on the ID.1001 of the content (YES in BLOCK 19), the depth map generation unit 4 reads the second high-quality depth map (BLOCK 20), the 3D image conversion unit 6 converts a 2D image included in the content into a 3D image based on the second high-quality depth map (BLOCK
  • the 3D image display unit 7 outputs (displays) the converted 3D image (BLOCK 22) (BLOCK 36 ⁇ BLOCK
  • the information processing apparatus 100 receives a reproduction instruction (that of a 3D image) of the content from the user (YES in BLOCK 18), if neither the first high-quality depth map nor the second high-quality depth map is recorded (NO in BLOCK 19), the depth map generation unit 4 generates a standard depth map (that simpler than the first high- quality depth map) from a 2D image included in the content by realtime processing parallel to the
  • the 3D image conversion unit 6 converts the 2D image included in the content into a 3D image based on the standard depth map (BLOCK 21), and the 3D image display unit 7 outputs (displays) the converted 3D image (BLOCK
  • the information processing apparatus 100 When the information processing apparatus 100 receives a reproduction instruction (that of a 2D image) -of the content from the user (NO in BLOCK 18), it outputs (displays) a 2D image included in the content intact.
  • the information processing apparatus 100 When the information processing apparatus 100 records the first and second high-quality depth maps, and receives, for example, a first high-quality reproduction instruction from the user, it may convert a 2D image included in the content into a 3D image based on the first high-quality depth map, and may output the converted 3D image. Alternatively, when the information processing apparatus 100 receives a second high-quality reproduction instruction from the user, it may convert a 2D image included in the content into a 3D image based on the second high-quality depth map, and may output the converted 3D image.
  • first and second high-quality depth maps receives, for example, a standard
  • reproduction instruction from the user it may convert a 2D image included in the content into a 3D image based on the already recorded standard depth map, and may output the converted 3D image.
  • content recording information may include first service information SI (additional information) and second service information S2 (additional information) , allowing the user to recognize whether or not content is ready to undergo high-quality reproduction or to know the remaining time until content is ready to undergo high-quality
  • the control unit 1 displays the content recording list in response to a display instruction of the content recording list from the user, and the 3D image display unit 7 displays the content recording list in response to this control.
  • the first service information SI indicates that a second high-quality depth, map corresponding to content A is recorded. That is, this information indicates that content A can be reproduced as a 3D image
  • the user can recognize that he or she can view content appended with the first service information SI as a high-quality 3D image. For example, when the user inputs a reproduction instruction of content A from the content recording list, content A is reproduced as a 3D image processed using the second high-quality depth map .
  • the second service information S2 indicates a remaining time required until a second high-quality depth map corresponding to content B is recorded (or a date and time of completion of recording of the second high-quality depth map) .
  • the depth map generation unit 4 When the depth map generation unit 4 generates a second high-quality depth map corresponding to content B, it can detect a required generation time of the second high-quality depth map from a 2D image included in content B and parameters required for generation of the second high-quality depth map.
  • the control unit 1 calculates, from the detected required generation time, a remaining time until the second high-quality depth map is recorded (or the date and time of completion of recording of the second high-quality depth map) , and performs control to display the content recording list including the second service information S2.
  • the 3D image display unit 7 displays the content recording list. For example, when the user inputs a reproduction reservation instruction of content B from the content recording list, and when content B is ready to be reproduced based on the second high-quality depth map, content B is reproduced as a 3D image processed using the second high-quality depth map.
  • a plurality of information processing apparatuses 100 may be inhibited from sharing the first or second high-quality depth map.
  • information processing for example, information processing
  • information processing apparatus 100a upon recording content, information processing apparatus 100a records the content together with content recording identification information (for example, an ID lOOal) . Also,
  • information processing apparatus 100a associates the content and a first high-quality depth map with each other via this ID lOOal. Furthermore, information processing apparatus 100a associates the content and a second high-quality depth map with each other via this ID lOOal.
  • information processing apparatus 100b records the content together with content recording
  • identification information for example, an ID lOObl
  • information processing apparatus 100b associates the content and a first high-quality depth map with each other via this ID lOObl.
  • information processing apparatus 100b associates the content and a second high-quality depth map with each other via this ID lOObl.
  • apparatuses 100a and 100b record the same content, information processing apparatus 100a associates the content and the first and second high-quality depth maps with each other via the ID lOOal, but information processing apparatus 100b associates the content and the first and second high-quality depth maps with each other via the ID lOObl.
  • information processing apparatus 100b receives the first or second high-quality depth map from information processing apparatus 100a while it records the content but does not record the first and second high-quality depth maps, since the content recorded in information processing apparatus 100b is managed using the ID lOObl, information processing apparatus 100b cannot use the first or second high-quality depth map managed using the ID lOOal.
  • the information processing apparatus 100 can receive broadcast content or network distributed content, can further receive a standard depth map, first high-quality depth map, or second high-quality depth map corresponding to the received content, can convert a 2D image included in the received content into a 3D image based on the received standard depth map, first high-quality depth map, or second high- quality depth map, and can display that 3D image.
  • the information processing apparatus 100 can record the received standard depth map, first high-quality depth map, or second high-quality depth map.
  • the information processing apparatus 100 executes payment processing for use of the depth map, and also receives a standard depth map, first high-quality depth map, or second high-quality depth map. Also, by paying a free higher than that required for temporary use of a depth map, when recording of a depth map is permitted, the information processing apparatus 100 executes payment processing for depth map recording, and receives a standard depth map, first high-quality depth map, or second high- quality depth map.
  • the information processing apparatus 100 can receive a standard depth map, first high-quality depth map, or second high-quality depth map in advance before broadcast or distribution of content. Alternatively, the information processing apparatus 100 can receive a standard depth map, first high-quality depth map, or second high-quality depth map during broadcast or distribution of content. Alternatively, the
  • information processing apparatus 100 can receive a standard depth map, first high-quality depth map, or second high-quality depth map after recording of content. In any case, since no depth map generation operation is required, no wait time required for generation of a depth map is generated.
  • the received depth map includes management
  • the information processing apparatus 100 associates the received content and received depth map with each other based on the management information of the received content and that of the received depth map, converts a 2D image included in the received content into a 3D image based on the received depth map, and displays the 3D image.
  • the information processing apparatus 100 practically displays the commercial in the content as a 2D image based on this received depth map.
  • the information processing apparatus 100 can read package content from each of various media, can
  • first high-quality depth map or second high-quality depth map corresponding to the read content
  • apparatus 100 can read package content from each of various media, can receive a standard depth map, first high-quality depth map, or second high-quality depth map corresponding to the read content, can convert a 2D image included in the package content into a 3D image based on the received depth map, and can display the converted 3D image, as described in the second embodiment .
  • the information processing apparatus 100 can receive content from an image .
  • site for example, a movie site
  • site can generate a standard depth map, first high-quality depth map, or second high-quality depth map corresponding to the read content
  • apparatus 100 can receive content from an image site (for example, a movie site), can receive a standard depth map, first high-quality depth map, or second high-quality depth map corresponding to the read content, can convert a 2D image included in the package content into a 3D image based on the received depth map, and can display the converted 3D image, as
  • the information processing apparatus 100 generates a standard depth map, first high-quality depth map, or second high-quality depth map corresponding to a 2D image of content recorded in advance in an external medium, and manages an ID of the external medium and that of the generated depth map in association with each other. Alternatively, the information processing apparatus 100 manages an ID of the content recorded in the external medium and that of the generated depth map in association with each other.
  • the information processing apparatus 100 Upon reception of a reproduction instruction of the external medium from the user, the information processing apparatus 100 reads the generated depth map based on the ID of the external medium and that of the generated depth map, converts a 2D image of the content recorded in the external medium into a 3D image based on the read depth map, and displays the converted 3D image.
  • the information processing apparatus 100 reads the
  • the information processing apparatus 100 In response to a depth map generation instruction from the user, the information processing apparatus 100 generates a standard depth map, first high-quality depth map, or second high-quality depth map
  • the depth map corresponding to the content designated by the user can be registered.
  • the information processing apparatus 100 can avoid temporal and resource limitations, and can generate a depth map using a plurality of estimation algorithms at a resolution equivalent to a source image.
  • the user can view a more appropriate 3D image (with higher definition and closer to an actual image) .
  • an information processing apparatus According to at least one embodiment described above, an information processing apparatus and
  • the various modules of the embodiments described herein can be implemented as software applications, hardware and/or software modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
PCT/JP2013/057937 2012-06-27 2013-03-13 Appareil de traitement de données, et procédé pour un traitement de données Ceased WO2014002539A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-144206 2012-06-27
JP2012144206A JP2014011474A (ja) 2012-06-27 2012-06-27 情報処理装置及び情報処理方法

Publications (1)

Publication Number Publication Date
WO2014002539A1 true WO2014002539A1 (fr) 2014-01-03

Family

ID=49782719

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/057937 Ceased WO2014002539A1 (fr) 2012-06-27 2013-03-13 Appareil de traitement de données, et procédé pour un traitement de données

Country Status (2)

Country Link
JP (1) JP2014011474A (fr)
WO (1) WO2014002539A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010068315A (ja) * 2008-09-11 2010-03-25 Mitsubishi Electric Corp 映像記録装置及び方法、並びに映像再生装置及び方法
WO2012029299A1 (fr) * 2010-08-31 2012-03-08 パナソニック株式会社 Dispositif de capture d'images, dispositif de lecture et procédé de traitement d'images
WO2012081332A1 (fr) * 2010-12-16 2012-06-21 シャープ株式会社 Dispositif, procédé et programme de traitement d'image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010068315A (ja) * 2008-09-11 2010-03-25 Mitsubishi Electric Corp 映像記録装置及び方法、並びに映像再生装置及び方法
WO2012029299A1 (fr) * 2010-08-31 2012-03-08 パナソニック株式会社 Dispositif de capture d'images, dispositif de lecture et procédé de traitement d'images
WO2012081332A1 (fr) * 2010-12-16 2012-06-21 シャープ株式会社 Dispositif, procédé et programme de traitement d'image

Also Published As

Publication number Publication date
JP2014011474A (ja) 2014-01-20

Similar Documents

Publication Publication Date Title
US20200322669A1 (en) Video recording of a display device
KR20050031870A (ko) 이미지 파일 콘테이너
JP2011142585A (ja) 画像処理装置、情報記録媒体、および画像処理方法、並びにプログラム
EP3422702B1 (fr) Dispositif de génération de fichiers, procédé de génération de fichiers, dispositif de reproduction et procédé de reproduction
EP3883250A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
US8693687B2 (en) Method and apparatus of processing three-dimensional video content
US20120293616A1 (en) Apparatus and method for converting 2d content into 3d content, and computer-readable storage medium thereof
EP3185544A1 (fr) Appareil de traitement d'informations, support d'enregistrement d'informations, procédé de traitement d'informations, et programme
US20090317062A1 (en) Image processing method and apparatus
JP5424930B2 (ja) 画像編集装置およびその制御方法およびプログラム
EP3952275B1 (fr) Dispositif de génération de fichier, procédé de génération de fichier, dispositif de reproduction de fichier, procédé de reproduction de fichier et programme
EP3422731B1 (fr) Dispositif de génération de fichiers, procédé de génération de fichiers, dispositif de reproduction et procédé de reproduction
CN115361495B (zh) 实时拍摄信息的传输方法、管理方法、装置及系统
WO2014002539A1 (fr) Appareil de traitement de données, et procédé pour un traitement de données
EP3193335A1 (fr) Dispositif de traitement de l'information, procédé de traitement de l'information, programme et support d'enregistrement
US8879872B2 (en) Method and apparatus for restoring resolution of multi-view image
US12192332B2 (en) File processing device and file processing method including file that stores encrypted image
US11431957B2 (en) File generation apparatus, file generation method, processing apparatus, and non-transitory computer-readable storage medium
US20130314400A1 (en) Three-dimensional image generating apparatus and three-dimensional image generating method
US20180070048A1 (en) Management of media content on a storage medium
KR20180104694A (ko) 다수 사용자들 간의 미디어 콘텐츠 공유 방법 및 시스템
EP3319089B1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations, et programme
EP4632612A1 (fr) Dispositif et procédé de traitement d'image
Kim et al. A Study on Identification of the Source of Videos Recorded by Smartphones
US11122252B2 (en) Image processing device, display device, information recording medium, image processing method, and program for virtual reality content

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13808633

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13808633

Country of ref document: EP

Kind code of ref document: A1