WO2005101819A1 - 表示装置 - Google Patents
表示装置 Download PDFInfo
- Publication number
- WO2005101819A1 WO2005101819A1 PCT/JP2004/005218 JP2004005218W WO2005101819A1 WO 2005101819 A1 WO2005101819 A1 WO 2005101819A1 JP 2004005218 W JP2004005218 W JP 2004005218W WO 2005101819 A1 WO2005101819 A1 WO 2005101819A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- frame
- correction
- video
- target area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/20—Circuitry for controlling amplitude response
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44012—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/45—Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
Definitions
- the present invention relates to a display device for displaying a composite image obtained by combining monomedia such as moving images, characters / graphics, and still images.
- a conventional display device that displays a composite image obtained by combining monomedia such as moving images, characters / graphics, and still images, for example, as disclosed in JP-A-2001-175239
- the display attributes of non-attention windows are changed, such as lowering the luminance / frame rate of non-attention windows, reducing the window size, etc., to provide a composite video display that is easy for viewers to see. .
- the present invention has been made in order to solve the above-described problem.
- An area having a difference between frames is set as a window of interest, and high-quality image processing is performed on the window of interest.
- An object of the present invention is to provide a display device that realizes high-quality display according to the intention of a person who has performed a window presentation instruction on the entire screen by implementing the present invention. Disclosure of the invention
- a display device is a scaling device for inputting a plurality of mono-media data and presentation style data describing a presentation style of a frame of each of the mono-media data, and synthesizing the mono-media data.
- Video presentation means for generating composite control information and composing each of the above-mentioned mono-media data to produce a composite video frame, and predetermined mono-media data in the composite video frame based on the scaling / composition control information.
- the correction target area is obtained, the frame ⁇ ⁇ difference in this correction target area is obtained, correction data is generated, and the above-described complementary IE target area is subjected to high image quality processing using the generated correction data to generate a display video frame. It is provided with a high-quality image processing unit and an image display means for displaying the generated display image frame.
- FIG. 1 is a block diagram showing a configuration of a display device according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing the internal configuration of the image presentation means of the display device according to Embodiment 1 of the present invention.
- FIG. 3 is a block diagram showing the internal configuration of the high-quality image display means of the display device according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram showing an example of a synthesized video frame synthesized by the video presentation means of the display device according to the first embodiment of the present invention.
- FIG. 5 is a diagram showing the image display means of the display device according to Embodiment 1 of the present invention.
- FIG. 14 is a diagram illustrating an example of an output frame. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 is a block diagram showing a configuration of a display device according to Embodiment 1 of the present invention.
- the display device 1 includes a video presenting unit 10, an image quality improving unit 20, and a video display unit 30.
- the video presenting means 10 receives video data 101 and data broadcasting service data 102 from a digital broadcasting service center to receive moving image data, character / graphic data, still image data, and the like. Input the presentation style data that describes the presentation style of multiple D mono-media data and the frame of each mono-media data, generate scaling / synthesis control information for combining the mono-media data, and generate each mono-media data. Are combined to generate a combined video frame 103. Based on the scaling Z synthesis control information 111, the image quality improving unit 20 determines the correction target area of the predetermined monomedia data in the composite video frame 103, and calculates the (inter-frame difference in the O correction target area).
- the video display means 30 displays O from the high-quality improvement means 20. Display video frame 104 on a video display panel or the like.
- FIG. 2 is a block diagram showing the internal configuration of the video presenting means 10.
- the O-video presenting means 10 includes a data broadcasting browser 11, a moving picture plane buffer factory 14, a character Z figure plane buffer 15, a still picture plane buffer 16 and a scaling / synthesizing means 17.
- data broadcasting The user 11 has a graphic title reproducing means 12 and a style analyzing means 13.
- the data broadcast browser 11 is a presentation style data that describes the presentation style of mono-media and frames, such as characters, graphic data, still image data, etc., contained in the input data broadcast service data 102. Separate the media and play the separated mono-media, analyze the separated presentation style data, and scale each of the mono-media data such as video data, text / graphic data, and still image data / compositing method Generates scaling / synthesis control information 1 1 1 1 1
- the graphics reproducing means 12 is built in the data broadcasting browser 11, and reproduces mono-media data such as character / graphic data and still image data contained in the data broadcasting service data 102.
- the style analysis means 13 is provided in the data broadcasting browser 11 and analyzes presentation style data describing the presentation style of the frame included in the data broadcasting service data 102. Scaling for each piece of mono-media data such as data and still image data ⁇ Scaling synthesis control information indicating the synthesis method 1 11 1.
- the video plane buffer 14 is inputted video data 10 0 which is input video data.
- FIG. 3 is a block diagram showing the internal configuration of the image quality improving means 20.
- the high image quality improving means 20 includes correction area management means 21, encoding means 22, delayed month frame buffer 23, previous frame decoding means 24, current frame decoding means 25, and correction data generating means. 26 and image correction means 27.
- the capturing area management means 21 receives the scaling Z synthesis control information 111 for each monomedia from the video presenting means 10, and outputs the information of the total video frame 103.
- the correction target area of the predetermined monomedia data is obtained, and the necessary minimum compression ratio in the correction target area is obtained to generate the correction target area Z compression rate information 112.
- the encoding means 22 inputs the total video frame 103, and the compression target area indicated by the correction target area specified by the correction pair component area / compression ratio information 112 from the correction area management means 21. Encode by rate.
- the delay frame buffer 23 stores the encoded data from the encoding means 22 and delays it by one frame. It is assumed that the memory capacity of the delay frame buffer 23 is 1 frame or less of the composite video frame 103 and, for example, has a capacity of only 1/4 frame of the composite video frame 103. This is because the amount of data for one frame of the composite video frame 103 is very large, and a huge memory capacity is required to store one frame.
- the previous frame decoding means 24 stores the 1 frame stored in the delay frame buffer 23 at the compression rate indicated by the correction rate information 1 12 from the correction area management means 21. And decodes the encoded data delayed by the time.
- the current frame decoding means 25 decodes the encoded data from the encoding means 22 at the compression rate specified by the correction target area / compression rate information 112 from the normal area management means 21.
- the data generation means 26 receives the decoded data from the previous frame decoding means 24 and the current frame decoding means for the correction target area designated by the correction target area / compression ratio information 112 from the correction area management means 21. 25. Compare the decoded data from 5 to determine the difference between frames, and generate correction data corresponding to the calculated difference between frames.
- the image correction unit 27 converts the correction target region in the composite video frame 103 specified by the correction target region Z compression ratio information 112 from the correction region management unit 21 from the correction data generation unit 26. Then, the image data is corrected by the correction data, the image quality is enhanced, and a display video frame 104 is generated and output to the video display means 30.
- the previous frame decoding means 24 and the current frame decoding means 25 are equipped with the same processing functions.
- the video data 101 for one frame transmitted from the digital broadcasting service center is stored in the video plane buffer 14 of the video presenting means 10.
- the data broadcasting browser 11 of the video presenting means 10 inputs the data broadcasting service data 102 transmitted from the digital broadcasting service center, and transmits mono data such as character / graphic data and still image data.
- the presentation style data that describes the presentation style of the frame is separated.
- scale information, arrangement information, text information such as character colors and fonts, and superimposition information of a still image, etc., for presenting each monomedia data in a frame are described in text. I have.
- the graphics reproducing means 12 reproduces the separated character Z graphic data and the mono-media data of the still image data, and stores them in the character / graphic plane buffer 15 and the still image plane buffer 16 respectively.
- the style analysis means 13 analyzes the presentation style data in the data broadcasting service data 102 and obtains each model such as video data, character / graphic data, and still image data. Scaling for no-media z Scaling / synthesis control information 1 1 1 for each mono-medium data indicating the synthesis method is generated. In the scaling / synthesis control information 111, scale information, arrangement information, synthesis information, and the like for presenting each piece of monomedia data are described in the data structure of the display device 1.
- the scaling / synthesizing means 17 is based on the style analysis means 13 based on the scaling Z synthesis control information 111, and is based on the video plane buffer 14, character character figure plane buffer 15 and still image plane.
- Puffer 16 Scales and combines moving image data, character Z figure data, and still image data stored in each of them to generate composite video frame 103.
- FIG. 4 is a diagram showing an example of a synthesized video frame 103 synthesized by the scaling synthesizing means 17 of the video presenting means 10.
- the video presenting means 10 repeatedly performs the above processing for each frame of the input video data 101 to generate a composite video frame 103.
- the correction area management means 21 sets in advance which mono-media data area of the composite video frame 103 input to the high image quality improvement means 20 is to be targeted as an E target. Shall be. For example, it is assumed that an area such as moving image data which is monomedia data in which there is a difference between frames between all frames is set as a correction target. Then, the correction area management means 21 inputs the scaling / synthesis control information 111 for each mono-media data from the video presenting means 10 and scales the mono-media data set as the correction target. When the input of the tuning / synthesis control information 1 1 1 is input, the correction target area of the set correction target is obtained, and it is necessary from the obtained correction target area and the memory capacity of the delay frame buffer 23. Calculate the minimum compression ratio and generate the correction target area Z compression ratio information 1 1 2 The encoding means 22, the previous frame decoding means 24, the current frame decoding means 25, the correction data generation means 26 and the video correction means 27 are notified.
- the calculation of the compression ratio will be specifically described using an example in which the memory capacity of the delay frame buffer 23 has a capacity of 1/4 of one frame of the composite video frame 103. I do.
- the area to be corrected is one frame in its entirety, the compression ratio for that frame is 1/4.
- two pixels vertically and two pixels horizontally hereinafter referred to as 2 x 2 pixels, etc.
- the area is compressed and coded to a capacity of 1 X 1 pixel.
- the area to be corrected is an area that is 1/4 of one frame, the compression ratio for this is 1, which means that it can be treated as non-compressed.
- the encoding means 22 of the high image quality improving means 20 receives the first synthesized video frame 103 from the video presenting means 10 and cuts it out, for example, every 8 ⁇ 8 bits, and corrects it. 2 Correction area from 1 Z fixed rate encoding is performed on the correction area indicated by the compression rate information 1 1 2 at the specified compression rate, and the first encoded data is transferred to the delay frame buffer 2 3 Store in.
- the encoding means 22 repeatedly executes, for example, fixed-length encoding of, for example, 8 ⁇ 8 bits, and obtains first encoded data in the correction target area for one frame of the first synthesized video frame 103. Is stored in the delay frame buffer 23.
- the first synthesized video frame 103 from the video presentation means 10 is also input to the video correction means 27.
- the video correction means 27 converts the input first synthesized video frame 103 into the display video frame 104 as it is. Then, the image is output to the video display means 30, and the video display means 30 displays the uncorrected display video frame 104.
- the second composite image frame as the second frame from the video presenting means 10 is input will be described.
- the previous frame decoding means 24 takes out, for example, the first coded data stored in the delay frame buffer 23 and corrects the correction target area / compression rate information 1 from the minor area management means 21. Fixed-length decoding is performed at the compression ratio specified by 12 and the first decoded data of 8 ⁇ 8 bits is output to the complementary IE data generating means 26.
- the encoding means 22 inputs the second composite video frame 103 which is the second frame from the video presentation means 10, and cuts out, for example, every 8 ⁇ 8 bits,
- the correction target area specified by the sugar correction target area Z compression rate information 112 from the correction area management means 21 is fixed-length coded at the specified compression rate, and the second encoded data is The data is stored in the empty area of the delay frame buffer 23, and the second encoded data is output to the preview frame decoding means 25.
- the current frame decoding means 25 decodes, for example, the second encoded data at a fixed length at the compression rate indicated by the correction target area ⁇ compression rate information 112 from the correction area management means 21.
- the 8 ⁇ 8-bit second decoded data is output to the correction data generating means 26.
- the correction data generating means 26 includes, for example, the 8 ⁇ 8-bit first decoded data output from the previous frame decoding means 24 and the 8 ⁇ 8-bit first decoded data output from the current frame decoding means 25.
- the inter-frame difference is obtained by comparing the second decoded data, and optimal complement IE data is generated from the obtained inter-frame difference.
- the correction data generation means 26 is configured to generate correction data specialized for gradation
- the correction target In order to optimize the gradation value of the target pixel in accordance with the calculated frame difference, Generate.
- the correction data generating means 26 generates correction data relating to values of display attributes such as gradation and luminance obtained from the inter-frame difference of the correction target area.
- Decoding processing of the first encoded data stored in the delay frame buffer 23 by the previous frame decoding means 24, and the current frame for the second encoded data encoded by the encoding means 22 The decoding processing by the decoding means 25 and the correction data generation processing by the correction data generation stage 26 for every 8 ⁇ 8 bits are repeatedly executed, and the correction data generation means 26 outputs the correction data for one frame. It is generated and output to the image correction means 27.
- the image correction means 27 converts the correction target area in the second composite image frame designated by the correction target area / compression rate information 112 from the correction area management means 21 from the correction data generation means 26.
- the image data is corrected with the captured data and the image quality is enhanced, and a display video frame 104 is generated and output to the video display means 30.
- the image display means 30 displays the corrected display image frame 104 output from the high-quality image enhancement means 20 to realize high-quality image display. Can be.
- digital broadcasting data broadcasting services for example, as shown in Fig. 4, video presentation, video broadcasting, still picture, text / graphic areas, etc. are broadcast, received, and played back as digital information, and the data broadcasting browser 11 is included. By means 10, they are arranged on one screen and synthesized to improve the broadcasting service.
- digital broadcasting instead of synthesizing and broadcasting frames at a broadcasting station as in analog broadcasting, monomedia data individually broadcasted from a digital broadcasting service center is decoded into individual S on the receiving side. And It is composed on a single screen by Kaliningno synthesis.
- moving image data is set as a correction target in the correction area management means 21.
- the character Z graphic data may be set as a correction target.
- the presentation style data when the presentation style data is included in the content, such as a digital broadcasting data broadcasting service, or when the presentation style of the frame is determined by the instruction of a user such as a personal computer, etc.
- the display style data may be generated in the display device 1 or the personal computer and provided to the style analysis means 13.
- the presentation style of the frame is changed.
- the style analysis means 13 of the data broadcast browser 11 changes the scaling Z synthesis control information 111.
- the user gives instructions to the style analysis means 13 by operating a personal computer or the like to change the scaling Z synthesis control information 1 1 1.
- the style analysis means 13 of the video presentation means 10 analyzes the presentation style data when the changed presentation style data is acquired, and changes the changed scaling Z composition control information for each mono media data. Raw 1 1 1 To achieve. Further, the correction area management means 21 of the high image quality improvement means 20 receives the changed scaling / synthesis control information 111 and generates the changed correction target area / compression ratio information 112. The image quality improving means 20 performs the image quality improving process on the composite video frame 103 in the same manner as described above, so that the image quality improving process corresponding to the change of the presentation style can be performed.
- the frame may transition to a full-screen video display frame.
- FIG. 5 is a diagram showing an example of a frame output from the scaling synthesizing unit 17 of the video presenting unit 10, which is a frame of a data broadcast display OF F Z full-screen moving image display.
- the display is significantly changed in this way, it is necessary to execute the high-quality image processing on the entire screen according to the concept of the high-quality image processing described above. It is not possible to recognize a slight increase in image quality, etc.
- the correction area management means 21 regenerates the correction target area Z compression ratio information 112 when the changed scaling Z synthesis control information 111 is inputted, and Comparing the correction target area Z compression rate information 1 1 2 before the change and the correction target area / compression rate information 1 1 2 after the change to the newly generated current frame, when there is a significant change, For each component in the image quality improving means 20, new correction target area Z compression ratio information 112 is output. Then, the correction data generation circuit 26 significantly changes the presentation style of the frame based on the correction target area / compression ratio information 112 before the change and the correction target region compression ratio information 112 after the change. If it is detected that there is a change in the correction target area, the frame corresponding to the Z compression ratio information 112 In this case, the frame difference detection process from the previous frame and the generation process of the correction data are not performed.
- the correction data generation unit 26 does not perform the frame difference detection processing and the correction data generation processing.
- the high image quality Any configuration may be used as long as the configuration does not implement the above. For example, if there is no previous frame such as the beginning of a table, the encoded data stored in the delay frame buffer 23 cannot be used, so a signal indicating that this is the first input frame may be prepared. However, by using this signal, the scaling / synthesis control information 1 1
- the video presenting means 10 receives a plurality of mono-media data and presentation style data describing the presentation style of the frame of each mono-media data, and inputs each mono-media data.
- a scaling Z composition control for composing media data is generated, a composite video frame is produced by composing each piece of mono-media data, and the image quality enhancement means 20 is used for scaling Z composition control information 11
- a skirt correction target area of predetermined mono-media data in the composite video frame 103 is obtained, a difference between frames in the correction target area is obtained, and correction data is generated.
- generating a display video frame 104 by processing the correction target area to high image quality, the intention of the person who gave the window presentation instruction to the entire screen was followed. The effect is obtained that it is possible to image quality display realized.
- the surface quality enhancement means 20 only needs to perform encoding / decoding processing on only the correction target area having a difference between frames, so that the processing can be performed at a high speed and the correction target area Depending on the size, note the delay frame buffer 23 in the image quality improvement means 20! ; Lower compression ratio without changing capacity As a result, image quality deterioration due to encoding / decoding can be reduced, and an effect that higher-quality image display can be performed on the image display means 30 can be obtained.
- the style analyzing means 13 of the video presenting means 10 when the presentation style is changed, the style analyzing means 13 of the video presenting means 10 generates the changed scaling information synthesis control information 11 1, and the image quality improving means 20
- the correction area management means 21 generates the corrected correction target area / compression ratio information 1 12, and the high-quality means 20 performs the high-quality processing, thereby providing a presentation style. Even when the display is changed, it is possible to achieve a high-quality display according to the intention of the person who has performed the window presentation instruction on the entire screen.
- the image quality improving means 20 when the presentation style is largely changed, the image quality improving means 20 does not execute the image quality improving process with little visual effect. This has the effect of increasing the efficiency of the processing.
- the display device corrects the correction target area having the inter-frame difference and performs the high image quality processing, thereby performing the window presentation instruction for the entire screen. It is suitable for realizing high-quality display according to.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Digital Computer Display Output (AREA)
Abstract
Description
Claims
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CNB2004800427285A CN100531340C (zh) | 2004-04-12 | 2004-04-12 | 显示装置 |
| JP2006512214A JPWO2005101819A1 (ja) | 2004-04-12 | 2004-04-12 | 表示装置 |
| EP04726899A EP1715681A4 (en) | 2004-04-12 | 2004-04-12 | DISPLAY |
| US10/589,903 US7830400B2 (en) | 2004-04-12 | 2004-04-12 | Display unit |
| PCT/JP2004/005218 WO2005101819A1 (ja) | 2004-04-12 | 2004-04-12 | 表示装置 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2004/005218 WO2005101819A1 (ja) | 2004-04-12 | 2004-04-12 | 表示装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2005101819A1 true WO2005101819A1 (ja) | 2005-10-27 |
Family
ID=35150349
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2004/005218 Ceased WO2005101819A1 (ja) | 2004-04-12 | 2004-04-12 | 表示装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US7830400B2 (ja) |
| EP (1) | EP1715681A4 (ja) |
| JP (1) | JPWO2005101819A1 (ja) |
| CN (1) | CN100531340C (ja) |
| WO (1) | WO2005101819A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011013294A (ja) * | 2009-06-30 | 2011-01-20 | Toshiba Corp | 情報処理装置および輝度制御方法 |
| JP2016154314A (ja) * | 2015-02-20 | 2016-08-25 | シャープ株式会社 | 画像処理装置、テレビジョン受像機、制御方法、プログラム、および記録媒体 |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1895466A1 (en) * | 2006-08-30 | 2008-03-05 | BRITISH TELECOMMUNICATIONS public limited company | Providing an image for display |
| CN102959947B (zh) * | 2010-07-06 | 2016-03-23 | 松下电器(美国)知识产权公司 | 画面合成装置及画面合成方法 |
| MX358934B (es) * | 2014-06-26 | 2018-09-10 | Panasonic Ip Man Co Ltd | Dispositivo de salida de datos, metodo de salida de datos y metodo de generacion de datos. |
| US10212428B2 (en) * | 2017-01-11 | 2019-02-19 | Microsoft Technology Licensing, Llc | Reprojecting holographic video to enhance streaming bandwidth/quality |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000005899A1 (fr) * | 1998-07-22 | 2000-02-03 | Mitsubishi Denki Kabushiki Kaisha | Systeme de codage d'images |
| JP2002323876A (ja) * | 2001-04-24 | 2002-11-08 | Nec Corp | 液晶表示装置における画像表示方法及び液晶表示装置 |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR100223171B1 (ko) * | 1996-07-30 | 1999-10-15 | 윤종용 | 감마보정장치 |
| US6097757A (en) * | 1998-01-16 | 2000-08-01 | International Business Machines Corporation | Real-time variable bit rate encoding of video sequence employing statistics |
| JP3466951B2 (ja) * | 1999-03-30 | 2003-11-17 | 株式会社東芝 | 液晶表示装置 |
| JP4081934B2 (ja) | 1999-09-17 | 2008-04-30 | ソニー株式会社 | データ配信方法及び装置、並びに、データ受信方法及び装置 |
| JP2001175239A (ja) | 1999-12-21 | 2001-06-29 | Canon Inc | マルチ画面表示装置、マルチ画面表示システム、マルチ画面表示方法、及び記憶媒体 |
| CA2336577A1 (en) | 2000-02-16 | 2001-08-16 | Siemens Corporate Research, Inc. | Systems and methods for generating and playback of annotated multimedia presentation |
| JP4691812B2 (ja) * | 2001-03-29 | 2011-06-01 | ソニー株式会社 | 係数データの生成装置および生成方法、それを使用した情報信号の処理装置および処理方法 |
| JP3617516B2 (ja) | 2001-10-31 | 2005-02-09 | 三菱電機株式会社 | 液晶駆動回路、液晶駆動方法、及び液晶ディスプレイ装置 |
| EP1328114A1 (en) * | 2002-01-10 | 2003-07-16 | Canal+ Technologies Société Anonyme | Image resolution management in a receiver/decoder |
| JP3673257B2 (ja) | 2002-06-14 | 2005-07-20 | 三菱電機株式会社 | 画像データ処理装置、画像データ処理方法、及び液晶ディスプレイ装置 |
| US20040019582A1 (en) * | 2002-07-29 | 2004-01-29 | Merlyn Brown | Electronic interactive community directory and portable player unit |
-
2004
- 2004-04-12 EP EP04726899A patent/EP1715681A4/en not_active Withdrawn
- 2004-04-12 WO PCT/JP2004/005218 patent/WO2005101819A1/ja not_active Ceased
- 2004-04-12 CN CNB2004800427285A patent/CN100531340C/zh not_active Expired - Fee Related
- 2004-04-12 US US10/589,903 patent/US7830400B2/en not_active Expired - Fee Related
- 2004-04-12 JP JP2006512214A patent/JPWO2005101819A1/ja active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000005899A1 (fr) * | 1998-07-22 | 2000-02-03 | Mitsubishi Denki Kabushiki Kaisha | Systeme de codage d'images |
| JP2002323876A (ja) * | 2001-04-24 | 2002-11-08 | Nec Corp | 液晶表示装置における画像表示方法及び液晶表示装置 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP1715681A4 * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011013294A (ja) * | 2009-06-30 | 2011-01-20 | Toshiba Corp | 情報処理装置および輝度制御方法 |
| JP2016154314A (ja) * | 2015-02-20 | 2016-08-25 | シャープ株式会社 | 画像処理装置、テレビジョン受像機、制御方法、プログラム、および記録媒体 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1715681A1 (en) | 2006-10-25 |
| CN1939052A (zh) | 2007-03-28 |
| US20070171235A1 (en) | 2007-07-26 |
| US7830400B2 (en) | 2010-11-09 |
| CN100531340C (zh) | 2009-08-19 |
| JPWO2005101819A1 (ja) | 2008-03-06 |
| EP1715681A4 (en) | 2008-11-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7006156B2 (en) | Image data output device and receiving device | |
| CN108235055B (zh) | Ar场景中透明视频实现方法及设备 | |
| TWI354981B (en) | Method and related device of increasing efficiency | |
| US20110129156A1 (en) | Block-Edge Detecting Method and Associated Device | |
| JP2003338991A (ja) | 画像表示装置及び画像表示方法 | |
| US8154654B2 (en) | Frame interpolation device, frame interpolation method and image display device | |
| US20090189912A1 (en) | Animation judder compensation | |
| WO2005101819A1 (ja) | 表示装置 | |
| JP6045405B2 (ja) | 映像処理装置、表示装置、テレビジョン受信機及び映像処理方法 | |
| JP2002199277A (ja) | 画像データ出力装置 | |
| US8063916B2 (en) | Graphics layer reduction for video composition | |
| US20070008348A1 (en) | Video signal processing apparatus and video signal processing method | |
| JP3115176U (ja) | プラズマテレビジョンおよび画像表示装置 | |
| JP2006295746A (ja) | 映像信号出力装置、および映像信号出力方法 | |
| JP2010055001A (ja) | 映像信号処理装置及び映像信号処理方法 | |
| KR20060135736A (ko) | 스크린상에 디스플레이될 영상들의 종횡비의 변경 | |
| CN100375509C (zh) | 缩放子画面数据呈现大小的方法、视讯处理电路以及数字激光视盘播放系统 | |
| JP4677755B2 (ja) | 画像フィルタ回路及び補間処理方法 | |
| EP1848203B2 (en) | Method and system for video image aspect ratio conversion | |
| JP4640587B2 (ja) | 映像表示装置、映像処理装置並びに映像処理方法 | |
| JP2011040974A (ja) | 表示制御装置及びその制御方法 | |
| JP4534975B2 (ja) | 再生装置、再生方法、記録方法、映像表示装置及び記録媒体 | |
| CN101521768A (zh) | 一种调整手机电视图像的方法及移动终端 | |
| JP2003102006A (ja) | 動画圧縮装置および動画圧縮方法 | |
| US7148931B2 (en) | Apparatus and method for signal processing in digital video system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200480042728.5 Country of ref document: CN |
|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2006512214 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2007171235 Country of ref document: US Ref document number: 10589903 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2004726899 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 2004726899 Country of ref document: EP |
|
| WWP | Wipo information: published in national office |
Ref document number: 10589903 Country of ref document: US |