US20110134252A1 - Information processing apparatus and control method thereof - Google Patents
Information processing apparatus and control method thereof Download PDFInfo
- Publication number
- US20110134252A1 US20110134252A1 US12/960,384 US96038410A US2011134252A1 US 20110134252 A1 US20110134252 A1 US 20110134252A1 US 96038410 A US96038410 A US 96038410A US 2011134252 A1 US2011134252 A1 US 2011134252A1
- Authority
- US
- United States
- Prior art keywords
- video
- area
- illuminance
- display
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000010365 information processing Effects 0.000 title claims description 24
- 238000000034 method Methods 0.000 title claims description 6
- 238000005259 measurement Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims 2
- 238000012545 processing Methods 0.000 description 64
- 238000001514 detection method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- PCTMTFRHKVHKIS-BMFZQQSSSA-N (1s,3r,4e,6e,8e,10e,12e,14e,16e,18s,19r,20r,21s,25r,27r,30r,31r,33s,35r,37s,38r)-3-[(2r,3s,4s,5s,6r)-4-amino-3,5-dihydroxy-6-methyloxan-2-yl]oxy-19,25,27,30,31,33,35,37-octahydroxy-18,20,21-trimethyl-23-oxo-22,39-dioxabicyclo[33.3.1]nonatriaconta-4,6,8,10 Chemical compound C1C=C2C[C@@H](OS(O)(=O)=O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2.O[C@H]1[C@@H](N)[C@H](O)[C@@H](C)O[C@H]1O[C@H]1/C=C/C=C/C=C/C=C/C=C/C=C/C=C/[C@H](C)[C@@H](O)[C@@H](C)[C@H](C)OC(=O)C[C@H](O)C[C@H](O)CC[C@@H](O)[C@H](O)C[C@H](O)C[C@](O)(C[C@H](O)[C@H]2C(O)=O)O[C@H]2C1 PCTMTFRHKVHKIS-BMFZQQSSSA-N 0.000 description 1
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/44504—Circuit details of the additional information generator, e.g. details of the character or graphics signal generator, overlay mixing circuits
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/57—Control of contrast or brightness
- H04N5/58—Control of contrast or brightness in dependence upon ambient light
Definitions
- the present invention relates to a technology that determines a video display position on a screen (display).
- Japanese Patent Application Laid-Open No. 6-308892 discusses a well-known technology for changing luminance on a screen depending on the intensity of ambient light around the screen to easily view a character or a drawing displayed on the screen.
- the present invention is directed to ensure the visibility of a character or a drawing displayed on a part of a screen.
- the following advantages are obtained.
- a user When strong light enters a part of a screen, a user can easily view a character or a drawing displayed on a part of the screen without performing operation for moving the character or drawing to another area on the screen where strong light does not enter.
- FIGS. 1A to 1C illustrate functional block diagrams of an example of a functional configuration of an information processing apparatus.
- FIGS. 2A to 2C schematically illustrate processing for changing a video display position based on a measured illuminance value.
- FIG. 3 illustrates a flowchart of an example of processing for setting a video display position.
- FIGS. 4A to 4C schematically illustrate processing for changing the video display position and the size based on the measured illuminance value.
- FIG. 5 illustrates a flowchart of example processing for setting the video display position.
- FIG. 6 illustrates a flowchart of an example for setting the video display position.
- FIGS. 7A to 7D schematically illustrate processing for determining a value based on the time change in illuminance value.
- FIG. 8 illustrates a flowchart of example processing for setting the video display position.
- FIGS. 9A and 9B schematically illustrate change of the layout.
- FIG. 1A illustrates a functional block diagram of a functional configuration of an information processing apparatus 100 according to a first exemplary embodiment.
- the information processing apparatus 100 is communicably connected to a display unit 151 , an illuminance measurement unit 152 , and a video image storage unit 153 via cables.
- the information processing apparatus 100 includes an illuminance acquisition unit 101 , a determination unit 102 , a video input unit 103 , a display position setting unit 104 , and a video output unit 105 .
- the display unit 151 is a liquid crystal display (LCD) or a plasma display panel (PDP), and displays a video image corresponding to a video signal output from the video output unit 105 .
- the display unit 151 may be a projection area of a projection device such as a liquid crystal on silicon (LCOS) projector.
- the illuminance measurement unit 152 is a well-known illuminometer, and measures the illuminance of one of a plurality of partial areas obtained by dividing a display area of the display unit 151 .
- the illuminance measurement unit 152 may include illuminometers disposed at a constant interval within a display area of the display unit 151 .
- the illuminance measurement unit 152 may be an illuminometer that measures the luminance of the partial area of the display unit 151 .
- FIGS. 2A to 2C schematically illustrate processing for measuring the illuminance of the partial area by the illuminance measurement unit 152 .
- 144 areas obtained by dividing the display area of the display unit 151 into 16 (in wide) ⁇ 9 (in height) areas are used for measuring the illuminance.
- a numerical value described in the partial area is obtained by quantizing the illuminance (hereinbelow, referred to as an illuminance value) measured by the illuminance measurement unit 152 .
- an illuminance value measured by the illuminance measurement unit 152 .
- an area (ambient light irradiation area 202 ) corresponding to apart of a display area 201 is irradiated with ambient light 204 , and the illuminance value of the ambient light irradiation area 202 is higher than an illuminance value of another area. More specifically, a high illuminance value indicates high illuminance measured by the illuminance measurement unit 152 .
- the video image storage unit 153 is a server that distributes contents to a hard disk drive (HDD), and stores video data to be output to the display unit 151 .
- the illuminance acquisition unit 101 is an interface (I/F) that obtains the illuminance value.
- the determination unit 102 is a micro processing unit (MPU) that determines whether the illuminance value of the partial area is a predetermined threshold as a reference value or less. The predetermined threshold is not limited to a constant, and may be calculated from the time change in illuminance.
- the video input unit 103 is an interface (I/F) that inputs video data output from the video image storage unit 153 .
- the display position setting unit 104 is a micro processing unit (MPU), and sets the video display position corresponding to the video data input to the video input unit 103 based on the determination result of the determination unit 102 .
- the processing for setting the video display position includes, e.g., processing for changing the coordinate for displaying a video image, and processing for affine transforming a video shape.
- the processing of the display position setting unit 104 is specifically described later.
- the video output unit 105 is an interface (I/F) that outputs a video signal for displaying a video image at the position set on the display unit 151 by the display position setting unit 104 .
- FIG. 3 illustrates a flowchart of example processing for setting the video display position in the information processing apparatus 100 .
- the video input unit 103 inputs video data output by the video image storage unit 153 .
- the display position setting unit 104 sets the video display position corresponding to the input video data.
- the illuminance acquisition unit 101 acquires the illuminance value of the partial area acquired by the illuminance measurement unit 152 .
- step S 304 the determination unit 102 determines whether the illuminance value of the partial area is a predetermined threshold value or less (reference value or less).
- the determination unit 102 determines in step S 304 that the illuminance value of the partial area is a predetermined threshold value or less (YES in step S 304 )
- the determination unit 102 executes the processing in step S 305 .
- the determination unit 102 determines in step S 304 that the illuminance value of the partial area is not a predetermined threshold value or less (NO in step S 304 )
- the determination unit 102 executes the processing in step S 307 .
- step S 305 the display position setting unit 104 determines whether the video display position corresponding to the video data is located on the partial area in which the illuminance value is a predetermined threshold value or more.
- the display position setting unit 104 determines in step S 305 that the video display position is located on the partial area in which the illuminance value is a predetermined threshold value or more (YES in step S 305 )
- the display position setting unit 104 executes the processing in step S 306 .
- step S 305 when the display position setting unit 104 determines in step S 305 that the video display position is not on the partial area on which the illuminance value is a predetermined threshold value or more (NO in step S 305 ), the display position setting unit 104 executes the processing in step S 307 .
- step S 306 the display position setting unit 104 changes the display position so that the video display position is not located on the partial area in which the illuminance value is a predetermined threshold value or more (reference value or more).
- the processing for changing the display position is specifically described later.
- step S 307 the video output unit 105 outputs a video signal to the display unit 151 .
- the video signal to be output displays the position set by the display position setting unit 104 in step S 302 , or the position after it is changed by the display position setting unit 104 in step S 305 .
- a series of processing ends.
- the video image storage unit 153 outputs new video data to the video input unit 103 , and the processing from step S 301 of the new video data is executed again.
- step S 303 the illuminance value illustrated in FIG. 2A is obtained. More specifically, the illuminance value ranging from 7 to 9 is measured in the ambient light irradiation area 202 . In the partial area other than the ambient light irradiation area 202 , the illuminance value measures 0.
- step S 304 the determination unit 102 determines whether the illuminance value of the partial area is a predetermined threshold value (4 according to the present exemplary embodiment) or less. Referring to FIG. 2A , the measured illuminance value is 4 or more is in the ambient light irradiation area 202 , and the processing in step 305 is therefore executed.
- step S 305 the display position setting unit 104 determines whether the video display position is located on the ambient light irradiation area 202 in which the illuminance value of 4 or more is measured.
- the processing in step S 305 is specifically described with reference to FIG. 2B .
- Apex coordinates of a display area 201 (rectangular area ABCD) as a video displayable area are defined as A(0,0), B(15,0), C(15,8), and D(0,8).
- a video output area 205 a video image to which the position is set in step S 302 is output.
- apex coordinates of the video output area 205 are E(8,3), F(15,3), G(15,8), and H(8,8). More specifically, in step S 305 , the display position setting unit 104 determines whether there is a partial area having the illuminance value of 4 or more in the rectangular area EFGH.
- step S 306 The processing executed by the display position setting unit 104 in step S 306 is specifically described with reference to FIG. 2C .
- operation is performed to change the coordinate of the video output area 205 .
- the operation is performed to change the coordinate so that the video output area 205 (video output area 206 ) after changing the coordinate comes outside the ambient light irradiation area 202 and is positioned at the coordinate that minimizes the amount of movement from the coordinate of the original video output area 205 .
- an apex e.g., coordinate G
- the operation for changing the coordinate is performed as follows.
- a condition for setting the video output area 205 (video output area 206 ) outside the ambient light irradiation area 202 is satisfied and a point (coordinate J) is calculated that minimizes a radius r of a concentric circle with the apex (coordinate G) as center.
- the following processing is performed.
- the apex of the video output area 205 that is the closest to the border of the ambient light irradiation area 202 is moved to the ambient light irradiation area 202 , and the processing is performed.
- the processing is performed.
- the center of gravity of the video output area 205 is first set as the center of concentric circle, and the processing is executed.
- the illuminance measurement unit 152 may acquire the illuminance while the display unit 151 are not displaying the video image in order to place the illuminance value measured by the illuminance measurement unit 152 under no influence from light emitted by the display unit 151 .
- the illuminance measurement unit 152 may acquire the illuminance while the display unit 151 is displaying the video image.
- the information processing apparatus further includes an illuminance correction unit 121 , and may input to the determination unit 102 a value obtained by subtracting the illuminance value of the light emitted by the display unit 151 from the illuminance value measured by the illuminance measurement unit 152 , as the illuminance value of the ambient light.
- movement processing for setting the position of the video area outside the ambient light irradiation area is performed.
- a part of a video area 703 is moved to the position outside a display area 701 and a part of the video area 703 cannot be consequently displayed.
- the change of the size of the video area can be executed to set the entire video area 703 within the display area 701 .
- FIG. 5 illustrates a flowchart of a processing flow for properly changing the size of the video area in addition to the processing according to the first exemplary embodiment.
- the processing in steps S 501 to S 506 and the processing in step S 509 is similar to the processing in steps S 301 to S 306 and the processing in step S 307 in FIG. 3 , a description thereof is therefore omitted, and only different points are described.
- step S 507 the display position setting unit 104 determines whether or not the video area after changing the display position is outside the display area.
- the display position setting unit 104 determines in step S 507 that the video area after changing the display position is outside the display area (YES in step S 507 )
- the display position setting unit 104 executes the processing in step S 508 .
- the display position setting unit 104 determines in step S 507 that the video area after changing the display position is not outside the display area (NO in step S 507 )
- the display position setting unit 104 executes the processing in step S 509 .
- step S 508 the display position setting unit 104 changes the size of the video area so that the video area after changing the display position is within the display area.
- the display position setting unit 104 executes the processing in step S 508 . Then, the display position setting unit 104 executes the processing in step S 509 .
- the video image can be displayed outside the ambient light irradiation area where the visibility is deteriorated, and a desired video image can be displayed within the display area.
- the light emitted by the display unit 151 does not influence the measured illuminance.
- processing for measuring the illuminance of the ambient light when the light emitted by the display unit 151 influences the measured illuminance is described.
- FIG. 1B illustrates a functional block diagram of a functional configuration of the information processing apparatus 110 according to the second exemplary embodiment.
- the same reference numeral denotes an element with the same function as the information processing apparatus 100 , and a description thereof is omitted.
- a luminance detection unit 111 is a micro processing unit (MPU) and detects the luminance of the light emitted by the display unit 151 . According to the present exemplary embodiment, the luminance value (value obtained by quantizing the measured luminance) is calculated from the video signal output to the display unit 151 .
- a determination unit 112 is a micro processing unit (MPU) and when the calculated luminance value is the threshold value or less, determines whether the illuminance value of the partial area is a predetermined threshold value or less.
- the determination unit 112 determines whether the illuminance value of the partial area is a predetermined threshold value or less.
- FIG. 6 illustrates a flowchart of an example of processing for setting the video display position in the information processing apparatus 110 .
- the processing in steps S 601 to S 607 is similar to the processing in steps S 301 to S 307 in FIG. 3 , a description thereof is thus omitted, and only different points are described.
- a luminance detection unit 111 determines whether the luminance value of the display unit 151 is a predetermined threshold value or less. When the luminance detection unit 111 determines in step S 600 that the luminance value is a predetermined threshold value or less (YES in step S 600 ), the luminance detection unit 111 executes the processing in step S 601 .
- step S 600 when the luminance detection unit 111 determines in step S 600 that the luminance value is not a predetermined threshold value or less (NO in step S 600 ), a series of processing ends.
- the present processing is sequentially executed at predetermined time interval or predetermined frame interval. The above-described processing is performed, thereby improving the visibility of video content by displaying the video image outside the ambient light irradiation area even if the ambient light irradiation area where the visibility is deteriorated changes with time passage.
- the area for displaying the video image is changed depending on the illuminance level of the display unit 151 .
- the area for displaying the video image is changed based on the change in illuminance of the display unit 151 . Since the information processing apparatus according to the present exemplary embodiment is similar to the information processing apparatus 100 , a description thereof is omitted.
- the determination unit 102 determines whether the time change of the illuminance value measured in the partial area is a predetermined threshold value or less. The determination unit 102 stores the illuminance values corresponding to the number of past measurement times, and obtains the difference from the current illuminance value.
- a flowchart illustrating an example of processing for setting the video display position is similar to FIG. 3 , and a description thereof is thus omitted.
- the determination unit 102 does not determine whether the illuminance value of the partial area is a predetermined threshold value or less, and determines whether a value based on the time change in illuminance value of the partial area is a predetermined threshold value or less.
- FIGS. 7A to 7D schematically illustrate processing for determining a value based on the time change in illuminance value.
- FIG. 7A illustrates an illuminance value (illuminance value a at this time) of the partial area measured at time (t ⁇ 2) before the illuminance is measured two times.
- FIG. 7B illustrates an illuminance value (illuminance value b at this time) of the partial area measured at time (t ⁇ 1) before the illuminance is measured one time.
- FIG. 7C illustrates an illuminance value (illuminance value c at this time) of the partial area measured at time t corresponding to the current time.
- W 1 and W 2 are weighting coefficients.
- a partial area where the change rate L in illuminance value is a predetermined threshold value or more is an illuminance change area.
- FIG. 7D illustrates the change rate L in illuminance when W 1 and W 2 are 10.
- the determination unit 102 observes that an area 702 corresponding to the lower right portion of the display area 701 has a rate of illuminance change that is the threshold value or more. By performing the processing, the video image is displayed while avoiding an area where ambient light may enter or may not enter the display area, thereby improving the visibility.
- the area for displaying the video image is changed depending on the illuminance level.
- the video layout is changed depending on the illuminance level without changing the display area of the video image.
- the information processing apparatus is similar to the information processing apparatus 100 , and a description thereof is thus omitted.
- the display position setting unit 104 sets not only the video display position but also the position of a user interface (UI) or subtitle that is superimposed on the video image.
- the display position setting unit 104 determines the display position according to the priority of the video image or the UI or subtitle superimposed on the video image.
- the priorities may be added to the video data stored in advance in the video image storage unit 153 , or the priority may be added to the video image in the video input unit 103 based on the operation of the user input from a user interface (not shown) of the information processing apparatus 100 .
- FIGS. 9A and 9B schematically illustrate change of the video layout based on the measured illuminance value.
- FIG. 9A illustrates data broadcasting that is displayed.
- the display area 701 displays a menu 705 of the data broadcasting and data broadcasting 706 as well as the video area 703 for displaying a video image of broadcasting.
- a part of the display area 701 includes the ambient light irradiation area 702 to which the ambient light is emitted.
- the priority is determined with respect to the video image by the user operation as follows.
- the user requests the data broadcasting display to the information processing apparatus 100 via a user interface.
- the possibility that the user desires to view the data broadcasting is high and the display priority of the data broadcasting is therefore set to be high.
- FIG. 8 illustrates a flowchart of example processing for setting the video display position in the information processing apparatus 110 .
- the processing in steps S 801 to S 804 is similar to the processing in steps S 301 to step S 304 .
- the processing in steps S 808 to S 809 is similar to the processing in steps S 306 to S 307 . Based on these, only the different points are described.
- step S 805 the display position setting unit 104 acquires the priority added to the video image in the video input unit 103 based on the user operation.
- the video area 703 includes three display contents of the video area 703 , the menu 705 of the data broadcasting, and the data broadcasting 706 .
- the display position setting unit 104 acquires three priorities of the input video images.
- the values of the priority of the data broadcasting 706 , the menu 705 of the data broadcasting, and the video area 703 are 1, 2, and 3, respectively, and the display position setting unit 104 determines that the priority order is the data broadcasting 706 , the menu 705 of the data broadcasting, and the video area 703 .
- step S 806 the display position setting unit 104 determines the display position of the display contents with high priority.
- a video image with the highest priority is the data broadcasting 706 , and the display position of the data broadcasting 706 is therefore determined.
- an area showing an illuminance value of the threshold value or less and a low coordinate value is assigned as the display area.
- the data broadcasting 706 is assigned to the lowest coordinate value (0,0) outside the ambient light irradiation area.
- step S 807 the display position setting unit 104 determines whether or not all the display positions are determined. Since the assignment of the display areas of the menu 705 of the data broadcasting and the video area 703 does not end, the processing proceeds to step S 806 .
- step S 806 the display position setting unit 104 next determines the display position of the menu 705 of the data broadcasting with high priority. An area adjacent to the data broadcasting 706 with the lowest coordinate value outside the ambient light irradiation area is assigned to the data broadcasting 706 .
- Steps S 806 and S 807 are repeated.
- the processing proceeds to step S 808 , and the video output unit 105 displays the video signal.
- the display area 701 is displayed as illustrated in FIG. 9B .
- aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment (s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s).
- the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Life Sciences & Earth Sciences (AREA)
- Business, Economics & Management (AREA)
- Environmental & Geological Engineering (AREA)
- Environmental Sciences (AREA)
- Remote Sensing (AREA)
- Ecology (AREA)
- Biodiversity & Conservation Biology (AREA)
- Emergency Management (AREA)
- Computer Graphics (AREA)
- Controls And Circuits For Display Device (AREA)
- Digital Computer Display Output (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- 1. Field of the Invention
- The present invention relates to a technology that determines a video display position on a screen (display).
- 2. Description of the Related Art
- Recently, a technology has become widespread which changes luminance, color temperature, and color shading depending on a surrounding environment to provide a easily viewable video image on an LCD (liquid crystal display) or a PDP (plasma display panel). In particular, Japanese Patent Application Laid-Open No. 6-308892 discusses a well-known technology for changing luminance on a screen depending on the intensity of ambient light around the screen to easily view a character or a drawing displayed on the screen.
- However, when strong light such as sunlight directly enters the screen, the visibility of a character or a drawing displayed on the screen cannot be sufficiently improved by changing the luminance of the screen. When strong light enters a part of the screen, a user performs operation for moving the character or drawing to another area on the screen where strong light does not enter to easily view the character or drawing displayed on a part of the screen.
- The present invention is directed to ensure the visibility of a character or a drawing displayed on a part of a screen.
- According to an aspect of the present invention, an information processing apparatus that sets a position of a video image displayed on a display unit includes: a measurement unit configured to measure illuminance of one or more partial areas forming the display unit; and a setting unit configured to set the video display position to an area except for a partial area where the illuminance is a reference value or more.
- According to the present invention, the following advantages are obtained. When strong light enters a part of a screen, a user can easily view a character or a drawing displayed on a part of the screen without performing operation for moving the character or drawing to another area on the screen where strong light does not enter.
- Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
-
FIGS. 1A to 1C illustrate functional block diagrams of an example of a functional configuration of an information processing apparatus. -
FIGS. 2A to 2C schematically illustrate processing for changing a video display position based on a measured illuminance value. -
FIG. 3 illustrates a flowchart of an example of processing for setting a video display position. -
FIGS. 4A to 4C schematically illustrate processing for changing the video display position and the size based on the measured illuminance value. -
FIG. 5 illustrates a flowchart of example processing for setting the video display position. -
FIG. 6 illustrates a flowchart of an example for setting the video display position. -
FIGS. 7A to 7D schematically illustrate processing for determining a value based on the time change in illuminance value. -
FIG. 8 illustrates a flowchart of example processing for setting the video display position. -
FIGS. 9A and 9B schematically illustrate change of the layout. - Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
-
FIG. 1A illustrates a functional block diagram of a functional configuration of aninformation processing apparatus 100 according to a first exemplary embodiment. Theinformation processing apparatus 100 is communicably connected to adisplay unit 151, anilluminance measurement unit 152, and a videoimage storage unit 153 via cables. Theinformation processing apparatus 100 includes anilluminance acquisition unit 101, adetermination unit 102, avideo input unit 103, a displayposition setting unit 104, and avideo output unit 105. - The
display unit 151 is a liquid crystal display (LCD) or a plasma display panel (PDP), and displays a video image corresponding to a video signal output from thevideo output unit 105. Thedisplay unit 151 may be a projection area of a projection device such as a liquid crystal on silicon (LCOS) projector. Theilluminance measurement unit 152 is a well-known illuminometer, and measures the illuminance of one of a plurality of partial areas obtained by dividing a display area of thedisplay unit 151. Theilluminance measurement unit 152 may include illuminometers disposed at a constant interval within a display area of thedisplay unit 151. Theilluminance measurement unit 152 may be an illuminometer that measures the luminance of the partial area of thedisplay unit 151. -
FIGS. 2A to 2C schematically illustrate processing for measuring the illuminance of the partial area by theilluminance measurement unit 152. According to the present exemplary embodiment, as illustrated inFIG. 2A , 144 areas obtained by dividing the display area of thedisplay unit 151 into 16 (in wide)×9 (in height) areas are used for measuring the illuminance. Referring toFIG. 2A , a numerical value described in the partial area is obtained by quantizing the illuminance (hereinbelow, referred to as an illuminance value) measured by theilluminance measurement unit 152. Referring toFIG. 2A , an area (ambient light irradiation area 202) corresponding to apart of adisplay area 201 is irradiated withambient light 204, and the illuminance value of the ambientlight irradiation area 202 is higher than an illuminance value of another area. More specifically, a high illuminance value indicates high illuminance measured by theilluminance measurement unit 152. - The video
image storage unit 153 is a server that distributes contents to a hard disk drive (HDD), and stores video data to be output to thedisplay unit 151. Theilluminance acquisition unit 101 is an interface (I/F) that obtains the illuminance value. Thedetermination unit 102 is a micro processing unit (MPU) that determines whether the illuminance value of the partial area is a predetermined threshold as a reference value or less. The predetermined threshold is not limited to a constant, and may be calculated from the time change in illuminance. Thevideo input unit 103 is an interface (I/F) that inputs video data output from the videoimage storage unit 153. - The display
position setting unit 104 is a micro processing unit (MPU), and sets the video display position corresponding to the video data input to thevideo input unit 103 based on the determination result of thedetermination unit 102. The processing for setting the video display position includes, e.g., processing for changing the coordinate for displaying a video image, and processing for affine transforming a video shape. The processing of the displayposition setting unit 104 is specifically described later. Thevideo output unit 105 is an interface (I/F) that outputs a video signal for displaying a video image at the position set on thedisplay unit 151 by the displayposition setting unit 104. -
FIG. 3 illustrates a flowchart of example processing for setting the video display position in theinformation processing apparatus 100. In step S301, thevideo input unit 103 inputs video data output by the videoimage storage unit 153. In step S302, the displayposition setting unit 104 sets the video display position corresponding to the input video data. In step S303, theilluminance acquisition unit 101 acquires the illuminance value of the partial area acquired by theilluminance measurement unit 152. - In step S304, the
determination unit 102 determines whether the illuminance value of the partial area is a predetermined threshold value or less (reference value or less). When thedetermination unit 102 determines in step S304 that the illuminance value of the partial area is a predetermined threshold value or less (YES in step S304), thedetermination unit 102 executes the processing in step S305. When thedetermination unit 102 determines in step S304 that the illuminance value of the partial area is not a predetermined threshold value or less (NO in step S304), thedetermination unit 102 executes the processing in step S307. - In step S305, the display
position setting unit 104 determines whether the video display position corresponding to the video data is located on the partial area in which the illuminance value is a predetermined threshold value or more. When the displayposition setting unit 104 determines in step S305 that the video display position is located on the partial area in which the illuminance value is a predetermined threshold value or more (YES in step S305), the displayposition setting unit 104 executes the processing in step S306. On the other hand, when the displayposition setting unit 104 determines in step S305 that the video display position is not on the partial area on which the illuminance value is a predetermined threshold value or more (NO in step S305), the displayposition setting unit 104 executes the processing in step S307. - In step S306, the display
position setting unit 104 changes the display position so that the video display position is not located on the partial area in which the illuminance value is a predetermined threshold value or more (reference value or more). The processing for changing the display position is specifically described later. In step S307, thevideo output unit 105 outputs a video signal to thedisplay unit 151. The video signal to be output displays the position set by the displayposition setting unit 104 in step S302, or the position after it is changed by the displayposition setting unit 104 in step S305. After executing the processing in step S307, a series of processing ends. The videoimage storage unit 153 outputs new video data to thevideo input unit 103, and the processing from step S301 of the new video data is executed again. - Next, the processing in steps S304, S305, and S306 is specifically described with reference to
FIGS. 2A to 2C . In step S303, the illuminance value illustrated inFIG. 2A is obtained. More specifically, the illuminance value ranging from 7 to 9 is measured in the ambientlight irradiation area 202. In the partial area other than the ambientlight irradiation area 202, the illuminance value measures 0. In step S304, thedetermination unit 102 determines whether the illuminance value of the partial area is a predetermined threshold value (4 according to the present exemplary embodiment) or less. Referring toFIG. 2A , the measured illuminance value is 4 or more is in the ambientlight irradiation area 202, and the processing in step 305 is therefore executed. - In step S305, the display
position setting unit 104 determines whether the video display position is located on the ambientlight irradiation area 202 in which the illuminance value of 4 or more is measured. Hereinbelow, the processing in step S305 is specifically described with reference toFIG. 2B . Apex coordinates of a display area 201 (rectangular area ABCD) as a video displayable area are defined as A(0,0), B(15,0), C(15,8), and D(0,8). In a video output area 205, a video image to which the position is set in step S302 is output. Then, apex coordinates of the video output area 205 (rectangular area EFGH) are E(8,3), F(15,3), G(15,8), and H(8,8). More specifically, in step S305, the displayposition setting unit 104 determines whether there is a partial area having the illuminance value of 4 or more in the rectangular area EFGH. - The processing executed by the display
position setting unit 104 in step S306 is specifically described with reference toFIG. 2C . In the processing executed by the displayposition setting unit 104, operation is performed to change the coordinate of the video output area 205. According to the present exemplary embodiment, the operation is performed to change the coordinate so that the video output area 205 (video output area 206) after changing the coordinate comes outside the ambientlight irradiation area 202 and is positioned at the coordinate that minimizes the amount of movement from the coordinate of the original video output area 205. When an apex (e.g., coordinate G) of the video output area 205 is included in the ambientlight irradiation area 202, the operation for changing the coordinate is performed as follows. A condition for setting the video output area 205 (video output area 206) outside the ambientlight irradiation area 202 is satisfied and a point (coordinate J) is calculated that minimizes a radius r of a concentric circle with the apex (coordinate G) as center. - When a part of the video output area 205 is the ambient
light irradiation area 202 and the apex of the video output area 205 is not included in the ambientlight irradiation area 202, the following processing is performed. The apex of the video output area 205 that is the closest to the border of the ambientlight irradiation area 202 is moved to the ambientlight irradiation area 202, and the processing is performed. When there are a plurality of apexes of the video output area 205 included in the ambientlight irradiation area 202, the apex the farthest from the apex of the video output area 205 that is not included in the ambientlight irradiation area 202 is set as the center of a concentric circle, the processing is performed. When all the apexes of the video output area 205 are included in the ambientlight irradiation area 202, the center of gravity of the video output area 205 is first set as the center of concentric circle, and the processing is executed. - In
FIG. 2C , a position of the video output area 206 (rectangular HIJK) is changed to a point moved from the original video output area 205 by −2 in the horizontal direction and −2 in the vertical direction. Thus, the video image can be displayed outside the ambient light irradiation area where the visibility is deteriorated. According to the present exemplary embodiment, theilluminance measurement unit 152 may acquire the illuminance while thedisplay unit 151 are not displaying the video image in order to place the illuminance value measured by theilluminance measurement unit 152 under no influence from light emitted by thedisplay unit 151. - The
illuminance measurement unit 152 may acquire the illuminance while thedisplay unit 151 is displaying the video image. As illustrated inFIG. 1C , the information processing apparatus further includes anilluminance correction unit 121, and may input to the determination unit 102 a value obtained by subtracting the illuminance value of the light emitted by thedisplay unit 151 from the illuminance value measured by theilluminance measurement unit 152, as the illuminance value of the ambient light. - According to the first exemplary embodiment, movement processing for setting the position of the video area outside the ambient light irradiation area is performed. However, by performing the processing, as illustrated in
FIG. 4B , a part of avideo area 703 is moved to the position outside adisplay area 701 and a part of thevideo area 703 cannot be consequently displayed. In this case, as illustrated inFIG. 4C , the change of the size of the video area can be executed to set theentire video area 703 within thedisplay area 701. -
FIG. 5 illustrates a flowchart of a processing flow for properly changing the size of the video area in addition to the processing according to the first exemplary embodiment. Referring toFIG. 5 , the processing in steps S501 to S506 and the processing in step S509 is similar to the processing in steps S301 to S306 and the processing in step S307 inFIG. 3 , a description thereof is therefore omitted, and only different points are described. - In step S507, the display
position setting unit 104 determines whether or not the video area after changing the display position is outside the display area. When the displayposition setting unit 104 determines in step S507 that the video area after changing the display position is outside the display area (YES in step S507), the displayposition setting unit 104 executes the processing in step S508. On the other hand, when the displayposition setting unit 104 determines in step S507 that the video area after changing the display position is not outside the display area (NO in step S507), the displayposition setting unit 104 executes the processing in step S509. In step S508, the displayposition setting unit 104 changes the size of the video area so that the video area after changing the display position is within the display area. Subsequently, the displayposition setting unit 104 executes the processing in step S508. Then, the displayposition setting unit 104 executes the processing in step S509. As a consequence, the video image can be displayed outside the ambient light irradiation area where the visibility is deteriorated, and a desired video image can be displayed within the display area. - According to the first exemplary embodiment, the light emitted by the
display unit 151 does not influence the measured illuminance. According to the present exemplary embodiment, processing for measuring the illuminance of the ambient light when the light emitted by thedisplay unit 151 influences the measured illuminance, is described. -
FIG. 1B illustrates a functional block diagram of a functional configuration of theinformation processing apparatus 110 according to the second exemplary embodiment. The same reference numeral denotes an element with the same function as theinformation processing apparatus 100, and a description thereof is omitted. Aluminance detection unit 111 is a micro processing unit (MPU) and detects the luminance of the light emitted by thedisplay unit 151. According to the present exemplary embodiment, the luminance value (value obtained by quantizing the measured luminance) is calculated from the video signal output to thedisplay unit 151. Adetermination unit 112 is a micro processing unit (MPU) and when the calculated luminance value is the threshold value or less, determines whether the illuminance value of the partial area is a predetermined threshold value or less. More specifically, when the light emitted by thedisplay unit 151 does not influence the illuminance (e.g., the luminance of scene change is low), thedetermination unit 112 determines whether the illuminance value of the partial area is a predetermined threshold value or less. -
FIG. 6 illustrates a flowchart of an example of processing for setting the video display position in theinformation processing apparatus 110. Referring toFIG. 6 , the processing in steps S601 to S607 is similar to the processing in steps S301 to S307 inFIG. 3 , a description thereof is thus omitted, and only different points are described. In step S600, aluminance detection unit 111 determines whether the luminance value of thedisplay unit 151 is a predetermined threshold value or less. When theluminance detection unit 111 determines in step S600 that the luminance value is a predetermined threshold value or less (YES in step S600), theluminance detection unit 111 executes the processing in step S601. On the other hand, when theluminance detection unit 111 determines in step S600 that the luminance value is not a predetermined threshold value or less (NO in step S600), a series of processing ends. The present processing is sequentially executed at predetermined time interval or predetermined frame interval. The above-described processing is performed, thereby improving the visibility of video content by displaying the video image outside the ambient light irradiation area even if the ambient light irradiation area where the visibility is deteriorated changes with time passage. - According to the first and second exemplary embodiments, depending on the illuminance level of the
display unit 151, the area for displaying the video image is changed. According to the present exemplary embodiment, the area for displaying the video image is changed based on the change in illuminance of thedisplay unit 151. Since the information processing apparatus according to the present exemplary embodiment is similar to theinformation processing apparatus 100, a description thereof is omitted. However, thedetermination unit 102 determines whether the time change of the illuminance value measured in the partial area is a predetermined threshold value or less. Thedetermination unit 102 stores the illuminance values corresponding to the number of past measurement times, and obtains the difference from the current illuminance value. - According to the third exemplary embodiment, a flowchart illustrating an example of processing for setting the video display position is similar to
FIG. 3 , and a description thereof is thus omitted. However, in step S304, thedetermination unit 102 does not determine whether the illuminance value of the partial area is a predetermined threshold value or less, and determines whether a value based on the time change in illuminance value of the partial area is a predetermined threshold value or less. -
FIGS. 7A to 7D schematically illustrate processing for determining a value based on the time change in illuminance value.FIG. 7A illustrates an illuminance value (illuminance value a at this time) of the partial area measured at time (t−2) before the illuminance is measured two times.FIG. 7B illustrates an illuminance value (illuminance value b at this time) of the partial area measured at time (t−1) before the illuminance is measured one time.FIG. 7C illustrates an illuminance value (illuminance value c at this time) of the partial area measured at time t corresponding to the current time. - According to the present exemplary embodiment, with the following
Expression 1, a value (change rate L) based on the time change in illuminance value of the partial area is calculated. -
L=W1*(|b−c|/c)+W2*(|a−c|/c) (Expression 1) - Here, W1 and W2 are weighting coefficients.
- A partial area where the change rate L in illuminance value is a predetermined threshold value or more is an illuminance change area.
FIG. 7D illustrates the change rate L in illuminance when W1 and W2 are 10. Further, when a threshold for determination is 4, in step S304, thedetermination unit 102 observes that anarea 702 corresponding to the lower right portion of thedisplay area 701 has a rate of illuminance change that is the threshold value or more. By performing the processing, the video image is displayed while avoiding an area where ambient light may enter or may not enter the display area, thereby improving the visibility. - According to the first and second exemplary embodiments, the area for displaying the video image is changed depending on the illuminance level. According to the present exemplary embodiment, the video layout is changed depending on the illuminance level without changing the display area of the video image.
- The information processing apparatus according to the fourth exemplary embodiment is similar to the
information processing apparatus 100, and a description thereof is thus omitted. The displayposition setting unit 104 sets not only the video display position but also the position of a user interface (UI) or subtitle that is superimposed on the video image. According to the present exemplary embodiment, the displayposition setting unit 104 determines the display position according to the priority of the video image or the UI or subtitle superimposed on the video image. The priorities may be added to the video data stored in advance in the videoimage storage unit 153, or the priority may be added to the video image in thevideo input unit 103 based on the operation of the user input from a user interface (not shown) of theinformation processing apparatus 100. -
FIGS. 9A and 9B schematically illustrate change of the video layout based on the measured illuminance value.FIG. 9A illustrates data broadcasting that is displayed. Thedisplay area 701 displays amenu 705 of the data broadcasting and data broadcasting 706 as well as thevideo area 703 for displaying a video image of broadcasting. A part of thedisplay area 701 includes the ambientlight irradiation area 702 to which the ambient light is emitted. According to the present exemplary embodiment, the priority is determined with respect to the video image by the user operation as follows. The user requests the data broadcasting display to theinformation processing apparatus 100 via a user interface. When the user requests the data broadcasting display, the possibility that the user desires to view the data broadcasting is high and the display priority of the data broadcasting is therefore set to be high. - Since the display priority of the data broadcasting is set to be high, the
video input unit 103 sets values of the priority of the data broadcasting 706, themenu 705 of the data broadcasting, and thevideo area 703 to 1 to 3, respectively, thereby adding the priorities. The lower the value of the priority, the higher the priority.FIG. 8 illustrates a flowchart of example processing for setting the video display position in theinformation processing apparatus 110. The processing in steps S801 to S804 is similar to the processing in steps S301 to step S304. The processing in steps S808 to S809 is similar to the processing in steps S306 to S307. Based on these, only the different points are described. - In step S805, the display
position setting unit 104 acquires the priority added to the video image in thevideo input unit 103 based on the user operation. As described above, thevideo area 703 includes three display contents of thevideo area 703, themenu 705 of the data broadcasting, and the data broadcasting 706. In step S805, the displayposition setting unit 104 acquires three priorities of the input video images. The values of the priority of the data broadcasting 706, themenu 705 of the data broadcasting, and thevideo area 703 are 1, 2, and 3, respectively, and the displayposition setting unit 104 determines that the priority order is the data broadcasting 706, themenu 705 of the data broadcasting, and thevideo area 703. - In step S806, the display
position setting unit 104 determines the display position of the display contents with high priority. A video image with the highest priority is the data broadcasting 706, and the display position of the data broadcasting 706 is therefore determined. According to a determination method of the display position of the displayposition setting unit 104, an area showing an illuminance value of the threshold value or less and a low coordinate value is assigned as the display area. The data broadcasting 706 is assigned to the lowest coordinate value (0,0) outside the ambient light irradiation area. - In step S807, the display
position setting unit 104 determines whether or not all the display positions are determined. Since the assignment of the display areas of themenu 705 of the data broadcasting and thevideo area 703 does not end, the processing proceeds to step S806. In step S806, the displayposition setting unit 104 next determines the display position of themenu 705 of the data broadcasting with high priority. An area adjacent to the data broadcasting 706 with the lowest coordinate value outside the ambient light irradiation area is assigned to the data broadcasting 706. - Steps S806 and S807 are repeated. After determining the positions of all display contents, the processing proceeds to step S808, and the
video output unit 105 displays the video signal. Then, thedisplay area 701 is displayed as illustrated inFIG. 9B . By performing the processing, when the user requests display of the data broadcasting, the data broadcasting is laid out outside the ambient light irradiation area so that the user can easily view contents of the data broadcasting. The layout of display contents is changed as mentioned above, thereby displaying the display contents with high importance outside the ambient light irradiation area and improving the visibility. - Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment (s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
- This application claims priority from Japanese Patent Application No. 2009-278958 filed Dec. 8, 2009 and No. 2010-227549 filed Oct. 7, 2010 which are hereby incorporated by reference herein in their entirety.
Claims (4)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2009-278958 | 2009-12-08 | ||
| JP2009278958 | 2009-12-08 | ||
| JP2010227549A JP5761953B2 (en) | 2009-12-08 | 2010-10-07 | Information processing apparatus and control method thereof |
| JP2010-227549 | 2010-10-07 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20110134252A1 true US20110134252A1 (en) | 2011-06-09 |
| US8953048B2 US8953048B2 (en) | 2015-02-10 |
Family
ID=44081644
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/960,384 Expired - Fee Related US8953048B2 (en) | 2009-12-08 | 2010-12-03 | Information processing apparatus and control method thereof |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US8953048B2 (en) |
| JP (1) | JP5761953B2 (en) |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120242254A1 (en) * | 2011-03-21 | 2012-09-27 | Changho Kim | Lighting system and method for controlling the same |
| US20130033467A1 (en) * | 2010-04-06 | 2013-02-07 | Yukihide Kohtoku | Display device, liquid crystal module, and image display system |
| EP2722003A1 (en) * | 2012-10-18 | 2014-04-23 | Dental Imaging Technologies Corporation | Light box effect for viewing digital radiographic images |
| EP2854122A1 (en) * | 2013-09-25 | 2015-04-01 | Samsung Electronics Co., Ltd | Adjusting light emitting pixels |
| US20160140907A1 (en) * | 2014-01-22 | 2016-05-19 | Sakai Display Products Corporation | Display Apparatus |
| CN105630447A (en) * | 2015-12-24 | 2016-06-01 | 小米科技有限责任公司 | Method and device for adjusting word display |
| WO2016195301A1 (en) * | 2015-06-03 | 2016-12-08 | Samsung Electronics Co., Ltd. | Display system for enhancing visibility and methods thereof |
| WO2017131410A1 (en) * | 2016-01-29 | 2017-08-03 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same |
| US20180350323A1 (en) * | 2017-06-01 | 2018-12-06 | Qualcomm Incorporated | Adjusting color palettes used for displaying images on a display device based on ambient light levels |
| CN110069102A (en) * | 2019-04-29 | 2019-07-30 | 努比亚技术有限公司 | A kind of display area regulation method, equipment and computer readable storage medium |
| US11263999B2 (en) * | 2017-06-08 | 2022-03-01 | Canon Kabushiki Kaisha | Image processing device and control method therefor |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6102215B2 (en) * | 2011-12-21 | 2017-03-29 | 株式会社リコー | Image processing apparatus, image processing method, and program |
| US20150163410A1 (en) * | 2013-12-10 | 2015-06-11 | Semiconductor Energy Laboratory Co., Ltd. | Display Device and Electronic Device |
| US10636384B2 (en) | 2014-04-04 | 2020-04-28 | Sony Corporation | Image processing apparatus and image processing method |
| JP6365361B2 (en) * | 2015-03-12 | 2018-08-01 | トヨタ自動車株式会社 | Information display device |
| JP2019090858A (en) * | 2017-11-10 | 2019-06-13 | キヤノン株式会社 | Display device, display controller and display control method |
| WO2024142522A1 (en) * | 2022-12-27 | 2024-07-04 | パナソニックIpマネジメント株式会社 | Video processing system, video processing method, and program |
Citations (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5751283A (en) * | 1996-07-17 | 1998-05-12 | Microsoft Corporation | Resizing a window and an object on a display screen |
| US20030214586A1 (en) * | 2002-05-18 | 2003-11-20 | Lg.Philips Lcd Co., Ltd. | Image quality analysis method and system for a display device |
| US6654035B1 (en) * | 1997-12-15 | 2003-11-25 | International Business Machines Corporation | Computer system and method of manipulating a graphical user interface component on a computer display through collision with a pointer |
| US6657637B1 (en) * | 1998-07-30 | 2003-12-02 | Matsushita Electric Industrial Co., Ltd. | Moving image combining apparatus combining computer graphic image and at least one video sequence composed of a plurality of video frames |
| US20030231161A1 (en) * | 2002-06-17 | 2003-12-18 | Fuji Photo Film Co., Tld. | Image display device |
| US20050200912A1 (en) * | 2004-02-26 | 2005-09-15 | Hitoshi Yamakado | Image arrangement for electronic album |
| US20050259112A1 (en) * | 2004-05-24 | 2005-11-24 | Hajime Suzukawa | Information processing apparatus and display control method for information processing apparatus |
| US20060197851A1 (en) * | 2005-03-07 | 2006-09-07 | Paul Vlahos | Positioning a subject with respect to a background scene in a digital camera |
| US20060265592A1 (en) * | 2005-04-28 | 2006-11-23 | Masaki Tsuchida | Television broadcast receiver and television broadcast receiving method |
| US7206029B2 (en) * | 2000-12-15 | 2007-04-17 | Koninklijke Philips Electronics N.V. | Picture-in-picture repositioning and/or resizing based on video content analysis |
| US20070285379A1 (en) * | 2006-06-09 | 2007-12-13 | Samsung Electronics Co., Ltd. | Liquid crystal display and method of adjusting brightness for the same |
| US20080111922A1 (en) * | 2006-07-28 | 2008-05-15 | International Business Machines Corporation | Mapping of presentation material |
| US20090141180A1 (en) * | 2007-11-30 | 2009-06-04 | Sony Corporation | Transmitting device, receiving device, and method for transmitting operational information in receiving device |
| US20090256974A1 (en) * | 2006-08-29 | 2009-10-15 | Panasonic Corporation | Image display method and image display device |
| US20090273661A1 (en) * | 2008-04-30 | 2009-11-05 | Mauchly J William | Method of lighting |
| US20090289951A1 (en) * | 2001-07-26 | 2009-11-26 | Seiko Epson Corporation | Environment-compliant image display system, projector, and program |
| US20120062621A1 (en) * | 2009-08-28 | 2012-03-15 | Mitsubishi Electric Corporation | Brightness adjusting device |
| US8400547B2 (en) * | 2008-11-05 | 2013-03-19 | Sony Corporation | Imaging apparatus and display control method in imaging apparatus |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06308892A (en) | 1993-04-23 | 1994-11-04 | Matsushita Electric Ind Co Ltd | Display device |
| JPH086532A (en) * | 1994-06-20 | 1996-01-12 | Hitachi Ltd | Multi-screen display device |
| JPH1165810A (en) * | 1997-08-27 | 1999-03-09 | Nippon Telegr & Teleph Corp <Ntt> | Screen display automatic switching method, recording medium recording this method, and terminal device |
| JP2000089738A (en) * | 1998-09-14 | 2000-03-31 | Mitsubishi Electric Corp | Multi-panel large-screen display device and its degraded display method |
| JP2002215136A (en) * | 2001-01-18 | 2002-07-31 | Kawai Musical Instr Mfg Co Ltd | Electronic musical instrument display |
| JP2007183449A (en) * | 2006-01-10 | 2007-07-19 | Sanyo Electric Co Ltd | Liquid crystal display device and timing signal adjustment method therefor |
| JP2008022115A (en) * | 2006-07-11 | 2008-01-31 | Sharp Corp | Digital television receiver |
| JP2008233379A (en) * | 2007-03-19 | 2008-10-02 | Sharp Corp | Liquid crystal display |
| JP2008285105A (en) * | 2007-05-21 | 2008-11-27 | Tokai Rika Co Ltd | Information display device |
| JP2009049512A (en) * | 2007-08-14 | 2009-03-05 | Toshiba Corp | Screen display processing apparatus and method |
| JP2009181501A (en) * | 2008-01-31 | 2009-08-13 | Toshiba Corp | Mobile communication equipment |
| JP2011013515A (en) * | 2009-07-03 | 2011-01-20 | J&K Car Electronics Corp | Display device, program, and display method |
-
2010
- 2010-10-07 JP JP2010227549A patent/JP5761953B2/en not_active Expired - Fee Related
- 2010-12-03 US US12/960,384 patent/US8953048B2/en not_active Expired - Fee Related
Patent Citations (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5751283A (en) * | 1996-07-17 | 1998-05-12 | Microsoft Corporation | Resizing a window and an object on a display screen |
| US6654035B1 (en) * | 1997-12-15 | 2003-11-25 | International Business Machines Corporation | Computer system and method of manipulating a graphical user interface component on a computer display through collision with a pointer |
| US6657637B1 (en) * | 1998-07-30 | 2003-12-02 | Matsushita Electric Industrial Co., Ltd. | Moving image combining apparatus combining computer graphic image and at least one video sequence composed of a plurality of video frames |
| US7206029B2 (en) * | 2000-12-15 | 2007-04-17 | Koninklijke Philips Electronics N.V. | Picture-in-picture repositioning and/or resizing based on video content analysis |
| US20090289951A1 (en) * | 2001-07-26 | 2009-11-26 | Seiko Epson Corporation | Environment-compliant image display system, projector, and program |
| US20030214586A1 (en) * | 2002-05-18 | 2003-11-20 | Lg.Philips Lcd Co., Ltd. | Image quality analysis method and system for a display device |
| US20030231161A1 (en) * | 2002-06-17 | 2003-12-18 | Fuji Photo Film Co., Tld. | Image display device |
| US20050200912A1 (en) * | 2004-02-26 | 2005-09-15 | Hitoshi Yamakado | Image arrangement for electronic album |
| US20050259112A1 (en) * | 2004-05-24 | 2005-11-24 | Hajime Suzukawa | Information processing apparatus and display control method for information processing apparatus |
| US20060197851A1 (en) * | 2005-03-07 | 2006-09-07 | Paul Vlahos | Positioning a subject with respect to a background scene in a digital camera |
| US20060265592A1 (en) * | 2005-04-28 | 2006-11-23 | Masaki Tsuchida | Television broadcast receiver and television broadcast receiving method |
| US20070285379A1 (en) * | 2006-06-09 | 2007-12-13 | Samsung Electronics Co., Ltd. | Liquid crystal display and method of adjusting brightness for the same |
| US20080111922A1 (en) * | 2006-07-28 | 2008-05-15 | International Business Machines Corporation | Mapping of presentation material |
| US20090256974A1 (en) * | 2006-08-29 | 2009-10-15 | Panasonic Corporation | Image display method and image display device |
| US7631974B2 (en) * | 2006-08-29 | 2009-12-15 | Panasonic Corporation | Image display method and image display device |
| US20090141180A1 (en) * | 2007-11-30 | 2009-06-04 | Sony Corporation | Transmitting device, receiving device, and method for transmitting operational information in receiving device |
| US20090273661A1 (en) * | 2008-04-30 | 2009-11-05 | Mauchly J William | Method of lighting |
| US8400547B2 (en) * | 2008-11-05 | 2013-03-19 | Sony Corporation | Imaging apparatus and display control method in imaging apparatus |
| US20120062621A1 (en) * | 2009-08-28 | 2012-03-15 | Mitsubishi Electric Corporation | Brightness adjusting device |
Cited By (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130033467A1 (en) * | 2010-04-06 | 2013-02-07 | Yukihide Kohtoku | Display device, liquid crystal module, and image display system |
| US8841865B2 (en) * | 2011-03-21 | 2014-09-23 | Lg Electronics Inc. | Lighting system and method for controlling the same |
| US20120242254A1 (en) * | 2011-03-21 | 2012-09-27 | Changho Kim | Lighting system and method for controlling the same |
| EP2722003A1 (en) * | 2012-10-18 | 2014-04-23 | Dental Imaging Technologies Corporation | Light box effect for viewing digital radiographic images |
| US9142196B2 (en) | 2012-10-18 | 2015-09-22 | Dental Imaging Technologies Corporation | Light box effect for viewing digital radiographic images |
| EP2854122A1 (en) * | 2013-09-25 | 2015-04-01 | Samsung Electronics Co., Ltd | Adjusting light emitting pixels |
| US10056021B2 (en) | 2013-09-25 | 2018-08-21 | Samsung Electronics Co., Ltd. | Method and apparatus for adjusting light-emitting pixels using light-receiving pixels |
| US20160140907A1 (en) * | 2014-01-22 | 2016-05-19 | Sakai Display Products Corporation | Display Apparatus |
| US10127889B2 (en) * | 2015-06-03 | 2018-11-13 | Samsung Electronics Co., Ltd. | Display system for enhancing visibility and methods thereof |
| WO2016195301A1 (en) * | 2015-06-03 | 2016-12-08 | Samsung Electronics Co., Ltd. | Display system for enhancing visibility and methods thereof |
| US20160358582A1 (en) * | 2015-06-03 | 2016-12-08 | Samsung Electronics Co., Ltd. | Display system for enhancing visibility and methods thereof |
| CN105630447A (en) * | 2015-12-24 | 2016-06-01 | 小米科技有限责任公司 | Method and device for adjusting word display |
| WO2017131410A1 (en) * | 2016-01-29 | 2017-08-03 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same |
| CN108604432A (en) * | 2016-01-29 | 2018-09-28 | 三星电子株式会社 | Electronic equipment and method for controlling it |
| CN108604432B (en) * | 2016-01-29 | 2022-04-05 | 三星电子株式会社 | Electronic device and method for controlling the same |
| US11574611B2 (en) | 2016-01-29 | 2023-02-07 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same |
| US20180350323A1 (en) * | 2017-06-01 | 2018-12-06 | Qualcomm Incorporated | Adjusting color palettes used for displaying images on a display device based on ambient light levels |
| US10446114B2 (en) * | 2017-06-01 | 2019-10-15 | Qualcomm Incorporated | Adjusting color palettes used for displaying images on a display device based on ambient light levels |
| US11263999B2 (en) * | 2017-06-08 | 2022-03-01 | Canon Kabushiki Kaisha | Image processing device and control method therefor |
| CN110069102A (en) * | 2019-04-29 | 2019-07-30 | 努比亚技术有限公司 | A kind of display area regulation method, equipment and computer readable storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5761953B2 (en) | 2015-08-12 |
| US8953048B2 (en) | 2015-02-10 |
| JP2011141864A (en) | 2011-07-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8953048B2 (en) | Information processing apparatus and control method thereof | |
| US20170229099A1 (en) | Display apparatus and method for controlling display apparatus | |
| US8917277B2 (en) | Animation control device, animation control method, program, and integrated circuit | |
| JP5768639B2 (en) | Pointer control device, projector and program | |
| US9640142B2 (en) | Apparatus for detecting region of interest and method thereof | |
| US9906762B2 (en) | Communication apparatus, method of controlling communication apparatus, non-transitory computer-readable storage medium | |
| US9172868B2 (en) | Imaging device, imaging method and storage medium for combining images consecutively captured while moving | |
| CA2894197A1 (en) | Method of implementing screen adaptation for owner-drawn elements and apparatus | |
| EP3043343A1 (en) | Information processing device, information processing method, and program | |
| US20140143691A1 (en) | User interface generating apparatus and associated method | |
| JP2013074525A (en) | Projector control device and program | |
| US10812764B2 (en) | Display apparatus, display system, and method for controlling display apparatus | |
| JP5152317B2 (en) | Presentation control apparatus and program | |
| US20160321968A1 (en) | Information processing method and electronic device | |
| TWI547938B (en) | Display device and image display method | |
| JP6091133B2 (en) | Projection type display device, control method used therefor, and program | |
| CN102625066A (en) | Image processing apparatus and image processing method | |
| CN103853318A (en) | User interface generating device and relevant method | |
| CN114979599A (en) | Laser projection apparatus and projected image correction method | |
| JP2011053397A (en) | Image display apparatus and image adjusting method | |
| JP2010015501A (en) | Image display device | |
| CN106713966B (en) | The display control method and display control program of terminal | |
| KR102284358B1 (en) | Projector having at least two data communication channel and method for controlling the same | |
| KR20170010473A (en) | Device for displaying image of digital photo frame, method for displaying of the digital photo frame | |
| JP6304135B2 (en) | Pointer control device, projector and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FURUKAWA, TAKESHI;REEL/FRAME:026035/0661 Effective date: 20101118 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20230210 |