WO2010007988A1 - データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 - Google Patents
データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 Download PDFInfo
- Publication number
- WO2010007988A1 WO2010007988A1 PCT/JP2009/062737 JP2009062737W WO2010007988A1 WO 2010007988 A1 WO2010007988 A1 WO 2010007988A1 JP 2009062737 W JP2009062737 W JP 2009062737W WO 2010007988 A1 WO2010007988 A1 WO 2010007988A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- viewing environment
- peripheral devices
- illumination
- audio
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4345—Extraction or processing of SI, e.g. extracting service information from an MPEG stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
- H04N21/8133—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8173—End-user applications, e.g. Web browser, game
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8186—Monomedia components thereof involving executable data, e.g. software specially adapted to be executed by a peripheral of the client device, e.g. by a reprogrammable remote control
Definitions
- the present invention relates to a data transmission device, a data transmission method, a viewing environment control device, a viewing environment control system, and a viewing environment control method, and in particular, viewing of a user for realizing viewing of video / audio content with high presence.
- the present invention relates to a data transmission device, a data transmission method, a viewing environment control device, a viewing environment control system, and a viewing environment control method for controlling peripheral devices in an environment space.
- Patent Document 1 discloses a technique for linking an image displayed on such a display with illumination light of an illumination device.
- the technology disclosed herein is intended to provide a high sense of presence, and in a lighting system that controls a plurality of lighting devices in conjunction with a video to be displayed, a feature amount (representative color) of video data is provided. , Average luminance), a method for generating illumination control data for a plurality of illumination devices is described. Specifically, a feature amount of video data in a predetermined screen area is detected according to an installation position of each lighting device, and lighting control data for each lighting device is generated based on the detected feature amount. Is described.
- the illumination control data is not limited to the one obtained by calculating from the feature amount of the video data, and may be one that is distributed alone or together with the video data, or one that is distributed via the Internet, or one that is distributed by a carrier wave. It is described that it is good.
- Patent Document 1 describes that illumination control data may be distributed from the outside via the Internet or the like.
- the illumination control data is illumination control data corresponding only to a predetermined layout of the illumination device (installation position of the illumination device in the virtual viewing environment space), and the illumination varies widely for each user. There is a problem that appropriate illumination control cannot be performed according to the arrangement layout of the apparatus.
- the present invention has been made in view of the above-described problems of the prior art, and data transmission that enables appropriate control of peripheral devices according to the layout of peripheral devices in an actual viewing environment space.
- An object is to provide a device, a data transmission method, a viewing environment control device, a viewing environment control method, a viewing environment control system, and a viewing environment control method.
- a first technical means of the present invention is a data transmission device that transmits video data and / or audio data, and identifies an arrangement pattern of peripheral devices in a virtual viewing environment space
- the information processing apparatus includes transmission means for transmitting information and viewing environment control data for peripheral devices in the virtual viewing environment space in addition to the video data and / or audio data.
- the arrangement pattern of peripheral devices indicated by the identification information includes an arrangement pattern in which peripheral devices are installed on a ceiling in the virtual viewing environment space. It is what.
- the arrangement pattern of the peripheral devices indicated by the identification information is the left side around the video display device for displaying the video data in the virtual viewing environment space. Includes a layout pattern in which peripheral devices are installed.
- the arrangement pattern of the peripheral device indicated by the identification information is the right side around the video display device for displaying the video data in the virtual viewing environment space. Includes a layout pattern in which peripheral devices are installed.
- the arrangement pattern of the peripheral devices indicated by the identification information is the surroundings of the video display device for displaying the video data in the virtual viewing environment space. It includes a layout pattern in which peripheral devices are installed on the back.
- the viewing environment control data includes position information indicating installation positions of peripheral devices constituting the arrangement pattern. is there.
- the viewing environment control data includes position information indicating an installation direction of peripheral devices constituting the arrangement pattern. is there.
- the eighth technical means according to any one of the first to seventh technical means is characterized in that the viewing environment control data includes information indicating a driving priority order for the peripheral device.
- the viewing environment control data includes mode information indicating a description method of the viewing environment control data for the peripheral device. is there.
- the tenth technical means is the ninth technical means, characterized in that the mode information includes information indicating that a drive control value for the peripheral device is described by an absolute value.
- An eleventh technical means is the ninth technical means, wherein the mode information indicates that a drive control value for the peripheral device is described by a difference value from a drive control value for another designated peripheral device. It is characterized by including information.
- a twelfth technical means is the ninth technical means, wherein the mode information indicates that a drive control value for the peripheral device is described by a ratio value with a drive control value for another designated peripheral device. It is characterized by including information.
- the mode information includes information indicating that a drive control value for the peripheral device is the same as a drive control value for another designated peripheral device. It is a feature.
- the fourteenth technical means includes identification information indicating an arrangement pattern of peripheral devices in the virtual viewing environment space in association with video data and / or audio data, and viewing environment control for the peripheral devices in the virtual viewing environment space
- the fifteenth technical means according to any one of the first to fourteenth technical means is characterized in that the peripheral device in the virtual viewing environment space is a lighting device.
- the sixteenth technical means according to any one of the first to fourteenth technical means is characterized in that the peripheral device in the virtual viewing environment space is a blower.
- the seventeenth technical means receives the video data and / or audio data, and includes identification information indicating an arrangement pattern of peripheral devices in the virtual viewing environment space, and a viewing environment for the peripheral devices in the virtual viewing environment space.
- the eighteenth technical means is characterized in that, in the seventeenth technical means, the viewing environment control data includes position information indicating installation positions of peripheral devices constituting the arrangement pattern.
- the viewing environment control data includes position information indicating an installation direction of peripheral devices constituting the arrangement pattern.
- the twentieth technical means includes the seventeenth technical means, a video / audio reproduction device for reproducing the video data and / or audio data, and peripheral devices installed around the video / audio reproduction device. It is characterized by having provided.
- a twenty-first technical means is a data transmission method for transmitting video data and / or audio data, the identification information indicating an arrangement pattern of peripheral devices in the virtual viewing environment space, and the virtual viewing environment space Viewing environment control data for a peripheral device is added to the video data and / or audio data and transmitted.
- the twenty-second technical means includes identification information indicating an arrangement pattern of peripheral devices in the virtual viewing environment space in association with video data and / or audio data, and viewing environment control for the peripheral devices in the virtual viewing environment space. And receiving the transmission request from the external device, and transmitting the identification information regarding the predetermined video data and / or audio data and the viewing environment control data to the external device of the transmission request source. It is a feature.
- the twenty-third technical means includes a step of receiving video data and / or audio data, identification information indicating an arrangement pattern of peripheral devices in the virtual viewing environment space, and a viewing environment for the peripheral devices in the virtual viewing environment space Using the step of receiving control data, the step of storing device arrangement information representing the arrangement pattern of peripheral devices in the actual viewing environment space, and the received identification information and the stored device arrangement information, The method includes a step of converting the viewing environment control data into drive control data for driving and controlling peripheral devices in the actual viewing environment space.
- identification information indicating an arrangement pattern of peripheral devices arranged in a virtual viewing environment space, and the virtual viewing environment space.
- the data transmission device the data transmission method, the viewing environment control device, and the viewing environment control system according to the embodiment of the present invention will be described mainly using the lighting device as an example of the peripheral device arranged in the viewing environment space.
- the device is not limited to a lighting device as long as it is a device that controls the viewing environment, and can also be applied to an air conditioner, a blower, a vibration device, a scent generator, and the like.
- FIG. 1 is a block diagram showing a schematic configuration example of a data transmission apparatus according to an embodiment of the present invention.
- the data transmission device 10 in this embodiment includes a data multiplexing unit 11 and a transmission unit 12.
- the input video data is compression encoded and output to the data multiplexing unit 11.
- various compression methods such as ISO / IEC 13818-2 (MPEG-2 Video), ISO / IEC 14496-2 (MPEG-4 Visual), ISO / IEC 14496-10 (MPEG-4 AVC), etc. Is available.
- the input audio data is compression-coded and output to the data multiplexing unit 11.
- Various compression schemes such as ISO / IEC 13818-7 (MPEG-2 AAC) and ISO / IEC 14496-3 (MPEG-4 Audio) can be used for audio encoding.
- the identification information and the illumination control data are compression encoded and output to the data multiplexing unit 11. Details of the identification information and the illumination control data will be described later.
- an XML (Extensible Markup Language) format is used.
- the BiM (Binary format for MPEG-7) method in ISO / IEC 15938-1 (MPEG-7 Systems) can be used.
- the data may be output in the XML format without being compressed.
- the encoded video data, audio data, identification information, and illumination control data are multiplexed by the data multiplexer 11 and transmitted or stored via the transmitter 12.
- MPEG-2 transport stream packet TSP
- IP packet IP packet
- RTP packet etc.
- ISO / IEC 13818-1 MPEG-2 Systems
- the viewing environment control data is described in the extension header portion following the header in which the information defined by MPEG-2 is described.
- the video data and the audio data can be transmitted by the payload following the extension header. Or you may make it transmit identification information and illumination control data by a payload like video data or audio
- different data streams of video data, audio data, identification information, and illumination control data may be multiplexed and transmitted.
- the identification information indicates an arrangement pattern of peripheral devices in a virtual viewing environment space, and in the case of a lighting device, the arrangement pattern of lighting devices arranged in the listening environment space (in the case of a lighting device, lighting).
- the arrangement location of the apparatus for example, information on what kind of lighting method is used, information on where to illuminate, information indicating an irradiation direction, an irradiation angle, and the like may be included.
- FIG. 2 shows an example in which the number of channels of the illumination device, the location of the illumination device for each channel number, and the illumination method are defined as an example of identification information (channel type ID) related to the arrangement pattern of the illumination device. Is shown.
- FIG. 2 when the channel type ID is set to “1”, two illumination devices Ch1 and Ch2 are provided, and Ch1 illuminates the back surface (periphery) of the video display device (display) (indirect illumination).
- Ch2 represents an illumination arrangement pattern in which the lower part is illuminated (direct illumination) from the ceiling.
- FIG. 3 shows an arrangement pattern of the lighting devices 32 (Ch1 and Ch2) with respect to the display 30 defined when the channel type ID is “1”, where Ch1 is a lower part of the display 30 and Ch2 is a ceiling. Is arranged.
- FIG. 4 shows a viewing environment space corresponding to the case where the channel type ID is “1”.
- the rear surface (periphery) of the display 30 is illuminated by Ch1, and the entire space is illuminated from the ceiling by Ch2. Which indicates that.
- FIG. 2 when the channel type ID is set to “2”, two illumination devices Ch1 and Ch2 are provided, Ch1 illuminates the left side of the back of the display (indirect illumination), and Ch2 indicates the display.
- An illumination arrangement pattern for illuminating the back right side (indirect illumination) is shown.
- FIG. 5 shows an arrangement pattern for the display 30 of the lighting devices 32 (Ch1 and Ch2) defined when the channel type ID is “2”, where Ch1 is on the left side of the display 30 and Ch2 is the display. 30 on the right side.
- FIG. 6 shows a viewing environment space corresponding to the case where the channel type ID is “2”. The left rear surface of the display 30 is illuminated by Ch1, and the right rear surface of the display 30 is illuminated by Ch2. It is shown that.
- the identification information (channel type ID) is secured for 8 bits (256), and other illumination arrangement patterns can be defined.
- the number of illumination channels is not limited to two, and the arrangement pattern may be defined for one or more channels.
- the installation location of the lighting device is assumed to be arranged directly on the left and right sides of the display as in the case shown in FIG.
- channel type ID “3”.
- the illumination control data which will be described in detail later, is the viewing environment control data for the lighting device arranged in the virtual viewing environment space, and each of the arrangement patterns defined by the identification information. This is control data for driving and controlling the illumination device of the channel. Therefore, it can be said that the arrangement pattern of the illumination device indicated by the identification information represents the viewing environment on which the illumination control data is generated.
- the lighting control data is provided in conjunction with the video / audio data, but is not necessarily added for each frame of the video data, but for each scene or shot of the video data having a story connection.
- the video data may be added to the video data regularly or irregularly at an appropriate interval.
- four types of data such as video data, audio data, identification information, and illumination control data are multiplexed and transmitted as broadcast data.
- multiplexing is an essential requirement.
- an appropriate transmission method may be selected as necessary.
- the respective data may be transmitted separately without being multiplexed, and further, the video data and the audio data may be multiplexed, and the identification information and the illumination control data may be transmitted independently.
- the identification information and the lighting control data are accumulated in an external server device accessible via the Internet or the like, and a URL (Uniform Resource Locator) for identifying the accumulated identification information and the lighting control data. ) Etc. may be multiplexed with the video data and transmitted. Further, when the identification information and the illumination control data are transmitted through a network different from that of the video data, the information relating the identification information and the illumination control data to the video data is not limited to the above-mentioned URL, but the identification information such as the content name and the illumination. Any information may be used as long as the correspondence between the control data and the video data can be specified.
- a URL Uniform Resource Locator
- the identification information and the specific information for associating the illumination control data with the video / audio multiplexed data when the identification information and the illumination control data are transmitted on a different network from the video / audio multiplexed data include the URL described above.
- any specific information that can identify the correspondence between video / audio multiplexed data, identification information, and illumination control data such as CRID (Content Reference ID) and content name in the TV-Anytime standard, may be used.
- CRID Content Reference ID
- only the identification information and the illumination control data may be recorded on a separate recording medium and distributed.
- video / audio data may be distributed on a large-capacity recording medium such as a Blu-ray Disc or DVD, and identification information and illumination control data may be distributed on a small semiconductor recording medium.
- the identification information and the illumination control data are handled as separate data, but it goes without saying that they may be described in one data format including the data contents of both.
- FIG. 7 is a block diagram showing a schematic configuration example of a viewing environment control system according to an embodiment of the present invention.
- 20 is a data receiving device
- 30 is a video display device (hereinafter referred to as a display)
- 31 is an audio reproducing device
- 32 is a lighting device.
- the data reception device 20 includes a reception unit 21, a data separation unit 22, delay generation units 23a and 23b, an illumination dimming data generation unit 24 as a drive control data generation unit, and illumination as a unit for storing device arrangement information
- An arrangement information storage unit 25 is provided.
- the data reception device 20 receives broadcast data in which video data, audio data, identification information, and illumination control data are multiplexed in the reception unit 21, and the data separation unit 22 receives video data, audio data, identification from the broadcast data. Separate information and lighting control data.
- the video data and audio data separated in the data separator 22 are transmitted to the delay generators 23a and 23b together with a TC (time code) indicating the start time of the video data and audio data, and the video data and audio data are transmitted via the delay generator 23a.
- Data is sent to the video display device 30, and audio data is sent to the audio reproduction device 31 via the delay generator 23b.
- the identification information and the illumination control data separated in the data separation unit 22 are sent to the illumination dimming data generation unit 24.
- the lighting arrangement information storage unit 25 stores arrangement information of each lighting device 32 installed in the viewing environment space (real space) of the viewer, and the lighting device 32 is in accordance with a command from the lighting dimming data generation unit 24. Is sent to the illumination dimming data generation unit 24 as appropriate.
- the illumination dimming data generation unit 24 Is sent to the illumination dimming data generation unit 24 as appropriate.
- two illumination devices 32 are installed around the video display device 30, and the illumination device Ch1 is an on-floor installation type indirect illumination.
- the lighting device Ch2 is a ceiling-mounted type and directly illuminates, an illumination arrangement information storage unit is provided so that each lighting device 32 can be individually controlled according to its installation position. 25, it is necessary to store information on the number of lighting devices, the relative position with respect to the video display device 30, and the lighting method.
- the lighting arrangement information storage unit 25 stores information on the relative position and the lighting method with respect to the video display device 30 for each of the identifiers. What is necessary is just to make it the structure hold
- illumination dimming data generation unit 24 based on the identification information and the illumination control data separated by the data separation unit 22, and the arrangement information of the illumination device 32 in the actual viewing environment space acquired from the illumination arrangement information storage unit 25, Approximate driving control data for driving and controlling the lighting device 32 installed in the viewer's actual viewing environment space (for example, the left and right lighting control data are received, but only one lighting device is in the center. If there is not, illumination dimming data (RGB data) is generated by approximating the average value of the two illumination control data to the actual illumination control data), and this is output to the illumination device 32.
- RGB data is generated by approximating the average value of the two illumination control data to the actual illumination control data
- Delay generation units 23a and 23b are provided for delaying the video data and the audio data separated by the data separation unit 22 and synchronizing with the illumination dimming data by a time required for the conversion.
- an LED light source of each color of R (red), G (green), and B (blue) that can be independently controlled to emit light can be used.
- These three primary color LED light sources emit illumination light having a desired color and brightness.
- the illumination device 32 may be configured to be able to control the illumination color and brightness of the surrounding environment of the video display device 30, and is limited to the combination of the LED light sources that emit the predetermined colors as described above.
- a white LED and a color filter may be used, or a combination of a white light bulb, a fluorescent tube and a color filter, a color lamp, or the like may be applied.
- the expression is not limited to R (red), G (green), and B (blue) colors, but may be expressed using, for example, illumination color temperature (unit: K).
- FIG. 7 shows a case where the illumination device 32 is driven by RGB data.
- the data receiving device 20 of the viewing environment control system may be provided integrally with the video display device 30 and the audio playback device 31, or may be provided separately from each other.
- the data receiving device 20 in the present embodiment approximates the drive control data based on the identification information and the illumination control data acquired from the outside, and thereby the illumination device 32 installed in the actual viewing environment space. Can be appropriately controlled.
- FIG. 9 shows an example of the description content of the illumination control data.
- a channel type ID identification information representing an arrangement pattern of one or more lighting devices in a virtual viewing environment space
- priority information shown in FIG. 10
- priority information shown in FIG. 11
- Mode information shown in FIG. 11
- reference Ch channel
- illumination control data illumination color temperature information.
- candela Cd
- lumen lm
- lux lx
- color expression not the color temperature but an XYZ color system, an RGB color system, a YCbCr color system, or the like may be used. All of these pieces of information are useful information for producing the atmosphere and presence of the scene of the video data with illumination light.
- the human visual field is classified into a discrimination visual field 101, an effective visual field 102, a guidance visual field 103, and an auxiliary visual field 104 according to the function of the visual function.
- the discrimination visual field 101 is a range in which high-density information such as graphic identification can be accurately received
- the effective visual field 102 is a range in which natural information can be received only by eye movement although the discrimination ability is lower than that of the discrimination visual field 101.
- the guidance visual field 103 has only a recognition ability that allows the presence of a presenting stimulus and simple identification, but has an influence when determining overall external information.
- the auxiliary visual field 104 is a range in which only the presence of a stimulus can be determined.
- the current high-definition television is designed to display an image in a range covering the effective visual field 102 among the above. That is, information such as video and illumination is not displayed in the guiding visual field 103 and the auxiliary visual field 104. Therefore, it can be expected that the presence of the visual field 103 and the auxiliary visual field 104 is further enhanced by irradiating the illumination.
- the back surface of the display 30 is illuminated to cover the guidance visual field and auxiliary visual field around the display 30 with illumination.
- the back surface of the display 30 is illuminated to cover the guidance visual field and auxiliary visual field around the display 30 with illumination.
- the back surface of the display 30 is illuminated to cover the guidance visual field and the auxiliary visual field around the display with illumination, thereby providing a sense of reality.
- FIG. 10 is a diagram illustrating an example of priority information.
- the priority is set to 5 levels (low, slightly low, normal, slightly high, high), and the priority is set. It is possible to irradiate only high illumination. As a result, the number of lights on the receiving side and the placement location are limited, and even when the lighting placement pattern in the virtual viewing environment space is different from the lighting placement status in the actual viewing environment space, the lighting with the highest priority is provided.
- the illumination value it is possible to realize an illumination situation that the transmission side desires to turn on preferentially on the reception side.
- FIG. 11 is a diagram illustrating an example of mode information.
- mode information indicating the description method of the brightness and color of a plurality of lighting devices.
- the description method of the illumination value of other illumination is set for the reference illumination reference RefID. Is shown.
- the mode is “Abs”
- absolute values relating to the brightness and color of the illumination are described for each illumination device.
- the mode is “Rel”
- a difference value or a ratio value regarding the brightness and color of the illumination with respect to the reference illumination RefID is described for each illumination device.
- the mode is “Equ”, it is described as “Ecu” on the assumption that each lighting device has the same value as the reference illumination RefID.
- the ambient illuminance (unit: lx) is increased by 100 lx from the reference illumination by using the mode “Rel”.
- the temperature is lowered by 1000K
- the ambient illuminance is raised by 10%, or when the color temperature is lowered by 20%
- the amount of data representing the brightness and color of a plurality of lighting devices is reduced, which is effective.
- the brightness and color of a plurality of lighting devices are described in the mode “Abs”, for example, when the brightness and color of the reference illumination and the surrounding illumination are equal by using the mode “Equ”, This is effective in reducing the amount of data representing the brightness and color of a plurality of lighting devices.
- FIG. 12 is a diagram for explaining an example in which the illumination control data is described in the XML format
- FIG. 13 is an explanatory diagram showing an XML schema corresponding to the illumination control data.
- this illumination control data corresponds to the arrangement of the illumination device corresponding to the arrangement pattern of the channel type ID “2” in the identification information of FIG. It can be seen that the control data for the lighting device in the arrangement of the lighting device shown in FIG. 5 is described.
- the channel IDs for identifying the illumination control data of the two illumination devices are a and b, respectively, and the Ch numbers of the illumination devices corresponding to the illumination control data are “1” and “2”. It is. That is, the position information “1” is an attribute of the illumination device Ch1 shown in FIG. 5, and the position information “2” is an attribute of the illumination device Ch2.
- the priority is 5 for the illumination control data of the channel IDs “a” and “b”.
- the illumination control data of the channel ID “a” has an illuminance value of 200 lx and a color temperature of 3000 K
- the illumination control data of the channel ID “b” has an illuminance value of +50 lx with respect to the illumination control data of the channel ID “a”.
- the color temperature is 3000 K which is the same value because there is no particular description.
- reference channels for mode information such as Rel and Equ can be referred to by adding a channel ID for each ControlData, but the reference destination is not limited to the channel ID for each ControlData, for example, location information You may refer to
- the data receiving device 20 receives video data and / or audio data, identification information, and illumination control data included in the broadcast data, and the illumination dimming data generation unit 24 uses the identification information, illumination control data, and illumination arrangement information. Based on the arrangement information of the actual lighting device 32 acquired from the storage unit 25, drive control data for driving and controlling the actual lighting device 32 is generated. The method will be described. First, the illumination dimming data generation unit 24 compares the virtual illumination device arrangement pattern indicated by the identification information with the arrangement information of the illumination device 32 in the actual viewing environment space acquired from the illumination arrangement information storage unit 25. If the number, arrangement location, and illumination method of the two illumination devices match, the actual illumination device is driven and controlled without correcting the illumination control data created assuming the virtual viewing environment space. Convert to data for.
- the position and size of the illuminated location in the illumination control data can be changed to the actual illumination device arrangement.
- Use a similar one or calculate a weighted average value of illumination control data for example, illumination brightness and illumination color temperature
- illumination control data for example, illumination brightness and illumination color temperature
- the broadcast data includes the arrangement pattern identification information indicated by the channel type ID “1” and the illumination control data corresponding thereto, and the actual viewing environment space of the viewer is shown in FIG.
- the arrangement pattern of the virtual illumination device indicated by the identification information and the arrangement information of the illumination device 32 in the actual viewing environment space acquired from the illumination arrangement information storage unit 25 are compared, The number, location, and lighting method of lighting devices do not match.
- the illumination device Ch1 in the virtual viewing environment space illuminates the periphery of the back of the display, and this illumination location is the illumination device Ch1 in the actual viewing environment space.
- Ch2 see FIGS. 4 and 6
- the lighting control data of the lighting device Ch1 can be applied to the lighting devices Ch1 and Ch2 in the actual viewing environment space. Become.
- the virtual viewing environment space is an arrangement pattern indicated by the channel type ID “2”
- the actual viewing environment space of the viewer is the arrangement of the illumination device shown in FIG. If there is, it is possible to calculate the illumination control data applied to the illumination device Ch1 in the actual viewing space from the value of the illumination control data of the illumination devices Ch1 and Ch2 of the channel type ID “2”.
- the weighted average value of the brightness and color temperature of the illumination control data considering the priorities of the illumination measures Ch1 and Ch2 may be used as the respective illumination control data.
- what is necessary is just to apply the calculation result of illuminating device Ch1 in an actual viewing space to the illumination control data applied to illuminating device Ch2 in an actual viewing space.
- FIG. 14 is a diagram illustrating a configuration of a table that is referred to when an arrangement pattern of a plurality of lighting devices and a position of each lighting device are described.
- the reference table includes a first table (T16) indicating the arrangement pattern of the lighting devices in the viewing environment space and a second table (T16a, T16b, T16c, T16d) indicating the positions of the lighting devices.
- the data transmission device 10 transmits a value indicating the lighting arrangement pattern (ChannelTypeID), a value indicating the position of each lighting device (Position), and control data of each lighting device.
- ChannelTypeID a value indicating the lighting arrangement pattern
- Position a value indicating the position of each lighting device
- control data a value indicating the position of each lighting device.
- the illumination control data refers to control parameters such as illumination brightness, color temperature, and time information.
- the illumination position may be defined by an external standard (hereinafter referred to as “standard A”) or a specification (hereinafter referred to as “specification B”).
- standard A an external standard
- specification B a specification
- the control target illumination device is determined with reference to, for example, the first table indicating the arrangement pattern of the illumination device and the second table indicating the position as peripheral devices in the viewing environment space.
- the viewer can set the lighting environment based on the arrangement patterns (setting of lighting devices corresponding to lighting control data, etc.) Can be performed.
- the content creator can also design the illumination control value based on the predefined arrangement pattern, so that the burden of creating illumination control data can be reduced.
- arrangement type 1 defines an arrangement (FIG. 15A) such that the display and the left and right illumination devices are in a straight line. Even when the left and right arrangements are the same, the second table that defines a new arrangement pattern (FIG. 15B) is defined when the arrangement 30 ° before the display surface (FIG. 15B) is specified.
- FIG. 16 is a diagram illustrating a configuration of a table that is referred to when an arrangement pattern of a plurality of fans and a position of each fan are described.
- the position of the blower in the viewing environment space is composed of a first table (T17) indicating the arrangement pattern and a second table (T17a, T17b, T17c, T17d) indicating the position, as in the case of the lighting device. .
- T17 first table
- FIG. 17 is a diagram showing an example in which an arrangement pattern and positions of illumination and blowers are described in an XML document.
- description of specific control parameters such as time information, illumination brightness and color temperature, and wind speed of the blower is omitted.
- the position (Position) is described in units of lighting / blower control data (Effect element units).
- FIG. 17A is an example described in units of illumination / blower control data.
- FIG. 17B is an example described in a plurality of illumination / blower control data units (GroupOfEffect element units).
- FIG. 17C is an example described with the entire control data (SEM element).
- the arrangement pattern (ChannelTypeID) and position (Position) are described as XML attributes (Attributes), but may be described as XML elements (elements). Alternatively, the ChannelTypeID may be described and referred to in another XML document.
- the data receiving device 20 When receiving the ChannelTypeID indicating the illumination arrangement pattern and the Position indicating the position of each illumination, the data reception device 20 refers to the first table and the second table, and determines the illumination position associated with the control data. .
- FIG. 18 is an operation flow diagram relating to the control target illumination determination in the illumination dimming data generation unit 24.
- arrangement pattern information (ChannelTypeID) is acquired (step S191), and a second table used for determining position information is selected based on the first table (step S192).
- position information (Position) is acquired (step S193), and an illumination position associated with the control data is determined from this and the second table (step S194).
- control data is acquired (step S195), and the device corresponding to the position determined in step S194 is controlled (step S196).
- FIG. 19 is a diagram for explaining an example of the light irradiation direction in the illumination device.
- FIG. 19A is a view of the viewing space viewed from above
- FIG. 19B is a view of the viewing space viewed from the horizontal direction (lateral).
- the light irradiation direction is described by a first horizontal angle (horizontal angle) and a second vertical angle (vertical angle) with reference to the line connecting the display and the viewer on the floor.
- the wind direction of the blower can be described by the horizontal angle and the vertical angle.
- a normal line from the viewer to the display or a line connecting the viewer and the center of the display may be used as a reference.
- the arrangement pattern and the position description method in the present embodiment are not limited to lighting and a blower, but can be similarly applied to peripheral devices such as a scent generator and a sound effect generator.
- FIG. 20 is a diagram illustrating another configuration example of a table referred to when describing the position of each peripheral device.
- the reference table includes a first table that defines positions (Position) such as “left”, “right”, and “front”, and a second table that defines a list of available positions and arrangement details of each available position. It consists of a table.
- Position in the first table in FIG. 20 is an example, and the positions included in a specific wall surface of the viewing environment space may be collectively defined as “left surface”, “right surface”, and “front surface”.
- left front, left, and left rear in the first table may be collectively defined as the left surface, or right front, right, and right rear in the first table may be collectively defined as the right surface.
- the definition of ChannelTypeID in the second table of FIG. 20 is an example, and ChannelTypeID such as “all positions are available” and “placement details are user-defined” may be defined.
- ChannelTypeID such as “all positions are available” and “placement details are user-defined” may be defined.
- FIG. 15B As the arrangement on the left and right, “3: Position at 30 ° with respect to the display surface, 7: By additionally defining a “position of 30 ° with respect to the display surface” and the like, it is possible to easily expand without affecting the existing definition.
- the table configuration is not limited to the lighting device as in the second embodiment, and is similarly applicable to peripheral devices such as a blower, a scent generating device, and a sound effect generating device.
- Embodiment 4 In Embodiments 1, 2, and 3 of the present invention described above, the case where identification information and lighting control data are added to broadcast data as additional data and transmitted is described. However, additional data is not added to broadcast data. In this case, by transmitting and receiving the identification information corresponding to the video data and / or audio data to be viewed and the illumination control data from an external server device or the like, it is possible to realize an optimal viewing environment when reproducing the video data and / or audio data It becomes possible.
- FIG. 21 is a block diagram illustrating a configuration example of a main part of the external server device according to the fourth embodiment of the present invention.
- the external server device 40 in this embodiment corresponds to the data transmission device of the present invention, and receives a transmission request for identification information and lighting control data regarding specific video data and / or audio data (content) from the data reception device side.
- a transmission unit 43 for transmitting to the original data receiving device is provided.
- the lighting control data stored in the lighting control data storage unit 42 of the present embodiment describes the start time code of an arbitrary segment (for example, scene or shot) intended by the content creator or the like.
- the transmission unit 43 requests the lighting control data of the video data and / or audio data (content) that has been requested to be transmitted, together with the TC (time code) indicating the start time of the video data and / or audio data (segment). The data is transmitted to the original data receiving device.
- an ID is added to an arbitrary segment (for example, a scene or a shot) intended by a content creator or the like, and the illumination control data of video data and / or audio data (content) for which a transmission request is received, together with the segment ID
- the transmission unit 43 may transmit the data to the requesting data receiving device.
- FIG. 22 is a block diagram illustrating an exemplary configuration of a main part of a viewing environment control system according to Embodiment 4 of the present invention.
- 50 is a data receiving device
- 60 is a video display device
- 61 is an audio reproducing device
- 62 is a lighting device.
- the data receiving device 50 receives and demodulates broadcast data input from the transmission path, and performs error correction, a receiving unit 51, and video data and audio output to the video display device 60 from the output data of the receiving unit 51.
- a data separation unit 52 that separates and extracts each of the audio data output to the playback device 61 is provided.
- the data reception device 50 sends an identification server corresponding to the video data (content) to be displayed and a transmission request for the illumination control data to the external server via the communication network based on an instruction from the illumination dimming data generation unit 56.
- a transmission unit 57 that transmits to the device 40 and a reception unit 54 that receives the identification information and the illumination control data requested to be transmitted from the external server device 40 via a communication network are provided.
- the illumination arrangement information storage unit 55 stores arrangement information of each illumination device 62 installed in the viewing environment space (real space) of the viewer, and the illumination device 62 is in accordance with an instruction from the illumination dimming data generation unit 56. Is sent to the illumination dimming data generation unit 24 as appropriate. Since the illumination arrangement information storage unit 55 is the same as the illumination arrangement information storage unit 25 of the first embodiment, detailed description thereof is omitted.
- the lighting dimming data generation unit 56 is based on the identification information and lighting control data received from the receiving unit 54, and the arrangement information of the lighting device 62 acquired from the lighting arrangement information storage unit 55.
- Lighting dimming data RGB data
- RGB data for appropriately controlling the lighting device 62 installed in the viewing environment space is generated and output to the lighting device 32, and transmission of identification information and lighting control data is performed. Since it is the same as the illumination dimming data generation unit 24 in the first embodiment except for making a request, the description is omitted.
- the lighting dimming data sent to the lighting device 62 needs to match the output timing with the video data and the audio data.
- the delay generating unit 53a for delaying the video data and the audio data separated by the data separation unit 52 and synchronizing with the illumination dimming data for a time necessary for conversion to the lighting dimming data corresponding to the viewing environment of , 53b are provided.
- the identification information and the lighting control data are not added to the broadcast data, the identification information and the lighting control data corresponding to the video data and / or the audio data (program content) are obtained from the external server device. Since the viewing environment lighting is controlled based on the identification information and the lighting control data, as in the first embodiment, an increase in the amount of data is suppressed and an arbitrary one according to the intention of the video producer is used. It is possible to control the switching of the viewing environment illumination at the timing and to realize the illumination control of the optimal viewing environment.
- the identification information (channel type ID) and the illumination control data are always transmitted or received in pairs, so how the illumination control data is. It is possible to detect whether the lighting device is created based on the arrangement pattern of the lighting device in a virtual viewing environment space. Therefore, the lighting control is performed in accordance with the arrangement state of the lighting device in the actual viewing environment space. It is possible to convert the data into appropriate lighting dimming data.
- the data transmission device, data transmission method, viewing environment control device, viewing environment control system, and viewing environment control method of the present invention can be realized by various embodiments without departing from the gist of the present invention described above. It is.
- the viewing environment control device may be provided in the video display device, and it is needless to say that an external lighting device may be controlled based on various information included in the input video data. Yes.
- the lighting device is described as an example of the peripheral device arranged in the virtual viewing environment space.
- the present invention is not limited to the lighting device, and the air conditioner, the blower, the vibration device, the scent
- the present invention can be applied to a peripheral device that affects the viewing environment, such as a generator.
- a peripheral device that affects the viewing environment such as a generator.
- an arrangement pattern including an output position / direction of an effect such as a wind or fragrance is defined by identification information. do it.
- video data and / or audio data is not limited to content related to a television program transmitted by television broadcasting, but is a work stored in a media medium such as Blu-ray Disc or DVD. It may be the content concerning. That is, the input video data is not limited to that obtained by receiving a television broadcast, and the present invention can also be applied to the case where video data reproduced from an external reproduction device is input.
- DESCRIPTION OF SYMBOLS 10 ... Data transmitter, 11 ... Data multiplexing part, 12 ... Transmitter, 20 ... Data receiver, 21, 51 ... Receiver, 22, 52 ... Data separator, 23a, 23b, 53a, 53b ... Delay generator, 24, 56 ... illumination dimming data generation unit, 25, 55 ... illumination arrangement information storage unit, 30, 60 ... video display device, 31, 61 ... audio reproduction device, 32, 62 ... illumination device, 40 ... external server device, 41: receiving unit, 42: lighting control data storage unit, 43: transmitting unit, 54: receiving unit, 57: transmitting unit.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ecology (AREA)
- Emergency Management (AREA)
- Environmental & Geological Engineering (AREA)
- Environmental Sciences (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Biodiversity & Conservation Biology (AREA)
- Business, Economics & Management (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
図1は、本発明の一実施態様に係るデータ送信装置の概略構成例を示すブロック図である。
本実施態様におけるデータ送信装置10は、データ多重化部11、および送信部12から構成される。
入力された映像データは、圧縮符号化され、データ多重化部11へ出力される。映像符号化には、ISO/IEC 13818-2(MPEG-2 Video),ISO/IEC 14496-2(MPEG-4 Visual),ISO/IEC 14496-10(MPEG-4 AVC)など、さまざまな圧縮方式が利用可能である。
同様に、入力された音声データは、圧縮符号化され、データ多重化部11へ出力される。音声符号化には、ISO/IEC 13818-7(MPEG-2 AAC)、ISO/IEC 14496-3(MPEG-4 Audio)など、さまざまな圧縮方式が利用可能である。
符号化された映像データ、音声データ、および識別情報、および照明制御データは、データ多重化部11で多重化され、送信部12を介して、伝送または蓄積される。多重化方式としては、例えば、ISO/IEC 13818-1(MPEG-2 Systems)におけるMPEG-2トランスポートストリームパケット(TSP)、IPパケット、RTPパケット等を用いることができる。
図3は、チャンネルタイプIDが“1”の場合に定義される、照明装置32(Ch1とCh2)のディスプレイ30に対する配置パターンを示したものであり、Ch1はディスプレイ30の下部に、Ch2は天井に配置されている。また、図4は、チャンネルタイプIDが“1”の場合に対応した視聴環境空間を示したものであり、Ch1によって、ディスプレイ30の背面(周囲)が、またCh2によって天井から空間全体が照明されることを示している。
図5は、チャンネルタイプIDが“2”の場合に定義される、照明装置32(Ch1とCh2)のディスプレイ30に対する配置パターンを示したものであり、Ch1がディスプレイ30の左側に、Ch2がディスプレイ30の右側に配置されている。そして、図6は、チャンネルタイプIDが“2”の場合に対応した視聴環境空間を示したものであり、Ch1によって、ディスプレイ30の左側背面が、またCh2によってディスプレイ30の右側背面が照明されることを示している。
例えば、図示しないが、照明装置の設置場所は、図5で示す場合と同様にディスプレイの左右に各1台を配置し、視聴者側へ向かって直接照明をおこなうことが想定される場合は、例えばチャンネルタイプID“3”によって、「チャンネル数2、Ch1:左;直接、Ch2:右;直接」を定義すればよい。
したがって、識別情報で示される照明装置の配置パターンは、照明制御データが生成される前提となる視聴環境を表すものであるといえる。
あるいは、識別情報、および照明制御データのみ別の記録媒体に記録して配布するようにしてもよい。例えば、映像/音声データは、Blu-ray Disc、DVDなどの大容量記録媒体で配布し、識別情報、および照明制御データは、小型半導体記録媒体等で配布する場合が挙げられる。この場合にも、複数のコンテンツを記録して配布する際には、映像/音声データと識別情報、および照明制御データとの対応関係を明らかにできる特定情報が必要となる。
なお、本実施態様では、識別情報と照明制御データとを別データとして取り扱っているが、両者のデータ内容を含む一つのデータ形式で記述してもよいことは言うまでもない。
データ受信装置20は、受信部21、データ分離部22、ディレイ発生部23a,23b、駆動制御データ生成手段としての照明調光データ生成部24、および機器配置情報を記憶するための手段としての照明配置情報記憶部25を備える。
データ受信装置20は、受信部21において映像データ、音声データ、識別情報、および照明制御データが多重されている放送データを受信し、データ分離部22において、放送データから映像データ、音声データ、識別情報、および照明制御データを分離する。
ここで、例えば、視聴者の視聴環境空間が、図3に示されるように、映像表示装置30の周辺に2個の照明装置32が設置されており、照明装置Ch1は床上設置型で間接照明を行うものであり、照明装置Ch2は天井設置型で直接照明を行うものであるような場合、各照明装置32をその設置位置に応じて個別制御することができるように、照明配置情報記憶部25には、照明装置の個数と、映像表示装置30に対する相対位置及び照明方法に関する情報を蓄積しておく必要がある。
このため、例えば、ユーザの視聴空間に配置された個々の照明装置に対応する識別子を付与し、照明配置情報記憶部25は、前記識別子毎に映像表示装置30に対する相対位置及び照明方法に関する情報をテーブル形式で保持する構成とすればよい。
図9は照明制御データの記述内容の一例を示すものであり。仮想的な視聴環境空間における1つ以上の照明装置の配置パターンを表すチャンネルタイプID(識別情報)、複数の照明装置を発光させるための優先順位を表す優先度情報(図10に示す)、複数の照明装置の明るさや色の記述方法を表すモード情報(図11に示す)、複数の照明装置の明るさや色を求める際に参照するリファレンス照明を意味する参照Ch(チャンネル)、照明の明るさ情報、照明色温度情報を有している。なお、図示しないが、必要に応じて照明方法を記載することも可能である。なお、明るさの表現には、ルクス(lx)でなく、カンデラ(Cd)や、ルーメン(lm)等を用いてもよい。また、色の表現には、色温度でなく、XYZ表色系、RGB表色系、YCbCr表色系等を用いてもよい。
これらの情報はいずれも、映像データのシーンの雰囲気や臨場感を照明光によって演出するために有用な情報である。
弁別視野101は、図形識別など高密度な情報を正確に受容できる範囲であり、有効視野102は、弁別能力が弁別視野101より低下するが眼球運動のみで自然な情報受容ができる範囲である。また、誘導視野103は、呈示刺激の存在や簡単な識別ができる程度の認識能力しかないが、全体的な外界情報を判断する際に影響を持つ範囲である。補助視野104は、刺激の存在のみが判別できる範囲である。
ここでは、複数の照明装置を発光させるための優先順位を表す情報の例を示しており、例えば優先度を5段階(低い、やや低い、普通、やや高い、高い)と設定し、優先度の高い照明だけを照射することを可能としている。これによって、受信側の照明数や配置場所に制限があり、仮想的な視聴環境空間における照明配置パターンと実際の視聴環境空間における照明配置状況が異なる場合であっても、優先度の最も高い照明の照明値を参照することで、送信側が優先的に点灯をさせたいと望む照明状況を受信側で実現することができる。
ここでは、複数の照明装置の明るさや色の記述方法を表すモード情報の例を有しており、例えば基準となるリファレンス照明RefIDに対して、他の照明の照明値の記述方法を設定することを示している。モードが“Abs”の場合は、照明装置毎に照明の明るさと色に関する絶対値を記述する。モードが“Rel”の場合は、照明装置毎にリファレンス照明RefIDに対する照明の明るさと色に関する差分値または比率値を記述する。モードが“Equ”の場合は、照明装置毎にリファレンス照明RefIDと同値であるとして、Equと記述する。
図12において、チャネルタイプIDは2と記述されていることから、この照明制御データは、図2の識別情報におけるチャンネルタイプID“2”の配置パターンに対応した照明装置の配置に対応しており、図5で示した照明装置の配置における照明装置のための制御データを記述したものであることが分かる。
そして、2台の照明装置の照明制御データのそれぞれを識別するためのチャンネルIDが、それぞれa、bであり、当該照明制御データに対応する照明装置のCh番号は、“1”、“2”である。すなわち、位置情報“1”は図5で示した照明装置Ch1の属性であることを、位置情報“2”は照明装置Ch2の属性であることを示している。
なお、RelやEquなどのモード情報の参照チャネルについては、ControlData毎にチャンネルIDを付加することで参照を行うことができるが、参照先はControlData毎のチャンネルIDに限定することなく、例えば位置情報を参照してもよい。
まず、照明調光データ生成部24は、識別情報で示される仮想上の照明装置の配置パターンと照明配置情報記憶部25から取得した実際の視聴環境空間における照明装置32の配置情報とを比較し、両者の照明装置の個数、配置場所、照明方法が一致しておれば、仮想的な視聴環境空間を想定して作成された照明制御データを補正することなく、実際の照明装置を駆動制御するためのデータに変換する。
しかし、個々の被照明場所の位置や大きさを比較すると、仮想的な視聴環境空間における照明装置Ch1はディスプレイの背面周囲を照明しており、この照明場所は実際の視聴環境空間では照明装置Ch1とCh2とで行っているため、(図4、図6参照)、照明制御データのうち、照明装置Ch1の照明制御データを実際の視聴環境空間における照明装置Ch1,Ch2に適用することが可能となる。
本発明の第2の実施態様について、図面を参照しながら説明する。本実施態様におけるデータ送信装置およびデータ受信装置の概略構成は、図1および図7と同様のため詳細な説明は省略する。
まず、照明装置の例を説明する。図14は、複数の照明装置の配置パターンおよび各照明装置の位置を記述する際に参照するテーブルの構成を示した図である。参照テーブルは、視聴環境空間における照明装置の配置パターンを示す第1のテーブル(T16)と、各照明装置の位置を示す第2のテーブル(T16a,T16b,T16c,T16d)とから構成される。
よく使われる典型的な照明配置パターンを第1および第2のテーブルとしてあらかじめ定義しておくことで、視聴者は、配置パターンに基づき照明環境設定(照明制御データに対応する照明装置のセッティング等)を行うことが可能となる。また、コンテンツ制作者側も、あらかじめ定義された配置パターンに基づいて、照明制御値を設計することができるので、照明制御データ作成の負担を軽減することができる。
なお、上記Positionの付加は必須ではなく、ChannelTypeIDによって定義されるすべての位置に対する制御データをあらかじめ定められた順にしたがって記述すると事前に決めておくことによって省略も可能である。Positionを付加する場合、必要最低限の制御データのみを記述(例えば、テーブルT16cにおいて、右照明のみの制御データ伝送する場合、Position=2の制御データのみを記述)すればよいので、データ量を削減することができる。
次に、データ受信装置20の動作について説明する。データ受信装置20は、照明配置パターンを示すChannelTypeIDと、各照明の位置を示すPositionを受信すると、第1のテーブルおよび第2のテーブルを参照し、制御データが関連付けられている照明位置を決定する。
本実施態様における配置パターンおよび位置の記述方法は、照明や送風機に限定されるものではなく、香り発生装置や効果音発生装置などの周辺機器にも同様に適用可能であることは言うまでもない。
本発明の第3の実施態様について、図面を参照しながら説明する。図20は、各周辺機器の位置を記述する際に参照するテーブルの他の構成例を示した図である。参照テーブルは、“左”、“右”、“前”といった位置(Position)を規定する第1のテーブルと、利用可能な位置のリストと各利用可能位置の配置詳細とを規定する第2のテーブルとから構成される。図20の第1のテーブルにおけるPositionの定義は一例であり、視聴環境空間の特定の壁面に含まれる位置をまとめて“左面”、“右面”、“前面”といった面を定義してもよい。例えば、第1のテーブルにおける左前と左と左後をまとめて左面と定義したり、第1のテーブルにおける右前と右と右後をまとめて右面と定義してもよい。
また、左右への配置として、例えば、図15(B)に示すような配置を追加したい場合、第2のテーブルに、ChannelTypeID=7として「3:ディスプレイ面に対して30°の位置、7:ディスプレイ面に対して30°の位置」等を追加定義することにより、既存の定義に影響を与えることなく容易に拡張することができる。
上記テーブル構成は、実施態様2と同様、照明装置に限定されるものではなく、送風機、香り発生装置、効果音発生装置などの周辺機器にも同様に適用可能であることは言うまでもない。
上記本発明の実施態様1、2および3においては、識別情報と照明制御データとが付加データとして放送データに付加されて送信される場合について説明したが、放送データに付加データが付加されていない場合、視聴する映像データ及び/又は音声データに対応する識別情報と照明制御データを外部サーバ装置等より送受信することによって、映像データ及び/又は音声データ再生時の最適な視聴環境を実現することが可能となる。
本実施態様における外部サーバ装置40は、本発明のデータ送信装置に相当し、データ受信装置側から特定の映像データ及び/又は音声データ(コンテンツ)に関する識別情報と照明制御データの送信要求を受信する受信部41と、映像データ及び/又は音声データ(コンテンツ)毎の識別情報と照明制御データを格納している照明制御データ格納部42と、送信要求を受けた識別情報と照明制御データを、要求元のデータ受信装置へ送信する送信部43を備えている。
図22は、本発明の実施態様4に係る視聴環境制御システムの要部構成例を示すブロック図である。図中、50はデータ受信装置、60は映像表示装置、61は音声再生装置、62は照明装置を示す。
データ受信装置50は、伝送路より入力された放送データを受信して復調するとともに、誤り訂正を行う受信部51と、受信部51の出力データから、映像表示装置60に出力する映像データ、音声再生装置61に出力する音声データのそれぞれを分離・抽出するデータ分離部52とを備えている。
Claims (23)
- 映像データ及び/又は音声データを送信するデータ送信装置であって、
仮想的な視聴環境空間における周辺機器の配置パターンを示す識別情報と、前記仮想的な視聴環境空間における周辺機器に対する視聴環境制御データとを、前記映像データ及び/又は音声データに付加して送信する送信手段を備えたことを特徴とするデータ送信装置。 - 前記識別情報によって示される周辺機器の配置パターンは、前記仮想的な視聴環境空間における天井に周辺機器を設置した配置パターンを含むことを特徴とする請求項1に記載のデータ送信装置。
- 前記識別情報によって示される周辺機器の配置パターンは、前記仮想的な視聴環境空間における、前記映像データを表示するための映像表示装置の周囲左側に周辺機器を設置した配置パターンを含むことを特徴とする請求項1に記載のデータ送信装置。
- 前記識別情報によって示される周辺機器の配置パターンは、前記仮想的な視聴環境空間における、前記映像データを表示するための映像表示装置の周囲右側に周辺機器を設置した配置パターンを含むことを特徴とする請求項1に記載のデータ送信装置。
- 前記識別情報によって示される周辺機器の配置パターンは、前記仮想的な視聴環境空間における、前記映像データを表示するための映像表示装置の周囲後背部に周辺機器を設置した配置パターンを含むことを特徴とする請求項1に記載のデータ送信装置。
- 前記視聴環境制御データは、前記配置パターンを構成する周辺機器の設置位置を示す位置情報を含むことを特徴とする請求項1乃至5のいずれか1項に記載のデータ送信装置。
- 前記視聴環境制御データは、前記配置パターンを構成する周辺機器の設置方向を示す位置情報を含むことを特徴とする請求項1乃至6のいずれか1項に記載のデータ送信装置。
- 前記視聴環境制御データは、前記周辺機器に対する駆動優先順位を示す情報を含むことを特徴とする請求項1乃至7のいずれか1項に記載のデータ送信装置。
- 前記視聴環境制御データは、前記周辺機器に対する視聴環境制御データの記述方法を表すモード情報を含むことを特徴とする請求項1乃至8のいずれか1項に記載のデータ送信装置。
- 前記モード情報は、前記周辺機器に対する駆動制御値が絶対値によって記述されていることを示す情報を含むことを特徴とする請求項9に記載のデータ送信装置。
- 前記モード情報は、前記周辺機器に対する駆動制御値が他の指定された周辺機器に対する駆動制御値との差分値によって記述されていることを示す情報を含むことを特徴とする請求項9に記載のデータ送信装置。
- 前記モード情報は、前記周辺機器に対する駆動制御値が他の指定された周辺機器に対する駆動制御値との比率値によって記述されていることを示す情報を含むことを特徴とする請求項9に記載のデータ送信装置。
- 前記モード情報は、前記周辺機器に対する駆動制御値が他の指定された周辺機器に対する駆動制御値と同一であることを示す情報を含むことを特徴とする前記請求項9に記載のデータ送信装置。
- 映像データ及び/又は音声データに関連付けて、仮想的な視聴環境空間における周辺機器の配置パターンを示す識別情報と、前記仮想的な視聴環境空間における周辺機器に対する視聴環境制御データとを格納する格納手段と、
外部装置からの送信要求を受けて、所定の映像データ及び/又は音声データに関する識別情報と視聴環境制御データとを、前記送信要求元の外部装置へ送信する送信手段とを備えたことを特徴とするデータ送信装置。 - 前記仮想的な視聴環境空間における周辺機器が、照明装置であることを特徴とする請求項1から14のいずれか1項に記載のデータ送信装置。
- 前記仮想的な視聴環境空間における周辺機器が、送風装置であることを特徴とする請求項1から14のいずれか1項に記載のデータ送信装置。
- 映像データ及び/又は音声データを受信するとともに、仮想的な視聴環境空間における周辺機器の配置パターンを示す識別情報と、前記仮想的な視聴環境空間における周辺機器に対する視聴環境制御データとを受信する受信手段と、
実際の視聴環境空間における周辺機器の配置パターンを表す機器配置情報を記憶する記憶手段と、
前記受信手段で受信した識別情報と前記記憶手段に記憶された機器配置情報とを用いて、前記視聴環境制御データを、前記実際の視聴環境空間における周辺機器を駆動制御するための駆動制御データに変換する駆動制御データ生成手段とを備えたことを特徴とする視聴環境制御装置。 - 前記視聴環境制御データは、前記配置パターンを構成する周辺機器の設置位置を示す位置情報を含むことを特徴とする請求項17に記載の視聴環境制御装置。
- 前記視聴環境制御データは、前記配置パターンを構成する周辺機器の設置方向を示す位置情報を含むことを特徴とする請求項17または18に記載のデータ送信装置。
- 前記請求項17に記載の視聴環境制御装置と、前記映像データ及び/又は音声データを再生するための映像/音声再生装置と、該映像/音声再生装置の周辺に設置された周辺機器とを備えたことを特徴とする視聴環境制御システム。
- 映像データ及び/又は音声データを送信するデータ送信方法であって、
仮想的な視聴環境空間における周辺機器の配置パターンを示す識別情報と、前記仮想的な視聴環境空間における周辺機器に対する視聴環境制御データとを、前記映像データ及び/又は音声データに付加して送信することを特徴とするデータ送信方法。 - 映像データ及び/又は音声データに関連付けて、仮想的な視聴環境空間における周辺機器の配置パターンを示す識別情報と、前記仮想的な視聴環境空間における周辺機器に対する視聴環境制御データとを格納しており、
外部装置からの送信要求を受けて、所定の映像データ及び/又は音声データに関する識別情報と視聴環境制御データとを、前記送信要求元の外部装置へ送信することを特徴とするデータ送信方法。 - 映像データ及び/又は音声データを受信するステップ、
仮想的な視聴環境空間における周辺機器の配置パターンを示す識別情報と、前記仮想的な視聴環境空間における周辺機器に対する視聴環境制御データとを受信するステップ、
実際の視聴環境空間における周辺機器の配置パターンを表す機器配置情報を記憶するステップ、
および、前記受信した識別情報と前記記憶された機器配置情報とを用いて、前記視聴環境制御データを、前記実際の視聴環境空間における周辺機器を駆動制御するための駆動制御データに変換するステップを有することを特徴とする視聴環境制御方法。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2009801273893A CN102090057A (zh) | 2008-07-15 | 2009-07-14 | 数据发送装置、数据发送方法、视听环境控制装置、视听环境控制系统及视听环境控制方法 |
| JP2010520869A JP5092015B2 (ja) | 2008-07-15 | 2009-07-14 | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
| US13/054,177 US20110190911A1 (en) | 2008-07-15 | 2009-07-14 | Data transmitting apparatus, data transmitting method, audio-visual environment controlling apparatus, audio-visual environment controlling system, and audio-visual environment controlling method |
| BRPI0916465A BRPI0916465A2 (pt) | 2008-07-15 | 2009-07-14 | aparelho para transmissão de dados, método para transmissão de dados, aparelho para controle de ambiente audiovisual, sistema de controle de ambiente audiovisual e método de controle ambiente audiovisual |
| EP09797913A EP2315442A1 (en) | 2008-07-15 | 2009-07-14 | Data transmission device, method for transmitting data, audio-visual environment control device, audio-visual environment control system, and method for controlling audio-visual environment |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008-183815 | 2008-07-15 | ||
| JP2008183815 | 2008-07-15 | ||
| JP2009-015373 | 2009-01-27 | ||
| JP2009015373 | 2009-01-27 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010007988A1 true WO2010007988A1 (ja) | 2010-01-21 |
Family
ID=41550390
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2009/062737 Ceased WO2010007988A1 (ja) | 2008-07-15 | 2009-07-14 | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
Country Status (7)
| Country | Link |
|---|---|
| US (1) | US20110190911A1 (ja) |
| EP (1) | EP2315442A1 (ja) |
| JP (1) | JP5092015B2 (ja) |
| KR (1) | KR20110030656A (ja) |
| CN (1) | CN102090057A (ja) |
| BR (1) | BRPI0916465A2 (ja) |
| WO (1) | WO2010007988A1 (ja) |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018147606A (ja) * | 2017-03-01 | 2018-09-20 | 任天堂株式会社 | 照明装置、照明器具及び電子機器 |
| JP2020529054A (ja) * | 2017-01-23 | 2020-10-01 | ザ ラボ イン ザ バッグ | スクリーンおよび少なくとも2つの多感覚ケースを備えた没入型デバイス |
| JP2021525991A (ja) * | 2018-06-07 | 2021-09-27 | シグニファイ ホールディング ビー ヴィSignify Holding B.V. | 遅延の変動に応じた1つ以上のライト効果の選択 |
| JP2022003392A (ja) * | 2014-05-09 | 2022-01-11 | ビュー, インコーポレイテッド | 色合い付与可能窓のための制御方法 |
| JP2022134638A (ja) * | 2021-03-03 | 2022-09-15 | ヤマハ株式会社 | 画像表示システム、表示制御方法、発光制御方法、及びプログラム |
| JP2023529073A (ja) * | 2020-05-25 | 2023-07-07 | シグニファイ ホールディング ビー ヴィ | 距離メトリックに基づくエンターテイメント照明のための画像分析領域の決定 |
| US11899331B2 (en) | 2013-02-21 | 2024-02-13 | View, Inc. | Control method for tintable windows |
| US11940705B2 (en) | 2013-02-21 | 2024-03-26 | View, Inc. | Control method for tintable windows |
| US11950340B2 (en) | 2012-03-13 | 2024-04-02 | View, Inc. | Adjusting interior lighting based on dynamic glass tinting |
| US11960190B2 (en) | 2013-02-21 | 2024-04-16 | View, Inc. | Control methods and systems using external 3D modeling and schedule-based computing |
| US11966142B2 (en) | 2013-02-21 | 2024-04-23 | View, Inc. | Control methods and systems using outside temperature as a driver for changing window tint states |
| US12298644B2 (en) | 2011-03-16 | 2025-05-13 | View Operating Corporation | Controlling transitions in optically switchable devices |
| US12429742B2 (en) | 2012-03-13 | 2025-09-30 | View Operating Corporation | Methods of controlling multi-zone tintable windows |
Families Citing this family (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011073877A1 (en) * | 2009-12-17 | 2011-06-23 | Koninklijke Philips Electronics N.V. | Ambience cinema lighting system |
| US20130147395A1 (en) * | 2011-12-07 | 2013-06-13 | Comcast Cable Communications, Llc | Dynamic Ambient Lighting |
| KR101305249B1 (ko) * | 2012-07-12 | 2013-09-06 | 씨제이씨지브이 주식회사 | 다면 상영 시스템 |
| US8928811B2 (en) * | 2012-10-17 | 2015-01-06 | Sony Corporation | Methods and systems for generating ambient light effects based on video content |
| US8928812B2 (en) | 2012-10-17 | 2015-01-06 | Sony Corporation | Ambient light effects based on video via home automation |
| US20140104497A1 (en) * | 2012-10-17 | 2014-04-17 | Adam Li | Video files including ambient light effects |
| WO2014111826A2 (en) * | 2013-01-17 | 2014-07-24 | Koninklijke Philips N.V. | A controllable stimulus system and a method of controlling an audible stimulus and a visual stimulus |
| TWM459428U (zh) * | 2013-03-04 | 2013-08-11 | Gunitech Corp | 環境控制裝置及影/音播放裝置 |
| US9380443B2 (en) | 2013-03-12 | 2016-06-28 | Comcast Cable Communications, Llc | Immersive positioning and paring |
| US20150312648A1 (en) * | 2014-04-23 | 2015-10-29 | Verizon Patent And Licensing Inc. | Mobile device controlled dynamic room environment using a cast device |
| US10595095B2 (en) * | 2014-11-19 | 2020-03-17 | Lg Electronics Inc. | Method and apparatus for transceiving broadcast signal for viewing environment adjustment |
| EP3062519A1 (en) * | 2015-02-27 | 2016-08-31 | Novabase Digital TV Technologies GmbH | Ambient surround information system for a media presentation |
| CN104869342A (zh) * | 2015-06-09 | 2015-08-26 | 柳州桂通科技股份有限公司 | 一种多媒体多信息同步重现的方法及其应用 |
| DE102015115050B4 (de) * | 2015-09-08 | 2017-07-27 | Jörg Köhler | Verfahren zur Beleuchtungsgestaltung |
| CN106422374B (zh) * | 2016-11-22 | 2018-03-13 | 深圳市环球数码科技有限公司 | 一种用于数字影院的动态视觉效果增强系统及控制方法 |
| IT201700099120A1 (it) * | 2017-09-05 | 2019-03-05 | Salvatore Lamanna | Sistema di illuminazione per schermo di qualsiasi tipo |
| US10932344B2 (en) | 2018-10-09 | 2021-02-23 | Rovi Guides, Inc. | Systems and methods for emulating an environment created by the outputs of a plurality of devices |
| US11750745B2 (en) | 2020-11-18 | 2023-09-05 | Kelly Properties, Llc | Processing and distribution of audio signals in a multi-party conferencing environment |
| JP7786120B2 (ja) * | 2021-10-12 | 2025-12-16 | ヤマハ株式会社 | 映像信号処理方法及び映像信号処理装置 |
| US12295081B2 (en) | 2022-01-06 | 2025-05-06 | Comcast Cable Communications, Llc | Video display environmental lighting |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001078117A (ja) * | 1999-09-06 | 2001-03-23 | Matsushita Electric Ind Co Ltd | ディジタル放送用受信装置 |
| JP2001343900A (ja) * | 2000-05-31 | 2001-12-14 | Matsushita Electric Ind Co Ltd | 照明システムおよび照明制御データ作成方法 |
| JP2007006461A (ja) * | 2005-05-23 | 2007-01-11 | Sharp Corp | 映像呈示システム |
| JP2008005297A (ja) * | 2006-06-23 | 2008-01-10 | Fujifilm Corp | 画像撮影再生システム |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5883621A (en) * | 1996-06-21 | 1999-03-16 | Sony Corporation | Device control with topology map in a digital network |
| US6548967B1 (en) * | 1997-08-26 | 2003-04-15 | Color Kinetics, Inc. | Universal lighting network methods and systems |
| US6976267B1 (en) * | 1999-04-09 | 2005-12-13 | Sony Corporation | Method and apparatus for controlling connections between devices |
| US7540012B1 (en) * | 1999-06-08 | 2009-05-26 | International Business Machines Corporation | Video on demand configuring, controlling and maintaining |
| US20050275626A1 (en) * | 2000-06-21 | 2005-12-15 | Color Kinetics Incorporated | Entertainment lighting system |
| AU2002236807A1 (en) * | 2001-01-18 | 2002-07-30 | Madstone Films | A method and system providing a digital cinema distribution network having backchannel feedback |
| DE602005021685D1 (de) * | 2004-11-30 | 2010-07-15 | Koninkl Philips Electronics Nv | Anzeigesystem |
| JP2007006281A (ja) * | 2005-06-24 | 2007-01-11 | Sony Corp | オーディオ・ディスプレイ装置 |
| EP2018062A4 (en) * | 2006-04-21 | 2010-08-04 | Sharp Kk | DATA TRANSMISSION METHOD AND DEVICE, AND AUDIOVISUAL ENVIRONMENT MANAGEMENT DEVICE, SYSTEM AND METHOD |
| US20110316426A1 (en) * | 2006-12-28 | 2011-12-29 | Sharp Kabushiki Kaisha | Audio-visual environment control device, audio-visual environment control system and audio-visual environment control method |
-
2009
- 2009-07-14 WO PCT/JP2009/062737 patent/WO2010007988A1/ja not_active Ceased
- 2009-07-14 US US13/054,177 patent/US20110190911A1/en not_active Abandoned
- 2009-07-14 JP JP2010520869A patent/JP5092015B2/ja not_active Expired - Fee Related
- 2009-07-14 CN CN2009801273893A patent/CN102090057A/zh active Pending
- 2009-07-14 EP EP09797913A patent/EP2315442A1/en not_active Withdrawn
- 2009-07-14 KR KR1020117002439A patent/KR20110030656A/ko not_active Withdrawn
- 2009-07-14 BR BRPI0916465A patent/BRPI0916465A2/pt not_active IP Right Cessation
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001078117A (ja) * | 1999-09-06 | 2001-03-23 | Matsushita Electric Ind Co Ltd | ディジタル放送用受信装置 |
| JP2001343900A (ja) * | 2000-05-31 | 2001-12-14 | Matsushita Electric Ind Co Ltd | 照明システムおよび照明制御データ作成方法 |
| JP2007006461A (ja) * | 2005-05-23 | 2007-01-11 | Sharp Corp | 映像呈示システム |
| JP2008005297A (ja) * | 2006-06-23 | 2008-01-10 | Fujifilm Corp | 画像撮影再生システム |
Cited By (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12298644B2 (en) | 2011-03-16 | 2025-05-13 | View Operating Corporation | Controlling transitions in optically switchable devices |
| US12429742B2 (en) | 2012-03-13 | 2025-09-30 | View Operating Corporation | Methods of controlling multi-zone tintable windows |
| US11950340B2 (en) | 2012-03-13 | 2024-04-02 | View, Inc. | Adjusting interior lighting based on dynamic glass tinting |
| US11966142B2 (en) | 2013-02-21 | 2024-04-23 | View, Inc. | Control methods and systems using outside temperature as a driver for changing window tint states |
| US12455485B2 (en) | 2013-02-21 | 2025-10-28 | View Operating Corporation | Control method for tintable windows |
| US12210261B2 (en) | 2013-02-21 | 2025-01-28 | View, Inc. | Control method for tintable windows |
| US11960190B2 (en) | 2013-02-21 | 2024-04-16 | View, Inc. | Control methods and systems using external 3D modeling and schedule-based computing |
| US11899331B2 (en) | 2013-02-21 | 2024-02-13 | View, Inc. | Control method for tintable windows |
| US11940705B2 (en) | 2013-02-21 | 2024-03-26 | View, Inc. | Control method for tintable windows |
| JP2022003392A (ja) * | 2014-05-09 | 2022-01-11 | ビュー, インコーポレイテッド | 色合い付与可能窓のための制御方法 |
| JP2023011920A (ja) * | 2014-05-09 | 2023-01-24 | ビュー, インコーポレイテッド | 色合い付与可能窓のための制御方法 |
| JP7072753B2 (ja) | 2017-01-23 | 2022-05-23 | ザ ラボ イン ザ バッグ | スクリーンおよび少なくとも2つの多感覚ケースを備えた没入型デバイス |
| JP2020529054A (ja) * | 2017-01-23 | 2020-10-01 | ザ ラボ イン ザ バッグ | スクリーンおよび少なくとも2つの多感覚ケースを備えた没入型デバイス |
| JP7009068B2 (ja) | 2017-03-01 | 2022-01-25 | 任天堂株式会社 | 照明装置、照明器具及び電子機器 |
| JP2018147606A (ja) * | 2017-03-01 | 2018-09-20 | 任天堂株式会社 | 照明装置、照明器具及び電子機器 |
| JP7273856B2 (ja) | 2018-06-07 | 2023-05-15 | シグニファイ ホールディング ビー ヴィ | 遅延の変動に応じた1つ以上のライト効果の選択 |
| JP2021525991A (ja) * | 2018-06-07 | 2021-09-27 | シグニファイ ホールディング ビー ヴィSignify Holding B.V. | 遅延の変動に応じた1つ以上のライト効果の選択 |
| JP2023529073A (ja) * | 2020-05-25 | 2023-07-07 | シグニファイ ホールディング ビー ヴィ | 距離メトリックに基づくエンターテイメント照明のための画像分析領域の決定 |
| JP2022134638A (ja) * | 2021-03-03 | 2022-09-15 | ヤマハ株式会社 | 画像表示システム、表示制御方法、発光制御方法、及びプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20110190911A1 (en) | 2011-08-04 |
| BRPI0916465A2 (pt) | 2018-02-06 |
| CN102090057A (zh) | 2011-06-08 |
| JP5092015B2 (ja) | 2012-12-05 |
| JPWO2010007988A1 (ja) | 2012-01-05 |
| KR20110030656A (ko) | 2011-03-23 |
| EP2315442A1 (en) | 2011-04-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5092015B2 (ja) | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 | |
| JP5442643B2 (ja) | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御方法および視聴環境制御システム | |
| JP7496890B2 (ja) | 遠隔地演出システム及び遠隔地演出方法 | |
| KR101667416B1 (ko) | 실감 효과 표현 방법 및 그 장치 및 실감 기기 성능 메타데이터가 기록된 컴퓨터로 읽을 수 있는 기록 매체 | |
| US20110188832A1 (en) | Method and device for realising sensory effects | |
| JP4948548B2 (ja) | 送信装置、視聴環境制御装置、及び視聴環境制御システム | |
| JP6566271B2 (ja) | 伝送方法及び再生装置 | |
| KR101078641B1 (ko) | 감각 재생 장치에 관계된 메타데이터를 이용한 멀티미디어 응용 시스템 및 방법 | |
| WO2010007987A1 (ja) | データ送信装置、データ受信装置、データ送信方法、データ受信方法および視聴環境制御方法 | |
| US20100268745A1 (en) | Method and apparatus for representing sensory effects using sensory device capability metadata | |
| US20110125790A1 (en) | Method and apparatus for representing sensory effects and computer readable recording medium storing sensory effect metadata | |
| CN110419225B (zh) | 用于多媒体回放中的环境信号的分布式同步控制系统 | |
| US20100274817A1 (en) | Method and apparatus for representing sensory effects using user's sensory effect preference metadata | |
| JP2011259354A (ja) | 視聴環境制御システム、送信装置、受信装置 | |
| US11184581B2 (en) | Method and apparatus for creating, distributing and dynamically reproducing room illumination effects | |
| JP5074864B2 (ja) | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 | |
| JP2009060541A (ja) | データ送信装置、データ送信方法、視聴環境制御装置、及び視聴環境制御方法 | |
| JP2019149838A (ja) | 再生方法及び再生装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200980127389.3 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09797913 Country of ref document: EP Kind code of ref document: A1 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2010520869 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 393/CHENP/2011 Country of ref document: IN |
|
| ENP | Entry into the national phase |
Ref document number: 20117002439 Country of ref document: KR Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2009797913 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13054177 Country of ref document: US |
|
| ENP | Entry into the national phase |
Ref document number: PI0916465 Country of ref document: BR Kind code of ref document: A2 Effective date: 20110114 |