[go: up one dir, main page]

WO2024256262A1 - A method of controlling a plurality of lighting devices and an augmented reality device - Google Patents

A method of controlling a plurality of lighting devices and an augmented reality device Download PDF

Info

Publication number
WO2024256262A1
WO2024256262A1 PCT/EP2024/065597 EP2024065597W WO2024256262A1 WO 2024256262 A1 WO2024256262 A1 WO 2024256262A1 EP 2024065597 W EP2024065597 W EP 2024065597W WO 2024256262 A1 WO2024256262 A1 WO 2024256262A1
Authority
WO
WIPO (PCT)
Prior art keywords
lighting devices
virtual
location
physical environment
subarea
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/065597
Other languages
French (fr)
Inventor
Peter Deixler
Dzmitry Viktorovich Aliakseyeu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Signify Holding BV
Original Assignee
Signify Holding BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding BV filed Critical Signify Holding BV
Publication of WO2024256262A1 publication Critical patent/WO2024256262A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/19Controlling the light source by remote control via wireless transmission

Definitions

  • the invention relates to a method of controlling a plurality of lighting devices and an augmented reality device.
  • the invention further relates to a computer program product for executing the method.
  • the invention further relates to control system for controlling a plurality of lighting devices and an augmented reality device.
  • AR augmented reality
  • virtual objects may be displayed as an overlay on top of the physical environment, for example on a smartphone or on AR-glasses, thereby creating a so-called mixed reality environment.
  • This technology enables many different types of applications, for example interaction with avatars of other users who may be virtually present in the same physical environment, or interaction with virtual characters or other objects that are rendered as an overlay on the physical environment. Users may, for example, chat or play games with virtually present users (or with artificially created characters) in such a mixed reality environment.
  • EP 3484249 Al discloses a control system configured for controlling at least one controllable device.
  • the device has been assigned a corresponding identifier and is configured for transmitting an identification signal comprising the identifier of the device.
  • the control system comprises a display for displaying a control item configured for controlling the controllable device.
  • the control system also comprises a receiver configured for wirelessly receiving the identification signal comprising the identifier.
  • the control system is configured for assigning a position of the control item on the display to the device identified by means of said received identifier.
  • the inventors have realized that when a virtual environment is rendered as an overlay onto a physical environment to create a mixed reality environment, the size of the physical environment may not always match to the virtual environment. Consequently, the user operating the augmented reality device may perceive the size of the virtual environment incorrectly. It is therefore an object to provide an augmented reality system that improves the perception of size of the virtual environment for a user operating an augmented reality device.
  • the object is achieved by a method of controlling a plurality of lighting devices and an augmented reality device, wherein the plurality of lighting devices and the augmented reality device are located in a physical environment, the method comprising: rendering, on a display of the augmented reality device, a virtual environment as an overlay on the physical environment, determining a size and a location of a virtual area in the virtual environment with respect to the physical environment, identifying a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtaining location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, selecting, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, selecting, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, controlling the one or more first lighting devices according to one or more first light settings, and controlling the one or more
  • the one or more first light settings may have a higher brightness than the one or more second light settings.
  • the one or more second light settings may, for example, be off-light settings and the one or more second lighting devices may thus be switched off.
  • the one or more first light settings may be selected such that the one or more first light settings have a higher brightness than the one or more second light settings. Controlling the one or more second lighting devices to provide less illumination compared to the one or more first lighting devices may decrease the perceived size of the area outside the subarea and/or may increase the perceived size of the subarea, and therewith the size of the virtual area.
  • the one or more second light settings may have a higher brightness than the one or more first light settings.
  • the one or more first light settings may, for example, be off-light settings and the one or more first lighting devices may thus be switched off. This may be beneficial under certain circumstances, for instance when the virtual area is a dark virtual area. It may then be beneficial to control the one or more second lighting devices to provide more illumination than the one or more first lighting devices to increase the contrast between the subarea and an area outside the subarea.
  • the method may further comprise: rendering a virtual object in the virtual environment, determining a virtual location of the virtual object in the virtual environment, determining a physical location in the physical environment that corresponds to the virtual location and/or an orientation of the augmented reality device (and/or the user operating the augmented reality device) with respect to the physical location, wherein the size and the location of the virtual area with respect to the physical environment are determined based on the physical location and/or the orientation.
  • This is beneficial, because the perceived size of the subarea (and the virtual area) is adjusted based on the location of the virtual object with respect to the user operating the augmented reality device.
  • the method may further comprise: determining a distance between the physical location and the augmented reality device.
  • the size and the location of the virtual area with respect to the physical environment are determined based on the distance.
  • lighting devices located between the location of the augmented reality device and the physical location are controlled differently from lighting devices beyond that distance.
  • the virtual object may be an avatar of a user or a virtual character.
  • the virtual object may move from a first virtual location to a second virtual location in the virtual environment, thereby dynamically adjusting the size of the virtual area and therewith the size of the physical area and the control of the lighting devices.
  • the augmented reality device may be located inside the subarea of the physical environment. This would result in an adjustment of the contrast between the (sub)area wherein the augmented reality device is located and an area outside the subarea.
  • the size and the location of the virtual area may be determined based on one or more distances between the augmented reality device and one or more respective edges of the virtual area.
  • the augmented reality device may be located outside the subarea of the physical environment. This would result in an adjustment of the contrast between a (sub)area remote from the augmented reality device.
  • the size and the location of the virtual area may be the same as the size and the location of the virtual environment.
  • the subarea may be the same size as the (full) virtual environment.
  • the method may further comprise: determining the one or more first light settings and the one or more second light settings such that a perceived contrast between the subarea and an area outside the subarea is increased or decreased.
  • the one or more first and/or second light settings may be selected, for instance to provide a target contrast between the subarea and the area outside subarea.
  • the target contrast may, for example, be a color contrast and/or a brightness contrast.
  • the object is achieved by a computer program product for a computing device, the computer program product comprising computer program code to perform any of the above-mentioned methods when the computer program product is run on a processing unit of the computing device.
  • the object is achieved by a control system for controlling a plurality of lighting devices and an augmented reality device, wherein the plurality of lighting devices and the augmented reality device are located in a physical environment, the augmented reality device comprising a display configured to render a virtual environment as an overlay on the physical environment, the control system comprising one or more processors configured to: determine a size and a location of a virtual area in the virtual environment with respect to the physical environment, identify a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtain location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, select, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, select, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, control the one or more first lighting devices according to one or more first light settings,
  • the object is achieved by an augmented reality device for use in the control system.
  • the augmented reality device comprises a display configured to render a virtual environment as an overlay on the physical environment.
  • the augmented reality device may comprise a processor configured to determine a size and a location of a virtual area in the virtual environment with respect to the physical environment and identify a subarea of the physical environment which corresponds to the virtual area based on the location and the size.
  • the processor of the augmented reality device may be configured to obtain location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, select, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, and select, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information.
  • the processor of the augmented reality device may be configured to request a lighting system controller configured to control the lighting devices to perform these steps.
  • the processor of the augmented reality device may be configured to control the one or more first lighting devices according to one or more first light settings and the one or more second lighting devices according to one or more second light settings. Alternatively, this control may be performed by the lighting system controller.
  • Fig. 1 shows schematically an example of a head-mounted augmented reality device for rendering a virtual environment on a display as an overlay on a physical environment;
  • Fig. 2 shows schematically an example of an augmented reality device for rendering a virtual environment on a display as an overlay on a physical environment
  • Figs. 3a-3d show examples of virtual areas in a virtual environment overlaid on a physical environment
  • Fig. 4 shows schematically a method of controlling a plurality of lighting devices and an augmented reality device. All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
  • Figs. 1 and 2 show schematically examples of augmented reality devices 120.
  • the augmented reality device 120 is depicted as a head-mounted augmented reality device (e.g. augmented reality glasses), and in Fig. 2 the augmented reality device 120 is depicted as a hand-held augmented reality device (e.g. a smartphone or a tablet pc).
  • a plurality of lighting devices 130, 132, 134 and the augmented reality device 120 are located in a physical environment 140.
  • the augmented reality device 120 comprises a display 122 configured to render a virtual environment 150 as an overlay on the physical environment 140.
  • Fig. 2 further depicts a control system 102 (not shown in Fig.
  • the control system 102 comprises one or more processors 106 (e.g. circuitry, microcontrollers, microchips).
  • the control system 102 may further comprise one or more communication units 104 for communicating with the lighting devices 130, 132, 134 and/or the augmented reality device 120.
  • the one or more processors 106 are configured to: determine a size and a location of a virtual area 154 in the virtual environment 150 with respect to the physical environment 140, identify a subarea of the physical environment 140 which corresponds to the virtual area based on the location and the size and obtain location information indicative of locations of the plurality of lighting devices 130, 132, 134 in the physical environment 140.
  • the one or more processors 106 are further configured to select, from the plurality of lighting devices 130, 132, 134, one or more first lighting devices 130, 132 which are located inside the subarea of the physical environment 140 based on the location information, and select, from the plurality of lighting devices 130, 132, 134, one or more second lighting devices 134 which are located outside the subarea of the physical environment 140 based on the location information.
  • the one or more processors 106 are further configured to control the one or more first lighting devices 130, 132 according to one or more first light settings, and control the one or more second lighting devices 134 according to one or more second light settings different from the one or more first light settings.
  • the control system 102 may comprise a single processor 106 for performing these steps.
  • the processor 106 may, for example, be comprised in a (central) lighting control system (e.g. a bridge, a hub, a smartphone, etc.), in the augmented reality device 120, in a remote (cloud) server, etc.
  • the control system 102 may comprise multiple processor 106 for performing these steps.
  • the processors 106 may be located in different parts of a system 100, which system may comprise one or more lighting devices 130, 132, 134, the control system 102 and/or the augmented reality device 120.
  • the locations of the processors 106 and the steps performed by the respective processors 106 may depend on the system architecture of the control system 102 and/or the system architecture of the system 100. Examples thereof are explained below.
  • the augmented reality device 120 comprises a display 122 for rendering a virtual environment 150 comprising one or more virtual objects 152 as an overlay on a view of the physical environment 140.
  • An example of the physical environment 140 is depicted in Fig. 2.
  • the depicted physical environment 140 comprises a couch and three lighting devices 130, 132, 134.
  • the display 122 may be a (semi-) transparent see-through display, wherein the user can see the physical environment 140 through the display 122, and wherein the display 122 is configured to render the virtual environment 150 comprising one or more virtual objects 152 as an overlay on the physical environment 140.
  • the display 122 may be integrated in the (semi-) transparent see-through display, or the virtual environment may be projected on the display 122.
  • augmented reality device 120 may comprise a camera 124 configured to continuously capture images of the physical environment 140 and render the images on the display 122, while rendering the virtual environment comprising one or more virtual objects 152 as an overlay on the images. It should be understood that such augmented reality devices are known in the art, and will therefore not be discussed in further detail.
  • the augmented reality device 120 may comprise a processor 106 configured to render, on the display 122, the virtual environment as an overlay on the physical environment 140.
  • the lighting devices 130, 132 and the augmented reality device 120 are located in the same physical environment 140 (e.g. a room such as a living room, an office, etc.). The user operating the augmented reality device 120 is also located in the physical environment 140.
  • the augmented reality device 120 may comprise a processor configured to determine how to render the virtual environment as an overlay on the physical environment.
  • the processor may, for example, determine to render the virtual environment as an overlay on the physical environment 140 based on a location and/or an orientation of the augmented reality device 120.
  • the processor 106 may, for example, map the virtual environment onto the physical environment.
  • the processor 106 may, for example, use image analysis to analyze an image of the physical environment 140, and select one or more anchor points in the physical environment 140 based on the image analysis.
  • the processor 106 may then anchor virtual objects (e.g.
  • the processor may be configured to obtain a predefined mapping of the virtual environment onto the physical environment 140.
  • the mapping may, for example, be based on the location and the orientation of the augmented reality device 120 relative to the physical environment 140.
  • Such techniques for mapping a virtual environment onto a physical environment are known in the art and will therefore not be discussed in further detail.
  • the one or more processors 106 (which may for example be comprised in the augmented reality device 120 or in the cloud) are configured to determine a size and a location of a virtual area 154 in the virtual environment 150 with respect to the physical environment.
  • the virtual area is an area in the virtual environment 150, and may be defined in various ways, as explained below with reference to Figs. 3a-3d.
  • the virtual area 154 covers at least a part of the virtual environment 150.
  • the one or more processors 106 may obtain information indicative of the location and the size of the virtual area 154 from the augmented reality device 120, from a memory (which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server) or from an augmented reality application (which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server).
  • the one or more processors 106 are further configured to identify a subarea of the physical environment 140 which corresponds to the virtual area 154 based on the location and the size of the virtual area 154. For example, the one or more processors 106 may determine which physical area in the physical environment 140 corresponds to the virtual area 154 in the virtual environment 150 based on a mapping of the virtual environment 150 onto the physical environment 140.
  • the virtual area may be defined as a set of coordinates in the virtual environment 150, which correspond to a set of coordinates in the physical environment 140.
  • the one or more processors 106 may then identify the subarea in the physical environment 140 that corresponds to the virtual area 154.
  • the one or more processors 106 are further configured to obtain location information indicative of locations of the plurality of lighting devices 130, 132, 134 relative to the augmented reality device 120 and/or relative to the physical environment 140. Additionally or alternatively, the one or more processors 106 may be configured to determine the location and/or the orientation of the augmented reality device 120 relative to the physical environment 140. The one or more processors 106 may be configured to receive location information indicative of the relative location/orientation of the augmented reality device 120 and/or the one or more lighting devices 130, 132, 134 relative to the physical environment 140.
  • the location information may, for example, be received from an (indoor) positioning system (such as an RF-based indoor positioning system or a visible light communication (VLC) based positioning system), it may be based on the signal strength of signals communicated between one or more lighting devices 130, 132, 134 and the augmented reality device 120, based on light signals communicated between them, etc.
  • the locations may be indicative of coordinates of the one or more lighting devices 130, 132, 134 and the augmented reality device 120 relative to the physical environment 140.
  • the one or more processors 106 may be configured to obtain the location and/or the orientation of the augmented reality device 120 relative to the physical environment 140 by analyzing the field of view (e.g. field of view 160).
  • the one or more processors 106 may for example be configured to analyze one or more images captured by a camera 124 of the augmented reality device to determine the locations of the lighting devices relative to the physical environment 140 and relative to the augmented reality device 120.
  • the one or more processors 106 may, for example, use image analysis techniques for identifying objects (e.g. lighting devices) in the physical environment 140 to determine their locations.
  • one or more depth cameras and/or depth sensors may be used to detect the dimensions of the physical environment 140. Techniques of obtaining location information of devices in an environment are known in the art and will therefore not be discussed in further detail.
  • the one or more processors 106 are further configured to select, from the plurality of lighting devices 130, 132, 134, one or more first lighting devices 130, 132 which are located inside the subarea of the physical environment.
  • the one or more processors 106 are further configured to select, from the plurality of lighting devices 130, 132, 134, one or more second lighting devices 134 which are located outside the subarea of the physical environment 140.
  • the one or more processors 106 may, for example, compare the locations of the lighting devices 130, 132, 134 (e.g. coordinates of the lighting devices 130, 132, 134) to the location of the subarea (e.g.
  • the one or more processors 106 may analyze the field of view of the user (e.g. field of view 160), for example by analyzing one or more images captured by a camera 124 of the augmented reality device to determine the locations of the lighting devices relative to the subarea in the physical environment 140, and select the one or more first and the one or more second lighting devices based thereon.
  • the one or more processors 106 are further configured to control the one or more first lighting devices 130, 132 according to one or more first light settings, and control the one or more second lighting devices 134 according to one or more second light settings different from the one or more first light settings.
  • the one or more processors 106 may, for example, communicate lighting control commands to the lighting devices 130, 132, 134 to to control them according to the respective light settings.
  • the control system 102 may comprise a communication unit configured to communicate the lighting control commands the lighting devices 130, 132, 134.
  • a lighting control command may comprise lighting control instructions for controlling the light output, such as the color, intensity, saturation, beam size, beam shape, etc. of one or more light sources of a lighting device. Referring to the example of Figs.
  • the one or more processors 106 may determine that lighting devices 130 and 132 are located in the subarea (corresponding to the virtual area), and the one or more processors 106 may control the lighting devices 130 and 132 according to the one or more first light settings.
  • the one or more processors 106 may determine that lighting device 134 is located outside the subarea (corresponding to the virtual area), and the one or more processors 106 may control lighting device 134 according to the one or more second light settings.
  • Figs. 3a-3d show a top view of the examples of virtual areas 154 in a virtual environment 150 mapped onto a physical environment 140. In the example of Fig. 3a, the size and the location of the virtual area 154 are the same as the size and the location of the virtual environment 150.
  • the one or more processors 106 may identify a subarea of the physical environment 140 which corresponds to the virtual area 154, for instance based on a mapping of the virtual environment 150 on the physical environment 140, or based on an analysis of one or more images captured by a camera 124 of the augmented reality device 120. The one or more processors 106 may then select lighting devices 130 and 132 because they are located in the subarea that corresponds to the virtual area 154 and control lighting devices 130 and 132 according to one or more first light settings. The one or more processors 106 may further select lighting device 134 because it is located outside the subarea that corresponds to the virtual area 154 and control the lighting device 134 according to one or more first second settings.
  • the virtual area 154 is a virtual subarea of the virtual environment 150.
  • the one or more processors 106 may identify a subarea of the physical environment 140 which corresponds to the virtual area 154, for instance based on a mapping of the virtual environment 150 on the physical environment 140, or based on an analysis of one or more images captured by a camera 124 of the augmented reality device 120.
  • the one or more processors 106 may then select lighting devices 130 and 132 because they are located in the subarea that corresponds to the virtual area 154 and control lighting devices 130 and 132 according to one or more first light settings.
  • the one or more processors 106 may further select lighting device 134 because it is located outside the subarea that corresponds to the virtual area 154 and control the lighting device 134 according to one or more first second settings.
  • the virtual area 154 is a virtual subarea of the virtual environment 150.
  • the location and size of the virtual area 154 are based on the location of a virtual object 152 (e.g. a virtual character, a stationary virtual object, etc.).
  • the one or more processors 106 may be configured to render the virtual object 152 in the virtual environment 150, determine a virtual location of the object in the virtual environment 154, a physical location (e.g. a coordinates) in the physical environment 140 that corresponds to the virtual location of the virtual object 152 and determine the size and the location of the virtual area 154 with respect to the physical environment 140 based on the physical location.
  • the size and the location of the virtual area with respect to the physical environment may be determined based on a distance between the augmented reality device 120 and the physical location, and/or based on an orientation of the augmented reality device or the user operating the augmented reality device with respect to the physical location.
  • the location and size of virtual area 154 are determined based on the locations of the augmented reality device 120 and the physical location of the virtual object 152.
  • the one or more processors 106 may identify a subarea of the physical environment 140 which corresponds to the virtual area 154, for instance based on a mapping of the virtual environment 150 on the physical environment 140, or based on an analysis of one or more images captured by a camera 124 of the augmented reality device 120.
  • the one or more processors 106 may then select lighting device 130 because it is located in the subarea that corresponds to the virtual area 154 and control lighting device 130 according to one or more first light settings.
  • the one or more processors 106 may further select lighting devices 132 and 134 because they are located outside the subarea that corresponds to the virtual area 154 and control the lighting devices 132 and 134 according to one or more second light settings.
  • the virtual area 154 of Fig. 3d is a virtual subarea of the virtual environment 150.
  • the location and size of the virtual area 154 are based on the location of a virtual object 152 (e.g. a virtual character, a stationary virtual object, etc.).
  • the size of the virtual area 154 may, for example, be predefined. Different virtual objects may be associated with different sizes.
  • the one or more processors 106 may, for example, access a memory (e.g. comprised in the augmented reality device or in a remote (cloud) server), and retrieve a size of the virtual area 154 associated with the virtual object 152. Similar to the example of Fig.
  • the one or more processors 106 may be configured to render the virtual object 152 in the virtual environment 150, determine a virtual location of the object in the virtual environment 154, a physical location (e.g. a coordinates) in the physical environment 140 that corresponds to the virtual location of the virtual object 152 and determine the size and the location of the virtual area 154 with respect to the physical environment 140 based on the physical location.
  • a virtual location of the object in the virtual environment 154 e.g. a coordinates
  • the difference between the examples of Fig. 3d and Fig. 3c is that in Fig. 3d the location of the virtual object 152 defines the virtual area 154, whereas in Fig. 3c the virtual area 154 is further defined by the location of the augmented reality device 120.
  • Fig. 3d the location of the virtual object 152 defines the virtual area 154
  • Fig. 3c the virtual area 154 is further defined by the location of the augmented reality device 120.
  • the one or more processors 106 may then select lighting devices 130 and 132 because they are located in the subarea that corresponds to the virtual area 154 and control lighting devices 130 ad 132 according to one or more first light settings.
  • the one or more processors 106 may further select lighting device 134 because it is located outside the subarea that corresponds to the virtual area 154 and control lighting device 134 according to one or more second light settings.
  • the one or more first light settings may have a higher brightness than the one or more second light settings.
  • the one or more second light settings may, for example, be off-light settings and the one or more second lighting devices may thus be switched off.
  • the one or more processors 106 may be configured to determine the one or more first light settings such that the one or more first light settings have a higher brightness than the one or more second light settings. Referring to Figs. 1 and 2, the one or more processors 106 may thus control the one or more second lighting devices 134 to provide less illumination compared to the one or more first lighting devices 130, 132 to decrease the perceived size of the area outside the subarea and/or to increase the perceived size of the subarea.
  • the one or more second light settings may have a higher brightness than the one or more first light settings.
  • the one or more first light settings may, for example, be off-light settings and the one or more first lighting devices may thus be switched off.
  • the one or more processors 106 may be configured to determine the one or more first light settings such that the one or more first light settings have a lower brightness than the one or more second light settings. Referring to Figs. 1 and 2, the one or more processors 106 may thus control the one or more second lighting devices 134 to provide more illumination compared to the one or more first lighting devices 130, 132 to increase the contrast between the subarea and an area outside the subarea.
  • the one or more processors 106 may be further configured to obtain type information indicative of the types of the plurality of lighting devices 130, 132, 134 located inside and/or outside the subarea.
  • the one or more processors 106 may be configured to receive the type information from the lighting devices, from a memory (e.g. a local or a remote memory), from a lighting system controller, etc.
  • the one or more processors 106 may be further configured to select one or more first lighting devices located inside the subarea based on the types of lighting devices located inside the subarea. Additionally or alternatively, the one or more processors 106 may be configured to select one or more second lighting devices located outside the subarea based on the types of lighting devices located outside the subarea. Different types of lighting devices may have different light emission properties.
  • the beam size and/or shape of the lighting devices may differ between lighting devices, or the spectral range of illumination (e.g. colored light vs. white light only) may differ between lighting devices.
  • a wall washer provides a completely different light effect compared to a spotlight.
  • the one or more processors 106 may select the one or more first and/or second lighting devices to provide a target contrast between the subarea and an area outside subarea.
  • the target contrast may, for example, be a color contrast and/or a brightness contrast.
  • the target contrast may be associated with the virtual object 152 or the virtual area 154, and the one or more processors 106 may be configured to receive the target contrast from the augmented reality device 120, from a memory (which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server) or from an augmented reality application (which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server).
  • a memory which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server
  • an augmented reality application which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server.
  • the one or more processors 106 may be configured to determining the one or more first light settings and the one or more second light settings such that a perceived contrast between the subarea and an area outside the subarea is increased or decreased.
  • the one or more processors 106 may select the one or more first and/or second light settings to provide a target contrast between the subarea and an area outside subarea.
  • the target contrast may, for example, be a color contrast and/or a brightness contrast.
  • the target contrast may be associated with the virtual object 152 or the virtual area 154, and the one or more processors 106 may be configured to receive the target contrast from the augmented reality device 120, from a memory (which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server) or from an augmented reality application (which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server).
  • a memory which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server
  • an augmented reality application which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server.
  • the one or more processors 106 may be further configured to receive an input indicative of the virtual area 154.
  • the input may be a user input received via a user interface, and the user input may be indicative of the size and the location of the virtual area.
  • the user interface may be a user interface of the augmented reality device 120.
  • Fig. 4 shows schematically a method of controlling a plurality of lighting devices and an augmented reality device.
  • the plurality of lighting devices and the augmented reality device are located in a physical environment.
  • the method 400 comprises: rendering 402, on a display of the augmented reality device, a virtual environment as an overlay on the physical environment, determining 404 a size and a location of a virtual area in the virtual environment with respect to the physical environment, identifying 406 a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtaining 408 location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, selecting 410, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, selecting 412, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, controlling 414 the one or more first lighting devices according to one or
  • the method 400 may be executed by computer program code of a computer program product when the computer program product is run on a processing unit of a computing device, such as the one or more processors 106.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • aspects of the invention may be implemented in a computer program product, which may be a collection of computer program instructions stored on a computer readable storage device which may be executed by a computer.
  • the instructions of the present invention may be in any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs) or Java classes.
  • the instructions can be provided as complete executable programs, partial executable programs, as modifications to existing programs (e.g. updates) or extensions for existing programs (e.g. plugins).
  • parts of the processing of the present invention may be distributed over multiple computers or processors or even the ‘cloud’.
  • Storage media suitable for storing computer program instructions include all forms of nonvolatile memory, including but not limited to EPROM, EEPROM and flash memory devices, magnetic disks such as the internal and external hard disk drives, removable disks and CD-ROM disks.
  • the computer program product may be distributed on such a storage medium, or may be offered for download through HTTP, FTP, email or through a server connected to a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of controlling a plurality of lighting devices and an augmented reality device is disclosed. The plurality of lighting devices and the augmented reality device are located in a physical environment. The method comprises: rendering, on a display of the augmented reality device, a virtual environment as an overlay on the physical environment, determining a size and a location of a virtual area in the virtual environment with respect to the physical environment, identifying a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtaining location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, selecting, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, selecting, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, controlling the one or more first lighting devices according to one or more first light settings, and controlling the one or more second lighting devices according to one or more second light settings different from the one or more first light settings.

Description

A METHOD OF CONTROLLING A PLURALITY OF LIGHTING DEVICES AND AN
AUGMENTED REALITY DEVICE
FIELD OF THE INVENTION
The invention relates to a method of controlling a plurality of lighting devices and an augmented reality device. The invention further relates to a computer program product for executing the method. The invention further relates to control system for controlling a plurality of lighting devices and an augmented reality device.
BACKGROUND
Recent developments in augmented reality (AR) enable users to interact with virtual objects. These virtual objects may be displayed as an overlay on top of the physical environment, for example on a smartphone or on AR-glasses, thereby creating a so-called mixed reality environment. This technology enables many different types of applications, for example interaction with avatars of other users who may be virtually present in the same physical environment, or interaction with virtual characters or other objects that are rendered as an overlay on the physical environment. Users may, for example, chat or play games with virtually present users (or with artificially created characters) in such a mixed reality environment.
EP 3484249 Al discloses a control system configured for controlling at least one controllable device. The device has been assigned a corresponding identifier and is configured for transmitting an identification signal comprising the identifier of the device. The control system comprises a display for displaying a control item configured for controlling the controllable device. The control system also comprises a receiver configured for wirelessly receiving the identification signal comprising the identifier. The control system is configured for assigning a position of the control item on the display to the device identified by means of said received identifier.
SUMMARY OF THE INVENTION
The inventors have realized that when a virtual environment is rendered as an overlay onto a physical environment to create a mixed reality environment, the size of the physical environment may not always match to the virtual environment. Consequently, the user operating the augmented reality device may perceive the size of the virtual environment incorrectly. It is therefore an object to provide an augmented reality system that improves the perception of size of the virtual environment for a user operating an augmented reality device.
According to a first aspect, the object is achieved by a method of controlling a plurality of lighting devices and an augmented reality device, wherein the plurality of lighting devices and the augmented reality device are located in a physical environment, the method comprising: rendering, on a display of the augmented reality device, a virtual environment as an overlay on the physical environment, determining a size and a location of a virtual area in the virtual environment with respect to the physical environment, identifying a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtaining location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, selecting, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, selecting, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, controlling the one or more first lighting devices according to one or more first light settings, and controlling the one or more second lighting devices according to one or more second light settings different from the one or more first light settings.
By determining which lighting devices are located in the subarea of the physical environment (which corresponds to the virtual area) and by controlling these lighting devices differently from lighting devices outside the subarea, the contrast between the subarea (and therewith the virtual area) and an area outside the subarea is adjusted. By adjusting the contrast between the subarea and the virtual area, an augmented reality system that improves the perception of size of the virtual environment is provided. The one or more first light settings may have a higher brightness than the one or more second light settings. The one or more second light settings may, for example, be off-light settings and the one or more second lighting devices may thus be switched off. The one or more first light settings may be selected such that the one or more first light settings have a higher brightness than the one or more second light settings. Controlling the one or more second lighting devices to provide less illumination compared to the one or more first lighting devices may decrease the perceived size of the area outside the subarea and/or may increase the perceived size of the subarea, and therewith the size of the virtual area.
Alternatively, the one or more second light settings may have a higher brightness than the one or more first light settings. The one or more first light settings may, for example, be off-light settings and the one or more first lighting devices may thus be switched off. This may be beneficial under certain circumstances, for instance when the virtual area is a dark virtual area. It may then be beneficial to control the one or more second lighting devices to provide more illumination than the one or more first lighting devices to increase the contrast between the subarea and an area outside the subarea.
The method may further comprise: rendering a virtual object in the virtual environment, determining a virtual location of the virtual object in the virtual environment, determining a physical location in the physical environment that corresponds to the virtual location and/or an orientation of the augmented reality device (and/or the user operating the augmented reality device) with respect to the physical location, wherein the size and the location of the virtual area with respect to the physical environment are determined based on the physical location and/or the orientation. This is beneficial, because the perceived size of the subarea (and the virtual area) is adjusted based on the location of the virtual object with respect to the user operating the augmented reality device.
The method may further comprise: determining a distance between the physical location and the augmented reality device. The size and the location of the virtual area with respect to the physical environment are determined based on the distance. Thus, lighting devices located between the location of the augmented reality device and the physical location (corresponding to the virtual location of the virtual object) are controlled differently from lighting devices beyond that distance.
The virtual object may be an avatar of a user or a virtual character. The virtual object may move from a first virtual location to a second virtual location in the virtual environment, thereby dynamically adjusting the size of the virtual area and therewith the size of the physical area and the control of the lighting devices. The augmented reality device may be located inside the subarea of the physical environment. This would result in an adjustment of the contrast between the (sub)area wherein the augmented reality device is located and an area outside the subarea. The size and the location of the virtual area may be determined based on one or more distances between the augmented reality device and one or more respective edges of the virtual area. Alternatively, the augmented reality device may be located outside the subarea of the physical environment. This would result in an adjustment of the contrast between a (sub)area remote from the augmented reality device.
The size and the location of the virtual area may be the same as the size and the location of the virtual environment. In other words, the subarea may be the same size as the (full) virtual environment.
The method may further comprise: obtaining type information indicative of the types of the plurality of lighting devices located inside and/or outside the subarea. The selection of the one or more first lighting devices located inside the subarea may be further based on the types of lighting devices located inside the subarea, and/or wherein the selection of the one or more second lighting devices located outside the subarea may be further based on the types of lighting devices located outside the subarea. Different types of lighting devices may have different light emission properties. For instance, the beam size and/or shape of the lighting devices may differ between lighting devices, or the spectral range of illumination (e.g. colored light vs. white light only) may differ between lighting devices. The one or more first and/or second lighting devices may be selected to provide a target contrast between the subarea and an area outside subarea. The target contrast may, for example, be a color contrast and/or a brightness contrast.
The method may further comprise: determining the one or more first light settings and the one or more second light settings such that a perceived contrast between the subarea and an area outside the subarea is increased or decreased. The one or more first and/or second light settings may be selected, for instance to provide a target contrast between the subarea and the area outside subarea. The target contrast may, for example, be a color contrast and/or a brightness contrast.
The method may further comprise: receiving an input indicative of the virtual area. The input may be a user input received via a user interface, and the user input may be indicative of the size and the location of the virtual area. This is beneficial, because the user can indicate a desired target size (and location) of the virtual area, whereupon the contrast between the subarea (and therewith the virtual area) and the area outside the subarea is adjusted.
According to a second aspect, the object is achieved by a computer program product for a computing device, the computer program product comprising computer program code to perform any of the above-mentioned methods when the computer program product is run on a processing unit of the computing device.
According to a third aspect, the object is achieved by a control system for controlling a plurality of lighting devices and an augmented reality device, wherein the plurality of lighting devices and the augmented reality device are located in a physical environment, the augmented reality device comprising a display configured to render a virtual environment as an overlay on the physical environment, the control system comprising one or more processors configured to: determine a size and a location of a virtual area in the virtual environment with respect to the physical environment, identify a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtain location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, select, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, select, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, control the one or more first lighting devices according to one or more first light settings, and control the one or more second lighting devices according to one or more second light settings different from the one or more first light settings.
According to a fourth aspect, the object is achieved by an augmented reality device for use in the control system. The augmented reality device comprises a display configured to render a virtual environment as an overlay on the physical environment. The augmented reality device may comprise a processor configured to determine a size and a location of a virtual area in the virtual environment with respect to the physical environment and identify a subarea of the physical environment which corresponds to the virtual area based on the location and the size.
Optionally, the processor of the augmented reality device may be configured to obtain location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, select, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, and select, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information. Alternatively, the processor of the augmented reality device may be configured to request a lighting system controller configured to control the lighting devices to perform these steps.
Additionally, the processor of the augmented reality device may be configured to control the one or more first lighting devices according to one or more first light settings and the one or more second lighting devices according to one or more second light settings. Alternatively, this control may be performed by the lighting system controller.
It should be understood that the computer program product, control system and the augmented reality device may have similar and/or identical embodiments and advantages as the above-mentioned methods.
BRIEF DESCRIPTION OF THE DRAWINGS
The above, as well as additional objects, features and advantages of the disclosed systems, devices and methods will be better understood through the following illustrative and non-limiting detailed description of embodiments of devices and methods, with reference to the appended drawings, in which:
Fig. 1 shows schematically an example of a head-mounted augmented reality device for rendering a virtual environment on a display as an overlay on a physical environment;
Fig. 2 shows schematically an example of an augmented reality device for rendering a virtual environment on a display as an overlay on a physical environment;
Figs. 3a-3d show examples of virtual areas in a virtual environment overlaid on a physical environment; and
Fig. 4 shows schematically a method of controlling a plurality of lighting devices and an augmented reality device. All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
DETAILED DESCRIPTION
Figs. 1 and 2 show schematically examples of augmented reality devices 120. In Fig. 1, the augmented reality device 120 is depicted as a head-mounted augmented reality device (e.g. augmented reality glasses), and in Fig. 2 the augmented reality device 120 is depicted as a hand-held augmented reality device (e.g. a smartphone or a tablet pc). A plurality of lighting devices 130, 132, 134 and the augmented reality device 120 are located in a physical environment 140. The augmented reality device 120 comprises a display 122 configured to render a virtual environment 150 as an overlay on the physical environment 140. Fig. 2 further depicts a control system 102 (not shown in Fig. 1) for controlling the lighting devices 130, 132, 134 and the augmented reality device 120. The control system 102 comprises one or more processors 106 (e.g. circuitry, microcontrollers, microchips). The control system 102 may further comprise one or more communication units 104 for communicating with the lighting devices 130, 132, 134 and/or the augmented reality device 120. The one or more processors 106 are configured to: determine a size and a location of a virtual area 154 in the virtual environment 150 with respect to the physical environment 140, identify a subarea of the physical environment 140 which corresponds to the virtual area based on the location and the size and obtain location information indicative of locations of the plurality of lighting devices 130, 132, 134 in the physical environment 140. The one or more processors 106 are further configured to select, from the plurality of lighting devices 130, 132, 134, one or more first lighting devices 130, 132 which are located inside the subarea of the physical environment 140 based on the location information, and select, from the plurality of lighting devices 130, 132, 134, one or more second lighting devices 134 which are located outside the subarea of the physical environment 140 based on the location information. The one or more processors 106 are further configured to control the one or more first lighting devices 130, 132 according to one or more first light settings, and control the one or more second lighting devices 134 according to one or more second light settings different from the one or more first light settings.
The control system 102 may comprise a single processor 106 for performing these steps. The processor 106 may, for example, be comprised in a (central) lighting control system (e.g. a bridge, a hub, a smartphone, etc.), in the augmented reality device 120, in a remote (cloud) server, etc. Alternatively, the control system 102 may comprise multiple processor 106 for performing these steps. The processors 106 may be located in different parts of a system 100, which system may comprise one or more lighting devices 130, 132, 134, the control system 102 and/or the augmented reality device 120. The locations of the processors 106 and the steps performed by the respective processors 106 may depend on the system architecture of the control system 102 and/or the system architecture of the system 100. Examples thereof are explained below.
Figs. 1 and 2 further depict three lighting devices 130, 132, 134 located in the physical environment 140. The lighting devices 130, 132, 134 may be any type of lighting device comprising one or more (LED) light sources, and a processing unit for controlling the light output (e.g. hue, saturation and/or brightness) of the one or more light sources based on received control signals. The lighting devices 130, 132, 134 may be arranged for providing general lighting, such as task lighting, ambient lighting, atmosphere lighting, accent lighting, indoor lighting, outdoor lighting, etc. The lighting devices 130, 132, 134 may further comprise a communication unit (not shown) configured to receive lighting control commands (and, optionally, orientation control commands). The communication unit may comprise hardware for communicating via one or more wireless communication protocols, for example Bluetooth, Wi-Fi, Li-Fi, 3G, 4G, 5G or ZigBee. A specific communication technology may be selected based on the system architecture of the lighting system.
The augmented reality device 120 comprises a display 122 for rendering a virtual environment 150 comprising one or more virtual objects 152 as an overlay on a view of the physical environment 140. An example of the physical environment 140 is depicted in Fig. 2. The depicted physical environment 140 comprises a couch and three lighting devices 130, 132, 134. The display 122 may be a (semi-) transparent see-through display, wherein the user can see the physical environment 140 through the display 122, and wherein the display 122 is configured to render the virtual environment 150 comprising one or more virtual objects 152 as an overlay on the physical environment 140. The display 122 may be integrated in the (semi-) transparent see-through display, or the virtual environment may be projected on the display 122. Alternatively, augmented reality device 120 may comprise a camera 124 configured to continuously capture images of the physical environment 140 and render the images on the display 122, while rendering the virtual environment comprising one or more virtual objects 152 as an overlay on the images. It should be understood that such augmented reality devices are known in the art, and will therefore not be discussed in further detail. The augmented reality device 120 may comprise a processor 106 configured to render, on the display 122, the virtual environment as an overlay on the physical environment 140. The lighting devices 130, 132 and the augmented reality device 120 are located in the same physical environment 140 (e.g. a room such as a living room, an office, etc.). The user operating the augmented reality device 120 is also located in the physical environment 140. By rendering the virtual environment as an overlay on the physical environment 140, a so- called mixed reality environment is created, wherein the user can see both physical (real-life) objects and virtual objects. The augmented reality device 120 may comprise a processor configured to determine how to render the virtual environment as an overlay on the physical environment. The processor may, for example, determine to render the virtual environment as an overlay on the physical environment 140 based on a location and/or an orientation of the augmented reality device 120. The processor 106 may, for example, map the virtual environment onto the physical environment. The processor 106 may, for example, use image analysis to analyze an image of the physical environment 140, and select one or more anchor points in the physical environment 140 based on the image analysis. The processor 106 may then anchor virtual objects (e.g. virtual characters, virtual furniture, a virtual space, etc.) of the virtual environment to the physical environment 140 based on the anchor points. Alternatively, the processor may be configured to obtain a predefined mapping of the virtual environment onto the physical environment 140. The mapping may, for example, be based on the location and the orientation of the augmented reality device 120 relative to the physical environment 140. Such techniques for mapping a virtual environment onto a physical environment are known in the art and will therefore not be discussed in further detail.
The one or more processors 106 (which may for example be comprised in the augmented reality device 120 or in the cloud) are configured to determine a size and a location of a virtual area 154 in the virtual environment 150 with respect to the physical environment. The virtual area is an area in the virtual environment 150, and may be defined in various ways, as explained below with reference to Figs. 3a-3d. In general, the virtual area 154 covers at least a part of the virtual environment 150. The one or more processors 106 may obtain information indicative of the location and the size of the virtual area 154 from the augmented reality device 120, from a memory (which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server) or from an augmented reality application (which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server). The one or more processors 106 are further configured to identify a subarea of the physical environment 140 which corresponds to the virtual area 154 based on the location and the size of the virtual area 154. For example, the one or more processors 106 may determine which physical area in the physical environment 140 corresponds to the virtual area 154 in the virtual environment 150 based on a mapping of the virtual environment 150 onto the physical environment 140. For example, the virtual area may be defined as a set of coordinates in the virtual environment 150, which correspond to a set of coordinates in the physical environment 140. The one or more processors 106 may then identify the subarea in the physical environment 140 that corresponds to the virtual area 154.
The one or more processors 106 are further configured to obtain location information indicative of locations of the plurality of lighting devices 130, 132, 134 relative to the augmented reality device 120 and/or relative to the physical environment 140. Additionally or alternatively, the one or more processors 106 may be configured to determine the location and/or the orientation of the augmented reality device 120 relative to the physical environment 140. The one or more processors 106 may be configured to receive location information indicative of the relative location/orientation of the augmented reality device 120 and/or the one or more lighting devices 130, 132, 134 relative to the physical environment 140. The location information may, for example, be received from an (indoor) positioning system (such as an RF-based indoor positioning system or a visible light communication (VLC) based positioning system), it may be based on the signal strength of signals communicated between one or more lighting devices 130, 132, 134 and the augmented reality device 120, based on light signals communicated between them, etc. The locations may be indicative of coordinates of the one or more lighting devices 130, 132, 134 and the augmented reality device 120 relative to the physical environment 140. Alternatively, the one or more processors 106 may be configured to obtain the location and/or the orientation of the augmented reality device 120 relative to the physical environment 140 by analyzing the field of view (e.g. field of view 160). The one or more processors 106 may for example be configured to analyze one or more images captured by a camera 124 of the augmented reality device to determine the locations of the lighting devices relative to the physical environment 140 and relative to the augmented reality device 120. The one or more processors 106 may, for example, use image analysis techniques for identifying objects (e.g. lighting devices) in the physical environment 140 to determine their locations. Additionally, one or more depth cameras and/or depth sensors may be used to detect the dimensions of the physical environment 140. Techniques of obtaining location information of devices in an environment are known in the art and will therefore not be discussed in further detail.
The one or more processors 106 are further configured to select, from the plurality of lighting devices 130, 132, 134, one or more first lighting devices 130, 132 which are located inside the subarea of the physical environment. The one or more processors 106 are further configured to select, from the plurality of lighting devices 130, 132, 134, one or more second lighting devices 134 which are located outside the subarea of the physical environment 140. The one or more processors 106 may, for example, compare the locations of the lighting devices 130, 132, 134 (e.g. coordinates of the lighting devices 130, 132, 134) to the location of the subarea (e.g. a set of coordinates defining the subarea in the physical environment), and select the one or more first and the one or more second lighting devices based thereon. Alternatively, the one or more processors 106 may analyze the field of view of the user (e.g. field of view 160), for example by analyzing one or more images captured by a camera 124 of the augmented reality device to determine the locations of the lighting devices relative to the subarea in the physical environment 140, and select the one or more first and the one or more second lighting devices based thereon.
The one or more processors 106 are further configured to control the one or more first lighting devices 130, 132 according to one or more first light settings, and control the one or more second lighting devices 134 according to one or more second light settings different from the one or more first light settings. The one or more processors 106 may, for example, communicate lighting control commands to the lighting devices 130, 132, 134 to to control them according to the respective light settings. The control system 102 may comprise a communication unit configured to communicate the lighting control commands the lighting devices 130, 132, 134. A lighting control command may comprise lighting control instructions for controlling the light output, such as the color, intensity, saturation, beam size, beam shape, etc. of one or more light sources of a lighting device. Referring to the example of Figs. 1 and 2, the one or more processors 106 may determine that lighting devices 130 and 132 are located in the subarea (corresponding to the virtual area), and the one or more processors 106 may control the lighting devices 130 and 132 according to the one or more first light settings. The one or more processors 106 may determine that lighting device 134 is located outside the subarea (corresponding to the virtual area), and the one or more processors 106 may control lighting device 134 according to the one or more second light settings. Figs. 3a-3d show a top view of the examples of virtual areas 154 in a virtual environment 150 mapped onto a physical environment 140. In the example of Fig. 3a, the size and the location of the virtual area 154 are the same as the size and the location of the virtual environment 150. In this example, the one or more processors 106 may identify a subarea of the physical environment 140 which corresponds to the virtual area 154, for instance based on a mapping of the virtual environment 150 on the physical environment 140, or based on an analysis of one or more images captured by a camera 124 of the augmented reality device 120. The one or more processors 106 may then select lighting devices 130 and 132 because they are located in the subarea that corresponds to the virtual area 154 and control lighting devices 130 and 132 according to one or more first light settings. The one or more processors 106 may further select lighting device 134 because it is located outside the subarea that corresponds to the virtual area 154 and control the lighting device 134 according to one or more first second settings.
In the example of Fig. 3b, the virtual area 154 is a virtual subarea of the virtual environment 150. In this example, the one or more processors 106 may identify a subarea of the physical environment 140 which corresponds to the virtual area 154, for instance based on a mapping of the virtual environment 150 on the physical environment 140, or based on an analysis of one or more images captured by a camera 124 of the augmented reality device 120. The one or more processors 106 may then select lighting devices 130 and 132 because they are located in the subarea that corresponds to the virtual area 154 and control lighting devices 130 and 132 according to one or more first light settings. The one or more processors 106 may further select lighting device 134 because it is located outside the subarea that corresponds to the virtual area 154 and control the lighting device 134 according to one or more first second settings.
In the example of Fig. 3c, the virtual area 154 is a virtual subarea of the virtual environment 150. In this example, the location and size of the virtual area 154 are based on the location of a virtual object 152 (e.g. a virtual character, a stationary virtual object, etc.). The one or more processors 106 may be configured to render the virtual object 152 in the virtual environment 150, determine a virtual location of the object in the virtual environment 154, a physical location (e.g. a coordinates) in the physical environment 140 that corresponds to the virtual location of the virtual object 152 and determine the size and the location of the virtual area 154 with respect to the physical environment 140 based on the physical location. Optionally, the size and the location of the virtual area with respect to the physical environment may be determined based on a distance between the augmented reality device 120 and the physical location, and/or based on an orientation of the augmented reality device or the user operating the augmented reality device with respect to the physical location. In other words, the location and size of virtual area 154 are determined based on the locations of the augmented reality device 120 and the physical location of the virtual object 152. In the example of Fig. 3c, the one or more processors 106 may identify a subarea of the physical environment 140 which corresponds to the virtual area 154, for instance based on a mapping of the virtual environment 150 on the physical environment 140, or based on an analysis of one or more images captured by a camera 124 of the augmented reality device 120. The one or more processors 106 may then select lighting device 130 because it is located in the subarea that corresponds to the virtual area 154 and control lighting device 130 according to one or more first light settings. The one or more processors 106 may further select lighting devices 132 and 134 because they are located outside the subarea that corresponds to the virtual area 154 and control the lighting devices 132 and 134 according to one or more second light settings.
Similar to the example of Fig. 3c, the virtual area 154 of Fig. 3d is a virtual subarea of the virtual environment 150. The location and size of the virtual area 154 are based on the location of a virtual object 152 (e.g. a virtual character, a stationary virtual object, etc.). The size of the virtual area 154 may, for example, be predefined. Different virtual objects may be associated with different sizes. The one or more processors 106 may, for example, access a memory (e.g. comprised in the augmented reality device or in a remote (cloud) server), and retrieve a size of the virtual area 154 associated with the virtual object 152. Similar to the example of Fig. 3c, the one or more processors 106 may be configured to render the virtual object 152 in the virtual environment 150, determine a virtual location of the object in the virtual environment 154, a physical location (e.g. a coordinates) in the physical environment 140 that corresponds to the virtual location of the virtual object 152 and determine the size and the location of the virtual area 154 with respect to the physical environment 140 based on the physical location. The difference between the examples of Fig. 3d and Fig. 3c is that in Fig. 3d the location of the virtual object 152 defines the virtual area 154, whereas in Fig. 3c the virtual area 154 is further defined by the location of the augmented reality device 120. In the example of Fig. 3d, the one or more processors 106 may then select lighting devices 130 and 132 because they are located in the subarea that corresponds to the virtual area 154 and control lighting devices 130 ad 132 according to one or more first light settings. The one or more processors 106 may further select lighting device 134 because it is located outside the subarea that corresponds to the virtual area 154 and control lighting device 134 according to one or more second light settings.
The one or more first light settings may have a higher brightness than the one or more second light settings. The one or more second light settings may, for example, be off-light settings and the one or more second lighting devices may thus be switched off. The one or more processors 106 may be configured to determine the one or more first light settings such that the one or more first light settings have a higher brightness than the one or more second light settings. Referring to Figs. 1 and 2, the one or more processors 106 may thus control the one or more second lighting devices 134 to provide less illumination compared to the one or more first lighting devices 130, 132 to decrease the perceived size of the area outside the subarea and/or to increase the perceived size of the subarea.
Alternatively, the one or more second light settings may have a higher brightness than the one or more first light settings. The one or more first light settings may, for example, be off-light settings and the one or more first lighting devices may thus be switched off. The one or more processors 106 may be configured to determine the one or more first light settings such that the one or more first light settings have a lower brightness than the one or more second light settings. Referring to Figs. 1 and 2, the one or more processors 106 may thus control the one or more second lighting devices 134 to provide more illumination compared to the one or more first lighting devices 130, 132 to increase the contrast between the subarea and an area outside the subarea.
The one or more processors 106 may be further configured to obtain type information indicative of the types of the plurality of lighting devices 130, 132, 134 located inside and/or outside the subarea. The one or more processors 106 may be configured to receive the type information from the lighting devices, from a memory (e.g. a local or a remote memory), from a lighting system controller, etc. The one or more processors 106 may be further configured to select one or more first lighting devices located inside the subarea based on the types of lighting devices located inside the subarea. Additionally or alternatively, the one or more processors 106 may be configured to select one or more second lighting devices located outside the subarea based on the types of lighting devices located outside the subarea. Different types of lighting devices may have different light emission properties. For instance, the beam size and/or shape of the lighting devices may differ between lighting devices, or the spectral range of illumination (e.g. colored light vs. white light only) may differ between lighting devices. For instance, a wall washer provides a completely different light effect compared to a spotlight. The one or more processors 106 may select the one or more first and/or second lighting devices to provide a target contrast between the subarea and an area outside subarea. The target contrast may, for example, be a color contrast and/or a brightness contrast. The target contrast may be associated with the virtual object 152 or the virtual area 154, and the one or more processors 106 may be configured to receive the target contrast from the augmented reality device 120, from a memory (which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server) or from an augmented reality application (which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server).
Additionally or alternatively, the one or more processors 106 may be configured to determining the one or more first light settings and the one or more second light settings such that a perceived contrast between the subarea and an area outside the subarea is increased or decreased. The one or more processors 106 may select the one or more first and/or second light settings to provide a target contrast between the subarea and an area outside subarea. The target contrast may, for example, be a color contrast and/or a brightness contrast. The target contrast may be associated with the virtual object 152 or the virtual area 154, and the one or more processors 106 may be configured to receive the target contrast from the augmented reality device 120, from a memory (which may be comprised in the augmented reality device 120 or remotely, for instance in a (cloud) server) or from an augmented reality application (which may be running on the augmented reality device 120 or remotely, for instance on a (cloud) server).
The one or more processors 106 may be further configured to receive an input indicative of the virtual area 154. The input may be a user input received via a user interface, and the user input may be indicative of the size and the location of the virtual area. The user interface may be a user interface of the augmented reality device 120.
Fig. 4 shows schematically a method of controlling a plurality of lighting devices and an augmented reality device. The plurality of lighting devices and the augmented reality device are located in a physical environment. The method 400 comprises: rendering 402, on a display of the augmented reality device, a virtual environment as an overlay on the physical environment, determining 404 a size and a location of a virtual area in the virtual environment with respect to the physical environment, identifying 406 a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtaining 408 location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, selecting 410, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, selecting 412, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, controlling 414 the one or more first lighting devices according to one or more first light settings, and controlling 416 the one or more second lighting devices according to one or more second light settings different from the one or more first light settings.
The method 400 may be executed by computer program code of a computer program product when the computer program product is run on a processing unit of a computing device, such as the one or more processors 106.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims.
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Aspects of the invention may be implemented in a computer program product, which may be a collection of computer program instructions stored on a computer readable storage device which may be executed by a computer. The instructions of the present invention may be in any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs) or Java classes. The instructions can be provided as complete executable programs, partial executable programs, as modifications to existing programs (e.g. updates) or extensions for existing programs (e.g. plugins). Moreover, parts of the processing of the present invention may be distributed over multiple computers or processors or even the ‘cloud’. Storage media suitable for storing computer program instructions include all forms of nonvolatile memory, including but not limited to EPROM, EEPROM and flash memory devices, magnetic disks such as the internal and external hard disk drives, removable disks and CD-ROM disks. The computer program product may be distributed on such a storage medium, or may be offered for download through HTTP, FTP, email or through a server connected to a network such as the Internet.

Claims

CLAIMS:
1. A method (400) of controlling a plurality of lighting devices and an augmented reality device, wherein the plurality of lighting devices and the augmented reality device are located in a physical environment, the method (400) comprising: rendering (402), on a display of the augmented reality device, a virtual environment as an overlay on the physical environment, determining (404) a size and a location of a virtual area in the virtual environment with respect to the physical environment, identifying (406) a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtaining (408) location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, selecting (410), from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, selecting (412), from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, controlling (414) the one or more first lighting devices according to one or more first light settings, and controlling (416) the one or more second lighting devices according to one or more second light settings different from the one or more first light settings.
2. The method (400) of claim 1, wherein the one or more first light settings have a higher brightness than the one or more second light settings.
3. The method (400) of claim 2, wherein the one or more second light settings are off-settings.
4. The method (400) of any preceding claim, wherein the method (400) further comprises: rendering a virtual object in the virtual environment, determining a virtual location of the virtual object in the virtual environment, determining a physical location in the physical environment that corresponds to the virtual location, and wherein the size and the location of the virtual area with respect to the physical environment are determined based on the physical location.
5. The method (400) of claim 4, wherein the method (400) further comprises: determining a distance between the physical location and the augmented reality device and/or an orientation of the augmented reality device with respect to the physical location, and wherein the size and the location of the virtual area with respect to the physical environment are determined based on the distance and/or the orientation.
6. The method (400) of claim 4 or 5, wherein the virtual object is an avatar of a user or a virtual character.
7. The method (400) of claim 1, 2 or 3, wherein the augmented reality device is located inside the subarea of the physical environment.
8. The method (400) of claim 7, wherein the virtual area comprises one or more edges, and wherein the size and the location of the virtual area are determined based on one or more distances between the augmented reality device and the one or more respective edges of the virtual area.
9. The method (400) of claim 7 or 8, wherein the size and the location of the virtual area are the same as the size and the location of the virtual environment.
10. The method (400) of any preceding claim, wherein the method (400) further comprises: obtaining type information indicative of the types of the plurality of lighting devices located inside and/or outside the subarea, and wherein the selection of the one or more first lighting devices located inside the subarea is further based on the types of lighting devices located inside the subarea, and/or wherein the selection of the one or more second lighting devices located outside the subarea is further based on the types of lighting devices located outside the subarea.
11. The method (400) of any preceding claim, wherein the method (400) comprises: determining the one or more first light settings and the one or more second light settings such that a perceived contrast between the subarea and an area outside the subarea is increased or decreased.
12. The method (400) of any preceding claim, further comprising: receiving an input indicative of the virtual area.
13. The method (400) of claim 12, wherein the input is a user input received via a user interface, and wherein the user input is indicative of the size and the location of the virtual area.
14. A computer program product for a computing device, the computer program product comprising computer program code to perform the method (400) of any preceding claim when the computer program product is run on a processing unit of the computing device
15. A control system (102) for controlling a plurality of lighting devices (130, 132, 134) and an augmented reality device (120), wherein the plurality of lighting devices and the augmented reality device are located in a physical environment (140), the augmented reality device comprising a display (122) configured to render a virtual environment (140) as an overlay on the physical environment, the control system comprising one or more processors (106) configured to: determine a size and a location of a virtual area in the virtual environment with respect to the physical environment, identify a subarea of the physical environment which corresponds to the virtual area based on the location and the size, obtain location information indicative of locations of the plurality of lighting devices relative to the augmented reality device and/or relative to the physical environment, select, from the plurality of lighting devices, one or more first lighting devices which are located inside the subarea of the physical environment based on the location information, select, from the plurality of lighting devices, one or more second lighting devices which are located outside the subarea of the physical environment based on the location information, control the one or more first lighting devices according to one or more first light settings, and control the one or more second lighting devices according to one or more second light settings different from the one or more first light settings.
PCT/EP2024/065597 2023-06-13 2024-06-06 A method of controlling a plurality of lighting devices and an augmented reality device Pending WO2024256262A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363472650P 2023-06-13 2023-06-13
US63/472,650 2023-06-13
EP23182610.8 2023-06-30
EP23182610 2023-06-30

Publications (1)

Publication Number Publication Date
WO2024256262A1 true WO2024256262A1 (en) 2024-12-19

Family

ID=91334541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/065597 Pending WO2024256262A1 (en) 2023-06-13 2024-06-06 A method of controlling a plurality of lighting devices and an augmented reality device

Country Status (1)

Country Link
WO (1) WO2024256262A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190053355A1 (en) * 2017-08-10 2019-02-14 Panasonic Intellectual Property Management Co., Ltd. Lighting system, operating device, and mapping method for use in lighting system
EP3484249A1 (en) 2009-01-06 2019-05-15 Signify Holding B.V. Control system for controlling one or more controllable devices sources and method for enabling such control
EP3583827B1 (en) * 2017-02-16 2020-07-15 Signify Holding B.V. A controller for indicating a presence of a virtual object via a lighting device and a method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3484249A1 (en) 2009-01-06 2019-05-15 Signify Holding B.V. Control system for controlling one or more controllable devices sources and method for enabling such control
EP3583827B1 (en) * 2017-02-16 2020-07-15 Signify Holding B.V. A controller for indicating a presence of a virtual object via a lighting device and a method thereof
US20190053355A1 (en) * 2017-08-10 2019-02-14 Panasonic Intellectual Property Management Co., Ltd. Lighting system, operating device, and mapping method for use in lighting system

Similar Documents

Publication Publication Date Title
RU2622405C2 (en) Light source remote control
EP3375253B1 (en) Image based lighting control
US11234312B2 (en) Method and controller for controlling a plurality of lighting devices
US10976905B2 (en) System for rendering virtual objects and a method thereof
KR20100041800A (en) Light control system with a user interface for interactively changing settings in a lighting system and method for interactively changing settings in a lighting system with a user interface
EP3338516B1 (en) A method of visualizing a shape of a linear lighting device
CN113383614A (en) LED illumination simulation system
EP3928595B1 (en) A controller for controlling light sources and a method thereof
JP7126507B2 (en) Controller and method for indicating presence of virtual object via lighting device
US12089310B2 (en) Controller for controlling a plurality of lighting units in a space and a method thereof
US11094091B2 (en) System for rendering virtual objects and a method thereof
WO2024256262A1 (en) A method of controlling a plurality of lighting devices and an augmented reality device
US11284493B2 (en) Lighting system
WO2024256316A1 (en) An augmented reality device for providing light effects
WO2022194773A1 (en) Generating light settings for a lighting unit based on video content
WO2025119684A1 (en) A controller for controlling a plurality of lighting units and a method thereof
CN121311856A (en) Methods for controlling lighting equipment and augmented reality devices
US20250358917A1 (en) A controller for controlling a plurality of lighting units in a space and a method thereof
WO2023202981A1 (en) Controlling a reorientable lighting device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24730052

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE