[go: up one dir, main page]

WO2024231019A1 - Picture recording arrangement and illumination adapting method - Google Patents

Picture recording arrangement and illumination adapting method Download PDF

Info

Publication number
WO2024231019A1
WO2024231019A1 PCT/EP2024/059820 EP2024059820W WO2024231019A1 WO 2024231019 A1 WO2024231019 A1 WO 2024231019A1 EP 2024059820 W EP2024059820 W EP 2024059820W WO 2024231019 A1 WO2024231019 A1 WO 2024231019A1
Authority
WO
WIPO (PCT)
Prior art keywords
directions
sensor
picture recording
image
recording arrangement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/059820
Other languages
French (fr)
Inventor
Raoul Mallart
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ams Osram AG
Original Assignee
Ams Osram AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ams Osram AG filed Critical Ams Osram AG
Priority to DE112024000399.0T priority Critical patent/DE112024000399T5/en
Publication of WO2024231019A1 publication Critical patent/WO2024231019A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means

Definitions

  • a picture recording arrangement is provided .
  • a method for adapting illumination using such a picture recording arrangement is also provided .
  • a problem to be solved is to provide a picture recording arrangement that enables easy and reliable use .
  • the picture recording arrangement comprises at least one image-taking device having an image-taking field of view .
  • the at least one image-taking device is a CMOS image sensor or a CCD sensor .
  • There can be a plurality of the image-taking devices for example , to enable stereo recording, or there are imagetaking devices having di f ferent optics .
  • taking pictures and/or videos may be enabled .
  • the picture recording arrangement comprises a light source .
  • the light source includes a plurality of light-emitting units , for example , N light-emitting units wherein N is a natural number. For example, 6 ⁇ N ⁇ 30.
  • the light source can be a single component, that is, one individual device, or the light source is alternatively composed of a couple of components. Hence, all the light-emitting units may be integrated in the single component, or each one or groups of the light-emitting units are configured as individual components.
  • the lightemitting units may be multi-color units, that is, may be configured to independently emit red, green and blue light and, thus, may be RGB units.
  • the light-emitting units may each have only one color channel, for example, to emit specific white light. It is possible that different kinds of light-emitting units, for example, with different spatial and/or spectral emission characteristics, are combined within the light source.
  • the lightemitting units are composed of light-emitting diodes, LEDs for short.
  • the light-emitting units are configured to emit light along a plurality of emission directions. It is possible that there is a one-to- one assignment between the light-emitting units and the emission directions. Otherwise, one or some or all the emission directions may be provided with a plurality of the light-emitting units. That is, the number N of the lightemitting units can be larger than a number M of the emission directions wherein M is a natural number, too.
  • the light source may thus be configured to emit visible light, like white light or red, green and/or blue light in any combination. It is optionally possible that the light source may emit infrared radiation, for example, nearinfrared radiation referring to the spectral range from 750 nm to 1 . 2 pm . That is , along each emission direction visible light and/or infrared radiation can be emitted . For example , there are M emission directions wherein M is a natural number . For example , 2 ⁇ M ⁇ 40 or 6 ⁇ M ⁇ 30 or 10 ⁇ M ⁇ 20 .
  • the emission directions point outside the image-taking field of view .
  • the optical axes of the light-emitting units point outside the image-taking field of view .
  • at most 20% or at most 10% or at most 5% or at most 1 % of a luminous flux emitted by the light-emitting units and measured in lumen may be emitted in the imagetaking field of view . This may apply for each one of the light-emitting units or collectively for all the lightemitting units .
  • the picture recording arrangement comprises a depth sensor .
  • the depth sensor comprises a plurality of depth sensor units , for example , K depth sensor units wherein K is a natural number .
  • the depth sensor units are configured to determine at least one distance along a plurality of sensor directions between the picture recording arrangement and at least one reflective surface .
  • the depth sensor units and the sensor directions are assigned in a one-to-one manner so that there are K sensor directions as well as K depth sensor units . Otherwise , there can be more depth sensor units than sensor directions .
  • a distance between the picture recording arrangement and a surface is determined, up to a maximum distance that can be measured which is , for example , at least 5 m or at least 10 m .
  • a maximum distance that can be measured which is , for example , at least 5 m or at least 10 m .
  • the sensor directions point outside the image-taking field of view .
  • the at least one reflective surface is located outside the imagetaking field of view .
  • a target within the imagetaking field of view can be illuminated in an indirect manner, that is , with bouncing light .
  • the picture recording arrangement comprises :
  • a light source comprising a plurality of light-emitting units configured to emit light along a plurality of emission directions , the emission directions point outside the imagetaking field of view, and
  • a depth sensor comprising a plurality of depth sensor units configured to determine at least one distance along a plurality of sensor directions between the picture recording arrangement and at least one reflective surface , the sensor directions point outside the image-taking field of view .
  • the light source is a multi-LED flash used in photo applications in low-light configuration in various environments to detect the influence of the environment on the LED illumination .
  • the picture recording arrangement described herein intends to solve the problem of providing better arti ficial lighting when taking pictures in a low-light environment from a mobile phone , for example .
  • Cameras in mobile phones are typically very small , so that they cannot receive a big amount of light and therefore behave poorly in low-light environment , producing images with a lot of noise .
  • one possibility is to try to add arti ficial lighting to the scene , by turning on some arti ficial light sources during image capture .
  • the nature of this additional light can have a huge impact on the quality of the final picture , and the picture recording arrangement described herein proposes a solution to improve the way flash LEDs can bring light into a low-light scene .
  • the picture recording arrangement described herein focusses on improving the quality of arti ficial flash particularly for indoor environments .
  • Some approaches to capture of information about the environment of the target to be pictured are , for example :
  • arti ficial content may be superimposed on the video that is captured in real-time via the phone camera .
  • the AR application has to perform an analysis of the scene to determine the pose of the camera and map the screen two-dimensional , 2D, coordinates to the real world 3D coordinates .
  • Several solutions may be used for that .
  • Simultaneous Locali zation And Mapping, SLAM may be used .
  • SLAM allows to build a 3D representation of the environment as the phone camera is moved .
  • Stereo cameras Light imaging, detection and ranging, LIDARs for short , and 3D time-of- f light sensors , 3D ToF for short , are three examples of 3D sensors .
  • Mobile phones can be fitted with such a 3D sensor pointing in the same direction as the camera and having a field of view close to the one of the camera .
  • capturing the 3D information with such a sensor allows to get a 3D map of the scene in the field of view of the camera .
  • the field of view of the 3D sensor can in principle be broaden to get a wider representation of the environment in 3D .
  • capturing the 3D scene in the field of view of the camera does not give any information on the surfaces where the light from the flash is to bound .
  • a user would need to scan the phone in order to capture information that is outside of the field of view of the camera . This is cumbersome and not easy to control .
  • a special multi- zone depth sensor is used .
  • This sensor has several individual depth sensor units each of them pointing to a speci fic direction .
  • the depth sensor can be based on depth sensing technologies like ToF, LIDAR or a sel f-mixing interferometer, SMI for short .
  • the individual depth sensor units may point in the same direction as the light-emitting units of the adaptive flash device ; in this case , the number of depth sensors is equal to the number of light-emitting units of the adaptive flash, or the number of emission directions is equal to the number of sensor directions . As an alternative , the number of individual depth sensor units is lower than the number of light-emitting units in the adaptive flash so that the number of emission directions may be higher than the number of sensor directions .
  • the depth sensor units allow to estimate the distance to a surface when each light-emitting unit of the flash will bounce . From this information, it is possible to select which light-emitting units will be used to illuminate the scene . Namely, for example , the light-emitting units are selected that point to the closest bouncing surface to shine light on the scene . It is alternatively or additionally also possible to determine the intensity of the illumination to use ; for example , the larger the distance , the higher the intensity to be used . When there are less depth sensor units and/or sensor directions than light-emitting units and/or emission directions , an interpolation may be performed .
  • the depth sensor is organi zed in as three sensor triplets comprising thus a total of nine depth sensor units .
  • Each triplet pointing to one of the ' top" , “ left” and “right” directions , relative to the image-taking device .
  • the three sensors in each triplet can slightly point in a di f ferent direction so as to determine three depth points in the ' top" , " left” and “right” directions .
  • Each of the triplets thereby allows to estimate the distance and orientation of a bouncing surface - this assumes , for example that the surfaces are planar .
  • the image-taking device is configured to take a picture of a target located within the image-taking field of view .
  • the target comprises at least one obj ect and/or at least one person .
  • the light source is configured to provide indirect illumination of the target by means of illuminating the at least one reflective surface located outside the image-taking field of view . It is possible that exactly one or also a plurality of reflective surfaces is simultaneously used to illuminate the target with bouncing light .
  • the depth sensor is configured to interpolate between measurement values along the sensor directions to provide a distance value for each one of the emission directions , even i f at least one speci fic emission direction does not run along one of the sensor directions .
  • the depth sensor comprises or consists of three triplets of the depth sensor units .
  • Other than triplets alternatively pairs or quadruplets or quintuplets or any combinations thereof may be used .
  • the depth sensor is configured to determine relative distances along the sensor directions between the at least one reflective surface and the picture recording arrangement .
  • it is not necessary to determine absolute values of the distances , for example , measured in meters , but it may be suf ficient to derive along which one of the sensor directions there is the most close reflective surface , along which one of the sensor directions there is the second-close reflective surface , and so on, and along which one of the sensor directions there is no reflective surface in range .
  • the distances may be determined with their absolute values , that is , measured in meters .
  • relatively large measurement tolerances may apply so that the measured distances may be determined, for example , with a relative accuracy of 10% or better or of 20% or better or of 30% or better .
  • the depth sensor is configured to determine which one of the sensor directions corresponds to a reflective surface that is closer to the picture recording arrangement than the target .
  • the determined reflective surfaces determined to be closer to the picture recording arrangement than the target may be used for indirectly illuminating the target .
  • the depth sensor is configured to determine the at least one reflective surface which is located closest to the picture recording arrangement , seen along the sensor directions . For example , the depth sensor determines only the closest reflective surface , and only this reflective surface is used for indirect illumination of the target .
  • one or some or all of the depth sensor units are selected from the following group : time-of- f light sensor, LIDAR sensor, stereo sensor or stereo camera, sel f-mixing interferometer .
  • the depth sensor units are SMI s .
  • one or some or all of the depth sensor units are configured to determine intensities of a radiation emitted along the sensor directions and reflected back to the depth sensor at the at least one reflective surface .
  • a relative distance between the picture recording arrangement and the reflective surfaces may be determined by means of the fraction of radiation reflected back from the respective reflective surface to the picture recording arrangement .
  • the depth sensor units may use the light-emitting units as sources for the radiation to be reflected and to be detected afterwards .
  • the depth sensor units are configured to emit near-infrared radiation .
  • the depth sensor units are configured to emit visible light and/or near-ultraviolet radiation .
  • Nearultraviolet radiation may refer to a wavelength range between 350 nm and 410 nm .
  • the light-emitting units are configured to emit visible light , like white light . It is possible that a hue , a color saturation and/or a correlated color temperature , CCT , of the light-emitting units can be adj usted .
  • an emission angle between an optical axis of the image-taking device and at least some of the emission directions and/or of the sensor directions is at least 20 ° or is at least 30 ° .
  • said angle is at most 75 ° or is at most 60 ° or is at most 50 ° .
  • an emission angle width per emission direction and/or per sensor directions is at least 2 ° or is at least 5 ° or is at least 10 ° or is at least 15 ° .
  • said angle is at most 55 ° or is at most 45 ° or is at most 30 ° .
  • the emission angles of the emission directions and of the sensor directions are the same .
  • the emission directions on the one hand and the sensor directions on the other hand have di f ferent emission angles .
  • the emission angles of the sensor directions are smaller than the emission angles of the emission directions , especially i f the depth sensor units are based on laser technology .
  • the depth sensor units and the light-emitting units are arranged in a circular manner around the image-taking device , seen in top view of the image-taking device .
  • the depth sensor units and the light-emitting units can be arranged in the same circle around the image-taking device .
  • the depth sensor units may be arranged along an inner circle and the light-emitting units may be arranged along an outer circle around the image-taking device .
  • the light-emitting units and/or the depth sensor units are arranged in at least one matrix .
  • the depth sensor units are arranged in a matrix, and the light-emitting units are arranged in a circular arrangement , possibly around the image-taking device only or around the image-taking device as well as around the depth sensor units .
  • the picture recording arrangement is a mobile device .
  • the picture recording arrangement is a smart phone or a tablet computer .
  • a method for operating the picture recording arrangement is additionally provided .
  • a picture recording arrangement as indicated in connection with at least one of the above-stated embodiments is operated .
  • Features of the picture recording arrangement are therefore also disclosed for the method and vice versa .
  • the method is for adapting illumination and comprises the following steps , for example , in the stated order :
  • A) Providing picture recording arrangement comprising an image-taking device having an image-taking field of view, a light source comprising a plurality of light-emitting units having emission directions pointing outside the image-taking field of view, and a depth sensor comprising a plurality of depth sensor units ,
  • a picture recording arrangement contains a set of , for example , M independently controlled light-emitting units , all close to the camera but each pointing in a di f ferent direction .
  • a process or an algorithm may be used to optimi ze the intensity applied to each light-emitting unit during the flash to provide the desired indirect illumination .
  • Figure 1 is a schematic side view of an exemplary embodiment of a method using a picture recording arrangement described herein,
  • Figure 2 is a schematic front view of the method of Figure 1 .
  • Figure 3 is a schematic perspective view of an exemplary embodiment of a method using a picture recording arrangement described herein,
  • Figure 4 is a schematic top view of an exemplary embodiment of a picture recording arrangement described herein
  • Figure 5 is a schematic side view of the picture recording arrangement of Figure 4 .
  • Figure 6 is a schematic top view of an exemplary embodiment of a picture recording arrangement described herein,
  • Figure 7 is a schematic side view of the picture recording arrangement of Figure 6 .
  • Figure 8 to 11 are schematic top views of exemplary embodiments of picture recording arrangements described herein.
  • Figure 12 is a schematic representation of the emission characteristics of a light-emitting unit and of a depth sensor unit for exemplary embodiments of picture recording arrangements described herein .
  • Figures 1 and 2 illustrate an exemplary embodiment of a method using a picture recording arrangement 1 .
  • the picture recording arrangement 1 is a mobile device 10 and comprises an image-taking device 2 , like a CMOS image sensor or a CCD sensor, configured to take photos and/or videos . Further, the picture recording arrangement 1 comprises a light source 3 and a depth sensor 5 . A user of the picture recording arrangement 1 is not shown in Figures 1 and 2 .
  • the picture recording arrangement 1 is used indoors to take , for example , an image of a target 4 in a scene 11 .
  • the target 4 is a person to be photographed .
  • a distance L between the target 4 and the picture recording arrangement 1 is a least 0.5 m and/or is at most 5 m. It is possible that a size H of the target 4 is at least 0.2 m and/or is at most 2 m.
  • the target 4 can be located in front of a wall 12 or any other item, for example, in front of the target 4 that provides a bouncing surface on the sides of the target 4 so that indirect lighting can be provided.
  • the target 4 can be directly at the wall or can have some distance to the wall 12.
  • the light source 3 is configured to emit an image-taking radiation RE, like visible light, along a plurality of emission directions El.. EM.
  • M is between ten and 20 inclusive.
  • the light source 3 for each one of the emission directions El.. EM one illuminated area 13 is present next to the target 4 out of a f ield-of-view of the image-taking device 2.
  • the light source 3 provides indirect lighting.
  • the emission of radiation RE along the emission directions El.. EM can be adjusted by means of a processing unit of the picture recording arrangement 1, for example .
  • the depth sensor 5 comprising K depth sensor units 51..5K to determine a distance between the respective illuminated areas 13 and the mobile device 10 along sensor directions D1..DK.
  • the picture recording arrangement 1 and the target 4 are located there is also a luminaire 8 that provides weak lighting .
  • a mood provided by the luminaire 8 may be reproduced by the picture recording arrangement 1 .
  • the light source 3 addresses , for example , in particular the illumination areas 13 being about in the same orientation relative to the target 4 as the luminaire 8 . In Figure 2 , this would be , for example , the illumination areas 13 in the upper left area next to the luminaire 8 .
  • the mood can be kept while good illumination conditions can be present when taking the picture by having the light source 3 as an adapted photo flash .
  • the selection of suitable illumination areas 13 and, thus , the adaption of the light source 3 to the desired mood can be highly improved by using the depth sensor 5 .
  • Adaption can also be to a pre-selected mood derived, for example , from a template or another picture or video , and by means of the depth sensor 5 adaption can be simpli fied because the most suitable illumination areas 13 to achieve said mood can effectively be selected.
  • the target 4 is a person standing near a corner of a room.
  • the depth sensor 5 it is determined that the nearest reflective surface 14 and, thus, the most suitable illumination area 13 is located on a left side of the mobile device 10, that is, on a right side of the person 4.
  • the light source 3 can illuminate said illumination area 13 in a targeted manner to provide indirect illumination on the person 4.
  • the depth sensor 5 may be integrated in a combined camera and flash 2, 3.
  • the depth sensor units 51..5K of the depth sensor 5 are configured to determine a distance towards a next surface along the K sensor directions D1..DK.
  • the sensor directions D1..DK are distributed in a rotation symmetric and/or equidistant manner so that between adjacent sensor directions D1..DK there is a same angle. All the sensor directions D1..DK point away from the combined camera and flash 2, 3 so that the combined camera and flash 2, 3 point out of a field- of-view of the image-taking device 2, that is, of the camera
  • the number K of the sensor directions D1..DK may be the same as the number M of emission directions El.. EM, or K is smaller than M. The latter is preferred.
  • the picture recording arrangement 1 again may be a smart phone 10.
  • the sensor directions D1..DK arranged in three triplets.
  • the sensor directions D1..D3 of one of the triplets points to the left, another three sensor directions D4..D6 point to the right, and the remaining three sensor directions D7..D9 point to the top, seen relative from the image-taking device 2, which is, for example, the combined camera and flash 2, 3.
  • the sensor directions D1..D9 run along slightly different directions.
  • the sensor directions D1..D3, D4..D6 and D7..D9 may each be located in a common plane.
  • an angle between the sensor directions D1..D3, D4..D6 and D7..D9 within one of the triplets is at least 10° or at least 15° and/or is at most 30° or is at most 20°.
  • FIGs 8 to 11 some exemplary embodiments of the picture recording arrangement 1 are shown.
  • the picture recording arrangement 1 is a mobile device 10, like a smartphone or a tablet computer.
  • the aspects described below may also apply analogously to other kinds of picture recording arrangements 1.
  • the light source 3 comprises the plurality of the lightemitting units 31..3M.
  • the light-emitting units 31..3M can be light-emitting diodes, LEDs for short. It is possible that the light-emitting units 31..3M are arranged in a circular manner, that is, on a circle or on an ellipse or on a regular polygon, seen in top view of the image-taking device 2.
  • the respective emission directions El.. EM associated with the light-emitting units 31..3M can point inwards, that is, can cross a center of the circle or ellipse or polygon.
  • the sensor directions D1..DK too.
  • the picture recording arrangement 1 includes the at least one image-taking device 2.
  • the picture recording arrangement 1 can include at least one additional light-emitting unit 6.
  • the at least one additional lightemitting unit 6 can also be configured as a photo flash.
  • the at least one additional light-emitting unit 6 may be configured for direct illumination of the target 4 so that an optical axis of the at least one additional lightemitting unit 6 can point into the f ield-of-view 22 of the image-taking device 2.
  • All the components 2, 3, 5, 6 of the picture recording arrangement 1 can be integrated into a housing 7.
  • Both the at least one image-taking device 2 and the optional at least one additional light-emitting unit 6 are arranged within said circle or ellipse or polygon, for example, in a center area of said circle or ellipse or polygon .
  • the light-emitting units 31..3N are arranged on an outer circle or ellipse or regular polygon whereas the depth sensor units 51..5K are arranged on an inner circle or ellipse or regular polygon around the at least one image-taking device 2 and optionally around the at least one additional light-emitting unit 6, if the latter is present.
  • K it is possible that K ⁇ N.
  • the inner and the outer circles or ellipses or polygons may be arranged directly next to one another or, other than shown in Figure 10, there may be a distance between the inner and the outer circles or ellipses or polygons.
  • the light-emitting units 31..3N are again arranged on a circle or ellipse or regular polygon.
  • the depth sensor units 51..5K may be arranged in at least one matrix or row, for example, within the circle or ellipse or regular polygon.
  • the image-taking device 2 and the additional light-emitting unit 6, if present, are integrated in a single component. The same may be true in all other embodiments, too .
  • an angle 23 between an optical axis 20 of the image-taking device 2 and the emission directions El.. EM and/or of the sensor directions D1..DK is about 60°.
  • An emission angle width 33 of the emission directions El.. EM and/or of the sensor directions D1..DK may be about 30° in each case, however, the emission angle width 33 of the emission directions El.. EM may differ from that of the sensor directions D1..DK.
  • no or virtually no radiation RE and/or RD is emitted into the f ield-of-view 23 of the imagetaking device 2.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

In at least one embodiment, the picture recording arrangement (1) comprises: - an image-taking device (2) having an image-taking field of view (22), - a light source (3) comprising a plurality of light-emitting units (31..3N) configured to emit light along a plurality of emission directions (E1..EM), the emission directions (E1..EM) point outside the image-taking field of view (22), and - a depth sensor (5) comprising a plurality of depth sensor units (51..5K) configured to determine at least one distance along a plurality of sensor directions (D1..DK) between the picture recording arrangement (1) and at least one reflective surface (14), the sensor directions (D1..DK) point outside the image-taking field of view (22).

Description

Description
PICTURE RECORDING ARRANGEMENT AND ILLUMINATION ADAPTING METHOD
A picture recording arrangement is provided . A method for adapting illumination using such a picture recording arrangement is also provided .
Document WO 2022 /212607 Al refers to a sel f-mixing interferometer .
A problem to be solved is to provide a picture recording arrangement that enables easy and reliable use .
This obj ect is achieved, inter alia, by a picture recording arrangement and by a method as defined in the independent patent claims . Exemplary further developments constitute the sub ect-matter of the dependent claims .
According to at least one embodiment , the picture recording arrangement comprises at least one image-taking device having an image-taking field of view . For example , the at least one image-taking device is a CMOS image sensor or a CCD sensor . There can be a plurality of the image-taking devices , for example , to enable stereo recording, or there are imagetaking devices having di f ferent optics . By means of the at least one image-taking device , taking pictures and/or videos may be enabled .
According to at least one embodiment , the picture recording arrangement comprises a light source . The light source includes a plurality of light-emitting units , for example , N light-emitting units wherein N is a natural number. For example, 6 < N < 30. The light source can be a single component, that is, one individual device, or the light source is alternatively composed of a couple of components. Hence, all the light-emitting units may be integrated in the single component, or each one or groups of the light-emitting units are configured as individual components. The lightemitting units may be multi-color units, that is, may be configured to independently emit red, green and blue light and, thus, may be RGB units. Otherwise, the light-emitting units may each have only one color channel, for example, to emit specific white light. It is possible that different kinds of light-emitting units, for example, with different spatial and/or spectral emission characteristics, are combined within the light source. For example, the lightemitting units are composed of light-emitting diodes, LEDs for short.
According to at least one embodiment, the light-emitting units are configured to emit light along a plurality of emission directions. It is possible that there is a one-to- one assignment between the light-emitting units and the emission directions. Otherwise, one or some or all the emission directions may be provided with a plurality of the light-emitting units. That is, the number N of the lightemitting units can be larger than a number M of the emission directions wherein M is a natural number, too.
The light source may thus be configured to emit visible light, like white light or red, green and/or blue light in any combination. It is optionally possible that the light source may emit infrared radiation, for example, nearinfrared radiation referring to the spectral range from 750 nm to 1 . 2 pm . That is , along each emission direction visible light and/or infrared radiation can be emitted . For example , there are M emission directions wherein M is a natural number . For example , 2 < M < 40 or 6 < M < 30 or 10 < M < 20 .
According to at least one embodiment , the emission directions point outside the image-taking field of view . For example , the optical axes of the light-emitting units point outside the image-taking field of view . Alternatively or additionally, at most 20% or at most 10% or at most 5% or at most 1 % of a luminous flux emitted by the light-emitting units and measured in lumen may be emitted in the imagetaking field of view . This may apply for each one of the light-emitting units or collectively for all the lightemitting units .
According to at least one embodiment , the picture recording arrangement comprises a depth sensor . The depth sensor comprises a plurality of depth sensor units , for example , K depth sensor units wherein K is a natural number . The depth sensor units are configured to determine at least one distance along a plurality of sensor directions between the picture recording arrangement and at least one reflective surface . For example , the depth sensor units and the sensor directions are assigned in a one-to-one manner so that there are K sensor directions as well as K depth sensor units . Otherwise , there can be more depth sensor units than sensor directions .
For example , along each one of the sensor directions a distance between the picture recording arrangement and a surface is determined, up to a maximum distance that can be measured which is , for example , at least 5 m or at least 10 m . Hence , along each one of the sensor directions it is measured i f there is a reflective surface along the respective sensor direction .
According to at least one embodiment , the sensor directions point outside the image-taking field of view . Thus , the at least one reflective surface is located outside the imagetaking field of view . Hence , by illuminating the determined at least one reflective surface with the corresponding at least one light-emitting unit a target within the imagetaking field of view can be illuminated in an indirect manner, that is , with bouncing light .
In at least one embodiment , the picture recording arrangement comprises :
- an image-taking device having an image-taking field of view,
- a light source comprising a plurality of light-emitting units configured to emit light along a plurality of emission directions , the emission directions point outside the imagetaking field of view, and
- a depth sensor comprising a plurality of depth sensor units configured to determine at least one distance along a plurality of sensor directions between the picture recording arrangement and at least one reflective surface , the sensor directions point outside the image-taking field of view .
By means of the picture recording arrangement , a method to control a group of light-emitting units to match a target light distribution while illuminating a scene , based on environment analysis , is enabled . For example , the light source is a multi-LED flash used in photo applications in low-light configuration in various environments to detect the influence of the environment on the LED illumination .
The picture recording arrangement described herein intends to solve the problem of providing better arti ficial lighting when taking pictures in a low-light environment from a mobile phone , for example . Cameras in mobile phones are typically very small , so that they cannot receive a big amount of light and therefore behave poorly in low-light environment , producing images with a lot of noise . To get a good image exposition, one possibility is to try to add arti ficial lighting to the scene , by turning on some arti ficial light sources during image capture . The nature of this additional light can have a huge impact on the quality of the final picture , and the picture recording arrangement described herein proposes a solution to improve the way flash LEDs can bring light into a low-light scene . The picture recording arrangement described herein focusses on improving the quality of arti ficial flash particularly for indoor environments .
Some approaches to capture of information about the environment of the target to be pictured are , for example :
- to estimate a three-dimensional , 3D, representation of the scene in the field of view of the camera with a 3D sensor or by estimating the 3D environment from a monocular image ;
- to estimate a 3D representation of the scene including areas outside of the field of view of the camera via the camera of the phone and/or a 3D sensor of the phone while the user moves the phone in front of the subj ect , wherein the 3D sensor points into the field of view . This approach may be used in augmented reality, AR, applications ;
- to estimate the spatial distribution of light in the scene , like intensity and color, using a segment ambient light sensor pointing to the scene .
In mobile phone based AR applications , arti ficial content may be superimposed on the video that is captured in real-time via the phone camera . In order to anchor this arti ficial content to the real scene , the AR application has to perform an analysis of the scene to determine the pose of the camera and map the screen two-dimensional , 2D, coordinates to the real world 3D coordinates . Several solutions may be used for that . For example , Simultaneous Locali zation And Mapping, SLAM, may be used . SLAM allows to build a 3D representation of the environment as the phone camera is moved .
Stereo cameras , Light imaging, detection and ranging, LIDARs for short , and 3D time-of- f light sensors , 3D ToF for short , are three examples of 3D sensors . Mobile phones can be fitted with such a 3D sensor pointing in the same direction as the camera and having a field of view close to the one of the camera . At a given phone pose , capturing the 3D information with such a sensor allows to get a 3D map of the scene in the field of view of the camera .
By scanning the phone , the field of view of the 3D sensor can in principle be broaden to get a wider representation of the environment in 3D . Given the setup of the adaptive flash where the LED array is illuminating outside of the field of view of the camera, capturing the 3D scene in the field of view of the camera does not give any information on the surfaces where the light from the flash is to bound . To get this information, a user would need to scan the phone in order to capture information that is outside of the field of view of the camera . This is cumbersome and not easy to control .
In the picture recording arrangement described herein, a special multi- zone depth sensor is used . This sensor has several individual depth sensor units each of them pointing to a speci fic direction . The depth sensor can be based on depth sensing technologies like ToF, LIDAR or a sel f-mixing interferometer, SMI for short .
The individual depth sensor units may point in the same direction as the light-emitting units of the adaptive flash device ; in this case , the number of depth sensors is equal to the number of light-emitting units of the adaptive flash, or the number of emission directions is equal to the number of sensor directions . As an alternative , the number of individual depth sensor units is lower than the number of light-emitting units in the adaptive flash so that the number of emission directions may be higher than the number of sensor directions .
In both cases , the depth sensor units allow to estimate the distance to a surface when each light-emitting unit of the flash will bounce . From this information, it is possible to select which light-emitting units will be used to illuminate the scene . Namely, for example , the light-emitting units are selected that point to the closest bouncing surface to shine light on the scene . It is alternatively or additionally also possible to determine the intensity of the illumination to use ; for example , the larger the distance , the higher the intensity to be used . When there are less depth sensor units and/or sensor directions than light-emitting units and/or emission directions , an interpolation may be performed . For example , the depth sensor is organi zed in as three sensor triplets comprising thus a total of nine depth sensor units . Each triplet pointing to one of the ' top" , " left" and "right" directions , relative to the image-taking device . The three sensors in each triplet can slightly point in a di f ferent direction so as to determine three depth points in the ' top" , " left" and "right" directions . Each of the triplets thereby allows to estimate the distance and orientation of a bouncing surface - this assumes , for example that the surfaces are planar .
Based on the estimation of the bouncing surfaces , it is possible to select which light-emitting units are to be used to optimi ze the bouncing of light onto the scene .
Some advantages of the solution presented herein may be as follows :
- Fast operation is enabled as the environment can be analyzed in a fraction of a second .
- Ease of operation is enabled as the user does not need to scan the environment .
- Optimal estimation of the bouncing surfaces is possible whereby optimi zing the illumination and saving power by only illuminating the light-emitting units that contribute to the illumination of the scene .
According to at least one embodiment , the image-taking device is configured to take a picture of a target located within the image-taking field of view . For example , the target comprises at least one obj ect and/or at least one person .
According to at least one embodiment , the light source is configured to provide indirect illumination of the target by means of illuminating the at least one reflective surface located outside the image-taking field of view . It is possible that exactly one or also a plurality of reflective surfaces is simultaneously used to illuminate the target with bouncing light .
According to at least one embodiment , the depth sensor is configured to interpolate between measurement values along the sensor directions to provide a distance value for each one of the emission directions , even i f at least one speci fic emission direction does not run along one of the sensor directions .
According to at least one embodiment , the depth sensor comprises or consists of three triplets of the depth sensor units . Correspondingly, there can be three triplets of the assigned sensor directions which point to the top, to the left and to the right , respectively, relative to the imagetaking device . Other than triplets , alternatively pairs or quadruplets or quintuplets or any combinations thereof may be used .
According to at least one embodiment , the depth sensor is configured to determine relative distances along the sensor directions between the at least one reflective surface and the picture recording arrangement . Thus , it is not necessary to determine absolute values of the distances , for example , measured in meters , but it may be suf ficient to derive along which one of the sensor directions there is the most close reflective surface , along which one of the sensor directions there is the second-close reflective surface , and so on, and along which one of the sensor directions there is no reflective surface in range . Otherwise , the distances may be determined with their absolute values , that is , measured in meters . However, relatively large measurement tolerances may apply so that the measured distances may be determined, for example , with a relative accuracy of 10% or better or of 20% or better or of 30% or better .
According to at least one embodiment , the depth sensor is configured to determine which one of the sensor directions corresponds to a reflective surface that is closer to the picture recording arrangement than the target . Thus , for example , one or some or all of the determined reflective surfaces determined to be closer to the picture recording arrangement than the target may be used for indirectly illuminating the target .
According to at least one embodiment , the depth sensor is configured to determine the at least one reflective surface which is located closest to the picture recording arrangement , seen along the sensor directions . For example , the depth sensor determines only the closest reflective surface , and only this reflective surface is used for indirect illumination of the target .
According to at least one embodiment , one or some or all of the depth sensor units are selected from the following group : time-of- f light sensor, LIDAR sensor, stereo sensor or stereo camera, sel f-mixing interferometer . In particular, the depth sensor units are SMI s .
According to at least one embodiment , one or some or all of the depth sensor units are configured to determine intensities of a radiation emitted along the sensor directions and reflected back to the depth sensor at the at least one reflective surface . Thus , a relative distance between the picture recording arrangement and the reflective surfaces may be determined by means of the fraction of radiation reflected back from the respective reflective surface to the picture recording arrangement . In this case , the depth sensor units may use the light-emitting units as sources for the radiation to be reflected and to be detected afterwards .
According to at least one embodiment , the depth sensor units are configured to emit near-infrared radiation . Alternatively or additionally, the depth sensor units are configured to emit visible light and/or near-ultraviolet radiation . Nearultraviolet radiation may refer to a wavelength range between 350 nm and 410 nm .
According to at least one embodiment , the light-emitting units are configured to emit visible light , like white light . It is possible that a hue , a color saturation and/or a correlated color temperature , CCT , of the light-emitting units can be adj usted .
According to at least one embodiment , an emission angle between an optical axis of the image-taking device and at least some of the emission directions and/or of the sensor directions is at least 20 ° or is at least 30 ° . Alternatively or additionally, said angle is at most 75 ° or is at most 60 ° or is at most 50 ° .
According to at least one embodiment , for one or for some or for all of the emission directions and/or the sensor directions an emission angle width per emission direction and/or per sensor directions is at least 2 ° or is at least 5 ° or is at least 10 ° or is at least 15 ° . Alternatively or additionally, said angle is at most 55 ° or is at most 45 ° or is at most 30 ° .
It is possible that the emission angles of the emission directions and of the sensor directions are the same . Alternatively, the emission directions on the one hand and the sensor directions on the other hand have di f ferent emission angles . For example , the emission angles of the sensor directions are smaller than the emission angles of the emission directions , especially i f the depth sensor units are based on laser technology .
According to at least one embodiment , the depth sensor units and the light-emitting units are arranged in a circular manner around the image-taking device , seen in top view of the image-taking device . For example , the depth sensor units and the light-emitting units can be arranged in the same circle around the image-taking device . Otherwise , the depth sensor units may be arranged along an inner circle and the light-emitting units may be arranged along an outer circle around the image-taking device .
Otherwise , it is possible that the light-emitting units and/or the depth sensor units are arranged in at least one matrix . For example , the depth sensor units are arranged in a matrix, and the light-emitting units are arranged in a circular arrangement , possibly around the image-taking device only or around the image-taking device as well as around the depth sensor units . According to at least one embodiment , the picture recording arrangement is a mobile device . For example , the picture recording arrangement is a smart phone or a tablet computer .
A method for operating the picture recording arrangement is additionally provided . For example , by means of the method a picture recording arrangement as indicated in connection with at least one of the above-stated embodiments is operated . Features of the picture recording arrangement are therefore also disclosed for the method and vice versa .
In at least one embodiment , the method is for adapting illumination and comprises the following steps , for example , in the stated order :
A) Providing picture recording arrangement comprising an image-taking device having an image-taking field of view, a light source comprising a plurality of light-emitting units having emission directions pointing outside the image-taking field of view, and a depth sensor comprising a plurality of depth sensor units ,
B ) Emitting radiation by the depth sensor units along a plurality of sensor directions and determining at least one distance along the sensor directions between the picture recording arrangement and at least one reflective surface outside the image-taking field of view, the sensor directions point outside the image-taking field of view, and
C ) Taking at least one image of a target located in the magetaking field of view, wherein light is emitted by at least one of the light-emitting units along at least one of the emission directions so that the target is indirectly illuminated by means of illuminating the at least one reflective surface located outside the image-taking field of view . In this method, for example , a picture recording arrangement is used that contains a set of , for example , M independently controlled light-emitting units , all close to the camera but each pointing in a di f ferent direction . A process or an algorithm may be used to optimi ze the intensity applied to each light-emitting unit during the flash to provide the desired indirect illumination .
A picture recording arrangement and a method described herein are explained in greater detail below by way of exemplary embodiments with reference to the drawings . Elements which are the same in the individual figures are indicated with the same reference numerals . The relationships between the elements are not shown to scale , however, but rather individual elements may be shown exaggeratedly large to assist in understanding .
In the figures :
Figure 1 is a schematic side view of an exemplary embodiment of a method using a picture recording arrangement described herein,
Figure 2 is a schematic front view of the method of Figure 1 ,
Figure 3 is a schematic perspective view of an exemplary embodiment of a method using a picture recording arrangement described herein,
Figure 4 is a schematic top view of an exemplary embodiment of a picture recording arrangement described herein, Figure 5 is a schematic side view of the picture recording arrangement of Figure 4 ,
Figure 6 is a schematic top view of an exemplary embodiment of a picture recording arrangement described herein,
Figure 7 is a schematic side view of the picture recording arrangement of Figure 6 ,
Figure 8 to 11 are schematic top views of exemplary embodiments of picture recording arrangements described herein, and
Figure 12 is a schematic representation of the emission characteristics of a light-emitting unit and of a depth sensor unit for exemplary embodiments of picture recording arrangements described herein .
Figures 1 and 2 illustrate an exemplary embodiment of a method using a picture recording arrangement 1 . The picture recording arrangement 1 is a mobile device 10 and comprises an image-taking device 2 , like a CMOS image sensor or a CCD sensor, configured to take photos and/or videos . Further, the picture recording arrangement 1 comprises a light source 3 and a depth sensor 5 . A user of the picture recording arrangement 1 is not shown in Figures 1 and 2 .
In the intended use , the picture recording arrangement 1 is used indoors to take , for example , an image of a target 4 in a scene 11 . For example , the target 4 is a person to be photographed . For example , a distance L between the target 4 and the picture recording arrangement 1 is a least 0.5 m and/or is at most 5 m. It is possible that a size H of the target 4 is at least 0.2 m and/or is at most 2 m. The target 4 can be located in front of a wall 12 or any other item, for example, in front of the target 4 that provides a bouncing surface on the sides of the target 4 so that indirect lighting can be provided. The target 4 can be directly at the wall or can have some distance to the wall 12.
The light source 3 is configured to emit an image-taking radiation RE, like visible light, along a plurality of emission directions El.. EM. Thus, there are M emission directions. For example, M is between ten and 20 inclusive. By means of the light source 3, for example, for each one of the emission directions El.. EM one illuminated area 13 is present next to the target 4 out of a f ield-of-view of the image-taking device 2. Thus, the light source 3 provides indirect lighting. The emission of radiation RE along the emission directions El.. EM can be adjusted by means of a processing unit of the picture recording arrangement 1, for example .
In order to determine suitable illuminated areas 13, there is the depth sensor 5 comprising K depth sensor units 51..5K to determine a distance between the respective illuminated areas 13 and the mobile device 10 along sensor directions D1..DK. For example, for each sensor direction D1..DK there is one of the depth sensor units 51..5K. The depth sensor units 51..5K are, for example, SMIs. Concerning SMIs, reference is made to document WO 2022/212607 Al, the disclosure content of which is hereby incorporated by reference. According to Figures 1 and 2 , K = M, but it is also possible that especially that K < M .
By having the depth sensor 5 with the sensor directions D1 . . DK pointing outside the f ield-of-view 22 of the imagetaking device 2 , for example , only such reflective surfaces 14 may be chosen which are located in front of the target 4 so that reflective surfaces behind the target 4 , from the perspective of the mobile device 10 , may not be illuminated in order to avoid a glaring ef fect .
In another application scenario of the method, for example , in a room the picture recording arrangement 1 and the target 4 are located there is also a luminaire 8 that provides weak lighting . A mood provided by the luminaire 8 may be reproduced by the picture recording arrangement 1 . In order to do so and reali zing a high picture quality, the light source 3 addresses , for example , in particular the illumination areas 13 being about in the same orientation relative to the target 4 as the luminaire 8 . In Figure 2 , this would be , for example , the illumination areas 13 in the upper left area next to the luminaire 8 . In this example , the mood can be kept while good illumination conditions can be present when taking the picture by having the light source 3 as an adapted photo flash . The selection of suitable illumination areas 13 and, thus , the adaption of the light source 3 to the desired mood, can be highly improved by using the depth sensor 5 .
Adaption can also be to a pre-selected mood derived, for example , from a template or another picture or video , and by means of the depth sensor 5 adaption can be simpli fied because the most suitable illumination areas 13 to achieve said mood can effectively be selected.
In Figure 3, another example is illustrated. In this example, the target 4 is a person standing near a corner of a room. By using the depth sensor 5, it is determined that the nearest reflective surface 14 and, thus, the most suitable illumination area 13 is located on a left side of the mobile device 10, that is, on a right side of the person 4. Hence, after determining the illumination area 13 by means of the depth sensor 5, the light source 3 can illuminate said illumination area 13 in a targeted manner to provide indirect illumination on the person 4.
Otherwise, the same as to Figures 1 and 2 may also apply to Figure 3, and vice versa.
In Figures 4 and 5, an example of the picture recording arrangement 1 is provided. The picture recording arrangement
1 may be a smart phone 10. The depth sensor 5 may be integrated in a combined camera and flash 2, 3. The depth sensor units 51..5K of the depth sensor 5 are configured to determine a distance towards a next surface along the K sensor directions D1..DK. For example, seen in top view of the combined camera and flash 2, 3, the sensor directions D1..DK are distributed in a rotation symmetric and/or equidistant manner so that between adjacent sensor directions D1..DK there is a same angle. All the sensor directions D1..DK point away from the combined camera and flash 2, 3 so that the combined camera and flash 2, 3 point out of a field- of-view of the image-taking device 2, that is, of the camera
2. The number K of the sensor directions D1..DK may be the same as the number M of emission directions El.. EM, or K is smaller than M. The latter is preferred.
Otherwise, the same as to Figures 1 to 3 may also apply to Figures 4 and 5, and vice versa.
In the example of the picture recording arrangement 1 as shown in Figures 6 and 7, the picture recording arrangement 1 again may be a smart phone 10. There are nine of the sensor directions D1..DK arranged in three triplets. The sensor directions D1..D3 of one of the triplets points to the left, another three sensor directions D4..D6 point to the right, and the remaining three sensor directions D7..D9 point to the top, seen relative from the image-taking device 2, which is, for example, the combined camera and flash 2, 3.
Within each triplet, the sensor directions D1..D9 run along slightly different directions. Optionally, within each triplets the sensor directions D1..D3, D4..D6 and D7..D9 may each be located in a common plane. For example, an angle between the sensor directions D1..D3, D4..D6 and D7..D9 within one of the triplets is at least 10° or at least 15° and/or is at most 30° or is at most 20°.
Thus, indirect lighting from a left side, a top side and/or a right side of the target 4 is enabled.
Otherwise, the same as to Figures 4 and 5 may also apply to Figures 6 and 7, and vice versa.
In Figures 8 to 11, some exemplary embodiments of the picture recording arrangement 1 are shown. In all these examples, the picture recording arrangement 1 is a mobile device 10, like a smartphone or a tablet computer. However, the aspects described below may also apply analogously to other kinds of picture recording arrangements 1.
The light source 3 comprises the plurality of the lightemitting units 31..3M. The light-emitting units 31..3M can be light-emitting diodes, LEDs for short. It is possible that the light-emitting units 31..3M are arranged in a circular manner, that is, on a circle or on an ellipse or on a regular polygon, seen in top view of the image-taking device 2.
However, other arrangements of the light-emitting units 31..3M around the image-taking device 2 are likewise possible. This is because a distance between the lightemitting units 31..3M is very small compared with a distance between the illuminated areas 13, compare Figure 2, so that it is not necessary that an arrangement order of the lightemitting units 31..3M corresponds to an arrangement order of the illuminated areas 13. Hence, it is alternatively also possible for the light-emitting units 31..3M to be arranged in a matrix, for example. The same is possible for the sensor directions D1..DK, too.
If the light-emitting units 31..3M are arranged on a circle or on an ellipse or on a regular polygon, it is possible that the respective emission directions El.. EM associated with the light-emitting units 31..3M can point inwards, that is, can cross a center of the circle or ellipse or polygon. The same is possible for the sensor directions D1..DK, too.
Moreover, the picture recording arrangement 1 includes the at least one image-taking device 2. Optionally, the picture recording arrangement 1 can include at least one additional light-emitting unit 6. The at least one additional lightemitting unit 6 can also be configured as a photo flash. In particular, the at least one additional light-emitting unit 6 may be configured for direct illumination of the target 4 so that an optical axis of the at least one additional lightemitting unit 6 can point into the f ield-of-view 22 of the image-taking device 2.
All the components 2, 3, 5, 6 of the picture recording arrangement 1 can be integrated into a housing 7.
According to Figure 8, there are two of the image-taking devices 2, for example, for different f ield-of-views 22 or for stereo pictures or videos. Further, there are K sensor directions D1..DK and M emission directions El.. EM wherein K = M. The N light-emitting units 31..3N and the K depth sensor units 51..5K may be arranged in an alternating manner on the same circle or ellipse or regular polygon so that N = K may apply. Both the at least one image-taking device 2 and the optional at least one additional light-emitting unit 6 are arranged within said circle or ellipse or polygon, for example, in a center area of said circle or ellipse or polygon .
Otherwise, the same as to Figures 1 to 7 may also apply to
Figure 8, and vice versa.
In Figure 9 it is illustrated that K < N wherein A x K = N may apply, for example, wherein A is a natural number. For example, 2 < A < 10 or 2 < A < 4. According to Figure 9, A = 2. Further, in Figure 9 it is shown that there is a plurality of the additional light-emitting units 6. Otherwise, the same as to Figure 8 may also apply to Figure 9, and vice versa.
According to Figure 10, the light-emitting units 31..3N are arranged on an outer circle or ellipse or regular polygon whereas the depth sensor units 51..5K are arranged on an inner circle or ellipse or regular polygon around the at least one image-taking device 2 and optionally around the at least one additional light-emitting unit 6, if the latter is present. Hence, it is possible that K < N.
The inner and the outer circles or ellipses or polygons may be arranged directly next to one another or, other than shown in Figure 10, there may be a distance between the inner and the outer circles or ellipses or polygons.
Otherwise, the same as to Figures 8 and 9 may also apply to Figure 10, and vice versa.
In Figure 11 it is illustrated that the light-emitting units 31..3N are again arranged on a circle or ellipse or regular polygon. However, the depth sensor units 51..5K may be arranged in at least one matrix or row, for example, within the circle or ellipse or regular polygon.
According to Figure 11, there are three rows of the depth sensor units 51..5K, one on the left, one on the top and one on the right of the image-taking device 2. With such an arrangement of the depth sensor units 51..5K, the triples of Figures 6 and 7 may be implemented.
As an option, the image-taking device 2 and the additional light-emitting unit 6, if present, are integrated in a single component. The same may be true in all other embodiments, too .
Otherwise, the same as to Figures 8 to 10 may also apply to Figure 11, and vice versa.
In Figure 12, exemplary parameters of the emission directions El.. EM and/or of the sensor directions D1..DK are illustrated. For example, an angle 23 between an optical axis 20 of the image-taking device 2 and the emission directions El.. EM and/or of the sensor directions D1..DK is about 60°. An emission angle width 33 of the emission directions El.. EM and/or of the sensor directions D1..DK may be about 30° in each case, however, the emission angle width 33 of the emission directions El.. EM may differ from that of the sensor directions D1..DK. Thus, no or virtually no radiation RE and/or RD is emitted into the f ield-of-view 23 of the imagetaking device 2.
The components shown in the figures follow, unless indicated otherwise, exemplarily in the specified sequence directly one on top of the other. Components which are not in contact in the figures are exemplarily spaced apart from one another. If lines are drawn parallel to one another, the corresponding surfaces may be oriented in parallel with one another. Likewise, unless indicated otherwise, the positions of the drawn components relative to one another are correctly reproduced in the figures.
The invention described here is not restricted by the description on the basis of the exemplary embodiments. Rather, the invention encompasses any new feature and also any combination of features, which includes in particular any combination of features in the patent claims, even if this feature or this combination itself is not explicitly specified in the patent claims or exemplary embodiments. This patent application claims the priority of German patent application 10 2023 112 280.0, the disclosure content of which is hereby incorporated by reference.
List of Reference Signs
1 picture recording arrangement
10 mobile device
11 scene
12 wall
13 illuminated area
14 reflective surface
2 image-taking device
20 optical axis
22 f ield-of-view
23 emission angle
3 light source
3 . . light-emitting unit
33 emission angle width
4 target
5 depth sensor
5 . . depth sensor unit
6 additional light-emitting unit
7 housing
8 luminaire
D . . sensor direction
E . . emission direction
H si ze
L distance
RD distance-measuring radiation
RE image-taking radiation

Claims

Patent Claims
1. A picture recording arrangement (1) comprising:
- an image-taking device (2) having an image-taking field of view ( 22 ) ,
- a light source (3) comprising a plurality of light-emitting units (31..3N) configured to emit light along a plurality of emission directions (El.. EM) , the emission directions
(El.. EM) point outside the image-taking field of view (22) , and
- a depth sensor (5) comprising a plurality of depth sensor units (51..5K) configured to determine at least one distance along a plurality of sensor directions (D1..DK) between the picture recording arrangement (1) and at least one reflective surface (14) , the sensor directions (D1..DK) point outside the image-taking field of view (22) , wherein the emission directions (El.. EM) and the sensor directions (D1..DK) are assigned to one another in a one-to- one manner.
2. The picture recording arrangement (1) according to the preceding claim, wherein the image-taking device (2) is configured to take a picture of a target (4) located within the image-taking field of view (22) , and the light source (3) is configured to provide indirect illumination of the target (4) by means of illuminating the at least one reflective surface (14) located outside the image-taking field of view (22) .
3. The picture recording arrangement (1) according to any one of the preceding claims, wherein the depth sensor (5) comprises three triplets of the depth sensor units (51..5K) , corresponding three triplets of the assigned sensor directions (D1..DK) point to the top, to the left and to the right, respectively, relative to the image-taking device (2) .
4. The picture recording arrangement (1) according to any one of the preceding claims, wherein the depth sensor (5) is configured to determine relative distances along the sensor directions (D1..DK) between the at least one reflective surface (14) and the picture recording arrangement (1) .
5. The picture recording arrangement (1) according to any one of the preceding claims, wherein the depth sensor (5) is configured to determine the at least one reflective surface (14) which is located closest to the picture recording arrangement (1) , seen along the sensor directions (D1..DK) .
6. The picture recording arrangement (1) according to any one of the preceding claims, wherein at least some of the depth sensor units (51..5K) are selected from the following group: time-of-f light sensor, LIDAR sensor, stereo camera, self-mixing interferometer.
7. The picture recording arrangement (1) according to any one of the preceding claims, wherein at least some of the depth sensor units (51..5K) are configured to determine intensities of a radiation emitted along the sensor directions (D1..DK) and reflected back to the depth sensor (5) at the at least one reflective surface
8. The picture recording arrangement (1) according to any one of the preceding claims, wherein the depth sensor units (51..5K) are configured to emit near-infrared radiation, and the light-emitting units (31..3N) are configured to emit visible light.
9. The picture recording arrangement (1) according to any one of the preceding claims, wherein an emission angle (23) between an optical axis (20) of the image-taking device (2) and at least some of the emission directions (El.. EM) and/or of the sensor directions (D1..DK) is between 30° and 75° inclusive, and wherein for at least some of the emission directions (El.. EM) and/or for at least some of the sensor directions (D1..DK) an emission angle width (5) per emission direction (El.. EM) and/or per sensor directions (D1..DK) is between 5° and 45° inclusive .
10. The picture recording arrangement (1) according to any one of the preceding claims, wherein there are at least six and at most 30 of the emission directions (El. .EM) .
11. The picture recording arrangement (1) according to any one of the preceding claims, wherein the depth sensor units (51..5K) and the lightemitting units (31..3N) are arranged in a circular manner around the image-taking device (2) , seen in top view of the image-taking device (2) .
12. The picture recording arrangement (1) according to any one of the preceding claims, wherein the picture recording arrangement (1) is a mobile device (10) .
13. The picture recording arrangement (1) according to the preceding claim, wherein the picture recording arrangement (1) is a smart phone .
14. The picture recording arrangement (1) according to any one of the preceding claims, wherein the light source (3) is configured as a photo flash.
15. A method for adapting illumination comprising:
A) Providing picture recording arrangement (1) comprising an image-taking device (2) having an image-taking field of view (22) , a light source (3) comprising a plurality of lightemitting units (31..3N) having emission directions (El.. EM) pointing outside the image-taking field of view (22) , and a depth sensor (5) comprising a plurality of depth sensor units (51. .5K) ,
B) Emitting radiation by the depth sensor units (51..5K) along a plurality of sensor directions (D1..DK) and determining at least one distance along the sensor directions (D1..DK) between the picture recording arrangement (1) and at least one reflective surface (14) outside the image-taking field of view (22) , the sensor directions (D1..DK) point outside the image-taking field of view (22) ,
C) Taking at least one image of a target (4) located in the mage-taking field of view (22) , wherein light is emitted by at least one of the light-emitting units (31..3N) along at least one of the emission directions (El.. EM) so that the target (4) is indirectly illuminated by means of illuminating the at least one reflective surface (14) located outside the image-taking field of view (22) .
PCT/EP2024/059820 2023-05-10 2024-04-11 Picture recording arrangement and illumination adapting method Pending WO2024231019A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112024000399.0T DE112024000399T5 (en) 2023-05-10 2024-04-11 IMAGE CAPTURE ARRANGEMENT AND METHOD FOR ILLUMINATION ADJUSTMENT

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102023112280 2023-05-10
DE102023112280.0 2023-05-10

Publications (1)

Publication Number Publication Date
WO2024231019A1 true WO2024231019A1 (en) 2024-11-14

Family

ID=90810094

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/059820 Pending WO2024231019A1 (en) 2023-05-10 2024-04-11 Picture recording arrangement and illumination adapting method

Country Status (2)

Country Link
DE (1) DE112024000399T5 (en)
WO (1) WO2024231019A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160328854A1 (en) * 2015-05-10 2016-11-10 Magik Eye Inc. Distance sensor
WO2022212607A1 (en) 2021-03-31 2022-10-06 Ams International Ag Displacement detector, array of displacement detectors and method of manufacturing a displacement detector
US20230095000A1 (en) * 2021-09-24 2023-03-30 Apple Inc. Adaptive-Flash Photography, Videography, and/or Flashlight Using Camera, Scene, or User Input Parameters

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160328854A1 (en) * 2015-05-10 2016-11-10 Magik Eye Inc. Distance sensor
WO2022212607A1 (en) 2021-03-31 2022-10-06 Ams International Ag Displacement detector, array of displacement detectors and method of manufacturing a displacement detector
US20230095000A1 (en) * 2021-09-24 2023-03-30 Apple Inc. Adaptive-Flash Photography, Videography, and/or Flashlight Using Camera, Scene, or User Input Parameters

Also Published As

Publication number Publication date
DE112024000399T5 (en) 2025-10-23

Similar Documents

Publication Publication Date Title
US20250047983A1 (en) Systems and methods for multi-camera placement
CN105122943B (en) A method of characterizing a light source and a mobile device
EP3192330B1 (en) Lighting preference arbitration.
US10080004B2 (en) Method and system for projector calibration
JP2025111555A (en) System and method for capturing and generating panoramic three-dimensional image
KR100917524B1 (en) Position detecting device
WO2014174779A1 (en) Motion sensor apparatus having a plurality of light sources
US12069227B2 (en) Multi-modal and multi-spectral stereo camera arrays
JP2015046142A5 (en)
JP2023532676A (en) Projector for diffuse illumination and structured light
US20230252637A1 (en) System and method for improving image segmentation
WO2015073590A2 (en) Collimation and homogenization system for an led luminaire
CN102865849B (en) Camera device for ranging and ranging method
CN202915911U (en) Shooting device for distance measurement
WO2024231019A1 (en) Picture recording arrangement and illumination adapting method
TWI630431B (en) Device and system for capturing 3-d images
WO2015009795A1 (en) Light control systems and methods
US20250316021A1 (en) 3d reconstruction method and picture recording arrangement
TWI788120B (en) Non-contact elevator control system
WO2023232525A1 (en) Illumination adapting method and picture recording arrangement
US20190289188A1 (en) System and method of adjusting power of a light source
CN115086630A (en) Method and apparatus for stereoscopic viewing
EP3466068B1 (en) A system and a method for capturing and generating 3d image
EP3891557A1 (en) Light reflection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24720040

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112024000399

Country of ref document: DE

WWP Wipo information: published in national office

Ref document number: 112024000399

Country of ref document: DE