US20230292012A1 - Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings - Google Patents
Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings Download PDFInfo
- Publication number
- US20230292012A1 US20230292012A1 US18/319,640 US202318319640A US2023292012A1 US 20230292012 A1 US20230292012 A1 US 20230292012A1 US 202318319640 A US202318319640 A US 202318319640A US 2023292012 A1 US2023292012 A1 US 2023292012A1
- Authority
- US
- United States
- Prior art keywords
- user device
- display
- lighting
- user
- display configuration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present disclosure relates generally to video teleconferencing, and more particularly to techniques and mechanisms for providing intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings.
- a virtual meeting e.g. a virtual web-based or online meeting
- the lighting conditions for one or more of the participants were inadequate.
- the lighting for the participants in a virtual meeting may be relatively dark. At these times, one may not be able to clearly see or identify other participants on the screen, especially in the case where the virtual meeting has a large number of participants.
- external lighting setups such as video lighting setups and other lighting kit setups
- video lighting setups and other lighting kit setups may have to be purchased for the virtual meetings.
- These types of lighting setups are not environmentally friendly and may come at a cost to the user.
- FIG. 1 is an illustrative representation of a basic network architecture within which a virtual meeting amongst a plurality of user devices may be facilitated according to some implementations of the present disclosure
- FIG. 2 is an illustrative representation of a user device and a cloud server for use in the virtual meeting according to some implementations of the present disclosure
- FIG. 3 is a high-level diagram of a system for a cloud-assisted lighting adjustment for a virtual meeting facilitated by a virtual meeting application according to some implementations of the present disclosure
- FIG. 4 is a flowchart for describing a method for use in a cloud-assisted adjustment of lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application of the user device with use of a selected display configuration;
- FIGS. 5 A- 5 B form a flowchart for describing a method for use in a cloud-assisted adjustment of a video of a participant in a virtual meeting facilitated by a virtual meeting application of the user device with use of a selected display configuration;
- FIG. 6 is a block diagram of a process which utilizes a model for generating or selecting baseline lighting setting parameters for displays and/or display configurations, and a machine learning process for training the model;
- FIG. 7 A is an example of a user display prompt for display at a user device, where the user display prompt indicates a plurality of display configurations, at least one of which may be selected for use at the user device;
- FIGS. 7 B- 7 E are examples of the plurality of display configurations associated with different lighting arrangements which may be used at the user device;
- FIGS. 8 A- 8 B are examples of different display configurations that are associated with different lighting techniques which may be utilized in a display of the user device;
- FIG. 9 illustrates a hardware block diagram of a computing device that may perform functions associated with operations discussed herein.
- a user device is operative to adjust and optimize lighting of a video of a participant in a virtual meeting with use of a selected display configuration at the user device, with the assistance of a cloud server.
- the selected display configuration may be a selected from one of a plurality of display configurations (e.g. a user display of the device, a plurality of displays connected at the user device, the user display and an alternate display of a laptop or a tablet, etc.).
- the user device may receive, from the cloud server, baseline lighting setting parameters associated with the selected display configuration.
- the user device may apply the baseline lighting setting parameters to one or more displays of the selected display configuration at the user device. Using the baseline lighting setting parameters as a baseline, the user device may automatically adjust a brightness and/or color pixels of the one or more displays of the selected display configuration, for optimizing the lighting of the video of the participant.
- cloud-assisted techniques and mechanisms that allow for adjusting and optimizing lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application of the user device.
- FIG. 1 a diagram of a network environment 100 in which the techniques of the present disclosure may be carried out is shown.
- a plurality of user devices 110 ( 1 ), 110 ( 2 ), to 110 (N) associated with a plurality of users (e.g. User 1 , User 2 , through User N) or participants of a virtual meeting are shown.
- User devices 110 ( 1 ), 110 ( 2 ), to 110 (N) may take on a variety of forms, including a smartphone, a tablet, a laptop computer, a desktop computer, etc.
- User devices 110 ( 1 ), 110 ( 2 ), to 110 (N) may communicate with various network-based entities shown in FIG. 1 via one or more networks 190 .
- Networked conferencing systems typically employ a client-server architecture, whereby each participant's client software (e.g. running on the participant's computer or work-station) connects to a web conference server 170 .
- web conference server 170 may include conference control services and access control services that govern access to conference functionality and/or resources.
- the participant's identification may determine permissions granted.
- the host may be given the broadest access rights to control the virtual meeting.
- Web conference server 170 may control all communications with the various clients according to a set of permissions granted to the conference participant logged in on that client.
- a media orchestrator 160 may ensure that all or select participants get connected to a meeting supported by a media provider 180 or, in the case of multiple media providers, to the appropriate one or more media providers.
- the functions of media orchestrator 160 and/or media provider(s) 180 may be performed by separate entities as shown, or may be integrated (either on-premises, in the cloud, or a hybrid of on-premises and cloud).
- a cloud server 130 may include a lighting adjustment service for user devices 110 ( 1 ), 110 ( 2 ), to 110 (N) for virtual meetings. More particularly, this cloud-assisted service may assist in the adjusting and optimizing of lighting of video of participants in the virtual meetings, with use of selected display configurations of user devices 110 ( 1 ), 110 ( 2 ), to 110 (N).
- cloud server 130 along with media orchestrator 160 , web conference server 170 , and media provider 180 , may reside off-premises in a cloud or data center computing environment. In some implementations, cloud server 130 may reside on-premises.
- FIG. 2 is a block diagram 200 of a user device 110 (one of user devices 110 ( 1 ), 110 ( 2 ), to 110 (N) of FIG. 1 ) and cloud server 130 which are configured to connect to one or more networks 190 for network-based communication.
- user device 110 may include one or more processors 220 (e.g., a microprocessor or microcontroller), a network interface unit 222 that enables wired and/or wireless network communication, one or more user interface components 224 (e.g., keyboard, mouse, touchscreen, etc.), and at least one display 226 (e.g., a display screen of a monitor, or touch screen, etc.).
- processors 220 e.g., a microprocessor or microcontroller
- network interface unit 222 that enables wired and/or wireless network communication
- user interface components 224 e.g., keyboard, mouse, touchscreen, etc.
- at least one display 226 e.g., a display screen of a monitor, or touch screen, etc.
- User device 110 may also include a memory 214 for storing software instructions of a lighting adjustment module 216 , a meeting client application 217 (e.g. a virtual meeting application.), and one or more join links 218 .
- FIG. 2 also shows an operating system 219 on which the lighting adjustment module 216 and the meeting client application 217 may run.
- Cloud server 130 of FIG. 2 may include one or more processors 232 , a network interface unit 234 and a memory 236 .
- Memory 236 may store instructions of cloud server software 238 for the lighting adjustment service.
- Memory 214 and memory 236 may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices.
- ROM read-only memory
- RAM random-access memory
- magnetic disk storage media devices magnetic disk storage media devices
- optical storage media devices optical storage media devices
- flash memory devices electrical, optical, or other physical/tangible memory storage devices.
- memory 236 shown in FIG. 2 may include one or more tangible (non-transitory) computer readable storage media encoded with software comprising computer executable instructions that, when executed by the one or more processors 232 , causes the operations as described herein to be performed.
- lighting adjustment module 216 on user device 110 , which may be referred to as an Intelligent Digital Lighting Adjustment (IDLA) module.
- IDLA Intelligent Digital Lighting Adjustment
- lighting adjustment module 216 may be operative to adjust light automatically and intelligently on existing monitors and laptop screens of the participant's device during a video teleconference meeting or “virtual meeting” in response to ambient light conditions.
- lighting adjustment module 216 may be designed to be an additional part of meeting client application 217 (e.g. Cisco WebEx Meetings) installed on a participant's computer, for example, to control the settings of a selected display configuration.
- the selected display configuration may include display 226 (or e.g. touchscreen) of user device 110 , an external monitor, a laptop screen, a multi-display/screen configuration, etc.
- display 226 or e.g. touchscreen
- a variety of different types of screen/monitor configurations may be selected from and utilized for video lighting at user device 110 .
- lighting adjustment module 216 may interface and communicate with the cloud server 130 in associated with user display profiles (e.g. display and/or configuration profiles) associated with the participant (e.g. even for a plurality of participants in the video teleconference).
- FIG. 3 is a high-level diagram of a system 300 for a cloud-assisted lighting adjustment for a virtual meeting facilitated by a virtual meeting application according to some implementations of the present disclosure.
- lighting adjustment module 216 may utilize a selected display configuration 310 associated with user device 110 to provide lighting and associated adjustments for the participant of the virtual meeting.
- Lighting adjustment module 216 may receive lighting adjustment assistance from cloud server 130 which is provided in a cloud 302 .
- Cloud server 130 has access to a database 250 which stores user display profiles having lighting setting parameters for different display configurations associated with the participants.
- Lighting adjustment may be performed such that an unoptimized, poorly-lit video presentation 320 of a virtual meeting may be converted into an optimized, well-lit video presentation 322 .
- the selected display configuration 310 will utilize the lighting provided via (at least) display 226 (or e.g. touchscreen) of user device 110 of FIG. 2 .
- lighting adjustment module 216 may be operative to utilize an ambient light sensor associated with user device 110 in order to assess the existing ambient conditions or lighting environment associated with the participant.
- the ambient light sensor may be a built-in light sensor of user device 110 (e.g. in MAC OS X and WINDOWS) for measuring the brightness of the light in a room, in order to adjust the brightness and other parameters in the selected display configuration 310 associated with user device 110 .
- lighting adjustment module 216 may be operative to utilize the same brightness control that can be manually-controlled with a laptop's physical buttons or screen: using a Display Data Channel (DDC). Using DDC allows for more advanced features (e.g. setting brightness to 30%) and others directly with respect to improving color rendering, power consumption, backlight bleeding effect.
- DDC Display Data Channel
- lighting adjustment module 216 for an intelligent adjustment of the brightness and other lighting parameters, so that a user may be illuminated in a consistent and pleasing way throughout a virtual meeting.
- lighting adjustment module 216 may utilize existing monitors or laptop screens as a light source to provide lightning during a video teleconference meeting for a warm, well-lit video look for a user's face.
- lighting adjustment module 216 may utilize video-based sensing technologies, such as deep learning algorithms and machine vision techniques, to enhance the quality of a virtual meeting, by intelligently adjusting digital light from screens or monitors to provide a consistent and balanced illumination throughout the duration of the virtual meeting. This may be achieved using existing monitor or laptop screens whose settings can be dynamically modified so they can emit the appropriate amount of light when needed based on sensor data collected within the room.
- video-based sensing technologies such as deep learning algorithms and machine vision techniques
- cloud server 130 may maintain and store user display profiles associated with each one of a plurality of users or participants.
- baseline lighting setting parameters associated with selected display configuration 310 for each user device may be stored.
- lighting adjustment module 216 may receive and apply the baseline lighting setting parameters to one or more displays of the selected display configuration.
- lighting adjustment module 216 may adjust the brightness and/or color pixels of the one or more displays of selected display configuration 310 , for optimizing the lighting of the video of the participant.
- cloud-based control may provide an operation to normalize the lighting for all participants when multiple individuals are presenting in the virtual meeting.
- lighting adjustment module 216 may utilize DDC.
- DDC is a technology that is supported by most computer monitors for allowing software control of the brightness in the same way it is manually controlled by a user (via buttons integrated into the display or laptop keyboard).
- lighting adjustment module 216 may be operative to employ one or more of a plurality of different strategies as follows:
- the background hue or brightness of the application's background may be varied in order to create the desired lighting characteristics.
- existing background matting or replacement technologies e.g. virtual background
- virtual background may be used to change the background of participants, thereby increasing the proportion of pixels that can be controlled for lighting purposes.
- This strategy is shown and described later in relation to FIG. 8 A .
- the screen size where the desktop content is displayed may be reduced to thereby create a border (or one or more border areas) around the display.
- the pixel values within the border may then be adjusted to control available lighting.
- This strategy is shown and described later in relation to FIG. 8 B .
- a display theme may be switched from “dark” to “light” during the virtual meeting, automatically or manually enabled by the user.
- companion software may be made available on those devices to use them for lighting which may be adjusted.
- This strategy is shown and described later in relation to FIG. 7 E .
- a flowchart 400 is shown for describing a method for a cloud-assisted adjustment of lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application according to some implementations.
- the method of FIG. 4 may be performed by a user device having the virtual meeting application and interacting with a server (e.g. a cloud server). More particularly, the method may be performed at least in part by a lighting adjustment module of the user device which interacts with the cloud server.
- the method may be embodied as a computer program product including a non-transitory computer readable medium (e.g. one or more memory elements) and instructions stored in the computer readable medium, where the instructions are executable on one or more processors for performing the steps of the method.
- the user device may receive, from the cloud server, baseline lighting setting parameters associated with a selected display configuration at the user device (step 404 of FIG. 4 ).
- the selected display configuration may be a selected one of a plurality of display configurations.
- the selected display configuration may be a user display of the user device; a plurality of displays connected at the user device; the user display of the user device and an alternate display of a laptop or a tablet, the user display of the user device and a mobile display of a mobile device, etc.
- the baseline setting parameters may be received in response to the user device sending to the cloud server a message which indicates a request for the baseline lighting setting parameters.
- the user device may apply the baseline lighting setting parameters to one or more displays of the selected display configuration at the user device (step 406 of FIG. 4 ).
- the user device may adjust the brightness and/or color pixels of the one or more displays of the selected display configuration at the user device, for optimizing the lighting of the video of the participant (step 408 of FIG. 4 ).
- the user device may obtain lighting environment parameters based on sensing a lighting environment of the user device (e.g. with use of one or more sensors of the user device), and may adjust the brightness and/or the color pixels of the one or more displays of the selected display configuration according to the lighting environment parameters.
- the user device may send, to the cloud server, display configuration information associated with the selected display configuration at the user device.
- the display configuration information may include a display configuration setting value for (properly or uniquely) identifying the selected display configuration.
- the display configuration information may (further) include one or more of a number of displays, an arrangement of displays, and display make and model information.
- the user device may cause a user display prompt to be displayed, where the user display prompt indicates the plurality of display configurations for user selection. The user device may receive a user selection of the selected display configuration, and then send to the cloud server a message which indicates the selected display configuration for storage in a user display profile.
- the cloud server may receive from the user device a message which indicates lighting setting parameters associated with the optimizing of the lighting of the video of the participant, and store, in the user profile, the lighting setting parameters in association with the selected display configuration.
- the lighting setting parameters may be for subsequent use by the user device as the baseline lighting setting parameters for the selected display configuration.
- the message which indicates lighting setting parameters may further indicate lighting environment parameters associated with a lighting environment of the user device.
- the cloud server may store a plurality of baseline lighting setting parameters respectively associated with the plurality of display configurations, based on lighting setting parameters received from a plurality of user devices respectively associated with a plurality of participants.
- the cloud server may utilize a machine learning process to generate a plurality of (more optimal) baseline lighting setting parameters respectively associated with the plurality of display configurations, based on the lighting setting parameters received from the plurality of user devices.
- FIGS. 5 A- 5 B form a flowchart 500 for describing a method for a cloud-assisted adjustment of lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application according to some implementations.
- the method of FIG. 5 A- 5 B may be performed by a user device having the virtual meeting application and interacting with a server (e.g. a cloud server). More particularly, the method may be performed at least in part by a lighting adjustment module of the user device which interacts with the cloud server.
- the method may be embodied as a computer program product including a non-transitory computer readable medium (e.g. one or more memory elements) and instructions stored in the computer readable medium, where the instructions are executable on one or more processors for performing the steps of the method.
- a virtual meeting is initiated (step 504 of FIG. 5 A ).
- a user of the user device having the virtual meeting application is one of the participants in the virtual meeting.
- the user device may monitor the lighting of a video of the participant of the virtual meeting (e.g. an ambient light sensor, for assessing the participant's lighting environment) (step 506 of FIG. 5 A ). If the lighting is identified to be satisfactory or optimal (as tested at step 508 of FIG. 5 A ), the user device may continue to monitor the lighting at step 506 . On the other hand, if the lighting is identified to be unsatisfactory or not optimal (as tested at step 508 of FIG. 5 A ), the user device may proceed to perform a cloud-assisted interaction with the lighting adjustment module of the user device (step 510 of FIG. 5 A ). The method may continued in FIG. 5 B .
- the user device may send a message to the cloud server (step 512 of FIG. 5 B ).
- the message may indicate (implicitly or explicitly) a request for baseline lighting setting parameters for its display configuration.
- the message may include one or more of an identity associated with the user device, display configuration information of the user device, and lighting environment parameters which indicate the lighting environment of the user device.
- the display configuration information may include a display configuration setting value for (properly or uniquely) identifying the selected display configuration.
- the display configuration information may (further) include one or more of a number of displays, an arrangement of displays, and display make and model information.
- the cloud server may receive the message from the user device (step 532 of FIG. 5 B ). In response to receipt of the message, the cloud server may search a database for identifying a user display profile associated with the user device or user thereof (step 534 of FIG. 5 B ). If the user display profile is not found in the database (as tested in step 536 of FIG. 5 B ), then the cloud server may send a message to the user device (step 538 of FIG. 5 B ). In some implementations, the message may indicate a request for user selection of a display configuration.
- the user device may receive the message from the cloud server (step 514 of FIG. 5 B ), cause a user display prompt to be displayed, where the user display prompt indicates a plurality of display configurations for user selection, and receive a user selection of one of the plurality of display configurations in the user display prompt (step 516 of FIG. 5 B ).
- the user device may send a message to the cloud server (step 518 of FIG. 5 B ).
- the message may indicate the selected display configuration for storage in a user display profile.
- the message may indicate the selected display configuration and lighting environment parameters which indicate the lighting environment of the user device.
- the cloud server may receive the message from the user device and create a user display profile based on the selected display configuration (step 540 of FIG. 5 B ).
- the cloud server may also obtain lighting setting parameters (step 542 of FIG. 5 B ), which may be generated or selected based on the lighting environment parameters received from the user device.
- this process may utilize a model or model function at the cloud server, one example of which is shown and described later in relation to FIG. 6 .
- the lighting setting parameters are baseline lighting setting parameters for use as a baseline at the user device.
- the baseline lighting setting parameters may be obtained from the user display profile if and when found in step 536 of FIG. 5 B .
- the cloud server may then send a message to the user device (step 544 of FIG. 5 B ).
- the message may include the baseline lighting setting parameters.
- the user device may receive the message which includes the baseline lighting setting parameters (step 520 of FIG. 5 B ).
- the user device may apply the baseline lighting setting parameters to one or more display of the selected display configuration at the user device (step 522 of FIG. 5 B ).
- the user device may adjust a brightness and/or color pixels of the one or more displays of the selected display configuration for optimizing the lighting of the video of the participant (step 524 of FIG. 5 B ).
- FIG. 6 is a block diagram of a process 600 which utilizes a model 602 (or model function) for generating or selecting lighting setting parameters for displays and/or display configurations, and a machine learning process 620 which may be used for training the model 602 .
- the process 600 shown in FIG. 6 may be utilized at the cloud server and performed in response to communications with user devices.
- Model 602 may receive, as inputs, display configuration information 610 of a selected display configuration (e.g. a single or multiple display configuration) at a user device and lighting environment parameters 612 of the lighting environment at the user device.
- Model 602 may generate or select, as an output, lighting setting parameters 614 for one or more displays of the selected display configuration at the user device.
- the lighting setting parameters may be baseline lighting setting parameters for use as a baseline for the selected display configuration and the associated lighting environment.
- the lighting setting parameters may be optimal lighting setting parameters for optimized lighting for the selected display configuration and the associated lighting environment.
- Machine learning process 620 may be used for training the model 602 according to input information 630 associated with different users, where the input information 630 may include display configuration information and lighting setting parameters.
- input information 630 may include display configuration information 1 for user 1 and associated lighting setting parameters 1 ; display configuration information 2 for user 2 and associated lighting setting parameters 2 ; and display configuration information 3 for user 3 and associated lighting setting parameters 3 , etc.
- FIG. 7 A is an example of a user display prompt 702 for display at a user device, where the user display prompt 702 indicates a plurality of display configurations 700 A. In some implementations, at least one of the plurality of display configurations 700 A shown may be selected at the user device for use in lighting adjustment.
- user display prompt 702 may include a text instruction 704 (e.g. “PLEASE SELECT YOUR DISPLAY CONFIGURATION”) as well as visual indications.
- the plurality of display configurations 700 A shown in FIG. 7 A include (e.g. visual indications of) a display configuration 700 B (e.g. for a single user display of the user device), a display configuration 700 C (e.g. for a plurality of displays connected at the user device), a display configuration 700 D (e.g. for the user display of the user device and an alternate display of a laptop or a tablet), and a display configuration 700 E (e.g. for the user display of the user device and a mobile display of a mobile device).
- a display configuration 700 B e.g. for a single user display of the user device
- a display configuration 700 C e.g. for a plurality of displays connected at the user device
- a display configuration 700 D e.g. for the user display of the user device and an alternate display of
- the plurality of display configurations 700 A may further include (e.g. visual indications of) a display configuration 800 A for a user display (e.g. for the use of unclaimed desktop space as shown and described herein) and a display configuration 800 B for a user display (e.g. for use of area outside the border of a reduced-sized desktop as shown and described herein).
- a display configuration 800 A for a user display e.g. for the use of unclaimed desktop space as shown and described herein
- a display configuration 800 B for a user display e.g. for use of area outside the border of a reduced-sized desktop as shown and described herein.
- each display in a multiple display configuration may be used exclusively for lighting or, in the alternative, only partially in accordance with one of display configurations 800 A and 800 B of FIGS. 8 A and 8 B , respectively.
- FIGS. 7 B- 7 E are examples of the plurality of display configurations which may be utilized at a user device, as indicated for user selection in the user display prompt of FIG. 7 A .
- display configuration 700 B shows a user 730 positioned in relation to a (single) user display 740 which may be utilized for both presenting the virtual meeting and for lighting.
- lighting in display configuration 700 B may be provided as described in relation to display configuration 800 A of FIG. 8 A or display configuration 800 B of FIG. 8 B (with lighting adjustments if and as needed).
- display configuration 700 C shows user 730 positioned in relation to a plurality of displays 750 , 752 , and 754 (e.g. cable-connected), where display 752 may be utilized for presenting the virtual meeting and (surrounding) displays 750 and 754 may be utilized for lighting (with lighting adjustments if and as needed).
- display 752 is utilized exclusively for the virtual meeting and displays 750 and 754 are utilized exclusively for lighting (with lighting adjustments if and as needed).
- each one of displays 750 , 752 , and 754 may be only partially used for lighting in accordance with one of display configuration 800 A of FIG. 8 A or display configuration 800 B of FIG. 8 B (with lighting adjustments if and as needed).
- display configuration 700 D shows user 730 positioned in relation to a user display 760 of the user device for presenting the virtual meeting and an alternate display 762 of a laptop or a tablet computer for lighting.
- user display 760 is utilized exclusively for the virtual meeting and alternate display 762 is utilized exclusively for lighting (with lighting adjustments if and as needed).
- each one of user display 760 and alternate display 762 may be only partially used for lighting in accordance with one of display configuration 800 A of FIG. 8 A or display configuration 800 B of FIG. 8 B (with lighting adjustments if and as needed).
- display configuration 700 E shows user 730 positioned in relation to a user display 770 of the user device presenting the virtual meeting and a mobile display 772 of a mobile device of user 730 for lighting.
- user display 770 is utilized exclusively for the virtual meeting and mobile display 772 is utilized exclusively for lighting (with lighting adjustments if and as needed).
- mobile display 772 is utilized exclusively for the virtual meeting and user display 770 is utilized exclusively for lighting (with lighting adjustments if and as needed).
- each one of user display 770 and mobile display 772 may be only partially used for lighting in accordance with one of display configuration 800 A of FIG. 8 A or display configuration 800 B of FIG. 8 B (with lighting adjustments if and as needed).
- FIGS. 8 A- 8 B are examples of different display configurations 800 A- 800 B, respectively, associated with different lighting techniques which may be utilized in a display 802 of the user device.
- a used desktop space 810 may display the presentation for the virtual meeting and unused desktop spaces 812 and 814 may be utilized for lighting.
- a used desktop space 830 may display the presentation for the virtual meeting and areas 832 and 834 outside the border of an adjusted, reduced-sized desktop may be utilized for lighting.
- lighting adjustment functionality of the lighting adjustment module associated with the video teleconference application may be triggered.
- the user device may be associated with a display(s) that is connected to or built-into the user device.
- the lighting adjustment module may retrieve previously-saved baseline lighting settings from a user display profile associated with the user.
- the profile may be a cloud profile which is stored in the cloud and retrieved via a cloud server.
- setting parameters such as display brightness, range, offset, temperature, and/or backlight (e.g. for monitors that support backlight) may be downloaded as a part of the profile.
- the profile may include the monitor's calibration profile settings and the user's previously-saved environment settings.
- the lighting adjustment module may cause a “Display Layout” prompt to be displayed (see e.g. FIG. 7 A ) for user selection based on a current set-up of the user.
- the lighting adjustment module may perform a lookup, in a cloud-connected database, for settings/parameters associated with the selected display configuration. This may be done in order to determine if there have been any previous instances of the display settings/parameters, for example, by other users as a baseline or initial settings value.
- the cloud-connected database may store all monitor display profiles and optimal configurations for rendering well-lit video.
- a machine learning model may repeatedly update these baseline settings for training the model.
- data may be regularly captured and analyzed from a plurality of virtual meetings, and/or from a plurality of display configurations of a plurality of different users, in order to train the model.
- the lighting adjustment module may use the baseline lighting settings as a baseline for making adjustments. As the lighting adjustment module runs in the background, it may analyze and evaluate the user's current (lighting) environment using the data captured from an ambient light sensor (e.g. the built-in sensor on all new laptops and monitors), as well as the screen pixels to account for the applications being shared, in order to optimize the display brightness and temperature.
- the machine learning model may utilize the data captured for dynamically adjusting the display settings to ensure that the video output is well-lit throughout the entire virtual meeting (e.g. even if the user ambient lighting or on-screen application changes during this interval).
- the lighting adjustment module may first examine to utilize the unused screen space around the monitor (see e.g. FIG. 8 A ) to illuminate the person to a consistent and pleasing glow. If the above unused screen space is not feasible, or if the virtual meeting does not have focus, the screen size where desktop content is displayed may be decreased, creating a border or edge around the display, whose pixel values may be changed in order to control the level of illumination (see e.g. FIG. 8 B ).
- the lighting adjustment module may check for the presence of a secondary display (such as a secondary monitor or laptop's screen).
- a secondary display such as a secondary monitor or laptop's screen
- the user may be prompted with the “Display Layout” options (see e.g. FIG. 7 A ) to select a different display configuration, which may include one or more secondary displays or screens (see e.g. FIGS. 7 C and 7 D , and optionally FIG. 7 E ).
- a secondary monitor or laptop screen is detected, and/or the user selects a (e.g. new) display configuration, the lighting adjustment module will automatically and adaptively adjust brightness and temperature throughout the meeting for the selected display configuration with respect to its multiple screens.
- the lighting adjustment module may prompt the user to use a mobile device (e.g. cellular telephone, smartphone, etc.) (e.g. FIG. 7 E ).
- the lighting adjustment module may be installed along with the virtual meetings application on the mobile device.
- the lighting adjustment module may intelligently adjust the intensity of a camera flash light from the mobile device during the virtual meeting to provide a secondary lighting source for a well-lit video throughout the meeting.
- the techniques and mechanisms of the present disclosure may further provide for flicker reduction.
- flicker may come from multiple sources, for example, low-quality lighting power supply electronics that do not adequately filter line noise from household power. Such line noise may be associated with the standard 60 Hertz signal of the power grid or ceiling fans, which may periodically occlude light sources or cast shadows.
- the techniques and mechanisms of the present disclosure may be utilized to improve the temporal consistency of the lighting, which may increase the overall amount and temperature of light available, in flicker reduction.
- the lighting adjustment module may utilize machine learning algorithms to detect the variations in the incoming video stream and determine their periodicity. Based on the analysis, the amount of light supplied to illuminate the subject may be varied over time to compensate for the flicker.
- the lighting adjustment module may adaptively adjust the digital light emitting from a user's existing display configuration intelligently, according to the user's ambient lighting conditions and the currently displayed content on the screen during screen-sharing.
- the lighting adjustment module may be installed on the computer for controlling the configuration of an external monitor or laptop screen, and be an additional part of the video teleconference meeting application.
- the solution may be utilized to provide consistent and well-lit lighting and illumination to participants during video teleconference meetings.
- the lighting adjustment module may enhance the video-based virtual meetings for users by proposing to utilize a variety of existing monitor/screen configurations, to provide adaptive brightness and temperature throughout the meeting. This may be achieved while saving costs for the user (e.g. those associated with the purchasing of external lighting setups), being environment friendly, and reducing overall eye strain of the users.
- FIG. 9 illustrates a hardware block diagram of a computing device 900 that may perform functions associated with operations discussed herein in connection with the techniques described in relation to the above figures, especially in relation to FIGS. 2 - 4 , 5 A- 5 B, 6 , 7 A- 7 E , and 8 A- 8 B.
- a computing device such as computing device 900 or any combination of computing devices 900 , may be configured as any entity/entities as discussed for the techniques depicted in connection with the figures in order to perform operations of the various techniques discussed herein.
- the computing device 900 may include one or more processor(s) 902 , one or more memory element(s) 904 , storage 906 , a bus 908 , one or more network processor unit(s) 910 interconnected with one or more network input/output (I/O) interface(s) 912 , one or more I/O interface(s) 914 , and control logic 920 .
- instructions associated with logic for computing device 900 can overlap in any manner and are not limited to the specific allocation of instructions and/or operations described herein.
- processor(s) 902 is/are at least one hardware processor configured to execute various tasks, operations and/or functions for computing device 900 as described herein according to software and/or instructions configured for computing device 900 .
- Processor(s) 902 e.g., a hardware processor
- processor(s) 902 can execute any type of instructions associated with data to achieve the operations detailed herein.
- processor(s) 902 can transform an element or an article (e.g., data, information) from one state or thing to another state or thing.
- Any of potential processing elements, microprocessors, digital signal processor, baseband signal processor, modem, PHY, controllers, systems, managers, logic, and/or machines described herein can be construed as being encompassed within the broad term ‘processor’.
- memory element(s) 904 and/or storage 906 is/are configured to store data, information, software, and/or instructions associated with computing device 900 , and/or logic configured for memory element(s) 904 and/or storage 906 .
- any logic described herein e.g., control logic 920
- control logic 920 can, in various embodiments, be stored for computing device 900 using any combination of memory element(s) 904 and/or storage 906 .
- storage 906 can be consolidated with memory element(s) 904 (or vice versa), or can overlap/exist in any other suitable manner.
- bus 908 can be configured as an interface that enables one or more elements of computing device 900 to communicate in order to exchange information and/or data.
- Bus 908 can be implemented with any architecture designed for passing control, data and/or information between processors, memory elements/storage, peripheral devices, and/or any other hardware and/or software components that may be configured for computing device 900 .
- bus 908 may be implemented as a fast kernel-hosted interconnect, potentially using shared memory between processes (e.g., logic), which can enable efficient communication paths between the processes.
- network processor unit(s) 910 may enable communication between computing device 900 and other systems, entities, etc., via network I/O interface(s) 912 to facilitate operations discussed for various embodiments described herein.
- network processor unit(s) 910 can be configured as a combination of hardware and/or software, such as one or more Ethernet driver(s) and/or controller(s) or interface cards, Fibre Channel (e.g., optical) driver(s) and/or controller(s), and/or other similar network interface driver(s) and/or controller(s) now known or hereafter developed to enable communications between computing device 900 and other systems, entities, etc. to facilitate operations for various embodiments described herein.
- network I/O interface(s) 912 can be configured as one or more Ethernet port(s), Fibre Channel ports, and/or any other I/O port(s) now known or hereafter developed.
- the network processor unit(s) 910 and/or network I/O interface(s) 912 may include suitable interfaces for receiving, transmitting, and/or otherwise communicating data and/or information in a network environment.
- I/O interface(s) 914 allow for input and output of data and/or information with other entities that may be connected to computing device 900 .
- I/O interface(s) 914 may provide a connection to external devices such as a keyboard, keypad, a touch screen, and/or any other suitable input and/or output device now known or hereafter developed.
- external devices can also include portable computer readable (non-transitory) storage media such as database systems, thumb drives, portable optical or magnetic disks, and memory cards.
- external devices can be a mechanism to display data to a user, such as, for example, a computer monitor, a display screen, or the like.
- control logic 920 can include instructions that, when executed, cause processor(s) 902 to perform operations, which can include, but not be limited to, providing overall control operations of computing device; interacting with other entities, systems, etc. described herein; maintaining and/or interacting with stored data, information, parameters, etc. (e.g., memory element(s), storage, data structures, databases, tables, etc.); combinations thereof; and/or the like to facilitate various operations for embodiments described herein.
- stored data, information, parameters, etc. e.g., memory element(s), storage, data structures, databases, tables, etc.
- control logic 920 may be identified based upon application(s) for which they are implemented in a specific embodiment. However, it should be appreciated that any particular program nomenclature herein is used merely for convenience; thus, embodiments herein should not be limited to use(s) solely described in any specific application(s) identified and/or implied by such nomenclature.
- entities as described herein may store data/information in any suitable volatile and/or non-volatile memory item (e.g., magnetic hard disk drive, solid state hard drive, semiconductor storage device, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), application specific integrated circuit (ASIC), etc.), software, logic (fixed logic, hardware logic, programmable logic, analog logic, digital logic), hardware, and/or in any other suitable component, device, element, and/or object as may be appropriate.
- RAM random access memory
- ROM read only memory
- EPROM erasable programmable read only memory
- ASIC application specific integrated circuit
- Any of the memory items discussed herein should be construed as being encompassed within the broad term ‘memory element’.
- Data/information being tracked and/or sent to one or more entities as discussed herein could be provided in any database, table, register, list, cache, storage, and/or storage structure: all of which can be referenced at any suitable timeframe. Any such storage options may also be included within the broad term ‘memory element’ as used herein.
- operations as set forth herein may be implemented by logic encoded in one or more tangible media that is capable of storing instructions and/or digital information and may be inclusive of non-transitory tangible media and/or non-transitory computer readable storage media (e.g., embedded logic provided in: an ASIC, digital signal processing (DSP) instructions, software [potentially inclusive of object code and source code], etc.) for execution by one or more processor(s), and/or other similar machine, etc.
- memory element(s) 904 and/or storage 906 can store data, software, code, instructions (e.g., processor instructions), logic, parameters, combinations thereof, and/or the like used for operations described herein.
- software of the present embodiments may be available via a non-transitory computer useable medium (e.g., magnetic or optical mediums, magneto-optic mediums, CD-ROM, DVD, memory devices, etc.) of a stationary or portable program product apparatus, downloadable file(s), file wrapper(s), object(s), package(s), container(s), and/or the like.
- non-transitory computer readable storage media may also be removable.
- a removable hard drive may be used for memory/storage in some implementations.
- Other examples may include optical and magnetic disks, thumb drives, and smart cards that can be inserted and/or otherwise connected to a computing device for transfer onto another computer readable storage medium.
- Embodiments described herein may include one or more networks, which can represent a series of points and/or network elements of interconnected communication paths for receiving and/or transmitting messages (e.g., packets of information) that propagate through the one or more networks. These network elements offer communicative interfaces that facilitate communications between the network elements.
- a network can include any number of hardware and/or software elements coupled to (and in communication with) each other through a communication medium.
- Such networks can include, but are not limited to, any local area network (LAN), virtual LAN (VLAN), wide area network (WAN) (e.g., the Internet), software defined WAN (SD-WAN), wireless local area (WLA) access network, wireless wide area (WWA) access network, metropolitan area network (MAN), Intranet, Extranet, virtual private network (VPN), Low Power Network (LPN), Low Power Wide Area Network (LPWAN), Machine to Machine (M2M) network, Internet of Things (IoT) network, Ethernet network/switching system, any other appropriate architecture and/or system that facilitates communications in a network environment, and/or any suitable combination thereof.
- LAN local area network
- VLAN virtual LAN
- WAN wide area network
- SD-WAN software defined WAN
- WLA wireless local area
- WWA wireless wide area
- MAN metropolitan area network
- Intranet Internet
- Extranet virtual private network
- VPN Virtual private network
- LPN Low Power Network
- LPWAN Low Power Wide Area Network
- M2M Machine to Machine
- Networks through which communications propagate can use any suitable technologies for communications including wireless communications (e.g., 4G/5G/nG, IEEE 802.11 (e.g., Wi-Fi®/Wi-Fi6®), IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), Radio-Frequency Identification (RFID), Near Field Communication (NFC), BluetoothTM, mm.wave, Ultra-Wideband (UWB), etc.), and/or wired communications (e.g., T1 lines, T3 lines, digital subscriber lines (DSL), Ethernet, Fibre Channel, etc.).
- wireless communications e.g., 4G/5G/nG, IEEE 802.11 (e.g., Wi-Fi®/Wi-Fi6®), IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), Radio-Frequency Identification (RFID), Near Field Communication (NFC), BluetoothTM, mm.wave, Ultra-Wideband (U
- any suitable means of communications may be used such as electric, sound, light, infrared, and/or radio to facilitate communications through one or more networks in accordance with embodiments herein.
- Communications, interactions, operations, etc. as discussed for various embodiments described herein may be performed among entities that may directly or indirectly connected utilizing any algorithms, communication protocols, interfaces, etc. (proprietary and/or non-proprietary) that allow for the exchange of data and/or information.
- entities for various embodiments described herein can encompass network elements (which can include virtualized network elements, functions, etc.) such as, for example, network appliances, forwarders, routers, servers, switches, gateways, bridges, loadbalancers, firewalls, processors, modules, radio receivers/transmitters, or any other suitable device, component, element, or object operable to exchange information that facilitates or otherwise helps to facilitate various operations in a network environment as described for various embodiments herein.
- network elements which can include virtualized network elements, functions, etc.
- network appliances such as, for example, network appliances, forwarders, routers, servers, switches, gateways, bridges, loadbalancers, firewalls, processors, modules, radio receivers/transmitters, or any other suitable device, component, element, or object operable to exchange information that facilitates or otherwise helps to facilitate various operations in a network environment as described for various embodiments herein.
- Communications in a network environment can be referred to herein as ‘messages’, ‘messaging’, ‘signaling’, ‘data’, ‘content’, ‘objects’, ‘requests’, ‘queries’, ‘responses’, ‘replies’, etc. which may be inclusive of packets.
- packet may be used in a generic sense to include packets, frames, segments, datagrams, and/or any other generic units that may be used to transmit communications in a network environment.
- a packet is a formatted unit of data that can contain control or routing information (e.g., source and destination address, source and destination port, etc.) and data, which is also sometimes referred to as a ‘payload’, ‘data payload’, and variations thereof.
- control or routing information, management information, or the like can be included in packet fields, such as within header(s) and/or trailer(s) of packets.
- IP addresses discussed herein and in the claims can include any IP version 4 (IPv4) and/or IP version 6 (IPv6) addresses.
- embodiments presented herein relate to the storage of data
- the embodiments may employ any number of any conventional or other databases, data stores or storage structures (e.g., files, databases, data structures, data or other repositories, etc.) to store information.
- data stores or storage structures e.g., files, databases, data structures, data or other repositories, etc.
- references to various features e.g., elements, structures, nodes, modules, components, engines, logic, steps, operations, functions, characteristics, etc.
- references to various features included in ‘one embodiment’, ‘example embodiment’, ‘an embodiment’, ‘another embodiment’, ‘certain embodiments’, ‘some embodiments’, ‘various embodiments’, ‘other embodiments’, ‘alternative embodiment’, and the like are intended to mean that any such features are included in one or more embodiments of the present disclosure, but may or may not necessarily be combined in the same embodiments.
- a module, engine, client, controller, function, logic or the like as used herein in this Specification can be inclusive of an executable file comprising instructions that can be understood and processed on a server, computer, processor, machine, compute node, combinations thereof, or the like and may further include library modules loaded during execution, object files, system files, hardware logic, software logic, or any other executable modules.
- each of the expressions ‘at least one of X, Y and Z’, ‘at least one of X, Y or Z’, ‘one or more of X, Y and Z’, ‘one or more of X, Y or Z’ and ‘X, Y and/or Z’ can mean any of the following: 1) X, but not Y and not Z; 2) Y, but not X and not Z; 3) Z, but not X and not Y; 4) X and Y, but not Z; 5) X and Z, but not Y; 6) Y and Z, but not X; or 7) X, Y, and Z.
- first, ‘second’, ‘third’, etc. are intended to distinguish the particular nouns they modify (e.g., element, condition, node, module, activity, operation, etc.). Unless expressly stated to the contrary, the use of these terms is not intended to indicate any type of order, rank, importance, temporal sequence, or hierarchy of the modified noun.
- ‘first X’ and ‘second X’ are intended to designate two ‘X’ elements that are not necessarily limited by any order, rank, importance, temporal sequence, or hierarchy of the two elements.
- ‘at least one of’ and ‘one or more of’ can be represented using the ‘(s)’ nomenclature (e.g., one or more element(s)).
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
Abstract
Description
- The present application is a continuation of U.S. application Ser. No. 17/504,956, filed on Oct. 19, 2021, the contents of which is incorporated herein by reference herein in its entirety.
- The present disclosure relates generally to video teleconferencing, and more particularly to techniques and mechanisms for providing intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings.
- There are many amongst us who have participated in a virtual meeting (e.g. a virtual web-based or online meeting) where the lighting conditions for one or more of the participants were inadequate. In many instances, the lighting for the participants in a virtual meeting may be relatively dark. At these times, one may not be able to clearly see or identify other participants on the screen, especially in the case where the virtual meeting has a large number of participants.
- In some cases, external lighting setups, such as video lighting setups and other lighting kit setups, may have to be purchased for the virtual meetings. These types of lighting setups are not environmentally friendly and may come at a cost to the user. Also, in today's post-pandemic era with everyone concerned about their health and wellness, it is desirable to provide lighting that is soothing and that will not cause eye strain, especially over relatively long virtual meetings.
-
FIG. 1 is an illustrative representation of a basic network architecture within which a virtual meeting amongst a plurality of user devices may be facilitated according to some implementations of the present disclosure; -
FIG. 2 is an illustrative representation of a user device and a cloud server for use in the virtual meeting according to some implementations of the present disclosure; -
FIG. 3 is a high-level diagram of a system for a cloud-assisted lighting adjustment for a virtual meeting facilitated by a virtual meeting application according to some implementations of the present disclosure; -
FIG. 4 is a flowchart for describing a method for use in a cloud-assisted adjustment of lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application of the user device with use of a selected display configuration; -
FIGS. 5A-5B form a flowchart for describing a method for use in a cloud-assisted adjustment of a video of a participant in a virtual meeting facilitated by a virtual meeting application of the user device with use of a selected display configuration; -
FIG. 6 is a block diagram of a process which utilizes a model for generating or selecting baseline lighting setting parameters for displays and/or display configurations, and a machine learning process for training the model; -
FIG. 7A is an example of a user display prompt for display at a user device, where the user display prompt indicates a plurality of display configurations, at least one of which may be selected for use at the user device; -
FIGS. 7B-7E are examples of the plurality of display configurations associated with different lighting arrangements which may be used at the user device; -
FIGS. 8A-8B are examples of different display configurations that are associated with different lighting techniques which may be utilized in a display of the user device; -
FIG. 9 illustrates a hardware block diagram of a computing device that may perform functions associated with operations discussed herein. - Techniques and mechanisms for providing intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings are described herein.
- In one illustrative example, a user device is operative to adjust and optimize lighting of a video of a participant in a virtual meeting with use of a selected display configuration at the user device, with the assistance of a cloud server. The selected display configuration may be a selected from one of a plurality of display configurations (e.g. a user display of the device, a plurality of displays connected at the user device, the user display and an alternate display of a laptop or a tablet, etc.). The user device may receive, from the cloud server, baseline lighting setting parameters associated with the selected display configuration. The user device may apply the baseline lighting setting parameters to one or more displays of the selected display configuration at the user device. Using the baseline lighting setting parameters as a baseline, the user device may automatically adjust a brightness and/or color pixels of the one or more displays of the selected display configuration, for optimizing the lighting of the video of the participant.
- More detailed and alternative techniques and implementations are provided herein as described below.
- Presented herein are cloud-assisted techniques and mechanisms that allow for adjusting and optimizing lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application of the user device.
- Referring first to
FIG. 1 , a diagram of anetwork environment 100 in which the techniques of the present disclosure may be carried out is shown. InFIG. 1 , a plurality of user devices 110(1), 110(2), to 110(N) associated with a plurality of users (e.g. User 1,User 2, through User N) or participants of a virtual meeting are shown. User devices 110(1), 110(2), to 110(N) may take on a variety of forms, including a smartphone, a tablet, a laptop computer, a desktop computer, etc. User devices 110(1), 110(2), to 110(N) may communicate with various network-based entities shown inFIG. 1 via one ormore networks 190. - Networked conferencing systems typically employ a client-server architecture, whereby each participant's client software (e.g. running on the participant's computer or work-station) connects to a
web conference server 170. Amongst other functionality,web conference server 170 may include conference control services and access control services that govern access to conference functionality and/or resources. When a participant “logs in” to a virtual meeting, the participant's identification may determine permissions granted. When a host or moderator of the virtual meeting logs in, the host may be given the broadest access rights to control the virtual meeting.Web conference server 170 may control all communications with the various clients according to a set of permissions granted to the conference participant logged in on that client. - A
media orchestrator 160 may ensure that all or select participants get connected to a meeting supported by amedia provider 180 or, in the case of multiple media providers, to the appropriate one or more media providers. The functions ofmedia orchestrator 160 and/or media provider(s) 180 may be performed by separate entities as shown, or may be integrated (either on-premises, in the cloud, or a hybrid of on-premises and cloud). - A
cloud server 130 may include a lighting adjustment service for user devices 110(1), 110(2), to 110(N) for virtual meetings. More particularly, this cloud-assisted service may assist in the adjusting and optimizing of lighting of video of participants in the virtual meetings, with use of selected display configurations of user devices 110(1), 110(2), to 110(N). InFIG. 1 , it is shown thatcloud server 130, along withmedia orchestrator 160,web conference server 170, andmedia provider 180, may reside off-premises in a cloud or data center computing environment. In some implementations,cloud server 130 may reside on-premises. -
FIG. 2 is a block diagram 200 of a user device 110 (one of user devices 110(1), 110(2), to 110(N) ofFIG. 1 ) andcloud server 130 which are configured to connect to one ormore networks 190 for network-based communication. As shown,user device 110 may include one or more processors 220 (e.g., a microprocessor or microcontroller), anetwork interface unit 222 that enables wired and/or wireless network communication, one or more user interface components 224 (e.g., keyboard, mouse, touchscreen, etc.), and at least one display 226 (e.g., a display screen of a monitor, or touch screen, etc.).User device 110 may also include amemory 214 for storing software instructions of alighting adjustment module 216, a meeting client application 217 (e.g. a virtual meeting application.), and one or more joinlinks 218. For the sake of completeness,FIG. 2 also shows anoperating system 219 on which thelighting adjustment module 216 and themeeting client application 217 may run. -
Cloud server 130 ofFIG. 2 may include one ormore processors 232, anetwork interface unit 234 and amemory 236.Memory 236 may store instructions ofcloud server software 238 for the lighting adjustment service.Memory 214 andmemory 236 may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices. In general,memory 236 shown inFIG. 2 may include one or more tangible (non-transitory) computer readable storage media encoded with software comprising computer executable instructions that, when executed by the one ormore processors 232, causes the operations as described herein to be performed. - As described herein, the techniques and mechanisms of the present disclosure may involve use of
lighting adjustment module 216 onuser device 110, which may be referred to as an Intelligent Digital Lighting Adjustment (IDLA) module. With assistance ofcloud server 130,lighting adjustment module 216 may be operative to adjust light automatically and intelligently on existing monitors and laptop screens of the participant's device during a video teleconference meeting or “virtual meeting” in response to ambient light conditions. - In some implementations,
lighting adjustment module 216 may be designed to be an additional part of meeting client application 217 (e.g. Cisco WebEx Meetings) installed on a participant's computer, for example, to control the settings of a selected display configuration. The selected display configuration may include display 226 (or e.g. touchscreen) ofuser device 110, an external monitor, a laptop screen, a multi-display/screen configuration, etc. Notably, a variety of different types of screen/monitor configurations may be selected from and utilized for video lighting atuser device 110. For improved lighting,lighting adjustment module 216 may interface and communicate with thecloud server 130 in associated with user display profiles (e.g. display and/or configuration profiles) associated with the participant (e.g. even for a plurality of participants in the video teleconference). -
FIG. 3 is a high-level diagram of asystem 300 for a cloud-assisted lighting adjustment for a virtual meeting facilitated by a virtual meeting application according to some implementations of the present disclosure. InFIG. 3 ,lighting adjustment module 216 may utilize a selecteddisplay configuration 310 associated withuser device 110 to provide lighting and associated adjustments for the participant of the virtual meeting.Lighting adjustment module 216 may receive lighting adjustment assistance fromcloud server 130 which is provided in acloud 302.Cloud server 130 has access to adatabase 250 which stores user display profiles having lighting setting parameters for different display configurations associated with the participants. Lighting adjustment may be performed such that an unoptimized, poorly-litvideo presentation 320 of a virtual meeting may be converted into an optimized, well-litvideo presentation 322. At least in some cases, which will depend on the user's setup as well as the user's selection, the selecteddisplay configuration 310 will utilize the lighting provided via (at least) display 226 (or e.g. touchscreen) ofuser device 110 ofFIG. 2 . - In some implementations,
lighting adjustment module 216 may be operative to utilize an ambient light sensor associated withuser device 110 in order to assess the existing ambient conditions or lighting environment associated with the participant. In preferred implementations, the ambient light sensor may be a built-in light sensor of user device 110 (e.g. in MAC OS X and WINDOWS) for measuring the brightness of the light in a room, in order to adjust the brightness and other parameters in the selecteddisplay configuration 310 associated withuser device 110. In some implementations,lighting adjustment module 216 may be operative to utilize the same brightness control that can be manually-controlled with a laptop's physical buttons or screen: using a Display Data Channel (DDC). Using DDC allows for more advanced features (e.g. setting brightness to 30%) and others directly with respect to improving color rendering, power consumption, backlight bleeding effect. - To illustrate by example, studies have shown that people tend to look the best when illuminated by light that measures warm (e.g. around 2700 kelvins). Most people prefer a warm, bluish glow, as such coloring appears as natural lighting to make people feel more “at home” on their computers. In some implementations, it is desired is to replicate such effects with use of
lighting adjustment module 216 for an intelligent adjustment of the brightness and other lighting parameters, so that a user may be illuminated in a consistent and pleasing way throughout a virtual meeting. - With selected
display configuration 310, one of a variety of different types of screen or monitor configurations ofuser device 110 may be selected from for video lighting (e.g. the use of a multi-screen display configuration screen for video lighting). Such options are considered to be very desirable or important, especially considering that in many instances the lighting provided from a single display is insufficient. Thus,lighting adjustment module 216 may utilize existing monitors or laptop screens as a light source to provide lightning during a video teleconference meeting for a warm, well-lit video look for a user's face. - In some implementations,
lighting adjustment module 216 may utilize video-based sensing technologies, such as deep learning algorithms and machine vision techniques, to enhance the quality of a virtual meeting, by intelligently adjusting digital light from screens or monitors to provide a consistent and balanced illumination throughout the duration of the virtual meeting. This may be achieved using existing monitor or laptop screens whose settings can be dynamically modified so they can emit the appropriate amount of light when needed based on sensor data collected within the room. - As an alternative to local control (e.g. a desktop application), cloud-based or cloud-assisted control may be utilized as needed and described herein. In some implementations,
cloud server 130 may maintain and store user display profiles associated with each one of a plurality of users or participants. In a user display profile, baseline lighting setting parameters associated with selecteddisplay configuration 310 for each user device may be stored. At the outset of a virtual meeting,lighting adjustment module 216 may receive and apply the baseline lighting setting parameters to one or more displays of the selected display configuration. Using the baseline lighting setting parameters as a baseline,lighting adjustment module 216 may adjust the brightness and/or color pixels of the one or more displays of selecteddisplay configuration 310, for optimizing the lighting of the video of the participant. In some further implementations, cloud-based control may provide an operation to normalize the lighting for all participants when multiple individuals are presenting in the virtual meeting. - A more detailed description of the technique and mechanisms of the present disclosure is now provided. In particular, it has been recognized that the supply of light that emanates from a computer's display screen(s) depends on two different factors: (1) the overall brightness (e.g. often the backlight brightness) of the screen; and (2) the color and brightness of the individual pixels corresponding to the currently displayed content on the screen.
- For controlling the overall brightness per (1) above,
lighting adjustment module 216 may utilize DDC. DDC is a technology that is supported by most computer monitors for allowing software control of the brightness in the same way it is manually controlled by a user (via buttons integrated into the display or laptop keyboard). For controlling the pixel values per (2) above,lighting adjustment module 216 may be operative to employ one or more of a plurality of different strategies as follows: - (a) When the video teleconference application has focus, the background hue or brightness of the application's background may be varied in order to create the desired lighting characteristics. In addition, existing background matting or replacement technologies (e.g. virtual background) may be used to change the background of participants, thereby increasing the proportion of pixels that can be controlled for lighting purposes. One example of this strategy is shown and described later in relation to
FIG. 8A . - (b) When the video teleconference meeting does not have focus, the screen size where the desktop content is displayed may be reduced to thereby create a border (or one or more border areas) around the display. The pixel values within the border may then be adjusted to control available lighting. One example of this strategy is shown and described later in relation to
FIG. 8B . - (c) A display theme may be switched from “dark” to “light” during the virtual meeting, automatically or manually enabled by the user.
- (d) If multiple display screens are available, the secondary (or tertiary, etc.) displays or screens may be utilized, overridden completely or, in the alternative, partially as described in (a) and (b) above, in order to provide controllable lighting pixels. One example of this strategy is shown and described later in relation to
FIGS. 7C and/or 7D . - (e) If mobile devices (e.g. cellular phones, smartphones, or tablet computers) are available, companion software may be made available on those devices to use them for lighting which may be adjusted. One example of this strategy is shown and described later in relation to
FIG. 7E . - With reference now to
FIG. 4 , aflowchart 400 is shown for describing a method for a cloud-assisted adjustment of lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application according to some implementations. The method ofFIG. 4 may be performed by a user device having the virtual meeting application and interacting with a server (e.g. a cloud server). More particularly, the method may be performed at least in part by a lighting adjustment module of the user device which interacts with the cloud server. The method may be embodied as a computer program product including a non-transitory computer readable medium (e.g. one or more memory elements) and instructions stored in the computer readable medium, where the instructions are executable on one or more processors for performing the steps of the method. - Beginning at a
start block 402, the user device may receive, from the cloud server, baseline lighting setting parameters associated with a selected display configuration at the user device (step 404 ofFIG. 4 ). The selected display configuration may be a selected one of a plurality of display configurations. For example, the selected display configuration may be a user display of the user device; a plurality of displays connected at the user device; the user display of the user device and an alternate display of a laptop or a tablet, the user display of the user device and a mobile display of a mobile device, etc. The baseline setting parameters may be received in response to the user device sending to the cloud server a message which indicates a request for the baseline lighting setting parameters. Upon receipt, the user device may apply the baseline lighting setting parameters to one or more displays of the selected display configuration at the user device (step 406 ofFIG. 4 ). Using the baseline lighting setting parameters as a baseline, the user device may adjust the brightness and/or color pixels of the one or more displays of the selected display configuration at the user device, for optimizing the lighting of the video of the participant (step 408 ofFIG. 4 ). In some implementations, the user device may obtain lighting environment parameters based on sensing a lighting environment of the user device (e.g. with use of one or more sensors of the user device), and may adjust the brightness and/or the color pixels of the one or more displays of the selected display configuration according to the lighting environment parameters. - In some implementations, the user device may send, to the cloud server, display configuration information associated with the selected display configuration at the user device. In some implementations, the display configuration information may include a display configuration setting value for (properly or uniquely) identifying the selected display configuration. In some implementations, the display configuration information may (further) include one or more of a number of displays, an arrangement of displays, and display make and model information. In some implementations, the user device may cause a user display prompt to be displayed, where the user display prompt indicates the plurality of display configurations for user selection. The user device may receive a user selection of the selected display configuration, and then send to the cloud server a message which indicates the selected display configuration for storage in a user display profile.
- Further, in some implementations, the cloud server may receive from the user device a message which indicates lighting setting parameters associated with the optimizing of the lighting of the video of the participant, and store, in the user profile, the lighting setting parameters in association with the selected display configuration. The lighting setting parameters may be for subsequent use by the user device as the baseline lighting setting parameters for the selected display configuration. The message which indicates lighting setting parameters may further indicate lighting environment parameters associated with a lighting environment of the user device.
- In some implementations, the cloud server may store a plurality of baseline lighting setting parameters respectively associated with the plurality of display configurations, based on lighting setting parameters received from a plurality of user devices respectively associated with a plurality of participants. In some further implementations, the cloud server may utilize a machine learning process to generate a plurality of (more optimal) baseline lighting setting parameters respectively associated with the plurality of display configurations, based on the lighting setting parameters received from the plurality of user devices.
-
FIGS. 5A-5B form aflowchart 500 for describing a method for a cloud-assisted adjustment of lighting of a video of a participant in a virtual meeting facilitated by a virtual meeting application according to some implementations. The method ofFIG. 5A-5B may be performed by a user device having the virtual meeting application and interacting with a server (e.g. a cloud server). More particularly, the method may be performed at least in part by a lighting adjustment module of the user device which interacts with the cloud server. The method may be embodied as a computer program product including a non-transitory computer readable medium (e.g. one or more memory elements) and instructions stored in the computer readable medium, where the instructions are executable on one or more processors for performing the steps of the method. - Beginning at a
start block 502 ofFIG. 5A , a virtual meeting is initiated (step 504 ofFIG. 5A ). A user of the user device having the virtual meeting application is one of the participants in the virtual meeting. The user device may monitor the lighting of a video of the participant of the virtual meeting (e.g. an ambient light sensor, for assessing the participant's lighting environment) (step 506 ofFIG. 5A ). If the lighting is identified to be satisfactory or optimal (as tested atstep 508 ofFIG. 5A ), the user device may continue to monitor the lighting atstep 506. On the other hand, if the lighting is identified to be unsatisfactory or not optimal (as tested atstep 508 ofFIG. 5A ), the user device may proceed to perform a cloud-assisted interaction with the lighting adjustment module of the user device (step 510 ofFIG. 5A ). The method may continued inFIG. 5B . - Continuing with the
flowchart 500 ofFIG. 5B , in response to lighting that is unsatisfactory or less than optimal, the user device may send a message to the cloud server (step 512 ofFIG. 5B ). In some implementations, the message may indicate (implicitly or explicitly) a request for baseline lighting setting parameters for its display configuration. In some implementations, the message may include one or more of an identity associated with the user device, display configuration information of the user device, and lighting environment parameters which indicate the lighting environment of the user device. In some implementations, the display configuration information may include a display configuration setting value for (properly or uniquely) identifying the selected display configuration. In some implementations, the display configuration information may (further) include one or more of a number of displays, an arrangement of displays, and display make and model information. - The cloud server may receive the message from the user device (step 532 of
FIG. 5B ). In response to receipt of the message, the cloud server may search a database for identifying a user display profile associated with the user device or user thereof (step 534 ofFIG. 5B ). If the user display profile is not found in the database (as tested instep 536 ofFIG. 5B ), then the cloud server may send a message to the user device (step 538 ofFIG. 5B ). In some implementations, the message may indicate a request for user selection of a display configuration. - The user device may receive the message from the cloud server (step 514 of
FIG. 5B ), cause a user display prompt to be displayed, where the user display prompt indicates a plurality of display configurations for user selection, and receive a user selection of one of the plurality of display configurations in the user display prompt (step 516 ofFIG. 5B ). In response to the user selection, the user device may send a message to the cloud server (step 518 ofFIG. 5B ). In some implementations, the message may indicate the selected display configuration for storage in a user display profile. In some implementations, the message may indicate the selected display configuration and lighting environment parameters which indicate the lighting environment of the user device. - The cloud server may receive the message from the user device and create a user display profile based on the selected display configuration (step 540 of
FIG. 5B ). The cloud server may also obtain lighting setting parameters (step 542 ofFIG. 5B ), which may be generated or selected based on the lighting environment parameters received from the user device. In some implementations, this process may utilize a model or model function at the cloud server, one example of which is shown and described later in relation toFIG. 6 . In some implementations, the lighting setting parameters are baseline lighting setting parameters for use as a baseline at the user device. Alternatively, the baseline lighting setting parameters may be obtained from the user display profile if and when found instep 536 ofFIG. 5B . - The cloud server may then send a message to the user device (step 544 of
FIG. 5B ). In some implementations, the message may include the baseline lighting setting parameters. The user device may receive the message which includes the baseline lighting setting parameters (step 520 ofFIG. 5B ). The user device may apply the baseline lighting setting parameters to one or more display of the selected display configuration at the user device (step 522 ofFIG. 5B ). Using the baseline lighting setting parameters as a baseline, the user device may adjust a brightness and/or color pixels of the one or more displays of the selected display configuration for optimizing the lighting of the video of the participant (step 524 ofFIG. 5B ). -
FIG. 6 is a block diagram of aprocess 600 which utilizes a model 602 (or model function) for generating or selecting lighting setting parameters for displays and/or display configurations, and amachine learning process 620 which may be used for training themodel 602. Theprocess 600 shown inFIG. 6 may be utilized at the cloud server and performed in response to communications with user devices.Model 602 may receive, as inputs,display configuration information 610 of a selected display configuration (e.g. a single or multiple display configuration) at a user device andlighting environment parameters 612 of the lighting environment at the user device.Model 602 may generate or select, as an output,lighting setting parameters 614 for one or more displays of the selected display configuration at the user device. In some implementations, the lighting setting parameters may be baseline lighting setting parameters for use as a baseline for the selected display configuration and the associated lighting environment. In some implementations, the lighting setting parameters may be optimal lighting setting parameters for optimized lighting for the selected display configuration and the associated lighting environment. -
Machine learning process 620 may be used for training themodel 602 according toinput information 630 associated with different users, where theinput information 630 may include display configuration information and lighting setting parameters. For example, as shown inFIG. 6 ,input information 630 may includedisplay configuration information 1 foruser 1 and associatedlighting setting parameters 1;display configuration information 2 foruser 2 and associatedlighting setting parameters 2; anddisplay configuration information 3 foruser 3 and associatedlighting setting parameters 3, etc. -
FIG. 7A is an example of auser display prompt 702 for display at a user device, where theuser display prompt 702 indicates a plurality ofdisplay configurations 700A. In some implementations, at least one of the plurality ofdisplay configurations 700A shown may be selected at the user device for use in lighting adjustment. - In the example of
FIG. 7A ,user display prompt 702 may include a text instruction 704 (e.g. “PLEASE SELECT YOUR DISPLAY CONFIGURATION”) as well as visual indications. The plurality ofdisplay configurations 700A shown inFIG. 7A include (e.g. visual indications of) adisplay configuration 700B (e.g. for a single user display of the user device), adisplay configuration 700C (e.g. for a plurality of displays connected at the user device), adisplay configuration 700D (e.g. for the user display of the user device and an alternate display of a laptop or a tablet), and adisplay configuration 700E (e.g. for the user display of the user device and a mobile display of a mobile device). - In
FIG. 7A , the plurality ofdisplay configurations 700A may further include (e.g. visual indications of) adisplay configuration 800A for a user display (e.g. for the use of unclaimed desktop space as shown and described herein) and adisplay configuration 800B for a user display (e.g. for use of area outside the border of a reduced-sized desktop as shown and described herein). Note that each display in a multiple display configuration may be used exclusively for lighting or, in the alternative, only partially in accordance with one of 800A and 800B ofdisplay configurations FIGS. 8A and 8B , respectively. -
FIGS. 7B-7E are examples of the plurality of display configurations which may be utilized at a user device, as indicated for user selection in the user display prompt ofFIG. 7A . - In
FIG. 7B ,display configuration 700B shows auser 730 positioned in relation to a (single)user display 740 which may be utilized for both presenting the virtual meeting and for lighting. Here, lighting indisplay configuration 700B may be provided as described in relation to displayconfiguration 800A ofFIG. 8A ordisplay configuration 800B ofFIG. 8B (with lighting adjustments if and as needed). - In
FIG. 7C ,display configuration 700C showsuser 730 positioned in relation to a plurality of 750, 752, and 754 (e.g. cable-connected), wheredisplays display 752 may be utilized for presenting the virtual meeting and (surrounding) displays 750 and 754 may be utilized for lighting (with lighting adjustments if and as needed). In some implementations,display 752 is utilized exclusively for the virtual meeting and displays 750 and 754 are utilized exclusively for lighting (with lighting adjustments if and as needed). In other implementations, each one of 750, 752, and 754 may be only partially used for lighting in accordance with one ofdisplays display configuration 800A ofFIG. 8A ordisplay configuration 800B ofFIG. 8B (with lighting adjustments if and as needed). - In
FIG. 7D ,display configuration 700D showsuser 730 positioned in relation to auser display 760 of the user device for presenting the virtual meeting and analternate display 762 of a laptop or a tablet computer for lighting. In some implementations,user display 760 is utilized exclusively for the virtual meeting andalternate display 762 is utilized exclusively for lighting (with lighting adjustments if and as needed). In other implementations, each one ofuser display 760 andalternate display 762 may be only partially used for lighting in accordance with one ofdisplay configuration 800A ofFIG. 8A ordisplay configuration 800B ofFIG. 8B (with lighting adjustments if and as needed). - In
FIG. 7E ,display configuration 700E showsuser 730 positioned in relation to auser display 770 of the user device presenting the virtual meeting and amobile display 772 of a mobile device ofuser 730 for lighting. In some implementations,user display 770 is utilized exclusively for the virtual meeting andmobile display 772 is utilized exclusively for lighting (with lighting adjustments if and as needed). In other implementations,mobile display 772 is utilized exclusively for the virtual meeting anduser display 770 is utilized exclusively for lighting (with lighting adjustments if and as needed). In yet other implementations, each one ofuser display 770 andmobile display 772 may be only partially used for lighting in accordance with one ofdisplay configuration 800A ofFIG. 8A ordisplay configuration 800B ofFIG. 8B (with lighting adjustments if and as needed). -
FIGS. 8A-8B are examples ofdifferent display configurations 800A-800B, respectively, associated with different lighting techniques which may be utilized in adisplay 802 of the user device. In a lighting technique ofdisplay configuration 800A ofFIG. 8A , a useddesktop space 810 may display the presentation for the virtual meeting and 812 and 814 may be utilized for lighting. In a lighting technique associated withunused desktop spaces display configuration 800B ofFIG. 8B , a useddesktop space 830 may display the presentation for the virtual meeting and 832 and 834 outside the border of an adjusted, reduced-sized desktop may be utilized for lighting.areas - What is now described are examples of the above-described strategies put in use according to some implementations. When lighting conditions are identified to be insufficient for a well-lit, video teleconference meeting (or “virtual meeting”), lighting adjustment functionality of the lighting adjustment module associated with the video teleconference application (e.g. Cisco WebEx Meetings) may be triggered. The user device may be associated with a display(s) that is connected to or built-into the user device. The lighting adjustment module may retrieve previously-saved baseline lighting settings from a user display profile associated with the user. The profile may be a cloud profile which is stored in the cloud and retrieved via a cloud server. Whenever a previously-used display or display configuration is connected and detected, setting parameters, such as display brightness, range, offset, temperature, and/or backlight (e.g. for monitors that support backlight), may be downloaded as a part of the profile. The profile may include the monitor's calibration profile settings and the user's previously-saved environment settings.
- Especially for a first time user or first time use, the lighting adjustment module may cause a “Display Layout” prompt to be displayed (see e.g.
FIG. 7A ) for user selection based on a current set-up of the user. The lighting adjustment module may perform a lookup, in a cloud-connected database, for settings/parameters associated with the selected display configuration. This may be done in order to determine if there have been any previous instances of the display settings/parameters, for example, by other users as a baseline or initial settings value. Here, the cloud-connected database may store all monitor display profiles and optimal configurations for rendering well-lit video. In some implementations, a machine learning model may repeatedly update these baseline settings for training the model. Here, data may be regularly captured and analyzed from a plurality of virtual meetings, and/or from a plurality of display configurations of a plurality of different users, in order to train the model. - The lighting adjustment module may use the baseline lighting settings as a baseline for making adjustments. As the lighting adjustment module runs in the background, it may analyze and evaluate the user's current (lighting) environment using the data captured from an ambient light sensor (e.g. the built-in sensor on all new laptops and monitors), as well as the screen pixels to account for the applications being shared, in order to optimize the display brightness and temperature. The machine learning model may utilize the data captured for dynamically adjusting the display settings to ensure that the video output is well-lit throughout the entire virtual meeting (e.g. even if the user ambient lighting or on-screen application changes during this interval).
- According to the present disclosure, in one method, the lighting adjustment module may first examine to utilize the unused screen space around the monitor (see e.g.
FIG. 8A ) to illuminate the person to a consistent and pleasing glow. If the above unused screen space is not feasible, or if the virtual meeting does not have focus, the screen size where desktop content is displayed may be decreased, creating a border or edge around the display, whose pixel values may be changed in order to control the level of illumination (see e.g.FIG. 8B ). - If, however, the lighting adjustment module determines that the brightness controls for the existing monitor are at a maximum value (e.g. saturated) and cannot be altered further, the lighting adjustment module may check for the presence of a secondary display (such as a secondary monitor or laptop's screen). Here, the user may be prompted with the “Display Layout” options (see e.g.
FIG. 7A ) to select a different display configuration, which may include one or more secondary displays or screens (see e.g.FIGS. 7C and 7D , and optionallyFIG. 7E ). If a secondary monitor or laptop screen is detected, and/or the user selects a (e.g. new) display configuration, the lighting adjustment module will automatically and adaptively adjust brightness and temperature throughout the meeting for the selected display configuration with respect to its multiple screens. - If a secondary monitor was not discovered, however, the lighting adjustment module may prompt the user to use a mobile device (e.g. cellular telephone, smartphone, etc.) (e.g.
FIG. 7E ). The lighting adjustment module may be installed along with the virtual meetings application on the mobile device. In some implementations, with the use of machine learning algorithms, the lighting adjustment module may intelligently adjust the intensity of a camera flash light from the mobile device during the virtual meeting to provide a secondary lighting source for a well-lit video throughout the meeting. - In some implementations, the techniques and mechanisms of the present disclosure may further provide for flicker reduction. “Flicker” may come from multiple sources, for example, low-quality lighting power supply electronics that do not adequately filter line noise from household power. Such line noise may be associated with the standard 60 Hertz signal of the power grid or ceiling fans, which may periodically occlude light sources or cast shadows. The techniques and mechanisms of the present disclosure may be utilized to improve the temporal consistency of the lighting, which may increase the overall amount and temperature of light available, in flicker reduction. In some implementations, the lighting adjustment module may utilize machine learning algorithms to detect the variations in the incoming video stream and determine their periodicity. Based on the analysis, the amount of light supplied to illuminate the subject may be varied over time to compensate for the flicker.
- Thus, according to the present disclosure, the lighting adjustment module may adaptively adjust the digital light emitting from a user's existing display configuration intelligently, according to the user's ambient lighting conditions and the currently displayed content on the screen during screen-sharing. The lighting adjustment module may be installed on the computer for controlling the configuration of an external monitor or laptop screen, and be an additional part of the video teleconference meeting application. The solution may be utilized to provide consistent and well-lit lighting and illumination to participants during video teleconference meetings. The lighting adjustment module may enhance the video-based virtual meetings for users by proposing to utilize a variety of existing monitor/screen configurations, to provide adaptive brightness and temperature throughout the meeting. This may be achieved while saving costs for the user (e.g. those associated with the purchasing of external lighting setups), being environment friendly, and reducing overall eye strain of the users.
-
FIG. 9 illustrates a hardware block diagram of acomputing device 900 that may perform functions associated with operations discussed herein in connection with the techniques described in relation to the above figures, especially in relation toFIGS. 2-4, 5A-5B, 6, 7A-7E , and 8A-8B. In various embodiments, a computing device, such ascomputing device 900 or any combination ofcomputing devices 900, may be configured as any entity/entities as discussed for the techniques depicted in connection with the figures in order to perform operations of the various techniques discussed herein. - In at least one embodiment, the
computing device 900 may include one or more processor(s) 902, one or more memory element(s) 904,storage 906, abus 908, one or more network processor unit(s) 910 interconnected with one or more network input/output (I/O) interface(s) 912, one or more I/O interface(s) 914, andcontrol logic 920. In various embodiments, instructions associated with logic forcomputing device 900 can overlap in any manner and are not limited to the specific allocation of instructions and/or operations described herein. - In at least one embodiment, processor(s) 902 is/are at least one hardware processor configured to execute various tasks, operations and/or functions for
computing device 900 as described herein according to software and/or instructions configured forcomputing device 900. Processor(s) 902 (e.g., a hardware processor) can execute any type of instructions associated with data to achieve the operations detailed herein. In one example, processor(s) 902 can transform an element or an article (e.g., data, information) from one state or thing to another state or thing. Any of potential processing elements, microprocessors, digital signal processor, baseband signal processor, modem, PHY, controllers, systems, managers, logic, and/or machines described herein can be construed as being encompassed within the broad term ‘processor’. - In at least one embodiment, memory element(s) 904 and/or
storage 906 is/are configured to store data, information, software, and/or instructions associated withcomputing device 900, and/or logic configured for memory element(s) 904 and/orstorage 906. For example, any logic described herein (e.g., control logic 920) can, in various embodiments, be stored forcomputing device 900 using any combination of memory element(s) 904 and/orstorage 906. Note that in some embodiments,storage 906 can be consolidated with memory element(s) 904 (or vice versa), or can overlap/exist in any other suitable manner. - In at least one embodiment,
bus 908 can be configured as an interface that enables one or more elements ofcomputing device 900 to communicate in order to exchange information and/or data.Bus 908 can be implemented with any architecture designed for passing control, data and/or information between processors, memory elements/storage, peripheral devices, and/or any other hardware and/or software components that may be configured forcomputing device 900. In at least one embodiment,bus 908 may be implemented as a fast kernel-hosted interconnect, potentially using shared memory between processes (e.g., logic), which can enable efficient communication paths between the processes. - In various embodiments, network processor unit(s) 910 may enable communication between
computing device 900 and other systems, entities, etc., via network I/O interface(s) 912 to facilitate operations discussed for various embodiments described herein. In various embodiments, network processor unit(s) 910 can be configured as a combination of hardware and/or software, such as one or more Ethernet driver(s) and/or controller(s) or interface cards, Fibre Channel (e.g., optical) driver(s) and/or controller(s), and/or other similar network interface driver(s) and/or controller(s) now known or hereafter developed to enable communications betweencomputing device 900 and other systems, entities, etc. to facilitate operations for various embodiments described herein. In various embodiments, network I/O interface(s) 912 can be configured as one or more Ethernet port(s), Fibre Channel ports, and/or any other I/O port(s) now known or hereafter developed. Thus, the network processor unit(s) 910 and/or network I/O interface(s) 912 may include suitable interfaces for receiving, transmitting, and/or otherwise communicating data and/or information in a network environment. - I/O interface(s) 914 allow for input and output of data and/or information with other entities that may be connected to
computing device 900. For example, I/O interface(s) 914 may provide a connection to external devices such as a keyboard, keypad, a touch screen, and/or any other suitable input and/or output device now known or hereafter developed. In some instances, external devices can also include portable computer readable (non-transitory) storage media such as database systems, thumb drives, portable optical or magnetic disks, and memory cards. In still some instances, external devices can be a mechanism to display data to a user, such as, for example, a computer monitor, a display screen, or the like. - In various embodiments,
control logic 920 can include instructions that, when executed, cause processor(s) 902 to perform operations, which can include, but not be limited to, providing overall control operations of computing device; interacting with other entities, systems, etc. described herein; maintaining and/or interacting with stored data, information, parameters, etc. (e.g., memory element(s), storage, data structures, databases, tables, etc.); combinations thereof; and/or the like to facilitate various operations for embodiments described herein. - The programs described herein (e.g., control logic 920) may be identified based upon application(s) for which they are implemented in a specific embodiment. However, it should be appreciated that any particular program nomenclature herein is used merely for convenience; thus, embodiments herein should not be limited to use(s) solely described in any specific application(s) identified and/or implied by such nomenclature.
- In various embodiments, entities as described herein may store data/information in any suitable volatile and/or non-volatile memory item (e.g., magnetic hard disk drive, solid state hard drive, semiconductor storage device, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), application specific integrated circuit (ASIC), etc.), software, logic (fixed logic, hardware logic, programmable logic, analog logic, digital logic), hardware, and/or in any other suitable component, device, element, and/or object as may be appropriate. Any of the memory items discussed herein should be construed as being encompassed within the broad term ‘memory element’. Data/information being tracked and/or sent to one or more entities as discussed herein could be provided in any database, table, register, list, cache, storage, and/or storage structure: all of which can be referenced at any suitable timeframe. Any such storage options may also be included within the broad term ‘memory element’ as used herein.
- Note that in certain example implementations, operations as set forth herein may be implemented by logic encoded in one or more tangible media that is capable of storing instructions and/or digital information and may be inclusive of non-transitory tangible media and/or non-transitory computer readable storage media (e.g., embedded logic provided in: an ASIC, digital signal processing (DSP) instructions, software [potentially inclusive of object code and source code], etc.) for execution by one or more processor(s), and/or other similar machine, etc. Generally, memory element(s) 904 and/or
storage 906 can store data, software, code, instructions (e.g., processor instructions), logic, parameters, combinations thereof, and/or the like used for operations described herein. This includes memory element(s) 904 and/orstorage 906 being able to store data, software, code, instructions (e.g., processor instructions), logic, parameters, combinations thereof, or the like that are executed to carry out operations in accordance with teachings of the present disclosure. - In some instances, software of the present embodiments may be available via a non-transitory computer useable medium (e.g., magnetic or optical mediums, magneto-optic mediums, CD-ROM, DVD, memory devices, etc.) of a stationary or portable program product apparatus, downloadable file(s), file wrapper(s), object(s), package(s), container(s), and/or the like. In some instances, non-transitory computer readable storage media may also be removable. For example, a removable hard drive may be used for memory/storage in some implementations. Other examples may include optical and magnetic disks, thumb drives, and smart cards that can be inserted and/or otherwise connected to a computing device for transfer onto another computer readable storage medium.
- Embodiments described herein may include one or more networks, which can represent a series of points and/or network elements of interconnected communication paths for receiving and/or transmitting messages (e.g., packets of information) that propagate through the one or more networks. These network elements offer communicative interfaces that facilitate communications between the network elements. A network can include any number of hardware and/or software elements coupled to (and in communication with) each other through a communication medium. Such networks can include, but are not limited to, any local area network (LAN), virtual LAN (VLAN), wide area network (WAN) (e.g., the Internet), software defined WAN (SD-WAN), wireless local area (WLA) access network, wireless wide area (WWA) access network, metropolitan area network (MAN), Intranet, Extranet, virtual private network (VPN), Low Power Network (LPN), Low Power Wide Area Network (LPWAN), Machine to Machine (M2M) network, Internet of Things (IoT) network, Ethernet network/switching system, any other appropriate architecture and/or system that facilitates communications in a network environment, and/or any suitable combination thereof.
- Networks through which communications propagate can use any suitable technologies for communications including wireless communications (e.g., 4G/5G/nG, IEEE 802.11 (e.g., Wi-Fi®/Wi-Fi6®), IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), Radio-Frequency Identification (RFID), Near Field Communication (NFC), Bluetooth™, mm.wave, Ultra-Wideband (UWB), etc.), and/or wired communications (e.g., T1 lines, T3 lines, digital subscriber lines (DSL), Ethernet, Fibre Channel, etc.). Generally, any suitable means of communications may be used such as electric, sound, light, infrared, and/or radio to facilitate communications through one or more networks in accordance with embodiments herein. Communications, interactions, operations, etc. as discussed for various embodiments described herein may be performed among entities that may directly or indirectly connected utilizing any algorithms, communication protocols, interfaces, etc. (proprietary and/or non-proprietary) that allow for the exchange of data and/or information.
- In various example implementations, entities for various embodiments described herein can encompass network elements (which can include virtualized network elements, functions, etc.) such as, for example, network appliances, forwarders, routers, servers, switches, gateways, bridges, loadbalancers, firewalls, processors, modules, radio receivers/transmitters, or any other suitable device, component, element, or object operable to exchange information that facilitates or otherwise helps to facilitate various operations in a network environment as described for various embodiments herein. Note that with the examples provided herein, interaction may be described in terms of one, two, three, or four entities. However, this has been done for purposes of clarity, simplicity and example only. The examples provided should not limit the scope or inhibit the broad teachings of systems, networks, etc. described herein as potentially applied to a myriad of other architectures.
- Communications in a network environment can be referred to herein as ‘messages’, ‘messaging’, ‘signaling’, ‘data’, ‘content’, ‘objects’, ‘requests’, ‘queries’, ‘responses’, ‘replies’, etc. which may be inclusive of packets. As referred to herein and in the claims, the term ‘packet’ may be used in a generic sense to include packets, frames, segments, datagrams, and/or any other generic units that may be used to transmit communications in a network environment. Generally, a packet is a formatted unit of data that can contain control or routing information (e.g., source and destination address, source and destination port, etc.) and data, which is also sometimes referred to as a ‘payload’, ‘data payload’, and variations thereof. In some embodiments, control or routing information, management information, or the like can be included in packet fields, such as within header(s) and/or trailer(s) of packets. IP addresses discussed herein and in the claims can include any IP version 4 (IPv4) and/or IP version 6 (IPv6) addresses.
- To the extent that embodiments presented herein relate to the storage of data, the embodiments may employ any number of any conventional or other databases, data stores or storage structures (e.g., files, databases, data structures, data or other repositories, etc.) to store information.
- Note that in this Specification, references to various features (e.g., elements, structures, nodes, modules, components, engines, logic, steps, operations, functions, characteristics, etc.) included in ‘one embodiment’, ‘example embodiment’, ‘an embodiment’, ‘another embodiment’, ‘certain embodiments’, ‘some embodiments’, ‘various embodiments’, ‘other embodiments’, ‘alternative embodiment’, and the like are intended to mean that any such features are included in one or more embodiments of the present disclosure, but may or may not necessarily be combined in the same embodiments. Note also that a module, engine, client, controller, function, logic or the like as used herein in this Specification, can be inclusive of an executable file comprising instructions that can be understood and processed on a server, computer, processor, machine, compute node, combinations thereof, or the like and may further include library modules loaded during execution, object files, system files, hardware logic, software logic, or any other executable modules.
- It is also noted that the operations and steps described with reference to the preceding figures illustrate only some of the possible scenarios that may be executed by one or more entities discussed herein. Some of these operations may be deleted or removed where appropriate, or these steps may be modified or changed considerably without departing from the scope of the presented concepts. In addition, the timing and sequence of these operations may be altered considerably and still achieve the results taught in this disclosure. The preceding operational flows have been offered for purposes of example and discussion. Substantial flexibility is provided by the embodiments in that any suitable arrangements, chronologies, configurations, and timing mechanisms may be provided without departing from the teachings of the discussed concepts.
- As used herein, unless expressly stated to the contrary, use of the phrase ‘at least one of’, ‘one or more of’, ‘and/or’, variations thereof, or the like are open-ended expressions that are both conjunctive and disjunctive in operation for any and all possible combination of the associated listed items. For example, each of the expressions ‘at least one of X, Y and Z’, ‘at least one of X, Y or Z’, ‘one or more of X, Y and Z’, ‘one or more of X, Y or Z’ and ‘X, Y and/or Z’ can mean any of the following: 1) X, but not Y and not Z; 2) Y, but not X and not Z; 3) Z, but not X and not Y; 4) X and Y, but not Z; 5) X and Z, but not Y; 6) Y and Z, but not X; or 7) X, Y, and Z.
- Additionally, unless expressly stated to the contrary, the terms ‘first’, ‘second’, ‘third’, etc., are intended to distinguish the particular nouns they modify (e.g., element, condition, node, module, activity, operation, etc.). Unless expressly stated to the contrary, the use of these terms is not intended to indicate any type of order, rank, importance, temporal sequence, or hierarchy of the modified noun. For example, ‘first X’ and ‘second X’ are intended to designate two ‘X’ elements that are not necessarily limited by any order, rank, importance, temporal sequence, or hierarchy of the two elements. Further as referred to herein, ‘at least one of’ and ‘one or more of’ can be represented using the ‘(s)’ nomenclature (e.g., one or more element(s)).
- One or more advantages described herein are not meant to suggest that any one of the embodiments described herein necessarily provides all of the described advantages or that all the embodiments of the present disclosure necessarily provide any one of the described advantages. Numerous other changes, substitutions, variations, alterations, and/or modifications may be ascertained to one skilled in the art and it is intended that the present disclosure encompass all such changes, substitutions, variations, alterations, and/or modifications as falling within the scope of the appended claims.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/319,640 US20230292012A1 (en) | 2021-10-19 | 2023-05-18 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/504,956 US11722780B2 (en) | 2021-10-19 | 2021-10-19 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
| US18/319,640 US20230292012A1 (en) | 2021-10-19 | 2023-05-18 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/504,956 Continuation US11722780B2 (en) | 2021-10-19 | 2021-10-19 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20230292012A1 true US20230292012A1 (en) | 2023-09-14 |
Family
ID=85981942
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/504,956 Active US11722780B2 (en) | 2021-10-19 | 2021-10-19 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
| US18/319,640 Abandoned US20230292012A1 (en) | 2021-10-19 | 2023-05-18 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/504,956 Active US11722780B2 (en) | 2021-10-19 | 2021-10-19 | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US11722780B2 (en) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160277242A1 (en) * | 2015-03-18 | 2016-09-22 | Citrix Systems, Inc. | Conducting online meetings using user behavior models based on predictive analytics |
| US20170339376A1 (en) * | 2016-05-23 | 2017-11-23 | Ningbo Yamao Optoelectronics Co., Ltd. | Lighting system with monitoring and alarm function |
| US20180167583A1 (en) * | 2016-12-12 | 2018-06-14 | International Business Machines Corporation | Dynamic video image management |
| US20210118404A1 (en) * | 2020-12-24 | 2021-04-22 | Intel Corporation | Display with integrated illuminator |
| US11089134B1 (en) * | 2011-12-19 | 2021-08-10 | Majen Tech, LLC | System, method, and computer program product for coordination among multiple devices |
| US20220070409A1 (en) * | 2020-09-01 | 2022-03-03 | Under Silver Lining Industries LLC | Lighting system for video conference participants |
| US11348557B1 (en) * | 2021-02-09 | 2022-05-31 | Inventec (Pudong) Technology Corp. | Light compensating method and computer system thereof |
| US20220262326A1 (en) * | 2021-02-12 | 2022-08-18 | Microsoft Technology Licensing, Llc | Optimized facial illumination from adaptive screen content |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6980697B1 (en) | 2001-02-01 | 2005-12-27 | At&T Corp. | Digitally-generated lighting for video conferencing applications |
| NO328169B1 (en) | 2005-11-01 | 2009-12-21 | Tandberg Telecom As | An illumination device |
| US7965859B2 (en) | 2006-05-04 | 2011-06-21 | Sony Computer Entertainment Inc. | Lighting control of a user environment via a display device |
| US8922672B2 (en) | 2008-01-03 | 2014-12-30 | Apple Inc. | Illumination systems and methods for imagers |
| US8384754B2 (en) | 2009-06-17 | 2013-02-26 | Verizon Patent And Licensing Inc. | Method and system of providing lighting for videoconferencing |
| US9494844B2 (en) | 2013-03-25 | 2016-11-15 | Applied Minds Llc | Light source for video communication device |
| US10257240B2 (en) | 2014-11-18 | 2019-04-09 | Cisco Technology, Inc. | Online meeting computer with improved noise management logic |
| US10200423B2 (en) | 2015-05-01 | 2019-02-05 | Cisco Technology, Inc. | Presenting methods for joining a virtual meeting |
| US9609230B1 (en) * | 2015-12-30 | 2017-03-28 | Google Inc. | Using a display as a light source |
| US10277829B1 (en) | 2016-08-12 | 2019-04-30 | Apple Inc. | Video capture in low-light conditions |
| US10255885B2 (en) | 2016-09-07 | 2019-04-09 | Cisco Technology, Inc. | Participant selection bias for a video conferencing display layout based on gaze tracking |
| US11032482B2 (en) | 2019-08-05 | 2021-06-08 | Cisco Technology, Inc. | Automatic screen brightness and camera exposure adjustment for remote multimedia collaboration sessions |
-
2021
- 2021-10-19 US US17/504,956 patent/US11722780B2/en active Active
-
2023
- 2023-05-18 US US18/319,640 patent/US20230292012A1/en not_active Abandoned
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11089134B1 (en) * | 2011-12-19 | 2021-08-10 | Majen Tech, LLC | System, method, and computer program product for coordination among multiple devices |
| US20160277242A1 (en) * | 2015-03-18 | 2016-09-22 | Citrix Systems, Inc. | Conducting online meetings using user behavior models based on predictive analytics |
| US20170339376A1 (en) * | 2016-05-23 | 2017-11-23 | Ningbo Yamao Optoelectronics Co., Ltd. | Lighting system with monitoring and alarm function |
| US20180167583A1 (en) * | 2016-12-12 | 2018-06-14 | International Business Machines Corporation | Dynamic video image management |
| US20220070409A1 (en) * | 2020-09-01 | 2022-03-03 | Under Silver Lining Industries LLC | Lighting system for video conference participants |
| US20210118404A1 (en) * | 2020-12-24 | 2021-04-22 | Intel Corporation | Display with integrated illuminator |
| US11348557B1 (en) * | 2021-02-09 | 2022-05-31 | Inventec (Pudong) Technology Corp. | Light compensating method and computer system thereof |
| US20220262326A1 (en) * | 2021-02-12 | 2022-08-18 | Microsoft Technology Licensing, Llc | Optimized facial illumination from adaptive screen content |
Also Published As
| Publication number | Publication date |
|---|---|
| US11722780B2 (en) | 2023-08-08 |
| US20230120029A1 (en) | 2023-04-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10334208B2 (en) | Technologies for following participants in a video conference | |
| US8665310B2 (en) | Techniques and system for active lighting control in video conferencing | |
| JP6885009B2 (en) | How to provide meeting feedback | |
| US11451744B2 (en) | Lighting, color vector, and virtual background correction during a video conference session | |
| US11665309B2 (en) | Physical object-based visual workspace configuration system | |
| US20240121354A1 (en) | Dynamic Lighting Level Adjustment For Virtual Backgrounds | |
| WO2014048326A1 (en) | Display method and electronic device | |
| US12289558B2 (en) | Artificial intelligence generated dynamic virtual backgrounds | |
| US11722780B2 (en) | Intelligent cloud-assisted video lighting adjustments for cloud-based virtual meetings | |
| US11507399B1 (en) | Enabling screen-share in online meeting platform based on virtual desktop | |
| US12248288B2 (en) | Method and systems for achieving collaboration between resources of IoT devices | |
| US10680885B2 (en) | mDNS support in unified access networks | |
| US20240243938A1 (en) | Dynamic video layout design during online meetings | |
| US20250030567A1 (en) | Stage user replacement techniques for online video conferences | |
| US20240129435A1 (en) | Virtual privacy curtain | |
| US12430912B2 (en) | Audience re-engagement during large online meetings | |
| WO2023177597A2 (en) | Remote realtime interactive network conferencing | |
| US12057953B2 (en) | Standby layout for online meetings | |
| US20240319856A1 (en) | Automatically adding logos to online meetings | |
| US12218999B2 (en) | Selective content sharing in a video conference | |
| US20240259443A1 (en) | Delaying client content downloads based on participant attention level | |
| US20260005886A1 (en) | Providing accommodations during online communication sessions | |
| US12541902B2 (en) | Sign language generation and display | |
| US20240388674A1 (en) | Providing lighting adjustment in a video conference | |
| US11768653B2 (en) | Dynamic window detection for application sharing from a video stream |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: CISCO TECHNOLOGY, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VED, RITU KIRIT;KALE, NIKHIL SAINATH;HESS, JOHN HERMAN, III;SIGNING DATES FROM 20211015 TO 20211018;REEL/FRAME:063683/0900 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |