[go: up one dir, main page]

FI20215100A1 - Video controlled multiservice mobile device - Google Patents

Video controlled multiservice mobile device Download PDF

Info

Publication number
FI20215100A1
FI20215100A1 FI20215100A FI20215100A FI20215100A1 FI 20215100 A1 FI20215100 A1 FI 20215100A1 FI 20215100 A FI20215100 A FI 20215100A FI 20215100 A FI20215100 A FI 20215100A FI 20215100 A1 FI20215100 A1 FI 20215100A1
Authority
FI
Finland
Prior art keywords
service
video
multiservice
user
mobile
Prior art date
Application number
FI20215100A
Other languages
Finnish (fi)
Swedish (sv)
Inventor
Esa Wahlroos
Hannu Meriläinen
Original Assignee
Elisa Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elisa Oyj filed Critical Elisa Oyj
Priority to FI20215100A priority Critical patent/FI20215100A1/en
Priority to PCT/FI2022/050037 priority patent/WO2022162273A1/en
Publication of FI20215100A1 publication Critical patent/FI20215100A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Selective Calling Equipment (AREA)
  • Telephone Function (AREA)

Abstract

A mobile video controlled multiservice device and a method and computer program for controlling same are disclosed, including: obtaining service settings; monitoring video signals; detecting a video trigger indicative of readiness of a user to use the device; and responsively performing according to the service settings: initiating an interactive service; and presenting service information to the user; and using motion as a video trigger in the video signals, optionally with adjustable sensitivity. Fig. 1

Description

VIDEO CONTROLLED MULTISERVICE MOBILE DEVICE
TECHNICAL FIELD
The present disclosure generally relates to a video controlled multiservice mobile device.
The disclosure relates particularly, though not exclusively, to operating a multiservice mobile device based on changes of video footage captured by the multiservice mobile device for different services according to a pre-set configuration.
BACKGROUND
This section illustrates useful background information without admission of any technique described herein representative of the state of the art.
Presently, various information services are provided using a human-to-human end interfacing or various human-machine or human-machine-human interfacing arrangements.
In common, it is generally information services require providing some information to a user and receiving some information from her.
Taking customer service as an example, a traditional model for face-to-face service is to queue at a customer service premises for a free member of staff to interface and, when necessary, use separate back end services. More recently, the user chat with or video call to a customer service and so virtually face a customer service person. It is even possible to sometimes provide a machine acting as the customer service person.
In common to these scenarios, the user should physically face a customer service person or use computer to engage with the customer service.
While customer service was used as an example here, it shall be appreciated that any other — services are likewise of interest. Basically, an object of the present invention is to allow
N
S accessing services with mobile multiservice devices based on video triggering so that a 5 service session can be begun without necessitating touching of the devices. Another object > 25 of the present invention is to allow video triggering of a service session to begin a service
I session without need or capability of moving any body parts or fingers. Yet another object = of the present invention is to at least provide a new technical alternative.
S
LO
N
O
N
SUMMARY
The appended claims define the scope of protection. Any examples and technical descriptions of apparatuses, products and / or methods in the description and / or drawings not covered by the claims are presented not as embodiments of the invention but as background art or examples useful for understanding the invention.
According to a first example aspect there is provided a method in a mobile video controlled multiservice device, comprising: obtaining service settings; monitoring video signals; detecting a video trigger indicative of readiness of a user to use the device; and responsively performing according to the service settings: initiating an interactive service; and presenting service information to the user.
Advantageously, the obtaining of service settings and accordingly initiating the interactive — service and presenting service information to the user on detecting the video trigger may enable automatically and without touch beginning any pre-set interactive service by the mobile multiservice device.
Advantageously, by performing the method in a mobile multiservice device, the interactive service may be deployed to a new location without need to perform settings on site and — without necessarily binding resources to permanent locations. The use of a mobile device may provide for power supply break protection using a battery and sleeping mode of the mobile device as a backup. Power supply break protection may also conveniently allow moving power supply of the mobile device from one mains socket to another without disrupting operation of the mobile device.
S 25 The video trigger may be motion in the video signals, optionally with adjustable sensitivity. < Advantageously, motion triggering may enable intuitive touch-free initiation of the interactive = service by beginning the interactive service simply on arrival of a user to proximity of the - multiservice device. s The motion may be detected by comparing subseguent image frames. The motion may be 2 30 detected using motion vectors of a video encoder operated by the multiservice device. 3 The video trigger may comprise detection of a person in the video signals. Advantageously, by initiating the interactive service on detecting the person in the video signals may enable intuitively initiating the service while avoiding unnecessarily doing so. This may be particularly advantageous when the interactive service requires manual responding.
The video trigger may comprise detection of eyes of a person in the video signal. The video trigger may comprise detection of gaze of person at the device in the video signals.
Advantageously, by detecting the eyes of a person to initiate the interactive service may further improve accuracy in initiating the service only when needed and avoiding unnecessary initiating of the interactive service. Advantageously, by detecting the gaze of the person it may be possible to improve access to the interactive service for persons unable to move or move their limbs.
The obtaining the service settings may comprise inputting the service settings from a local — user interface. The obtaining the service settings may comprise inputting the service settings from a remote source over a communication connection. The inputting of the service settings may comprise using a group provisioning reception so that same or similar settings can be provisioned to a plurality of mobile devices with joint commanding.
The interactive service may comprise a video call. The interactive service may comprise an — audio call. The video call may convey image and sound. Advantageously, the user may be enabled to virtually face a person in case the interactive service comprises video communication. Further advantageously, a client may be provided with service desk or help desk access; and / or a patient may be allowed to ask help or otherwise initiate discussion with medical staff using the mobile device.
The presenting of service information may comprise displaying information to the user as defined by the service settings. Advantageously, the user may be informed using text, graphics, and / or video, about the availability of the interactive service and / or provided information as part of the interactive service or introduction thereof to allow using the interactive service without necessarily receiving prior guiding or training.
S 25 The presenting of service information may comprise outputting audible information to the < user as defined by the service settings. The audible information may comprise speech = synthesised information. Advantageously, the audible information may facilitate perceiving
N the availability of the interactive service and / or facilitate use of the interactive service for = sight impaired users.
S 30 The presenting of service information may last until at least one termination criterion is met. = The at least one termination criterion may comprise lapse of a termination interval. The i termination interval may begin from the initiating of the interactive service. The termination interval may begin from a video signal-based event. The video signal-based event may comprise motion detected in the video signal. The video signal-based event may comprise detection of disappearance of a person. The video signal-based event may comprise no longer detecting eyes of a person. The video signal-based event may comprise no longer detecting gaze of the person.
The method may comprise operating the multiservice device in at least three different modes comprising: a standby mode; an interactive mode; and switched off mode. The monitoring of the video signals and initiating of the interactive service may be performed in the standby mode. The monitoring of the video signals and initiating of the interactive service may be ceased in the switched off mode. The presenting of service information may — be performed in the interactive mode.
The mode of the multiservice device may be periodically changed from the standby mode to the interactive mode for indicating availability of the multiservice device.
The mode of the multiservice device may be changed from the standby mode to the switched off mode for saving power. The switched off mode may be adopted based on a schedule. The schedule may be defined by the service settings.
The method may further comprise operating the multiservice device in a restricted mode.
The restricted mode may be configured to allow restricted use. The restricted mode may be configured to allow staff to use internal services.
The method may further comprise monitoring audio signals and detecting an audio trigger indicative of readiness of the user to use the device. The method may comprise responsively performing according to the service settings the initiating an interactive service; and the presenting the service information to the user.
According to a second example aspect there is provided a mobile multiservice device, = comprising:
S 25 a camera;
D at least one processor; and 2 a memory comprising instructions for the at least one processor, which instructions, = when performed by the processor, cause the mobile multiservice device to at least perform + the method of the first example aspect. o oS
LO 30 The mobile multiservice device may further comprise a display. The display may be
N integrated to the mobile multiservice device.
N
The mobile multiservice device may further comprise a microphone. The microphone may be integrated to the camera. The camera may be integrated to the multiservice device.
The mobile multiservice device may further comprise a sound output. The sound output may comprise a loudspeaker. The sound output may comprise a sound output connector.
The sound output may be integrated camera. The sound output may be integrated the 5 multiservice device.
The mobile multiservice device may be a smart phone. The smart phone may be a mobile phone capable of loading and running applications.
The mobile multiservice device may be a tablet computer.
The mobile multiservice device may be a television with an additional battery (e.g., 12V device) and camera (e.g., USB-camera) with adequate operating system running the application.
According to a third example aspect there is provided a computer program comprising computer executable program code which when executed by at least one processor causes an apparatus at least to perform the method of the first example aspect.
According to a fourth example aspect there is provided a computer program product comprising a non-transitory computer readable medium having the computer program of the third example aspect stored thereon.
Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette; optical storage; magnetic storage; holographic storage; opto-magnetic storage; phase-change memory; resistive random-access memory; magnetic random-access memory; solid-electrolyte memory; ferroelectric random-access memory; organic memory; or polymer memory. The memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other 5 functions, including but not limited to a memory of a computer; a chip set; and a sub a 25 assembly of an electronic device. ? According to a fifth example aspect there is provided a system comprising a cradle and the o
N multiservice device of the second example aspect mounted to the cradle.
I a = The cradle may be configured attachable to a bed frame. o oS
Do The system may further comprise a transparent separator window or wall configured to
N 30 prevent physical access to the multiservice device. The system may comprise a booth. The
N booth may comprise the transparent separator window or wall. The transparent separator window or wall may be made of transparent material. Alternatively, or additionally, the transparent separator window or wall may be made using a net or screen or bars.
Different non-binding example aspects and embodiments have been illustrated in the foregoing. The embodiments in the foregoing are used merely to explain selected aspects or steps that may be utilized in different implementations. Some embodiments may be presented only with reference to certain example aspects. It should be appreciated that corresponding embodiments may apply to other example aspects as well.
BRIEF DESCRIPTION OF THE FIGURES
Some example embodiments will be described with reference to the accompanying figures, in which: — Fig. 1 schematically shows a system according to an example embodiment;
Fig. 2 shows a block diagram of an apparatus according to an example embodiment; and
Fig. 3 shows a flow chart according to an example embodiment.
DETAILED DESCRIPTION
In the following description, like reference signs denote like elements or steps.
Fig. 1 schematically shows a system 100 according to an example embodiment. The system 100 comprises a mobile video controlled multiservice device or multiservice device 110.
The system 100 further comprises a cradle 112 for holding the multiservice device 110. Fig. 1 further illustrates a user 120 and a remote service provider 130 such as a service desk.
The multiservice device 110 is in an embodiment a tablet computer or a smart phone.
Fig. 2 shows a block diagram of the multiservice device 110 according to an example embodiment. The multiservice device 110 comprises a communication interface 210; a processor 220; a user interface 230; and a memory 240.
N The communication interface 210 comprises in an embodiment a wired and / or wireless
N
< communication circuitry, such as Ethernet; Wireless LAN; Bluetooth; NFC; GSM; CDMA: = 25 WCDMA; LTE; and / or 5G circuitry. The communication interface can be integrated in the
N multiservice device 110 or provided as a part of an adapter, card, or the like, that is
I a. attachable to the multiservice device 110. The communication interface 210 may support
S one or more different communication technologies. The multiservice device 110 may also
LO or alternatively comprise more than one of the communication interfaces 210.
N
N 30 In this document, a processor may refer to a central processing unit (CPU); a microprocessor; a digital signal processor (DSP); a graphics processing unit; an application specific integrated circuit (ASIC); a field programmable gate array; a microcontroller; or a combination of such elements. The multiservice device 110 may comprise a System On
Chip (SoC). The processor may be integrated to the SoC.
The user interface 230 comprises a circuitry for receiving input from a user of the multiservice device 110, e.g., via a keyboard; a camera for receiving video signals; graphical user interface shown on the display of the multiservice device 110; a microphone, a speech recognition circuitry; or an accessory device, such as a headset. The user interface 230 further comprises a circuitry for providing output to the user via, e.g., a display or a loudspeaker.
The camera may be integrated to the multiservice device 110. Alternatively, or additionally, — the circuitry for receiving input of a camera may comprise a connector or wireless channel for receiving video signals from an external camera.
The microphone may be integrated to the multiservice device 110. Alternatively, or additionally, the circuitry for receiving input of a microphone may comprise a connector or wireless channel for receiving video signals from an external microphone. The external microphone may be integrated to the external camera.
The loudspeaker may be integrated to the multiservice device 110. Alternatively, or additionally, the circuitry for providing output to the user via the loudspeaker may comprise a connector or wireless channel outputting audio signals via an external loudspeaker. The external loudspeaker may be integrated to the external camera. The connector or wireless channel may be bidirectional for outputting audio signals via a loudspeaker and for inputting audio signals from a microphone.
The speech recognition circuitry may be a dedicated circuitry. In an example embodiment, the speech recognition circuitry is formed of the processor and an input audio signal circuitry. The speech recognition circuitry may enable control of operation of the multiservice
N 25 device 110. The speech recognition circuitry may enable entering text to the multiservice = device 110, e.g., for making settings and/or for writing a note or message. oO > In this document, video signals refer to image signals capable of indicating changed image
I of a camera without requirement of smooth motion. Hence, the frame rate may be, but need * not be, greater than 10 frames per second. In other alternative embodiments, the frame rate 2 30 can be as low as 1 frame a second or 1 frame per 5 or 10 seconds. 3 The memory 240 comprises a work memory 242 and a persistent memory 244 configured to store computer program code 246 and data 248. The memory 240 may comprise any one or more of: a read-only memory (ROM); a programmable read-only memory (PROM);
an erasable programmable read-only memory (EPROM); a random-access memory (RAM); a flash memory; a data disk; an optical storage; a magnetic storage; a smart card; a solid- state drive (SSD); or the like. The multiservice device 110 may comprise a plurality of the memories 240. The memory 240 may be constructed as a part of the multiservice device 110 or as an attachment to be inserted into a slot; port; or the like of the multiservice device 110 by a user or by another person or by a robot. The memory 240 may serve the sole purpose of storing data or be constructed as a part of a multiservice device 110 serving other purposes, such as processing data.
A skilled person appreciates that in addition to the elements shown in Figure 2, the multiservice device 110 may comprise other elements, such as microphones; displays; as well as additional circuitry such as input / output (I / O) circuitry; memory chips; application- specific integrated circuits (ASIC); processing circuitry for specific purposes such as source coding / decoding circuitry, channel coding / decoding circuitry; ciphering / deciphering circuitry; and the like. Additionally, the multiservice device 110 may comprise a disposable or rechargeable battery (not shown) for powering the multiservice device 110 if external power supply is not available.
Fig. 3 shows a flow chart according to an example embodiment. Fig. 3 illustrates a process in a mobile video controlled multiservice device, comprising various possible steps including some optional steps while also further steps can be included and / or some of the steps can — be performed more than once: 301. obtaining service settings; 302. monitoring video signals; 303. detecting a video trigger indicative of readiness of a user to use the device; and responsively performing according to the service settings: initiating an interactive service; and presenting service information to the user; = 304. using motion as a video trigger in the video signals, optionally with adjustable
N sensitivity; 5 305. detecting the motion by comparing subsequent image frames, optionally using motion 2 vectors of a video encoder operated by the multiservice device; = 30 306. using as the video trigger detection of a person in the video signals; > 307. using as the video trigger detection of eyes of a person in the video signal; 2 308. using as the video trigger detection of gaze of person at the device in the video 5 signals;
N 309. in the obtaining the service settings, inputting the service settings from a local user interface;
310. in the obtaining the service settings, inputting the service settings from a remote source over a communication connection; 311. in the inputting of the service settings, using a group provisioning reception so that same or similar settings can be provisioned to a plurality of mobile devices with joint commanding; 312. in the interactive service providing a video call; 313. in the interactive service, providing an audio call; 314. in the presenting of service information, displaying information to the user as defined by the service settings; 315. in the presenting of service information, outputting audible information to the user as defined by the service settings; 316. containing in the audible information speech synthesised information; 317. maintaining the presenting of service information until at least one termination criterion is met; 318 defining the at least one termination criterion as lapsing of a termination interval; 319. beginning the termination interval from the initiating of the interactive service; 320. beginning the termination interval from a video signal-based event; 321. including as the video signal-based event motion detected in the video signal, 322. including as the video signal-based event detection of disappearance of a person; 323. including as the video signal-based event no longer detecting eyes of a person; 324. including as the video signal-based event no longer detecting gaze of the person; 325. operating the multiservice device in at least three different modes comprising: a standby mode; an interactive mode; and switched off mode; 326. performing the monitoring of the video signals and initiating of the interactive service atleast in the standby mode; 327. ceasing the monitoring of the video signals and initiating of the interactive service in
N the switched off mode; = 328. performing the presenting of service information in the interactive mode; = 329. periodically changing the mode of the multiservice device from the standby mode to
N 30 the interactive mode for indicating availability of the multiservice device:
E 330. changing the mode of the multiservice device from the standby mode to the switched
S off mode for saving power;
LO 331. adopting the switched off mode based on a schedule that is optionally defined by the
N service settings;
N 35 332. further operating the multiservice device in a restricted mode that allows restricted use such as allowing staff to use internal services; and / or
333. further monitoring audio signals and detecting an audio trigger indicative of readiness of the user to use the device and optionally responsively performing according to the service settings the initiating an interactive service; and the presenting the service information to the user.
Any of the afore described methods, method steps, or combinations thereof, may be controlled or performed using hardware; software; firmware; or any combination thereof.
The software and / or hardware may be local; distributed; centralised; virtualised; or any combination thereof. Moreover, any form of computing, including computational intelligence, may be used for controlling or performing any of the afore described methods, method steps, or combinations thereof. Computational intelligence may refer to, for example, any of artificial intelligence; neural networks; fuzzy logics; machine learning; genetic algorithms; evolutionary computation; or any combination thereof.
Various embodiments have been presented. It should be appreciated that in this document, words comprise; include; and contain are each used as open-ended expressions with no intended exclusivity.
The foregoing description has provided by way of non-limiting examples of particular implementations and embodiments a full and informative description of the best mode presently contemplated by the inventors for carrying out the invention. It is however clear to a person skilled in the art that the invention is not restricted to details of the embodiments presented in the foregoing, but that it can be implemented in other embodiments using equivalent means or in different combinations of embodiments without deviating from the characteristics of the invention.
Furthermore, some of the features of the afore-disclosed example embodiments may be used to advantage without the corresponding use of other features. As such, the foregoing = 25 description shall be considered as merely illustrative of the principles of the present
N invention, and not in limitation thereof. Hence, the scope of the invention is only restricted
S by the appended patent claims. o
N
I a a o oS
LO
N
O
N

Claims (15)

1. A method in a mobile video controlled multiservice device, comprising: obtaining service settings; monitoring video signals; detecting a video trigger indicative of readiness of a user to use the device; and responsively performing according to the service settings: initiating an interactive service; and presenting service information to the user; and using motion as a video trigger in the video signals.
2. The method of claim 1, further comprising detecting the motion by comparing subsequent image frames.
3. The method of claim 1, further comprising using as the video trigger detection of a person in the video signals.
4, The method of claim 1 or 2, further comprising using as the video trigger detection of eyes of a person in the video signal.
5. The method of any one of preceding claims, further comprising in the obtaining the service settings, inputting the service settings from a remote source over a communication connection.
6. The method of any one of preceding claims, wherein the interactive service comprises — providing a video call.
7. The method of any one of preceding claims, further comprising in the presenting of service information, displaying information to the user as defined by the service settings.
8. The method of any one of preceding claims, further comprising maintaining the N S presenting of service information until at least one termination criterion is met. O 25
9. The method of claim 8, further comprising: N defining as one termination criterion lapsing of a termination interval; and E beginning the termination interval from a video signal-based event. 2
10. The method of claim 9, further comprising including as the video signal-based event O any one or more of the following: motion detected in the video signal; detection of N I . i . S 30 disappearance of a person; no longer detecting eyes of a person; and / or no longer detecting gaze of the person.
11. The method of any one of preceding claims, further comprising: operating the multiservice device in at least three different modes comprising: a standby mode; an interactive mode; and switched off mode; and performing the monitoring of the video signals and initiating of the interactive service atleast in the standby mode.
12. The method of any one of preceding claims, further comprising: monitoring audio signals; detecting an audio trigger indicative of readiness of the user to use the device; and responsively performing according to the service settings: the initiating an interactive service; and the presenting the service information to the user.
13. A mobile multiservice device, comprising: a camera; at least one processor; and a memory comprising instructions for the at least one processor, which instructions, when performed by the processor, cause the mobile multiservice device to at least perform the method of any one of preceding claims.
14. The mobile multiservice device of claim 13, wherein the mobile multiservice device is a tablet computer; a battery eguipped television; or a mobile phone.
15. A computer program comprising computer executable program code which when executed by at least one processor causes an apparatus at least to perform the method of any one of claims 1 to 12. N O N S o N I a a o S LO N O N
FI20215100A 2021-01-29 2021-01-29 Video controlled multiservice mobile device FI20215100A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FI20215100A FI20215100A1 (en) 2021-01-29 2021-01-29 Video controlled multiservice mobile device
PCT/FI2022/050037 WO2022162273A1 (en) 2021-01-29 2022-01-21 Video controlled multiservice mobile device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FI20215100A FI20215100A1 (en) 2021-01-29 2021-01-29 Video controlled multiservice mobile device

Publications (1)

Publication Number Publication Date
FI20215100A1 true FI20215100A1 (en) 2022-07-30

Family

ID=80777892

Family Applications (1)

Application Number Title Priority Date Filing Date
FI20215100A FI20215100A1 (en) 2021-01-29 2021-01-29 Video controlled multiservice mobile device

Country Status (2)

Country Link
FI (1) FI20215100A1 (en)
WO (1) WO2022162273A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8218829B2 (en) * 2001-08-20 2012-07-10 Polycom, Inc. System and method for using biometrics technology in conferencing
JP2010004118A (en) * 2008-06-18 2010-01-07 Olympus Corp Digital photograph frame, information processing system, control method, program, and information storage medium
US9191619B2 (en) * 2012-08-01 2015-11-17 Google Inc. Using an avatar in a videoconferencing system
US9796093B2 (en) * 2014-10-24 2017-10-24 Fellow, Inc. Customer service robot and related systems and methods

Also Published As

Publication number Publication date
WO2022162273A1 (en) 2022-08-04

Similar Documents

Publication Publication Date Title
US10705487B2 (en) Methods and devices for mode switching
CN106572299B (en) Camera opening method and device
EP3093569B1 (en) Method and device for turning on air conditioner
US10063760B2 (en) Photographing control methods and devices
JP6314286B2 (en) Audio signal optimization method and apparatus, program, and recording medium
EP3136793A1 (en) Method and apparatus for awakening electronic device
EP2938054B1 (en) Method, device and system for handling busy line
EP3316232A1 (en) Method, apparatus and storage medium for controlling target device
CN108052079A (en) Apparatus control method, device, plant control unit and storage medium
WO2017092246A1 (en) Brightness adjustment method and apparatus
CN105812574A (en) Volume adjusting method and device
EP3024211B1 (en) Method and device for announcing voice call
JP2017509308A (en) CHARGE MANAGEMENT METHOD, CHARGE MANAGEMENT DEVICE, PROGRAM, AND RECORDING MEDIUM
CN106527682B (en) Method and device for switching environment pictures
CN105653041A (en) Display state adjusting method and device
EP3125512A1 (en) Silent ring indication while listening music over a headset
JP2016527577A (en) User command execution method, execution device, program, and storage medium
EP3328062A1 (en) Photo synthesizing method and device
US20170048451A1 (en) Method and apparatus for controlling video image
CN108042230A (en) Toothbrush control method, device and readable storage medium storing program for executing
CN105959545A (en) Camera and camera control method and device
CN111610912A (en) Application display method, application display device and storage medium
CN111698600A (en) Processing execution method and device and readable medium
CN108319363A (en) Product introduction method, apparatus based on VR and electronic equipment
CN111415421B (en) Virtual object control method, device, storage medium and augmented reality device