WO2019087014A1 - Interaction des utilisateurs via une diffusion en continu de données de réalité augmentée - Google Patents
Interaction des utilisateurs via une diffusion en continu de données de réalité augmentée Download PDFInfo
- Publication number
- WO2019087014A1 WO2019087014A1 PCT/IB2018/058294 IB2018058294W WO2019087014A1 WO 2019087014 A1 WO2019087014 A1 WO 2019087014A1 IB 2018058294 W IB2018058294 W IB 2018058294W WO 2019087014 A1 WO2019087014 A1 WO 2019087014A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual object
- user device
- screen
- user
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
Definitions
- the invention relates to the field of communication technologies, in particular, to a method of streaming data relating to data on virtual objects of augmented reality, in real time.
- Real-time streaming of data is characterized by the ability of the recipient to continuously receive data from the sender - the initiator of streaming data.
- communication systems have been implemented that provide the possibility of streaming data related to data not only about real-world objects, but also to data about virtual objects that complement real-world objects.
- the idea of the present invention is that the sender in real-time streaming data relating to at least one primary virtual object that is displayed on the screen of the user's sender device as an object of augmented reality.
- the sender binds the specified An object to a location that is displayed by the device that captures the image of the user's device of the sender (that is, to objects of the sender's real world).
- the streaming data contains only data related to objects displayed as augmented reality objects and does not contain data relating to the real world of the sender displayed by the image capturing device of the user's sender device.
- the streaming data contains only data about the virtual object.
- the streaming data is displayed on the screen of the recipient's user device as the augmented reality of the recipient.
- the specified data is also displayed with reference to the specified location displayed by the image capture device of the recipient's user device (that is, with reference to the display objects of the real world of the recipient).
- FIG. 1 is a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of the preferred embodiments of the present invention
- FIG. 2a is a schematic diagram illustrating the screen of the user's device of the sender at the stage of selecting the main virtual object
- FIG. 2b is a schematic diagram illustrating the screen of the user device of the sender, which displays the step of setting the location of the selected primary virtual object
- FIG. 2c is a schematic diagram illustrating the screen of the user's sender's device, on which a 3d animated virtual object is displayed as an object of augmented reality at a given location
- FIG. 1 is a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of the preferred embodiments of the present invention
- FIG. 2a is a schematic diagram illustrating the screen of the user's device of the sender at the stage of selecting the main virtual object
- FIG. 2b is a schematic diagram illustrating the screen of the user device of the sender
- FIG. 2d is a schematic diagram illustrating the screen of the user's device of the sender at the stage of initiating the streaming of data
- FIG. 2e is a schematic diagram illustrating the screen of the recipient's user device at the stage of initiating receipt of streaming data
- FIG. 2f is a schematic diagram illustrating the recipient's user device screen, which displays the step of setting a location for displaying streaming data
- FIG. 2g is a schematic diagram illustrating the screen of the recipient's user device, which displays streaming data as augmented reality at a given location
- FIG. 3 is a schematic representation of an embodiment of a user device according to one of the preferred embodiments of the present invention
- FIG. 4 is a schematic representation of an embodiment of a communication system in accordance with one of the preferred embodiments of the present invention.
- the present invention relates to a method of user interaction in a communication system, a user device for organizing user interaction in a communication system, a communication system allowing this method to be implemented, and a computer-readable medium on which program instructions are stored that initiate execution of aspects of the method of user interaction in accordance with the present invention.
- a method for user interaction in a communication system including real-time streaming of data related to data about at least one primary virtual object displayed on the user's screen of the sender with reference to the specified location displayed the device to capture the image of the user's device of the sender, as an object of augmented reality, the display of data on the current broadcast on the screen of the user device of at least one recipient with reference to the specified location displayed by the image capturing device of the recipient's user device as augmented reality.
- a user device for organizing user interaction in a communication system including an image capture device, at least one processor, a machine-readable medium connected to at least one processor and containing program instructions for user interaction in the communication system, who when bringing them into Execution by at least one processor provides the possibility of real-time streaming of data related to data on at least one main virtual object displayed on the screen of the user device with reference to a specified location displayed by the image capturing device of the user device.
- augmented reality providing the ability to display streaming data on the screen of a user device TWA with reference to the specified location displayed by the device capturing the image of the user device, as augmented reality.
- a communication system that allows user devices and a server to communicate with each other, comprising software instructions located on a computer-readable media of a user device, which, when executed by at least one processor of the user device, ensures that Real-time streaming of data related to data on at least m
- a computer-readable media contains software instructions for user interaction in a communication system, which, when executed by at least one processor of a user device, provides the possibility of real-time streaming of data related to data about at least one primary virtual object displayed on the screen of the user device with a graft language to the specified location displayed by the device capturing the image of the user device as an object of augmented reality, providing the ability to display streaming data on the screen of the user device with reference to the specified location displayed by the device capturing the image of the user device as augmented reality.
- the sender has the ability to stream real-time only those data that are related to the virtual objects displayed as augmented reality objects associated with the sender’s real-world objects displayed by the image capturing device of the user's sender’s device, while data about the specified objects in the real world of the sender is not transmitted.
- the streaming data is displayed as the augmented reality of the recipient.
- the specified data is also displayed with binding to a specified location displayed by the image capture device of the recipient's user device (that is, linked to the displayed objects of the real world of the recipient).
- User devices and a server in a communication system are communicated via a network through which connections between the server and user devices are established to enable user interaction in the communication system according to the described method, including without limitation the Internet, wireless communication networks, networks using standard communication technologies and / or protocols.
- the described communication system can function on any suitable user devices, regardless of the operating systems installed on them.
- User access to the communication system can be carried out using the appropriate application installed on the user device via the network.
- An application is a program installed on the user's device and intended for user interaction in the communication system.
- a user device for example, a smartphone, tablet computer, augmented reality glasses, or any other device that contains an image capture device capable of displaying the world around the user (for example, a camera), a display component capable of providing the user with the ability to see the displayed image capture device the surrounding world (for example, the user device screen) and a network component that allows you to communicate with at least one other user device.
- an image capture device capable of displaying the world around the user (for example, a camera)
- a display component capable of providing the user with the ability to see the displayed image capture device the surrounding world (for example, the user device screen)
- a network component that allows you to communicate with at least one other user device.
- Such devices should have computing capacity and components sufficient for launching and executing applications based on their current location, as well as for streaming data.
- interacting users should understand the sender and recipient.
- the sender should be understood as the user of the communication system that streams the data, and the receiver as the user who receives the streaming data.
- the recipient may receive a notification on his user device about the sender’s streaming the data and the ability to receive the specified data.
- Such notification may also be a publication in the message stream of interacting users with the recipient being able to open the publication with the further implementation of the steps of the method according to the present invention.
- the primary virtual object is selected from the virtual objects available for selection via the user interface on the screen of the sender's user device and the selected primary virtual object is placed on the screen of the user's sender device at a specified location displayed by the device of the user's capture of the sender device. as an object of augmented reality.
- the sender actually controls the primary virtual object and each subsequent other primary virtual object to which it replaces the displayed virtual object, merging them with additional virtual objects and changing their location, with realizing the streaming data containing the data about virtual objects. of time.
- Such an implementation option is most preferable when using as the main virtual object 3d an animated virtual object, the movement of which can be displayed most harmoniously.
- At least one of the 2d static or animated virtual object, 3d static or animated virtual object is used as the primary virtual object. It is also preferable to use at least one of the 3d static (for example, 3d drawing) or animated virtual object, 2d static (for example, picture, photo) or animated virtual object, text, audio as an additional virtual object. (e.g. music, audio effect, user voice recording) or video. Combining the main virtual object with an additional one can be implemented as a possibility for the sender to attach an additional virtual object to the main virtual object.
- an additional virtual object can be attached to the main virtual object at any time: at the initial specified location of the primary virtual object on the screen of the user device of the sender, when the virtual object moves, at the new specified location of the main virtual object.
- Additional virtual objects are also displayed on the screens of user devices of the sender and receiver as objects of augmented reality.
- the additional virtual objects are mapped to the main virtual objects.
- specifying a location is provided by binding the main virtual object to a geographical location displayed by the image capturing device of the user device, for example, by binding the specified object to the surface.
- a 3d animated virtual object is carried out from a specified predetermined location.
- At least one of the interacting users is provided with an opportunity to perform at least one of the following operations: sending a text message, sending a voice message, sending a video message, multimedia messaging, video call.
- the sender and receiver of the main virtual object can be displayed on screens of user devices from different angles and from different viewpoints. This means that after displaying the main virtual object on the screens of user devices of the sender or recipient, the user can view the main virtual object using the image capture device of the user device from different viewpoints, for example, from above, from the side or bypass it. For example, while displaying streaming data and simultaneously rotating the image capturing device of the recipient's user device to the side opposite to the specified location of the main virtual object, the recipient user device will display only the physical, real world without the main virtual object on the screen of the recipient user device. The same applies to additional virtual objects.
- FIG. 1 illustrates a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of preferred embodiments of the present invention. The steps of the method illustrated in the flowchart will be further described in more detail with reference to FIG. 2a - 2g.
- the screen 200 of the sender's user device 202 displays the real-world objects 204 using the image capture device of the sender's user device 202, as well as a panel 206 with selectable basic virtual objects, as illustrated in FIG. 2a
- the sender selects the primary virtual object from the available virtual objects.
- the sender selects the primary virtual object 208. This step is described in block 100 of the flowchart shown in FIG. one.
- the sender performs the placement of the selected primary virtual object 208 on the screen 200 of the user device 202 of the sender (this step is described in block 102 of the flowchart shown in Fig. 1).
- the sender sets the location 210 of the primary virtual object 208, as illustrated in FIG. 2b.
- the specified location 210 is displayed by the image capturing device of the user device 202 of the sender on the screen 200 of the user device 202 of the sender.
- the sender can specify the location of an object, for example, by touching a specific location on the touchscreen 200 or dragging the selected primary virtual object 208 on the touchscreen 200 to a specific location, after which the specified object 208 is attached to a given geographical location displayed by the user’s device sender device 202, and is displayed as an augmented reality object, as illustrated in FIG. 2c. Those. On the screen 200 of the user device 202 of the sender, both the main virtual object 208 and the real-world objects 204 surrounding the sender are shown, which are addressed by the device for capturing an image of the user device 202 of the sender.
- Sender can inspect the displayed main virtual object 208 from different sides through the screen 200 of the user device 202 of the sender, changing the viewing angle, using the image capture device of the user device 202 of the sender.
- the primary virtual object 208 is a 3d animated virtual object. This stage of placing the main virtual object 208 is described in block 102 of the flowchart shown in FIG. one
- the user interface on the screen 200 of the sender’s user device 202 allows you to start real-time streaming of data related to the data on the displayed virtual primary object 208.
- 200 of the user's device of the sender may, for example, display the icon 212 with the caption “Start streaming data as illustrated in FIG. 2c, when touched as shown in FIG. 2d, data streaming will begin. This step is described in block 104 of the flowchart shown in FIG. one
- the sender can in any sequence, for example, combine the displayed primary virtual object with an additional virtual object (for example, text, 2d or 3d virtual object, audio or video); set a new location on the screen 200 of the user device 202 of the sender, as described above, and then moving the displayed primary virtual object on the screen 200 of the user device 202 of the sender to a new predetermined location; replace the displayed primary virtual object 208 on the screen 200 of the user device 202 of the sender with another primary virtual object from the available virtual objects through the user interface on the screen 200 of the user device 202 sender, etc. Merge primary virtual object 208 with additional virtual objects.
- an additional virtual object for example, text, 2d or 3d virtual object, audio or video
- Objects can be implemented, for example, by selecting the appropriate additional virtual object from files stored on the user's device 202 of the sender, for example, from photos, or by selecting from the examples available for selection, 5 “drop-downs” after touching the corresponding window with the finger saying to add an additional virtual object. "
- the sender can perform any available actions on the displayed primary virtual object (primary virtual object 208 or a new primary virtual object that can be replaced with the primary virtual object 208), i.e. manage the main virtual object.
- real-time data will be streamed, associated only with virtual objects displayed on the screen 200 of the user device 202 of the sender, as objects augmented
- 15 reality i.e., for example, data associated with finding the main virtual object 208 in a given location, displaying additional virtual objects in combination with the main virtual object 208, moving the main virtual object 208.
- the recipient is provided with the ability to receive streaming data via the corresponding icon 214, as 25 is illustrated in FIG. 2e, the touch of which initiates the display of streaming data on the screen 216 of the recipient's user device 218.
- the recipient After touching the specified icon 214, the recipient places the streaming data on the screen 216 of the user's device 218 of the recipient. For this, the recipient sets the locations 220 the primary virtual object 208, as illustrated in FIG. 2f.
- the specified location 220 is displayed by the image capturing device of the recipient's user device 218 on the screen of the recipient's user device 218.
- the recipient can specify the location of the object, for example, by touching a specific location on the touch screen 216, after which the specified object 208 is attached to a given geographical location displayed by the image capturing device of the recipient's user device 218 and displayed as an augmented reality object, as illustrated in FIG. 2g. Those.
- both the primary virtual object 208 is displayed, the data of which is transmitted by streaming data broadcast by the sender, and the real-world objects 222 surrounding the recipient that are addressed by the recipient's user device 218.
- the recipient can inspect the displayed primary virtual object 208 from different sides through the screen 216 of the recipient's user device 218, changing the viewing angle, using the image capture device of the recipient's user device 218. This step of displaying streaming data on the screen 216 of the recipient's user device 218 is described in block 106 of the flowchart shown in FIG. one.
- FIG. 3 is a schematic representation of an embodiment of a user device according to one embodiment of the present invention.
- the specified user device can be both the user device of the sender and the user device of the recipient and contains the processor 300 and its associated screen 302, computer-readable media 304, network component 306 and image pickup device 308.
- FIG. 4 shows a schematic representation of an example implementation of a communication system in accordance with one embodiment of the present invention.
- the specified communication system contains server 400 and associated sender and receiver user devices 402 and 404, respectively.
- the streaming of data from the sender's user device is carried out by the server 400.
- the user interaction method in the communication system is not limited to the specific features or steps described above.
- the specific features and steps described above are disclosed as examples implementing the present invention, and other equivalent features and steps may be covered by the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Information Transfer Between Computers (AREA)
- Processing Or Creating Images (AREA)
Abstract
L'invention concerne le domaine des technologies de la communication et notamment un procédé de diffusion en continu de données se rapportant aux données sur les objets virtuels de réalité augmentée en temps réel. Le procédé d'interaction des utilisateurs dans un système de communication consiste en ce qui suit : exécution en temps réel d'une diffusion en continu des données correspondant aux données sur l'objet virtuel principal, affiché à l'écran du dispositif utilisateur de l'expéditeur avec rattachement à la position voulue affichée par le dispositif de capture d'images du dispositif utilisateur de l'expéditeur comme objet de réalité augmentée, affichage des données diffusées en continu à l'écran du dispositif utilisateur d'au moins un destinataire avec rattachement à la position voulue affichée par le dispositif de capture d'images du dispositif utilisateur du destinataire comme objets de réalité augmentée.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762578543P | 2017-10-30 | 2017-10-30 | |
| US62/578,543 | 2017-10-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019087014A1 true WO2019087014A1 (fr) | 2019-05-09 |
Family
ID=66332962
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2018/058294 Ceased WO2019087014A1 (fr) | 2017-10-30 | 2018-10-24 | Interaction des utilisateurs via une diffusion en continu de données de réalité augmentée |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2019087014A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111935491A (zh) * | 2020-06-28 | 2020-11-13 | 百度在线网络技术(北京)有限公司 | 直播的特效处理方法、装置以及服务器 |
| CN113286162A (zh) * | 2021-05-20 | 2021-08-20 | 成都威爱新经济技术研究院有限公司 | 一种基于混合现实的多机位画面直播方法及系统 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130293584A1 (en) * | 2011-12-20 | 2013-11-07 | Glen J. Anderson | User-to-user communication enhancement with augmented reality |
| US20140002442A1 (en) * | 2012-06-29 | 2014-01-02 | Mathew J. Lamb | Mechanism to give holographic objects saliency in multiple spaces |
| US20140282162A1 (en) * | 2013-03-15 | 2014-09-18 | Elwha Llc | Cross-reality select, drag, and drop for augmented reality systems |
| US20160307374A1 (en) * | 2013-12-19 | 2016-10-20 | Metaio Gmbh | Method and system for providing information associated with a view of a real environment superimposed with a virtual object |
| US20170178272A1 (en) * | 2015-12-16 | 2017-06-22 | WorldViz LLC | Multi-user virtual reality processing |
-
2018
- 2018-10-24 WO PCT/IB2018/058294 patent/WO2019087014A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130293584A1 (en) * | 2011-12-20 | 2013-11-07 | Glen J. Anderson | User-to-user communication enhancement with augmented reality |
| US20140002442A1 (en) * | 2012-06-29 | 2014-01-02 | Mathew J. Lamb | Mechanism to give holographic objects saliency in multiple spaces |
| US20140282162A1 (en) * | 2013-03-15 | 2014-09-18 | Elwha Llc | Cross-reality select, drag, and drop for augmented reality systems |
| US20160307374A1 (en) * | 2013-12-19 | 2016-10-20 | Metaio Gmbh | Method and system for providing information associated with a view of a real environment superimposed with a virtual object |
| US20170178272A1 (en) * | 2015-12-16 | 2017-06-22 | WorldViz LLC | Multi-user virtual reality processing |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111935491A (zh) * | 2020-06-28 | 2020-11-13 | 百度在线网络技术(北京)有限公司 | 直播的特效处理方法、装置以及服务器 |
| US11722727B2 (en) | 2020-06-28 | 2023-08-08 | Baidu Online Network Technology (Beijing) Co., Ltd. | Special effect processing method and apparatus for live broadcasting, and server |
| CN113286162A (zh) * | 2021-05-20 | 2021-08-20 | 成都威爱新经济技术研究院有限公司 | 一种基于混合现实的多机位画面直播方法及系统 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11979244B2 (en) | Configuring 360-degree video within a virtual conferencing system | |
| US11792241B2 (en) | Method, system, and non-transitory computer-readable record medium for displaying reaction during VoIP-based call | |
| US9615058B2 (en) | Apparatus and method for sharing content items among a plurality of mobile devices | |
| CN103051865B (zh) | 画面控制的方法及终端、视频会议装置 | |
| WO2019092590A1 (fr) | Interaction d'utilisateurs dans un système de communication utilisant une notification par flux multipleы de données de réalité augmentée | |
| US12086378B2 (en) | Moving a digital representation of a video conference participant to a new location in a virtual environment | |
| CN114793285B (zh) | 信息显示方法、装置、设备及介质 | |
| CN105554430B (zh) | 一种视频通话方法、系统及装置 | |
| WO2019082050A1 (fr) | Interaction d'utilisateurs dans un système de communications utilisant de l'historique et messages de réalité augmentée | |
| CN114168018A (zh) | 数据交互方法、装置、电子设备、存储介质和程序产品 | |
| CN115830224A (zh) | 多媒体数据的编辑方法、装置、电子设备及存储介质 | |
| CN103873453A (zh) | 沉浸通信客户端、服务器及获取内容视图的方法 | |
| JP2024518472A (ja) | 画像融合方法、装置、電子機器および記憶媒体 | |
| CN111698574A (zh) | 一种视频水印处理方法、装置、电子设备和存储介质 | |
| CN112055164B (zh) | 信息互动方法、装置、终端及存储介质 | |
| CN109947528B (zh) | 信息处理方法和装置 | |
| CN112817671B (zh) | 图像处理方法、装置、设备以及计算机可读存储介质 | |
| WO2019087014A1 (fr) | Interaction des utilisateurs via une diffusion en continu de données de réalité augmentée | |
| CN112218144A (zh) | 投屏控制方法、装置、电子设备以及计算机可读介质 | |
| CN116112617B (zh) | 演播画面的处理方法、装置、电子设备及存储介质 | |
| EP3389049B1 (fr) | Techniques permettant à des tiers d'ajouter des effets à une application | |
| WO2019097364A1 (fr) | Création d'un contenu multimédia contenant des objet virtuels de la réalité augmentée | |
| CN113891135B (zh) | 一种多媒体数据播放方法、装置、电子设备及存储介质 | |
| KR20240030921A (ko) | 사용자 중심의 멀티뷰 제공 시스템 및 그 방법 | |
| CN111367598B (zh) | 动作指令的处理方法、装置、电子设备及计算机可读存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18874257 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18874257 Country of ref document: EP Kind code of ref document: A1 |