[go: up one dir, main page]

WO2018070604A1 - Procédé de traitement d'objet d'interface utilisateur virtuelle pour une communication entre des utilisateurs et système le mettant en oeuvre - Google Patents

Procédé de traitement d'objet d'interface utilisateur virtuelle pour une communication entre des utilisateurs et système le mettant en oeuvre Download PDF

Info

Publication number
WO2018070604A1
WO2018070604A1 PCT/KR2016/013980 KR2016013980W WO2018070604A1 WO 2018070604 A1 WO2018070604 A1 WO 2018070604A1 KR 2016013980 W KR2016013980 W KR 2016013980W WO 2018070604 A1 WO2018070604 A1 WO 2018070604A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
connection
interface object
virtual
communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2016/013980
Other languages
English (en)
Korean (ko)
Inventor
심혁훈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Akn Korea Inc
Original Assignee
Akn Korea Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Akn Korea Inc filed Critical Akn Korea Inc
Publication of WO2018070604A1 publication Critical patent/WO2018070604A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display

Definitions

  • the present invention relates to a method and system for providing communication between users in augmented reality, virtual reality, or a mixed reality environment combining the same.
  • communication between users is performed through a user terminal such as a smart phone, and in the case of a video call, the video inputted through the camera module mounted on the terminal is transmitted to the other party and displayed on the display device of the other party's terminal. Communication is taking place. At this time, the user attempts to connect through a button displayed on the touch screen of the terminal or a button physically disposed on the terminal, or controls functions necessary for communication.
  • an object of the present invention is to implement real-time communication in a virtual reality environment.
  • a method of processing a virtual user interface object for communication between users comprising: receiving physical space information of a user; Creating a virtual user interface object using the physical space information to receive a command for communication between users in a virtual space; And receiving an access request command of an access requestor or an access approval command of an access receiver through the virtual user interface object.
  • the virtual user interface object has a display format including a size or a location based on the physical object included in the physical space information.
  • the virtual user interface object preferably includes an interface element for selection of at least one communication connection means of the connection requester.
  • the virtual user interface object includes an element for inputting context information defining an access request purpose of the access requester or an element for displaying the context information input by the access requester to the access receiver.
  • the generating may further include generating a connection established between the connection requester and the connection receiver as a connection object using the physical space information.
  • connection object may be generated as a holographic object using the physical object information of the physical space information and setting information of a connection counterpart.
  • connection object generated for each connection for a plurality of established connections with different depths using the physical space information.
  • connection request command or the connection approval command it is preferable to receive the connection request command or the connection approval command through the user's behavior recognition.
  • the generating may include moving the virtual user interface object by reflecting changed depth information of the physical object in the physical space information.
  • the command for communication is at least one of a connection request command, a connection grant / reject command, and an audio / video information input command.
  • a system for processing a virtual user interface object for inter-user communication comprising: a display unit that references a processor and a memory; A sensor unit for obtaining physical space information of a user; And a user interface processing program processed by the processor using a part of the memory, wherein the program uses the physical space information as a virtual user interface object that receives a command for communication between users in a virtual space. It is preferable to generate by.
  • the virtual user interface object has a display format including a size or a location based on the physical object included in the physical space information.
  • the virtual user interface object preferably includes an interface element for selection of at least one communication connection means of the connection requester.
  • the virtual user interface object includes an element for inputting context information defining a connection request purpose of the connection requester or an element for displaying the context information input by the connection requester to the connection receiver.
  • the user interface processing program may further include generating a plurality of connections established between the connection requester and the connection receiver as connection objects for each connection using the physical space information.
  • the user interface processing program may generate a plurality of connection objects with different depths using the physical space information.
  • the user interface processing program receives the connection request command or the connection approval command through the user's behavior recognition.
  • the user interface processing program includes moving the virtual user interface object to reflect the change of the physical object in the physical space information.
  • the configuration of the present invention it is possible to provide a real time communication function in a mixed reality. Therefore, the user does not need to exit to the real environment in order to communicate with others in the real environment in the virtual reality environment, and can communicate through applications or web browsers in various virtual reality platforms, so the scope of communication between users Is not limited.
  • the additional service provided in the real environment may be provided as it is.
  • the user can efficiently control real-time communication in a mixed reality with less time / spatial constraints than the real environment.
  • FIG. 1 is a diagram illustrating an example of communication between users implemented by a method according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a virtual user interface object for inter-user communication according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating selection of communication connection means through a virtual user interface object according to an embodiment of the present invention.
  • FIG. 4 and 6 illustrate a virtual user interface object for inter-user communication according to an embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a method of processing a virtual user interface object according to an embodiment of the present invention.
  • FIG. 8 is a conceptual diagram illustrating a method of processing a virtual user interface object for inter-user communication and a system for performing the same according to an embodiment of the present invention.
  • 1 is a diagram showing an example of communication between users 10 implemented by the method according to an embodiment of the present invention.
  • the user 10 communicates with other users 20 in a virtual reality situation.
  • Virtual reality refers to an environment that allows a user 10 to interact with objects created by belonging to a specific situation by creating a specific situation through computer graphics.
  • the virtual reality is based on Augmented Reality, which displays not only the above-described virtual situation but also the virtual object in the real world seen by the user 10, and is one of the wearable computers.
  • the display device may include a mixed reality (MR) which combines a virtual world having additional information in the real world in real time as a single image.
  • MR mixed reality
  • the communication between the users 10 is performed only in the virtual space, and the communication between the users 10 is performed, and the real space and the virtual are transmitted to the user 10 through the HMD device 15. This includes all cases in which communication between the users 10 is performed in a situation where the spaces overlap.
  • the user 10 may, for example, take off the HMD device 15 and make a phone call through the smart phone in order to communicate with the other user 20 in the virtual reality situation according to the present embodiment. Call 20 or call the user 10 in another virtual reality situation.
  • the HMD device 15 may be composed of a pair of wearable glasses including a display device, and may be composed of a selection and a combination of transparent, translucent, or opaque display devices to properly combine the real environment and the virtual environment. By creating a mixed reality environment can be provided to the user (10).
  • the transparent display device displays the real object through at least some transparent pixels, and displays the virtual object together through other pixels.
  • the transparent display may be composed of a lens including a transparent image generating element, such as a see-through organic light-emitting diode (OLED).
  • OLED organic light-emitting diode
  • the transparent display device may include a filter to adjust the brightness, resolution, and the like appropriately before the image of the objects in the real environment reaches the eyes of the user 10 so that the user 10 may use the virtual objects together with the virtual objects. To be recognized.
  • the HMD device 15 may include a gaze tracking sensor to recognize the gaze of the user 10, provide information about an object to be stared at, or set the size or position of the virtual object in consideration of a field of view. .
  • the HMD device 15 may recognize an action or movement of an object outside the field of view through an optical sensor.
  • an optical sensor as an additional external sensor, the state, position, depth, and illumination information of objects in the real world may be recognized through a depth camera, a visible light camera, an infrared camera, and a behavior recognition camera.
  • a position sensor information related to a user's position and information about a motion in a space may be recognized, including a global positioning system (GPS), an acceleration sensor, a gyro sensor, and the like.
  • GPS global positioning system
  • acceleration sensor acceleration sensor
  • gyro sensor gyro sensor
  • an apparatus for collecting external sounds such as a microphone may be used to transmit and receive voices between users, and to recognize control commands through voices.
  • the HMD device 15 Based on the sensed information, the HMD device 15 recognizes the physical space information of the physical space to which the user belongs, and more realistically expresses the virtual object displayed on the physical space through the display device, or the object of reality. In combination with the virtual object to be displayed.
  • the mixed reality will be mainly described, but the technical application range is not limited to the mixed reality, and may include various forms of virtual reality.
  • an HMD device will be described as an example.
  • the present invention may be applied regardless of its name by interworking with a wearable computer that provides a virtual reality to a user, a head up display (HUD) device for automobiles, and various other user terminal devices. Can be.
  • HUD head up display
  • FIG. 2 is a diagram illustrating a virtual user interface object 100 for communication between users 10 according to an embodiment of the present invention.
  • the virtual user interface object 100 may be displayed on the physical space 200 composed of physical objects of the real world.
  • the physical object is an object that actually exists in the real world, and the object includes not only a person, an animal, and an object but also a point in the real space.
  • Physical space information is information that includes these physical objects.
  • Environmental information such as the user's current location information collected by the sensor, image information about the current location, the user's movement information at the current location, temperature, humidity, and illumination of the current location And the like.
  • the physical object includes all objects that are the objects of sensing with physical spatial information.
  • the user interface object 100 may be generated using physical space information, and in one embodiment, a format that is generated and displayed using a physical object included in the physical space 200 may be determined.
  • the size or position displayed on the display device may be determined based on the depth information of the physical object.
  • the depth information of the physical space 200 is information on the focus of the image information of the physical space 200 recognized by the eyes of the user 10
  • the virtual user interface object 100 is physically considering the depth of focus
  • the displayed depth of the virtual user interface object 100 in the spatial information may be determined.
  • the displayed depth of the user interface object 100 is also displayed relatively deep together with the physical object that the user 10 gazes at. Allows you to manipulate the interface.
  • the virtual user interface object 100 may determine a display format including a size or a location based on depth information of the physical object.
  • the display format may be determined by using characteristics of surrounding physical objects corresponding to the corresponding depth based on the depth information.
  • the color of the wall may be determined as a physical object to determine the color displayed to the user 10. If the user interface object 100 is displayed in a color similar to the color of the wall, visibility may be difficult to manipulate the user 10.
  • the virtual user interface object 100 is divided into active and inactive states.
  • the visual field is displayed by displaying a color similar to that of the physical object. It is also possible to prevent interference or distractions, and to facilitate operation by displaying different colors when the user 10 is activated.
  • the virtual user interface object 100 may be located at a location that minimizes the disturbance of the user 10's behavior such as turning off and turning on the TV in anticipation of the user 10 operating the TV. ) Can be displayed.
  • the position of the virtual user interface object 100 may be changed by reflecting the changed depth information of the physical object in the physical space information.
  • the user's 10 direction of sight, focus, and the like are immediately recognized, and the position of the virtual user interface object 100 is changed accordingly, thereby improving convenience of using the interface.
  • the user interface object according to the above embodiment may be implemented as an application executable in a virtual environment.
  • the application provides basic video and audio transmission and reception functions to support general voice or video call through a communication line, Voice over Internet Protocol (VoIP) based on Internet IP network, and additionally according to the present embodiment.
  • VoIP Voice over Internet Protocol
  • the virtual user interface object 100 is provided before or during the call.
  • the virtual user interface object 100 may be implemented based on real-time communication (RTC) in this embodiment.
  • the virtual user interface object 100 may be implemented using web RTC to enable media communication between users 10 using only a web browser without applying a separate application to communication between users 10.
  • the user 10 may use the user interface object 100 generated according to the present embodiment only by executing a web browser in a virtual reality.
  • the user interface object 100 may be created and executed only by accessing a web page using Microsoft's edge browser, Google's Chrome, and the like.
  • the virtual user interface object 100 may allow a connection requestor requesting communication for communication between the users 10 to request communication through selecting one of the connection recipient lists as the counterpart. have.
  • connection requester communicates by using a finger to point or press a part of the virtual user interface object 100 that is generated in the virtual space and displayed together with the physical space 200. You can enter the request command of the connection for.
  • the command for communication may be at least one of a connection request command, a connection grant / reject command, and an audio / video information input command.
  • a connection request command a connection request command
  • a connection grant / reject command a connection grant / reject command
  • an audio / video information input command An interface for specific input of such a command will be described below.
  • the user 10 may request access by using a finger to a user 10 of an account named akn@akn.io of a virtual user interface object 100 created on an empty wall as a physical object.
  • the user interface object 100 may display a list of connection means, and thus may select a connection means for connection with another user 20.
  • connection means is based on the format displayed on the virtual user interface object 100 of the connection requestor 10, a method of receiving a command, a communication method between the connection requester 10 and the connection receiver 20, and the like. Can be distinguished.
  • FIG. 3 is a diagram illustrating communication connection means through a virtual user interface object 100 according to an embodiment of the present invention.
  • the virtual user interface object 100 may include an interface element for selecting at least one communication connection means of the access requester.
  • the interface element is, for example, a link set by the connection receiver 20 such that a custom link for executing an application used for a connection request and a web link for executing a web browser are exposed in text form. Can be.
  • connection requester 10 selects an operation such as staring, clicking or touching by hand, a command according to a corresponding link is performed.
  • QR Quick Response
  • the coded information is displayed on the terminal of the access requester without exposing the specific link address, and a code recognition object for recognizing the code is executed in the virtual reality environment (the program can also be generated based on WebRTC).
  • the object of recognizes the code it is possible to request the connection receiver 20 to connect through the recognized information.
  • the code recognition object is automatically executed in the background to request another user to connect as a result of the QR code recognition. can do.
  • connection request intention it is also possible to receive an additional user command in addition to the recognition of the QR code.
  • an additional user command it is possible to recognize a QR code and enter a command to request a connection by voice or to input a command through an action so that a call can be attempted.
  • connection means may be implemented as a list of buttons imaged as shown in FIG. 2.
  • Selection of the indirect user 10 connection means may be implemented through various sensors of the HMD device 15.
  • behavior recognition may be used to receive a selection of a connection means.
  • connection means by image recognition of the external environment.
  • connection means connected to the fire station For example, through the camera in the HMD device 15 to collect external image information about the physical space 200 for fire or intrusion, and in case of fire can be automatically recognized by selecting the connection means connected to the fire station. In case of intrusion, it can be recognized as selecting the connection means connected with the security company or the police station.
  • the virtual user interface object 100 ′ may provide connection information of the connection requester 10 from the side of the connection receiver 20.
  • the access information includes all information about the access requester 10, such as personal information of the access requester 10, connection request purpose information, and the like.
  • the virtual user interface object 100 on the connection requestor 10 side includes an element for inputting context information defining the purpose of the connection request of the connection requester 10, and the connection receiver 20 side.
  • the virtual user interface object 100 ′ may include an element for displaying the context information input by the connection requester 10 to the connection receiver.
  • the virtual user interface object 100 on the connection receiver 20 side may allow the connection receiver 20 to ring a specific ring tone to the terminal.
  • the ringtone may be classified based on the access requester 10, the purpose of the access request, and the like.
  • it may include referring to schedule information by itself and performing an interface accordingly.
  • the virtual user 10 interface is executed to check the access request time of the access requester 10 and to notify the access receiver 20 of the access request or provide a response when the user is out of office with reference to the schedule information. It is also possible to make it.
  • connection means selected from the access requester 10 is set in advance as a means for lease inquiry of the real estate, if the real estate has already completed the lease, the connection process provides a message that the lease is completed in an automatic response, and the connection It is also possible to terminate the.
  • providing additional information through contextual awareness may be indicated as an interface element. That is, the text information on the situation set by the connection requestor 10 or the connection receiver 20 may be referred to and displayed on the terminal 300 on the connection receiver side.
  • the text information is not simply displayed, but the existing input text information and current text information or the user 10 information stored in the terminal of the access requestor 10 or the access receiver 20 are referred to and learned. You can provide additional information.
  • the user 10 may make his or her hand shaped like a receiver and input an interface command through moving the hand.
  • the receiver can input a command to approve the request of the access requester by raising the handset, and the user can enter a command to reject the connection or terminate the call by handing down the receiver. Can be.
  • a command may be input through actions of various user 10 that are promised in advance.
  • the number 1 or 2 when the number 1 or 2 is displayed by hand while the virtual user 10 interface is activated, it may be recognized as an access request command for the other party inputted as the number 1 or 2 of the speed dial.
  • connection request command by recognizing not only the simple behavior of the user 10 but also the interaction between the user 10 and the physical object included in the physical space.
  • the user may request a call from the other party or company in the space associated with the door.
  • the virtual user interface object 100 may create a connection established between the connection requester and the connection receiver as the connection object using the physical space information.
  • connection object may be generated as a holographic object displayed in three dimensions by using the physical object information of the physical space information and setting information of the connection counterpart.
  • the physical object information may be recognized as physical floor information as the floor of the space where the current user 10 exists, and the connection counterpart may be displayed as the holographic objects 110 and 110 ′ as if they are standing on the floor. have.
  • the holographic object may also be created by referring to the face information of the user 10 or the account setting information of the user 10 to identify the connection counterpart only by the holographic object.
  • the gaze position of the user 10 may be recognized to recognize which party the user 10 speaks to and may transmit information. Do.
  • the input conversation may be transmitted only to the other party of the account called akn.
  • the virtual user interface object 100 may recognize the language of the user 10 and display it as text. It is also possible to translate the text and provide it to the user 10. In this case, the language for translation may be determined by referring to the setting information of the user 10.
  • connection objects may be implemented as respective image layers 110, 110 ′ and 110 ′′, and may be generated with different depths using physical space information.
  • connection objects when there are a large number of connection objects, it may be difficult to display them as holographic objects according to spatial constraints.
  • Each connection object is overlaid and displayed like a book inserted into a bookshelf through a layer structure. You can choose one of the connected objects to have a conversation, just like choosing a book from the bookshelf.
  • connection object By providing a user interface 10 that the user 10 can arbitrarily change the order and size of the connection object to move the connection object of the person with whom the user 10 has the most conversations to the front or to increase the size. It is possible.
  • the image itself (mixed reality image) of the physical space including the virtual user interface object 100 according to the present embodiment may be relayed to another device.
  • the mixed reality image may be relayed to the terminal of the call counterpart.
  • the mixed reality recognized as the view of the user of the virtual reality may be relayed to the counterpart of the call as it is so that the mutual call may be made more easily.
  • FIG. 7 is a flowchart illustrating a method of processing the virtual user interface object 100 according to the present embodiment.
  • physical space information including depth information of at least one physical object is received (S100).
  • the camera module of the HMD device 15 receives the physical space information including the physical object of the real environment to which the user 10 currently belongs.
  • the received physical space information may have focus information according to the field of view of the user 10, and in the present embodiment, the depth information may correspond to the focus information of the user 10.
  • a virtual user interface object 100 that receives a command for communication between users 10 in a virtual space is generated based on the depth information using the physical space information (S200).
  • the virtual user interface object 100 is generated using the physical space information.
  • the depth at which the interface object 100 is displayed in the virtual space may be determined.
  • the depth at which the interface object 100 is generated and displayed may be determined by appropriately adjusting the convenience of the operation of the interface object 100 and the degree of restriction of the field of view of the user 10.
  • the interface object 100 when the interface object 100 is created at the same depth as that of the physical object corresponding to the focal point of the user 10 for convenience of operation, the user 10 issues a command for communication without shifting the focus. You can enter
  • the generating step (S200) may be generated at the same depth as the focus in order to reduce the restriction of the field of view, but may generate the interface object 100 outside the physical object corresponding to the focus.
  • the display format including the size, color, etc. of the interface object 100 may also be determined by referring to the attribute information of the physical object.
  • the attribute information of the physical object may be recognized through a camera sensor of the HMD device 15.
  • the interface object 100 comprises interface elements for communication connection means.
  • the interface element may include an element for defining a connection request means for another user 20, an element for inputting and indicating the purpose of the connection request.
  • the interface object 100 may include an element for efficiently displaying the user object 10 on the physical space 200 when the user 10 is simultaneously connected with a plurality of counterparts.
  • a 3D object is directly generated using holographic, and a conversation can be made in one space even if it is far away.
  • a user 10 interface for creating and selecting connection objects having different depths in a layer structure as shown in FIG. 6.
  • the access request command of the access requester or the access approval command of the access receiver are input through the described virtual user interface object 100 (S300).
  • the step of receiving the input receives the attribute information on the physical space 200, the user 10 behavior recognition information, etc. from the sensor of the HMD device 15 and receives the interface command through this, or the interface object 100 Enter other commands for).
  • the input of the command through the interface may be recognized by pressing a button as shown in FIG. 2 or by the shape of a finger as shown in FIG. 4.
  • Other commands for the interface object 100 may be recognized through behavior recognition of the user 10 as a command such as a change or a change in moving or expanding the connection object as shown in FIG. 6.
  • FIG. 8 is a conceptual diagram illustrating a processing method of a virtual user interface object 100 for communication between users 10 and a system for performing the same according to an embodiment of the present invention.
  • the system includes an HMD device 15 including a mixed reality processor 310 for generating a mixed reality and an interface object generator 320 for generating the interface object 100. It may be composed of various servers 500 and other users 20 through the network 400 and the network 400.
  • the mixed reality generator 312 of the mixed reality processor 310 generates a virtual reality suitable for the physical space 200 by using the image information, the distance information, and other environment information recognized by the sensor unit 314, and combines the same. Create mixed reality.
  • the sensor unit 314 may recognize a line of sight of the user 10, including a line of sight tracking sensor, and may recognize an action or movement of an object outside the field of view through an optical sensor.
  • Depth cameras, visible light cameras, infrared cameras, and the like can recognize the depth and illuminance information of objects in the real world.
  • Other sensors may be included to increase the immersion of other virtual reality.
  • the interface object generator 320 generates a virtual user interface object 100 that receives a command for communication between the users 10 in the virtual space.
  • the interface object generating unit 320 includes a display unit 322, a processor 324, a memory 326, and a storage unit 328.
  • the storage unit 328 stores the user interface processing program 330 to process the processor ( 324).
  • the interface object generation unit 320 loads the command of the user interface processing program 330 stored in the storage unit 328 and data necessary for the execution of the command into the memory 326, and the processor 324 executes the interface to execute the interface. Create an object 100.
  • the generated interface object 100 may be displayed through the display unit 322.
  • the memory 326 or the storage unit 328 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EPEROM), programmable read-only memory (PROM) ),
  • a magnetic memory, a magnetic disk, and an optical disk may include at least one type of storage medium.
  • system further includes a communication module, and may connect the device with another user 20 or the server 500 by accessing the network 400 through a wired / wireless communication system.
  • wireless Internet technologies may include wireless LAN (Wi-Fi), wireless broadband (Wibro), world interoperability for microwave access (Wimax), high speed downlink packet access (HSDPA), and the like.
  • the wired Internet technology may include a digital subscriber line (XDSL), fibers to the home (FTTH), power line communication (PLC), and the like.
  • the communication module may include a short range communication module to transmit and receive data to and from an electronic device including the short range communication module.
  • a short range communication technology Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.
  • the various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
  • the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller itself.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as the procedures and functions described herein may be implemented as separate software modules.
  • Each of the software modules may perform one or more functions and operations described herein.
  • Software code may be implemented in software applications written in a suitable programming language. The software code is stored in a memory as a recording medium and can be executed by the controller.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé et un système pour fournir une communication entre des utilisateurs dans un environnement de réalité augmentée, de réalité virtuelle ou de réalité mixte. Selon la présente invention, un procédé de traitement d'un objet d'interface utilisateur virtuelle pour une communication entre des utilisateurs comprend les étapes consistant à : recevoir des informations d'espace physique comprenant au moins un objet physique d'informations d'espace physique; générer un objet d'interface utilisateur virtuel pour recevoir une commande pour la communication entre les utilisateurs dans l'espace virtuel sur la base de l'objet physique en utilisant les informations d'espace physique; et recevoir une commande de demande de liaison d'un demandeur de liaison ou d'une commande d'approbation de liaison d'un récepteur de liaison par l'intermédiaire de l'objet d'interface utilisateur virtuel. Selon une configuration de la présente invention, une fonction de communication en temps réel peut être fournie dans la réalité mixte. Par conséquent, l'utilisateur n'a pas besoin de se déconnecter de l'environnement de réalité virtuelle à un environnement réel de façon à communiquer avec d'autres utilisateurs dans l'environnement réel, et peut communiquer par l'intermédiaire d'un navigateur Web sur des plates-formes de divers environnements de réalité virtuelle, et ainsi un mode de communication n'est pas limité.
PCT/KR2016/013980 2016-10-13 2016-11-30 Procédé de traitement d'objet d'interface utilisateur virtuelle pour une communication entre des utilisateurs et système le mettant en oeuvre Ceased WO2018070604A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2016-0133027 2016-10-13
KR1020160133027A KR101896982B1 (ko) 2016-10-13 2016-10-13 사용자간 통신을 위한 가상의 사용자 인터페이스 객체의 처리 방법 및 이를 수행하는 시스템

Publications (1)

Publication Number Publication Date
WO2018070604A1 true WO2018070604A1 (fr) 2018-04-19

Family

ID=61905690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/013980 Ceased WO2018070604A1 (fr) 2016-10-13 2016-11-30 Procédé de traitement d'objet d'interface utilisateur virtuelle pour une communication entre des utilisateurs et système le mettant en oeuvre

Country Status (2)

Country Link
KR (1) KR101896982B1 (fr)
WO (1) WO2018070604A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210083016A (ko) * 2019-12-26 2021-07-06 삼성전자주식회사 전자 장치 및 그의 제어 방법
KR20220101783A (ko) 2021-01-12 2022-07-19 삼성전자주식회사 콘텐츠 생성 기능을 제공하기 위한 방법 및 이를 지원하는 전자 장치
US12141364B2 (en) 2021-11-15 2024-11-12 Samsung Electronics Co., Ltd. Wearable device for communicating with at least one counterpart device according to trigger event and control method therefor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090061514A (ko) * 2007-12-11 2009-06-16 한국전자통신연구원 혼합현실용 콘텐츠 재생 시스템 및 방법
US20130321390A1 (en) * 2012-05-31 2013-12-05 Stephen G. Latta Augmented books in a mixed reality environment
US20130339864A1 (en) * 2012-06-15 2013-12-19 Nokia Corporation Method and apparatus for providing mixed-reality connectivity assistance
US20140055493A1 (en) * 2005-08-29 2014-02-27 Nant Holdings Ip, Llc Interactivity With A Mixed Reality
US20160027218A1 (en) * 2014-07-25 2016-01-28 Tom Salter Multi-user gaze projection using head mounted display devices

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101306288B1 (ko) * 2010-09-30 2013-09-09 주식회사 팬택 가상 객체를 이용한 증강 현실 제공 장치 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140055493A1 (en) * 2005-08-29 2014-02-27 Nant Holdings Ip, Llc Interactivity With A Mixed Reality
KR20090061514A (ko) * 2007-12-11 2009-06-16 한국전자통신연구원 혼합현실용 콘텐츠 재생 시스템 및 방법
US20130321390A1 (en) * 2012-05-31 2013-12-05 Stephen G. Latta Augmented books in a mixed reality environment
US20130339864A1 (en) * 2012-06-15 2013-12-19 Nokia Corporation Method and apparatus for providing mixed-reality connectivity assistance
US20160027218A1 (en) * 2014-07-25 2016-01-28 Tom Salter Multi-user gaze projection using head mounted display devices

Also Published As

Publication number Publication date
KR101896982B1 (ko) 2018-09-10
KR20180041000A (ko) 2018-04-23

Similar Documents

Publication Publication Date Title
WO2014123270A1 (fr) Procédé conçu pour fournir un service de messagerie instantanée, support d'enregistrement contenant un programme à cet effet et terminal
WO2015167160A1 (fr) Procédé d'affichage d'instruction et dispositif d'affichage d'instruction
WO2017135797A2 (fr) Procédé et dispositif électronique pour gérer le fonctionnement d'applications
WO2014137074A1 (fr) Terminal mobile et procédé de commande du terminal mobile
WO2016036132A1 (fr) Procédé de traitement de contenu et dispositif électronique associé
WO2014069891A1 (fr) Procédé et dispositif pour fournir des informations concernant un objet
WO2016175602A1 (fr) Dispositif électronique pour fournir une interface utilisateur de raccourci et procédé correspondant
WO2017104941A1 (fr) Terminal mobile et son procédé de commande
WO2018008978A1 (fr) Procédé de reconnaissance d'iris sur la base d'une intention d'un utilisateur et dispositif électronique associé
WO2020105752A1 (fr) Procédé de personnalisation de produits par l'intermédiaire d'un terminal
WO2017209409A1 (fr) Procédé d'édition de contenu sphérique et dispositif électronique prenant en charge ce dernier
WO2019226001A1 (fr) Procédé et appareil de gestion de contenu dans un système de réalité augmentée
WO2014119862A1 (fr) Procédé d'activation de fonction de sécurité pour région de cyberbavardage et dispositif s'y rapportant
WO2016208992A1 (fr) Dispositif électronique et procédé de commande d'affichage d'image panoramique
WO2017217592A1 (fr) Procédé de fourniture de notifications
WO2015111926A1 (fr) Dispositif électronique et procédé d'affichage d'interface utilisateur pour ledit dispositif
EP3069220A1 (fr) Procédé et appareil de fourniture d'informations d'application
WO2019107799A1 (fr) Procédé et appareil de déplacement d'un champ d'entrée
WO2016186325A1 (fr) Système et procédé de service de réseau social par image
WO2018070604A1 (fr) Procédé de traitement d'objet d'interface utilisateur virtuelle pour une communication entre des utilisateurs et système le mettant en oeuvre
WO2014171613A1 (fr) Procédé de prestation de service de messagerie, support d'enregistrement enregistré avec un programme afférent et terminal correspondant
WO2012015092A1 (fr) Terminal mobile et procédé pour avertir l'expéditeur de communication
WO2020050432A1 (fr) Terminal mobile
WO2020171574A1 (fr) Système et procédé pour interface d'utilisateur avec bouton d'obturateur renforcée par ia
WO2016064149A1 (fr) Dispositif électronique et procédé de commande de contenus dans un dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16918532

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.07.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16918532

Country of ref document: EP

Kind code of ref document: A1