WO2019231090A1 - 외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법 - Google Patents
외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법 Download PDFInfo
- Publication number
- WO2019231090A1 WO2019231090A1 PCT/KR2019/003336 KR2019003336W WO2019231090A1 WO 2019231090 A1 WO2019231090 A1 WO 2019231090A1 KR 2019003336 W KR2019003336 W KR 2019003336W WO 2019231090 A1 WO2019231090 A1 WO 2019231090A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- electronic device
- external electronic
- information
- processor
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/98—Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/105—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
- A63F2300/1093—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
Definitions
- Various embodiments relate to an electronic device and a method thereof for displaying an object related to an external electronic device using information on an external electronic device.
- Various electronic devices may include an electronic device for displaying multimedia content related to a virtual reality, and an external electronic device for user input within the multimedia content related to the virtual reality.
- a user who receives a service related to a virtual reality (VR) through an electronic device may have a limited field of view. Due to such a limitation of the field of view, a separate external electronic device for displaying a user's input in the service related to the VR may be provided.
- the electronic device requires a method for natural display of the external electronic device.
- An electronic device includes one or more cameras, a display, communication circuits, and a processor having a designated field of view, wherein the processor uses the camera to Identifying an external electronic device among one or more external objects included in a specified field of view, and based on the first location information of the external electronic device identified based on at least the image information acquired through the camera, When the corresponding graphic object is displayed on the display and the external electronic device is out of the designated view, second location information of the external electronic device checked through the camera before leaving the specified view; And the external electronic device through the communication circuit after being out of the designated field of view.
- the graphic object may be set to be displayed on the display based on information related to the movement of the external electronic device received from the mobile terminal.
- a method of identifying an external electronic device among one or more external objects included in the designated field of view using one or more cameras having a designated field of view may be performed. Displaying a graphic object corresponding to the external electronic device on the display based on the first location information of the external electronic device identified based on at least the image information acquired through the display, and the external electronic device Is outside the specified field of view, the second location information of the external electronic device identified through the camera before leaving the specified field of view and the received from the external electronic device through the communication circuit after being outside the specified field of view. Based on the information related to the movement of the external electronic device, the graphic object is imaged. It may include the operation of displaying on a display.
- An electronic device and a method thereof show an external electronic device using at least one of at least one sensor included in the electronic device and at least one sensor of an external electronic device interworking with the electronic device. It is possible to provide an enhanced user experience (UX) by displaying the graphic object together with the multimedia content for providing VR.
- UX enhanced user experience
- FIG. 1 is a block diagram of an electronic device in a network environment displaying an object related to an external electronic device based on a location and a movement of the external electronic device according to various embodiments.
- FIG. 2 illustrates an example of a functional configuration of an electronic device and an external electronic device according to various embodiments.
- FIG. 3 illustrates an example of a functional configuration of an electronic device, a first external electronic device, and a second external electronic device according to various embodiments.
- FIG. 4A illustrates an example in which an external electronic device is detached from a camera field of view of an electronic device according to various embodiments.
- 4B illustrates an example for comparing a field of view of a camera and a field of view of a user according to various embodiments.
- 5A illustrates an example of an operation of an electronic device according to various embodiments.
- 5B illustrates an example of an operation of an electronic device according to various embodiments.
- FIG. 6 illustrates an example of an external electronic device and a graphic object displayed to correspond to the external electronic device according to various embodiments of the present disclosure.
- FIG. 7 illustrates a temporal flow of a display displayed according to an operation of an electronic device according to various embodiments of the present disclosure.
- FIG. 8 illustrates an example for comparing the effect of correction using sensor data according to various embodiments.
- 9A illustrates an example of an operation of an electronic device of an electronic device according to various embodiments.
- 9B illustrates an example of an operation of an electronic device of an electronic device according to various embodiments.
- FIG. 10 illustrates an example of a graphic object for guiding a location of an external electronic device displayed on a display of an electronic device according to various embodiments of the present disclosure.
- FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
- the electronic device 101 communicates with the electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or the second network 199.
- the electronic device 104 may communicate with the server 108 through a long range wireless communication network.
- the electronic device 101 may communicate with the electronic device 104 through the server 108.
- the electronic device 101 may include a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module. 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197.
- the components may be included.
- at least one of the components may be omitted or one or more other components may be added to the electronic device 101.
- some of these components may be implemented in one integrated circuit.
- the sensor module 176 eg, fingerprint sensor, iris sensor, or illuminance sensor
- the display device 160 eg, display
- the processor 120 executes software (eg, the program 140) to execute at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or operations. According to one embodiment, as at least part of the data processing or operation, the processor 120 may send instructions or data received from another component (eg, the sensor module 176 or the communication module 190) to the volatile memory 132. Can be loaded into, processed in a command or data stored in volatile memory 132, and stored in the non-volatile memory (134).
- software eg, the program 140
- the processor 120 may send instructions or data received from another component (eg, the sensor module 176 or the communication module 190) to the volatile memory 132. Can be loaded into, processed in a command or data stored in volatile memory 132, and stored in the non-volatile memory (134).
- the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor), and a coprocessor 123 (eg, a graphics processing unit, an image signal processor) that may operate independently or together. , Sensor hub processor, or communication processor). Additionally or alternatively, the coprocessor 123 may be configured to use lower power than the main processor 121 or to be specialized for its designated function. The coprocessor 123 may be implemented separately from or as part of the main processor 121.
- a main processor 121 eg, a central processing unit or an application processor
- a coprocessor 123 eg, a graphics processing unit, an image signal processor
- the coprocessor 123 may be configured to use lower power than the main processor 121 or to be specialized for its designated function.
- the coprocessor 123 may be implemented separately from or as part of the main processor 121.
- the coprocessor 123 may, for example, replace the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 may be active (eg, execute an application). At least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) together with the main processor 121 while in the) state. Control at least some of the functions or states associated with the. According to one embodiment, the coprocessor 123 (eg, image signal processor or communication processor) may be implemented as part of another functionally related component (eg, camera module 180 or communication module 190). have.
- another functionally related component eg, camera module 180 or communication module 190
- the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101.
- the data may include, for example, software (eg, the program 140) and input data or output data for a command related thereto.
- the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
- the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
- the input device 150 may receive a command or data to be used for a component (for example, the processor 120) of the electronic device 101 from the outside (for example, a user) of the electronic device 101.
- the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
- the sound output device 155 may output a sound signal to the outside of the electronic device 101.
- the sound output device 155 may include, for example, a speaker or a receiver.
- the speaker may be used for general purposes such as multimedia playback or recording playback, and the receiver may be used to receive an incoming call.
- the receiver may be implemented separately from or as part of a speaker.
- the display device 160 may visually provide information to the outside (eg, a user) of the electronic device 101.
- the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
- the display device 160 may include touch circuitry configured to sense a touch, or sensor circuit (eg, a pressure sensor) set to measure the strength of the force generated by the touch. have.
- the audio module 170 may convert sound into an electrical signal or, conversely, convert an electrical signal into a sound. According to an embodiment, the audio module 170 may acquire sound through the input device 150, or may output an external electronic device (eg, a sound output device 155 or directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (for example, a speaker or a headphone).
- an external electronic device eg, a sound output device 155 or directly or wirelessly connected to the electronic device 101. Sound may be output through the electronic device 102 (for example, a speaker or a headphone).
- the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
- the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to be directly or wirelessly connected to an external electronic device (for example, the electronic device 102).
- the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card interface
- audio interface audio interface
- connection terminal 178 may include a connector through which the electronic device 101 may be physically connected to an external electronic device (eg, the electronic device 102).
- the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
- the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that can be perceived by the user through tactile or kinetic senses.
- the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module 180 may capture still images and videos. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 188 may manage power supplied to the electronic device 101.
- the power management module 388 may be implemented, for example, as at least part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 may supply power to at least one component of the electronic device 101.
- the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell or a fuel cell.
- the communication module 190 may establish a direct (eg wired) communication channel or wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). Establish and perform communication over established communication channels.
- the communication module 190 may operate independently of the processor 120 (eg, an application processor) and include one or more communication processors supporting direct (eg, wired) or wireless communication.
- the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a near field communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg It may include a local area network (LAN) communication module, or a power line communication module.
- GNSS global navigation satellite system
- the corresponding communication module of these communication modules may be a first network 198 (e.g. a short range communication network such as Bluetooth, WiFi direct or infrared data association (IrDA)) or a second network 199 (e.g. cellular network, the Internet, or Communicate with external electronic devices via a telecommunications network, such as a computer network (eg, LAN or WAN).
- a first network 198 e.g. a short range communication network such as Bluetooth, WiFi direct or infrared data association (IrDA)
- a second network 199 e.g. cellular network, the Internet, or Communicate with external electronic devices via a telecommunications network, such as a computer network (eg, LAN or WAN).
- a telecommunications network such as a computer network (eg, LAN or WAN).
- the wireless communication module 192 uses subscriber information (e.g., international mobile subscriber identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
- subscriber information e.g., international mobile subscriber identifier (IMSI)
- IMSI international mobile subscriber identifier
- the antenna module 197 may transmit or receive a signal or power to an external device (for example, an external electronic device).
- the antenna module 197 may include one antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (eg, a PCB).
- the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for the communication scheme used in the communication network, such as the first network 198 or the second network 199, may be, for example, communicated from the plurality of antennas by the communication module 190. Can be selected.
- the signal or power may be transmitted or received between the communication module 190 and the external electronic device through the selected at least one antenna.
- components other than radiators eg, RFICs
- peripheral devices eg, a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
- GPIO general purpose input and output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
- Each of the electronic devices 102 and 104 may be the same or different type of device as the electronic device 101.
- all or some of the operations performed by the electronic device 101 may be performed by one or more external electronic devices among the external electronic devices 102, 104, or 108.
- the electronic device 101 when the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101 instead of executing the function or service itself.
- one or more external electronic devices may be requested to perform at least a part of the function or the service.
- the one or more external electronic devices that receive the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101.
- the electronic device 101 may process the result as it is or additionally and provide the result as at least a part of the response to the request.
- cloud computing, distributed computing, or client-server computing technology may be used.
- FIG. 2 illustrates an example of a functional configuration of an electronic device and an external electronic device according to various embodiments. At least a part of the functional configuration of the electronic device 201 and the external electronic device 202 of FIG. 2 may be included in the functional configuration of the electronic device 101 shown in FIG. 1 and the electronic device 102 shown in FIG. 1. .
- the environment 200 may include an electronic device 201 and an external electronic device 202.
- the electronic device 201 may be linked with the external electronic device 202 and may be configured to provide content related to virtual reality (VR) to a user.
- the electronic device 201 may include a memory 210, a communication module 220, a display 230, a camera 240, or a processor 250.
- the memory 210 may include the memory 130 shown in FIG. 1
- the communication module 220 may include the communication module 190 shown in FIG. 1
- display The 230 may include the display device 160 illustrated in FIG. 1
- the camera 240 may include the camera 180 illustrated in FIG. 1
- the processor 250 may include the FIG. 1. It may include the processor 120 shown.
- processor 250 (eg, processor 120) is coupled to at least one of memory 210, communication module 220, display 230 operatively or operably ( coupled to).
- the processor 250 may obtain information about the location of the external electronic device 202 linked with the electronic device 201 using the camera 240.
- the processor 250 may acquire an image of the external environment of the electronic device 201 using the camera 240.
- the image of the external environment may include an image related to a plurality of external objects including the external electronic device 202.
- the camera 240 is a housing for mounting components of the electronic device 201 (eg, the memory 210, the communication module 220, the display 230, or the processor 250). It may be visually exposed through at least a portion of the first face of the housing.
- the housing may include a second side at an opposite side of the first side, and the display 230 may be visually exposed through at least a portion of the second side.
- camera 240 may have a designated field of view.
- the field of view may be referred to in various terms such as field of view, field of view, angle of view, FOV, horizontal FOV, vertical FOV, and the like.
- the camera 240 may acquire an image corresponding to an area of a predetermined size.
- the region of the predetermined size may include a region formed by a predetermined angle with respect to the top, bottom, left and right.
- the region of the predetermined size may correspond to the region included in the image 610.
- the user field of view 401 may correspond to a field of view (FOV) that can be displayed on the display.
- the camera field of view 403 may be determined based on the capabilities of the lenses included in the camera 240.
- the camera 240 may include a wide angle lens, and the camera field of view 403 may have a large value angle corresponding to the wide angle lens.
- the camera field of view 403 may correspond to areas 412, 414, 416 indicated by dashed lines.
- the camera 240 may acquire an image for an area of a predefined size. Based on the field of view of the camera 240, the size of the region may be different. For example, if the camera 240 corresponds to a wide lens camera, the size of the area can be increased.
- the camera field of view may correspond to the user field of view 401 and may correspond to a field of view 403 that is larger than the image area 410 output to the display.
- the image 410 displayed through the display 230 may include images corresponding to the field of view of the user.
- the field of view of the camera 240 may be the same as the field of view of the display area of the image 410 displayed through the display 230.
- the field of view of the camera 240 is not limited thereto, and may be smaller or larger than the field of view of the user, as shown in FIG. 4B.
- the processor 250 may be linked or interlocked with the external electronic device 202.
- the interworking may include a connection generated by wired or wireless.
- the interworking may be performed by the communication module 220 included in the electronic device 201.
- the processor 250 may detect the external electronic device 202 by detecting a signal output from the external electronic device 202.
- the external electronic device 202 may further include a light emitter.
- the light emitting unit may include at least one light emitting diode (LED).
- the external electronic device 202 may output light corresponding to illuminance above a threshold range (eg, a threshold value) through the LED.
- the processor 250 of the electronic device 201 may identify the external electronic device 202 by receiving light corresponding to the illuminance of the threshold value or more.
- the processor 250 detects a signal output from the external electronic device LED in the image acquired by the camera 240, the external electronic device 202 is in the field of view of the camera 240.
- the processor 250 may determine that the external electronic device 202 is detached from the field of view of the camera 240. .
- the LED included in the external electronic device 202 may be turned on or off at predetermined time intervals.
- the electronic device 201 recognizes that the external electronic device 202 is turned on or off at predetermined time intervals, and provides information on the predetermined time interval for turning on or off the LED included in the external electronic device 202. By identifying, the external electronic device 202 can be detected. For example, the external electronic device 202 may control the LED to repeat ON or OFF every 0.1 seconds. When the external electronic device 202 controls the LED to be turned on or off at predetermined time intervals, the external electronic device 202 may transmit period information for indicating the predetermined time interval to the electronic device 201.
- the electronic device 201 may receive the period information and obtain information on the predetermined time interval for turning on or off the LED included in the external electronic device 202.
- the external electronic device 202 may detect the external electronic device 202 by identifying LEDs that are turned on or off in the same manner as the predetermined time interval, based on the period information.
- the predetermined time interval may be determined randomly. For example, the predetermined time interval may be determined based on a unique ID of the external electronic device 202. Since a time interval of turning on or off is determined based on a unique ID of the external electronic device 202 linked to the electronic device 201, the electronic device 201 may be connected to another electronic device through the camera 240. Even if the LED output from the external electronic device (for example, the VR controller of another user) is recognized, the LED output from the external electronic device 202 linked with the electronic device 201 may be correctly identified.
- the external electronic device 202 may measure brightness of an external environment through an illuminance sensor (not shown) included in the external electronic device 202.
- the external electronic device 202 may change the LED output intensity when the brightness of the external environment is greater than or equal to a predetermined value. For example, the external electronic device 202 may increase the LED output intensity in response to detecting that the measured brightness of the external environment is greater than a predetermined threshold value.
- the external electronic device 202 may be more easily identified by the electronic device 201 by increasing the LED output intensity.
- the external electronic device 202 may turn the LED on or off according to the predetermined time interval based on the increased LED output intensity.
- the electronic device 201 may obtain information about the location of the external electronic device 202.
- the location information may include information determined based on an image obtained from the camera 240 of the external electronic device 202.
- the information about the location may include information about coordinates in which the external electronic device 202 is positioned in the acquired image.
- the information about the position may include an x coordinate value, a y coordinate value, or a z coordinate value.
- the information about the coordinates may be obtained by the location tracking module 252 included in the processor 250.
- the processor 250 may include a location tracking module 252.
- the location tracking module 252 may obtain information about the location of the external electronic device 202 included in the image obtained by the camera 240. For example, the location tracking module 252 divides the obtained image into x and y axes having the same interval, and acquires the x coordinate or the y coordinate corresponding to the detected area by the external electronic device 202. Can be. However, in various embodiments, the x-axis coordinates and the y-axis coordinates are not limited to being obtained based on the division of the same interval, and may be obtained based on techniques for obtaining information about a plurality of positions. have. For another example, the location tracking module 252 may obtain depth information based on the obtained image.
- the depth information may be obtained based on an image operation for calculating the depth information from a 2D image.
- the location tracking module 252 may obtain depth information based on the difference value for the two images.
- the electronic device 201 may include two cameras, and the two cameras may acquire images of the external electronic device 202, respectively. The two cameras may be arranged side by side independently with a certain distance.
- the location tracking module 252 may obtain the depth information based on the difference between the images respectively obtained from the two cameras.
- the location tracking module 252 may obtain a z coordinate value according to the obtained depth information.
- the processor 250 may receive information about tilting of the external electronic device 202 from the external electronic device 202.
- the processor 250 may receive the information about the inclination via the communication module 220.
- the information on the tilt may include information obtained by at least one sensor included in the external electronic device 202.
- the at least one sensor may include a gyro sensor.
- the gyro sensor may acquire sensor data (hereinafter, referred to as first sensor data) about the tilt of the external electronic device 202.
- the first sensor data may include sensor data related to at least one of a pitch, a roll, or a yaw.
- the tilt information may be transmitted from the external electronic device 202 to the electronic device 201.
- the inclination information may be transmitted to the electronic device 201 based on a change in state of the external electronic device 202.
- the external electronic device 202 responds to the inclination to the electronic device 201 in response to the power on, on status, power on, and activation states of the external electronic device 202. Information can be sent.
- the tilt information may be periodically transmitted from the external electronic device 202 to the electronic device 201.
- the external electronic device 202 may further include a grip sensor (not shown). The grip sensor may be configured to detect that the external electronic device 202 is gripped by a user.
- the processor 295 receives a signal indicating that the external electronic device 202 has been gripped by the user from the grip sensor, and in response to receiving the indicating signal, the gyro The sensor can be activated.
- the processor 295 may obtain information about the inclination via the activated gyro sensor and transmit the information to the electronic device 201.
- the external electronic device 202 may transmit information about the inclination to the electronic device 201.
- the electronic device 201 may detect that the interworking with the external electronic device 202 is completed, and transmit a control signal to the external electronic device 202 to transmit the information about the inclination in response to the detection.
- the control signal may include a signal instructing to activate the gyro sensor and / or the acceleration sensor.
- the external electronic device 202 may activate the gyro sensor and acquire the first sensor data in response to the control signal.
- the external electronic device 202 may activate the acceleration sensor and acquire the second sensor data in response to the control signal.
- the processor 250 may provide a graphic object for presenting the external electronic device 202 along with multimedia content related to the VR.
- the graphic object may refer to an object displayed on the display 230 to indicate user input in multimedia content related to virtual reality.
- the user input may include information about the movement of the external electronic device 202.
- the graphic object may be referred to in various terms such as a visual object, a graphical object, a controller object, and the like.
- the graphic object may include images of various shapes.
- the graphic object may include an image having the same or similar shape as that of the external electronic device 202.
- the external electronic device 202 may include an image that is determined based on the type of the multimedia content.
- the image determined based on the type of content may include images for various objects, including tennis rackets, steering wheels, knives or swords, and the like.
- the processor 250 may display the multimedia content associated with the VR.
- the multimedia content may include at least one of three-dimensional or omni-directional images for providing a virtual reality service to a user.
- the processor 250 may display the graphic object for showing the external electronic device 202 on the display 230.
- the graphic object may be displayed to be superimposed with the multimedia content related to the VR.
- the processor 250 may render the multimedia content related to the VR and render the graphic object at a first point in time.
- the graphic object may be displayed by being superimposed on the multimedia content.
- the graphic object may be identified based on the information about the position of the external electronic device 250 or the information about the tilt of the external electronic device 202.
- the processor 250 may obtain information about the location of the external electronic device 202 and identify a location to display the graphic object.
- the processor 250 may determine to display the graphic object three-dimensionally at the identified display position based on the information about the tilt. Since the processor 250 only identifies the position of the external electronic device 202 based on the illuminance value in the image acquired by the camera 240, the processor 250 may be configured to display the tilt as information for rendering in three dimensions. You may need information.
- the processor 250 may obtain information about the inclination based on a pattern or a marker of the external electronic device 202 included in the image acquired through the camera 240.
- the external electronic device 202 may include at least one marker. Since the at least one marker is located in a fixed area of the external electronic device 202, when the external electronic device 202 is tilted, the at least one marker may also be tilted equally.
- the processor 250 may acquire an image of the at least one inclined marker in the image acquired through the camera 240.
- the external electronic device 202 may output light of a predetermined pattern through the LED.
- the external electronic device 202 when the external electronic device 202 is inclined to be perpendicular to the ground surface, the external electronic device 202 may turn the LED on or off at intervals of 0.1 second. For another example, the external electronic device 202 may turn the LED on or off at an interval of 0.5 seconds when the external electronic device 202 is inclined in parallel with the ground surface.
- the external electronic device 202 may indirectly provide information about the inclination of the external electronic device 202 by transmitting the information about the predetermined pattern to the electronic device 201.
- the graphic object may be determined based on multimedia content associated with the VR.
- the multimedia content related to the VR may include a tennis game, a war game, a car game, and the like.
- the graphic object when the multimedia content corresponds to a tennis game, the graphic object may correspond to an image including the shape of a tennis racket.
- the graphic object when the multimedia content corresponds to a war game, the graphic object may correspond to an image including a shape of a sword or a sword.
- the graphic object may correspond to an image including a shape of a steering wheel.
- the processor 250 refers to a value for the x coordinate and / or y coordinate among the information about the position, and corresponds to a point corresponding to the value for the x coordinate and / or the y coordinate on the display 230. Can be determined.
- the processor 250 may determine the size of the determined point by referring to a value for z coordinate among the information about the position. For example, when the value for the z coordinate is large (eg, the external electronic device 202 is far from the user), the determined point may be displayed small for the perspective effect. For another example, when the value for the z coordinate is small (for example, the external electronic device 202 is located closer to the user), the determined point may be displayed large for the perspective effect.
- the processor 250 may determine a three-dimensional display of the graphic object based on the information on the inclination.
- the processor 250 may refer to gyro sensor data indicating at least one of a pitch, a roll, or a yaw among the information about the inclination. Or the angle can be determined.
- the processor 250 may display the graphic object inclined according to an inclined direction or angle in three dimensions at a position determined according to the x coordinate, y coordinate, or z coordinate.
- the processor 250 may display the display image 620 through the display 230.
- the display image 620 may include a background image related to the multimedia content or the graphic object 602.
- the processor 250 may display the graphic object 602 on an area corresponding to the location of the external electronic device 102 identified based on the camera image 610.
- the processor 250 may display the graphic object 602 to have the same inclination, based on the information on the inclination of the external electronic device 202 received from the external electronic device 202.
- the image 620 displayed through the display 230 may include images corresponding to the user's field of view.
- the field of view of the camera 240 may be the same as the field of view of the display area of the image 620 displayed through the display 230.
- the processor 250 may identify that the external electronic device 202 is out of view of the camera 240. For example, the processor 250 may identify whether the external electronic device 202 is separated, based on the image acquired by the camera 240. The processor 250 may determine that the external electronic device 202 has deviated from the field of view of the camera 240 when no area corresponding to the illuminance above the threshold value is detected in the acquired image. As another example, when an area corresponding to the illuminance above the threshold value is detected in the image acquired by the camera 240, the processor 250 may determine that the external electronic device 202 is connected to the camera 240. It can be determined to be located within the field of view. For another example, referring to FIG.
- a user may grip respective external electronic devices 202-1 and 202-2 with both hands while wearing the electronic device 201.
- Each of the external electronic devices 202-1 and 202-2 may correspond to the external electronic device 202 illustrated in FIG. 2.
- the external electronic devices 202-1 and 202-2 may have a viewing angle of the electronic device 201. Since it is included in the 401, the electronic device 201 may acquire an image including the external electronic devices 202-1 and 202-2.
- the processor 250 detects an area corresponding to an illuminance above a threshold value emitted from the external electronic devices 202-1 and 202-2 to thereby detect the external electronic devices 202-1 and 202-2.
- the electronic device 201 when the user of the electronic device 201 points the arm laterally (for example, as in the example of the arm indicated by the dotted line), the external electronic devices 202-1 and 202-2 are connected to the electronic device ( Since the electronic device 201 deviates from the viewing angle 401 of 201, the electronic device 201 may not acquire an image including the external electronic devices 202-1 and 202-2. The electronic device 201 may not detect an area corresponding to an illuminance above a threshold value emitted from the external electronic devices 202-1 and 202-2. The electronic device 201 may identify that the external electronic devices 202-1 and 202-2 are out of the field of view of the camera 240 or are out of the field of view of the camera 240.
- the processor 250 may store information about the location of the external electronic device 202 corresponding to the departure time. For example, the processor 250 may identify the location of the external electronic device 202 in response to identifying the departure. The identified position may be located on one of the boundary surfaces of the image acquired by the camera 240. The processor 250 may store the information about the identified location as second location information. The second location information may be obtained by an image of the most recent view including the external electronic device 202 among the plurality of images acquired by the camera 240. When the image obtained by the camera 240 is divided into the x axis along the horizontal axis and the y axis along the vertical axis, the second position information includes only one coordinate value of an x coordinate value or a y coordinate value. It may include information on one of the points.
- the processor 250 may request the external electronic device 202 to transmit information about a change in the location of the external electronic device 202.
- the external electronic device 202 is deviated from the field of view of the camera 240, the camera 240 may not acquire an image including the external electronic device 202, and thus, the processor 250 may not acquire the image. It is not possible to obtain information about the location of 202. Therefore, when the processor 250 detects the departure of the external electronic device 202, the processor 250 may request information on the change of the position from the external electronic device 202.
- the information about the change of the location may include information obtained by at least one sensor of the external electronic device 202.
- the at least one sensor included in the external electronic device 202 may include a first sensor 292 or a second sensor 294.
- the first sensor 292 may correspond to a gyro sensor
- the second sensor 294 may correspond to an acceleration sensor.
- the first sensor 292 may acquire information about the inclination of the external electronic device 202
- the second sensor 294 may obtain information about a speed change of the external electronic device 202.
- the external electronic device 202 may transmit information obtained from the first sensor 292 or the second sensor 294 to the electronic device 201 in response to a request for information on the change of the location.
- the external electronic device 202 transmits only the information about the inclination by the first sensor 292 to the electronic device 201, and in response to the request, the external electronic device 202 inclines the inclination by the first sensor 292.
- Information about the load or information about the speed change by the second sensor 294 may be transmitted together.
- the transmitted tilt information or the speed change information may be transmitted by the processor 250 of the electronic device 201 of the external electronic device 202 with respect to a time after the departure of the external electronic device 202. It can be used to predict location.
- the information about the change in the location may include data indicating the amount of change in the location of the external electronic device 202.
- the data indicating the position change amount may include vector information.
- Data indicating the position change amount may be obtained by the external electronic device 202.
- the external electronic device 202 may obtain the vector information based on the information obtained by the first sensor 292 or the second sensor 294.
- the external electronic device 202 may transmit the vector information obtained based on the sensor data, instead of transmitting the sensor data acquired by the first sensor 292 or the second sensor 294.
- the processor 295 may predict the moving direction of the external electronic device 202 through the information on the inclination. For example, the processor 295 may predict the movement of the external electronic device 202 in a direction corresponding to the rotation direction.
- the processor 295 acquires data indicating that the external electronic device 202 has rotated in the x + direction
- the processor 295 may predict that the external electronic device 202 has moved in the x + direction.
- processor 295 may store information about changes in the location for repetitive operations.
- the processor 295 may compare the data obtained from the first sensor 292 or the second sensor 294 with the stored information. When the acquired data and the stored information match, the processor 295 may determine that the user who holds the external electronic device 202 is performing a repetitive operation. The processor 250 may display the graphic object on the display 230 based on the determined repetitive operation.
- the signal requesting to transmit the information about the change in the location may include a control signal instructing to activate the second sensor 294 of the external electronic device 202.
- the external electronic device 202 may activate the second sensor 294 and obtain information on the speed change.
- the control signal instructing to activate the first sensor 292 eg, a gyro sensor
- the processor 250 may receive information about a direction and / or an angle at which the external electronic device 202 is inclined from the first sensor 292.
- the processor 250 renders the external electronic device 202 according to the inclined direction and / or angle on the position of the external electronic device 202 in the image acquired by the camera 240. can do.
- the processor 250 may be unable to identify the external electronic device 202 based on the image acquired by the camera 240 (eg, from the camera field of view 403). Detection of departure), the first sensor 292 (e.g., gyro sensor), and / or the second sensor 294 (e.g., acceleration sensor) to control the external electronic device 202 Can be sent to
- the processor 250 may receive information about a speed change of the external electronic device 202 that is outside the camera field of view 403 from the second sensor 294.
- the processor 250 may obtain a location at which the external electronic device 202 is predicted to move based on the information on the speed change.
- the processor 250 may receive information about an inclination direction and / or an inclination angle of the external electronic device 202 from the gyro sensor.
- the processor 250 may render a graphic object corresponding to the external electronic device 202 on a location where the external electronic device 202 is expected to move.
- the processor 250 may display the graphic object based on the inclined direction and / or the inclined angle.
- the processor 250 may acquire information on the location of the external electronic device 202 corresponding to the departure time, obtained using the camera 240, and received from the external electronic device 202.
- the display of the graphic object may be changed based at least on the information about the tilt of the external electronic device 202 or the information about the change of the position of the external electronic device 202.
- the processor 250 may predict the location of the external electronic device 202 corresponding to the current time. For example, the processor 250 may determine that the external electronic device 202 of the external electronic device 202 is based on the inclination information or the speed change information received from the external electronic device 202. Information about the change in position can be generated. The information on the change of the generated position may include vector information. The vector information may include information about the displacement of the external electronic device 202. As another example, the processor 250 may receive information about the change of the position from the external electronic device 202. The information on the change of position received from the external electronic device 202 is obtained by the processor 295 of the external electronic device 202 based on the information on the inclination or the information on the speed change. May correspond to information.
- the processor 250 may predict the current location of the external electronic device 202 based on the information on the change of the location and / or the information on the location of the external electronic device 202 corresponding to the departure time. . For example, the processor 250 determines an area from which the external electronic device 202 is separated as a starting point, and based on the starting point and / or vector information, the processor 250 departs from the camera field of view. It is possible to identify a change in position according to the movement of 202.
- the processor 250 may display a visual effect and / or a separate graphic object (eg, an arrow) for indicating a direction in which the external electronic device is located based on the predicted position of the external electronic device 202. have.
- the starting point and / or the vector information may be changed based on the movement of the electronic device 201.
- the processor 250 may obtain information about the movement of the electronic device 201 after the external electronic device 202 leaves the camera field of view 403.
- Information about the movement of the electronic device 201 may be obtained by a sensor module (eg, the sensor module 176 of FIG. 1) included in the electronic device 201.
- the electronic device 201 rotates to the left (eg, wears a VR device and rotates the head to the left) after the external electronic device 202 leaves the camera field of view 403
- the starting point may be changed based on the movement of the electronic device 201.
- the starting point may reflect the leftward rotation of the electronic device 201 and may move in a leftward direction from an area at which the external electronic device 202 is separated.
- the vector information may be changed based on the movement of the electronic device 201. Can be.
- the vector information may further include information for indicating an additional position change in the right direction by reflecting the leftward rotation of the electronic device 201.
- the processor 250 may display a part of the first graphic object.
- the first graphic object may refer to a graphic object determined based on the multimedia content. For example, when the first graphic object corresponds to a sword, the processor 250 may display a portion of the first graphic object with reference to the length of the sword. That is, when the external electronic device 202 is out of the field of view of the camera 240 but does not move beyond the length of the knife, the processor 250 may display a part of the first graphic object.
- the processor 250 estimates the current position through the vector information about the departure position and the change of position of the external electronic device 202, and when the distance between the predicted current position and the camera 240 is smaller than the length of the knife, the knife You can mark a part of (for example, blade).
- the processor 250 may omit the display of the first graphic object and display a second graphic object distinguished from the graphic object.
- the second graphic object may correspond to an object for instructing the user that the external electronic device 202 is located in an external area of the image displayed on the display 230.
- the second graphic object may be an object different from the first graphic object.
- the second graphic object may include an image having the same or similar shape as that of the external electronic device 202.
- the second graphic object may include an arrow-shaped image for indicating that the external electronic device 202 is located outside the field of view of the camera 240.
- the processor 250 may further display a phrase guiding that the external electronic device 202 is located outside the field of view of the camera 240 together with the second graphic object.
- the processor 250 may not display the first graphic object when the distance between the predicted position of the external electronic device 202 and the camera 240 exceeds the length of the first graphic object. Can be. Since the first graphic object is not displayed, the processor 250 may display a second graphic object for indicating a current position of the external electronic device 202.
- the external electronic device 202 may be configured to interwork with the electronic device 201 and provide a user input in content related to virtual reality.
- the external electronic device 202 may include a memory 260, a communication module 270, a user input module 280, a sensor module 290, or a processor 295.
- the memory 260 may include the memory 130 shown in FIG. 1, and the communication module 220 may include the communication module 190 shown in FIG. 1, and the user
- the input module 280 may include the input device 150 shown in FIG. 1, and the sensor module 290 may include the sensor module 176 shown in FIG. 1.
- the user input module 280 may receive a user input for controlling the electronic device 201 linked to the external electronic device 202.
- the user input module 280 may be visually exposed through a portion of the housing (not shown) of the external electronic device 202.
- the user input module 280 may include a touch pad for receiving a user's touch input, a physical button or a physical key capable of receiving a physical pressure. physical key).
- the processor 295 is operatively or operably coupled with at least one of the memory 260, the communication module 270, the user input module 280, or the sensor module 290. (coupled to)
- the sensor module 290 may include a first sensor 292 or a second sensor 294.
- the first sensor 292 may correspond to a gyro sensor
- the second sensor 294 may correspond to an acceleration sensor.
- the processor 295 may transmit information about the inclination of the external electronic device 202 to the electronic device 201.
- the processor 295 may obtain information about the inclination via at least one sensor.
- the at least one sensor may include a gyro sensor.
- the inclination information may include information for indicating a three-dimensional angle change of the external electronic device 202.
- the information on the inclination may include data about a change amount of a pitch, a roll, and a yaw.
- the processor 295 may transmit information about a change in the position of the external electronic device 202 to the electronic device 201.
- the information about the change in the position may include information about the inclination or information about the speed change obtained by the first sensor 292 or the second sensor 294 of the external electronic device 202. have.
- the information on the change of the position may include information indicating the amount of change of position obtained based on the information on the tilt or the information on the speed change.
- the information indicating the position change amount may correspond to vector information.
- the processor 295 may adaptively transmit information about the change of the location to the electronic device 201.
- the processor 295 may transmit information about the change of the position to the electronic device 201 at predefined time intervals.
- the predefined time interval may be changed based at least on the remaining amount of battery of the external electronic device 202 and the quality of wireless communication between the external electronic device 202 and the electronic device 201.
- the predefined time interval may be reduced to reflect the movement of the external electronic device 202 in real time.
- the predefined time interval may be increased to increase the driving time of the external electronic device 202 and reduce power consumption.
- the processor 295 may reduce the predefined time interval to ensure the reliability of data transmission.
- the processor 295 may increase the predefined time interval if the quality of the wireless communication is good.
- the processor 295 may transmit information about the change of location along with different information transmitted to the electronic device 201.
- the processor 295 may receive a user's physical key input and transmit information about the change in location along with the received input.
- the external electronic device 202 may further include a light emitting unit.
- the light emitter may activate the light emitter in response to detecting an on status of the external electronic device 202.
- the external electronic device 202 may generate light by receiving a signal instructing to emit light from the electronic device 201.
- the electronic device 201 may be configured to acquire information about the position of the external electronic device 202 by capturing light emitted from the light emitting unit through the camera 240.
- the light emitting unit may include a light emitting diode (LED) light source.
- LED light emitting diode
- FIG. 3 illustrates an example of a functional configuration of an electronic device, a first external electronic device, and a second external electronic device according to various embodiments.
- the functional configuration of the second external electronic device 303 illustrated in FIG. 3 may be included in the functional configuration of the external electronic device 202 illustrated in FIG. 2.
- the functional configuration of the electronic device 301 illustrated in FIG. 3 may be included in the functional configuration of the electronic device 201 illustrated in FIG. 2.
- the first external electronic device 302 may include a memory 321, a communication module 322, a display 323, a camera 324, or a processor 325.
- the memory 321 may include the memory 210 shown in FIG. 2, and the communication module 322 may include the communication module 220 shown in FIG. 2, Display 323 may include display 230 shown in FIG. 2, camera 324 may include camera 240 shown in FIG. 2, and processor 325 may include FIG. 2. It may include the processor 250 shown in.
- the processor 325 may be operatively coupled with at least one of the memory 321, the communication module 322, the display 323, and the camera 324.
- the processor 325 may obtain information about the location of the second external electronic device 303. For example, when the second external electronic device 303 is located inside the field of view of the camera 324, the processor 325 includes the second external electronic device 303 via the camera 324. An image can be obtained. According to one embodiment, the processor 325 may further include a location tracking module. The location tracking module may acquire information about the location of the second external electronic device 303 in the obtained image. The location tracking module may correspond to the location tracking module 252 of the processor 250 illustrated in FIG. 2. In other embodiments, the information about the location of the second external electronic device 303 may be obtained by the electronic device 301.
- the processor 325 may acquire an image including the second external electronic device 303 through the camera 324, and transmit the obtained image to the electronic device 301.
- the electronic device 301 is based on an image including the second external electronic device 303, received from the first external electronic device 302, for the position of the second external electronic device 303 within the image. Information can be obtained.
- the processor 325 may transmit information about a location corresponding to the departure time of the second external electronic device 303 to the electronic device 301. For example, the processor 325 may determine whether the second external electronic device 303 is out of view of the camera 324 based on the image acquired by the camera 324. The processor 325 may determine that the second external electronic device 303 is located in the detected area when detecting the area corresponding to the illuminance above the threshold in the acquired image. In another example, when the area corresponding to the illuminance above the threshold value is not detected in the acquired image, the processor 325 may determine that the second external electronic device 303 leaves the field of view of the camera 324. have.
- the processor 325 may temporarily store an image for a time immediately before the time when the area corresponding to the illuminance above the threshold value is not detected, in a buffer or the like. Therefore, when the departure is identified, the processor 325 may identify an image including the second external electronic device 303 among a plurality of images stored in a buffer or the like. The processor 325 may acquire location information on the departure time of the second external electronic device 303 in the identified image using the location tracking module.
- the processor 325 may transmit a signal indicating the departure to the electronic device 301. For example, in response to identifying the departure, the processor 325 may transmit a signal indicating the departure to the electronic device 301. Since the electronic device 301 does not have a camera, the departure of the second external electronic device 303 cannot be identified. Therefore, the processor 325 may transmit a signal indicating the departure to the electronic device 301. In various embodiments, the processor 325 may transmit information about the location of the second external electronic device 303 corresponding to the departure time, together with the signal to the electronic device 301.
- the processor 325 may display an image related to the multimedia content or a graphic object corresponding to the second external electronic device 303 through the display 323.
- the processor 325 may receive data for the display from the electronic device 301.
- the processor 325 may receive location information that the second external electronic device 303 is expected to be located from the electronic device 301, and display the graphic object based on the predicted location information.
- the predicted location information may be obtained by the electronic device 301.
- the predicted position information may be information about a change in a position transmitted from the second external electronic device 303 to the electronic device 301 or a departure time point transmitted from the first external electronic device 302 to the electronic device 301. It may be obtained based on information on the location of the second external electronic device 303.
- the electronic device 301 may generate a graphic object to be output to the display 323 of the first external electronic device 302.
- the graphic object may include multimedia content related to the VR (eg, operation of VR) and / or a graphic object corresponding to the external electronic device 303.
- the processor 313 of the electronic device 301 may provide information about the location of the second external electronic device 303 obtained from the first external electronic device 302 and / or the second external electronic device 303. Information on the location of the second external electronic device 303 obtained from the received information may be received, and a graphic object may be generated based on the received location information of the second external electronic device 303.
- the graphic object may be delivered to the first external electronic device 302 and output on the display 323.
- FIG. 4A illustrates an example in which an external electronic device is detached from a camera field of view of an electronic device according to various embodiments.
- the electronic device 201 illustrated in FIG. 4A may correspond to the electronic device 201 illustrated in FIG. 2.
- the external electronic devices 202-1 and 202-2 shown in FIG. 4A may correspond to the external electronic devices 202 shown in FIG. 2.
- the electronic device 201 may correspond to a state of being worn on a user's body.
- the electronic device 201 may include a display 230 on one surface of a housing (not shown).
- the display 230 may be disposed to face the user's eye in the wearing state of the electronic device 201.
- the electronic device 201 may include a camera 240 on a surface opposite to one surface of the housing.
- the camera 240 may be disposed on a surface of the camera 240 so as to obtain an image of an external environment in a direction corresponding to the user's gaze.
- the external electronic device 202 may correspond to a state held by a user's hand.
- the external electronic device 202 may include two devices 202-1 and 202-2 (eg, a controller) so as to be gripped by the user's left and right hands, respectively.
- the camera 240 included in the electronic device 201 may acquire an image of a field of view corresponding to a predetermined angle 401 (eg, the viewing angle of the camera 240).
- the camera 240 may display an image of the external electronic device 202 when the external electronic device 202 is positioned within the angle 401 (eg, when the user extends the arm forward). Can be obtained.
- the image of the external electronic device 202 may include an image of an external image or a housing of the external electronic device 202.
- the image of the external electronic device 202 may include an LED, a pattern, or a marker.
- the external electronic device 202 may further include an LED separately from the housing of the external electronic device in order to be easily recognized by the processor 250. Instead of recognizing the external electronic device 202 by processing an image of the housing of the external electronic device 202, the processor 250 acquires the LED 240 that emits brightness above a threshold illuminance value. An image of the external electronic device 202 may be identified within the image.
- the external electronic device 202 may include at least one marker. The at least one marker may include markers of a distinctive pattern, respectively. The processor 250 may identify the at least one marker in the image acquired by the camera 240, and identify the external electronic device 202 in the acquired image.
- the external electronic device 202 when the external electronic device 202 deviates from the angle 401 (for example, when the user spreads his arms to both sides), the external electronic device 202 may include the camera ( Departed from the field of view of 240, the camera 240 may not acquire an image 413 of the external electronic device 202.
- the size of the area displayed through the display 230 and the area of the image acquired through the camera 240 may be different.
- the field of view of the camera 240 may include an area larger than the field of view of the user (eg, an area displayed through the display 230).
- the region 410 indicated by the solid line may correspond to the region displayed through the display 230.
- the area 412 indicated by the dotted line indicates the size of the area that the camera 240 can obtain.
- the processor 250 may acquire an image including the external electronic device 202 when it is located inside the field of view of the camera 240.
- the processor 250 may obtain information about the location of the external electronic device 202 based on the image including the external electronic device 202. Therefore, the processor 250 may display the graphic object 411 corresponding to the external electronic device 202 based on the obtained location information.
- the field of view of the camera 240 may be the same as that of the user.
- the size of the area 410 indicated by the solid line may be the same as or similar to the size of the area 414 indicated by the dotted line. Therefore, when the external electronic device 202 is out of the field of view of the user, the processor 250 may not acquire an image of the external electronic device 202 because it is out of the field of view of the camera 240.
- the processor 250 may obtain an image of the external electronic device 202 because the processor 250 is located inside the field of view of the camera 240. Therefore, the processor 250 may obtain information about the location of the external electronic device 202 based on the image, and display the graphic object 411 corresponding to the external electronic device 202.
- the field of view of the camera 240 may include an area smaller than the field of view of the user.
- the size of the area 416 indicated by the dotted line may be smaller than the area 410 indicated by the solid line. Therefore, when the external electronic device 202 is located outside the field of view of the camera 240, even though the external electronic device 202 is located inside the field of view of the user, the processor 250 may acquire an image including the external electronic device 202. Can't.
- the processor 250 may request the external electronic device 202 to transmit information about a change in the position of the external electronic device 202.
- the processor 250 acquires a location of the external electronic device 202 based on the information about the change of the location, and displays the graphic object 411 corresponding to the external electronic device 202 on the obtained location. Can be.
- 5A illustrates an operation of an electronic device according to various embodiments. Such an operation may be performed by the electronic device 201 or the processor 250 included in the electronic device 201 shown in FIG. 2.
- the processor 250 may identify an external electronic device among one or more external objects included in a designated field of view using a camera.
- the processor 250 may acquire an image of the surrounding environment of the electronic device 201 through the camera.
- the image may include a plurality of objects located around the electronic device 201.
- the processor 250 may identify the external electronic device 202 of the plurality of objects.
- the external electronic device 202 may further include a light emitting unit.
- the processor 250 may identify the external electronic device 202 by identifying a light source exceeding a predetermined illuminance lux in the image acquired through the camera.
- the external electronic device 202 may adjust the brightness of the LED based on the ambient brightness.
- the external electronic device 202 may further include an illuminance sensor (not shown), and obtain a value for illuminance (lux) of the ambient brightness of the external electronic device 202 by using the illuminance sensor (not shown). can do.
- the external electronic device 202 may adjust the brightness of the light source based on the value of the illuminance of the ambient brightness. For example, the external electronic device 202 may output light having a first brightness corresponding to the first section in response to identifying that the value of the illuminance of the ambient brightness is included in the first section.
- light of the second brightness corresponding to the second section may be output.
- the first brightness may correspond to an illuminance value required to be separately identified from the ambient brightness of the first section.
- the second brightness may correspond to an illuminance value required to be separately identified from the ambient brightness of the second section.
- the processor 250 may display a graphic object corresponding to the external electronic device on the display, based on the first location information of the external electronic device 202.
- the processor 250 may display the graphic object on the identified location of the external electronic device 202.
- the graphic object may be determined based on multimedia content.
- the graphic object may include images of various shapes, such as a knife, a tennis racket, or an automobile wheel.
- the processor 250 may move out of the designated field of view and / or after the second position information checked through the camera and / or outside the designated field of view.
- the graphic object may be displayed on the display based on information related to a movement of the external electronic device received from the external electronic device through the communication circuit.
- the processor 250 may store second location information when the external electronic device 202 is out of the designated field of view. The second location information may be located at one point of boundary surfaces of the image displayed through the display 210.
- the processor 250 may receive information related to the movement of the external electronic device 202 from the external electronic device 202 from a time point outside the designated view.
- the information related to the movement may include data obtained by a gyro sensor and / or data obtained by an acceleration sensor.
- the processor 250 may generate a vector value for the movement of the external electronic device based on the information related to the movement.
- the vector value may be obtained by using the region where the external electronic device 202 is separated as a starting point and determining the size and / or direction of the vector according to the information related to the movement.
- 5B illustrates an operation of an electronic device according to various embodiments. Such an operation may be performed by the electronic device 201 or the processor 250 included in the electronic device 201 shown in FIG. 2.
- the processor 250 may obtain information about a location of the external electronic device 202 linked with the electronic device 201.
- the information about the location of the external electronic device 202 may refer to information about a coordinate value in which the external electronic device 202 is identified in the image acquired by the camera 240.
- the processor 250 may acquire an image via the camera 240.
- the processor 250 may obtain an x coordinate value and / or a y coordinate value based on the image.
- the processor 250 may identify the coordinate value corresponding to the area where the external electronic device 202 is located in the image by analyzing the image.
- the processor 250 may obtain depth information based on the image analysis of the image.
- the processor 250 may obtain a z coordinate value based on the obtained depth information and / or the obtained x coordinate value and y coordinate value.
- the processor 250 may receive information about the tilt of the external electronic device 202.
- the inclination information may include sensor data acquired by the first sensor 292 included in the external electronic device 202.
- the information on the inclination may include information about a change amount of at least one of pitch, roll, or yaw.
- the information about the inclination may be determined from the external electronic device 202 in response to an on state or an activation state of the external electronic device 202. 201) may be set to be automatically transmitted.
- the information about the tilt may be set to be transmitted to the electronic device 201 in response to a request of the electronic device 201.
- the processor 250 may provide a graphic object for showing the external electronic device 202 with the multimedia content.
- the processor 250 utilizes the obtained information about the tilt of the external electronic device 202 and / or information about the position of the external electronic device 202 obtained by the camera 240 to thereby provide the graphic object.
- the processor 250 obtains coordinate values for x, y, and z from the camera 240, and receives the information about the tilt from the external electronic device 202 in the 3D space.
- the shape of the external electronic device 202 may be determined.
- the processor 250 may display the graphic object so as to correspond to the determined shape of the external electronic device 202.
- the graphic object may be displayed to have the same inclination on the same coordinate as the external electronic device 202.
- the graphic object may be determined based on multimedia content.
- the graphic object may include a shape of a tennis racket.
- the graphic object may include a sword or a sword.
- the processor 250 may identify that the external electronic device 202 is out of view of the camera.
- the processor 250 may determine that the external electronic device 202 does not exist based on the image acquired by the camera 240. For example, when the image acquired by the camera 240 does not include an image of light emitted from a light emitting unit (not shown) of the external electronic device 202, the external device may be used.
- the departure of the electronic device 202 may be identified. For another example, if no brighter area than the predetermined illuminance is detected in the acquired image, the processor 250 may determine that the external electronic device 202 has left.
- the processor 250 may store information about the location of the external electronic device 202 with respect to the departure time. For example, in response to detecting the departure, the processor 250 may store information regarding a location where the external electronic device 202 leaves.
- the processor 250 may request to transmit information about a change in the position of the external electronic device 202.
- the processor 250 may not acquire information about the location of the external electronic device 202.
- the processor 250 may transmit a signal requesting the external electronic device 202 to transmit information about a change in the position of the external electronic device 202.
- the request signal may include a control signal instructing to activate the second sensor 294 of the external electronic device 202.
- the external electronic device 202 activates only the first sensor 292 and is obtained from the first sensor 292. Information about the tilt may be transmitted to the electronic device 201.
- the external electronic device 202 when the external electronic device 202 is out of the field of view of the camera 240, the external electronic device 202 receives the control signal from the electronic device 201, and the second sensor 294. ) Can be activated.
- the external electronic device 202 may obtain information about the inclination obtained by the first sensor 292 and / or information about a speed change of the external electronic device 202 obtained by the second sensor 294. It may transmit to the electronic device 201.
- the second sensor 294 may inform the electronic device 201 about the speed change regardless of the reception of a control signal instructing activation of the second sensor 294 from the electronic device 201. Can be sent.
- the processor 295 of the external electronic device 202 checks (eg, detects) information (eg, power-on) related to driving of the external electronic device 202, and in response to the confirmation,
- the second sensor 294 may be activated.
- the processor 295 may further include a grip sensor (not shown), and the grip sensor (not shown) may detect that the external electronic device 202 is held by the user.
- the processor 295 may receive a sensor value from the grip sensor (not shown), and activate the second sensor 294 in response to the reception.
- the second sensor 294 When the second sensor 294 is activated in response to receiving a sensor value from a power-on of the external electronic device 202 or a grip sensor (not shown), the external electronic device acquired by the first sensor 292 ( Information about the inclination of 202 and / or information about the speed change of the external electronic device 202 obtained by the second sensor 294 may be transmitted to the electronic device 201 together.
- the processor 250 may obtain a vector value for a change in the position of the external electronic device 202.
- the processor 250 receives information about the change of the position from the external electronic device 202, and the received information is transmitted to the data and / or the second sensor 294 obtained by the first sensor 292. It may include data obtained by.
- the processor 250 may include data on tilt of the external electronic device 202 obtained by the first sensor 292 and / or data of the external electronic device 202 obtained by the second sensor 294. Based on the data for the speed change, a vector value can be obtained.
- the vector value may include information indicating in which direction how much the external electronic device 202 has moved from the position information corresponding to the departure time from the field of view of the camera 290.
- the processor 250 may determine the information about the change in the position of the external electronic device 202 based on the information about the change in the position of the electronic device 201.
- the electronic device 201 may include a sensor module (not shown).
- the sensor module (not shown) may include an acceleration sensor and / or a gyro sensor.
- the gyro sensor may obtain information about the inclination of the electronic device 201
- the acceleration sensor may obtain information about a speed change of the electronic device 201.
- the electronic device 201 receives information about a tilt of the external electronic device 202 from the first sensor 292 or the second sensor 294 or information about a speed change from the external electronic device 202 for a predetermined time.
- information about a change in the position of the electronic device 201 may be obtained.
- the processor 250 receives information indicating null from the external electronic device 202 or does not receive the information for the predetermined time, the external electronic device 202 does not move, and the electronic device 201 It may be determined that the camera deviates from the field of view of the camera 240 by the movement. Accordingly, the electronic device 201 may obtain information about the change of the position of the electronic device 201 and generate information about the change of the position of the external electronic device 202 based on this. For example, when the electronic device 201 moves the view of the camera 240 upward, the external electronic device 202 moves in the opposite direction (for example, the lower left direction). Can be determined. This is because a change in the position of the external electronic device 202 based on the user's field of view (eg, the display 230) may appear in a direction opposite to the moving direction of the electronic device 201.
- the processor 250 may change the display of the graphic object.
- the processor 250 may determine a location where the external electronic device 202 is expected to move based on the received information about the change in the location of the external electronic device 202.
- the processor 250 may change the display of the graphic object to correspond to the predicted position.
- the processor 250 may change a display of the graphic object to display a part of the graphic object and not display the other part. For example, referring to the image 720 of FIG. 7, only a part of the graphic object including the shape of a knife may be displayed.
- the processor 250 may not display the graphic object. For example, referring to the image 740 of FIG.
- the processor 250 may display only the image related to the multimedia content, not the graphic object including the shape of the knife.
- the processor 250 stops displaying the graphic object when the predicted position of the external electronic device 202 deviates from a predefined distance based on the information on the tilt and / or the information on the speed change. can do. For example, when the graphic object includes a shape having a long length, the processor 250 may set a size of a predefined distance to be large. As another example, when the graphic object includes a shape having a short length, the processor 250 may set a smaller size of a predefined distance.
- FIG. 6 illustrates an example of an external electronic device and a graphic object displayed to correspond to the external electronic device according to various embodiments of the present disclosure.
- the processor 250 may acquire a camera image 610 using the camera 240.
- the camera image 610 may include an external electronic device 202 and a user's hand for holding the external electronic device 202 and an image of an external environment in which the user is located.
- the processor 250 may display the display image 620 via the display 230.
- the display image 620 may include a graphic object 602 corresponding to the multimedia content and / or the external electronic device 202.
- the graphic object 602 may include an image corresponding to the shape of a sword or a sword.
- the processor 250 may superimpose and display the graphic object 602 and the multimedia content.
- the processor 250 may render or display a background image associated with the multimedia content.
- the background image may correspond to an image related to a tennis court or a tennis stadium.
- the processor 250 may display the background image and / or the graphic object by displaying a tennis racket on a position corresponding to the external electronic device 202 in the background image associated with the tennis court or the tennis stadium. Can be displayed by overlapping.
- the electronic device 201 may provide an immersive service related to VR by displaying the multimedia content and / or the graphic object 602 on the display 230 so as to overlap the multimedia content.
- the images 710 to 760 displayed through the display 230 may include images corresponding to the user's field of view.
- the field of view of the camera 240 may be the same as the field of view of the display area of the images 710-760 displayed through the display.
- the field of view of the camera 240 is not limited thereto, and may be smaller or larger than the field of view of the user, as shown in FIG. 4B.
- the processor 250 may display an image 710.
- the processor 250 may provide information about the location of the external electronic device 202 based on the image acquired by the camera 240. Can be obtained.
- the processor 250 may display the image 710 based on the information about the inclination of the external electronic device 202 and / or the location information received from the external electronic device 202.
- the processor 250 identifies a position of the electronic device 201 located inside the field of view of the camera 240, and based on the inclination information at the identified position, a graphic object (eg, Sword or sword).
- a graphic object eg, Sword or sword
- the graphical object may be displayed as an image that includes the shape of a knife.
- the image including the shape of the knife may be displayed based on the information on the position of the electronic device identified through the camera 240 and the tilt information received from the external electronic device 202, and the image ( As shown in 710, the display may be displayed to be inclined in the same manner as the external electronic device 202 in the three-dimensional space.
- the processor 250 identifies the external electronic device 202 within the image acquired by the camera 240 and marks the knife on a location corresponding to the identified external electronic device 202. You can mark the handle.
- the processor 250 receives the information about the tilt of the external electronic device 202 from the external electronic device 202, and based on the received information about the tilt, the tilt of the handle of the knife and the tilt of the knife.
- the tilt of the blade can be indicated.
- the processor 250 identifies an area brightly indicated by the LEDs in the image acquired by the camera 240, and based on the information about the tilt of the external electronic device 202, moves to the left.
- the handle portion of the knife tilted at 45 degrees and the blade portion of the knife can be marked.
- the handle portion of the knife can be displayed on the identified area.
- the processor 250 may display an image 720.
- the electronic device 201 may store location information of the time when the external electronic device 202 is separated.
- the processor 250 may detect the deviation when it fails to detect brightness above the threshold illuminance value in the image acquired by the camera 240.
- the processor 250 attaches to an external image or an external image of the external electronic device 202 within the acquired image based on image processing of the image acquired by the camera 240. An image for the marker may be identified.
- the processor 250 may determine that the external electronic device 202 is out of the field of view of the camera 240.
- the processor 250 may request the external electronic device 202 to transmit information about a change in the position of the external electronic device 202. For example, referring to image 720, the user's hand may be out of view of the camera. According to an embodiment, the processor 250 may display a part of the graphic object. For example, the processor 250 may display only a part of the graphic object when the external electronic device 202 does not move away by a predefined distance after the departure time. Referring to the image 720, the external electronic device 202 may not move a distance exceeding the length of the knife after being separated from the field of view of the camera 240. Therefore, the processor 250 may not display the handle portion of the knife corresponding to the external electronic device 202 and may display a portion of the graphic object corresponding to the blade portion of the knife.
- the processor 250 may display an image 730.
- the processor 250 may receive information about a change in location from the external electronic device 202.
- the processor 250 may predict that the external electronic device 202 has moved in the upper right direction by a predetermined distance based on the information on the speed change and / or the tilt information of the received external electronic device 202. have.
- the processor 250 may change the display of the inclination angle of the graphic object by reflecting the inclination information, as shown in the image 730.
- the processor 250 does not display the handle portion of the knife corresponding to the external electronic device 202, and displays the graphic object corresponding to the blade portion of the knife. Information can be received.
- the processor 250 may receive information about the inclination indicating that the external electronic device 202 is further inclined to the left side after leaving the view of the camera 240. Although the external electronic device 202 is not included in the image acquired by the camera 240, the processor 250 may change the angle of the blade portion of the displayed knife based on the information on the inclination. . The processor 250 may display the graphic object corresponding to the blade of the blade to be tilted to the left based on the received tilt information.
- the processor 250 may display an image 740.
- the processor 250 may receive information about a change in the position of the external electronic device 202 and may not display a graphic object corresponding to the external electronic device 202 on the display 230.
- the processor 250 may obtain a location at which the external electronic device 202 is predicted to move based on data on a speed change and / or data on a tilt of the external electronic device 202. have.
- the processor 250 may not display a graphic object including the shape of the knife.
- the predetermined distance may be, for example, a distance equal to a length of a graphic object including the shape of the knife.
- processor 250 may display image 750 and / or image 760.
- the processor 250 may detect that the external electronic device 202 has moved into the field of view of the camera 240. Therefore, the processor 250 may obtain information about the location of the external electronic device 202 through the camera 240 and display the graphic object.
- the processor 250 may display an image 750 corresponding to a point in time when the external electronic device 202 enters the field of view of the camera 240.
- the processor 250 may display the graphic object at the time of re-entry, based on the information about the tilt of the external electronic device 202. For example, the processor 250 may identify the entry position of the external electronic device 202 based on the image obtained at the entry point.
- the processor 250 may determine the inclination degree of the graphic object to be displayed on the entry position based on the information about the inclination of the external electronic device 202 among the information about the change in the received position.
- the processor 250 may display the graphic object based on the determined degree of inclination and / or the entry position.
- FIG. 8 illustrates an example for comparing effects of correction based on information on a change in position of an external electronic device according to various embodiments of the present disclosure.
- the images 810 and 820 displayed through the display 230 may include images corresponding to the user's field of view.
- the field of view of the camera 240 may be the same as the field of view of the display area of the images 810 and 820 displayed through the display 230.
- the field of view of the camera 240 is not limited thereto, and may be smaller or larger than the field of view of the user, as shown in FIG. 4B.
- the image 810 may correspond to an image of which the display of the graphic object is not corrected by using the information about the change of the position.
- the processor 250 may store information corresponding to the departure time of the external electronic device 202.
- the processor 250 may store a location (hereinafter, referred to as first location information) 815 and / or a tilt (first sensor information) of the external electronic device with respect to the departure time. Thereafter, when the external electronic device 202 moves inside the field of view of the camera 240, the processor 250 may detect the external electronic device 202.
- the processor 250 may display the graphic object 813 regarding the entry time of the external electronic device 202.
- the processor 250 may not receive the information about the tilt from the external electronic device 202 after the departure time.
- the processor 250 may display the graphic object 813 according to the location (hereinafter, referred to as second location information) 817 of the external electronic device 202 with respect to the entry time of the external electronic device 202. Since the processor 250 has not yet received the information about the tilt of the external electronic device 202 at the entry time, the processor 250 displays the graphic object 813 based on the first sensor information and / or the second location information. can do. Accordingly, the actual external electronic device 102 is inclined to face the upper right side, while the processor 250 displays the graphic object 813 according to the inclination corresponding to the departure time of the external electronic device 202. can do.
- the image 820 may correspond to an image in which the display of the graphic object is corrected by using information about the change of the position.
- the processor 250 may receive information about a change in the position of the external electronic device 202 from the external electronic device 202 during the time range from the departure time to the entry time.
- the processor 250 may identify the information about the inclination of the external electronic device 202 corresponding to the entry time point, based on the information about the change in the received position. Therefore, the processor 250 is based on the second position information 827 corresponding to the entry point of the external electronic device 202 and / or information about the inclination of the external electronic device 202 corresponding to the entry point of time.
- the graphic object 823 may be displayed.
- the graphic objects 811 and 821 corresponding to the departure time of the external electronic device 202 may be inclined toward the upper left side at the departure time. Can be. Subsequently, referring to FIG. 8B, the graphic object 823 displayed at the entry point of the external electronic device 202 does not face the upper left side, and matches the inclination of the external electronic device 202. It may be displayed tilted to the upper right corner.
- the processor 250 may provide information about a change in position from the external electronic device 202 for a time from when the external electronic device 202 leaves the field of view of the camera 240 to the time when the external electronic device 202 reenters. By receiving, it is possible to display a graphic object naturally at the time of entry.
- FIG. 9A illustrates an example of an operation of an electronic device according to various embodiments. Such an operation may be performed by the electronic device 201 shown in FIG. 2.
- the processor 250 may identify the external electronic device 202 using the camera 240.
- the processor 250 may acquire an image of an external environment through the camera 240.
- the image for the external environment may include a plurality of external objects.
- the processor 250 may identify the external electronic device 202 among the plurality of external objects.
- the external electronic device 202 may further include a light emitting unit for generating light outside the housing.
- the processor 250 may identify the external electronic device 202 by identifying an area brighter than a predetermined illuminance in the image acquired through the camera.
- the processor 250 externally stores the one or more location information obtained during the designated time range before the external electronic device 202 enters the low power mode. It may receive from the electronic device 202.
- the low power mode may correspond to a mode for reducing power consumption of the external electronic device 202.
- the external electronic device 202 may determine to enter the low power mode and control the amount of power provided to the plurality of components. For example, the external electronic device 202 may change the operation cycle of the sensor module 290 to be long and reduce the number of operations of the sensor module 290 in order to reduce power consumption.
- the low power mode may be referred to in various terms including a sleep mode, a sleep mode, an inactive state, an inactive state, a deactivation state, and the like.
- the external electronic device 202 may perform a low power mode in which the data acquisition operation by the sensor module 290 is not performed for the specified time range.
- the external electronic device 202 acquires at least one of information on tilt obtained by the first sensor 292 and / or information on speed change obtained by the second sensor 294, It may not enter the low power mode.
- the external electronic device 202 may monitor the acquisition of the at least one information from the time corresponding to the acquisition of the at least one information for the specified time range.
- the external electronic device 202 may obtain one or more location information during the specified time range.
- the sensor module 290 may be activated from the designated time range, that is, the last time the user input is received to the time when the low power mode is performed.
- the sensor module 290 may receive information about an inclination of the external electronic device 202 corresponding to a time range from the time when the user input is received to the time of performing the low power mode.
- the processor 250 may receive a request for displaying a graphic object associated with the external electronic device 202.
- the display request may be based on a change in an operation mode of the external electronic device 202.
- the processor 250 may detect that the operation mode is changed from the inactive mode to the active mode, and in response to the detection, may receive a signal requesting display of the graphic object.
- the request signal may correspond to a signal received from another electronic device (eg, the second external electronic device 202).
- the request signal may include a control signal received from elements of the external electronic device 202.
- the display request of the graphic object may be generated when the data value of the external electronic device 202 obtained by the first sensor 292 detects an exceeding a threshold value.
- the display request of the graphic object may be transmitted to the electronic device 201.
- the external electronic device 202 may not generate the display request when the amount of change in the detected movement does not exceed a predetermined threshold value (eg, inadvertent vibration of the user).
- the external electronic device 202 may further include a grip sensor (not shown). The external electronic device 202 may generate the display request when detecting the grip of the user by the grip sensor.
- the processor 250 may determine a location to display the graphic object based on the one or more location information.
- the one or more location information may include information on a location of the external electronic device 202 that is out of view of the camera 240, first sensor information corresponding to when the low power mode is entered, and a time when the low power mode is released. It may include the second sensor information corresponding to the.
- FIG. 9B illustrates an example of an operation of an electronic device according to various embodiments. Such an operation may be performed by the electronic device 201 shown in FIG. 2.
- the processor 250 may obtain information about a location of the external electronic device 202 linked with the electronic device 201. In operation 913, the processor 250 may receive information about an inclination of the external electronic device 202. In operation 915, the processor 250 may provide a visual object for presenting the external electronic device 202 along with the multimedia content.
- the operations 911 to 915 may correspond to the operations 511 to 515 illustrated in FIG. 5B, respectively.
- the processor 250 may determine that the sleep mode has been entered.
- the sleep mode may correspond to a mode for reducing power consumption of the external electronic device 202.
- the processor 250 may determine that a sleep mode has been entered.
- the processor 250 may receive location information and / or tilt information of the external electronic device 202 corresponding to a time point when the user enters the sleep mode. For example, when the processor 250 does not receive a user input for the predefined length of time, the processor 250 determines that the low power mode has been entered, and the information related to the external electronic device 202 corresponding to the entry time point. Can be received.
- the processor 250 may identify that the sleep mode is released.
- the sleep mode may be released by receiving a wake-up signal from the external electronic device 202.
- the wake-up signal may be referred to in various terms such as a paging signal and an activation signal.
- the wake-up signal may be generated when detecting a motion that exceeds a predefined threshold value of the external electronic device 202.
- the external electronic device 202 may further include a grip sensor (not shown). The external electronic device 202 may generate the wake-up signal by obtaining data indicating that the user has gripped the external electronic device 202 from the grip sensor.
- the processor 250 may display information on the inclination of the external electronic device 202 corresponding to the time of releasing the sleep mode, and the inclination of the external electronic device 202 corresponding to the time of entering the sleep mode.
- the location of the external electronic device 202 may be estimated based on the information on the information about the external electronic device 202 and / or the location information of the external electronic device 202 corresponding to the time of entering the sleep mode.
- the processor 250 compares a difference between information corresponding to an entry point of the sleep mode and / or information corresponding to a release point among information on the inclination of the external electronic device 202, A change in the position of the external electronic device 202 may be predicted.
- the processor 250 may obtain a location that is expected to be located at the time of releasing the sleep mode by applying the information about the predicted change in the location to the location information on the entry point of the sleep mode.
- the processor 250 may predict the location based on the information on the tilt of the electronic device 201.
- the electronic device 201 may move between the entry point of the sleep mode and the release point.
- the processor 250 may reflect the movement of the electronic device 201 and / or information about the tilt of the electronic device 201 and / or the external electronic device 202 at the time when the external electronic device 202 enters the sleep mode. ) May obtain information about the movement of the electronic device 201 by comparing the information about the tilt of the electronic device 201 at the time when the sleep mode is released.
- the processor 250 may determine whether the predicted position of the external electronic device is included in the field of view of the camera. When the location of the predicted electronic device 250 is located within an area of the field of view of the camera 240, the processor 250 may display the graphic object at the location of the predicted external electronic device 202 in operation 929. . When the location of the predicted electronic device is located outside the area of the camera 240 's field of view, the processor 250 may display a graphic object for guiding the predicted location in operation 927.
- the graphic object may not be a graphic object corresponding to the external electronic device 202.
- the graphic object may include an arrow-shaped image.
- the graphic object may include an image that is the same as or similar to the shape of the external electronic device 202.
- the images 1010 and 1020 displayed through the display 230 may include images corresponding to the field of view of the user.
- the field of view of camera 240 may be the same as the field of view of the display area of images 1010 and 1020 displayed through display 230.
- the field of view of the camera 240 is not limited thereto, and may be smaller or larger than the field of view of the user, as shown in FIG. 4B.
- the processor 250 may display an image 1010 related to multimedia content on the display 230.
- the image 1010 related to the multimedia content may include at least one of an omnidirectional image and a 3D image.
- the processor 250 may display a graphic object for guiding the location of the external electronic device 202.
- the processor 250 may display an arrow-shaped graphic object 1012.
- the arrow-shaped graphic object 1012 may be displayed to overlap an image associated with the displayed multimedia content.
- the processor 250 may change the display of the arrow-shaped graphic object 1012. For example, when the external electronic device 202 moves away from the electronic device 201, the processor 250 may reduce the thickness of the arrow-shaped graphic object 1012.
- the processor 250 may increase the thickness of the arrow-shaped graphic object 1012.
- the processor 250 may change the color of the graphical object.
- the processor 250 may change the color of the graphic object based on the color of the multimedia content displayed on the display 230. For example, the processor 250 may determine a display area of the arrow-shaped graphic object 1012 and identify a partial area of an image related to the multimedia content corresponding to the determined area. The processor 250 identifies a color of the identified area and improves the identification of the graphic object 1012 and the arrow-shaped graphic object 1012 in the identified color and complementary color. Can be displayed.
- processor 250 may display visual effect 1022.
- the processor 250 may superimpose and display the visual effect 1022 on the image 1020 related to the multimedia content.
- the processor 250 may display the visual effect 1022 as a boundary surface of a direction in which the external electronic device 202 is predicted to be located among the four surfaces constituting the boundary of the display 230.
- the visual effect 1022 may include an effect of highlighting the boundary surface.
- the processor 250 may include visual effects related to blur for blurring the boundary surface.
- the graphic object 1012 has been described as including an arrow-shaped image, but is not limited thereto.
- the graphic object 1012 may include an image having the same or similar shape as the external electronic device 202.
- Electronic devices may be various types of devices.
- the electronic device may include, for example, a portable communication device (eg, a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
- a portable communication device eg, a smartphone
- a computer device e.g., a tablet, or a smart phone
- a portable multimedia device e.g., a portable medical device
- a camera e.g., a portable medical device
- a camera e.g., a portable medical device
- a camera e.g., a camera
- a wearable device e.g., a portable medical device
- a camera e.g., a wearable device
- first, second, or first, second may be used to simply distinguish a component from other corresponding components, and to separate the components from other aspects (e.g. Order).
- Some (eg, first) component may be referred to as “coupled” or “connected” to another (eg, second) component, with or without the term “functionally” or “communicatively”.
- any component can be connected directly to the other component (eg, by wire), wirelessly, or via a third component.
- module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit.
- the module may be an integral part or a minimum unit or part of the component, which performs one or more functions.
- the module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of this document may include one or more stored in a storage medium (eg, internal memory 136 or external memory 138) that can be read by a machine (eg, electronic device 101). It may be implemented as software (eg, program 140) containing instructions.
- a processor eg, the processor 120 of the device (eg, the electronic device 101) may call and execute at least one command among one or more instructions stored from the storage medium. This enables the device to be operated to perform at least one function in accordance with the at least one command invoked.
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory' means only that the storage medium is a tangible device and does not contain a signal (e.g. electromagnetic wave), and the term is used when the data is stored semi-permanently on the storage medium. It does not distinguish cases where it is temporarily stored.
- a signal e.g. electromagnetic wave
- a method according to various embodiments disclosed in the present disclosure may be included in a computer program product.
- the computer program product may be traded between the seller and the buyer as a product.
- the computer program product may be distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play Store TM ) or two user devices ( Example: smartphones) can be distributed (eg downloaded or uploaded) directly or online.
- a device-readable storage medium such as a server of a manufacturer, a server of an application store, or a relay server, or may be temporarily created.
- each component eg, module or program of the above-described components may include a singular or plural entity.
- one or more of the aforementioned components or operations may be omitted, or one or more other components or operations may be added.
- a plurality of components eg, a module or a program
- the integrated component may perform one or more functions of the component of each of the plurality of components the same as or similar to that performed by the corresponding component of the plurality of components before the integration. .
- operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Or one or more other actions may be added.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (15)
- 전자 장치에 있어서,지정된 시야(field of view)를 갖는 하나 이상의 카메라;디스플레이;통신 회로; 및프로세서를 포함하고, 상기 프로세서는,상기 카메라를 이용하여 상기 지정된 시야에 포함된 하나 이상의 외부 객체들 중 외부 전자 장치를 확인하고,상기 카메라를 통해 획득된 이미지 정보에 적어도 기반하여 확인된 상기 외부 전자 장치의 제1 위치 정보에 기반하여, 상기 외부 전자 장치에 대응하는 그래픽 객체(graphical object)를 상기 디스플레이를 통해 표시하고, 및상기 외부 전자 장치가 상기 지정된 시야를 벗어나는 경우, 상기 지정된 시야를 벗어나기 이전에 상기 카메라를 통해 확인된 상기 외부 전자 장치의 제2 위치 정보 및 상기 지정된 시야를 벗어난 이후에 상기 통신 회로를 통해 상기 외부 전자 장치로부터 수신된 상기 외부 전자 장치의 움직임과 관련된 정보에 기반하여, 상기 그래픽 객체를 상기 디스플레이를 통해 표시하도록 설정된 전자 장치.
- 청구항 1에 있어서,상기 외부 전자 장치는,자이로 센서를 포함하는 제1 센서; 및가속도 센서를 포함하는 제2 센서를 더 포함하고,상기 외부 전자 장치의 움직임과 관련된 정보는,상기 제1 센서에 의해 획득된 상기 외부 전자 장치의 기울어짐(tilting)과 관련된 데이터, 또는 상기 제2 센서에 의해 획득된 상기 외부 전자 장치의 속도 변화와 관련된 데이터 중 적어도 하나를 포함하는 전자 장치.
- 청구항 1에 있어서,상기 외부 전자 장치는,하나 이상의 LED를 포함하는 발광부를 더 포함하고,상기 프로세서는,상기 카메라를 이용하여 상기 발광부의 적어도 일부를 포함하는 이미지를 획득하고,상기 이미지에 대한 이미지 연산에 적어도 기반하여, 상기 발광부와 관련된 위치를 추정하고,상기 추정된 위치에 적어도 기반하여 상기 외부 전자 장치와 관련된 상기 제1 위치 정보를 결정하도록 설정된 전자 장치.
- 청구항 2에 있어서,상기 프로세서는,상기 외부 전자 장치가 상기 지정된 시야 내에 위치하는 경우,상기 제1 위치 정보 및 상기 외부 전자 장치의 기울어짐과 관련된 데이터에 적어도 기반하여 상기 그래픽 객체의 표시 위치를 식별하고,상기 외부 전자 장치가 상기 지정된 시야를 벗어나는 경우,상기 제2 위치 정보 및 제3 위치 정보에 적어도 기반하여 상기 그래픽 객체의 표시 위치를 식별하고, 상기 제3 위치 정보는, 상기 외부 전자 장치의 기울어짐과 관련된 데이터 및 상기 외부 전자 장치의 속도 변화와 관련된 데이터에 기반하여 획득된 벡터 정보를 포함하고,상기 식별된 표시 위치에 기반하여, 상기 그래픽 객체를 표시하도록 설정된 전자 장치.
- 청구항 1에 있어서,상기 프로세서는,상기 전자 장치에 의해 제공되는, 가상 현실과 관련된 멀티미디어 콘텐트, 상기 제1 위치 정보 및 상기 제2 위치 정보에 기반하여, 상기 그래픽 객체를 표시하도록 설정된 전자 장치.
- 청구항 1에 있어서,상기 프로세서는,상기 지정된 시야 내에서 상기 외부 전자 장치가 확인되지 않을 경우,상기 외부 전자 장치가 저전력 모드로 진입하기 이전의 지정된 시간 범위 동안에 획득된 하나 이상의 제3 위치 정보들을 상기 외부 전자 장치로부터 수신하고,상기 그래픽 객체의 표시 요청이 확인될 경우, 상기 하나 이상의 제3 위치 정보들에 적어도 기반하여 상기 디스플레이를 이용하여 표시할 상기 그래픽 객체의 표시 위치를 결정하도록 설정된 전자 장치.
- 청구항 1에 있어서,상기 프로세서는,상기 외부 전자 장치가 슬립 모드에 진입하였음을 식별하고,상기 진입 시점에 상응하는 상기 외부 전자 장치의 제3 위치 정보 및 상기 외부 전자 장치의 기울어짐과 관련된 정보를 수신하고,상기 슬립 모드가 해제되었음을 식별하고,상기 해제 시점에 상응하는 상기 외부 전자 장치의 기울어짐과 관련된 정보를 수신하고,상기 진입 시점에 상응하는 상기 외부 전자 장치의 제3 위치 정보, 상기 외부 전자 장치의 기울어짐과 관련된 정보, 및 상기 해제 시점에 상응하는 상기 외부 전자 장치의 기울어짐과 관련된 정보에 적어도 기반하여, 상기 외부 전자 장치의 위치를 지시하기 위한 제2 그래픽 객체를 표시하도록 설정된 전자 장치.
- 청구항 7에 있어서,상기 제2 그래픽 객체는,상기 외부 전자 장치와 동일 또는 유사한 형상의 이미지 또는 화살표 형상의 이미지를 포함하는 전자 장치.
- 지정된 시야(field of view)를 갖는 하나 이상의 카메라를 이용하여 상기 지정된 시야에 포함된 하나 이상의 외부 객체들 중 외부 전자 장치를 확인하는 동작;상기 카메라를 통해 획득된 이미지 정보에 적어도 기반하여 확인된 상기 외부 전자 장치의 제1 위치 정보에 기반하여, 상기 외부 전자 장치에 대응하는 그래픽 객체(graphical object)를 디스플레이를 통해 표시하는 동작; 및상기 외부 전자 장치가 상기 지정된 시야를 벗어나는 경우, 상기 지정된 시야를 벗어나기 이전에 상기 카메라를 통해 확인된 상기 외부 전자 장치의 제2 위치 정보 및 상기 지정된 시야를 벗어난 이후에 통신 회로를 통해 상기 외부 전자 장치로부터 수신된 상기 외부 전자 장치의 움직임과 관련된 정보에 기반하여, 상기 그래픽 객체를 상기 디스플레이를 통해 표시하는 동작을 포함하는 방법.
- 청구항 9에 있어서,상기 외부 전자 장치의 움직임과 관련된 정보는,상기 외부 전자 장치에 포함된 제1 센서에 의해 획득된 상기 외부 전자 장치의 기울어짐(tilting)과 관련된 데이터, 또는 상기 외부 전자 장치에 포함된 제2 센서에 의해 획득된 상기 외부 전자 장치의 속도 변화와 관련된 데이터 중 적어도 하나를 포함하는 방법.
- 청구항 10에 있어서,상기 외부 전자 장치가 상기 지정된 시야 내에 위치하는 경우, 상기 제1 위치 정보 및 상기 외부 전자 장치의 기울어짐과 관련된 데이터에 적어도 기반하여 상기 그래픽 객체의 표시 위치를 식별하는 동작;상기 외부 전자 장치가 상기 지정된 시야를 벗어나는 경우, 상기 제2 위치 정보 및 제3 위치 정보에 적어도 기반하여 상기 그래픽 객체의 표시 위치를 식별하는 동작; 및상기 식별된 표시 위치에 기반하여, 상기 그래픽 객체를 표시하는 동작을 더 포함하고,상기 제3 위치 정보는, 상기 외부 전자 장치의 기울어짐과 관련된 데이터 및 상기 외부 전자 장치의 속도 변화와 관련된 데이터에 기반하여 획득된 벡터 정보를 포함하는 방법.
- 청구항 9에 있어서,상기 카메라를 이용하여 상기 외부 전자 장치에 포함된 발광부의 적어도 일부를 포함하는 이미지를 획득하는 동작;상기 이미지에 대한 이미지 연산에 적어도 기반하여, 상기 발광부와 관련된 위치를 추정하는 동작; 및상기 추정된 위치에 적어도 기반하여 상기 외부 전자 장치와 관련된 상기 제1 위치 정보를 결정하는 동작을 더 포함하는 방법.
- 청구항 9에 있어서,상기 그래픽 객체를 상기 디스플레이를 통해 표시하는 동작은,가상 현실과 관련된 멀티미디어 콘텐트, 상기 제1 위치 정보 및 상기 제2 위치 정보에 기반하여, 상기 그래픽 객체를 표시하는 동작을 포함하는 방법.
- 청구항 9에 있어서,상기 지정된 시야 내에서 상기 외부 전자 장치가 확인되지 않을 경우, 상기 외부 전자 장치가 저전력 모드로 진입하기 이전의 지정된 시간 범위 동안에 획득된 하나 이상의 제3 위치 정보들을 상기 외부 전자 장치로부터 수신하는 동작; 및상기 그래픽 객체의 표시 요청이 확인될 경우, 상기 하나 이상의 제3 위치 정보들에 적어도 기반하여 상기 디스플레이를 이용하여 표시할 상기 그래픽 객체의 표시 위치를 결정하는 동작을 더 포함하는 방법.
- 청구항 9에 있어서,상기 외부 전자 장치가 슬립 모드에 진입하였음을 식별하는 동작;상기 진입 시점에 상응하는 상기 외부 전자 장치의 제3 위치 정보 및 상기 외부 전자 장치의 기울어짐과 관련된 정보를 수신하는 동작;상기 슬립 모드가 해제되었음을 식별하는 동작;상기 해제 시점에 상응하는 상기 외부 전자 장치의 기울어짐과 관련된 정보를 수신하는 동작; 및상기 진입 시점에 상응하는 상기 외부 전자 장치의 제3 위치 정보, 상기 외부 전자 장치의 기울어짐과 관련된 정보, 및 상기 해제 시점에 상응하는 상기 외부 전자 장치의 기울어짐과 관련된 정보에 적어도 기반하여, 상기 외부 전자 장치의 위치를 지시하기 위한 제2 그래픽 객체를 표시하는 동작을 더 포함하는 방법.
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/059,374 US11442274B2 (en) | 2018-05-29 | 2019-03-22 | Electronic device and method for displaying object associated with external electronic device on basis of position and movement of external electronic device |
| CN201980035566.9A CN112204503B (zh) | 2018-05-29 | 2019-03-22 | 用于基于外部电子装置的位置和移动来显示与外部电子装置相关联的对象的电子装置和方法 |
| EP19810932.4A EP3796131B1 (en) | 2018-05-29 | 2019-03-22 | Electronic device and method for displaying object associated with external electronic device on basis of position and movement of external electronic device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020180061384A KR102551686B1 (ko) | 2018-05-29 | 2018-05-29 | 외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법 |
| KR10-2018-0061384 | 2018-05-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019231090A1 true WO2019231090A1 (ko) | 2019-12-05 |
Family
ID=68698234
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/003336 Ceased WO2019231090A1 (ko) | 2018-05-29 | 2019-03-22 | 외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11442274B2 (ko) |
| EP (1) | EP3796131B1 (ko) |
| KR (1) | KR102551686B1 (ko) |
| CN (1) | CN112204503B (ko) |
| WO (1) | WO2019231090A1 (ko) |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102861533B1 (ko) * | 2020-08-26 | 2025-09-18 | 삼성전자 주식회사 | 외부 장치의 위치를 보정하기 위한 전자 장치 및 그의 동작 방법 |
| KR20230020845A (ko) * | 2021-08-04 | 2023-02-13 | 현대자동차주식회사 | 전자장치 및 그의 객체 추적 방법 |
| US11836301B2 (en) * | 2021-08-10 | 2023-12-05 | Qualcomm Incorporated | Electronic device for tracking objects |
| US11748965B2 (en) * | 2021-09-28 | 2023-09-05 | Htc Corporation | Virtual image display system and virtual image display method |
| CN114092370B (zh) * | 2021-11-19 | 2024-09-06 | 抖音视界有限公司 | 一种图像展示方法、装置、计算机设备及存储介质 |
| WO2023153597A1 (ko) * | 2022-02-08 | 2023-08-17 | 삼성전자주식회사 | 인식 공간을 기반으로 운동 컨텐츠를 제공하는 전자 장치 및 그 동작 방법 |
| EP4435707A4 (en) | 2022-02-08 | 2025-03-12 | Samsung Electronics Co., Ltd. | Electronic device for providing exercise content on basis of recognition space and operation method thereof |
| CN115430135A (zh) * | 2022-03-08 | 2022-12-06 | 北京罗克维尔斯科技有限公司 | 画面显示方法、装置、系统、电子设备、车辆和存储介质 |
| CN115291392A (zh) * | 2022-08-04 | 2022-11-04 | 南昌黑鲨科技有限公司 | 一种vr系统及vr虚拟现实设备 |
| WO2024043546A1 (ko) * | 2022-08-26 | 2024-02-29 | 삼성전자주식회사 | 사용자의 움직임을 트래킹 하기 위한 전자 장치 및 방법 |
| JP2024044162A (ja) * | 2022-09-20 | 2024-04-02 | キヤノン株式会社 | 通信装置、通信方法、および通信システム |
| WO2024063330A1 (ko) * | 2022-09-23 | 2024-03-28 | 삼성전자 주식회사 | 웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20170023491A (ko) * | 2015-08-24 | 2017-03-06 | 엘지전자 주식회사 | 카메라 및 이를 포함하는 가상 현실 시스템 |
| KR20170090276A (ko) * | 2016-01-28 | 2017-08-07 | 엠더블유엔테크 주식회사 | 가상현실 소방체험 시스템 |
| JP2018029907A (ja) * | 2016-08-26 | 2018-03-01 | 株式会社コロプラ | 情報処理方法、当該情報処理方法をコンピュータに実行させるためのプログラム及びコンピュータ |
| JP2018036720A (ja) * | 2016-08-29 | 2018-03-08 | 株式会社タカラトミー | 仮想空間観察システム、方法及びプログラム |
| KR20180043132A (ko) * | 2016-10-19 | 2018-04-27 | 주식회사 글로벌이노베이션 | 사격 게임 시스템 |
Family Cites Families (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH05143270A (ja) | 1991-11-20 | 1993-06-11 | Matsushita Electric Ind Co Ltd | ウインドウシステム上のカーソル管理装置およびカーソル管理方法 |
| CN105359063B (zh) | 2013-06-09 | 2018-08-17 | 索尼电脑娱乐公司 | 利用追踪的头戴式显示器 |
| US9804257B2 (en) * | 2014-11-13 | 2017-10-31 | WorldViz LLC | Methods and systems for an immersive virtual reality system using multiple active markers |
| KR20160063834A (ko) | 2014-11-27 | 2016-06-07 | 삼성전자주식회사 | 포인팅 장치, 인터페이스 장치 및 디스플레이 장치 |
| US9746921B2 (en) * | 2014-12-31 | 2017-08-29 | Sony Interactive Entertainment Inc. | Signal generation and detector systems and methods for determining positions of fingers of a user |
| US20160378176A1 (en) * | 2015-06-24 | 2016-12-29 | Mediatek Inc. | Hand And Body Tracking With Mobile Device-Based Virtual Reality Head-Mounted Display |
| WO2017039308A1 (en) * | 2015-08-31 | 2017-03-09 | Samsung Electronics Co., Ltd. | Virtual reality display apparatus and display method thereof |
| US10518172B2 (en) * | 2016-03-07 | 2019-12-31 | Htc Corporation | Accessory management of virtual reality system |
| KR20170126295A (ko) | 2016-05-09 | 2017-11-17 | 엘지전자 주식회사 | 헤드 마운티드 디스플레이 장치 및 그것의 제어방법 |
| KR20170129509A (ko) | 2016-05-17 | 2017-11-27 | 엘지전자 주식회사 | 헤드 마운티드 디스플레이 장치 및 그것의 제어방법 |
| US10249090B2 (en) | 2016-06-09 | 2019-04-02 | Microsoft Technology Licensing, Llc | Robust optical disambiguation and tracking of two or more hand-held controllers with passive optical and inertial tracking |
| US10078377B2 (en) | 2016-06-09 | 2018-09-18 | Microsoft Technology Licensing, Llc | Six DOF mixed reality input by fusing inertial handheld controller with hand tracking |
| KR102649197B1 (ko) * | 2016-07-29 | 2024-03-20 | 삼성전자주식회사 | 그래픽 객체를 표시하기 위한 전자 장치 및 컴퓨터 판독 가능한 기록 매체 |
| US10255658B2 (en) | 2016-08-09 | 2019-04-09 | Colopl, Inc. | Information processing method and program for executing the information processing method on computer |
| KR102719606B1 (ko) * | 2016-09-09 | 2024-10-21 | 삼성전자주식회사 | 이미지 표시 방법, 저장 매체 및 전자 장치 |
| US10705598B2 (en) | 2017-05-09 | 2020-07-07 | Microsoft Technology Licensing, Llc | Tracking wearable device and handheld object poses |
-
2018
- 2018-05-29 KR KR1020180061384A patent/KR102551686B1/ko active Active
-
2019
- 2019-03-22 CN CN201980035566.9A patent/CN112204503B/zh active Active
- 2019-03-22 US US17/059,374 patent/US11442274B2/en active Active
- 2019-03-22 EP EP19810932.4A patent/EP3796131B1/en active Active
- 2019-03-22 WO PCT/KR2019/003336 patent/WO2019231090A1/ko not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20170023491A (ko) * | 2015-08-24 | 2017-03-06 | 엘지전자 주식회사 | 카메라 및 이를 포함하는 가상 현실 시스템 |
| KR20170090276A (ko) * | 2016-01-28 | 2017-08-07 | 엠더블유엔테크 주식회사 | 가상현실 소방체험 시스템 |
| JP2018029907A (ja) * | 2016-08-26 | 2018-03-01 | 株式会社コロプラ | 情報処理方法、当該情報処理方法をコンピュータに実行させるためのプログラム及びコンピュータ |
| JP2018036720A (ja) * | 2016-08-29 | 2018-03-08 | 株式会社タカラトミー | 仮想空間観察システム、方法及びプログラム |
| KR20180043132A (ko) * | 2016-10-19 | 2018-04-27 | 주식회사 글로벌이노베이션 | 사격 게임 시스템 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3796131A1 (en) | 2021-03-24 |
| EP3796131B1 (en) | 2025-08-20 |
| KR102551686B1 (ko) | 2023-07-05 |
| CN112204503B (zh) | 2025-01-24 |
| CN112204503A (zh) | 2021-01-08 |
| US11442274B2 (en) | 2022-09-13 |
| KR20190135870A (ko) | 2019-12-09 |
| EP3796131A4 (en) | 2021-07-14 |
| US20210239983A1 (en) | 2021-08-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019231090A1 (ko) | 외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법 | |
| WO2020185029A1 (ko) | 전자 장치 및 증강 현실 기반의 공유 정보 표시 방법 | |
| WO2020050636A1 (ko) | 사용자 의도 기반 제스처 인식 방법 및 장치 | |
| WO2021020940A1 (en) | Electronic device and method for generating augmented reality object | |
| WO2019164092A1 (ko) | 디스플레이를 통해 표시된 제 1 콘텐트에 대해 제 2 콘텐트를 외부 객체의 움직임에 따라 제공하기 위한 전자 장치 및 그의 동작 방법 | |
| KR20190102715A (ko) | 컨트롤러와 접촉된 신체 부위에 따라 그래픽 객체를 다르게 표시하는 방법 및 전자 장치 | |
| WO2022173164A1 (ko) | Ar 객체를 표시하는 방법 및 전자 장치 | |
| KR20200101205A (ko) | 전자 장치 및 전자 장치에서 디스플레이 동작 제어 방법 | |
| WO2021230568A1 (ko) | 증강 현실 서비스를 제공하기 위한 전자 장치 및 그의 동작 방법 | |
| WO2020197245A1 (ko) | 입력 장치 및 입력 장치와 상호 작용하는 전자 장치 | |
| WO2021261829A1 (ko) | 밝기 조절 방법 및 hmd 장치 | |
| EP3744087A1 (en) | Electronic device for adjusting position of content displayed on display based on ambient illuminance and method for operating same | |
| WO2019103350A1 (ko) | 사용자 인터페이스를 적응적으로 구성하기 위한 장치 및 방법 | |
| WO2019209075A1 (ko) | 외부 전자 장치를 제어하는 전자 장치 및 방법 | |
| WO2021149938A1 (en) | Electronic device and method for controlling robot | |
| WO2021225333A1 (ko) | 증강 현실 서비스를 제공하기 위한 전자 장치 및 그의 동작 방법 | |
| WO2020130579A1 (ko) | 이미지 처리 방법 및 그 전자 장치 | |
| WO2020171359A1 (ko) | 전자 장치 및 그 촬영 관련 정보 안내 방법 | |
| WO2021162353A1 (ko) | 카메라를 포함하는 전자 장치 및 그의 동작 방법 | |
| WO2021162366A1 (ko) | Ar 객체를 배치하는 방법 및 전자 장치 | |
| WO2023003156A1 (ko) | Udc를 사용하여 사용자 인증 기능을 수행하는 방법 및 전자 장치 | |
| WO2019182340A1 (ko) | 이미지 데이터 처리 방법 및 이를 위한 장치 | |
| WO2020004758A1 (ko) | 사용자의 위치에 기반하여 발광 소자를 이용한 시각적 효과를 제공하는 전자 장치 및 그에 관한 방법 | |
| WO2025009720A1 (ko) | 엘이디를 제어하기 위한 데이터 세트들의 순서를 조절하기 위한 전자 장치, 방법, 및 컴퓨터 판독 가능 저장 매체 | |
| WO2023158171A1 (ko) | 전자 장치 및 전자 장치의 제어 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19810932 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2019810932 Country of ref document: EP Effective date: 20201216 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 201980035566.9 Country of ref document: CN |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2019810932 Country of ref document: EP |