[go: up one dir, main page]

US20230014421A1 - 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION - Google Patents

6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION Download PDF

Info

Publication number
US20230014421A1
US20230014421A1 US17/935,897 US202217935897A US2023014421A1 US 20230014421 A1 US20230014421 A1 US 20230014421A1 US 202217935897 A US202217935897 A US 202217935897A US 2023014421 A1 US2023014421 A1 US 2023014421A1
Authority
US
United States
Prior art keywords
6dof
map
pose
coordinate system
hmd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/935,897
Inventor
Datta Ramadasan
Qiong Lin
Hao YE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xvisio Technology Corp
Original Assignee
Xvisio Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xvisio Technology Corp filed Critical Xvisio Technology Corp
Priority to US17/935,897 priority Critical patent/US20230014421A1/en
Publication of US20230014421A1 publication Critical patent/US20230014421A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention relates generally to VR/AR system, more particularly relates to 6DoF inside-out tracking game controller, head mount device in-side-out tracking and multi agent interaction; robot position and posture tracking, routing planning, collision avoidance.
  • VR virtual reality
  • AR augmented reality
  • the SLAM (simultaneous localization and mapping) algorithm is widely adopted to improve the system.
  • the scale factor the scale factor
  • the drift problem even with a stereo camera
  • the long processing time the long processing time.
  • the state of the art solutions for the drift are the on-line loop-closure and on-line re-localization (used in ORB-SLAM). Both are based on a bag of words approach (to store every patches). But the update of this bag of words is very time/CPU consuming.
  • 6DoF six degree of freedom
  • a multi-processor structure is used for VI-SLAM.
  • the apparatus obtains overlapping image frames and sensor inputs of an apparatus, wherein the sensor inputs comprise gyrometer data, accelerometer data and magnetometer data, splits computation work onto a plurality of vector processors to obtain six degree of freedom (6DoF) outputs of the apparatus based on a splitting algorithm, and performs a localization process to generate 6DoF estimations, and a mapping process to generate a cloud of three-dimensional points associated to the descriptors of the map.
  • 6DoF six degree of freedom
  • the splitting algorithm involves: dividing a current frame in N equal part; and each of a set of selected vector processors processes a portion of the current frame based on a split-by-corner rule, and wherein the split-by-corner rule determining whether each pixel of is a corner and classifying each pixel determined to a corner to a compressed descriptor by converting each sub-image centered by the pixel to a 16-float descriptor using a base matrix.
  • the localization process and mapping process are configured to run sequentially, wherein the localization process is split over all of the vector processors and the mapping process is split over all the vector processors.
  • the localization process and mapping process are configured to run in parallel, wherein the localization process is split over a first subset of the vector processors and the mapping process is split over the rest subset of the vector processors.
  • the 6DoF outputs is in one format selecting from an output format group comprising: six floating point values with three for the translated 3D space and three for the rotation space, twelve floating point values with three for the translated 3D space and nine for the rotation space, six fix point values with three for the translated 3D space and three for the rotation space, and twelve fix point values with three for the translated 3D space and nine for the rotation space.
  • a map of the background environment is generated in advance.
  • This reference map is a batch of visual features with pre-estimated 3D position and visual feature description.
  • the map is used for real-time localization. During the localization process, the 3D position of the features is not updated, so the map is static. Because the map is known, there is no need to map the environment constantly. And because the map is static, the localization will not drift. The potential issue of this approach is a failure of the localization when we move too far from the reference map. We solve this problem using a light SLAM algorithm.
  • client server topology is used in deploying the mapping and localization technology, which makes the client lighter in computing and less power hungry.
  • tracking and localization are based on a known map. This allows to achieve fast processing speed. This is useful for the VR/AR application.
  • a calibrated stereo camera is provided in this approach to fix the scale factor problem.
  • FIG. 1 illustrates an exemplary block diagram of a 6DoF inside-out tracking game controller in accordance with embodiments of the current invention.
  • FIG. 2 illustrates exemplary multi-camera configurations in accordance with embodiments of the current invention.
  • FIG. 3 illustrates an exemplary data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention.
  • FIG. 4 illustrates an exemplary location system data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention.
  • FIG. 5 A illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processor for the localization process based on a splitting algorithm in accordance with embodiments of the current invention.
  • FIG. 5 B illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processor for the mapping process based on a splitting algorithm in accordance with embodiments of the current invention.
  • FIG. 6 illustrates an exemplary parallel and an exemplary sequential process in the multi-processor configuration in accordance with embodiments of the current invention.
  • FIG. 7 illustrates exemplary diagram of a calibration process in accordance with embodiments of the current invention.
  • FIG. 8 illustrates an exemplary flow chart for the 6DoF inside-out tracking process in accordance with embodiments of the current invention.
  • FIG. 9 illustrates exemplary diagrams for registering initial 6DoF pose in an HMD tracking system in accordance with embodiments of the current invention.
  • FIG. 10 illustrates exemplary diagrams of registering initial 6DoF pose using a physical button on a game controller in an HMD tracking system in accordance with embodiments of the current invention.
  • FIG. 11 A illustrates exemplary diagrams of registering initial 6DoF pose using a two-dimensional (2D) 6DoF tag in an HMD tracking system in accordance with embodiments of the current invention.
  • FIG. 11 B illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the HMD in accordance with embodiments of the current invention.
  • FIG. 11 C illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the game controller in accordance with embodiments of the current invention.
  • FIG. 12 illustrates an exemplary flow chart of registering initial 6DoF pose in an HMD system in accordance with embodiments of the current invention.
  • FIG. 1 illustrates an exemplary block diagram of a 6DoF inside-out tracking game controller 100 in accordance with embodiments of the current invention.
  • Game controller 100 is not limited to function as a game controller only. It can be an apparatus used in other scenarios and in combination with other apparatus.
  • the 6DoF is produced in real time by game controller 100 .
  • game controller 100 includes a plurality of sensors, such as sensor 121 , 122 , and 125 .
  • the sensors are cameras. A plurality of cameras for game controller 100 generate overlapping images.
  • Game controller 100 also includes an inertial measurement unit (IMU) 131 , an optional external memory card (SD Card) 132
  • IMU inertial measurement unit
  • SD Card optional external memory card
  • An interface module 111 communicates and controls the sensors, IMU 131 , SD 132 , and the wireless interface, such WiFi 133 and Bluetooth 134 .
  • a hardware accelerator and image signal processing unit 112 helps image processing of the sensor inputs.
  • IMU 131 detects of movements and rotations and magnetic heading of game controller 100 .
  • IMU 131 is an integrated 9-axis sensor for the detection of movements and rotations and magnetic heading.
  • IMU 131 senses orientation, angular velocity, and linear acceleration of game controller 100 .
  • game controller 100 processes data of an IMU frame rate of at least 500 Hz.
  • a plurality of cameras are mounted on the outer case of the game controller to generate overlapping views for the game controller.
  • Using multiple cameras with overlapping view has many advantages compared to monocular solution, such as the scale factor of the 3D motion does not drift, the 3D points seen on the overlapping area can be triangulated without a motion of the device, the matching on the overlapping area is faster and more accurate using the epipolar geometry, the global field of view is wider which increase the accuracy and reduce the jittering.
  • FIG. 2 illustrates exemplary multi-camera configurations in accordance with embodiments of the current invention.
  • the multi-camera configurations are not limited by the examples shown.
  • a plurality of cameras can be used to capture stereo images such that overlapping images are obtained.
  • a game controller 210 has a straight outer case. Two cameras 211 and 211 are mounted at close to the two-ends of the outer case and the lens of the cameras are parallel with the surface of the outer case.
  • a game controller 220 is similarly configured as game controller 210 with two cameras mounted at close to the edge of the straight line outer case. In other scenarios, four cameras are used.
  • a game controller 230 has a straight outer case. Four cameras 231 , 232 , 233 , and 234 are mounted at close to the edge of the outer case.
  • game controller 240 also has a straight outer case with four cameras 241 , 242 , 243 , and 244 mounted at close to the edge of the outer case.
  • the cameras that are on the same side but opposite to each other are inline.
  • a game controller 250 has a square outer case with four cameras 251 , 252 , 253 , and 254 . Each of the cameras is mounted on one side of the outer case.
  • a game controller 260 is a “V” shaped outer case.
  • Four cameras 261 , 262 , 263 , and 264 are mounted on the outer case of game controller 260 , with two of each the cameras on each leg of the outer case.
  • a game controller 270 has an outer case in the shape of a half hexagon. Game controller 270 has four cameras with two of each mounted on the two outer legs of the outer case. Other configurations and/or different number of cameras can be configured.
  • FIG. 3 illustrates an exemplary data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention.
  • Two processes are used, one localization process 312 for the real-time tracking, and one mapping process 311 for the map update.
  • One or more stereo cameras 341 are used.
  • the obtained images are gather in one frame 331 , which through interface A is passed to the localization process 312 .
  • IMU unit 342 talks to accelerometer, gyrometer and magnetometer 332 , which through interface B is passed to localization process 312 .
  • the localization procedure generates 6DoF pose, IMU, and feature points 322 and passes through interface G of mapping and loop closure process 311 .
  • Mapping and loop closure 311 receives 6DoF pose and IMU and feature points through interface G. Mapping and loop closure 311 generates descriptors and 3D points 321 and sends it to localization 312 via its own interface F and the localization interface C. In one embodiment, mapping and loop closure 311 also generates 6DoF pose for the game controller through its interface H.
  • FIG. 4 illustrates an exemplary location system data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention. The interfaces A, B, C, D, and E each corresponds to the same interfaces as marked in our FIG. 3 .
  • a feature detection procedure 411 receives stereo images from interface A and descriptors base 451 .
  • Features detection 411 detects features 452 and passes to match with local map procedure 412 .
  • Procedure 412 also receives descriptors and 3D points 462 from local map 461 .
  • Procedure 461 generates data 462 based on input from interface C, which are the outputs from the mapping procedure.
  • Procedure 412 generates 2D and 3D points 453 of the matching result and passes to a compute pose procedure 413 .
  • the localization procedure also receives IMU inputs from interface B and passes a pose prediction procedure 414 .
  • Procedure 414 further receives 6DoF pose 465 .
  • 6DoF pose 465 is generated by current pose 464 based on inputs from interface E.
  • Compute pose procedure 454 based on input 453 and 454 compares the result with a threshold at step 421 .
  • step 421 determines that the number of inliers are greater than the threshold, current pose 463 is passed to current pose procedure 464 . If step 421 determines that the number of inliers are not greater than the threshold, the 6DoF pose and IMU and feature points are passed to the mapping process through interface D.
  • the VI-SLAM algorithm is split to run on a plurality of processors based on a splitting algorithm and the sensor inputs.
  • FIG. 5 A illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processors for the localization process based on a splitting algorithm in accordance with embodiments of the current invention.
  • the localization process is a pipeline implemented as a state machine including a feature extraction procedure 510 , a matching procedure 520 , and a 6DoF estimation procedure 530 .
  • the feature detection and extraction procedure 510 is split to be run on N vector processors following the splitting rule.
  • Step 511 divides the current frame to be processes into N equals part.
  • Step 512 assign each frame part to a corresponding vector processor.
  • Each processor processes one part of the frame following a predefine algorithm.
  • a corner is determined. For each pixel p i , described by a 2D coordinate in the image, and an adjustable threshold t, p i is determined to be a corner if there exist a set of K contiguous pixels in the neighbor circle, which are all brighter than (p i +t) or all darker than (p i ⁇ t).
  • threshold t is in the range of 5 ⁇ t ⁇ 200.
  • each corner pixel p i is classified, using a n ⁇ n sub-image centered on p i , to a compressed descriptor. This is done using a base matrix to convert each sub-image to a 16 floats descriptor. The base matrix is computed with a singular value decomposition on a large set of selected features.
  • the n ⁇ n sub-image is 11 ⁇ 11.
  • P (p 1 , . . . , p n ) the list of features points (2D coordinate in the image) detected from the current frame.
  • D (d 1 , . . . , d n ) the list of descriptors associated pair with each feature point with its associated descriptor.
  • the matching procedure 520 is split onto N vector processors.
  • Step 521 splits the descriptor list into N parts.
  • the descriptor list is equally split into N part.
  • Step 522 performs descriptor matching for each descriptor Di by matching Di with a subset of the map descriptors. The descriptors are split in N equal range.
  • a matching algorithm applies for Di.
  • the processor i (0 ⁇ i ⁇ N+1) run the matching algorithm on the range D i .
  • the descriptors D i are matched with a subset of the descriptors of the map LocalMap (subset of the map), using the cross-matching method: each match is a a pair of descriptor (d a ,d b ) such as d a is the best candidate for d b among the descriptors D i of the current frame and d b is the best candidate for d a among the descriptors of the map LocalMap.
  • Some of the descriptors of the map are associated to some 3D points geo-referenced in world (this 3D estimation is performed by the mapping algorithm). So the matching associates each descriptor d i de D to a 3D point p3d of the LocalMap.
  • estimation 6DoF procedure 530 is split onto N processors.
  • the input of this step is the N lists M i (from the matching).
  • the 6DoF estimation minimizes, for each pair (p i ,p3d i ) in M, the difference in 2D between the projected of p3d i in the current frame and p i .
  • This minimization is performed with the non-linear least square algorithm Levenberg-Marquardt combined with the M-Estimator (robust method) of Geman-McClure.
  • the robust method of Levenberg-Marquard is used on N processors.
  • each processor i computes the reprojection error of all the elements of Mi:Ei, computes the Jacobian error function of all elements of Mi:Ji. Subsequently, the total number of N Ei in E and the total number of N Ji in J are merged with concatenation. The median of the absolute different of E (MAD) is computed.
  • FIG. 5 B illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processors for the mapping process based on a splitting algorithm in accordance with embodiments of the current invention.
  • the mapping process takes inputs of features points, descriptors and current pose and generates a cloud of 3D points associated to the descriptors.
  • the mapping process includes a matching procedure 560 and an optimization procedure 570 .
  • Matching procedure 560 is split onto multiple processors.
  • the features points coming from the localization are matched with a subset of features points of the map using the cross matching method.
  • the splitting algorithm involves: splitting the range of features to match in M equal ranges. Each range is processed using a different processor.
  • Optimization procedure 570 optimizes the pose and 3D points of the map to minimize the reprojection error with the Levenberg-Marquardt algorithm.
  • the optimization may further involve: 3D point culling wherein the 3D points seen only in 2 images are removed because they are not relevant; a key frame culling that removes the key frames (poses of the map) which contain a lot of 3D points in common because they are redundant; the loop-closure procedure detects the multiple instances of the same 3D points in the map and merge them in order to save some memory and to reduce the drift of the 3D reconstruction.
  • FIG. 6 illustrates an exemplary parallel and an exemplary sequential process in the multi-processor configuration in accordance with embodiments of the current invention.
  • Multiple processors Processor 601 , Processor 602 , Processor 603 , and Processor 612 .
  • the localization process such localization 611 and localization 612 are configured to run in serial on all the vector processors. Each localization process is split on all the vector processors.
  • the mapping process such as mapping process 612 and mapping process 619 .
  • the mapping process, such mapping 612 and mapping 619 are configured to run on all the vector processors. Each mapping process is split on all the vector processors.
  • the localization process and the mapping process are run in parallel.
  • a localization process 621 is configured to occupy processor 601 to process 608 all the time. Each localization process uses a subset of the vector processors.
  • the mapping process uses the rest of the vector processors, such as processor 609 to processor 12 .
  • FIG. 7 illustrates exemplary diagram of a calibration process in accordance with embodiments of the current invention.
  • the calibration is divided in three consecutive steps.
  • Step 701 is the intrinsic calibration.
  • the intrinsic calibration estimates the intrinsic parameters using a known pattern, such as a planar checkerboard pattern or a 3D pattern composed of multiple checkerboards with different position and orientation.
  • the intrinsic calibration does not require overlapping area. This can be done before the cameras are mounted.
  • Step 702 is the extrinsic calibration.
  • the extrinsic calibration estimates of the rotation and translation between each camera; this require to have the intrinsic calibration and to see a known pattern in the overlapping area. This step must be done once all the cameras are mounted.
  • the triangulation of the checkerboard seen at the same time with the cameras give the baseline between the cameras using.
  • Step 703 is the intrinsic and extrinsic refinement using a motion. Moving the device on the 6DoF following a known and pre-determined motion (on all the translation and orientation axis) recording the frames. Then we use a structure from motion algorithm to estimate the motion and the 3D points frame the recorded frame. During this step, we optimize the intrinsic and extrinsic parameters to refine their values using a motion constraint based on the known and pre-determined motion: the optimization algorithm estimates the set of intrinsic and extrinsic parameters which are required to explain the known and pre-determined motion.
  • FIG. 8 illustrates an exemplary flow chart for the 6DoF inside-out tracking process in accordance with embodiments of the current invention.
  • the apparatus obtains overlapping image frames and sensor inputs of an apparatus, wherein the sensor inputs comprise gyrometer data, accelerometer data and magnetometer data.
  • the apparatus splits computation work onto a plurality of vector processors to obtain six degree of freedom (6DoF) outputs of the apparatus based on a splitting algorithm.
  • the apparatus performs a localization process to generate 6DoF estimations, and a mapping process to generate a cloud of three-dimensional points associated to the descriptors of the map.
  • the 6DoF tracking method provides the efficient way to track the 6DoF pose for an AR/VR system.
  • One configuration of the AR/VR system involves tracking one device in the AR/VR.
  • Another configuration includes a head mount device (HMD) and one or more handheld/hand-control game controllers.
  • HMD head mount device
  • a world coordinate system is used to coordinate the multiple devices.
  • the world coordinate system is a common coordinate system shared by multiple devices in a system. With multiple system sharing the same coordinate system, an origin/starting point is needed.
  • FIG. 9 illustrates exemplary diagrams for registering initial 6DoF pose in an HMD tracking system in accordance with embodiments of the current invention.
  • the HMD tracking system is an AR/VR system with multiple 6DoF tracking devices, including an HMD 901 and a game controller 902 . In other embodiments, more devices can be included.
  • HMD 901 includes a sensor unit and one or more processors 917 and a memory 918 coupled with the one or more processors 917 .
  • the sensor unit includes one or more sensors and/or cameras 911 , 912 , and 915 .
  • the sensor unit also includes an IMU 916 that detects movements, rotations, and magnetic headings of HMD 901 .
  • Game controller 902 includes a sensor unit and one or more processors 927 and a memory 928 coupled with the one or more processors 927 .
  • the sensor unit includes one or more sensors and/or cameras 921 , 922 , and 925 .
  • the sensor unit also includes an IMU 926 that detects movements, rotations, and magnetic headings of game controller 902 .
  • the exemplary HMD 901 and game controller 902 may include other modules or be configured with other blocks as shown in FIG. 1 .
  • the computation and process split may be shared or separately performed for each device.
  • Each tracking device tracks 6DoF poses in a coordinate system 930 .
  • Each devices tracks 6DoF outputs, which include six dimensions of the apparatus including three dimensions (3D) of an orientation in a rotation space 931 and three dimensions translation in a 3D space 932 .
  • Each AR/VR tracking device such as HMD 901 and game controller 902 , obtains and maintains a map.
  • the map is a collection of 2D and 3D points in the space relative to a coordinate system.
  • both real objects and virtual objects are presented in the coordinate system. For example, real objects, such as a table 935 and a plate 936 , and virtual objects, such as cheese 937 , are both presented in the AR system in reference to the coordinate system 930 .
  • Each tracking devices collects local information through its sensor unit. Each map is generated based on the local information collected.
  • each device tracks its own locale information relative to their own coordinate system, which may or may not be the same as the coordinate system 930 .
  • the purpose of the registration is to synchronize the coordinate system of each individual devices to a world coordinate system.
  • a world coordinate system is a common coordinate system shared by multiple devices in a system. In an AR/VR environment, its origin is typically the starting position of the HMD device.
  • one base map in shared by multiple devices.
  • One or more devices receives a base map and generates its 6DoF relative to the base map received.
  • the base map may be generated by the HMD and shared with the game controller. In another embodiment, the base map may be generated by the game controller and shared with the HMD.
  • the base map may also be generated remotely by other devices in the AR/VR system.
  • a single base map is shared by multiple devices in the AR/VR system.
  • the based map is in reference to the world coordinate system for the HMD tracking system.
  • Each 6DoF generated for the base map from each tracking device therefore, is in reference to the world coordinate system.
  • each device generates its map in its own coordinate system or in reference to the same coordinate system.
  • the initialization process synchronize the maps such that the 6DoF generated by each device are synchronized.
  • the second scenario when different maps are generated in reference to different coordinate system, the different coordinate systems of each device are synchronized at the initialization process to the world coordinate system.
  • the combination of the shared map and individual map/coordinate system may be used for different devices.
  • the synchronization methods presented below apply to the combination scenarios.
  • a world coordinate system is initialized/synchronized to be shared by multiple devices.
  • the virtual view of the world coordinated system 930 is streamed to one or more viewing devices 960 , such as a mobile device 961 and/or a desktop/laptop 962 .
  • the virtual view of 930 is streamed in real-time.
  • the virtual view is broadcasted to multiple remote viewing devices.
  • FIG. 10 illustrates exemplary diagrams of registering initial 6DoF pose using a physical button on a game controller in an HMD tracking system in accordance with embodiments of the current invention.
  • a physical button is used to register the initial 6DoF pose in the world coordinate system.
  • An exemplary game controller 1001 has a sensor unit including multiples cameras, such as camera 1101 , 1102 and 1103 .
  • the sensor unit of game controller 1001 also includes an IMU detects movements, rotations, and magnetic headings of the apparatus.
  • the sensor unit tracks series of motions of the apparatus and collects locale information of the apparatus of relative to a map of its own coordinate system.
  • An initialization button 1108 registers an initial 6DoF pose of the apparatus, wherein the initial pose is a 3D to a predefined coordinate on the map and an initial pose of the apparatus when pressed.
  • Game controller 1001 includes one or more processors configured to communication with the HMD system to generate series of six degree of freedom (6DoF) outputs of the apparatus based on the series of motion and locale information, wherein each 6DoF output comprises six dimensions of the apparatus including three dimensions (3D) of an orientation in a rotation space and three dimensions translation in a 3D space.
  • the sensor unit of game controller 1001 tracks motions of the apparatus and generates pose changes relative to the predefined coordinate on the map.
  • the 6DoF includes the 3D coordinate and the pose of the game controller.
  • the initial 6DoF pose is registered with the world coordinate system.
  • An initial 3D coordinate is predefined.
  • the button is pressed when game controller is at 1051 pose, the pose of game controller is registered and the 3D coordinate of the initial 6DoF is registered at a predefined position 1080 .
  • the initial 6DoF of game controller 1001 is registered at the initial 6DoF 1052 .
  • the predefined 3D coordinate 1080 is at center of the plate. If the game controller moves and/or rotates to 1002 , with a 6DoF 1061 .
  • the initialization button 1008 is pressed, the 3D coordinate of the game controller is registered at the same predefined 3D coordinate 1080 , the center of the plate.
  • the game controller's pose is also registered.
  • the initial 6DoF of game controller 1002 is registered at the initial 6DoF 1062 .
  • the game controller also includes a map module that obtains map data to generate the map and to register the initial pose to the map.
  • a 2D 6DoF tag is used to register the initial 6DoF in the HMD system.
  • FIG. 11 A illustrates exemplary diagrams of registering initial 6DoF pose using a two-dimensional (2D) 6DoF tag in an HMD tracking system in accordance with embodiments of the current invention.
  • the HMD tracking system in an AR/VR environment has multiple tracking devices.
  • a 2D 6DoF tag 1101 is attached to a first device.
  • the second device derives a 6DoF 1103 relative the map of the world coordinate system.
  • the 6DoF 1103 is a 3D of an orientation in a rotation space and three dimensions translation in a 3D space.
  • the second device derives ⁇ x, y, z, roll, pitch, yaw ⁇ parameters when 2D 6DoF tag 1101 is in view.
  • the 2D 6DoF tag is an April Tag.
  • FIG. 11 B illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the HMD in accordance with embodiments of the current invention.
  • the 2D 6DoF tag is attached to the HMD in the AR/VR system.
  • HMD 1111 has a 2D 6DoF tag 1112 attached to its surface.
  • a game controller 1113 with a camera at step 1115 , captures the image of the 2D 6DoF 1112 on the HMD.
  • Game controller 1113 determines an initial relative pose to the HMD when the 6DoF tag is in the field of view of the second device, wherein a 6DoF information can be generated from the capturing of the tag.
  • the initial 6DoF 1118 is registered in the world coordinate system 1116 .
  • FIG. 11 C illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the game controller in accordance with embodiments of the current invention.
  • the 2D 6DoF tag is attached to a game controller in the AR/VR system.
  • Game controller 1123 has a 2D 6DoF tag 1122 attached to its surface.
  • An HMD 1121 with a camera, at step 1125 captures the image of the 2D 6DoF 1122 on the game controller.
  • HMD 1121 determines an initial relative pose to the HMD when the 6DoF tag is in the field of view of the second device, wherein a 6DoF information can be generated from the capturing of the tag.
  • the initial 6DoF 1128 is registered in the world coordinate system 1126 .
  • FIG. 12 illustrates an exemplary flow chart of registering initial 6DoF pose in an HMD system in accordance with embodiments of the current invention.
  • the AR/VR system collects locale information of an apparatus, wherein series of six degree of freedom (6DoF) outputs of the apparatus is generated based on locale information, and wherein each 6DoF output comprising six dimensions of the apparatus including three dimensions of an orientation in a rotation space and three dimensions translated in a 3D space.
  • the AR/VR system obtains a map in reference to a world coordinate system of the AR/VR system, wherein the map in reference to the world coordinate system is generated based on locale information collected by the apparatus.
  • the AR/VR system initializes an initial 6DoF position of the apparatus relative to the map in reference to the world coordinate system by an initializing process selecting from pressing a physical button on an apparatus or capturing a two-dimensional (2D) 6DoF tag.
  • the AR/VR system performs a localization process to generate localization information of the apparatus based on the apparatus initial position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

Methods and apparatus are provided for 6DoF inside-out tracking game control. In one novel aspect, a multi-processor architecture is used for VI-SLAM. In one embodiment, the apparatus obtains overlapping image frames and sensor inputs of an apparatus, wherein the sensor inputs comprise gyrometer data, accelerometer data and magnetometer data, splits computation work onto a plurality of vector processors to obtain six degree of freedom (6DoF) outputs of the apparatus based on a splitting algorithm, and performs a localization process to generate 6DoF estimations, and a mapping process to generate a cloud of three-dimensional points associated to the descriptors of the map. In one embodiment, the localization process and mapping process are configured to run sequentially. In another embodiment, the localization process and mapping process are configured to run in parallel.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation-in-part, and claims priority under 35 U.S.C. § 120 from nonprovisional U.S. patent application Ser. No. 17/075,853, entitled “6DOF INSIDE-OUT TRACKING GAME CONTROLLER”, filed on Oct. 21, 2020, the subject matter of which is incorporated herein by reference. application Ser. No. 17/075,853, in turn, claims priority under 35 U.S.C. § 120 from nonprovisional U.S. patent application Ser. No. 15/874,842, entitled “6DOF INSIDE-OUT TRACKING GAME CONTROLLER”, filed on Jan. 18, 2018, the subject matter of which is incorporated herein by reference. application Ser. No. 15/874,842, in turn, claims priority under 35 U.S.C. § 119 from U.S. Provisional Application No. 62/447,867 entitled “A MULTI AGENT STEROSCOPIC CAMERA BASED POSITION AND POSTURE TRACKING SYSTEM FOR PORTABLE DEVICE” filed on Jan. 18, 2017, the subject matter of which is incorporated herein by reference.
  • TECHNICAL FIELD
  • The present invention relates generally to VR/AR system, more particularly relates to 6DoF inside-out tracking game controller, head mount device in-side-out tracking and multi agent interaction; robot position and posture tracking, routing planning, collision avoidance.
  • BACKGROUND
  • The virtual reality (VR) and augmented reality (AR) is expected to continue to grow rapidly. With the development of new technology in both hardware and software could help the AR/VR market to grow even faster. With more applications using the technology the requirement for the system to run faster, be more accurate and without any drift for the localization.
  • The SLAM (simultaneous localization and mapping) algorithm is widely adopted to improve the system. However, there are three issues using the SLAM algorithm: the scale factor, the drift problem (even with a stereo camera), and the long processing time. The state of the art solutions for the drift are the on-line loop-closure and on-line re-localization (used in ORB-SLAM). Both are based on a bag of words approach (to store every patches). But the update of this bag of words is very time/CPU consuming.
  • Further, six degree of freedom (6DoF) data of a game controller are needed for the AR/VR system. However, the game controllers today are not efficient and fast enough to produce the 6DoF data in real time. The three dimensions for the translation in the 3D space are not obtained by the game controller of the market.
  • Enhancement and improvement are required tracking game controller.
  • SUMMARY
  • Methods and apparatus are provided for 6DoF inside-out tracking game control. In one novel aspect, a multi-processor structure is used for VI-SLAM. In one embodiment, the apparatus obtains overlapping image frames and sensor inputs of an apparatus, wherein the sensor inputs comprise gyrometer data, accelerometer data and magnetometer data, splits computation work onto a plurality of vector processors to obtain six degree of freedom (6DoF) outputs of the apparatus based on a splitting algorithm, and performs a localization process to generate 6DoF estimations, and a mapping process to generate a cloud of three-dimensional points associated to the descriptors of the map. In one embodiment, the splitting algorithm involves: dividing a current frame in N equal part; and each of a set of selected vector processors processes a portion of the current frame based on a split-by-corner rule, and wherein the split-by-corner rule determining whether each pixel of is a corner and classifying each pixel determined to a corner to a compressed descriptor by converting each sub-image centered by the pixel to a 16-float descriptor using a base matrix. In one embodiment, the localization process and mapping process are configured to run sequentially, wherein the localization process is split over all of the vector processors and the mapping process is split over all the vector processors. In another embodiment, the localization process and mapping process are configured to run in parallel, wherein the localization process is split over a first subset of the vector processors and the mapping process is split over the rest subset of the vector processors. In one embodiment, the 6DoF outputs is in one format selecting from an output format group comprising: six floating point values with three for the translated 3D space and three for the rotation space, twelve floating point values with three for the translated 3D space and nine for the rotation space, six fix point values with three for the translated 3D space and three for the rotation space, and twelve fix point values with three for the translated 3D space and nine for the rotation space.
  • In one novel aspect, a map of the background environment is generated in advance. This reference map is a batch of visual features with pre-estimated 3D position and visual feature description. The map is used for real-time localization. During the localization process, the 3D position of the features is not updated, so the map is static. Because the map is known, there is no need to map the environment constantly. And because the map is static, the localization will not drift. The potential issue of this approach is a failure of the localization when we move too far from the reference map. We solve this problem using a light SLAM algorithm.
  • In one embodiment, client server topology is used in deploying the mapping and localization technology, which makes the client lighter in computing and less power hungry. There could be one or more clients working on the server network. Or the client works on its own without a server at the cost of power consumption.
  • In another embodiment, tracking and localization are based on a known map. This allows to achieve fast processing speed. This is useful for the VR/AR application. A calibrated stereo camera is provided in this approach to fix the scale factor problem.
  • Other embodiments and advantages are described in the detailed description below. This summary does not purport to define the invention. The invention is defined by the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, where like numerals indicate like components, illustrate embodiments of the invention.
  • FIG. 1 illustrates an exemplary block diagram of a 6DoF inside-out tracking game controller in accordance with embodiments of the current invention.
  • FIG. 2 illustrates exemplary multi-camera configurations in accordance with embodiments of the current invention.
  • FIG. 3 illustrates an exemplary data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention.
  • FIG. 4 illustrates an exemplary location system data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention.
  • FIG. 5A illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processor for the localization process based on a splitting algorithm in accordance with embodiments of the current invention.
  • FIG. 5B illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processor for the mapping process based on a splitting algorithm in accordance with embodiments of the current invention.
  • FIG. 6 illustrates an exemplary parallel and an exemplary sequential process in the multi-processor configuration in accordance with embodiments of the current invention.
  • FIG. 7 illustrates exemplary diagram of a calibration process in accordance with embodiments of the current invention.
  • FIG. 8 illustrates an exemplary flow chart for the 6DoF inside-out tracking process in accordance with embodiments of the current invention.
  • FIG. 9 illustrates exemplary diagrams for registering initial 6DoF pose in an HMD tracking system in accordance with embodiments of the current invention.
  • FIG. 10 illustrates exemplary diagrams of registering initial 6DoF pose using a physical button on a game controller in an HMD tracking system in accordance with embodiments of the current invention.
  • FIG. 11A illustrates exemplary diagrams of registering initial 6DoF pose using a two-dimensional (2D) 6DoF tag in an HMD tracking system in accordance with embodiments of the current invention.
  • FIG. 11B illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the HMD in accordance with embodiments of the current invention.
  • FIG. 11C illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the game controller in accordance with embodiments of the current invention.
  • FIG. 12 illustrates an exemplary flow chart of registering initial 6DoF pose in an HMD system in accordance with embodiments of the current invention.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to some embodiments of the invention, examples of which are illustrated in the accompanying drawings.
  • FIG. 1 illustrates an exemplary block diagram of a 6DoF inside-out tracking game controller 100 in accordance with embodiments of the current invention. Game controller 100 is not limited to function as a game controller only. It can be an apparatus used in other scenarios and in combination with other apparatus. In one novel aspect, the 6DoF is produced in real time by game controller 100. In one embodiment, game controller 100 includes a plurality of sensors, such as sensor 121, 122, and 125. In one embodiment, the sensors are cameras. A plurality of cameras for game controller 100 generate overlapping images.
  • Game controller 100 also includes an inertial measurement unit (IMU) 131, an optional external memory card (SD Card) 132 Other embodiments and advantages are described in the detailed description below. This summary does not purport to define the invention. The invention is defined by the claims, and one or more wireless interface 133, such as a WiFi interface, a Bluetooth interface. An interface module 111 communicates and controls the sensors, IMU 131, SD 132, and the wireless interface, such WiFi 133 and Bluetooth 134. A hardware accelerator and image signal processing unit 112 helps image processing of the sensor inputs. IMU 131 detects of movements and rotations and magnetic heading of game controller 100. In one embodiment, IMU 131 is an integrated 9-axis sensor for the detection of movements and rotations and magnetic heading. It comprises a triaxial, low-g acceleration sensor, a triaxial angular rate sensor and a triaxial geomagnetic sensor. IMU 131 senses orientation, angular velocity, and linear acceleration of game controller 100. In one embodiment, game controller 100 processes data of an IMU frame rate of at least 500 Hz.
  • In one embodiment, a plurality of cameras are mounted on the outer case of the game controller to generate overlapping views for the game controller. Using multiple cameras with overlapping view has many advantages compared to monocular solution, such as the scale factor of the 3D motion does not drift, the 3D points seen on the overlapping area can be triangulated without a motion of the device, the matching on the overlapping area is faster and more accurate using the epipolar geometry, the global field of view is wider which increase the accuracy and reduce the jittering.
  • FIG. 2 illustrates exemplary multi-camera configurations in accordance with embodiments of the current invention. The multi-camera configurations are not limited by the examples shown. In general, a plurality of cameras can be used to capture stereo images such that overlapping images are obtained. A game controller 210 has a straight outer case. Two cameras 211 and 211 are mounted at close to the two-ends of the outer case and the lens of the cameras are parallel with the surface of the outer case. A game controller 220 is similarly configured as game controller 210 with two cameras mounted at close to the edge of the straight line outer case. In other scenarios, four cameras are used. A game controller 230 has a straight outer case. Four cameras 231, 232, 233, and 234 are mounted at close to the edge of the outer case. The cameras that are on the same side but opposite to each other are not inline. Similarly, to game controller 230, game controller 240 also has a straight outer case with four cameras 241, 242, 243, and 244 mounted at close to the edge of the outer case. The cameras that are on the same side but opposite to each other are inline. A game controller 250 has a square outer case with four cameras 251, 252, 253, and 254. Each of the cameras is mounted on one side of the outer case. A game controller 260 is a “V” shaped outer case. Four cameras 261, 262, 263, and 264 are mounted on the outer case of game controller 260, with two of each the cameras on each leg of the outer case. In another example, a game controller 270 has an outer case in the shape of a half hexagon. Game controller 270 has four cameras with two of each mounted on the two outer legs of the outer case. Other configurations and/or different number of cameras can be configured.
  • FIG. 3 illustrates an exemplary data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention. Two processes are used, one localization process 312 for the real-time tracking, and one mapping process 311 for the map update. One or more stereo cameras 341 are used. The obtained images are gather in one frame 331, which through interface A is passed to the localization process 312. Similarly, IMU unit 342 talks to accelerometer, gyrometer and magnetometer 332, which through interface B is passed to localization process 312. The localization procedure generates 6DoF pose, IMU, and feature points 322 and passes through interface G of mapping and loop closure process 311. Mapping and loop closure 311 receives 6DoF pose and IMU and feature points through interface G. Mapping and loop closure 311 generates descriptors and 3D points 321 and sends it to localization 312 via its own interface F and the localization interface C. In one embodiment, mapping and loop closure 311 also generates 6DoF pose for the game controller through its interface H. FIG. 4 illustrates an exemplary location system data flow for the real time stereo tracking for the game controller in accordance with embodiments of the current invention. The interfaces A, B, C, D, and E each corresponds to the same interfaces as marked in our FIG. 3 . A feature detection procedure 411 receives stereo images from interface A and descriptors base 451. Features detection 411 detects features 452 and passes to match with local map procedure 412. Procedure 412 also receives descriptors and 3D points 462 from local map 461. Procedure 461 generates data 462 based on input from interface C, which are the outputs from the mapping procedure. Procedure 412 generates 2D and 3D points 453 of the matching result and passes to a compute pose procedure 413. The localization procedure also receives IMU inputs from interface B and passes a pose prediction procedure 414. Procedure 414 further receives 6DoF pose 465. 6DoF pose 465 is generated by current pose 464 based on inputs from interface E. Compute pose procedure 454 based on input 453 and 454 compares the result with a threshold at step 421. If step 421 determines that the number of inliers are greater than the threshold, current pose 463 is passed to current pose procedure 464. If step 421 determines that the number of inliers are not greater than the threshold, the 6DoF pose and IMU and feature points are passed to the mapping process through interface D.
  • In one novel aspect, the VI-SLAM algorithm is split to run on a plurality of processors based on a splitting algorithm and the sensor inputs.
  • FIG. 5A illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processors for the localization process based on a splitting algorithm in accordance with embodiments of the current invention. In one embodiment, the localization process is a pipeline implemented as a state machine including a feature extraction procedure 510, a matching procedure 520, and a 6DoF estimation procedure 530.
  • In one embodiment, the feature detection and extraction procedure 510 is split to be run on N vector processors following the splitting rule. Step 511 divides the current frame to be processes into N equals part. Step 512 assign each frame part to a corresponding vector processor. Each processor processes one part of the frame following a predefine algorithm. First, a corner is determined. For each pixel pi, described by a 2D coordinate in the image, and an adjustable threshold t, pi is determined to be a corner if there exist a set of K contiguous pixels in the neighbor circle, which are all brighter than (pi+t) or all darker than (pi−t). In some embodiment, threshold t is in the range of 5<t<200. In another embodiment, the K is in the range of 5<K<13. In yet another embodiment, the neighbor circle has a radius of three pixels. Subsequently, at the second step, each corner pixel pi is classified, using a n×n sub-image centered on pi, to a compressed descriptor. This is done using a base matrix to convert each sub-image to a 16 floats descriptor. The base matrix is computed with a singular value decomposition on a large set of selected features. In one embodiment, the n×n sub-image is 11×11. Let P=(p1, . . . , pn) the list of features points (2D coordinate in the image) detected from the current frame. Let D=(d1, . . . , dn) the list of descriptors associated pair with each feature point with its associated descriptor.
  • In another embodiment, the matching procedure 520 is split onto N vector processors. Step 521 splits the descriptor list into N parts. In one embodiment, the descriptor list is equally split into N part. Step 522 performs descriptor matching for each descriptor Di by matching Di with a subset of the map descriptors. The descriptors are split in N equal range. For each vector process i, a matching algorithm applies for Di. The processor i (0<i<N+1) run the matching algorithm on the range Di. The descriptors Di are matched with a subset of the descriptors of the map LocalMap (subset of the map), using the cross-matching method: each match is a a pair of descriptor (da,db) such as da is the best candidate for db among the descriptors Di of the current frame and db is the best candidate for da among the descriptors of the map LocalMap. Some of the descriptors of the map are associated to some 3D points geo-referenced in world (this 3D estimation is performed by the mapping algorithm). So the matching associates each descriptor di de D to a 3D point p3d of the LocalMap. The output of the matching is a list of descriptor pairs associating the features points P to the 3D points of the map: Mi=((p1,p3d1), . . . , (pn,p3dn)).
  • In yet another embodiment, estimation 6DoF procedure 530 is split onto N processors. The input of this step is the N lists Mi (from the matching). The 6DoF estimation minimizes, for each pair (pi,p3di) in M, the difference in 2D between the projected of p3di in the current frame and pi. This minimization is performed with the non-linear least square algorithm Levenberg-Marquardt combined with the M-Estimator (robust method) of Geman-McClure. The robust method of Levenberg-Marquard is used on N processors. Once split, each processor i computes the reprojection error of all the elements of Mi:Ei, computes the Jacobian error function of all elements of Mi:Ji. Subsequently, the total number of N Ei in E and the total number of N Ji in J are merged with concatenation. The median of the absolute different of E (MAD) is computed. The estimation of 6DoF is obtained by solving the linear system of (JTJ) X=JTE·MAD, where X is the update of the 6DoF.
  • FIG. 5B illustrates exemplary flow diagram of splitting the VI-SLAM onto N vector processors for the mapping process based on a splitting algorithm in accordance with embodiments of the current invention. The mapping process takes inputs of features points, descriptors and current pose and generates a cloud of 3D points associated to the descriptors. The mapping process includes a matching procedure 560 and an optimization procedure 570. Matching procedure 560 is split onto multiple processors. The features points coming from the localization are matched with a subset of features points of the map using the cross matching method. The splitting algorithm involves: splitting the range of features to match in M equal ranges. Each range is processed using a different processor. Each match performed with a descriptor which is not already associated to a 3D point of the map allows the triangulate on of a new 3D point. Optimization procedure 570 optimizes the pose and 3D points of the map to minimize the reprojection error with the Levenberg-Marquardt algorithm. The optimization may further involve: 3D point culling wherein the 3D points seen only in 2 images are removed because they are not relevant; a key frame culling that removes the key frames (poses of the map) which contain a lot of 3D points in common because they are redundant; the loop-closure procedure detects the multiple instances of the same 3D points in the map and merge them in order to save some memory and to reduce the drift of the 3D reconstruction.
  • In one novel aspect, using the multi-processor processors architect, the efficiencies of the localization process and the mapping process are greatly improved.
  • FIG. 6 illustrates an exemplary parallel and an exemplary sequential process in the multi-processor configuration in accordance with embodiments of the current invention. Multiple processors, Processor 601, Processor 602, Processor 603, and Processor 612. In one embodiment, the localization process, such localization 611 and localization 612 are configured to run in serial on all the vector processors. Each localization process is split on all the vector processors. The mapping process, such as mapping process 612 and mapping process 619. The mapping process, such mapping 612 and mapping 619 are configured to run on all the vector processors. Each mapping process is split on all the vector processors. In another embodiment, the localization process and the mapping process are run in parallel. A localization process 621 is configured to occupy processor 601 to process 608 all the time. Each localization process uses a subset of the vector processors. The mapping process uses the rest of the vector processors, such as processor 609 to processor 12.
  • FIG. 7 illustrates exemplary diagram of a calibration process in accordance with embodiments of the current invention. The calibration is divided in three consecutive steps. Step 701 is the intrinsic calibration. The intrinsic calibration estimates the intrinsic parameters using a known pattern, such as a planar checkerboard pattern or a 3D pattern composed of multiple checkerboards with different position and orientation. The intrinsic calibration does not require overlapping area. This can be done before the cameras are mounted. Step 702 is the extrinsic calibration. The extrinsic calibration estimates of the rotation and translation between each camera; this require to have the intrinsic calibration and to see a known pattern in the overlapping area. This step must be done once all the cameras are mounted. The triangulation of the checkerboard seen at the same time with the cameras give the baseline between the cameras using. Step 703 is the intrinsic and extrinsic refinement using a motion. Moving the device on the 6DoF following a known and pre-determined motion (on all the translation and orientation axis) recording the frames. Then we use a structure from motion algorithm to estimate the motion and the 3D points frame the recorded frame. During this step, we optimize the intrinsic and extrinsic parameters to refine their values using a motion constraint based on the known and pre-determined motion: the optimization algorithm estimates the set of intrinsic and extrinsic parameters which are required to explain the known and pre-determined motion.
  • FIG. 8 illustrates an exemplary flow chart for the 6DoF inside-out tracking process in accordance with embodiments of the current invention. At step 801, the apparatus obtains overlapping image frames and sensor inputs of an apparatus, wherein the sensor inputs comprise gyrometer data, accelerometer data and magnetometer data. At step 802, the apparatus splits computation work onto a plurality of vector processors to obtain six degree of freedom (6DoF) outputs of the apparatus based on a splitting algorithm. At step 803, the apparatus performs a localization process to generate 6DoF estimations, and a mapping process to generate a cloud of three-dimensional points associated to the descriptors of the map.
  • Register Initial 6DoF Pose
  • The 6DoF tracking method provides the efficient way to track the 6DoF pose for an AR/VR system. One configuration of the AR/VR system involves tracking one device in the AR/VR. Another configuration includes a head mount device (HMD) and one or more handheld/hand-control game controllers. In an HMD tracking system, more than one 6DoF tracking devices are presented. A world coordinate system is used to coordinate the multiple devices. The world coordinate system is a common coordinate system shared by multiple devices in a system. With multiple system sharing the same coordinate system, an origin/starting point is needed.
  • FIG. 9 illustrates exemplary diagrams for registering initial 6DoF pose in an HMD tracking system in accordance with embodiments of the current invention. The HMD tracking system is an AR/VR system with multiple 6DoF tracking devices, including an HMD 901 and a game controller 902. In other embodiments, more devices can be included. HMD 901 includes a sensor unit and one or more processors 917 and a memory 918 coupled with the one or more processors 917. The sensor unit includes one or more sensors and/or cameras 911, 912, and 915. The sensor unit also includes an IMU 916 that detects movements, rotations, and magnetic headings of HMD 901. Game controller 902 includes a sensor unit and one or more processors 927 and a memory 928 coupled with the one or more processors 927. The sensor unit includes one or more sensors and/or cameras 921, 922, and 925. The sensor unit also includes an IMU 926 that detects movements, rotations, and magnetic headings of game controller 902. The exemplary HMD 901 and game controller 902 may include other modules or be configured with other blocks as shown in FIG. 1 . The computation and process split may be shared or separately performed for each device.
  • Each tracking device tracks 6DoF poses in a coordinate system 930. Each devices tracks 6DoF outputs, which include six dimensions of the apparatus including three dimensions (3D) of an orientation in a rotation space 931 and three dimensions translation in a 3D space 932. Each AR/VR tracking device, such as HMD 901 and game controller 902, obtains and maintains a map. The map is a collection of 2D and 3D points in the space relative to a coordinate system. In an AR system, both real objects and virtual objects are presented in the coordinate system. For example, real objects, such as a table 935 and a plate 936, and virtual objects, such as cheese 937, are both presented in the AR system in reference to the coordinate system 930. Each tracking devices collects local information through its sensor unit. Each map is generated based on the local information collected.
  • With multiple tracking devices in the HMD tracking system, each device tracks its own locale information relative to their own coordinate system, which may or may not be the same as the coordinate system 930. The purpose of the registration is to synchronize the coordinate system of each individual devices to a world coordinate system. A world coordinate system is a common coordinate system shared by multiple devices in a system. In an AR/VR environment, its origin is typically the starting position of the HMD device. In one embodiment, one base map in shared by multiple devices. One or more devices receives a base map and generates its 6DoF relative to the base map received. The base map may be generated by the HMD and shared with the game controller. In another embodiment, the base map may be generated by the game controller and shared with the HMD. The base map may also be generated remotely by other devices in the AR/VR system. A single base map is shared by multiple devices in the AR/VR system. The based map is in reference to the world coordinate system for the HMD tracking system. Each 6DoF generated for the base map from each tracking device, therefore, is in reference to the world coordinate system. In another embodiment, each device generates its map in its own coordinate system or in reference to the same coordinate system. In the first scenario, when different maps are generated in reference to the same coordinate system, the initialization process synchronize the maps such that the 6DoF generated by each device are synchronized. In the second scenario, when different maps are generated in reference to different coordinate system, the different coordinate systems of each device are synchronized at the initialization process to the world coordinate system. In other embodiments, with multiple tracking devices, the combination of the shared map and individual map/coordinate system may be used for different devices. The synchronization methods presented below apply to the combination scenarios. A world coordinate system is initialized/synchronized to be shared by multiple devices.
  • In one novel aspect, the virtual view of the world coordinated system 930 is streamed to one or more viewing devices 960, such as a mobile device 961 and/or a desktop/laptop 962. In one embodiment, the virtual view of 930 is streamed in real-time. In another embodiment, the virtual view is broadcasted to multiple remote viewing devices.
  • FIG. 10 illustrates exemplary diagrams of registering initial 6DoF pose using a physical button on a game controller in an HMD tracking system in accordance with embodiments of the current invention. In one novel aspect, a physical button is used to register the initial 6DoF pose in the world coordinate system. An exemplary game controller 1001 has a sensor unit including multiples cameras, such as camera 1101, 1102 and 1103. The sensor unit of game controller 1001 also includes an IMU detects movements, rotations, and magnetic headings of the apparatus. The sensor unit tracks series of motions of the apparatus and collects locale information of the apparatus of relative to a map of its own coordinate system. An initialization button 1108 registers an initial 6DoF pose of the apparatus, wherein the initial pose is a 3D to a predefined coordinate on the map and an initial pose of the apparatus when pressed. Game controller 1001 includes one or more processors configured to communication with the HMD system to generate series of six degree of freedom (6DoF) outputs of the apparatus based on the series of motion and locale information, wherein each 6DoF output comprises six dimensions of the apparatus including three dimensions (3D) of an orientation in a rotation space and three dimensions translation in a 3D space. The sensor unit of game controller 1001 tracks motions of the apparatus and generates pose changes relative to the predefined coordinate on the map. The 6DoF includes the 3D coordinate and the pose of the game controller. When the button 1108 is pressed, the initial 6DoF pose is registered with the world coordinate system. An initial 3D coordinate is predefined. When the button is pressed when game controller is at 1051 pose, the pose of game controller is registered and the 3D coordinate of the initial 6DoF is registered at a predefined position 1080. The initial 6DoF of game controller 1001 is registered at the initial 6DoF 1052. As an example, the predefined 3D coordinate 1080 is at center of the plate. If the game controller moves and/or rotates to 1002, with a 6DoF 1061. When the initialization button 1008 is pressed, the 3D coordinate of the game controller is registered at the same predefined 3D coordinate 1080, the center of the plate. The game controller's pose is also registered. The initial 6DoF of game controller 1002 is registered at the initial 6DoF 1062. In one embodiment, the game controller also includes a map module that obtains map data to generate the map and to register the initial pose to the map.
  • In another novel aspect, a 2D 6DoF tag is used to register the initial 6DoF in the HMD system.
  • FIG. 11A illustrates exemplary diagrams of registering initial 6DoF pose using a two-dimensional (2D) 6DoF tag in an HMD tracking system in accordance with embodiments of the current invention. The HMD tracking system in an AR/VR environment has multiple tracking devices. A 2D 6DoF tag 1101 is attached to a first device. When a sensor, such as a camera 1102 of a second device captures the 2D 6DoF tag 1101, the second device derives a 6DoF 1103 relative the map of the world coordinate system. The 6DoF 1103 is a 3D of an orientation in a rotation space and three dimensions translation in a 3D space. The second device derives {x, y, z, roll, pitch, yaw} parameters when 2D 6DoF tag 1101 is in view. In one embodiment, the 2D 6DoF tag is an April Tag.
  • FIG. 11B illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the HMD in accordance with embodiments of the current invention. In embodiment, the 2D 6DoF tag is attached to the HMD in the AR/VR system. HMD 1111 has a 2D 6DoF tag 1112 attached to its surface. A game controller 1113 with a camera, at step 1115, captures the image of the 2D 6DoF 1112 on the HMD. Game controller 1113 determines an initial relative pose to the HMD when the 6DoF tag is in the field of view of the second device, wherein a 6DoF information can be generated from the capturing of the tag. The initial 6DoF 1118 is registered in the world coordinate system 1116.
  • FIG. 11C illustrates exemplary diagrams of registering initial 6DoF pose by attaching a 2D 6DoF tag on the game controller in accordance with embodiments of the current invention. In one embodiment, the 2D 6DoF tag is attached to a game controller in the AR/VR system. Game controller 1123 has a 2D 6DoF tag 1122 attached to its surface. An HMD 1121 with a camera, at step 1125, captures the image of the 2D 6DoF 1122 on the game controller. HMD 1121 determines an initial relative pose to the HMD when the 6DoF tag is in the field of view of the second device, wherein a 6DoF information can be generated from the capturing of the tag. The initial 6DoF 1128 is registered in the world coordinate system 1126.
  • FIG. 12 illustrates an exemplary flow chart of registering initial 6DoF pose in an HMD system in accordance with embodiments of the current invention. At step 1201, the AR/VR system collects locale information of an apparatus, wherein series of six degree of freedom (6DoF) outputs of the apparatus is generated based on locale information, and wherein each 6DoF output comprising six dimensions of the apparatus including three dimensions of an orientation in a rotation space and three dimensions translated in a 3D space. At step 1202, the AR/VR system obtains a map in reference to a world coordinate system of the AR/VR system, wherein the map in reference to the world coordinate system is generated based on locale information collected by the apparatus. At step 1203, the AR/VR system initializes an initial 6DoF position of the apparatus relative to the map in reference to the world coordinate system by an initializing process selecting from pressing a physical button on an apparatus or capturing a two-dimensional (2D) 6DoF tag. At step 1204, the AR/VR system performs a localization process to generate localization information of the apparatus based on the apparatus initial position.
  • Although the present invention has been described in connection with certain specific embodiments for instructional purposes, the present invention is not limited thereto. Accordingly, various modifications, adaptations, and combinations of various features of the described embodiments can be practiced without departing from the scope of the invention as set forth in the claims.

Claims (20)

What is claimed:
1. An apparatus operating in an augmented reality/virtual reality (AR/VR) system comprising:
a sensor unit that tracks series of motions of the apparatus and collects locale information of the apparatus relative to world coordinate system shared by a plurality of devices;
an initialization button that registers an initial six degree of freedom (6DoF) pose of the apparatus when pressed, wherein the initial 6DoF pose comprises a predefined three dimensions (3D) coordinate and an initial pose of the apparatus relative to the world coordinate system; and
one or more processors configured to communication with the AR/VR system and to generate series of 6DoF outputs of the apparatus based on the series of motion and locale information, wherein each 6DoF output comprises six dimensions of the apparatus including 3D of an orientation in a rotation space and three dimensions translation in a 3D space.
2. The apparatus of claim 1, wherein the sensor unit comprises one or more cameras and an inertial measurement unit (IMU), and wherein the IMU detects movements, rotations, and magnetic headings of the apparatus.
3. The apparatus of claim 1, wherein the sensor unit tracks motions of the apparatus and generates pose changes relative to the predefined coordinate relative to the world coordinate system.
4. The apparatus of claim 1, further comprising a map module that obtains map data to generate a base map in reference to the world coordinate system and register the initial pose to the base map in reference to the world coordinate system.
5. The apparatus of claim 4, wherein the apparatus determines real-time 6DoF outputs based the map data and locale information relative to the initial pose.
6. A system comprising:
a head mount device (HMD) with an HMD sensor unit and one or more processors configured to communication with the system and to generate series of six degree of freedom (6DoF) outputs of the HMD relative to a world coordinate system shared by a plurality of devices;
a game controller with a controller sensor unit and one or more processors configured to communication with the system and to generate series of 6DoF outputs of the controller relative to the world coordinate system; and
a two dimensional (2D) 6DoF tag attached to a first device selecting from the HMD and the game controller, wherein a second device determines an initial relative pose to the first device when the 6DoF tag is in the field of view of the second device.
7. The system of claim 6, wherein the HMD sensor unit comprises one or more HMD cameras and an HMD inertial measurement unit (IMU) and the controller sensor unit comprises one or more controller cameras and a controller IMU.
8. The system of claim 6, wherein the 6DoF tag is an AprilTag.
9. The system of claim 6, wherein the controller sensor unit tracks motions of the apparatus and generates relative pose changes relative to the initial relative pose.
10. The system of claim 6, wherein the one or more processors of the HMD device is further configured to split computation work onto a plurality of vector processors to 6DoF outputs based on a splitting algorithm.
11. A method for an augmented reality/virtual reality (AR/VR) system, comprising:
collecting locale information of an apparatus, wherein series of six degree of freedom (6DoF) outputs of the apparatus is generated based on locale information, and wherein each 6DoF output comprising six dimensions of the apparatus including three dimensions of an orientation in a rotation space and three dimensions translated in a 3D space;
obtaining a map in reference to a world coordinate system of the AR/VR system, wherein the map in reference to the world coordinate system is generated based on locale information collected by the apparatus;
initializing an initial 6DoF position of the apparatus relative to the map in reference to the world coordinate system by an initializing process selecting from pressing a physical button on an apparatus or capturing a two-dimensional (2D) 6DoF tag; and
performing a localization process to generate localization information of the apparatus based on the apparatus initial position.
12. The method of claim 11, wherein the initializing is performed by pressing the physical button on the apparatus, and wherein the initializing registers the apparatus to a predefined coordinate on the map in reference to the world coordinate system.
13. The method of claim 11, wherein the initializing is performed by capturing a 2D 6DoF tag, and wherein the 2D 6DoF tag is attached to a second apparatus.
14. The method of claim 13, wherein the apparatus determines an initial relative pose to the second apparatus when the 2D 6DoF tag is in the field of view of the apparatus.
15. The method of claim 11, further comprising: receiving real-time map data generated by the AR/VR system.
16. The method of claim 15, wherein the apparatus determines 6DoF outputs based the received real-time map data.
17. The method of claim 11, further comprising: splitting computation work onto a plurality of vector processors to obtain six degree of freedom (6DoF) outputs of the apparatus based on a splitting algorithm.
18. The method of claim 17, wherein the splitting algorithm involves: dividing a current frame in N equal part; and each of a set of selected vector processors processes a portion of the current frame based on a split-by-corner rule, and wherein the split-by-corner rule determining whether each pixel of is a corner and classifying each pixel determined to a corner to a compressed descriptor by converting each sub-image centered by the pixel to a 16-float descriptor using a base matrix.
19. The method of claim 11, wherein the localization process and mapping process are configured to run sequentially, wherein the localization process is split over all of the vector processors and the mapping process is split over all the vector processors.
20. The method of claim 11, wherein the localization process and mapping process are configured to run in parallel, wherein the localization process is split over a first subset of the vector processors and the mapping process is split over the rest subset of the vector processors.
US17/935,897 2017-01-18 2022-09-27 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION Abandoned US20230014421A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/935,897 US20230014421A1 (en) 2017-01-18 2022-09-27 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762447867P 2017-01-18 2017-01-18
US15/874,842 US10843068B2 (en) 2017-01-18 2018-01-18 6DoF inside-out tracking game controller
US17/075,853 US11504608B2 (en) 2017-01-18 2020-10-21 6DoF inside-out tracking game controller
US17/935,897 US20230014421A1 (en) 2017-01-18 2022-09-27 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/075,853 Continuation US11504608B2 (en) 2017-01-18 2020-10-21 6DoF inside-out tracking game controller

Publications (1)

Publication Number Publication Date
US20230014421A1 true US20230014421A1 (en) 2023-01-19

Family

ID=62838531

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/874,842 Expired - Fee Related US10843068B2 (en) 2017-01-18 2018-01-18 6DoF inside-out tracking game controller
US17/075,853 Active 2038-02-27 US11504608B2 (en) 2017-01-18 2020-10-21 6DoF inside-out tracking game controller
US17/935,897 Abandoned US20230014421A1 (en) 2017-01-18 2022-09-27 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/874,842 Expired - Fee Related US10843068B2 (en) 2017-01-18 2018-01-18 6DoF inside-out tracking game controller
US17/075,853 Active 2038-02-27 US11504608B2 (en) 2017-01-18 2020-10-21 6DoF inside-out tracking game controller

Country Status (2)

Country Link
US (3) US10843068B2 (en)
CN (1) CN110052020B8 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664993B1 (en) 2017-03-13 2020-05-26 Occipital, Inc. System for determining a pose of an object
WO2019173524A1 (en) * 2018-03-07 2019-09-12 Magic Leap, Inc. Visual tracking of peripheral devices
EP4291940A1 (en) 2021-03-11 2023-12-20 Cariad Se Method and a processing unit for determining calibration parameters for a calibration function of a head-mounted device (hmd) for use in a vehicle and hmd comprising such a calibration function
JP2023110697A (en) * 2022-01-28 2023-08-09 キヤノン株式会社 Information processing device, mobile object, computer program, and information processing method.
CN115143960A (en) * 2022-06-27 2022-10-04 上海商汤科技开发有限公司 SLAM system, method, device, apparatus, medium, and program product

Family Cites Families (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6176837B1 (en) * 1998-04-17 2001-01-23 Massachusetts Institute Of Technology Motion tracking system
JP2002157606A (en) * 2000-11-17 2002-05-31 Canon Inc Image display control device, mixed reality presentation system, image display control method, and medium providing processing program
JP4054585B2 (en) * 2002-02-18 2008-02-27 キヤノン株式会社 Information processing apparatus and method
WO2004015369A2 (en) * 2002-08-09 2004-02-19 Intersense, Inc. Motion tracking system and method
US20160267720A1 (en) * 2004-01-30 2016-09-15 Electronic Scripting Products, Inc. Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience
US7689321B2 (en) * 2004-02-13 2010-03-30 Evolution Robotics, Inc. Robust sensor fusion for mapping and localization in a simultaneous localization and mapping (SLAM) system
WO2005098729A2 (en) * 2004-03-27 2005-10-20 Harvey Koselka Autonomous personal service robot
US7239301B2 (en) * 2004-04-30 2007-07-03 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20160047675A1 (en) * 2005-04-19 2016-02-18 Tanenhaus & Associates, Inc. Inertial Measurement and Navigation System And Method Having Low Drift MEMS Gyroscopes And Accelerometers Operable In GPS Denied Environments
US8577538B2 (en) * 2006-07-14 2013-11-05 Irobot Corporation Method and system for controlling a remote vehicle
WO2008013568A2 (en) * 2005-12-30 2008-01-31 Irobot Corporation Autonomous mobile robot
US8730156B2 (en) * 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US8493323B2 (en) * 2006-08-02 2013-07-23 Research In Motion Limited System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device
KR100886340B1 (en) * 2007-04-04 2009-03-03 삼성전자주식회사 Apparatus and method for calibrating a gyro sensor of a mobile robot
US8675017B2 (en) * 2007-06-26 2014-03-18 Qualcomm Incorporated Real world gaming framework
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
US8209525B2 (en) * 2008-08-15 2012-06-26 Apple Inc. Method and apparatus for executing program code
US20100110412A1 (en) * 2008-10-31 2010-05-06 Honeywell International Inc. Systems and methods for localization and mapping using landmarks detected by a measurement device
US8970690B2 (en) * 2009-02-13 2015-03-03 Metaio Gmbh Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
US8983127B2 (en) * 2010-03-19 2015-03-17 University Of Central Florida Research Foundation, Inc. Object tracking with opposing image capture devices
US8570320B2 (en) * 2011-01-31 2013-10-29 Microsoft Corporation Using a three-dimensional environment model in gameplay
US8711206B2 (en) * 2011-01-31 2014-04-29 Microsoft Corporation Mobile camera localization using depth maps
US8401225B2 (en) * 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US8498811B2 (en) * 2011-02-09 2013-07-30 SenionLab AB Method and device for indoor positioning
WO2012135546A1 (en) * 2011-03-29 2012-10-04 Qualcomm Incorporated Anchoring virtual images to real world surfaces in augmented reality systems
US20120306850A1 (en) * 2011-06-02 2012-12-06 Microsoft Corporation Distributed asynchronous localization and mapping for augmented reality
US9053571B2 (en) * 2011-06-06 2015-06-09 Microsoft Corporation Generating computer models of 3D objects
US9454849B2 (en) * 2011-11-03 2016-09-27 Microsoft Technology Licensing, Llc Augmented reality playspaces with adaptive game rules
GB2506338A (en) * 2012-07-30 2014-04-02 Sony Comp Entertainment Europe A method of localisation and mapping
WO2014020547A1 (en) * 2012-07-31 2014-02-06 Indoorgo Navigation Systems Ltd. Navigation method and device
US9849333B2 (en) * 2012-08-31 2017-12-26 Blue Goji Llc Variable-resistance exercise machine with wireless communication for smart device control and virtual reality applications
US20140123507A1 (en) * 2012-11-02 2014-05-08 Qualcomm Incorporated Reference coordinate system determination
US9524434B2 (en) * 2013-10-04 2016-12-20 Qualcomm Incorporated Object tracking based on dynamically built environment map data
DK3428766T3 (en) * 2014-09-05 2021-06-07 Sz Dji Technology Co Ltd MULTI-SENSOR FOR IMAGING THE ENVIRONMENT
US10185775B2 (en) * 2014-12-19 2019-01-22 Qualcomm Technologies, Inc. Scalable 3D mapping system
US10684485B2 (en) * 2015-03-06 2020-06-16 Sony Interactive Entertainment Inc. Tracking system for head mounted display
KR20160125674A (en) * 2015-04-22 2016-11-01 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10169917B2 (en) * 2015-08-20 2019-01-01 Microsoft Technology Licensing, Llc Augmented reality
US10395116B2 (en) * 2015-10-29 2019-08-27 Hand Held Products, Inc. Dynamically created and updated indoor positioning map
US10078218B2 (en) * 2016-01-01 2018-09-18 Oculus Vr, Llc Non-overlapped stereo imaging for virtual reality headset tracking
US10078377B2 (en) * 2016-06-09 2018-09-18 Microsoft Technology Licensing, Llc Six DOF mixed reality input by fusing inertial handheld controller with hand tracking
US10620717B2 (en) * 2016-06-30 2020-04-14 Microsoft Technology Licensing, Llc Position-determining input device
WO2018027206A1 (en) * 2016-08-04 2018-02-08 Reification Inc. Methods for simultaneous localization and mapping (slam) and related apparatus and systems
US11269480B2 (en) * 2016-08-23 2022-03-08 Reavire, Inc. Controlling objects using virtual rays
US10402663B1 (en) * 2016-08-29 2019-09-03 Trifo, Inc. Visual-inertial positional awareness for autonomous and non-autonomous mapping
US10032276B1 (en) * 2016-08-29 2018-07-24 PerceptIn, Inc. Visual-inertial positional awareness for autonomous and non-autonomous device
US10043076B1 (en) * 2016-08-29 2018-08-07 PerceptIn, Inc. Visual-inertial positional awareness for autonomous and non-autonomous tracking
US20180096507A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US10712555B2 (en) * 2016-11-04 2020-07-14 Koninklijke Kpn N.V. Streaming virtual reality video

Also Published As

Publication number Publication date
US20210031097A1 (en) 2021-02-04
CN110052020A (en) 2019-07-26
CN110052020B (en) 2022-07-22
US11504608B2 (en) 2022-11-22
CN110052020B8 (en) 2023-06-06
US10843068B2 (en) 2020-11-24
US20180200614A1 (en) 2018-07-19

Similar Documents

Publication Publication Date Title
US20230014421A1 (en) 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION
US10659768B2 (en) System and method for virtually-augmented visual simultaneous localization and mapping
US10948297B2 (en) Simultaneous location and mapping (SLAM) using dual event cameras
Zou et al. Coslam: Collaborative visual slam in dynamic environments
US10068344B2 (en) Method and system for 3D capture based on structure from motion with simplified pose detection
Kneip et al. Robust real-time visual odometry with a single camera and an IMU
Clipp et al. Robust 6dof motion estimation for non-overlapping, multi-camera systems
CN111899276A (en) SLAM method and system based on binocular event camera
CN113888639B (en) Visual odometer positioning method and system based on event camera and depth camera
JP2014514539A (en) Method for aligning at least a portion of a first image and at least a portion of a second image using a collinear transform warp function
WO2020195875A1 (en) Information processing device, information processing method, and program
AU2015334707A1 (en) Optical Inertial Measurement Apparatus and Method
Huttunen et al. A monocular camera gyroscope
Bi et al. An MAV localization and mapping system based on dual realsense cameras
Calloway et al. Three tiered visual-inertial tracking and mapping for augmented reality in urban settings
Kaur et al. Computer vision and sensor fusion for efficient hybrid tracking in augmented reality systems
Thangarajah et al. Vision-based registration for augmented reality-a short survey
US12315195B2 (en) Using cloud computing to improve accuracy of pose tracking
JP2023144757A (en) Image generation device, image generation method and program
Delgado et al. The use of a graphic processing unit (gpu) in a real time visual odometry application
Kim et al. Pose initialization method of mixed reality system for inspection using convolutional neural network
CN116067364B (en) SLAM system based on tight coupling multi-source data
Huang et al. AR mapping: accurate and efficient mapping for augmented reality
Calloway Adaptive Three-Tier Sensor Fusion Model with Application to See-Through Augmented Reality in Urban Environments
Aliakbarpour et al. A novel framework for data registration and data fusion in presence of multi-modal sensors

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCB Information on status: application discontinuation

Free format text: ABANDONMENT FOR FAILURE TO CORRECT DRAWINGS/OATH/NONPUB REQUEST