CN119563150A - Computer system with handheld controller - Google Patents
Computer system with handheld controller Download PDFInfo
- Publication number
- CN119563150A CN119563150A CN202380052831.0A CN202380052831A CN119563150A CN 119563150 A CN119563150 A CN 119563150A CN 202380052831 A CN202380052831 A CN 202380052831A CN 119563150 A CN119563150 A CN 119563150A
- Authority
- CN
- China
- Prior art keywords
- input
- controller
- user
- housing
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0339—Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
Abstract
A system may include an electronic device, such as a head-mounted device, and a handheld controller for controlling the electronic device. The hand-held controller may have a housing with an elongate shaft extending between a first end portion and a second end portion. The handheld controller may include a touch sensor for gathering touch input, buttons, a rotating scroll wheel or other input device for gathering other user input, a force sensor for gathering force input, an inertial measurement unit for gathering motion data, a camera for capturing images of the environment, and/or one or more haptic output devices, such as actuators, for providing haptic output. The control circuitry may send control signals to the head-mounted device based on sensor data collected with the handheld controller and user input (e.g., force input, touch input, motion input, voice input, etc.).
Description
The present application claims priority from U.S. patent application Ser. No. 18/348,943, filed 7 at 2023, and U.S. provisional patent application Ser. No. 63/388,196, filed 11 at 2022, 7, which are hereby incorporated by reference in their entirety.
Technical Field
The present invention relates generally to computer systems, and more particularly to input devices for computer systems.
Background
Electronic devices, such as computers, may be controlled using a computer mouse and other input accessories. In a virtual reality system, a force feedback glove may be used to control a virtual object. The mobile phone may have a touch screen display and a vibrator for creating haptic feedback in response to a touch input.
Devices such as these may be inconvenient for the user, may be cumbersome or uncomfortable, or may provide inadequate feedback.
Disclosure of Invention
A system may include an electronic device, such as a head-mounted device, and a handheld controller for controlling the electronic device. The head mounted device or other device may have a display configured to display virtual content overlaid onto real world content.
The hand-held controller may have a housing with an elongate shaft extending between a first end portion and a second end portion. The housing may have a flat surface and a curved surface. The housing may have first and second housing portions operable in a first mode in which the first and second housing portions cooperate with one another and a second mode in which the first and second housing portions are separated from one another to form a stand-alone hand-held controller device.
The touch sensor may be located on a curved surface and may be configured to collect touch inputs, such as swipe inputs, tap inputs, multi-touch inputs, and other touch inputs. Buttons or other input devices may be located on a flat surface. The rotating roller may be mounted in the housing and may be configured to rotate about an axis perpendicular or parallel to the longitudinal axis of the elongated housing. One or more sensors may be located in one or both of the end portions. For example, one of the end portions may have a force sensor and may be force sensitive. The tip portion may have a camera for helping track the position of the handheld controller in space, on a surface, or relative to an electronic device controlled by the handheld controller. One or more haptic output devices, such as actuators, may be mounted in the housing.
The control circuitry in the handheld controller may use the haptic output device to provide haptic output in response to user input such as force input, touch input, motion input, voice input, and/or any other suitable user input. The control circuit may adjust the operational settings of the handheld controller based on the sensor data and the user input. The control circuitry may also send control signals to an external electronic device, such as a head-mounted device, to manipulate display content on the external electronic device, adjust operational settings of the head-mounted device, and/or take other actions in response to sensor data and user input collected with the controller.
The hand-held controller may include an inertial measurement unit, such as one or more accelerometers, gyroscopes, and/or compasses. Visual markers such as passive visual markers and/or active visual markers (e.g., infrared light emitting diodes) may be formed at one or more locations of the housing of the handheld controller. The head-mounted device may include a camera for capturing images of the handheld controller and the visual marker. Control circuitry in the head-mounted device may be configured to track the position of the hand-held controller using the captured images and motion data from the inertial measurement unit.
Drawings
FIG. 1 is a schematic diagram of an exemplary system with a handheld controller according to an embodiment.
Fig. 2 is a perspective view of an exemplary hand-held controller according to an embodiment.
Fig. 3 is a cross-sectional side view of an exemplary hand-held controller with a removable end portion according to an embodiment.
FIG. 4 is a perspective view of an exemplary handheld controller having a knob that rotates about a longitudinal axis of the handheld controller, according to an embodiment.
Fig. 5 is a perspective view of an exemplary handheld controller having knobs accessible on two opposite sides of the handheld controller and rotatable about an axis perpendicular to a longitudinal axis of the handheld controller, in accordance with an embodiment.
Fig. 6 is a perspective view of an exemplary hand-held controller having a knob accessible on one side of the hand-held controller and rotatable about an axis perpendicular to the longitudinal axis of the hand-held controller, according to an embodiment.
Fig. 7 is a perspective view of an exemplary hand-held controller having a first mating housing portion and a second mating housing portion, according to an embodiment.
Fig. 8 is a perspective view of the exemplary hand-held controller of fig. 7 in a configuration in which the first and second mating housing portions have been separated from one another, according to an embodiment.
FIG. 9 is a diagram of an exemplary system including a handheld controller and an electronic device configured to track a position of the handheld controller, according to an embodiment.
Fig. 10 is a side view of an exemplary system having a computer with a computer housing in which a display or other equipment with a display is installed and having a gaze tracker, according to an embodiment.
Fig. 11 is a top view of an exemplary head-mounted device having a support structure configured to support a display and sensors (such as a gaze tracker and a front camera), according to an embodiment.
FIG. 12 is a side view of an exemplary handheld controller with computer generated display content overlaid thereon, according to an embodiment.
Fig. 13 is a side view of an exemplary hand-held controller having an end portion with a computer-generated paint brush overlaid thereon, according to an embodiment.
FIG. 14 is a side view of an exemplary hand-held controller having an end portion with a computer-generated tool head overlaid thereon, according to an embodiment.
FIG. 15 is a graph illustrating exemplary haptic output signals that may be generated to provide different textures to a user when interacting with a handheld controller, according to an embodiment.
FIG. 16 is a side view of an exemplary haptic output device, such as a linear resonant actuator, according to an embodiment.
Fig. 17 is a graph of an exemplary drive signal for a linear resonant actuator according to an embodiment.
Detailed Description
An electronic device configured to be held in a user's hand may be used to capture user input and provide output to the user. For example, an electronic device (which is sometimes referred to as a controller, a handheld controller, or a handheld input device) configured to control one or more other electronic devices may be used to collect user input and supply output. For example, a handheld controller may include an inertial measurement unit with an accelerometer for gathering information about controller motion (such as swipe motion, write motion, draw motion, shake motion, rotation, etc.), and may include wireless communication circuitry for communicating with external equipment (such as a headset), may include tracking features (such as active or passive visual markers that may be tracked with optical sensors in the external electronic device), may include input devices (such as touch sensors, force sensors, buttons, knobs, wheels, etc.), may include sensors for gathering information about interactions between the handheld controller, the user's hands interacting with the controller, and the surrounding environment. The handheld controller may include a haptic output device that provides haptic output to the user's hand and may include other output components such as one or more speakers.
One or more handheld controllers may gather user input from a user. A user may use a handheld controller to control a virtual reality or mixed reality device (e.g., a headset, such as glasses, goggles, a helmet, or other device with a display). During operation, the handheld controller may gather user input, such as information about interactions between the handheld controller and the surrounding environment, interactions between the user's finger or hand and the surrounding environment, and interactions associated with virtual content displayed for the user. The user input may be used to control visual output on a display (e.g., a head mounted display, a computer display, etc.). A corresponding tactile output may be provided to a user's finger using a handheld controller. The haptic output may be used, for example, to provide a desired feel (e.g., texture, weight, torque, push, pull, etc.) to a user's finger when the user interacts with a real or virtual object using a handheld controller. The haptic output may also be used to generate a brake, provide local or global haptic feedback in response to user input supplied to the handheld controller, and/or provide other haptic effects.
The hand-held controller may be held in one or both hands of the user. The user may interact with any suitable electronic equipment using a handheld controller. For example, a user may interact with a virtual reality or mixed reality system (e.g., a head mounted device with a display) using one or more handheld controllers to supply input to a desktop computer, tablet computer, cellular telephone, watch, ear bud, or other accessory to control household appliances such as lighting, televisions, thermostats, appliances, etc., or to interact with other electronic equipment.
FIG. 1 is a schematic diagram of an exemplary system of the type that may include one or more handheld controllers. As shown in fig. 1, the system 8 may include electronic devices such as a handheld controller 10 and other electronic devices 24. Each hand-held controller 10 may be held in a user's hand. Additional electronic devices in system 8, such as device 24, may include the following: such as a laptop computer, a computer monitor including an embedded computer, a tablet computer, a desktop computer (e.g., a display on a stand with integrated computer processor and other computer circuitry), a cellular telephone, a media player, or other handheld or portable electronic device, a small device such as a watch device, a hanging device, a computer monitor including an embedded computer, a tablet computer, a desktop computer, a computer monitor including an embedded computer, a computer monitor, a computer headphones or earphone devices such as glasses worn on the user's head, goggles, helmets or other equipment, or other wearable or miniature devices, televisions, computer displays that do not include an embedded computer, gaming devices, remote controls, navigation devices, embedded systems (such as where the equipment is installed in a kiosk, a television, a video game, a video game, a system on an automobile, airplane, or other vehicle), or other vehicle, a removable electronic equipment housing, a strap, wristband or hair band, a removable device cover, a case or bag with a strap or other structure to hold and carry electronic equipment and other items, a necklace or arm band into which electronic equipment or other items may be inserted, a purse, sleeve, pocket or other structure, a chair, sofa, or a portion of other seat (e.g., a cushion or other seating structure), clothing or other wearable item (e.g., a hat, a belt, a wrist strap, a hair band, socks, gloves, shirts, pants, etc.), or equipment that performs the function of two or more of these devices.
With one exemplary configuration (which may sometimes be described herein as an example), device 10 is a handheld controller having a housing in the shape of an elongated marker configured to be held within a user's finger, or having a housing in other shapes configured to rest in a user's hand, and device 24 is a head-mounted device, a cellular telephone, a tablet computer, a laptop computer, a watch device, a device with speakers, or other electronic device (e.g., a device with a display, audio component, and/or other output component). A hand-held controller having a marker-shaped housing may have an elongated housing that spans the width of a user's hand and may be held like a pen, pencil, marker, grip, or tool.
The devices 10 and 24 may include control circuitry 12 and control circuitry 26. Control circuit 12 and control circuit 26 may include storage and processing circuitry for supporting the operation of system 8. The storage and processing circuitry may include memory such as non-volatile memory, (e.g., flash memory or other electrically programmable read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random access memory), etc., and the processing circuitry in control circuits 12 and 26 may be used to collect input from sensors and other input devices and may be used to control output devices. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors and other wireless communication circuits, power management units, audio chips, application specific integrated circuits, and the like.
To support communication between device 10 and device 24 and/or to support communication between equipment in system 8 and external electronic equipment, control circuit 12 may communicate using communication circuit 14 and/or control circuit 26 may communicate using communication circuit 28. Circuitry 14 and/or circuitry 28 may include an antenna, radio frequency transceiver circuitry, and other wireless and/or wired communication circuitry. For example, circuitry 14 and/or circuitry 28 (which may sometimes be referred to as control circuitry and/or control and communication circuitry) may support two-way wireless communication between device 10 and device 24 via wireless link 38 (e.g., a wireless local area network link, a near field communication link or other suitable wired or wireless communication link (e.g.,A link(s),Links, 60GHz links or other millimeter wave links, etc.)). The devices 10 and 24 may also include power circuitry for transmitting and/or receiving wired and/or wireless power, and may include a battery. In configurations supporting wireless power transfer between device 10 and device 24, inductive power transfer coils (as examples) may be used to support in-band wireless communications.
Devices 10 and 24 may include input-output devices such as device 16 and device 30. The input-output device 16 and/or the input-output device 30 may be used to gather user input, to gather information about the user's surroundings, and/or to provide output to the user. Device 16 may include sensor 18 and device 30 may include sensor 32. The sensors 18 and/or 32 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors (such as microphones), touch sensors, and/or proximity sensors (such as capacitive sensors), optical sensors (such as optical sensors that emit and detect light), ultrasonic sensors (e.g., ultrasonic sensors for tracking device orientation and position and/or for detecting user input such as finger input), and/or other touch sensors and/or proximity sensors, monochrome and color ambient light sensors, image sensors, sensors for detecting positioning, orientation, and/or motion (e.g., accelerometers, orientation, and/or motion), magnetic sensors such as compass sensors, gyroscopes and/or inertial measurement units containing some or all of these sensors), muscle activity sensors (EMG) for detecting finger motion, radio frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereoscopic imaging devices), optical sensors such as self-mixing interferometric sensors and light detection and ranging (lidar) sensors that acquire time-of-flight measurements, optical sensors such as visual ranging sensors that use images acquired by digital image sensors in cameras to cell phone location and/or orientation information, gaze tracking sensors, a visible and/or infrared camera with a digital image sensor, a humidity sensor, a moisture content sensor, and/or other sensors. In some arrangements, the device 10 and/or the device 24 may use the sensor 18 and/or the sensor 32 and/or the other input-output device 16 and/or the input-output device 30 to gather user input (e.g., buttons may be used to gather button press input, touch sensors overlapping the display may be used to gather touch screen input, a touch pad may be used to gather touch input, a microphone may be used to gather audio input, an accelerometer may be used to monitor when a finger contacts the input surface, and thus may be used to gather finger press input, etc.). If desired, device 10 and/or device 24 may include a rotation button (e.g., a crown mechanism on a watch or other suitable rotation button that rotates and optionally can be pressed to select an item of interest). Alphanumeric keys and/or other buttons may be included in device 16 and/or device 30. In some configurations, sensor 18 may include a joystick, a ball, an optical sensor (e.g., a laser emitting light and an image sensor that tracks motion by monitoring and analyzing changes in spot patterns and other information associated with surfaces illuminated with the emitted light as device 10 moves relative to those surfaces), a fingerprint sensor, and/or other sensing circuitry. A radio frequency tracking device may be included in the sensor 18 to detect position, orientation and/or range. Beacons (e.g., radio frequency beacons) may be used to transmit radio frequency signals at different locations in a user's environment (e.g., one or more registered locations in a user's home or office). the devices 10 and/or 24 may analyze the radio frequency beacon signals to help determine the location and position of the devices 10 and/or 24 relative to the beacons. Devices 10 and/or 24 may include beacons, if desired. Frequency strength (received signal strength information), beacon orientation, time of flight information, and/or other radio frequency information may be used to determine orientation and positioning information. At some frequencies (e.g., lower frequencies, such as frequencies below 10 GHz), signal strength information may be used, while at other frequencies (e.g., higher frequencies, such as frequencies above 10 GHz), an indoor radar scheme may be used. If desired, light-based beacons, ultrasound beacons, and/or other beacon devices may be used in system 8 in addition to or in lieu of using radio frequency beacons and/or radio frequency radar technology.
Device 16 and/or device 30 may include a haptic output device 20 and a haptic output device 34. Haptic output device 20 and/or haptic output device 34 may generate motion sensed by a user (e.g., motion sensed by a user's fingertip). Haptic output device 20 and/or haptic output device 34 may include actuators such as electromagnetic actuators, motors, piezoelectric actuators, electroactive polymer actuators, vibrators, linear actuators (e.g., linear resonant actuators), rotary actuators, actuators that bend members in a flexible manner, actuator devices that generate and/or control repulsive and/or attractive forces between devices 10 and/or 24 (e.g., components for generating electrostatic repulsive and/or attractive forces such as electrodes, components for generating ultrasonic output such as ultrasonic transducers, components for generating magnetic interactions such as electromagnets, permanent magnets, magnetic materials such as iron or ferrite for generating direct and/or alternating magnetic fields, and/or other circuits for generating repulsive and/or attractive forces between devices 10 and/or 24). In some cases, actuators for generating forces in the device 10 may be used to exert a sensation (e.g., a sensation of weight, texture, pulling, pushing, torque, etc.) on and/or otherwise interact directly with a user's finger. In other cases, these components may be used to interact with each other (e.g., by using electromagnets to create dynamically adjustable electromagnetic repulsive and/or attractive forces between a pair of devices 10 and/or devices 10 and 24).
If desired, the input-output devices 16 and/or the input-output devices 30 may include other devices 22 and/or devices 36, such as a display (e.g., displaying user images in the device 24), status indicators (e.g., light emitting diodes in the device 10 and/or the device 24 that serve as power indicators, and other light-based output devices), speakers and other audio output devices, electromagnets, permanent magnets, structures formed of magnetic material (e.g., iron bars or other ferromagnetic members attracted by magnets, such as electromagnets and/or permanent magnets), batteries, and the like. Device 10 and/or device 24 may also include power transmission and/or reception circuitry configured to transmit and/or receive wired and/or wireless power signals.
Fig. 2 is a perspective view of a user's hand (hand 40) and an exemplary hand-held controller 10. As shown in fig. 2, the controller 10 may be an electronic device in the shape of an elongated marker that fits within a user's hand 40. The elongated shape of the controller 10 allows the hand 40 to hold the controller as if the controller 10 were a pen, pencil, marker or other writing instrument. In other configurations, the controller 10 may be held in the hand 40 as a grip or stick. Generally, the controller 10 may be held in the hand 40 in any suitable manner (e.g., at the end, in the middle, between two, three, four, or all five fingers, with both hands, etc.).
A user may hold one or more devices 10 simultaneously. For example, a user may hold a single device 10 in the user's left or right hand. As another example, a user may hold the first device 10 in the left hand of the user and hold the second device 10 in the right hand of the user. An arrangement in which multiple devices 10 are held in one hand may also be used. The configuration in which the device 10 has a body held within a user's hand is sometimes described herein as an example.
The control circuitry 12 (and, if desired, the communication circuitry 14 and/or the input-output device 16) may be entirely contained within the device 10 (e.g., in the housing 54) and/or may include circuitry located in external structures (e.g., in external electronic devices such as the device 24, a console, a storage box, etc.).
In general, electronic components such as the control circuitry 12, the communication circuitry 14, and/or the input-output device 16 (e.g., the sensor 18, the haptic output device 20, and/or the other device 22) may be mounted in any suitable location within the controller housing 54 and/or on a surface of the controller housing.
As shown in FIG. 2, the housing 54 may have an elongated marker shape, an elongated tube shape, an elongated cylindrical shape, and/or any other elongated shape. The outer shell 54, which may sometimes be referred to as a housing, body, or case, may be formed of plastic, glass, ceramic, fiber composite, metal (e.g., stainless steel, aluminum, etc.), fabric, other suitable material, or a combination of any two or more of these materials. The housing 54 may be formed using a unitary configuration in which a portion or all of the housing 54 is machined or molded into a single structure, or may be formed using multiple structures (e.g., an internal frame structure, one or more structures forming an external housing surface, etc.). The housing 54 may form a housing wall, an end portion, and/or an internal support structure for the device 10. The housing 54 may have a length L of between 140mm and 150mm, between 130mm and 160mm, between 100mm and 200mm, between 120mm and 160mm, greater than 180mm, less than 180mm, or any other suitable length. The diameter D of the housing 54 may be between 12mm and 14mm, between 10mm and 15mm, between 11mm and 16mm, between 15mm and 20mm, between 18mm and 25mm, greater than 25mm, less than 25mm, or any other suitable diameter.
The housing 54 may have one or more curved surfaces and one or more planar surfaces. In the illustrative example of fig. 2, the device 10 has a curved surface C wrapped around a first portion of the device 10 and a flat surface F extending along a second portion of the device 10. If desired, the flat surface F may be located on a first side of the device 10 and the curved surface C may be located on an opposite second side of the device 10. The curved surface C and the flat surface F wrap around the device 10 to form an elongated tube shape that surrounds an elongated interior space for housing interior components such as the control circuit 12, the communication circuit 14, and the input-output device 16. The housing 54 may have an elongated shaft portion, such as shaft B, extending between a first end portion, such as end portion T1 at a first end of the device 10, and a second end portion T2 at an opposite second end of the device 10. One or both of the housing end portions T1 and T2 are removable from the main elongate shaft B between the end portions T1 and T2.
Ultrasonic sensors, optical sensors, inertial measurement units, touch sensors such as capacitive touch sensor electrodes, strain gauges and other force sensors, radio frequency sensors and/or other sensors may be used to acquire sensor measurements indicative of the activity of the device 10 and/or the hand 40 holding the device 10.
In some configurations, the controller positioning, movement, and orientation may be monitored using sensors installed in external electronic equipment (e.g., in a computer or other desktop device, in a head-mounted device or other wearable device, and/or in other electronic device 24 separate from device 10). For example, optical sensors (such as image sensors separate from the device 10) may be used to monitor the device 10 to determine their location, movement, and/or orientation. If desired, the device 10 may include passive optical alignment features and/or active optical alignment features to aid the image sensor in the device 24 in tracking the position, orientation, and/or movement of the device 10. For example, the device 10 may comprise a light emitting device. The light emitting device may include a light emitting diode, a laser (e.g., a laser diode, a vertical cavity surface emitting laser, etc.), or other light source, and may operate at visible, ultraviolet, and/or infrared wavelengths. The light emitting devices may be arranged in an asymmetric pattern on the housing 44 and may emit light that is detected by an image sensor, a depth sensor, and/or other light-based tracking sensor circuitry in the device 24 (e.g., a head-mounted device, a desktop computer, a stand-alone camera-based monitoring system, and/or other electrical equipment having an image sensor or other tracking sensor circuitry). By processing the received patterned emitted light, device 24 is able to determine the location, orientation, and/or movement of device 10. The light emitting device may be removable and/or customizable (e.g., the user may customize the location and type of the light emitting device), if desired.
Tracking may also be performed that involves estimating from a known body part orientation (e.g., finger orientation) to generate orientation information at other body parts (e.g., wrist and/or arm orientations estimated using inverse kinematics). Visual ranging sensors may be included in the device 10 if desired. These sensors may include image sensors that capture frames of image data of the environment of device 10 and may be used to measure the location, orientation, and/or motion of frames from the image data. If desired, lidar, ultrasonic sensors oriented in multiple directions, radio frequency tracking sensors, and/or other controller tracking arrangements may be used. In some arrangements, the user input for control system 8 may include both user input to controller 10 and other user input (e.g., user eye gaze input, user voice input, etc.). For example, gaze tracking information (such as the user's gaze point measured with a gaze tracker) may be fused with controller inputs of the controller 10 when controlling the device 10 and/or the device 24 in the system 8. The user may, for example, gaze at an object of interest while device 10 uses one or more of sensors 18 (e.g., accelerometer, force sensor, touch sensor, etc.) to gather information, such as a tap input (a tap input in which the user taps on device 10 with one or more fingers, a tap input in which device 10 taps a desktop or other external surface or object, and/or any other tap input that results in measurable force and/or accelerometer output from device 10), a double tap input, a force input, a controller gesture (a tap, swipe, turn, shake, write, draw, paint, engravings, games and/or other gestures performed using device 10, a gesture performed on an external surface using device 10, a gesture performed on an external object using device 10, a null gesture interacted with a virtual object, a drag and drop operation associated with an object selected using a delayed gaze or other gaze point input, etc.). The controller inputs from controller 10 to system 8 may include information regarding the orientation, positioning, and/or movement of a finger relative to controller 10, may include information regarding the force with which the finger presses against a surface of controller 10 (e.g., force information), may include information regarding the force with which controller 10 presses against an object or an external surface (e.g., the force with which an end portion such as end portion T1 presses against an external surface), may include controller pointing inputs (e.g., the direction in which controller 10 points) (which controller pointing inputs may be collected using a radio frequency sensor in sensor 18 and/or other sensors in device 10), and/or may include other controller inputs.
By correlating user input from a first one of the devices 10 with user input from a second one of the devices 10 and/or by otherwise analyzing controller sensor input, multi-device input may be detected and used to manipulate virtual objects or take other actions in the system 8. For example, consider using a flick gesture with device 10 to select a virtual object associated with a user's current gaze point. Once the virtual object has been selected based on the direction of the user's gaze point (or the controller pointing at the directional input) and based on the tap gesture input or other user input, the virtual object may be rotated and/or otherwise manipulated using additional user input collected with one or more devices 10. For example, information about controller movement (e.g., rotational movement) may be acquired using an internal measurement unit or other sensor 18 in the device 10, and the rotational input may be used to rotate the selected object. In some cases, an object may be selected based on a gaze point (e.g., when a gaze point of a user is detected to be pointing at the object), and after selection, object properties (e.g., virtual object properties such as virtual object appearance and/or real world object properties such as operational settings of a real world device) may be adjusted (e.g., rotating the virtual object) using strain gauge input, touch sensor input, controller orientation input, etc.
If desired, gestures with the device 10, such as a pop gesture (three-dimensional gesture), may involve additional input. For example, a user may control system 8 using hybrid gestures that involve movement of device 10 through air (e.g., a flying gesture component), and also involve contact between device 10 and one or more fingers of hand 40. For example, an inertial measurement unit in device 10 and/or a camera in device 24 may detect user movement of device 10 through air (e.g., to trace a path), while a sensor 18 in device 10 (such as a two-dimensional touch sensor, force sensor, or other sensor 18) detects force input, touch input, or other input associated with contact with device 10.
The sensors in the device 10 may, for example, measure the force of the user relative to the surface moving the device 10 (e.g., in a direction perpendicular to the surface) and/or the force of the user moving the device 10 along the surface (e.g., shear force in a direction parallel to the surface). The direction of movement of the device 10 may also be measured by force sensors and/or other sensors 18 in the device 10.
Information collected using the sensor 18, such as force sensor input collected with a force sensor, motion data collected with a motion sensor (e.g., pointing input, rotation, etc.), position information indicating the position of the controller 10, touch input collected with a touch sensor, and other user inputs may be used to control external equipment such as the device 24. For example, control circuitry 12 may send control signals to device 24 including instructions to select user interface elements, scroll display content, select a different input function for controller 10 (e.g., switch from using controller 10 as a drawing or writing tool to using controller 10 as a pointing device or game piece), draw lines or type words on a display in device 24, adjust operational settings of device 24, manipulate display content on device 24, and/or take any other suitable action with device 24. These control signals may be sent in addition to or instead of providing feedback to sensor inputs from the device 10 (e.g., tactile output, audio output, adjusting operational settings of the device 10, etc.).
In the exemplary configuration of fig. 2, the device 10 includes a touch sensor 42. Touch sensor 42 may be formed from an array of capacitive touch sensor electrodes, such as electrodes 46, that overlap one or more surfaces of housing 54, such as curved surface C, flat surface F, and/or surfaces on end portions T1 and T2. Touch sensor 42 may be configured to detect swipes, taps, multi-touch inputs, squeeze inputs, and/or other touch inputs. In some arrangements, the touch sensor 42 is formed from a one-or two-dimensional array of capacitive electrodes 46. In some arrangements, the touch sensor 42 may be a strain gauge that detects a squeeze input to the housing 54 (e.g., when a user squeezes or pinches the device 10 between the user's fingers). The touch sensor 42 may be used to gather touch input, such as input from direct contact and/or close proximity with a different finger of the user or other external object. In the example of fig. 2, the touch sensor 42 overlaps a touch input area 44 on the curved surface C of the device 10. Additional touch inputs may be acquired in adjacent areas (such as the flat surface F of the housing 54) if desired. The touch sensor 42 may include other types of touch sensing technologies, such as optical touch sensors, acoustic-based touch sensors, and the like, if desired. Touch sensor 42 may span length L of device 10, may span only partially along length L of device 10, may cover some or all of curved surface C, may cover some or all of planar surface F, and/or may cover some or all of end portions T1 and T2. If desired, the touch sensor 42 may be illuminated, may overlap the display (e.g., to form a touch sensitive display area on the device 10), may overlap the pointer or textured surface, and/or may otherwise be visually or tactilely distinct from the surrounding non-touch sensitive portions of the housing 54 (if desired).
In addition to or in lieu of touch sensor 42, device 10 may include one or more other user input devices, such as user input device 48. The user input device 48 may be a mechanical input device such as a depressible button, a rotary knob, a rotary wheel, a rocker switch, a slider, or other mechanical input device, a force sensor such as a strain gauge or other force sensor, an optical sensor such as a proximity sensor, a touch sensor such as a capacitive, acoustic, or optical touch sensor, and/or any other suitable input device for receiving input from the user's hand 40. One of the haptic output devices 20, such as an actuator, may be used to provide haptic feedback in response to user input to device 48, if desired. For example, the input device 48 may be a touch-sensitive button that does not physically move relative to the housing 54, but the user may feel a localized button click sensation from the tactile output provided by the actuator 20 that overlaps the device 48.
The haptic output device 20 may be located in any suitable location within the housing 54. In one exemplary arrangement, one or more haptic output devices 20 may be located at each end of the shaft B, such as at one or both of the end portions T1 and T2. Haptic output device 20 may be configured to provide local haptic feedback and/or global haptic feedback. The localized haptic feedback may be more prominent in certain locations of the housing 54 relative to other portions of the housing 54 (e.g., the localized haptic feedback may be more prominent at one or both of the end portions T1 and T2). The localized haptic effect may be achieved by arranging the axis of the linear resonant actuator within the housing 54 perpendicular to the longitudinal axis of the device 10 (e.g., perpendicular to the length L). Haptic output device 20 may additionally or alternatively provide global haptic feedback that is more prominent across most or all of the length of device 10. To achieve a global haptic effect, the axis of the linear resonant actuator may be arranged in the housing 54 to be parallel to the longitudinal axis of the device 10 (e.g., parallel to the length L).
Device 10 may include one or more sensors at end portions T1 and T2 in addition to or in lieu of touch sensor 42 and input device 48. For example, the end portion T1 and/or the end portion T2 may be force sensitive. As shown in fig. 2, the device 10 may include a sensor 52. The sensor 52 may be located at one or both of the end portions T1 and T2 and/or may be located elsewhere in the device 10, such as at a location along the axis B of the device 10. The shaft B, which may sometimes be referred to as a cylindrical housing, may form an elongated body portion of the housing 54 of the device 10 that extends between the ends T1 and T2. One or both of the end portions T1 and T2 may be removable and may sometimes be referred to as a cap, writing end, or the like. The sensors located at the end portions T1 and T2, such as sensor 52, may include device positioning sensors (e.g., optical flow sensors having a light source illuminating a portion of the surface contacted by the device 10 and having an image sensor configured to determine the position of the device 10 on the surface and/or measure movement of the electronic device relative to the surface based on captured images of the illuminated portion, mechanical positioning sensors such as encoder wheels tracking movement of the device 10 on the surface, or other device positioning sensors), force sensors (e.g., one or more strain gauges, piezoelectric sensors, capacitive sensors, and/or any other suitable force sensors), optical proximity sensors (such as light emitting diodes and photodetectors), cameras (e.g., single pixel cameras or image sensors having a two-dimensional array of pixels), and/or other sensors.
If desired, power may be transferred wirelessly between device 10 and an external electronic device such as device 24 (e.g., a headset, a wireless charging pad, a storage case, a battery pack, a wireless charging puck, or other electronic device). For example, contacts (e.g., metal pads) may be capacitively coupled (without forming ohmic contacts) to allow power to be transferred, and/or a wireless power transmitter having a coil in device 24 may be used to transfer power to a wireless power receiver having a coil in device 10. Inductive power transfer techniques may be used (e.g., one or more wireless power transfer coils in device 24 may be used to transfer wireless power, and a power receiving coil such as coil 50 may be used to receive the transferred wireless power signal in a power receiving circuit in device 10). The rectifier in device 10 may be used to convert the received ac wireless power signal from device 24 to dc power for charging a battery in device 10 and/or powering circuitry in device 10. In configurations where the power receiving circuitry of the device 10 receives power via a wired connection (e.g., using terminals), the power receiving circuitry in the device 10 may provide the received power to the battery and/or other circuitry in the device 10.
To help align the wireless charging coil 50 in the device 10 with the wireless charging coil in the device 24 and/or otherwise retain the device 10 to a power source or other device (e.g., the device 24 of fig. 1), the device 10 and the device 24 may be provided with mating alignment features (e.g., mating protrusions and recesses and/or other interlocking alignment structures (e.g., key and keyhole structures that allow the device 10 and/or the device 24 to interlock when engaged by a twisting or other locking motion), magnets (or ferromagnetic elements such as iron bars), and/or other alignment structures.
In configurations where the device 10 includes magnetic attachment structures (e.g., magnets, magnetic material attracted to magnets, or other magnetic attachment structures), the magnetic attachment structures may be used to hold the device 10 against the interior and/or exterior of the device 24. For example, the device 24 may be a battery compartment having a recess or other depression that receives the device 10. Magnetic attachment structures in the device 24 (e.g., proximate the recess) and in the device 10 may cooperate (magnetically attach) to help secure the device 10 within the interior of the case (e.g., not allow the device 10 to excessively rattle within the case). As another example, the device 24 may be a head-mounted device (e.g., goggles and/or glasses) or a strap or other wearable device. In this type of arrangement, the magnetic attachment structure may hold the device 10 against an outer surface of the device 24 (e.g., against a portion of the housing of a pair of goggles or glasses, such as along the frame of a pair of glasses, to the front, top, or side surfaces of a pair of goggles, etc.) or within a recess in the housing of the device 24. The magnets and other alignment features may be located near the coil 50 or may be located in other portions of the housing 54.
Fig. 3 is a cross-sectional side view of an exemplary portion of device 10 showing how one or more end portions of device 10 may be removed. As shown in fig. 3, the device 10 may include a housing 54 having an axis B that extends parallel to a longitudinal axis 62 of the device 10. The end portion T1 may be removably attached to the body portion B using an attachment structure, such as attachment structure 58. Attachment structure 58 may be one or more screws, friction fit, magnetic attachment structures (e.g., magnets, magnetic material attracted to magnets, or other magnetic attachment structures), clasps, clips, interlocking engagement features, or any other suitable attachment structure. In a configuration in which the attachment structure 58 is a screw, the screw may be mounted to an internal support structure, such as support structure 56 in body portion B, and the screw portion 58 may extend from support structure 56 into end portion T1. The end portion T1 may have threads that mate with threads on the attachment structure 58. The user can remove the end portion T1 by twisting the end portion T1 relative to the body portion B. The end portion T1 may be mounted back onto the body portion B such that the end portion T1 is twisted relative to the body portion B which rotates in the direction 60. In configurations where the tip portion T1 includes a sensor 52 (such as a force sensor forming a touch-sensitive tip on the device 10), the user may be able to replace the force-sensitive tip by removing the tip portion T1 and replacing it with a new tip portion in the event of a failure.
Fig. 4,5 and 6 show exemplary shapes for the device 10 and different forms of input devices included in the device 10. In the example of fig. 4, the housing 54 has a pill-shaped cross-section with opposed first and second planar surfaces F joined by upper and lower curved surfaces. If desired, one or more openings such as opening 70 may be formed on one or both ends of housing 54 to form a connector port (for transmitting and/or receiving data signals, power signals, ground signals, etc.), an audio port (e.g., one or more openings through which sound passes to a microphone in device 10, one or more openings through which sound emitted from a speaker in device 10 passes, and/or an opening forming an audio jack for receiving an audio cable), a window or other opening for a sensor in device 10, such as an optical sensor.
In the example of fig. 4, the device 10 includes an input device, such as a rotating scroll wheel 64. The wheel 64 may be mounted in an opening (such as opening 72) of the housing 54. Wheel 64 may be configured to rotate in direction 66 about an axis 68 (e.g., an axis parallel to the longitudinal axis of device 10). The control circuit 12 may be configured to detect rotation of the wheel 64 and may take appropriate action based on the rotation of the wheel 64. This may include, for example, adjusting one or more settings or modes of operation of device 10, selecting a virtual tip for device 10 (e.g., scrolling different types of tools that device 10 is using in a virtual setting, such as paint brushes, pencils, pens or other writing tools, styling tools, etc.), sending control signals to external electronic devices such as device 24 (e.g., to scroll content on a display of device 24, select options being displayed on a display of device 24, etc.), and/or other actions.
In the example of fig. 5, the device 10 has a shape similar to that of fig. 4. The housing 54 may have a pill-shaped cross-section with opposing first and second planar surfaces F joined by upper and lower curved surfaces. The device 10 includes an input device such as a rotating scroll wheel 74. The wheel 74 is accessible through the first opening and the second opening. For example, the housing 54 may have a first opening on a first side of the device 10 (e.g., in one of the curved surfaces of the housing 54) and a second opening on a second side of the device 10 (e.g., in the opposite curved surface of the housing 54). The wheel 74 may be configured to rotate in a direction 76 about an axis 78 (e.g., an axis perpendicular to the longitudinal axis of the apparatus 10). The control circuit 12 may be configured to detect rotation of the wheel 74 and may take appropriate action based on the rotation of the wheel 74 (e.g., by adjusting one or more settings or modes of operation of the device 10, by selecting a virtual end for the device 10, by sending control signals to an external electronic device such as the device 24 to scroll through content on the display of the device 24, selecting an option being displayed on the display of the device 24, and/or other actions).
In the example of fig. 6, the device 10 includes an input device, such as a rotating scroll wheel 80. The wheel 80 is accessible through an opening in the housing 54 and may be configured to rotate in a direction 82 about an axis 84 (e.g., an axis perpendicular to the longitudinal axis of the device 10). The control circuit 12 may be configured to detect rotation of the wheel 80 and may take appropriate action based on rotation of the wheel 80 (e.g., by adjusting one or more settings or modes of operation of the device 10, by selecting a virtual end for the device 10, by sending control signals to an external electronic device such as the device 24 to scroll through content on a display of the device 24, by selecting options being displayed on a display of the device 24, and/or other actions).
Fig. 7 and 8 show an exemplary arrangement of the apparatus 10, wherein the apparatus 10 comprises a first separable portion and a second separable portion forming a first controller and a second controller. As shown in fig. 7, the apparatus 10 may include a first controller 10A and a second controller 10B. In the configuration of fig. 7, the controllers 10A and 10B are connected and mated to each other. In the configuration of fig. 8, controllers 10A and 10B are separate from each other and may be used independently to control external equipment such as device 24.
The controllers 10A and 10B may be coupled together using interlocking engagement features, screws, press fit connections, magnets, magnetic materials attracted to magnets, other magnetic attachment structures, and/or other attachment structures. As shown in fig. 8, for example, the controller 10A and the controller 10B have mating engagement features such as protrusions 88 and openings 90. The protrusions 88 on the controller 10A may be configured to mate with corresponding openings 90 on the controller 10B, and the protrusions 88 on the controller 10B may be configured to mate with corresponding openings 90 on the controller 10A.
If desired, each separable piece of equipment 10, such as controller 10A and controller 10B, may have a separate input device. For example, the controller 10A may have an input device 86A for receiving user input, and the controller 10B may have an input device 86B for receiving user input. Input devices 86A and 86B may include one or more buttons, scroll wheels, touch sensors, switches, and the like. This may allow more than one user to use the device 10 (e.g., a first user may use the controller 10A to control external equipment and a second user may use the controller 10B to control external equipment). This may also allow a single user to use the controller in each hand (e.g., controller 10A may be held in the user's left hand and controller 10B may be held in the user's right hand).
If desired, each component of the device 10, such as the controllers 10A and 10B, may have separate circuitry, such as separate control circuitry 12, communication circuitry 14, and input-output devices 16. In other arrangements, some of the circuitry of device 10 may be located in one controller, such as controller 10A, and not in controller 10B (or vice versa). The input-output capabilities of one controller may be shared with another controller, if desired. For example, a wireless control circuit, such as one of the controllers 10A, may collect user input from another controller, such as the controller 10B, and may send corresponding control signals to external equipment. Button press inputs, touch inputs, force inputs, or pointing inputs received by the controller 10A may be used to control the operational settings of the controller 10B (as examples).
As shown in fig. 9, external equipment in the system 8, such as the electronic device 24, may include sensors, such as one or more cameras 92 (e.g., visible light cameras, infrared cameras, etc.). For example, the electronic device 24 may be a head-mounted device such as an augmented reality (mixed reality) or virtual reality goggles (or glasses, helmets, or other head-mounted support structures). Visual markers, such as marker 96, may be placed on device 10. The marker 96 may be, for example, a passive visual marker such as a bar code, cross-tag, reflector, or other visually identifiable pattern, and the marker may be applied to any suitable location of the device 10. If desired, the markers 96 may include active visual markers formed by light emitting components (e.g., visual light emitting diodes and/or infrared light emitting diodes modulated using identifiable modulation codes) detected using a camera such as the camera 92. The markers 96 may help inform the system 8 of the location of the controller 10 when a user interacts with a computer or other equipment in the system 8. If desired, the device 10 may include an optical sensor, such as sensor 94 in the end portion T1. The optical sensor 94 may be a single pixel camera or a two-dimensional array of pixels forming an image sensor. Sensor 94 may be configured to capture an image of the environment that may be used by control circuitry 12 in device 10 and/or control circuitry 26 in device 24 to help track the position of device 10. This may be useful, for example, in a scenario where the hand 40 blocks the visual marker 96 on the device 10.
Visual markers 96 and/or inertial measurement units, such as inertial measurement unit 98 (e.g., accelerometer, compass, and/or gyroscope), on device 10 may be used to track the position of device 10 relative to device 24 and/or relative to an external object, such as surface 100. At the same time, system 8 may display the associated visual content for the user (e.g., using a display on device 24). A user may interact with the displayed visual content by supplying force input (e.g., to force sensor 52 in force-sensitive end portion T1 of fig. 2), motion input (e.g., a pop gesture, a point gesture, a rotation, etc.), taps, shear force input, touch input (e.g., to touch sensor 42 of fig. 2), and other inputs to device 10.
For example, during operation of system 8, information regarding the position of device 10 relative to device 24 and/or surface 100 may be collected by control circuitry 12 in device 10 or by control circuitry 26 of device 24 (e.g., a headset, computer, cellular telephone, or other electronic device), while device 10 is monitored for force input, gesture input (e.g., a tap, three-dimensional overhead gesture, pointing input, writing or drawing input, etc.), touch input, and/or any other user input that indicates that a user has selected (e.g., highlighted), moved, or otherwise manipulated a displayed visual element, and/or provided a command to system 8. For example, the user may make a swiping gesture with the device 10, such as waving the device 10 to the left to move visual content to the left. The system 8 may detect a left hand swipe gesture using an inertial measurement unit in the device 10, and may move visual elements presented to the user with a display in the device 24 in response to the left hand swipe gesture. As another example, the user may select the visual element of the device 10 and/or the element of the pointing device 10 in the user's field of view by tapping the element. A user may draw, paint, or otherwise move device 10 along surface 100 to form a corresponding drawing, painting, or other visual output on the display of device 24.
In this way, control circuitry 12 in device 10 and/or control circuitry 26 in device 24 may allow a user to manipulate visual elements (e.g., virtual reality content or other visual content presented with a head-mounted device such as an augmented reality goggle or other device 24 having a display) that the user is viewing. If desired, a camera such as camera 92 may face the user's eye (e.g., camera 92 or other visual tracking equipment may form part of a gaze tracking system). The camera and/or other circuitry of the gaze tracking system may monitor the direction in which the user is viewing the real world objects and visual content. For example, a camera may be used to monitor the gaze point (gaze direction) of the user's eyes while the user is interacting with virtual content presented by the device 24 and while the user is holding the controller 10 in the hand 40. Control circuitry 12 in device 10 and/or control circuitry 26 in device 24 may measure the amount of time that the user's gaze remains in a particular location and may use this gaze point information to determine when to select a virtual object. The virtual object may also be selected when it is determined that the user is viewing a particular object (e.g., by analyzing gaze point information) and when it is determined that the user has made a voice command, finger input, button press input, or other user input to select the particular object being viewed. The gaze point information may also be used during drag and drop operations (e.g., to move a virtual object according to movement of the gaze point from one location to another in the scene).
As an example, consider a scenario of the type shown in fig. 10. In this example, the device 24 has a housing in which the gaze tracker 106 is installed for monitoring the user's eyes 110. The device 24 may include a display, such as display 108. The display 108 may be configured to display images to a user. The image may include one or more objects (e.g., visual items), such as object 102. Control circuitry in device 24 may use gaze tracker 106 to determine a direction 112 from which a user views display 108 or other objects. Using direction 112 and/or other information from gaze tracker 106 and/or other sensors (e.g., a depth sensor and/or other sensors that determine a distance of a user from device 24), device 24 may determine a location of gaze point 104 of the user on display 108. For example, device 24 may determine whether the user is currently viewing a virtual object, such as object 102 on display 108.
Another exemplary system with gaze tracking is shown in fig. 11. In the example of fig. 11, the device 24 is a head mounted device having a head mounted support structure 114 (sometimes referred to as a housing) configured to be worn on the head of a user. The rear-facing gaze tracking system 106 may monitor the user's eyes 110 to determine the direction 112 of the user's gaze. Additional sensors (e.g., depth sensor 116) may be used to determine the location and/or other properties of objects in the user's field of view, such as object 102 of fig. 11. The object 102 may be a real world object (e.g., a table surface; inanimate object with circuitry such as one or more devices 24; non-electronic inanimate object such as a pencil, ball, bottle, cup, table, wall, etc.), or a computer-generated (virtual) object presented to the user's eye 110 by a display in the device 24 (e.g., a see-through display system or a display system in which virtual content is overlaid on a real world image that has been captured with the camera 116 on the display). Using information about the direction 112 of the user's gaze and information about the relative positioning between the user and the object 102 (e.g., information from a depth sensor in the device 24 and/or information about virtual objects presented to the user), the device 24 may determine when the user's gaze point 104 coincides with the object 102.
Arrangements of the type shown in fig. 10 and 11 allow a user to interact with real world content and computer generated (virtual) content. For example, the user may select an object of interest by pointing the gaze point 104 at the object (e.g., for more than a predetermined dwell time and/or until an associated user input such as a finger input is received to confirm the selection). Using the device 10 and/or other equipment in the system 8, a user may perform operations on a selected object. For example, objects selected by the delayed gaze point or other selection actions may be manipulated using two-dimensional touch inputs collected with touch sensor 42, using force inputs collected with sensor 52, using tap inputs collected by accelerometers in device 10, using motion inputs collected by motion sensor 98 and/or positioning tracking circuitry (such as camera 92, camera 94, and/or visual marker 86), or using other inputs collected with other sensors 18. Examples of virtual object manipulations that may be performed based on two-dimensional touch inputs and/or other sensor inputs include object panning, rotation, resizing operations, modification of other visual characteristics (such as color, texture, brightness level and/or contrast settings, etc.).
Real world objects may also be manipulated. Such objects may include, for example, real world devices such as electronic systems in a home or office, electronic devices such as portable electronic devices, and/or other electronic equipment, computers, home automation systems, lighting, heating and ventilation systems, shutters, door locks, security cameras, thermostats, audio systems, audiovisual equipment such as televisions, set-top boxes, voice assistant speakers, and/or other electronic equipment (e.g., devices including components such as circuitry of device 24). Examples of real-life object manipulations that may be performed on the selected object include adjusting the brightness of a light bulb (part of a wireless lighting system), adjusting the temperature of a thermostat, adjusting the operation of a computer, adjusting a television (e.g., changing channels, adjusting volume, changing video and/or audio sources, selecting tracks and video clips to play, etc.), adjusting speaker volume, skipping tracks, etc.
If desired, the object may be selected by detecting when the device 10 is pointed at an object of interest (e.g., by tracking the position of the object and/or device 10 using a camera in the device 24 or device 10, and determining the orientation and pointing direction of the device 10 using an inertial measurement unit or other orientation sensor in the device 10, and/or by tracking the position and orientation of the device 10 using a radio frequency sensor and/or using a camera to use an optical tracking element on the device 10). The relative positioning determination and object selection may be performed using radio frequency sensors (e.g., IEEE ultra wideband sensors) for detecting the orientation and position of the device 10 and determining the range of objects, etc., and/or using other sensors 18.
As a first example, consider a scenario in which object 102 is a computer-generated icon. In this case, after aligning gaze point 104 to overlap with the computer-generated icon and thereby selecting the icon for further action, the user may utilize controller 10 and/or other input components in system 8 to supply commands in system 8 to guide system 8 to begin an associated operation in system 8. As an example, if the icon is an email icon, system 8 may initiate an email program on device 24 upon receiving user input for device 10.
In a second example, object 102 is a real world object, such as a non-electronic inanimate object (e.g., an object viewed by a user of device 24 of fig. 11 when device 24 is worn on the user's head). In response to detecting that the user's gaze point 104 is directed toward the object 102 and in response to receiving input for the controller 10, the device 24 may generate a virtual object that overlaps all or part of the object 102 in the user's field of view. Other operations may include zooming in on a portion of the object 102, changing a color or texture of the object 102, adding contours around the object 102, adding graphical elements that are aligned with the object 102, and/or taking other suitable actions.
In a third example, object 102 is a real world object that includes circuitry. The object 102 may be, for example, a wireless speaker or other electronic device 24. In response to detecting that the user's gaze point 104 is directed toward the object 102 and in response to receiving user input for the device 10, the device 24 may adjust the output volume of the speaker. If the object coincident with gaze point 104 is a device such as a television, the channel of the television may be changed in response to user input to device 10. In this way, the user may interact with electronic devices around the user's home or office or in other environments by merely gazing at the object and coordinating this gaze point selection to supply additional input for the device 10. Gaze point dwell time, blinking, and other eye movements may also be used as user input.
If desired, device 24 may be used to overlay computer-generated content onto or near device 10. As shown in fig. 12, for example, a user may wear the head mounted device 24 while holding the device 10. The head-mounted device may overlay computer-generated display content, such as display content 118, onto device 10. Display content 118 may include, for example, an array of virtual keyboard keys. The user may select a key by providing touch input, force input, motion input, or other user input to the device 10 in a position that overlaps with the virtual keyboard key being selected. For example, as shown in fig. 12, some of the display content, such as display content 118M, may overlap with device 10. The user may then select keyboard keys from display content 118M by tapping, hovering, gazing, or providing other user input to desired virtual keys within the row of display content 118M. If desired, the user may apply a touch or force input to touch sensor 42 to select or manipulate display content 118 (e.g., to act as a shift key). To scroll to different rows such that different sets of keys overlap device 10, the user may rotate device 10 in directions 122 and 124, which may cause display content 118 to move based on the direction of rotation of device 10. If desired, sensors in device 10 and/or cameras in device 24 may track the position of the user's finger relative to device 10 and may manipulate display content 118 accordingly. For example, if finger 40B hovers in location 120C, a key in the top row may be selected, or the top row of keys in display content 118 may be moved downward to overlap device 10, such that a user may select one of the keys from the row by touching or tapping device 10. If finger 40A hovers in location 120B, the bottom row of keys in the bottom row may be selected or the bottom row of virtual keys in display content 118 may be moved upward to overlap device 10 so that the user may select one of the keys from the row by touching or tapping device 10. Camera 92 of device 24 and/or sensor 18 in device 10 may be used to monitor finger position (e.g., positions 120A, 120B, 120C, 120D, etc.) relative to device 10 so that appropriate action may be taken with respect to display content 118.
If desired, the display may be overlaid onto device 10 to indicate what type of input functionality device 10 is providing to external equipment such as device 24. For example, if device 10 is used to provide virtual painting input while the display in device 24 is displaying what the user is drawing, the display of device 24 may overlay a painting head 126 over the end of device 10, as shown in FIG. 13. If device 10 is used to provide virtual build input while the display in device 24 is displaying what the user is building, the display of device 24 may overlay build tool 128 onto the end of device 10, as shown in FIG. 14. These examples are merely illustrative. Other computer-generated display content, such as other writing instruments, steering wheels, buttons, knobs, levers, and/or any other suitable virtual content, may be overlaid onto device 10 while using device 10 to provide input to external devices, such as device 24.
If desired, the device 10 may provide a tactile output when the device 10 is being used to control external equipment. For example, when a user is "writing" with device 10 by moving the end of device 10 across a surface, a tactile output may be provided based on the texture of the surface being written and/or based on the writing instrument currently selected for device 10. Different surfaces and/or different input modes for device 10 may produce different haptic outputs if desired. Fig. 15 is a graph showing an exemplary drive signal D that has been plotted as a function of time t. As an example, when the device 10 is being used in a pen input mode (e.g., where the device 10 is used to create pen writing on an external display), the control circuitry in the device 10 may supply the first drive signal D having a relatively high amplitude and frequency during time 130. When the device 10 switches into a different mode, such as a gaming mode (e.g., where the device 10 is being used as a tool in a game), during time 132, a different texture or other haptic output may be supplied to the user's hand using the haptic output component 20 in the device 10. As an example, during time 132, a lower frequency and lower amplitude drive signal D may be used to control the haptic output components in device 10. Thus, the user will feel a first texture or other feedback when using the device 10 for a first input function, and will feel a second texture or other feedback when using the device 10 for a second input function. Other haptic effects, such as haptic output conveying a shape and/or rounded edge, effects associated with a compliant structure, braking, force feedback simulating resistance to movement, clicking on a simulated recess and/or releasing a button with a physical click feel, and/or other haptic effects, if desired. The corresponding visual effect may be provided from the display of the device 24.
Fig. 16 is a diagram of an exemplary haptic output device (such as a linear resonant actuator) that may be included in the haptic output device 20 of the controller 10. In the example of fig. 16, the linear resonant actuator 232 has a magnet 228. The magnet 228 may comprise a permanent magnet and/or an electromagnet. The support structure 226 may be formed of metal, polymer, and/or other materials. A first one of magnets 228, such as magnet MAG2 (e.g., a permanent magnet), may be attached to support structure 226. A second one of magnets 228, such as magnet MAG1 (e.g., an electromagnet), may be coupled to support structure 226 by a mechanical system (see, e.g., spring 230) that allows magnet MAG1 to move relative to magnet MAG2 and exhibit resonance. The linear resonant actuator 232 may be driven by applying an ac drive signal to an electromagnet (e.g., magnet MAG 2) in the magnet 228. When driven with a waveform featuring a frequency at or near the resonant frequency of actuator 232, actuator 232 may exhibit enhanced vibration and thus enhanced tactile output to a user of device 10. The linear resonant actuator 232 can be mounted in any suitable location of the device 10 (e.g., within the body portion B, the end portion T1, the end portion T2, etc.).
A haptic output device such as linear resonant actuator 232 may be driven using any suitable drive signal. An exemplary drive signal that may be used is shown in fig. 17. The drive signal amplitude AM is plotted against time t in the graph of fig. 17 for two exemplary drive signals. The exemplary drive signal of fig. 17 may be repeated for a plurality of periods TP (e.g., at a repetition frequency of 75Hz, greater than 75Hz, or less than 75 Hz). The total duration of the drive signal applied to the linear resonant actuator 232 is equal to the total duration of the desired haptic output (e.g., at least 0.2s, at least 0.5s, at least 1s, less than 500s, less than 50s, less than 5s, less than 0.8s, etc.). As shown by the exemplary drive waveform 234, the linear resonant actuator 232 may be driven by a periodic oscillating waveform that decays over several oscillations during each period TP. The frequency of oscillation in this waveform may be selected to coincide with the resonant frequency of the linear resonant actuator.
The asymmetry of the oscillations within each period TP may impart directionality to the haptic effect experienced by the user. For example, waveform 234 may impart a sensation of force to finger 40 in a given direction, while waveform 236, which is complementary to waveform 234, may impart a sensation of force to finger 40 in a direction opposite the given direction (as an example). A haptic output having frequency components in the range of 50Hz to 300Hz may be well suited for detection by a human being (e.g., finger 40), but other frequency components may be present in the drive signal applied to linear resonant actuator 232 if desired. In some arrangements, there may be a tradeoff between directional feel and vibratory feel (i.e., a drive signal that is asymmetric or otherwise configured to enhance a user's perception of directional forces on the finger 40 may tend to reduce the user's perception that the finger 40 is vibrating, and vice versa). Drive signals such as drive signals 234 and 236 of fig. 17 may be configured to provide a desired mix of directional and vibratory sensations (e.g., by changing the shape of the damping envelope of the drive signal, by changing the frequency of oscillation of the drive signal within each period TP, by changing the leading and/or trailing edge slope of oscillations in the drive signal, by changing the degree of asymmetry within each oscillation and/or within each period TP, etc.).
As described above, one aspect of the present technology is to collect and use information, such as sensor information. The present disclosure contemplates that in some instances, such collected data may include personal information data that uniquely identifies or may be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, TWITTER ID, home addresses, data or records related to the user's health or fitness level (e.g., vital sign measurements, medication information, athletic information), birth date, eyeglass power, user name, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information in the present technology may be used to benefit users. For example, the personal information data may be used to deliver targeted content of greater interest to the user. Thus, the use of such personal information data enables a user to have programmatic control over the delivered content. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user. For example, the health and fitness data may be used to provide insight into the general health of the user, or may be used as positive feedback to individuals who use the technology to pursue health goals.
The present disclosure contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information data will adhere to sophisticated privacy policies and/or privacy measures. In particular, such entities should exercise and adhere to the use of privacy policies and measures that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. Such policies should be convenient for the user to access and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable physical uses and must not be shared or sold outside of these legitimate uses. Further, such collection/sharing should be performed after receiving the user's informed consent. In addition, such entities should consider taking any necessary steps for protecting and securing access to such personal information data and ensuring that other entities having access to the personal information data adhere to the privacy policies and procedures of other entities. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices. In addition, policies and practices should be adapted to the particular type of personal information data collected and/or accessed, and to applicable laws and standards including consideration of particular jurisdictions. For example, in the united states, the collection or access to certain health data may be governed by federal and/or state law, such as the health insurance and liability act (Insurance Portability and Accountability Act, HIPAA), while health data in other countries may be subject to other regulations and policies and should be processed accordingly. Thus, different privacy measures should be claimed for different personal data types in each country.
Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively blocks use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, the present technology may be configured to allow a user to choose to participate in the collection of personal information data "opt-in" or "opt-out" during or at any time after the registration service. In another example, the user may choose not to provide certain types of user data. In yet another example, the user may choose to limit the length of time that user-specific data is maintained. In addition to providing the "opt-in" and "opt-out" options, the present disclosure contemplates providing notifications related to accessing or using personal information. For example, a user may be notified that his personal information data will be accessed when an application ("app") is downloaded, and then be reminded again before the personal information data is accessed by the app.
Furthermore, it is intended that personal information data should be managed and processed in a manner that minimizes the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, risk can be minimized by limiting the collection and deletion of data. In addition, and when applicable, included in certain health-related applications, the data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing a particular identifier (e.g., date of birth, etc.), controlling the amount or characteristics of data stored (e.g., collecting location data at a city level rather than an address level), controlling the manner in which data is stored (e.g., aggregating data between users), and/or other methods, where appropriate.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without accessing such personal information data. That is, various embodiments of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data.
Physical environment-a physical environment refers to the physical world in which people can sense and/or interact without the assistance of an electronic system. Physical environments such as physical parks include physical objects such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with a physical environment, such as by visual, tactile, auditory, gustatory, and olfactory.
Computer-generated reality in contrast, a computer-generated reality (CGR) environment refers to a completely or partially simulated environment in which people perceive and/or interact via an electronic system. In the CGR, a subset of the physical movements of the person, or a representation thereof, is tracked and in response one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner consistent with at least one physical law. For example, the CGR system may detect human head rotation and, in response, adjust the graphical content and sound field presented to the human in a manner similar to the manner in which such views and sounds change in the physical environment. In some cases (e.g., for reachability reasons), the adjustment of the characteristics of the virtual object in the CGR environment may be made in response to a representation of physical motion (e.g., a voice command). A person may utilize any of his sensations to sense and/or interact with CGR objects, including visual, auditory, tactile, gustatory, and olfactory. For example, a person may sense and/or interact with audio objects that create a 3D or spatial audio environment that provides a perception of a point audio source in 3D space. As another example, an audio object may enable audio transparency that selectively introduces environmental sounds from a physical environment with or without computer generated audio. In some CGR environments, a person may sense and/or interact with only audio objects. Examples of CGR include virtual reality and mixed reality.
Virtual reality-Virtual Reality (VR) environment refers to a simulated environment designed to be based entirely on computer-generated sensory input for one or more senses. The VR environment includes a plurality of virtual objects that a person can sense and/or interact with. For example, computer-generated images of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with virtual objects in a VR environment through a simulation of the presence of the person within the computer-generated environment and/or through a simulation of a subset of the physical movements of the person within the computer-generated environment.
Mixed reality-in contrast to VR environments that are designed to be based entirely on computer-generated sensory input, mixed Reality (MR) environments refer to simulated environments that are designed to introduce sensory input, or representations thereof, from a physical environment in addition to including computer-generated sensory input (e.g., virtual objects). On a virtual continuum, a mixed reality environment is any condition between, but not including, a full physical environment as one end and a virtual reality environment as the other end. In some MR environments, the computer-generated sensory input may be responsive to changes in sensory input from the physical environment. In addition, some electronic systems for rendering MR environments may track the position and/or orientation relative to the physical environment to enable virtual objects to interact with real objects (i.e., physical objects or representations thereof from the physical environment). for example, the system may cause movement such that the virtual tree appears to be stationary relative to the physical ground. Examples of mixed reality include augmented reality and augmented virtualization. Augmented Reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed on a physical environment or a representation of a physical environment. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present the virtual object on a transparent or semi-transparent display such that a person perceives the virtual object superimposed over the physical environment with the system. Alternatively, the system may have an opaque display and one or more imaging sensors that capture images or videos of the physical environment, which are representations of the physical environment. The system combines the image or video with the virtual object and presents the composition on an opaque display. A person utilizes the system to indirectly view the physical environment via an image or video of the physical environment and perceive a virtual object superimposed over the physical environment. As used herein, video of a physical environment displayed on an opaque display is referred to as "pass-through video," meaning that the system captures images of the physical environment using one or more image sensors and uses those images when rendering an AR environment on the opaque display. Further alternatively, the system may have a projection system that projects the virtual object into the physical environment, for example as a hologram or on a physical surface, such that a person perceives the virtual object superimposed on top of the physical environment with the system. An augmented reality environment also refers to a simulated environment in which a representation of a physical environment is transformed by computer-generated sensory information. For example, in providing a passthrough video, the system may transform one or more sensor images to apply a selected viewing angle (e.g., a viewpoint) that is different from the viewing angle captured by the imaging sensor. As another example, the representation of the physical environment may be transformed by graphically modifying (e.g., magnifying) portions thereof such that the modified portions may be representative but not real versions of the original captured image. For another example, the representation of the physical environment may be transformed by graphically eliminating or blurring portions thereof. Enhanced virtual-enhanced virtual (AV) environments refer to simulated environments in which a virtual environment or computer-generated environment incorporates one or more sensory inputs from a physical environment. The sensory input may be a representation of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but the face of a person is realistically reproduced from an image taken of a physical person. As another example, the virtual object may take the shape or color of a physical object imaged by one or more imaging sensors. For another example, the virtual object may employ shadows that conform to the positioning of the sun in the physical environment.
Hardware there are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head-mounted systems, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a human eye (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smart phones, tablet computers, and desktop/laptop computers. The head-mounted system may have one or more speakers and an integrated opaque display. Alternatively, the head-mounted system may be configured to accept an external opaque display (e.g., a smart phone). The head-mounted system may incorporate one or more imaging sensors for capturing images or video of the physical environment, and/or one or more microphones for capturing audio of the physical environment. The head-mounted system may have a transparent or translucent display instead of an opaque display. The transparent or translucent display may have a medium through which light representing an image is directed to the eyes of a person. The display may utilize digital light projection, OLED, LED, μled, liquid crystal on silicon, laser scanning light source, or any combination of these techniques. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, the transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface.
According to an embodiment, a handheld controller for controlling an electronic device is provided that includes a housing having an elongated shaft extending between a first end portion and a second end portion, a force sensor located in the first end portion and configured to collect force input, a motion sensor configured to collect motion data, an actuator configured to provide a haptic output in response to the force input and the motion data, and control circuitry located in the housing and configured to send a control signal to the electronic device based on the force input and the motion data.
According to another embodiment, the first end portion is removable from the elongate shaft.
According to another embodiment, the elongate shaft has a flat surface and a curved surface.
According to another embodiment, the handheld controller includes a button located on a planar surface of the elongate shaft.
According to another embodiment, the handheld controller includes a touch sensor located on a curved surface of the elongate shaft.
According to another embodiment, a handheld controller includes a power receiving coil configured to receive wireless power.
According to another embodiment, the actuator comprises a linear resonant actuator configured to impart directivity to the haptic output.
According to another embodiment, the hand-held controller includes a roller that rotates relative to the elongated housing.
According to another embodiment, the elongated housing comprises a first housing portion and a second housing portion operable in a first mode in which the first housing portion and the second housing portion cooperate with each other and a second mode in which the first housing portion and the second housing portion are separate from each other and operate independently.
According to another embodiment, the control signal includes instructions for selecting a user interface element on a display in the electronic device.
According to another embodiment, the control signal includes instructions for drawing lines on a display in the electronic device.
According to another embodiment, the control signal includes instructions for typing words on a display in the electronic device.
According to an embodiment, a handheld controller for controlling an electronic device is provided that includes a housing having an elongated shaft extending between a first end portion and a second end portion, a motion sensor configured to collect motion data, a camera located in the first end portion and configured to capture an image, a touch sensor located on the elongated shaft and configured to collect touch input, and control circuitry configured to send control signals to the electronic device based on the motion data, the image, and the touch input.
According to another embodiment, the housing has a length between 130mm and 160mm and a diameter between 10mm and 15 mm.
According to another embodiment, the camera comprises a single pixel camera.
According to another embodiment, the control signal includes instructions for scrolling content on a display in the electronic device.
According to an embodiment, a system is provided that includes a handheld controller including an elongated housing having a force-sensitive tip, a visual marker located on the elongated housing, and an inertial measurement unit configured to collect motion data, and a head-mounted device including a camera configured to capture an image of the visual marker, and control circuitry configured to track a position of the handheld controller relative to the head-mounted device based on the image and the motion data.
According to another embodiment, a head mounted device includes a display configured to overlay computer-generated display content onto a handheld controller.
According to another embodiment, the handheld controller includes additional control circuitry configured to send control signals to the head mounted device to manipulate the computer generated display content in response to the motion data.
According to another embodiment, the computer-generated display is selected from the group consisting of virtual keyboard keys and virtual writing tools.
The foregoing is merely illustrative and various modifications may be made to the embodiments. The foregoing embodiments may be implemented alone or in any combination.
Claims (20)
1. A hand-held controller for controlling an electronic device, comprising:
a housing having an elongate shaft extending between a first end portion and a second end portion;
a force sensor located in the first end portion and configured to collect a force input;
a motion sensor configured to collect motion data;
An actuator configured to provide a haptic output in response to the force input and the motion data, and
Control circuitry located in the housing and configured to send control signals to the electronic device based on the force input and the motion data.
2. The hand-held controller of claim 1, wherein the first end portion is removable from the elongate shaft.
3. The hand-held controller of claim 1, wherein the elongate shaft has a flat surface and a curved surface.
4. The hand-held controller of claim 3, further comprising a button located on the planar surface of the elongate shaft.
5. The hand-held controller of claim 3, further comprising a touch sensor located on the curved surface of the elongate shaft.
6. The handheld controller of claim 1, further comprising a power receiving coil configured to receive wireless power.
7. The handheld controller of claim 1, wherein the actuator comprises a linear resonant actuator configured to impart directivity to the haptic output.
8. The hand-held controller of claim 1, further comprising a roller that rotates relative to the elongated housing.
9. The hand-held controller of claim 1, wherein the elongated housing comprises first and second housing portions operable in a first mode in which the first and second housing portions mate with one another and a second mode in which the first and second housing portions are separate from one another and operate independently.
10. The handheld controller of claim 1, wherein the control signal comprises instructions for selecting a user interface element on a display in the electronic device.
11. The handheld controller of claim 1, wherein the control signal comprises instructions for drawing a line on a display in the electronic device.
12. The handheld controller of claim 1, wherein the control signal comprises instructions for typing words on a display in the electronic device.
13. A hand-held controller for controlling an electronic device, comprising:
a housing having an elongate shaft extending between a first end portion and a second end portion;
a motion sensor configured to collect motion data;
a camera located in the first end portion and configured to capture an image;
a touch sensor positioned on the elongate shaft and configured to collect touch input, and
Control circuitry configured to send control signals to the electronic device based on the motion data, the image, and the touch input.
14. The hand-held controller of claim 13, wherein the housing has a length between 130mm and 160mm and a diameter between 10mm and 15 mm.
15. The hand-held controller of claim 13, wherein the camera comprises a single-pixel camera.
16. The handheld controller of claim 13, wherein the control signal includes instructions for scrolling content on a display in the electronic device.
17. A system, comprising:
a handheld controller, the handheld controller comprising:
An elongated housing having a force-sensitive end;
A visual marker located on the elongated housing, and
An inertial measurement unit configured to collect motion data, and
A headset, the headset comprising:
a camera configured to capture an image of the visual marker, and
Control circuitry configured to track a position of the handheld controller relative to the head mounted device based on the image and the motion data.
18. The system of claim 17, wherein the head mounted device comprises a display configured to overlay computer-generated display content onto the handheld controller.
19. The system of claim 18, wherein the handheld controller comprises additional control circuitry configured to send control signals to the head mounted device to manipulate the computer-generated display content in response to the motion data.
20. The system of claim 19, wherein the computer-generated display is selected from the group consisting of virtual keyboard keys and virtual writing tools.
Applications Claiming Priority (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263388196P | 2022-07-11 | 2022-07-11 | |
| US63/388,196 | 2022-07-11 | ||
| US18/348,943 | 2023-07-07 | ||
| US18/348,943 US20240012496A1 (en) | 2022-07-11 | 2023-07-07 | Computer Systems with Handheld Controllers |
| PCT/US2023/069909 WO2024015750A1 (en) | 2022-07-11 | 2023-07-10 | Computer systems with handheld controllers |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN119563150A true CN119563150A (en) | 2025-03-04 |
Family
ID=87553680
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202380052831.0A Pending CN119563150A (en) | 2022-07-11 | 2023-07-10 | Computer system with handheld controller |
Country Status (4)
| Country | Link |
|---|---|
| KR (1) | KR20250020641A (en) |
| CN (1) | CN119563150A (en) |
| DE (1) | DE112023003025T5 (en) |
| WO (1) | WO2024015750A1 (en) |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190369752A1 (en) * | 2018-05-30 | 2019-12-05 | Oculus Vr, Llc | Styluses, head-mounted display systems, and related methods |
| TWI697814B (en) * | 2018-07-16 | 2020-07-01 | 禾瑞亞科技股份有限公司 | Functional modular touch pen |
-
2023
- 2023-07-10 DE DE112023003025.1T patent/DE112023003025T5/en active Pending
- 2023-07-10 KR KR1020257000517A patent/KR20250020641A/en active Pending
- 2023-07-10 CN CN202380052831.0A patent/CN119563150A/en active Pending
- 2023-07-10 WO PCT/US2023/069909 patent/WO2024015750A1/en not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| DE112023003025T5 (en) | 2025-10-02 |
| WO2024015750A1 (en) | 2024-01-18 |
| KR20250020641A (en) | 2025-02-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11360558B2 (en) | Computer systems with finger devices | |
| US11347312B1 (en) | Ultrasonic haptic output devices | |
| US12229341B2 (en) | Finger-mounted input devices | |
| CN111240465B (en) | Computer system with finger device for sampling object properties | |
| US11287886B1 (en) | Systems for calibrating finger devices | |
| US12026325B2 (en) | Handheld input devices with sleeves | |
| US11360587B1 (en) | Deployment systems for computer system finger devices | |
| US12130972B2 (en) | Tracking devices for handheld controllers | |
| US20240362312A1 (en) | Electronic Device System With Ring Devices | |
| US20240012496A1 (en) | Computer Systems with Handheld Controllers | |
| US12259560B2 (en) | Handheld controllers with charging and storage systems | |
| CN119440281A (en) | Handheld Input Devices | |
| US12399561B2 (en) | Ring device | |
| CN119563150A (en) | Computer system with handheld controller | |
| US20250093968A1 (en) | Handheld Controllers with Surface Marking Capabilities | |
| US11714495B2 (en) | Finger devices with adjustable housing structures | |
| WO2025064174A1 (en) | Handheld controllers with surface marking capabilities | |
| US12287927B2 (en) | Multi-input for rotating and translating crown modules | |
| US20250103140A1 (en) | Audio-haptic cursor for assisting with virtual or real-world object selection in extended-reality (xr) environments, and systems and methods of use thereof | |
| CN118502580A (en) | Ring apparatus |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |