CN118556217A - Debug access for goggles with multiple SOCs - Google Patents
Debug access for goggles with multiple SOCs Download PDFInfo
- Publication number
- CN118556217A CN118556217A CN202280087068.0A CN202280087068A CN118556217A CN 118556217 A CN118556217 A CN 118556217A CN 202280087068 A CN202280087068 A CN 202280087068A CN 118556217 A CN118556217 A CN 118556217A
- Authority
- CN
- China
- Prior art keywords
- soc
- usb
- usb port
- low power
- automation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/3296—Power saving characterised by the action undertaken by lowering the supply or operating voltage
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/38—Information transfer, e.g. on bus
- G06F13/382—Information transfer, e.g. on bus using universal interface adapter
- G06F13/385—Information transfer, e.g. on bus using universal interface adapter for adaptation of a particular data processing system to different peripheral devices
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/38—Information transfer, e.g. on bus
- G06F13/42—Bus transfer protocol, e.g. handshake; Synchronisation
- G06F13/4282—Bus transfer protocol, e.g. handshake; Synchronisation on a serial bus, e.g. I2C bus, SPI bus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2213/00—Indexing scheme relating to interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F2213/0042—Universal serial bus [USB]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Goggles include multiple system-on-chip (socs) sharing processing workload and USB ports configured for performing low-power debugging. The goggles include a USB hub configured to allow the USB ports to communicate with multiple socs simultaneously. The USB hub may be turned off to disable the USB hub and all socs may enter their low power mode without remaining awake through a persistent USB connection. Goggles include switches and control logic to enable USB ports to perform low power debugging or control each of these socs.
Description
Cross Reference to Related Applications
The present application claims priority from U.S. application Ser. No. 17/566,951 submitted at 12/31 of 2021. The contents of which are incorporated herein by reference in their entirety.
Technical Field
Examples set forth in this disclosure relate to the field of electronic devices, and more particularly to eyewear devices.
Background
Many types of computers and electronic devices available today, such as mobile devices (e.g., smartphones, tablets, and notebooks), handheld devices, and wearable devices (e.g., smart glasses, digital glasses, headwear, headgear, and head-mounted displays) include various cameras, sensors, wireless transceivers, input systems (e.g., touch-sensitive surfaces, pointers), peripheral devices, displays, and Graphical User Interfaces (GUIs) through which a user can interact with displayed content.
Augmented Reality (AR) combines real objects with virtual objects in a physical environment and displays the combination to a user. The combined display gives the impression that the virtual object is actually present in the environment, especially when the virtual object looks and behaves like a real object.
Drawings
Features of the different examples described will be readily understood from the following detailed description with reference to the accompanying drawings. Reference numerals are used for each element in the description and throughout the several views of the drawings. When there are multiple like elements, a single reference numeral may be assigned to a like element, with an increased letter denoting a particular element. When referring to more than one of the elements or to non-specific ones of the elements, letters may be removed.
The various elements shown in the drawings are not drawn to scale unless otherwise indicated. The dimensions of the various elements may be exaggerated or reduced for clarity. The several figures depict one or more implementations and are presented by way of example only and should not be construed as limiting. The drawings include the following figures:
FIG. 1A is a side view (right) of an example hardware configuration of an eyewear device suitable for use in an eyewear system;
FIG. 1B is a perspective partial cross-sectional view of the right temple portion of the eyewear device of FIG. 1A, depicting a right visible light camera and a circuit board;
FIG. 1C is a side view (left) of an example hardware configuration of the eyewear device of FIG. 1A, showing a left visible light camera;
FIG. 1D is a perspective partial cross-sectional view of the left temple portion of the eyewear device of FIG. 1C, depicting a left visible light camera and a circuit board;
Fig. 2A and 2B are rear views of an example hardware configuration of an eyewear device for use in the eyewear system;
FIG. 2C illustrates detecting an eye gaze direction;
FIG. 2D illustrates detecting eye position;
FIG. 3 is a pictorial representation of a three dimensional scene, a left raw image captured by a left visible camera, and a right raw image captured by a right visible camera;
FIG. 4 is a functional block diagram of an example eyewear system including eyewear devices connected to a mobile device and a server system via various networks;
FIG. 5 is a schematic representation of an example hardware configuration of a mobile device of the eyewear system of FIG. 4;
FIG. 6 is a partial block diagram of a eyewear device having a first system-on-chip adjacent one temple and a second system-on-chip adjacent the other temple;
FIG. 7 is a flowchart of example steps for performing operations on goggles with a first system on a chip (SoC) and a second SoC;
FIG. 8 is a block diagram of goggles including a Universal Serial Bus (USB) port and USB hub for providing debug access to multiple SoCs; and
FIG. 9 is a flowchart of example steps for performing a debug operation on goggles using a USB port and a USB hub.
Detailed Description
A goggle apparatus includes a plurality of socs sharing a processing workload and a USB port configured to perform low power debugging (debugging) and automation of the plurality of socs, for example, using Universal Asynchronous Receiver Transmitter (UART) or serial line debugging (SWD). The goggles include a USB hub configured such that a USB port may communicate with multiple socs simultaneously. The USB hub may be turned off and all socs may enter their low power mode without remaining awake through a persistent USB connection. The goggles include a first switch and control logic, wherein the control logic controls the first switch and enables the USB port to perform low power debugging and automation of the plurality of socs. The goggle also comprises a second switch, wherein the control logic controls the second switch to enable the USB port to perform low power debugging and automation of the plurality of socs via the processor or to enable the USB port to control each of the plurality of socs.
The following detailed description includes systems, methods, techniques, sequences of instructions, and computer program products that illustrate examples set forth in this disclosure. Numerous details and examples are included to provide a thorough understanding of the disclosed subject matter and its related teachings. However, one skilled in the relevant art will understand how to apply the relevant teachings without these details. Aspects of the disclosed subject matter are not limited to the specific devices, systems, and methods described, as the related teachings may be applied or practiced in numerous ways. The terms and designations used herein are used for the purpose of describing particular aspects only and are not intended to be limiting. Generally, well-known instruction instances, protocols, structures, and techniques have not necessarily been shown in detail.
The term "system-on-a-chip" or "SoC" is used herein to refer to an integrated circuit (also referred to as a "chip") that integrates components of an electronic system on a single substrate or microchip. These components include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a memory controller, a video decoder, and a system bus interface for connecting to another SoC. As non-limiting examples, components of the SoC may additionally include one or more of the following: an interface to an inertial measurement unit (IMU; e.g., I2C, SPI, I3C, etc.), a video encoder, a transceiver (TX/RX; e.g., wi-Fi, bluetooth technology, or a combination thereof), and digital, analog, mixed signal, and radio frequency signal processing functions.
The term "couple" or "connect" as used herein refers to any logical, optical, physical, or electrical connection, including links, etc., by which electrical or magnetic signals generated or provided by one system element are imparted to another coupled or connected system element. Unless otherwise described, elements or devices coupled or connected are not necessarily directly connected to each other and may be separated by intermediate components, elements or communication media one or more of which may modify, manipulate or carry electrical signals. The term "on … …" means supported directly by an element or indirectly by another element through integration into or support by the element.
The term "proximal" is used to describe an article or portion of an article that is positioned adjacent, or next to an object or person; or closer relative to other portions of the article, which may be described as "distal". For example, the end of the article closest to the object may be referred to as the proximal end, while the generally opposite end may be referred to as the distal end.
The orientations of the goggle device, other mobile devices, associated components, and any other device incorporating a camera, an inertial measurement unit, or both a camera and an inertial measurement unit (such as shown in any of these figures) are given by way of example only for purposes of illustration and discussion. In operation, the eyewear device may be oriented in any other direction suitable for the particular application of the eyewear device; for example, upward, downward, sideways, or any other orientation. Moreover, within the scope of the use herein, any directional terms (such as front, back, inward, outward, toward, left, right, lateral, longitudinal, upward, downward, upper, lower, top, bottom, side, horizontal, vertical, and diagonal) are used by way of example only and are not limited to any camera or inertial measurement unit direction or orientation as configured or otherwise described herein.
Additional objects, advantages, and novel features of the examples will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following figures or may be learned by production or operation of the examples. The objects and advantages of the subject matter may be realized and attained by means of the instrumentalities and combinations particularly pointed out in the appended claims.
Reference will now be made in detail to examples illustrated in the accompanying drawings and discussed below.
Fig. 1A is a side view (right) of an example hardware configuration of the eyewear device 100 that includes a touch-sensitive input device or touch pad 181. As shown, the touch pad 181 may have subtle and not easily visible boundaries; alternatively, the boundary may be clearly visible or include a raised or otherwise tactile edge that provides feedback to the user regarding the position and boundary of the touch pad 181. In other implementations, the eyewear device 100 may include a touch pad on the left side.
The surface of the touch pad 181 is configured to detect finger touches, taps, and gestures (e.g., mobile touches) for use with a GUI displayed by the eyewear device on an image display to allow a user to navigate and select menu options in an intuitive manner, which enhances and simplifies the user experience.
Detection of finger input on touch pad 181 may perform several functions. For example, touching any place on the touch pad 181 may cause the GUI to display or highlight an item on the image display that may be projected onto at least one of the optical components 180A, 180B. A double tap on touch pad 181 may select an item or icon. Sliding or swiping a finger in a particular direction (e.g., front-to-back, back-to-front, top-to-bottom, or bottom-to-bottom) may cause an item or icon to slide or scroll in a particular direction; for example, move to the next item, icon, video, image, page, or slide. Sliding the finger in the other direction may slide or scroll in the opposite direction; for example to move to a previous item, icon, video, image, page or slide. The touch pad 181 may be located virtually anywhere on the eyewear device 100.
In one example, a single tap on touch pad 181 of the recognized finger gesture initiates selection or pressing of a graphical user interface element in an image presented on the image display of optical assemblies 180A, 180B. The adjustment of the image presented on the image display of the optical component 180A, 180B based on the recognized finger gesture may be a primary action of selecting or submitting a graphical user interface element on the image display of the optical component 180A, 180B for further display or execution.
As shown in fig. 1A, the eyewear device 100 includes a right visible light camera 114B. As further described herein, the two cameras 114A, 114B capture image information of a scene from two separate viewpoints. The two captured images may be used to project a three-dimensional display onto an image display for viewing on 3D glasses or with 3D glasses.
The eyewear device 100 includes a right optical assembly 180B having an image display for presenting an image, such as a depth image. As shown in fig. 1A and 1B, the eyewear device 100 includes a right visible light camera 114B. The eyewear device 100 may include a plurality of visible light cameras 114A, 114B that form a passive type of three-dimensional camera, such as a stereoscopic camera, with the right visible light camera 114B located on the right temple portion 110B. As shown in fig. 1C-1D, the eyewear device 100 also includes a left visible light camera 114A located on the left temple portion 110A.
The left and right visible light cameras 114A, 114B are sensitive to visible light range wavelengths. Each of the visible light cameras 114A, 114B has a different forward-facing field of view that overlap to enable the generation of three-dimensional depth images. The right visible light camera 114B captures a right field of view 111B and the left visible light camera 114A captures a left field of view 111A. Generally, a "field of view" is a portion of a scene that is visible through a camera at a particular location and orientation in space. The fields of view 111A, 111B have overlapping fields of view 304 (fig. 3). When the visible light camera captures an image, objects or object features outside of the fields of view 111A, 111B are not recorded in the original image (e.g., a photograph or picture). The field of view describes the angular extent or extent to which the image sensors of the visible light cameras 114A, 114B pick up electromagnetic radiation of a given scene in the captured image of the given scene. The field of view may be expressed as the angular size of the viewing cone (i.e., the viewing angle). The viewing angle may be measured horizontally, vertically or diagonally.
In an example, the visible light cameras 114A, 114B have a field of view with a viewing angle of 15 ° to 30 ° (e.g., 24 °) and have a resolution of 480 pixels or more of 480 x. In another example, the field of view may be much wider, such as 100 ° or greater. "coverage angle" describes the angular range over which the lenses of the visible light cameras 114A, 114B or the infrared camera 410 (see fig. 2A) can effectively image. Typically, the camera lens produces an image circle that is large enough to completely cover the film or sensor of the camera, possibly including some vignetting (e.g., the image darkens toward the edge when compared to the center). If the coverage angle of the camera lens does not fill the sensor, the image circle will be visible, typically with a strong vignetting towards the edge, and the effective viewing angle will be limited to the coverage angle.
Examples of such visible light cameras 114A, 114B include high resolution Complementary Metal Oxide Semiconductor (CMOS) image sensors and digital VGA cameras (video graphics arrays) capable of achieving resolutions of 640p (e.g., 640x 480 pixels for a total of 30 ten thousand pixels), 720p, or 1080 p. Other examples of visible light cameras 114A, 114B, the visible light cameras 114A, 114B may capture High Definition (HD) still images and store them at a resolution of 1642 by 1642 pixels (or more); or recording high definition video at a high frame rate (e.g., thirty to sixty frames per second or greater) and storing the recording at 1216 by 1216 pixels (or greater) resolution.
The eyewear device 100 may capture image sensor data from the visible light cameras 114A, 114B as well as geographic location data digitized by an image processor for storage in memory. The visible light cameras 114A, 114B capture respective left and right raw images of a pixel matrix included on a two-dimensional coordinate system including an X-axis for a horizontal position and a Y-axis for a vertical position in a two-dimensional spatial domain. Each pixel includes a color attribute value (e.g., a red pixel light value, a green pixel light value, or a blue pixel light value) and a position attribute (e.g., an X-axis coordinate and a Y-axis coordinate).
For capturing stereoscopic images for later display as three-dimensional projections, an image processor 412 (shown in fig. 4) may be coupled to the visible light cameras 114A, 114B to receive and store visual image information. The image processor 412 or another processor controls the operation of the visible light cameras 114A, 114B to function as a stereoscopic camera simulating human binocular vision, and a time stamp may be added to each image. The time stamps on each pair of images allow the images to be displayed together as part of a three-dimensional projection. Three-dimensional projection produces an immersive, realistic living experience desired in various environments including Virtual Reality (VR) and video games.
Fig. 1B is a perspective cross-sectional view of the right temple portion 110B of the eyewear device 100 of fig. 1A, depicting the right visible-light camera 114B and a circuit board of the camera system. Fig. 1C is a side view (left) of an example hardware configuration of the eyewear device 100 of fig. 1A, showing a left visible light camera 114A of the camera system. Fig. 1D is a perspective cross-sectional view of the left temple portion 110A of the eyewear device of fig. 1C, depicting the left visible-light camera 114A of the three-dimensional camera and a circuit board. The configuration and arrangement of the left visible light camera 114A is substantially similar to the configuration and arrangement of the right visible light camera 114B, except that the connections and couplings are located on the left side 170A.
As shown in the example of fig. 1B, the eyewear device 100 includes a right visible light camera 114B and a circuit board 140B, which may be a flexible Printed Circuit Board (PCB). The right hinge 126B connects the right temple portion 110B to the right temple portion 125B of the eyewear device 100. In some examples, the right visible light camera 114B, the flexible PCB140B, or other components of electrical connectors or contacts can be located on the right temple 125B, the right hinge 126B, the right temple portion 110B, the frame 105, or a combination thereof. These components (or a subset thereof) may be incorporated in the SoC.
As shown in the example of fig. 1D, the eyewear device 100 includes a left visible light camera 114A and a circuit board 140A, which may be a flexible Printed Circuit Board (PCB). The left hinge 126A connects the left temple portion 110A to the left temple 125A of the eyewear device 100. In some examples, the left visible light camera 114A, the flexible PCB 140A, or other components of electrical connectors or contacts can be located on the left temple 125A, the left hinge 126A, the left temple portion 110A, the frame 105, or a combination thereof. These components (or a subset thereof) may be incorporated in the SoC.
The left and right temple portions 110A and 110B include a temple portion body 190 and a temple portion cover, which is omitted in the cross-section of fig. 1B and 1D. Disposed within the left and right temple portions 110A, 110B are various interconnected circuit boards, such as PCBs or flexible PCBs, including controller circuitry for the respective left and right visible light cameras 114A, 114B, microphone(s) 130, speaker 132, low power wireless circuitry (e.g., for wireless short-range network communication via bluetooth TM), high speed wireless circuitry (e.g., for wireless local area network communication via Wi-Fi). The components and circuitry (or a subset thereof) in each temple portion 110 may be incorporated in a SoC.
The right visible light camera 114B is coupled to or disposed on the flexible PCB 140B and is covered by a right visible light camera cover lens that is aimed through an opening(s) formed in the frame 105. For example, as shown in fig. 2A, the right rim 107B of the frame 105 is connected to the right temple portion 110B and includes an opening(s) for a visible light camera cover lens. The frame 105 includes a front face configured to face outwardly and away from the eyes of the user. An opening for a visible light camera cover lens is formed on the front or outward facing side of the frame 105 and passes through the front or outward facing side of the frame 105. In an example, the right visible light camera 114B has an outward facing field of view 111B (shown in fig. 3) having a line of sight or viewing angle associated with the right eye of the user of the eyewear device 100. The visible light camera cover lens may also be adhered to the front or outward facing surface of the right temple portion 110B, wherein the opening is formed with an outward facing coverage angle, but in a different outward direction. The coupling may also be indirect via intervening components. Although shown as being formed on the circuit board of the right temple portion 110B, the right visible light camera 114B may be formed on the circuit board of the left temple 125B or the frame 105.
The left visible light camera 114A is coupled to the flexible PCB 140A or disposed on the flexible PCB 140A and is covered by a visible light camera cover lens that is aimed through an opening(s) formed in the frame 105. For example, as shown in fig. 2A, the left rim 107A of the frame 105 is connected to the left temple portion 110A and includes opening(s) for a visible light camera cover lens. The frame 105 includes a front face configured to face outwardly and away from the eyes of the user. An opening for a visible light camera cover lens is formed on the front or outward facing side of the frame 105 and passes through the front or outward facing side of the frame 105. In an example, the left visible light camera 114A has an outward facing field of view 111A (shown in fig. 3) having a line of sight or viewing angle associated with the left eye of the user of the eyewear device 100. The visible light camera cover lens may also be adhered to the front or outward facing surface of the left temple portion 110A, wherein the opening is formed with an outward facing coverage angle, but in a different outward direction. The coupling may also be indirect via intervening components. Although shown as being formed on the circuit board of the left temple portion 110A, the left visible light camera 114A may be formed on the circuit board of the left temple portion 125A or the frame 105.
Fig. 2A and 2B are perspective views from the rear of an example hardware configuration of the eyewear device 100 including two different types of image displays. The eyewear device 100 is sized and shaped in a form configured for wearing by a user; in the example shown the form of spectacles. The eyewear device 100 may take other forms and may incorporate other types of frames; such as a headgear, headphones, or a helmet.
In the eyeglass example, the eyewear device 100 includes a frame 105 including a left bezel 107A connected to a right bezel 107B via a bridge 106 that is adapted to be supported by a user's nose. The left bezel 107A and the right bezel 107B include respective apertures 175A, 175B that retain respective optical elements 180A, 180B, such as lenses and display devices. As used herein, the term "lens" is intended to include transparent or translucent pieces of glass or plastic having curved or flat surfaces that cause light to converge/diverge or cause little or no convergence or divergence.
Although shown with two optical elements 180A, 180B, the eyewear device 100 may include other arrangements, such as a single optical element (or it may not include any optical elements 180A, 180B), depending on the application or intended user of the eyewear device 100. As further shown, the eyewear device 100 includes a left temple portion 110A adjacent a left side 170A of the frame 105 and a right temple portion 110B adjacent a right side 170B of the frame 105. The temple portions 110A, 110B can be integrated into the frame 105 on the respective sides 170A, 170B (as shown) or implemented as separate components attached to the frame 105 on the respective sides 170A, 170B. Alternatively, the temple portions 110A, 110B can be integrated into a temple (not shown) that is attached to the frame 105.
In one example, the image display of the optical assemblies 180A, 180B includes an integrated image display 177. As shown in fig. 2A, each optical assembly 180A, 180B includes a suitable display matrix 177, such as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, or any other such display. Each optical assembly 180A, 180B also includes one or more optical layers 176, which may include lenses, optical coatings, prisms, mirrors, waveguides, optical ribbons, and other optical components in any combination. The optical layers 176A, 176B, … N (shown as 176A-N in fig. 2A and herein) may include prisms of suitable size and configuration and including a first surface for receiving light from the display matrix and a second surface for emitting light to the eyes of the user. The prisms of the optical layers 176A-N extend over all or at least a portion of the respective apertures 175A, 175B formed in the left and right rims 107A, 107B to allow a user to see the second surface of the prisms when the user's eyes are looking through the corresponding left and right rims 107A, 107B. The first surfaces of the prisms of optical layers 176A-N face upward from the frame 105 and the display matrix 177 covers the prisms such that photons and light emitted by the display matrix 177 strike the first surfaces. The prisms are sized and shaped such that light is refracted within the prisms and directed toward the user's eyes by the second surfaces of the prisms of optical layers 176A-N. In this aspect, the second surface of the prisms of optical layers 176A-N may be convex to direct light toward the center of the eye. The prism may optionally be sized and shaped to magnify the image projected by the display matrix 177, and light passes through the prism such that the image viewed from the second surface is larger in one or more dimensions than the image emitted from the display matrix 177.
In one example, the optical layers 176A-N may include transparent (keep the lenses on) LCD layers unless and until a voltage is applied that makes the layers opaque (turns off or blocks the lenses). The image processor 412 on the eyewear device 100 may perform programming to apply voltages to the LCD layers in order to create an active shutter system to make the eyewear device 100 suitable for viewing visual content when displayed as a three-dimensional projection. Techniques other than LCD may be used for active shutter mode, including other types of reactive layers that respond to voltage or another type of input.
In another example, the image display device of the optical assemblies 180A, 180B includes a projection image display as shown in fig. 2B. Each optical assembly 180A, 180B includes a laser projector 150, which is a three-color laser projector using a scanning mirror or galvanometer. During operation, a light source (such as laser projector 150) is disposed in or on one of the temples 125A, 125B of the eyewear device 100. The optical assemblies 180B in this example include one or more optical zones 155A, 155B, … … N (shown as 155A-N in fig. 2B) that are spaced apart and span the width of the lens of each optical assembly 180A, 180B or span the depth of the lens between the front and rear surfaces of the lens.
As photons projected by laser projector 150 traverse the lenses of each optical assembly 180A, 180B, the photons encounter optical zones 155A-N. When a particular photon encounters a particular optical band, the photon is either redirected toward the user's eye or the photon is passed on to the next optical band. The combination of the modulation of the laser projector 150 and the modulation of the optical band may control a particular photon or beam. In an example, the processor controls the optical zones 155A-N by activating mechanical, acoustic, or electromagnetic signals. Although shown with two optical assemblies 180A, 180B, the eyewear device 100 may include other arrangements, such as a single or three optical assemblies, or each optical assembly 180A, 180B may have a different arrangement depending on the application or intended user of the eyewear device 100.
In another example, the eyewear device 100 shown in fig. 2B may include two projectors: a left projector (not shown) and a right projector (shown as projector 150). Left optical assembly 180A may include a left display matrix (not shown) or a set of left optical strips (not shown) configured to interact with light from a left projector. In this example, the eyewear device 100 includes a left display and a right display.
As further shown in fig. 2A and 2B, the eyewear device 100 includes a left temple portion 110A adjacent a left side 170A of the frame 105 and a right temple portion 110B adjacent a right side 170B of the frame 105. The temple portions 110A, 110B can be integrated into the frame 105 on the respective sides 170A, 170B (as shown) or implemented as separate components attached to the frame 105 on the respective sides 170A, 170B. Alternatively, the temple portions 110A, 110B can be integrated into the temple portions 125A, 125B attached to the frame 105.
Referring to fig. 2A, the frame 105, or one or more of the left and right temples 110A and 110B, includes an infrared emitter 215 and an infrared camera 220. The infrared emitter 215 and the infrared camera 220 may be connected to the flexible PCB 140B by, for example, soldering. Other arrangements of infrared emitter 215 and infrared camera 220 can be implemented, including arrangements in which infrared emitter 215 and infrared camera 220 are both on right bezel 107B or in different locations on frame 105, for example, infrared emitter 215 on left bezel 107A and infrared camera 220 on right bezel 107B. In another example, infrared emitter 215 is on frame 105 and infrared camera 220 is on one of temples 110A-B, or vice versa. Infrared emitter 215 may be attached to substantially any location on frame 105, left bezel 110A, or right bezel 110B to emit a pattern of infrared light. Similarly, the infrared camera 220 can be attached substantially anywhere on the frame 105, left temple 110A, or right temple 110B to capture at least one reflective change in the emission pattern of infrared light.
Infrared emitter 215 and infrared camera 220 are arranged to face inwardly toward the user's eyes with some or all of the field of view of the eyes in order to identify the respective eye position and gaze direction. For example, infrared emitter 215 and infrared camera 220 are positioned directly in front of the eyes, in an upper portion of frame 105, or in temples 110A-B at either end of frame 105.
In an example, the processor 432 utilizes the eye tracker 213 to determine the eye gaze direction 230 of the wearer's eye 234 as shown in fig. 2C, and the eye position 236 of the wearer's eye 234 within the eye box (eyebox) as shown in fig. 2D. In one example, the eye tracker 213 is a scanner that captures an image of the reflected change of infrared light from the eye 234 using infrared light illumination (e.g., near infrared, short wavelength infrared, mid wavelength infrared, long wavelength infrared, or far infrared) to determine a gaze direction 230 of a pupil 232 of the eye 234, and an eye position 236 relative to the display 180D.
Fig. 3 is a diagrammatic depiction of a three-dimensional scene 306, a left raw image 302A captured by left visible camera 114A, and a right raw image 302B captured by right visible camera 114B. The left field of view 111A may overlap with the right field of view 111B as shown. The overlapping fields of view 304 represent portions of the images captured by the two cameras 114A, 114B. When referring to fields of view, the term "overlap" refers to the pixel matrices in the generated original image overlapping by thirty percent (30%) or more. "substantially overlapping" refers to the pixel matrices in the infrared image of the original image or scene being generated that overlap by fifty percent (50%) or more. As described herein, the two original images 302A, 302B may be processed to include a time stamp that allows the images to be displayed together as part of a three-dimensional projection.
For capture of stereoscopic images, as shown in fig. 3, a pair of raw red, green, and blue (RGB) images of the real scene 306-a left raw image 302A captured by the left camera 114A and a right raw image 302B captured by the right camera 114B-are captured at a given moment. When a pair of original images 302A, 302B is processed (e.g., by the image processor 412), a depth image is generated. The generated depth image may be viewed on the optical components 180A, 180B of the eyewear device, on another display (e.g., the image display 580 on the mobile device 401), or on a screen.
The generated depth image is in the three-dimensional spatial domain and may include a vertex matrix on a three-dimensional position coordinate system that includes an X-axis for a horizontal position (e.g., length), a Y-axis for a vertical position (e.g., height), and a Z-axis for depth (e.g., distance). Each vertex may include a color attribute (e.g., a red pixel light value, a green pixel light value, or a blue pixel light value); location attributes (e.g., X-location coordinates, Y-location coordinates, and Z-location coordinates); texture attributes; reflectivity properties; or a combination thereof. Texture attributes quantify the perceived texture of a depth image, such as the spatial arrangement of colors or intensities in the vertex region of the depth image.
In one example, the eyewear system 400 (fig. 4) includes the eyewear device 100 including the frame 105, a left temple 110A extending from a left side 170A of the frame 105, and a right temple 125B extending from a right side 170B of the frame 105. The eyewear device 100 may further include at least two visible light cameras 114A, 114B having overlapping fields of view. In one example, the eyewear device 100 includes a left visible light camera 114A having a left field of view 111A, as shown in fig. 3. The left camera 114A is connected to the frame 105, left temple 125A, or left temple portion 110A to capture a left raw image 302A from the left side of the scene 306. The eyewear device 100 also includes a right visible light camera 114B having a right field of view 111B. The right camera 114B is connected to the frame 105, right temple 125B, or right temple portion 110B to capture a right raw image 302B from the right side of the scene 306.
Fig. 4 is a functional block diagram of an example eyewear system 400 that includes a wearable device (e.g., eyewear device 100), a mobile device 401, and a server system 498 connected via a different network 495, such as the internet. The eyewear system 400 includes a low power wireless connection 425 and a high speed wireless connection 437 between the eyewear device 100 and the mobile device 401.
As shown in fig. 4, the eyewear device 100 includes one or more visible light cameras 114A, 114B that capture still images, video images, or both still and video images, as described herein. The cameras 114A, 114B may have Direct Memory Access (DMA) to the high-speed circuit 430 and function as stereo cameras. The cameras 114A, 114B may be used to capture an initial depth image that may be rendered as a three-dimensional (3D) model of texture mapped images of red, green, and blue (RGB) imaging scenes.
The eyewear device 100 also includes two optical assemblies 180A, 180B (one associated with the left side 170A and one associated with the right side 170B). The eyewear device 100 also includes an image display driver 442, an image processor 412, low power circuitry 420, and high speed circuitry 430 (all of which may be duplicated and incorporated into a pair of socs). The image display 177 of each optical assembly 180A, 180B is for presenting images including still images, video images, or both still and video images. An image display driver 442 is coupled to the image display of each optical assembly 180A, 180B to control the display of images.
The eyewear device 100 additionally includes one or more microphones 130 and speakers 132 (e.g., one associated with the left side of the eyewear device and the other associated with the right side of the eyewear device). The microphone 130 and speaker 132 may be incorporated into the frame 105, temple 125, or temple portion 110 of the eyewear device 100. One or more speakers 132 are driven by an audio processor 443 (which may be duplicated and incorporated into a pair of socs) under the control of low power circuitry 420, high speed circuitry 430, or both. Speaker 132 is used to present audio signals including, for example, beat tracks. An audio processor 443 is coupled to the speaker 132 to control the presentation of sound.
The components for the eyewear device 100 shown in fig. 4 are located on one or more circuit boards, such as a Printed Circuit Board (PCB) or a Flexible Printed Circuit (FPC), which are located in a bezel or temple. Alternatively or additionally, the depicted components may be located in the temple portion, frame, hinge, or bridge of the eyewear device 100. The left and right visible light cameras 114A, 114B may include digital camera elements such as Complementary Metal Oxide Semiconductor (CMOS) image sensors, charge-coupled devices, lenses, or any other corresponding visible or light capturing element that may be used to capture data, including still images or video of a scene with unknown objects.
As shown in fig. 4, the high-speed circuit 430 includes a high-speed processor 432, a memory 434, and a high-speed wireless circuit 436. In this example, an image display driver 442 is coupled to the high speed circuit 430 and operated by the high speed processor 432 to drive the left and right image displays of each optical assembly 180A, 180B. The high-speed processor 432 may be any processor capable of managing the high-speed communication and operation of any general purpose computing system required by the eyewear device 100. The high speed processor 432 includes processing resources required to manage high speed data transmission over a high speed wireless connection 437 to a Wireless Local Area Network (WLAN) using a high speed wireless circuit 436.
In some examples, the high speed processor 432 executes an operating system, such as a LINUX operating system or other such operating system of the eyewear device 100, and the operating system is stored in the memory 434 for execution. The high speed processor 432 executes, among other things, a software architecture for the eyewear device 100 for managing data transmissions utilizing the high speed wireless circuit 436. In some examples, the high-speed wireless circuit 436 is configured to implement an Institute of Electrical and Electronics Engineers (IEEE) 802.11 communication standard, also referred to herein as Wi-Fi. In other examples, other high-speed communication standards may be implemented by the high-speed wireless circuit 436.
The low power circuit 420 includes a low power processor 422 and a low power wireless circuit 424. The low power wireless circuit 424 and the high speed wireless circuit 436 of the eyewear device 100 may include a short range transceiver (Bluetooth TM or Bluetooth Low Energy (BLE)) and a wireless wide, local or wide area network transceiver (e.g., cellular or Wi-Fi). The mobile device 401 including a transceiver that communicates via the low power wireless connection 425 and the high speed wireless connection 437 may be implemented using details of the architecture of the eyewear device 100, as may other elements of the network 495.
Memory 434 includes any storage device capable of storing different data and applications including, among other things, camera data generated by left and right visible light cameras 114A, 114B, infrared camera(s) 220, image processor 412, and images generated by image display driver 442 for display 177 on the image display of each optical assembly 180A, 180B. Although the memory 434 is shown as being integrated with the high-speed circuit 430, in other examples, the memory 434 may be a separate, stand-alone element of the eyewear device 100. In some such examples, the electrical routing lines may provide a connection from the image processor 412 or the low power processor 422 to the memory 434 through a chip that includes the high speed processor 432. In other examples, high-speed processor 432 may manage addressing of memory 434 such that low-power processor 422 will boot high-speed processor 432 at any time that a read or write operation involving memory 434 is required.
As shown in fig. 4, the high speed processor 432 of the eyewear device 100 may be coupled to a camera system (visible light cameras 114A, 114B), an image display driver 442, a user input device 491, and a memory 434. As shown in fig. 5, the CPU 530 of the mobile device 401 may be coupled to the camera system 570, the inertial measurement system 572, the mobile display driver 582, the user input layer 591, and the memory 540A.
The server system 498 may be one or more computing devices that are part of a service or network computing system, for example, including a processor, memory, and a network communication interface for communicating with one or more eyewear devices 100 and mobile devices 401 over a network 495.
The output components of the eyewear device 100 include visual elements such as left and right image displays (e.g., displays such as Liquid Crystal Displays (LCDs), plasma Display Panels (PDPs), light Emitting Diode (LED) displays, projectors, or waveguides) associated with each lens or optical assembly 180A, 180B as described in fig. 2A and 2B. The eyewear device 100 may include a user-facing indicator (e.g., an LED, speaker, or vibration actuator), or an outward-facing signal (e.g., an LED, speaker). The image display of each optical assembly 180A, 180B is driven by an image display driver 442. In some example configurations, the output component of the eyewear device 100 also includes additional indicators, such as audible elements (e.g., speakers), tactile components (e.g., actuators such as vibration motors, etc. to generate tactile feedback), and other signal generators. For example, the apparatus 100 may include a set of indicators facing the user and a set of signals facing outward. A set of user-oriented indicators are configured to be seen or otherwise sensed by a user of the device 100. For example, the device 100 may include an LED display positioned so that it is viewable by a user, one or more speakers positioned to produce sound audible to the user, or an actuator providing tactile feedback that is perceptible to the user. An outward facing set of signals is configured to be seen or otherwise sensed by an observer in the vicinity of the device 100. Similarly, the device 100 may include LEDs, speakers, or actuators configured and positioned to be sensed by a viewer.
The input components of the eyewear device 100 may include input components (e.g., a touch screen or touch pad 181 configured to receive alphanumeric input, an optoelectronic keyboard or other alphanumeric configured element), pointer-based input components (e.g., a mouse, touch pad, trackball, joystick, motion sensor or other pointing instrument), tactile input components (e.g., a button switch, a touch screen or touch pad that senses touch or touch gesture position, force, or position and force, or other tactile configured element), and audio input components (e.g., a microphone), etc. Mobile device 401 and server system 498 may include alphanumeric, pointer-based, tactile, audio, and other input means.
In some examples, the eyewear device 100 includes a set of motion-sensing components referred to as inertial measurement units 472 (which may be replicated and incorporated into a pair of socs). The motion sensing component may be a microelectromechanical system (MEMS) with microscopic moving components, typically small enough to be part of a microchip. In some example configurations, an Inertial Measurement Unit (IMU) 472 includes an accelerometer, a gyroscope, and a magnetometer. The accelerometer sensing device 100 senses linear acceleration (including acceleration due to gravity) relative to three orthogonal axes (x, y, z). The gyro sensing device 100 senses angular velocities about three rotational axes (pitch, roll, yaw). Together, the accelerometer and gyroscope may provide position, orientation, and motion data about the device relative to six axes (x, y, z, pitch, roll, yaw). Magnetometer (if present) senses the orientation of device 100 relative to magnetic north. The location of the apparatus 100 may be determined by a location sensor, such as a GPS unit 473, one or more transceivers for generating relative location coordinates, a height sensor or barometer, and other orientation sensors (which may be duplicated and incorporated into a pair of socs). The positioning system coordinates may also be received from the mobile device 401 via the low power wireless circuit 424 or the high speed wireless circuit 436 over the wireless connections 425, 437.
The IMU 472 may include or cooperate with a digital motion processor or program that collects raw data from the components and calculates a number of useful values for the position, orientation, and motion of the device 100. For example, acceleration data collected from the accelerometers may be integrated to obtain velocity with respect to each axis (x, y, z); and again integrated to obtain the position of the device 100 (in linear coordinates x, y, z). Angular velocity data from the gyroscope may be integrated to obtain the position of the device 100 (in spherical coordinates). The programming for calculating these useful values may be stored in the memory 434 and executed by the high speed processor 432 of the eyewear device 100.
The eyewear device 100 may optionally include additional peripheral sensors, such as biometric sensors, dedicated sensors, or display elements integrated with the eyewear device 100. For example, the peripheral elements may include any I/O components, including output components, motion components, position components, or any other such elements described herein. For example, the biometric sensor may include components for detecting an expression (e.g., a hand expression, a facial expression, a sound expression, a body posture, or eye tracking), for measuring a biometric signal (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), or for identifying a person (e.g., identification based on voice, retina, facial characteristics, fingerprint, or an electrical biometric signal such as electroencephalogram data), and the like.
The mobile device 401 may be a smart phone, tablet, notebook, access point, or any other such device capable of connecting with the eyewear device 100 using both a low-power wireless connection 425 and a high-speed wireless connection 437. Mobile device 401 connects to server system 498 and network 495. Network 495 may include any combination of wired and wireless connections.
As shown in fig. 4, the eyewear system 400 includes a computing device, such as a mobile device 401, coupled to the eyewear device 100 via a network 495. The eyewear system 400 includes a memory for storing instructions and a processor for executing the instructions. Execution of the instructions of the eyewear system 400 by the processor 432 configures the eyewear device 100 for cooperation with the mobile device 401 and also with another eyewear device 100 over the network 495. The eyewear system 400 may utilize the memory 434 of the eyewear device 100 or the memory elements 540A, 540B, 540C of the mobile device 401 (fig. 5).
As described herein, any of the functions described herein with respect to the eyewear device 100, the mobile device 401, and the server system 498 may be embodied in one or more computer software applications or programmed instruction sets. According to some examples, a "function," "functions," "application," "instruction," "instructions," or "program" is a program that performs a function defined in the program. Different programming languages may be employed to develop one or more applications that are structured in different ways, such as an object oriented programming language (e.g., objective-C, java or C++) or a procedural programming language (e.g., C or assembly language). In a particular example, a third party application (e.g., an application developed by an entity of a vendor other than the particular platform using ANDROID TM or IOS TM Software Development Kit (SDK)) may be included in a mobile operating system (such as IOS TM、ANDROIDTM,Phone, or another mobile operating system). In this example, the third party application may call an API call provided by the operating system to facilitate the functionality described herein.
Thus, the machine-readable medium may take many forms of tangible storage media. Non-volatile storage media includes, for example, optical or magnetic disks, such as any storage device in any computer device or the like, such as might be used to implement client devices, media gateways, transcoders, and the like as shown in the figures. Volatile storage media include dynamic memory, such as the main memory of such a computer platform. Tangible transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier wave transmission media can take the form of electrical or electromagnetic signals, or acoustic or light waves such as those generated during Radio Frequency (RF) and Infrared (IR) data communications. Thus, common forms of computer-readable media include, for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, a cable or link transporting such a carrier wave, or any other medium from which a computer can read programming code or data. Many of these forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
Fig. 5 is a high-level functional block diagram of an example mobile device 401. Mobile device 401 includes flash memory 540A that stores programming to be executed by CPU530 to perform all or a subset of the functions described herein.
The mobile device 401 may include a camera 570 comprising at least two visible light cameras (first and second visible light cameras having overlapping fields of view) or at least one visible light camera and a depth sensor having substantially overlapping fields of view. Flash memory 540A may also include a plurality of images or videos generated via camera 570.
As shown, the mobile device 401 includes an image display 580, a mobile display driver 582 for driving the image display 580, and a display controller 584 for controlling the image display 580. In the example of fig. 5, image display 580 includes a user input layer 591 (e.g., a touch screen), with user input layer 591 layered on top of or otherwise integrated into the screen used by image display 580.
Examples of touch screen type mobile devices that may be used include, but are not limited to, smart phones, personal Digital Assistants (PDAs), tablet computers, notebook computers, or other portable devices. However, the structure and operation of the touch screen type device is provided by way of example; the subject technology as described herein is not intended to be limited thereto. For discussion purposes, fig. 5 thus provides a block diagram illustration of an exemplary mobile device 401 having a user interface that includes a touch screen input layer 591 for receiving input (by touch, multi-touch, or gesture, etc., by hand, stylus, or other tool) and an image display 580 for displaying content.
As shown in fig. 5, mobile device 401 includes at least one digital transceiver (XCVR) 510, shown as a WWAN XCVR, for digital wireless communications via a wide area wireless mobile communications network. The mobile device 401 also includes additional digital or analog transceivers, such as a short-range transceiver (XCVR) 520 for short-range network communication via NFC, VLC, DECT, zigBee, bluetooth TM, or Wi-Fi. For example, short-range XCVR 520 may take the form of any available two-way Wireless Local Area Network (WLAN) transceiver of a type compatible with one or more standard protocols for communication implemented in a wireless local area network, such as one of the Wi-Fi standards under IEEE 802.11.
To generate location coordinates for locating mobile device 401, mobile device 401 may include a Global Positioning System (GPS) receiver. Alternatively or additionally, the mobile device 401 may utilize one or both of the short-range XCVR 520 and the WWAN XCVR 510 to generate position coordinates for positioning. For example, a cellular network, wi-Fi, or Bluetooth TM based positioning system may generate very accurate location coordinates, especially when used in combination. Such location coordinates may be transmitted to the eyewear device via the XCVR 510, 520 over one or more network connections.
The transceivers 510, 520 (i.e., network communication interfaces) conform to one or more of the different digital wireless communication standards used by modern mobile networks. Examples of WWAN transceivers 510 include, but are not limited to, transceivers configured to operate in accordance with Code Division Multiple Access (CDMA) and third generation partnership project (3 GPP) network technologies including, for example, but not limited to, 3GPP type 2 (or 3GPP 2) and LTE, sometimes referred to as "4G". For example, transceivers 510, 520 provide two-way wireless communication of information including digitized audio signals, still image and video signals, web page information for display and network-related inputs, and different types of mobile messaging to/from mobile device 401.
Mobile device 401 also includes a microprocessor that functions as a Central Processing Unit (CPU) 530. A processor is a circuit having elements constructed and arranged to perform one or more processing functions, typically different data processing functions. Although discrete logic components may be used, examples utilize components that form a programmable CPU. Microprocessors, for example, include one or more Integrated Circuit (IC) chips that incorporate electronic components to perform the functions of a CPU. For example, the CPU530 may be based on any known or available microprocessor architecture, such as Reduced Instruction Set Computing (RISC) using the ARM architecture, as is commonly used today in mobile devices and other portable electronic devices. Of course, other arrangements of processor circuits may be used to form the CPU530 or processor hardware in smart phones, notebook computers, and tablet computers.
The CPU 530 serves as a programmable host controller for the mobile device 401 by configuring the mobile device 401 to perform various operations, e.g., in accordance with instructions or programming executable by the CPU 530. For example, such operations may include various general operations of the mobile device, as well as operations related to programming of applications on the mobile device. While the processor may be configured using hardwired logic, a typical processor in a mobile device is a general processing circuit configured by programmed execution.
Mobile device 401 includes a memory or storage system for storing programming and data. In this example, the memory system may include flash memory 540A, random Access Memory (RAM) 540B, and other memory components 540C as desired. RAM 540B serves as short-term storage of instructions and data processed by CPU 530, for example, as working data processing memory. Flash memory 540A typically provides long-term storage.
Thus, in the example of mobile device 401, flash memory 540A is used to store programs or instructions for execution by CPU 530. Depending on the type of device, mobile device 401 stores and runs a mobile operating system through which a particular application is executed. Examples of Mobile operating systems include valley singer, apple iOS (for iPhone or iPad devices), windows Mobile, amazon Fire OS, RIM blackberry OS, etc.
The processor 432 within the eyewear device 100 may construct a map of the environment surrounding the eyewear device 100, determine the location of the eyewear device within the mapped environment, and determine the relative locations of the eyewear device and one or more objects in the mapped environment. Processor 432 may construct a map and determine location and position information using a simultaneous localization and mapping (SLAM) algorithm applied to data received from one or more sensors. In the context of augmented reality, SLAM algorithms are used to construct and update a map of an environment while tracking and updating the location of a device (or user) in the map environment. Various statistical methods (such as particle filters, kalman filters, extended kalman filters, and covariance intersections) may be used to approximate the mathematical solution.
The sensor data includes images received from one or both of the cameras 114A, 114B, distances received from a laser rangefinder, location information received from the GPS unit 473, or a combination of two or more of such sensor data, or from other sensors providing data useful in determining location information.
Fig. 6 is a partial block diagram of the eyewear device 100 incorporating a first SoC 602A and a second SoC 602B, according to an example. The first SoC 602A is located within the left temple portion 110A along with a memory 604A (e.g., flash), a battery 606A, IMU a, a camera 114A, and a display component 608A. The second SoC 602B is located within the right temple portion 110B along with a memory 604B (e.g., flash), a battery 606B, IMU B, a camera 114B, and a display component 608B. The first SoC 602A is coupled to the second SoC for communication therebetween.
Although shown in the left temple portion 110A, one or more of the first SoC 602A, memory 604A, battery 606A, and display component 608A can be positioned in the frame 105 adjacent the left temple portion 110A (i.e., on the left side 170A) or in the temple 125A. In addition, although shown in the right temple portion 110B, one or more of the second SoC 602B, memory 604B, battery 606B, and display component 608B can be positioned in the frame 105 adjacent the right temple portion 110B (i.e., on the right side 170B) or in the temple 125B. Further, although two memories 604A, 604B, batteries 606A, 606B, and display components 608A, 608B are illustrated, fewer or more memories, batteries, and display components may be incorporated. For example, a single battery 606 may power both socs 602A, 602B, and the socs 602A, 602B may access three or more memories 604 for performing different operations.
In one example, the two socs 602 contain identical or substantially similar components and component layouts. Thus, their total processing resources are equivalent. According to this example, the first SoC 602A is at least substantially identical to the second SoC (i.e., they are identical or have 95% or greater overlap of components or processing resources). By using dual socs 602 (one positioned on one side of the eyewear device 100 and the other positioned on the other side of the eyewear device), cooling is effectively distributed throughout the eyewear device 100, with one side of the eyewear device providing passive cooling for one SoC 602 and the other side of the eyewear device providing passive cooling for the other SoC 602.
In one example, the eyewear device 100 has a thermal passive cooling capacity of about 3 watts per temple. The display 608 (e.g., a projection LED display) on each side utilizes approximately 1-2 watts. Each SoC 602 is designed to operate at less than about 1.5 watts (e.g., 800-1000mW; unlike about 5 watts of socs commonly used in mobile phones), which enables proper cooling of electronics on each side of the eyewear device 105 with passive cooling through the frame 105, temple portion 110A, temple 125A, or a combination thereof. By combining two socs 602 (positioned on opposite sides of the eyewear device 100 to take advantage of the unique passive cooling capacity exhibited by the eyewear device 100), it is possible to achieve a computational capacity that meets or exceeds that available in conventional mobile devices (socs that operate with 5 watt power loss).
Combining the same or similar components and component layouts in each SoC enables flexibility in distributing processing workload between the two socs 602. In one example, the processing workload is allocated based on neighboring components. According to this example, each SoC may drive a respective camera and display, which may be desirable from an electrical standpoint.
In another example, processing workload is allocated based on functionality. According to this example, one SoC 602 may function as a sensor hub (e.g., perform all computer vision, CV and machine learning, ML, processing, and video encoding), and another SoC 602 may run application logic, audio and video rendering functions, and communications (e.g., wi-Fi,4G/5G, etc.). From a privacy perspective, it may be desirable to distribute processing workloads based on functionality. For example, processing sensor information with one SoC and Wi-Fi implementations with another SoC may prevent private data (such as camera images) from leaving the eyewear device unnoticed by not allowing such sensor information to be sent from the SoC that is performing the sensor processing to the SoC that manages the communication. In another example, as described in further detail below, the processing workload may be transferred based on the processing workload (e.g., as determined by SoC temperature or instructions per second).
Fig. 7 is a flow chart 700 for implementing a dual SoC in a goggle apparatus. Although these steps are described with reference to eyewear device 100, one skilled in the art will appreciate from the description herein other suitable eyewear devices that may implement one or more of the steps of flowchart 700. Furthermore, it is contemplated that one or more of the steps shown in fig. 7 and described herein may be omitted, performed concurrently or sequentially, performed in a different order than shown and described, or performed in conjunction with additional steps.
Fig. 7 is a flowchart 700 of example steps for performing an operation on goggles using a first system-on-chip and a second system-on-chip. At block 702, a first SoC (e.g., soC 602A) performs a first set of operations. This includes operating the OS, the first color camera 114A, the second color camera 114B, the first display 608A, and the second display 608B.
At block 704, a second SoC (e.g., soC 602B) performs a second set of operations. This includes running the CV algorithm, visual odometry (VIO), tracking the user's gestures, and providing depth from stereo.
At block 706, the eyewear device 100 monitors the temperatures of the first and second socs. In one example, each SoC includes an integrated thermistor for measuring temperature. Each SoC may monitor its own temperature via a respective integrated thermistor, and may monitor the temperature of another SoC by periodically requesting a temperature reading from the other SoC.
At blocks 708 and 710, the eyewear device 100 connects the CV cameras to select socs and transfer processing workloads between a first set of operations and a second set of operations performed on the respective socs to balance the temperature (which effectively distributes the processing workloads). In examples including a master SoC and a replicated SoC, the master SoC manages the allocation of workloads to itself and to the replicated SoC to maintain a relatively uniform distribution among the socs. In one example, when one of the socs has a temperature that is 10% higher than the temperature of the other SoC, the master SoC redistributes the processing workload from the SoC with the higher temperature to the SoC with the lower temperature until the temperature difference is less than 5%. Processing instructions executed by each of the socs may be assigned an allocatable value from 1 to 10, where 1 is always unallocated and 10 is always allocatable. When transferring processing workload, the master SoC initially transfers instructions with an allocability value of 10, then transfers instructions with an allocability value of 9, 8, etc. The steps of blocks 706-710 are repeated continuously to maintain a uniform heat distribution.
Debug access provides access to low-level interfaces that are not typically required by end users. When the eyewear device 100 is worn by an end user, only wireless protocols (such as Bluetooth TM and Wi-Fi) are required to interface with other devices; and wired interfaces such as USB, universal Asynchronous Receiver Transmitter (UART), and serial line debug (SWD) are disconnected or disabled.
During development and manufacturing, engineering and factory teams need debug access to the eyewear device 100. Debug access provides the ability to reflect (reflash) the device with the development or debug build of firmware, the ability to restore (unlock (unbrick)) equipment that is no longer booted, the ability to monitor serial logs during early boot to debug problems, and the ability to run reliable automated tests on form factor devices.
On a single SoC goggle device, most debug access is provided through a single USB data connection. In the case of the goggle apparatus 100 having a plurality of socs 602, debug access becomes challenging. A safe and reliable method of accessing all SoC debug interfaces requires circuitry that is safe from electrostatic discharge (ESD), overvoltage, water ingress, which is invisible to the end user, USB compatible, which can be disabled at the end of the factory line so that the end user does not have debug access and functions even when all SoC software is in an unknown state.
Adding a separate USB port for each SoC 602 negatively impacts product design because it requires an additional USB port that end users never use and it reduces reliability by including additional entry points for moisture, ESD, and over-voltage. The use of USB switches to select a SoC allows access to only one SoC at a time, and the switches require hardware mechanisms for selecting which SoC to address. The use of additional USB-C data paths requires custom hardware by engineers and factories to access debug circuitry, and they require additional overvoltage protection and ESD protection. The always-on USB hub increases power consumption, which affects thermal performance.
According to the present disclosure, as shown in FIG. 8, the network of analog switches SW1 and SW2 are controlled to determine the location of the connection of a single USB-C port 802 having a data line 804. By default, the USB-C port 802 is connected to the main application SoC 602A via switches SW1 and SW2, as shown.
A low power debug microcontroller unit (MCU) 806 acts as a USB interface bridge to enable USB-C port 802 to selectively access low speed non-USB interfaces 808 of the SoC, such as UART, SWD, etc., and communicate with all of the socs 602A, 602B, and 602C. If desired, USB-C port 802 may access more than three SoCs using debug MCU 806.
When the switch and hub control logic 810 toggles the switch SW1 and enables the USB hub 812, the debug mode may be enabled. This debug mode provides simultaneous access by the USB-C port 802 to the USB, UART, and SWD buses 808 of all SoCs. This allows USB-C port 802 to provide simultaneous low power debugging and automation for all socs 602A, 602B, and 602C. DEBUG mode may be enabled in software, such as using a debugen signal from the SoC (shown as SoC 602A) or by USB-C controller 814 providing Device Test System (DTS) detection to switch and hub control logic 810 when a dedicated cable is connected. The DTS detection allows full recovery using the USB-C port 802 even when the SOC software is in a bad state.
The second switch SW2 is also controlled by the switch and hub control logic 810 to selectively connect the USB-C port 802 directly to the debug MCU 806 while disabling the remaining USB hub 812 ports. This allows USB-C port 802 to provide simultaneous low power debugging and automation for all socs 602A, 602B, and 602C. The USB hub 812 is turned off by the switch and hub control logic 810 to disable the USB hub 812 port and all of the socs 602A, 602B, and 602C602 may enter their low power mode without remaining awake through a continuous USB connection. Both the software debugen signal and the USB-C controller 814DTS detection may be disabled by non-volatile memory (NVM) writing at the end of the factory line to permanently disable DEBUG mode for the end user.
Referring to fig. 9, a flowchart 900 of example steps for performing a debug operation on an eyewear device 100 having a plurality of socs is shown.
At block 902, the switch and hub control logic 810 toggles the switch SW1 to enable the USB-C port 802 to enter a debug mode. This debug mode provides simultaneous access by the USB-C port 802 to the USB, UART, and SWD buses 808 of all SoCs. This allows USB-C port 802 to provide simultaneous low power debugging and automation for all socs 602A, 602B, and 602C. DEBUG mode may be enabled by software, such as using a debugen signal from the SoC (shown as SoC 602A), or by USB-C controller 814 providing DTS detection to switch and hub control logic 810 when a special cable is connected.
At block 904, the USB-C port 802 accesses the USB, UART, and SWD buses 808 of all SoCs simultaneously. DEBUG mode may be enabled by software, such as using a debugen signal from the SoC (shown as SoC 602A), or by USB-C controller 814 providing DTS detection to switch and hub control logic 810 when a special cable is connected.
At block 906, the switch and hub control logic 810 toggles the second switch SW2 so that the USB-C port 802 is directly connected to the debug MCU 806, while the remaining USB hub 812 ports are disabled. This allows low power debugging and automation of the SoC. The USB hub 812 is off and all SOCs 602 enter their low power mode without remaining awake through a persistent USB connection.
The use of MIPI interface bridge is discussed with respect to use with socs in goggles, however, the bridge may be used between other circuits in other devices if desired.
Nothing that has been stated or illustrated is intended or should be construed as causing any element, step, feature, object, benefit, advantage, or equivalent to be dedicated to the public regardless of whether it is recited in the claims.
It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," "includes," "including," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or comprises a list of elements or steps does not include only those elements or steps, but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element preceded by "a" or "an" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
Unless otherwise indicated, any and all measurements, values, ratings, locations, magnitudes, and other descriptions set forth in this specification (including in the appended claims) are approximate, imprecise. Such amounts are intended to have a reasonable range of values consistent with the functions they relate to and with the practices of the art to which they pertain. For example, unless explicitly stated otherwise, parameter values and the like may differ from the recited amounts by ±10%.
Furthermore, in the above detailed description, it may be seen that different features are combined together in different examples for the purpose of simplifying the present disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the claimed subject matter lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separately claimed subject matter.
While the foregoing has described what are considered to be the best mode and other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which are described herein. It is intended that the appended claims claim any and all modifications and variations as fall within the true scope of the present concepts.
Claims (20)
1.A goggle, comprising:
a frame having a first side and a second side;
A first system-on-a-chip (SoC) adjacent to the first side of the frame, the first SoC coupled to a first electronic component;
a second SoC adjacent to the second side of the frame, the second SoC coupled to a second electronic component;
a Universal Serial Bus (USB) port; and
A USB hub coupled to the USB port, wherein the USB hub is configured to enable the USB port to control each of the first SoC and the second SoC.
2. The goggle of claim 1, further comprising:
A processor coupled to the USB port and each of the first SoC and the second SoC, the processor configured to enable the USB port to perform low power debugging and automation of the first SoC and the second SoC.
3. The goggle of claim 2, wherein the USB hub is disabled when the USB port is configured to perform low power debugging and automation of the first SoC and the second SoC.
4. The goggle of claim 2, wherein the USB port is configured to perform low power debugging and automation using a Universal Asynchronous Receiver Transmitter (UART) or serial line debugging (SWD).
5. The goggle of claim 2, further comprising a first switch and control logic, wherein the control logic is configured to control the first switch and enable the USB port to perform low power debugging and automation of the first SoC and the second SoC via the processor.
6. The goggle of claim 5, further comprising a second switch, wherein the control logic is configured to control the second switch to enable the USB port to perform low power debugging and automation of the first SoC and the second SoC via the processor or to enable the USB port to control each of the first SoC and the second SoC.
7. The goggle of claim 5, wherein the control logic is configured to disable the USB hub such that the first SoC and the second SoC enter a low power mode without remaining awake through a persistent USB connection.
8. The goggle of claim 5, further comprising a USB controller configured to provide Device Test System (DTS) detection to the control logic to allow full recovery using the USB port.
9. The goggle of claim 5, wherein the first SoC is configured to enable the control logic to control the first switch and to enable the USB port to perform low power debugging and automation of the first SoC and the second SoC via the processor.
10. A method of using goggles, the goggles comprising: a frame having a first side and a second side; a first system-on-a-chip (SoC) adjacent to the first side of the frame, the first SoC coupled to a first electronic component; a second SoC adjacent to the second side of the frame, the second SoC coupled to a second electronic component; a Universal Serial Bus (USB) port; and a USB hub coupled to the USB port, the method comprising:
the USB port is configured to control each of the first SoC and the second SoC.
11. The method of claim 10, wherein the goggle further comprises a processor coupled to the USB port and each of the first SoC and the second SoC, the method comprising:
The processor enables the USB port to perform low power debugging and automation of the first SoC and the second SoC.
12. The method of claim 11, further comprising disabling the USB hub such that the USB port performs low power debugging and automation of the first SoC and the second SoC.
13. The method of claim 11, wherein the USB port performs low-power debugging and automation using a Universal Asynchronous Receiver Transmitter (UART) or serial line debugging (SWD).
14. The method of claim 11, wherein the goggles further comprise a first switch and control logic, wherein the control logic controls the first switch and enables the USB port to perform low power debugging and automation of the first SoC and the second SoC via the processor.
15. The method of claim 14, wherein the goggles further comprise a second switch, wherein the control logic controls the second switch to enable the USB port to perform low power debugging and automation of the first SoC and the second SoC via the processor or to enable the USB port to control each of the first SoC and the second SoC.
16. The method of claim 14, wherein the control logic disables the USB hub such that the first SoC and the second SoC enter a low power mode without remaining awake through a persistent USB connection.
17. The method of claim 14, wherein the goggles further comprise a USB to provide Device Test System (DTS) detection to the control logic to allow full recovery using the USB port.
18. The method of claim 14, wherein the first SoC enables the control logic to control the first switch and enables the USB port to perform low power debugging and automation of the first SoC and the second SoC via the processor.
19. A non-transitory computer readable medium storing program code, which when executed by a processor of goggles has: a frame having a first side and a second side; a first system-on-a-chip (SoC) adjacent to the first side of the frame, the first SoC coupled to a first electronic component; a second SoC adjacent to the second side of the frame, the second SoC coupled to a second electronic component; a Universal Serial Bus (USB) port; and a USB hub coupled to the USB port, the program code operable to cause a computing device to:
Configuring the USB port to control each of the first SoC and the second SoC; and
Enabling the USB port to perform low power debugging and automation of the first SoC and the second SoC.
20. The non-transitory computer-readable medium of claim 19, further operable to control a first switch and enable the USB port to perform low power debugging and automation of the first SoC and the second SoC, and to control a second switch to enable the USB port to perform low power debugging and automation of the first SoC and the second SoC, or to enable the USB port to control each of the first SoC and the second SoC.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/566,951 US11829312B2 (en) | 2021-12-31 | 2021-12-31 | Debug access of eyewear having multiple socs |
US17/566,951 | 2021-12-31 | ||
PCT/US2022/050141 WO2023129294A1 (en) | 2021-12-31 | 2022-11-16 | Debug access of eyewear having multiple socs |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118556217A true CN118556217A (en) | 2024-08-27 |
Family
ID=84943137
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202280087068.0A Pending CN118556217A (en) | 2021-12-31 | 2022-11-16 | Debug access for goggles with multiple SOCs |
Country Status (5)
Country | Link |
---|---|
US (2) | US11829312B2 (en) |
EP (1) | EP4457580A1 (en) |
KR (1) | KR20240130699A (en) |
CN (1) | CN118556217A (en) |
WO (1) | WO2023129294A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230124748A1 (en) * | 2021-10-14 | 2023-04-20 | Jason Heger | Dual system on a chip eyewear |
US11829312B2 (en) * | 2021-12-31 | 2023-11-28 | Snap Inc. | Debug access of eyewear having multiple socs |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7475182B2 (en) * | 2005-12-06 | 2009-01-06 | International Business Machines Corporation | System-on-a-chip mixed bus architecture |
US7480753B2 (en) * | 2006-04-27 | 2009-01-20 | Standard Microsystems Corporation | Switching upstream and downstream logic between ports in a universal serial bus hub |
US9432288B2 (en) * | 2014-02-28 | 2016-08-30 | Cavium, Inc. | System on chip link layer protocol |
US9753836B2 (en) | 2014-09-12 | 2017-09-05 | Intel Corporation | Low power debug architecture for system-on-chips (SoCs) and systems |
GB2541216B (en) | 2015-08-12 | 2021-03-17 | Ultrasoc Technologies Ltd | Reconfiguring debug circuitry |
EP3701316A4 (en) | 2017-12-20 | 2021-08-04 | Vuzix Corporation | DISPLAY SYSTEM FOR EXTENDED REALITY |
WO2021030403A1 (en) | 2019-08-15 | 2021-02-18 | Snap Inc. | Eyewear tether |
US11855124B2 (en) * | 2019-11-15 | 2023-12-26 | Qualcomm Incorporated | Vertically integrated device stack including system on chip and power management integrated circuit |
US11829312B2 (en) * | 2021-12-31 | 2023-11-28 | Snap Inc. | Debug access of eyewear having multiple socs |
-
2021
- 2021-12-31 US US17/566,951 patent/US11829312B2/en active Active
-
2022
- 2022-11-16 KR KR1020247021618A patent/KR20240130699A/en active Pending
- 2022-11-16 CN CN202280087068.0A patent/CN118556217A/en active Pending
- 2022-11-16 WO PCT/US2022/050141 patent/WO2023129294A1/en active Application Filing
- 2022-11-16 EP EP22843451.0A patent/EP4457580A1/en active Pending
-
2023
- 2023-10-30 US US18/497,034 patent/US20240061798A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20240061798A1 (en) | 2024-02-22 |
EP4457580A1 (en) | 2024-11-06 |
US20230214343A1 (en) | 2023-07-06 |
US11829312B2 (en) | 2023-11-28 |
KR20240130699A (en) | 2024-08-29 |
WO2023129294A1 (en) | 2023-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11994751B1 (en) | Dual system on a chip eyewear | |
US20240061798A1 (en) | Debug access of eyewear having multiple socs | |
US20250093662A1 (en) | Dual system on a chip eyewear | |
US20250094370A1 (en) | Eyewear with a system on a chip with simultaneous usb communications | |
US20240275937A1 (en) | Dual system on a chip eyewear | |
CN118103794A (en) | System-on-two-piece glasses | |
US12372791B2 (en) | Dual system on a chip eyewear | |
KR20240144185A (en) | Dual SoC (SYSTEM ON A CHIP) eyewear with MIPI bridge | |
US20230124748A1 (en) | Dual system on a chip eyewear | |
CN117751348A (en) | Electronic device and virtual machine operating system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |