[go: up one dir, main page]

US20140078137A1 - Augmented reality system indexed in three dimensions - Google Patents

Augmented reality system indexed in three dimensions Download PDF

Info

Publication number
US20140078137A1
US20140078137A1 US14/026,870 US201314026870A US2014078137A1 US 20140078137 A1 US20140078137 A1 US 20140078137A1 US 201314026870 A US201314026870 A US 201314026870A US 2014078137 A1 US2014078137 A1 US 2014078137A1
Authority
US
United States
Prior art keywords
instruction
dimensional
token
graphic
program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/026,870
Inventor
Nagabhushanam Peddi
Stephen PHillip Alvey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/026,870 priority Critical patent/US20140078137A1/en
Publication of US20140078137A1 publication Critical patent/US20140078137A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras

Definitions

  • the present disclosure relates generally to an augmented or virtual reality system indexed in three dimensions.
  • examples of the present disclosure are related to use of an augmented reality system indexed in three dimensions presenting a virtual instructor.
  • Augmented reality includes methods wherein computerized images are displayed to augment a view or experience in the real world.
  • a computer generated image is presented upon or superimposed over a series of images captured by a camera associated with the view screen.
  • Virtual reality includes images in an entirely separate environment, with all elements upon the display being computer generated. Examples of augmented reality displays includes HUD displays, lines projected on a television image of a football field showing a first down mark on the field, and a glow projected on a television image of a hockey puck.
  • Smart-phones, tablet computers, and other similar portable computerized devices utilize camera devices to interact with their environment.
  • a smart-phone can capture an image of a quick response code (QR code) and an instruction can be provided to the phone based upon the image.
  • QR code quick response code
  • the phone can be instructed to access a particular webpage over a communications network based upon the information provided by the QR code.
  • a two-dimensional barcode can be used by a portable computerized device to provide an input to the device.
  • a handheld unit in a store can permit a user to enter items onto a gift registry operated by the store based upon a scanned input.
  • a portable computerized device is configured to display a three dimensional instruction graphic.
  • the device includes a camera device capturing an image, wherein the image includes location data related to a token.
  • the portable computerized device is configured to display the three dimensional instruction graphic based upon the location data.
  • FIG. 1 is an illustration of a portable computerized device including a camera feature reading information from a token placed upon the ground, in accordance with the present disclosure
  • FIG. 2 is an illustration of a plurality of positions from which portable computerized devices can be located and view a token, in accordance with the present disclosure
  • FIG. 3 is an illustration of an exemplary yoga mat utilized as a token, with two exemplary portable computerized devices illustrating a virtual yoga instructor with an orientation based upon the location of portable computerized device with respect to the token, in accordance with the present disclosure
  • FIG. 4 illustrates operation of an exemplary three dimensional model instruction program operating a first aid program, in accordance with the present disclosure
  • FIG. 5 illustrates operation of an exemplary three dimensional model instruction program operating a martial arts program, in accordance with the present disclosure
  • FIG. 6 illustrates an exemplary three dimensional model instruction program illustrating instructions to install a cable to a computer, in accordance with the present disclosure
  • FIG. 7 is a schematic illustrating an exemplary portable computerized device in communication with an exemplary three dimensional model instruction server, in accordance with the present disclosure
  • FIG. 8 is a schematic illustrating an exemplary three dimensional model instruction server, in accordance with the present disclosure.
  • FIG. 9 is a schematic illustrating an exemplary portable computerized device configured to implement processes disclosed herein, in accordance with the present disclosure.
  • Computerized devices and servers available over the Internet operate three dimensional computer models. Such models are known in the art and will not be described in detail herein. Such models can be manipulated to generate a changing display, e.g. changing a perspective upon the three dimensionally modeled object, by an input that is used to determine a point of view for the generated display.
  • a program can be created showing a three dimensional model of a car, and a user can manipulate a point of view by providing an input to a slider graphic.
  • the graphic representation of the motor vehicle can be rotated in a horizontal plane through 360 degrees based upon the user's input to the mouse device.
  • the vehicle can be rotated to view a top of a bottom of the car through a second slider or, for example, by monitoring motion of the mouse device in both X and Y axes of input.
  • Augmented reality and virtual reality programs can monitor a captured image and use information from the captured image to input information.
  • Image recognition software and related programming is known in the art and will not be described in detail herein.
  • a spatial relationship of a known object can be determined based upon how the object appears in a captured image.
  • a known object can be defined as a token for use by an augmented or virtual reality program. For example, if a one dollar bill, a well known pattern that can be programmed for identification within a computerized program, is laid upon a table and a program is utilized to analyze the dimensions of the image of the dollar bill, a determination can be made of a distance and an orientation of the dollar bill to the device capturing the image. This distance and/or orientation of the known object or token in a captured image can be used by an augmented or virtual reality program as an input for manipulating a three dimensional model.
  • Augmented or virtual reality can be used to help a user improve their experience or acquire more detailed information about a topic.
  • a program can be operated on a computerized device to assist a user in learning about a skill or activity.
  • a process for displaying a three dimensional computer generated image of an instructor upon a portable computerized device is disclosed.
  • portable computerized device can refer to a number of computerized devices, such as smart-phones, laptop computers, tablet computers, and eye glasses configured with a processor and capable of displaying graphics within a view of the user.
  • a user may wish to utilize a portable computerized device to watch a demonstration of or instruction on an activity the user wishes to learn. Such a demonstration would be enhanced by the ability of the user to utilize a portable computerized device to view the demonstration or instruction as it would appear in the user's actual environment through an augmented reality system.
  • a three dimensional model of an instructor can be displayed upon a portable computerized device, permitting a user to view the instruction from a number of different points of view.
  • a wide variety of instruction topics can be programmed for display.
  • a yoga instruction program can be operated, whereby a yoga position or a series of yoga exercises can be displayed to the user. The user can change a point of view for the program, enabling the user to see from a number of different angles what movements the instructor is exemplifying.
  • a progression of yoga exercises can be made available, with varying levels of difficulty ranging from novice to expert, such that the program can be used and marketed as a comprehensive training program.
  • a virtual reality program can be provided, whereby the user can manipulate a point of view for the instructor though a slider input displayed upon a screen of a smart-phone device.
  • a yoga mat with a token imprinted thereupon can be provided, and an augmented reality program can utilize an image captured of the yoga mat as a token to manipulate a point of view of the displayed yoga instructor.
  • an image of a token captured by a camera device can be utilized as a first input to determine a point of view for a three dimensional program, and a user input to the device can be utilized as a second input to determine the point of view.
  • a user viewing a token-oriented model from a front facing point of view can tap an option displayed upon the screen of an exemplary tablet device, and the displayed model can be changed to a left facing point of view.
  • a first aid instruction program can be operated.
  • an instructor can be displayed going through sequential instructions on how to perform cardiopulmonary resuscitation (CPR).
  • CPR cardiopulmonary resuscitation
  • an instructor and a virtual patient can be displayed for view by the user, with an audio message describing at each step of the procedure important aspects of the displayed procedure.
  • An exemplary user can pause the displayed sequence during chest compressions and slowly repeat the display through a chest compression, while changing point of view as necessary, to visualize proper hand placement and compression depth.
  • Other options can be changed by the user as desired, for example, changing an age, gender, or medical condition of the virtual patient.
  • a CPR mannequin for use by the user can be utilized as a token for an augmented reality program, such that the user can examine the instructor in place over the mannequin performing CPR just prior to the user actually practicing on the mannequin.
  • a sensor or sensors monitoring movement of the user can be monitored, and a subsequent display can be used to compare monitored movement to the movement of the modeled instructor.
  • sensors used to monitor movement of a person are known in the art and will not be described in detail herein.
  • Other first aid procedures that can be displayed include proper use of a defibrillator, proper application of a splint to a broken limb, and addressing a cut.
  • a martial arts instruction program can be operated.
  • an actual instructor using a first portable computerized device, can control parameters of a program displayed in an augmented reality program superimposed upon a workout mat, wherein a token is imprinted or otherwise placed upon the mat, and a plurality of students can view operation of the program under the control of the instructor from various points of view controlled according to inputs disclosed herein.
  • a pair of virtual participants can be displayed interacting with each other.
  • a number of other exemplary instruction programs can be operated.
  • a program providing instruction on how to assemble components of a computer can be operated, enabling a user to quickly unpack a new computer, make the required cable connections, and take advantage of various features of the computer.
  • a program can be operated to show a tennis player a correct form or various popular forms for hitting a back-hand shot.
  • a program can be operated to show how to chip a golf ball onto a green.
  • a program can be operated to show a number of different sport techniques, including but not limited to a soccer kick; a football throw or kick; a baseball swinging, throwing, or catching technique; a jump shot or dribble in basketball; a swimming technique; a water rescue technique; a hockey technique; a figure skating routine; proper technique for lifting weights; an aerobics workout routine; downhill or cross country skiing; snowboarding or snow blades; roller blading; water skiing; parachuting technique; boxing; table tennis; water polo; lacrosse; wrestling; archery; target shooting; and rock climbing.
  • a program can be operated to show automotive repair techniques, e.g., how to change oil for a particular model and year of car.
  • a program can be operated to show a popular dance technique.
  • a program can be operated to show to a class of students a technique used for a medical, dental, or surgical procedure.
  • a program can be operated to show a student proper technique in playing a guitar.
  • a virtual guitar player and guitar can be displayed.
  • a virtual violin, saxophone, or piano can be displayed.
  • only a guitar with a pair of virtual hands located to the guitar can be displayed.
  • an animated cartoon character can be displayed.
  • a token can be a two dimensional image printed upon a flat object.
  • a token can be a three dimensional object or images printed upon a three dimensional object.
  • a token can be a simple design, for example, printable upon a single sheet of paper.
  • a token can be imprinted upon a decorative object.
  • the mat can include a token for operating a three dimensional model instruction program as disclosed herein.
  • Such an exemplary mat can further include exemplary images showing a user a number of yoga positions.
  • An object printed with a token can be a sellable object, for example, wherein information on the object can enable a user to download and/or initiate a corresponding instruction program.
  • a token recognized by a user's device can act similarly to a QR code, automatically instructing the device to go to a particular webpage whereat an executable program can be executed or downloaded.
  • End user license information for a program can be contained upon a printed object also acting as a token.
  • techniques are disclosed for presenting instructions or demonstrations of activities utilizing an augmented or virtual reality system indexed in three dimensions which allows a user to view an instruction or demonstration as it would appear in the user's actual environment.
  • Instructions, characters, avatars, cartoon graphics, and other graphic displays are embodiments of a three dimensional instruction graphic as disclosed herein.
  • FIG. 1 illustrates a portable computerized device including a camera feature reading information from a token placed upon a surface.
  • Portable computerized device 10 includes view-screen 15 .
  • An exemplary portable computerized device 10 further includes a processor, RAM memory, and storage memory in the form of a hard drive, flash memory, or other similar storage devices known in the art.
  • Portable computerized device 10 can further be connected to a wireless communication network through cellular connection, wireless LAN connection, or other communication methods known in the art.
  • Portable computerized device further includes software, such as an operating system and applications configured to monitor inputs, for example, in the form of touch inputs to a touch screen device, inputs to the camera device, and audio inputs and control outputs, for example, in the form of graphics, sounds, and communication signals over the wireless connection.
  • View-screen 15 can include a touch screen input.
  • Other exemplary devices use button inputs, trackball and button inputs, eye focus location sensors, or other methods known in the art.
  • the camera device of portable computerized device 10 can include a lens and optical sensor according to digital cameras or smart-phones known in the art capable of capturing a visual image and translating it into a stored digital representation of the image. View 30 of the camera is illustrated. Any number of portable computerized devices can be utilized according to the methods disclosed herein, and the disclosure is not intended to be limited to the particular examples provided.
  • Token 20 is some graphic design, symbol, word, or pattern 25 that can be identified by a portable computerized device 10 through its camera device input.
  • An application upon the portable computerized device 10 interprets images captured by the camera device and identifies the token within the image. Based upon the size and orientation of the token and the graphics or symbols thereupon in the image, a spatial relationship of the token to the image can be determined.
  • Graphical images displayed upon view-screen 15 can be based upon a three-dimensional model.
  • Such models are well known in the art and can include simple geometric shapes or more complex models, for example, modeling the human form.
  • the application can include programming to create a graphical image upon view-screen 15 based upon manipulation of a model associated with token 20 .
  • Token 20 provides a reference point at which to anchor the graphical image and the orientation of the model upon which the image is based.
  • the graphic based upon the model can be displayed upon the view-screen 15 .
  • the image or a stream of images from the camera device can be displayed upon the view-screen 15 , and the graphic based upon the model can, with the orientation based upon the sensed token, be superimposed over the images from the camera.
  • the graphic based upon the model can be located to partially or fully cover the token in the image.
  • FIG. 2 illustrates a plurality of positions from which portable computerized devices can be located and view a token.
  • Three portable computerized devices 110 A, 110 B, and 110 C, are illustrated located at different locations with respect to token 100 , and the camera devices of the portable computerized devices include respective views 115 A, 115 B, and 115 C.
  • applications running upon each of portable computerized devices 110 A, 110 B, 110 C can interpret a location of the token with respect to the portable computerized device and manipulate a point of view of a programmed model associated with the token to represent a virtual object or character oriented based upon how the portable computerized device is located with respect to the token.
  • Graphic 105 is illustrated as an arrow showing a graphic that can clearly indicate a direction in which the token is oriented. Further, based upon perspective of the graphic to the viewer or the portable computerized device viewing the graphic, an inclination of the token with respect to the portable computerized device can be determined. Further, based upon a size of the graphic within the image, a distance of the token from the portable computerized device viewing the graphic can be determined.
  • the graphic is illustrated as an arrow for clarity of example, but any graphic with distinguishable orientation can be utilized upon token 100 . Vivid colors and bright contrast in the graphic can be used to aid identification of the token and the graphic thereupon by the portable computerized device.
  • FIG. 3 illustrates an exemplary yoga mat utilized as a token, with two exemplary portable computerized devices illustrating a virtual yoga instructor with an orientation based upon the location of portable computerized device with respect to the token.
  • Yoga mat 200 includes graphics 205 which indicate an orientation of the token 200 .
  • Illustrative graphics 206 can also be included upon the mat, in addition to the graphics needed to provide orientation and identification of the token or as part of the graphics providing orientation and identification of the token.
  • each of the illustrative graphics can include a border with a particular shape, such as a pentagon, which aid in a portable computerized device quickly and robustly identifying and tracking the graphic upon the token.
  • Portable computerized devices 210 A and 210 B are illustrated with different locations and orientations with respect to token 200 .
  • Portable computerized device 210 A illustrates a virtual yoga instructor 220 A located upon an image of token 215 A represented upon the view screen as image 215 A.
  • Portable computerized device 210 B illustrates a virtual yoga instructor 220 B located upon an image of token 215 B represented upon the view screen as image 215 B.
  • the yoga instructor 220 A on portable computerized device 210 A is projected facing to the left, while the yoga instructor 220 B on portable computerized device 210 B is illustrated facing the viewer.
  • Each of the portable computerized devices can be moved about the token 200 as indicated by arrow 225 , and the orientation of the virtual instructor upon the view-screen will change with the changed location of the portable computerized device.
  • Portable computerized devices 210 A and 210 B are illustrated displaying the same yoga instructor based upon a same model with a same anchored orientation with respect to the token 200 .
  • the user of a particular portable computerized device can change any of a number of parameters.
  • a gender of the instructor illustrated can be changed, a size of the instructor graphic can be changed, a baseline rotation of the instructor can be changed based upon the preferences of a particular viewer.
  • a number of quick select buttons can be presented along an edge of the view-screen of the portable computerized device for easy manipulation of the projected graphics.
  • an orientation of the illustrated character can be toggled 180 degrees, so that the person watching the instructor can quickly change whether the front or the back of the character is being viewed.
  • FIG. 3 illustrates a yoga instructor that can be viewed in three dimensions based upon an orientation of a token to the device of the viewer.
  • Other embodiments are envisioned, such as martial art instruction or first aid instruction, and the disclosure is not intended to be limited to the particular examples provided herein.
  • FIG. 4 illustrates operation of an exemplary three dimensional model instruction program operating a first aid program.
  • Device 310 includes display 320 and a camera device capturing view 315 including a token as disclosed herein. Based upon input received from view 315 , device 310 displays a first virtual character 330 providing first aid to second virtual character 340 .
  • character 330 is applying a splint 335 to the arm of character 340 .
  • a number of first aid instructions are envisioned, and the disclosure is not intended to be limited to the exemplary embodiments disclosed herein.
  • FIG. 5 illustrates operation of an exemplary three dimensional model instruction program operating a martial arts program.
  • Device 410 includes display 420 .
  • Character 430 representing an instructor is illustrated.
  • Character 440 is illustrated showing user motions captured through a motion capture sensor known in the art.
  • Device 410 can show the instructor and the user's motions in slow or pause motion, permitting the user to compare the graphics and learn from the comparison.
  • Input graphics permitting a user to interact with a touch-screen device are illustrated, including button 450 prompting the user to play the graphic motions, button 452 prompting the user to pause the graphic motions, button 454 prompting the user to request to see the instructor go through the instruction again, and button 456 prompting the user to record another attempt at the instructed motion.
  • a virtual character executing a block of the illustrated motion could be displayed.
  • a number of teaching methods and interactive controls are envisioned for use with the instructions disclosed herein, and the disclosure is not intended to be limited to the examples provided herein.
  • FIG. 6 illustrates an exemplary three dimensional model instruction program illustrating instructions to install a cable to a computer.
  • Device 510 is illustrated including display 520 and a camera device capturing view 525 .
  • Laptop computer 530 is illustrated proximate to and within the view of device 510 .
  • the model of computer 530 can be entered, and an image of computer 530 can be referenced in a database, such that computer 530 can act as a token for a program operated on device 510 .
  • Either a graphic representing computer 530 in a virtual reality program or an image of computer 530 can be displayed as graphic 540 .
  • Virtual hand 550 is illustrated inserting a printer cable in a particular port located upon the computer.
  • a number of instructional programs showing a user how to accomplish physical tasks are envisioned, and the disclosure is not intended to be limited to the examples provided herein.
  • FIG. 7 is a schematic illustrating an exemplary portable computerized device in communication with an exemplary three dimensional model instruction server.
  • Portable computerized device 610 is illustrated, including message 630 displayed upon a graphical user interface 620 of device 610 .
  • Device 610 can include a camera device, and an image or a series of images creating a video feed can be displayed including an object displaying a token image.
  • Device 610 is an exemplary portable computerized device including input devices configured to gather information and a processor configured to make determinations regarding data from the input devices.
  • Server 650 is illustrated including a remote computerized system with modules operating to process information gathered from device 610 and enable operation of a three dimensional model. Server 650 and device 610 are in communication through exemplary wireless communications network 640 .
  • Message 630 illustrates an embodiment whereby an image capturing a token initiates a sequence for downloading the instruction program to the device.
  • FIG. 8 is a schematic illustrating an exemplary three dimensional model instruction server.
  • the server 650 may include a processing device 720 , a communication device 710 , and memory device 730 .
  • the processing device 720 can include memory, e.g., read only memory (ROM) and random access memory (RAM), storing processor-executable instructions and one or more processors that execute the processor-executable instructions. In embodiments where the processing device 720 includes two or more processors, the processors can operate in a parallel or distributed manner. In the illustrative embodiment, the processing device 720 executes one or more of a video input module 740 , a 3D model rendering module 750 , and an instruction module 760 .
  • ROM read only memory
  • RAM random access memory
  • the processing device 720 executes one or more of a video input module 740 , a 3D model rendering module 750 , and an instruction module 760 .
  • the communication device 710 is a device that allows the server 650 to communicate with another device, e.g., a portable computerized device 610 through a wireless communication network connection.
  • the communication device 710 can include one or more wireless transceivers for performing wireless communication and/or one or more communication ports for performing wired communication.
  • the memory device 730 is a device that stores data generated or received by the server 650 .
  • the memory device 730 can include, but is not limited to a hard disc drive, an optical disc drive, and/or a flash memory drive. Further, the memory device 730 may be distributed and located at multiple locations.
  • the memory device 730 is accessible to the processing device 720 .
  • the memory device 730 includes a graphics database 780 and an instruction database 790 .
  • Graphics database 780 can include files, libraries, and other tools facilitating operation of a 3D model.
  • Instruction database 790 can include information enabling operation of an instruction program, for example, including data enabling operation of twenty yoga exercises.
  • the video input module 740 can monitor information provided by device 610 over network 640 , for example, including a series of images showing a token within a view of the device. Module 740 can include programming to process the images, recognize the token within the images, determine a distance to and orientation of the token, and process the information as an input value or values to a 3D model.
  • Instruction module 760 includes programming to execute an instruction program, including operation of rules, routines, lesson plans, and other relevant information required to display an instruction program. Module 760 can access instruction database 790 to enable used of information stored on the database.
  • 3D model rendering module 750 receives data from modules 740 and 760 and database 780 , and module 750 provides graphics, images, or instructions enabling display of a three dimensional model instruction display upon device 610 .
  • FIG. 9 is a schematic illustrating an exemplary portable computerized device configured to implement processes disclosed herein.
  • Device 610 includes a processing device 810 , a user interface 820 , a communication device 860 , a camera 830 , and a memory device 840 .
  • the processing device 810 can include memory, e.g., read only memory (ROM) and random access memory (RAM), storing processor-executable memory (ROM) and random access memory (RAM), storing processor-executable instructions and one or more processors that execute the processor-executable instructions. In embodiments where the processing device 810 includes two or more processors, the processors can operate in a parallel or distributed manner. In the illustrative embodiment, the processing device 810 can execute the operating system of the portable computerized device. In the illustrative embodiment, the processing device 810 also executes a video input module 850 , a user input module 870 , and graphics module 880 , which are described in greater detail below.
  • ROM read only memory
  • RAM random access memory
  • processors can operate in a parallel or distributed manner.
  • the processing device 810 can execute the operating system of the portable computerized device.
  • the processing device 810 also executes a video input module 850 , a user input module 870 , and graphics module 880
  • the user interface 820 is a device that allows a user to interact with the portable computerized device. While one user interface 820 is shown, the term “user interface” can include, but is not limited to, a touch screen, a physical keyboard, a mouse, a microphone, and/or a speaker.
  • the communication device 860 is a device that allows the portable computerized device to communicate with another device, e.g., server 650 .
  • the communication device 860 can include one or more wireless transceivers for performing wireless communication and/or one or more communication ports for performing wired communication.
  • the memory device 840 is a device that stores data generated or received by the portable computerized device.
  • the memory device 840 can include, but is not limited to, a hard disc drive, an optical disc drive, and/or a flash memory drive.
  • the camera 830 is a digital camera that captures a digital photograph.
  • the camera 830 receives an instruction to capture an image and captures an image of a view proximate to the camera.
  • the digital photograph can be a bitmap file.
  • the bitmap file can be a bitmap, a JPEG, a GIF, or any other suitably formatted file.
  • the camera 830 can receive the instruction to capture the image from the processing device 810 and can output the digital photograph to the processing device 810 .
  • Video input module 850 monitors data from camera device 830 , which can include an image of a token.
  • User input module 870 can monitor input from the user related to manipulation of three dimensional model being operated.
  • Graphics module 880 can receive data from server 650 and provide a display upon device 610 related to operation of the model and the related instruction program.
  • Embodiments in accordance with the present disclosure may be embodied as an device, process, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product embodied any tangible medium of expression having computer-usable program code embodied in the medium.
  • a computer-readable medium may include one or more of a portable computer diskette, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device.
  • Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages.
  • Embodiments may also be implemented in cloud computing environments.
  • cloud computing may be defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly.
  • configurable computing resources e.g., networks, servers, storage, applications, and services
  • a cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“IaaS”), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.).
  • service models e.g., Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“IaaS”)
  • deployment models e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.
  • FIGS. 7-9 illustrate an exemplary embodiment whereby a process to monitor location data related to a token is used to generate a three dimensional model instruction graphic.
  • Location data can include a distance, an orientation, and other information that can be determined based upon an image of a token.
  • Tasks are split between the exemplary device and server according to one embodiment of the disclosure. However, other embodiments are disclosed.
  • a portable computerized device can perform all of the necessary programming to operate processes disclosed herein. Operation of the 3D model can be operated on either the server or the device.
  • a computerized system to display a three dimensional yoga instruction graphic to a plurality of users employing processes disclosed herein can be disclosed.
  • the system includes a token and, for each of the plurality of users, a portable computerized device.
  • the portable computerized device includes a camera device capturing an image, the image comprising location data related to a token configured to display a three dimensional yoga instruction graphic.
  • the portable computerized devices each operate a three dimensional yoga instruction model based upon the location data, and display the three dimensional instruction yoga graphic based upon the three dimensional yoga instruction model.
  • a remote server could perform some tasks for each of the devices, such as operating the three dimensional model, and the devices can each display individual outputs based upon communication with the server.
  • a camera device is disclosed as a device for localizing a portable computerized device to a token.
  • Other devices and processes for providing location data to the device are envisioned, for example, utilizing radio frequency (RF ID) chips, a 3D map device, or inertial sensors within the device as spatial inputs to a model. Operation of these devices is known in the art and will not be disclosed in detail herein. Any of these alternative devices can be used in isolation or in cooperation with each other or with a camera device to provide or improve upon a spatial input to a three dimensional model.
  • RF ID radio frequency

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A portable computerized device is configured to display a three dimensional instruction graphic. The device includes a camera device capturing an image, wherein the image includes location data related to a token. The portable computerized device is configured to display the three dimensional instruction graphic based upon the location data.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This disclosure claims the benefit of U.S. Provisional Application No. 61/701,039 filed on Sep. 14, 2012 and U.S. Provisional Application No. 61/752,010 filed on Jan. 14, 2013 which are hereby incorporated by reference.
  • FIELD OF THE DISCLOSURE
  • The present disclosure relates generally to an augmented or virtual reality system indexed in three dimensions. In particular, examples of the present disclosure are related to use of an augmented reality system indexed in three dimensions presenting a virtual instructor.
  • BACKGROUND
  • The statements in this section merely provide background information related to the present disclosure. Accordingly, such statements are not intended to constitute an admission of prior art.
  • Augmented reality includes methods wherein computerized images are displayed to augment a view or experience in the real world. In one example, a computer generated image is presented upon or superimposed over a series of images captured by a camera associated with the view screen. Virtual reality, on the other hand, includes images in an entirely separate environment, with all elements upon the display being computer generated. Examples of augmented reality displays includes HUD displays, lines projected on a television image of a football field showing a first down mark on the field, and a glow projected on a television image of a hockey puck.
  • Smart-phones, tablet computers, and other similar portable computerized devices utilize camera devices to interact with their environment. In one exemplary embodiment, a smart-phone can capture an image of a quick response code (QR code) and an instruction can be provided to the phone based upon the image. In one example, the phone can be instructed to access a particular webpage over a communications network based upon the information provided by the QR code. Similarly a two-dimensional barcode can be used by a portable computerized device to provide an input to the device. A handheld unit in a store can permit a user to enter items onto a gift registry operated by the store based upon a scanned input.
  • SUMMARY
  • A portable computerized device is configured to display a three dimensional instruction graphic. The device includes a camera device capturing an image, wherein the image includes location data related to a token. The portable computerized device is configured to display the three dimensional instruction graphic based upon the location data.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting and non-exhaustive embodiments of the present disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
  • FIG. 1 is an illustration of a portable computerized device including a camera feature reading information from a token placed upon the ground, in accordance with the present disclosure;
  • FIG. 2 is an illustration of a plurality of positions from which portable computerized devices can be located and view a token, in accordance with the present disclosure;
  • FIG. 3 is an illustration of an exemplary yoga mat utilized as a token, with two exemplary portable computerized devices illustrating a virtual yoga instructor with an orientation based upon the location of portable computerized device with respect to the token, in accordance with the present disclosure;
  • FIG. 4 illustrates operation of an exemplary three dimensional model instruction program operating a first aid program, in accordance with the present disclosure;
  • FIG. 5 illustrates operation of an exemplary three dimensional model instruction program operating a martial arts program, in accordance with the present disclosure;
  • FIG. 6 illustrates an exemplary three dimensional model instruction program illustrating instructions to install a cable to a computer, in accordance with the present disclosure;
  • FIG. 7 is a schematic illustrating an exemplary portable computerized device in communication with an exemplary three dimensional model instruction server, in accordance with the present disclosure;
  • FIG. 8 is a schematic illustrating an exemplary three dimensional model instruction server, in accordance with the present disclosure; and
  • FIG. 9 is a schematic illustrating an exemplary portable computerized device configured to implement processes disclosed herein, in accordance with the present disclosure.
  • Corresponding reference characters indicate corresponding components throughout the several views of the drawings. Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present disclosure. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one having ordinary skill in the art that the specific detail need not be employed to practice the present disclosure. In other instances, well-known materials or methods have not been described in detail in order to avoid obscuring the present disclosure
  • Reference throughout this specification to “one embodiment”, “an embodiment”, “one example” or “an example” means that a particular feature, structure or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in an embodiment”, “one example” or “an example” in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures or characteristics may be combined in any suitable combinations and/or sub-combinations in one or more embodiments or examples. In addition, it is appreciated that the figures provided herewith are for explanation purposes to persons ordinarily skilled in the art and that the drawings are not necessarily drawn to scale.
  • Computerized devices and servers available over the Internet operate three dimensional computer models. Such models are known in the art and will not be described in detail herein. Such models can be manipulated to generate a changing display, e.g. changing a perspective upon the three dimensionally modeled object, by an input that is used to determine a point of view for the generated display. For example, a program can be created showing a three dimensional model of a car, and a user can manipulate a point of view by providing an input to a slider graphic. By moving a button with an exemplary mouse device, the graphic representation of the motor vehicle can be rotated in a horizontal plane through 360 degrees based upon the user's input to the mouse device. Similarly, the vehicle can be rotated to view a top of a bottom of the car through a second slider or, for example, by monitoring motion of the mouse device in both X and Y axes of input.
  • Augmented reality and virtual reality programs can monitor a captured image and use information from the captured image to input information. Image recognition software and related programming is known in the art and will not be described in detail herein. According to one embodiment, a spatial relationship of a known object can be determined based upon how the object appears in a captured image. Such a known object can be defined as a token for use by an augmented or virtual reality program. For example, if a one dollar bill, a well known pattern that can be programmed for identification within a computerized program, is laid upon a table and a program is utilized to analyze the dimensions of the image of the dollar bill, a determination can be made of a distance and an orientation of the dollar bill to the device capturing the image. This distance and/or orientation of the known object or token in a captured image can be used by an augmented or virtual reality program as an input for manipulating a three dimensional model.
  • Augmented or virtual reality can be used to help a user improve their experience or acquire more detailed information about a topic. For example, a program can be operated on a computerized device to assist a user in learning about a skill or activity. A process for displaying a three dimensional computer generated image of an instructor upon a portable computerized device is disclosed.
  • Some users of portable computerized devices utilize such devices to research and gather information on activities that they are doing. As used herein, the term “portable computerized device” can refer to a number of computerized devices, such as smart-phones, laptop computers, tablet computers, and eye glasses configured with a processor and capable of displaying graphics within a view of the user. A user may wish to utilize a portable computerized device to watch a demonstration of or instruction on an activity the user wishes to learn. Such a demonstration would be enhanced by the ability of the user to utilize a portable computerized device to view the demonstration or instruction as it would appear in the user's actual environment through an augmented reality system.
  • A three dimensional model of an instructor can be displayed upon a portable computerized device, permitting a user to view the instruction from a number of different points of view. A wide variety of instruction topics can be programmed for display. In one exemplary embodiment, a yoga instruction program can be operated, whereby a yoga position or a series of yoga exercises can be displayed to the user. The user can change a point of view for the program, enabling the user to see from a number of different angles what movements the instructor is exemplifying. A progression of yoga exercises can be made available, with varying levels of difficulty ranging from novice to expert, such that the program can be used and marketed as a comprehensive training program. In one embodiment, a virtual reality program can be provided, whereby the user can manipulate a point of view for the instructor though a slider input displayed upon a screen of a smart-phone device. In another embodiment, a yoga mat with a token imprinted thereupon can be provided, and an augmented reality program can utilize an image captured of the yoga mat as a token to manipulate a point of view of the displayed yoga instructor. In one embodiment, an image of a token captured by a camera device can be utilized as a first input to determine a point of view for a three dimensional program, and a user input to the device can be utilized as a second input to determine the point of view. For example, a user viewing a token-oriented model from a front facing point of view can tap an option displayed upon the screen of an exemplary tablet device, and the displayed model can be changed to a left facing point of view.
  • In another exemplary embodiment, a first aid instruction program can be operated. For example, an instructor can be displayed going through sequential instructions on how to perform cardiopulmonary resuscitation (CPR). In such an instance, an instructor and a virtual patient can be displayed for view by the user, with an audio message describing at each step of the procedure important aspects of the displayed procedure. An exemplary user can pause the displayed sequence during chest compressions and slowly repeat the display through a chest compression, while changing point of view as necessary, to visualize proper hand placement and compression depth. Other options can be changed by the user as desired, for example, changing an age, gender, or medical condition of the virtual patient. In one example, a CPR mannequin for use by the user can be utilized as a token for an augmented reality program, such that the user can examine the instructor in place over the mannequin performing CPR just prior to the user actually practicing on the mannequin. In one embodiment, a sensor or sensors monitoring movement of the user can be monitored, and a subsequent display can be used to compare monitored movement to the movement of the modeled instructor. Such sensors used to monitor movement of a person are known in the art and will not be described in detail herein. Other first aid procedures that can be displayed include proper use of a defibrillator, proper application of a splint to a broken limb, and addressing a cut.
  • In another exemplary embodiment, a martial arts instruction program can be operated. In one exemplary program, an actual instructor, using a first portable computerized device, can control parameters of a program displayed in an augmented reality program superimposed upon a workout mat, wherein a token is imprinted or otherwise placed upon the mat, and a plurality of students can view operation of the program under the control of the instructor from various points of view controlled according to inputs disclosed herein. A pair of virtual participants can be displayed interacting with each other.
  • A number of other exemplary instruction programs can be operated. For example, a program providing instruction on how to assemble components of a computer can be operated, enabling a user to quickly unpack a new computer, make the required cable connections, and take advantage of various features of the computer. A program can be operated to show a tennis player a correct form or various popular forms for hitting a back-hand shot. A program can be operated to show how to chip a golf ball onto a green. A program can be operated to show a number of different sport techniques, including but not limited to a soccer kick; a football throw or kick; a baseball swinging, throwing, or catching technique; a jump shot or dribble in basketball; a swimming technique; a water rescue technique; a hockey technique; a figure skating routine; proper technique for lifting weights; an aerobics workout routine; downhill or cross country skiing; snowboarding or snow blades; roller blading; water skiing; parachuting technique; boxing; table tennis; water polo; lacrosse; wrestling; archery; target shooting; and rock climbing. A program can be operated to show automotive repair techniques, e.g., how to change oil for a particular model and year of car. A program can be operated to show a popular dance technique. A program can be operated to show to a class of students a technique used for a medical, dental, or surgical procedure. A program can be operated to show a student proper technique in playing a guitar. In such an embodiment, a virtual guitar player and guitar can be displayed. In another example, a virtual violin, saxophone, or piano can be displayed. In another embodiment, only a guitar with a pair of virtual hands located to the guitar can be displayed. In one embodiment intended to instruct children, an animated cartoon character can be displayed. A number of exemplary uses and instruction programs are envisioned, and the disclosure is not intended to be limited to the exemplary embodiments disclosed herein.
  • A token can be a two dimensional image printed upon a flat object. A token can be a three dimensional object or images printed upon a three dimensional object. A token can be a simple design, for example, printable upon a single sheet of paper. A token can be imprinted upon a decorative object. In an exemplary embodiment including a yoga mat, the mat can include a token for operating a three dimensional model instruction program as disclosed herein. Such an exemplary mat can further include exemplary images showing a user a number of yoga positions. An object printed with a token can be a sellable object, for example, wherein information on the object can enable a user to download and/or initiate a corresponding instruction program. In one exemplary embodiment, a token recognized by a user's device can act similarly to a QR code, automatically instructing the device to go to a particular webpage whereat an executable program can be executed or downloaded. End user license information for a program can be contained upon a printed object also acting as a token.
  • In accordance with various embodiments of the present disclosure, techniques are disclosed for presenting instructions or demonstrations of activities utilizing an augmented or virtual reality system indexed in three dimensions which allows a user to view an instruction or demonstration as it would appear in the user's actual environment. Instructions, characters, avatars, cartoon graphics, and other graphic displays are embodiments of a three dimensional instruction graphic as disclosed herein.
  • Referring now to the drawings, wherein the showings are for the purpose of illustrating certain exemplary embodiments only and not for the purpose of limiting the same, FIG. 1 illustrates a portable computerized device including a camera feature reading information from a token placed upon a surface. Portable computerized device 10 includes view-screen 15. An exemplary portable computerized device 10 further includes a processor, RAM memory, and storage memory in the form of a hard drive, flash memory, or other similar storage devices known in the art. Portable computerized device 10 can further be connected to a wireless communication network through cellular connection, wireless LAN connection, or other communication methods known in the art. Portable computerized device further includes software, such as an operating system and applications configured to monitor inputs, for example, in the form of touch inputs to a touch screen device, inputs to the camera device, and audio inputs and control outputs, for example, in the form of graphics, sounds, and communication signals over the wireless connection. View-screen 15 can include a touch screen input. Other exemplary devices use button inputs, trackball and button inputs, eye focus location sensors, or other methods known in the art. The camera device of portable computerized device 10 can include a lens and optical sensor according to digital cameras or smart-phones known in the art capable of capturing a visual image and translating it into a stored digital representation of the image. View 30 of the camera is illustrated. Any number of portable computerized devices can be utilized according to the methods disclosed herein, and the disclosure is not intended to be limited to the particular examples provided.
  • Token 20 is some graphic design, symbol, word, or pattern 25 that can be identified by a portable computerized device 10 through its camera device input. An application upon the portable computerized device 10 interprets images captured by the camera device and identifies the token within the image. Based upon the size and orientation of the token and the graphics or symbols thereupon in the image, a spatial relationship of the token to the image can be determined.
  • Graphical images displayed upon view-screen 15 can be based upon a three-dimensional model. Such models are well known in the art and can include simple geometric shapes or more complex models, for example, modeling the human form.
  • The application can include programming to create a graphical image upon view-screen 15 based upon manipulation of a model associated with token 20. Token 20 provides a reference point at which to anchor the graphical image and the orientation of the model upon which the image is based. According to one embodiment, the graphic based upon the model can be displayed upon the view-screen 15. In another embodiment, the image or a stream of images from the camera device can be displayed upon the view-screen 15, and the graphic based upon the model can, with the orientation based upon the sensed token, be superimposed over the images from the camera. In one embodiment, the graphic based upon the model can be located to partially or fully cover the token in the image.
  • FIG. 2 illustrates a plurality of positions from which portable computerized devices can be located and view a token. Three portable computerized devices 110A, 110B, and 110C, are illustrated located at different locations with respect to token 100, and the camera devices of the portable computerized devices include respective views 115A, 115B, and 115C. Based upon identifying the token 100 and the graphic 105 thereupon, applications running upon each of portable computerized devices 110A, 110B, 110C can interpret a location of the token with respect to the portable computerized device and manipulate a point of view of a programmed model associated with the token to represent a virtual object or character oriented based upon how the portable computerized device is located with respect to the token. Graphic 105 is illustrated as an arrow showing a graphic that can clearly indicate a direction in which the token is oriented. Further, based upon perspective of the graphic to the viewer or the portable computerized device viewing the graphic, an inclination of the token with respect to the portable computerized device can be determined. Further, based upon a size of the graphic within the image, a distance of the token from the portable computerized device viewing the graphic can be determined. The graphic is illustrated as an arrow for clarity of example, but any graphic with distinguishable orientation can be utilized upon token 100. Vivid colors and bright contrast in the graphic can be used to aid identification of the token and the graphic thereupon by the portable computerized device.
  • FIG. 3 illustrates an exemplary yoga mat utilized as a token, with two exemplary portable computerized devices illustrating a virtual yoga instructor with an orientation based upon the location of portable computerized device with respect to the token. Yoga mat 200 includes graphics 205 which indicate an orientation of the token 200. Illustrative graphics 206 can also be included upon the mat, in addition to the graphics needed to provide orientation and identification of the token or as part of the graphics providing orientation and identification of the token. According to one embodiment, each of the illustrative graphics can include a border with a particular shape, such as a pentagon, which aid in a portable computerized device quickly and robustly identifying and tracking the graphic upon the token. Portable computerized devices 210A and 210B are illustrated with different locations and orientations with respect to token 200. Portable computerized device 210A illustrates a virtual yoga instructor 220A located upon an image of token 215A represented upon the view screen as image 215A. Portable computerized device 210B illustrates a virtual yoga instructor 220B located upon an image of token 215B represented upon the view screen as image 215B. Based upon the different locations of the portable computerized devices and a common model operating upon both devices, the yoga instructor 220A on portable computerized device 210A is projected facing to the left, while the yoga instructor 220B on portable computerized device 210B is illustrated facing the viewer. Each of the portable computerized devices can be moved about the token 200 as indicated by arrow 225, and the orientation of the virtual instructor upon the view-screen will change with the changed location of the portable computerized device.
  • Portable computerized devices 210A and 210B are illustrated displaying the same yoga instructor based upon a same model with a same anchored orientation with respect to the token 200. However, it will be appreciated that the user of a particular portable computerized device can change any of a number of parameters. For example, a gender of the instructor illustrated can be changed, a size of the instructor graphic can be changed, a baseline rotation of the instructor can be changed based upon the preferences of a particular viewer. According to one embodiment, a number of quick select buttons can be presented along an edge of the view-screen of the portable computerized device for easy manipulation of the projected graphics. In one example, an orientation of the illustrated character can be toggled 180 degrees, so that the person watching the instructor can quickly change whether the front or the back of the character is being viewed. FIG. 3 illustrates a yoga instructor that can be viewed in three dimensions based upon an orientation of a token to the device of the viewer. Other embodiments are envisioned, such as martial art instruction or first aid instruction, and the disclosure is not intended to be limited to the particular examples provided herein.
  • FIG. 4 illustrates operation of an exemplary three dimensional model instruction program operating a first aid program. Device 310 includes display 320 and a camera device capturing view 315 including a token as disclosed herein. Based upon input received from view 315, device 310 displays a first virtual character 330 providing first aid to second virtual character 340. In the exemplary embodiment of FIG. 4, character 330 is applying a splint 335 to the arm of character 340. A number of first aid instructions are envisioned, and the disclosure is not intended to be limited to the exemplary embodiments disclosed herein.
  • FIG. 5 illustrates operation of an exemplary three dimensional model instruction program operating a martial arts program. Device 410 includes display 420. Character 430 representing an instructor is illustrated. Character 440 is illustrated showing user motions captured through a motion capture sensor known in the art. Device 410 can show the instructor and the user's motions in slow or pause motion, permitting the user to compare the graphics and learn from the comparison. Input graphics permitting a user to interact with a touch-screen device are illustrated, including button 450 prompting the user to play the graphic motions, button 452 prompting the user to pause the graphic motions, button 454 prompting the user to request to see the instructor go through the instruction again, and button 456 prompting the user to record another attempt at the instructed motion. In another embodiment, a virtual character executing a block of the illustrated motion could be displayed. A number of teaching methods and interactive controls are envisioned for use with the instructions disclosed herein, and the disclosure is not intended to be limited to the examples provided herein.
  • FIG. 6 illustrates an exemplary three dimensional model instruction program illustrating instructions to install a cable to a computer. Device 510 is illustrated including display 520 and a camera device capturing view 525. Laptop computer 530 is illustrated proximate to and within the view of device 510. In one exemplary embodiment, the model of computer 530 can be entered, and an image of computer 530 can be referenced in a database, such that computer 530 can act as a token for a program operated on device 510. Either a graphic representing computer 530 in a virtual reality program or an image of computer 530 can be displayed as graphic 540. Virtual hand 550 is illustrated inserting a printer cable in a particular port located upon the computer. A number of instructional programs showing a user how to accomplish physical tasks are envisioned, and the disclosure is not intended to be limited to the examples provided herein.
  • FIG. 7 is a schematic illustrating an exemplary portable computerized device in communication with an exemplary three dimensional model instruction server. Portable computerized device 610 is illustrated, including message 630 displayed upon a graphical user interface 620 of device 610. Device 610 can include a camera device, and an image or a series of images creating a video feed can be displayed including an object displaying a token image. Device 610 is an exemplary portable computerized device including input devices configured to gather information and a processor configured to make determinations regarding data from the input devices. Server 650 is illustrated including a remote computerized system with modules operating to process information gathered from device 610 and enable operation of a three dimensional model. Server 650 and device 610 are in communication through exemplary wireless communications network 640. Message 630 illustrates an embodiment whereby an image capturing a token initiates a sequence for downloading the instruction program to the device.
  • FIG. 8 is a schematic illustrating an exemplary three dimensional model instruction server. In the illustrated embodiment, the server 650 may include a processing device 720, a communication device 710, and memory device 730.
  • The processing device 720 can include memory, e.g., read only memory (ROM) and random access memory (RAM), storing processor-executable instructions and one or more processors that execute the processor-executable instructions. In embodiments where the processing device 720 includes two or more processors, the processors can operate in a parallel or distributed manner. In the illustrative embodiment, the processing device 720 executes one or more of a video input module 740, a 3D model rendering module 750, and an instruction module 760.
  • The communication device 710 is a device that allows the server 650 to communicate with another device, e.g., a portable computerized device 610 through a wireless communication network connection. The communication device 710 can include one or more wireless transceivers for performing wireless communication and/or one or more communication ports for performing wired communication.
  • The memory device 730 is a device that stores data generated or received by the server 650. The memory device 730 can include, but is not limited to a hard disc drive, an optical disc drive, and/or a flash memory drive. Further, the memory device 730 may be distributed and located at multiple locations. The memory device 730 is accessible to the processing device 720. In some embodiments, the memory device 730 includes a graphics database 780 and an instruction database 790.
  • Graphics database 780 can include files, libraries, and other tools facilitating operation of a 3D model. Instruction database 790 can include information enabling operation of an instruction program, for example, including data enabling operation of twenty yoga exercises.
  • The video input module 740 can monitor information provided by device 610 over network 640, for example, including a series of images showing a token within a view of the device. Module 740 can include programming to process the images, recognize the token within the images, determine a distance to and orientation of the token, and process the information as an input value or values to a 3D model.
  • Instruction module 760 includes programming to execute an instruction program, including operation of rules, routines, lesson plans, and other relevant information required to display an instruction program. Module 760 can access instruction database 790 to enable used of information stored on the database.
  • 3D model rendering module 750 receives data from modules 740 and 760 and database 780, and module 750 provides graphics, images, or instructions enabling display of a three dimensional model instruction display upon device 610.
  • FIG. 9 is a schematic illustrating an exemplary portable computerized device configured to implement processes disclosed herein. Device 610 includes a processing device 810, a user interface 820, a communication device 860, a camera 830, and a memory device 840.
  • The processing device 810 can include memory, e.g., read only memory (ROM) and random access memory (RAM), storing processor-executable memory (ROM) and random access memory (RAM), storing processor-executable instructions and one or more processors that execute the processor-executable instructions. In embodiments where the processing device 810 includes two or more processors, the processors can operate in a parallel or distributed manner. In the illustrative embodiment, the processing device 810 can execute the operating system of the portable computerized device. In the illustrative embodiment, the processing device 810 also executes a video input module 850, a user input module 870, and graphics module 880, which are described in greater detail below.
  • The user interface 820 is a device that allows a user to interact with the portable computerized device. While one user interface 820 is shown, the term “user interface” can include, but is not limited to, a touch screen, a physical keyboard, a mouse, a microphone, and/or a speaker. The communication device 860 is a device that allows the portable computerized device to communicate with another device, e.g., server 650. The communication device 860 can include one or more wireless transceivers for performing wireless communication and/or one or more communication ports for performing wired communication. The memory device 840 is a device that stores data generated or received by the portable computerized device. The memory device 840 can include, but is not limited to, a hard disc drive, an optical disc drive, and/or a flash memory drive.
  • The camera 830 is a digital camera that captures a digital photograph. The camera 830 receives an instruction to capture an image and captures an image of a view proximate to the camera. The digital photograph can be a bitmap file. The bitmap file can be a bitmap, a JPEG, a GIF, or any other suitably formatted file. The camera 830 can receive the instruction to capture the image from the processing device 810 and can output the digital photograph to the processing device 810.
  • Video input module 850 monitors data from camera device 830, which can include an image of a token. User input module 870 can monitor input from the user related to manipulation of three dimensional model being operated. Graphics module 880 can receive data from server 650 and provide a display upon device 610 related to operation of the model and the related instruction program.
  • Embodiments in accordance with the present disclosure may be embodied as an device, process, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product embodied any tangible medium of expression having computer-usable program code embodied in the medium.
  • Any combination of one or more computer-usable or computer-readable media may be utilized. For example, a computer-readable medium may include one or more of a portable computer diskette, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device. Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages.
  • Embodiments may also be implemented in cloud computing environments. In this description and the following claims, “cloud computing” may be defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly. A cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“IaaS”), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.).
  • FIGS. 7-9 illustrate an exemplary embodiment whereby a process to monitor location data related to a token is used to generate a three dimensional model instruction graphic. Location data can include a distance, an orientation, and other information that can be determined based upon an image of a token. Tasks are split between the exemplary device and server according to one embodiment of the disclosure. However, other embodiments are disclosed. A portable computerized device can perform all of the necessary programming to operate processes disclosed herein. Operation of the 3D model can be operated on either the server or the device.
  • A computerized system to display a three dimensional yoga instruction graphic to a plurality of users employing processes disclosed herein can be disclosed. The system includes a token and, for each of the plurality of users, a portable computerized device. The portable computerized device includes a camera device capturing an image, the image comprising location data related to a token configured to display a three dimensional yoga instruction graphic. The portable computerized devices each operate a three dimensional yoga instruction model based upon the location data, and display the three dimensional instruction yoga graphic based upon the three dimensional yoga instruction model. In an alternate embodiment, a remote server could perform some tasks for each of the devices, such as operating the three dimensional model, and the devices can each display individual outputs based upon communication with the server.
  • Throughout the disclosure, a camera device is disclosed as a device for localizing a portable computerized device to a token. Other devices and processes for providing location data to the device are envisioned, for example, utilizing radio frequency (RF ID) chips, a 3D map device, or inertial sensors within the device as spatial inputs to a model. Operation of these devices is known in the art and will not be disclosed in detail herein. Any of these alternative devices can be used in isolation or in cooperation with each other or with a camera device to provide or improve upon a spatial input to a three dimensional model.
  • The above description of illustrated examples of the present disclosure, including what is described in the Abstract, are not intended to be exhaustive or to be limited to the precise forms disclosed. While specific embodiments of, and examples for, the disclosure are described herein for illustrative purposes, various equivalent modifications are possible without departing from the broader spirit and scope of the present disclosure. Indeed, it is appreciated that the specific example values, times, etc., are provided for explanation purposes and that other values may also be employed in other embodiments and examples in accordance with the teachings of the present disclosure.

Claims (20)

1. A portable computerized device configured to display a three dimensional instruction graphic, the device comprising:
a camera device capturing an image, the image comprising location data related to a token; and
wherein the portable computerized device is configured to display the three dimensional instruction graphic based upon the location data.
2. The device of claim 1, wherein the three dimensional instruction graphic comprises a yoga instruction program.
3. The device of claim 2, wherein the token comprises a yoga mat.
4. The device of claim 3, wherein the yoga mat comprises illustrations of yoga exercises.
5. The device of claim 1, wherein the three dimensional instruction graphic comprises a first aid instruction program.
6. The device of claim 1, wherein the first aid instruction program comprises a cardiopulmonary resuscitation instruction program.
7. The device of claim 1, wherein the three dimensional instruction graphic comprises a martial arts instruction program.
8. The device of claim 1, wherein the three dimensional instruction graphic comprises an assembly instruction program, illustrating to a user a process to install a newly purchased device.
9. The device of claim 1, wherein the three dimensional instruction graphic comprises a sports technique instruction program.
10. The device of claim 9, wherein the sports technique instruction program comprises a tennis instruction program.
11. The device of claim 9, wherein the sports technique instruction program comprises a golf instruction program.
12. The device of claim 1, wherein the three dimensional instruction graphic comprises an automotive repair instruction program.
13. The device of claim 1, wherein the three dimensional instruction graphic comprises a dance instruction program.
14. The device of claim 1, wherein the three dimensional instruction graphic comprises a music instruction program.
15. The device of claim 1, wherein the three dimensional instruction graphic comprises a medical procedure instruction program.
16. A computerized system to display a three dimensional yoga instruction graphic to a plurality of users, the system comprising:
a token; and
for each of the plurality of users, a portable computerized device:
comprising a device providing spatial inputs based upon a location of the token;
operating a three dimensional yoga instruction model based upon the spatial inputs; and
displaying the three dimensional instruction yoga graphic based upon the three dimensional yoga instruction model.
17. The system of claim 16, wherein the device providing spatial inputs comprises a camera device capturing an image, the image comprising location data related to the token.
18. The system of claim 16, wherein the device providing spatial inputs comprises a location device selected from a radio frequency chip device, a three dimensional map device, and an inertial sensor device.
19. A computerized process for displaying a three dimensional instruction graphic, the process comprising:
within a computerized processor:
operating a three dimensional instruction model;
monitoring an image providing location data for a token; and
utilizing the location data as spatial inputs to the three dimensional instruction model; and
upon a computerized display device, displaying the three dimensional instruction graphic based upon the three dimensional instruction model.
20. The computerized process of claim 19, further comprising capturing the image with the computerized display device comprising a portable computerized device; and
wherein the three dimensional instruction graphic changes as the portable computerized device is moved relative to the token.
US14/026,870 2012-09-14 2013-09-13 Augmented reality system indexed in three dimensions Abandoned US20140078137A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/026,870 US20140078137A1 (en) 2012-09-14 2013-09-13 Augmented reality system indexed in three dimensions

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261701039P 2012-09-14 2012-09-14
US201361752010P 2013-01-14 2013-01-14
US14/026,870 US20140078137A1 (en) 2012-09-14 2013-09-13 Augmented reality system indexed in three dimensions

Publications (1)

Publication Number Publication Date
US20140078137A1 true US20140078137A1 (en) 2014-03-20

Family

ID=50273989

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/026,909 Expired - Fee Related US9224231B2 (en) 2012-09-14 2013-09-13 Augmented reality system indexed in three dimensions
US14/026,870 Abandoned US20140078137A1 (en) 2012-09-14 2013-09-13 Augmented reality system indexed in three dimensions

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/026,909 Expired - Fee Related US9224231B2 (en) 2012-09-14 2013-09-13 Augmented reality system indexed in three dimensions

Country Status (1)

Country Link
US (2) US9224231B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140166739A1 (en) * 2012-12-14 2014-06-19 Michael F. Savage Process and apparatus utilizing qr codes with exercise mats
US20150170546A1 (en) * 2013-12-12 2015-06-18 Koninklijke Philips N.V. Software application for a portable device for cpr guidance using augmented reality
US20170150127A1 (en) * 2015-11-23 2017-05-25 Wal-Mart Stores, Inc. Virtual Training System
US20170315359A1 (en) * 2014-10-21 2017-11-02 Koninklijke Philips N.V. Augmented reality patient interface device fitting appratus
WO2020096376A1 (en) * 2018-11-07 2020-05-14 Samsung Electronics Co., Ltd. System and method for coded pattern communication
EP4053825A4 (en) * 2019-10-30 2022-12-21 Newbase Inc. Method and apparatus for providing treatment training for emergency patient

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9951549B2 (en) * 2013-11-01 2018-04-24 Flectronics AP, LLC Vehicle power systems activation based on structured light detection
US10058775B2 (en) * 2014-04-07 2018-08-28 Edo Segal System and method for interactive mobile gaming
US9734634B1 (en) 2014-09-26 2017-08-15 A9.Com, Inc. Augmented reality product preview
WO2016165771A1 (en) * 2015-04-16 2016-10-20 Hewlett-Packard Development Company, L.P. Print apparatus having first and second printing devices, computer readable medium and computer implemented method
US9524589B1 (en) * 2015-08-31 2016-12-20 Welspun India Limited Interactive textile article and augmented reality system
DE102016103054A1 (en) * 2016-02-22 2017-08-24 Krauss-Maffei Wegmann Gmbh & Co. Kg Military sandbox system and method of operating a military sandbox
RU2636676C2 (en) * 2016-05-12 2017-11-27 Общество с ограниченной ответственностью "Торговый дом "Технолайн" ООО "Торговый дом "Технолайн" Method of creating augmented reality elements and graphic media for its implementation
WO2018102355A1 (en) * 2016-11-29 2018-06-07 Wal-Mart Stores, Inc. Augmented reality-assisted modular set-up and product stocking systems and methods
WO2018237040A2 (en) * 2017-06-20 2018-12-27 Walmart Apollo, Llc SYSTEMS AND METHODS FOR MANAGING INVENTORY AUDITS
US11610183B2 (en) 2017-06-29 2023-03-21 Walmart Apollo, Llc Systems and methods for performing and tracking asset inspections
LU100390B1 (en) * 2017-09-05 2019-03-19 Luxembourg Inst Science & Tech List Stretchable interactive tangibles
US11232636B2 (en) * 2018-02-08 2022-01-25 Edx Technologies, Inc. Methods, devices, and systems for producing augmented reality
US11126845B1 (en) * 2018-12-07 2021-09-21 A9.Com, Inc. Comparative information visualization in augmented reality
CN110388919B (en) * 2019-07-30 2023-05-23 上海云扩信息科技有限公司 3D Model Localization Method Based on Feature Map and Inertial Measurement in Augmented Reality
US12059632B2 (en) 2019-11-17 2024-08-13 Nickolay Lamm Augmented reality system for enhancing the experience of playing with toys

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060057551A1 (en) * 2000-05-09 2006-03-16 Knowlagent, Inc. Versatile resource computer-based training system
US20080187896A1 (en) * 2004-11-30 2008-08-07 Regents Of The University Of California, The Multimodal Medical Procedure Training System
US20080191864A1 (en) * 2005-03-31 2008-08-14 Ronen Wolfson Interactive Surface and Display System
US20090114079A1 (en) * 2007-11-02 2009-05-07 Mark Patrick Egan Virtual Reality Composer Platform System
US20120075285A1 (en) * 2010-09-28 2012-03-29 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US20120075343A1 (en) * 2010-09-25 2012-03-29 Teledyne Scientific & Imaging, Llc Augmented reality (ar) system and method for tracking parts and visually cueing a user to identify and locate parts in a scene

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6424410B1 (en) 1999-08-27 2002-07-23 Maui Innovative Peripherals, Inc. 3D navigation system using complementary head-mounted and stationary infrared beam detection units
US7159041B2 (en) 2000-03-07 2007-01-02 Microsoft Corporation Method and system for defining and controlling algorithmic elements in a graphics display system
US8040328B2 (en) 2000-10-11 2011-10-18 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US7329193B2 (en) 2002-07-23 2008-02-12 Plank Jr Richard G Electronic golf swing analyzing system
JP2004199496A (en) * 2002-12-19 2004-07-15 Sony Corp Information processing apparatus and method, and program
US7647289B2 (en) 2006-06-02 2010-01-12 Microsoft Corporation Learning belief distributions for game moves
US8217995B2 (en) 2008-01-18 2012-07-10 Lockheed Martin Corporation Providing a collaborative immersive environment using a spherical camera and motion capture
US8231465B2 (en) * 2008-02-21 2012-07-31 Palo Alto Research Center Incorporated Location-aware mixed-reality gaming platform
US20090244097A1 (en) * 2008-03-25 2009-10-01 Leonardo William Estevez System and Method for Providing Augmented Reality
US9604142B2 (en) 2010-08-26 2017-03-28 Blast Motion Inc. Portable wireless mobile device motion capture data mining system and method
GB2491819A (en) 2011-06-08 2012-12-19 Cubicspace Ltd Server for remote viewing and interaction with a virtual 3-D scene

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060057551A1 (en) * 2000-05-09 2006-03-16 Knowlagent, Inc. Versatile resource computer-based training system
US20080187896A1 (en) * 2004-11-30 2008-08-07 Regents Of The University Of California, The Multimodal Medical Procedure Training System
US20080191864A1 (en) * 2005-03-31 2008-08-14 Ronen Wolfson Interactive Surface and Display System
US20090114079A1 (en) * 2007-11-02 2009-05-07 Mark Patrick Egan Virtual Reality Composer Platform System
US20120075343A1 (en) * 2010-09-25 2012-03-29 Teledyne Scientific & Imaging, Llc Augmented reality (ar) system and method for tracking parts and visually cueing a user to identify and locate parts in a scene
US20120075285A1 (en) * 2010-09-28 2012-03-29 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Fitzgerald, Diarmaid, et al. "Usability evaluation of e-motion: a virtual rehabilitation system designed to demonstrate, instruct and monitor a therapeutic exercise programme." Virtual Rehabilitation, 2008. IEEE, 2008. *
G2, [online][ January 25, 2010][Retrieved from: http://fatfightertv.com/2010/01/g2-yoga-mat-giveaway/][on: 7/28/2015 11:09:40 AM]. *
Huang, Chun-Hong, et al. "A web-based e-learning platform for physical education." Journal of Networks 6.5 (2011): 721-727. *
Patel, Kayur, et al. "The effects of fully immersive virtual reality on the learning of physical tasks." Proceedings of the 9th Annual International Workshop on Presence, Ohio, USA. 2006. *
Youtube, [online][published Apr 12, 2012][retrieved from: https://www.youtube.com/watch?v=UnhOmEGoTRA] [on: Aug 22, 2014]. *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140166739A1 (en) * 2012-12-14 2014-06-19 Michael F. Savage Process and apparatus utilizing qr codes with exercise mats
US10134307B2 (en) * 2013-12-12 2018-11-20 Koninklijke Philips N.V. Software application for a portable device for CPR guidance using augmented reality
US20150170546A1 (en) * 2013-12-12 2015-06-18 Koninklijke Philips N.V. Software application for a portable device for cpr guidance using augmented reality
US10459232B2 (en) * 2014-10-21 2019-10-29 Koninklijke Philips N.V. Augmented reality patient interface device fitting apparatus
US20170315359A1 (en) * 2014-10-21 2017-11-02 Koninklijke Philips N.V. Augmented reality patient interface device fitting appratus
GB2546589A (en) * 2015-11-23 2017-07-26 Wal Mart Stores Inc Virtual training system
US20170150127A1 (en) * 2015-11-23 2017-05-25 Wal-Mart Stores, Inc. Virtual Training System
GB2546589B (en) * 2015-11-23 2020-02-05 Walmart Apollo Llc Virtual training system
US10582190B2 (en) * 2015-11-23 2020-03-03 Walmart Apollo, Llc Virtual training system
WO2020096376A1 (en) * 2018-11-07 2020-05-14 Samsung Electronics Co., Ltd. System and method for coded pattern communication
US10691767B2 (en) 2018-11-07 2020-06-23 Samsung Electronics Co., Ltd. System and method for coded pattern communication
EP4053825A4 (en) * 2019-10-30 2022-12-21 Newbase Inc. Method and apparatus for providing treatment training for emergency patient
US11615712B2 (en) 2019-10-30 2023-03-28 Newbase Inc. Method and apparatus for providing training for treating emergency patients
US11915613B2 (en) 2019-10-30 2024-02-27 Newbase Inc. Method and apparatus for providing training for treating emergency patients

Also Published As

Publication number Publication date
US20140080605A1 (en) 2014-03-20
US9224231B2 (en) 2015-12-29

Similar Documents

Publication Publication Date Title
US20140078137A1 (en) Augmented reality system indexed in three dimensions
US11826628B2 (en) Virtual reality sports training systems and methods
US10821347B2 (en) Virtual reality sports training systems and methods
US11532172B2 (en) Enhanced training of machine learning systems based on automatically generated realistic gameplay information
US11132533B2 (en) Systems and methods for creating target motion, capturing motion, analyzing motion, and improving motion
Miles et al. A review of virtual environments for training in ball sports
US20160049089A1 (en) Method and apparatus for teaching repetitive kinesthetic motion
US20180374383A1 (en) Coaching feedback system and method
WO2020150327A1 (en) Augmented cognition methods and apparatus for contemporaneous feedback in psychomotor learning
US20160314620A1 (en) Virtual reality sports training systems and methods
US20140188009A1 (en) Customizable activity training and rehabilitation system
JP7700787B2 (en) Information processing device, information processing method, and program
WO2015112611A1 (en) Method and system for portraying a portal with user-selectable icons on a large format display system
KR102095647B1 (en) Comparison of operation using smart devices Comparison device and operation Comparison method through dance comparison method
CN116271757A (en) Auxiliary system and method for basketball practice based on AI technology
US11331551B2 (en) Augmented extended realm system
Dabnichki Computers in sport
Covarrubias et al. Enhancing inclusive sport participation for students with special needs through VR and rowing simulator
Tai et al. Badminton self-training system based on virtual reality
US20250001294A1 (en) System to determine a real-time user-engagement state during immersive electronic experiences
KR20190101587A (en) Realistic biathlon simulator system
WO2001009861A2 (en) Method and system for interactive motion training
GR1010685B (en) Sport competition digital simulation method used for the cognitive education and training of athletes
Miles An Advanced Virtual Environment for Rugby Skills Training
Katz et al. Sport Technology Research Laboratory, University of Calgary

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION