[go: up one dir, main page]

WO2009054619A2 - Augmented reality computer device - Google Patents

Augmented reality computer device Download PDF

Info

Publication number
WO2009054619A2
WO2009054619A2 PCT/KR2008/005842 KR2008005842W WO2009054619A2 WO 2009054619 A2 WO2009054619 A2 WO 2009054619A2 KR 2008005842 W KR2008005842 W KR 2008005842W WO 2009054619 A2 WO2009054619 A2 WO 2009054619A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
rectangle
mark
camera
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2008/005842
Other languages
French (fr)
Other versions
WO2009054619A3 (en
Inventor
Moon Key Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority claimed from KR1020080097458A external-priority patent/KR20090040839A/en
Publication of WO2009054619A2 publication Critical patent/WO2009054619A2/en
Publication of WO2009054619A3 publication Critical patent/WO2009054619A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present invention relates to the augmented reality computer device.
  • the object of the present invention is to provide the enlarged display through the head mound display by augmented reality with the small physical display of small (handheld, mobile) computer such as mobile phone, navigator, mobile game machine, umpc(ultra mobile PC).
  • the augmented reality is the technology to overlay the computer graphic image onto the real video image.
  • the overlaied computer graphic image usually attached to the physical mark in video image and the overlaied image looks like real physical thing. For example, a person wares the mark on head and camera captures the person and image processing portion recognizes the mark from the video and overlays the 3 dimensional monster face image onto the mark then the final output video shows the person whose head is monster.
  • the mark usually contains black rectangle in white background.
  • the image processing portion recognizes the rectangle from the video and produces the 3 dimensional distance and direction between the camera and the mark by analyzing the shape(distortion) and size of rectangle in captured video and overlay the transformed 3 dimensional monster face image onto the video.
  • the transformation includes the translation, scaling, rotation which is well known skill in 3 dimensional game programming.
  • the present invention captures the video in direction of view of eye of its user by the camera attached to the head mount display(or head of user) and overlays the enlarged screen image of mobile computer to the video.
  • the head mount display of present invention is safer(easy to escape from the accident) than ordinary head mount display because the ordinary head mount display cover the whole field of view of its user so the user must take off the head mount display in order to watch the real world but the user of present invention can watch the real world and virtual overlaied screen at the same time without taking off the head mount display.
  • FIG.1 shows the composition of present invention.
  • FIG.2 shows the mark on display of mobile computer.
  • FIG.3 shows the modified mark on display of mobile computer.
  • the augmented reality computer device of present invention comprises [27] camera portion capturing the image of mark of mobile computer,
  • image synthesis portion synthesizing the image by transforming and overlaying the screen shot image of mobile computer onto the video image captured by the camera portion where the transformation and the overlaying position are determined by the distance and direction between camera and mark,
  • FIG.l shows the composition of the present invention.
  • stereo cameras(cal,ca2) are attached on the head mount display(ds) in order to capture the video in the direction of view of eye.
  • the display(mo) of mobile computer is captured by the said stereo camera(cal,ca2) and the captured video(cd) is transferred to the main portion (con).
  • the said stereo camera(cal,ca2) is captured by the said stereo camera(cal,ca2) and the captured video(cd) is transferred to the main portion (con).
  • the main portion (con).
  • Image processing program in main portion(con) analyzes the received video(cd) and extract feature points of mark(for example, rectangle of display(mo) of mobile computer ,rectangle of case of mobile computer ,LED attached on the mobile computer, or image in display(mo) of mobile computer) and outputs the direction and distance (3 dimensional position or 2 dimensional position) between camera and mark by recognizing the size and shape(distortion) of mark.
  • There is well known technology to calculate the 3 dimensional distance and direction between camera and mark whose size and shape is known. For example, by analyzing size and shape of rectangle mark (vertices of rectangle are 4 feature points) in captured image, the 3 dimensional distance and direction can be obtained by the solution of the perspective 4 point problem. Detail information can be found from http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL COPIES/MARBLE/high/pia/solv ing.htm
  • the 3 dimensional direction and distance between mark and camera can also be obtained by analyzing the stereo video images (left and right image).
  • the main portion(con) receives the screen shot video image(md) signal from the mobile computerlike the beam projector receives the video signal from computer.)
  • the main portion(con) may be embedded into the mobile computer.
  • the image synthesis portion (micro controller, computer, or digital signal processor) in main portion synthesizes the output video by transforming (scaling, translating, rotating, and distorting(affine transforaiation)) the received video image(md) of mobile computer and overlaying it onto the video image(cd) captured by camera where the overlaying position and the transformation is determined by the information of 3 dimensional distance and direction between camera(cal,ca2) and the mark of mobile computer.
  • the synthesized video is outputted through the head mount display(ds) and the wearer of the head mount display can see the enlarged virtual display(amo) of mobile computer attached on the mobile computer.
  • FIG.1 shows the large virtual display(amo) as if it is attached on the mobile computer (mobile phone).
  • the wearer of head mount display of present invention can watch his environment quickly and find way out by just laying down the mobile computer (laying down the mobile computer removes the virtual large display (amo) from the view of the wearer). But the wearer of the conventional head mount display must take off the head mount display to watch his environment because the video image of mobile computer covers all the view of its wearer. Therefore it is safer to use the head mount display of present invention then conventional one.
  • Physical mark (like LED) may be attached on the mobile computer and the virtual image of keyboard, mouse or stylus pen may be overlaied onto the final synthesize image.
  • the wearer of head mount display of present invention can use the virtual keyboard or mouse to control the mobile computer by adding function of recognizing the gesture of hand of wearer of head mount display of present invention into the image processing portion.
  • FIG.2 shows the example of mark displayed on the display(mo) of mobile computer.
  • the mark contains small rectangle (mk2) and big torus like rectangle (mkl) with common center and direction representing rectangle(mk3).
  • Such mark can be obtained by outputting graphic image onto the display of mobile computer.
  • the image processing portion can detect the rectangles (mkl, mk2) by recognizing the edge of rectangles by hough trans - formation(hough transformation is well known line detection technology) and analyzing the relative position of vertices which are the crossing points of the edge lines and the direction of mark can be determined by recognizing the another rectangle (mrk3). By inserting the positions of the vertices of rectangle (mkl or mk2) into the formula of perspective 4 point problem, 3 dimensional direction and distance between camera and the rectangle can be obtained.
  • the rectangle is only one example of mark and there is no limitation on the modification of mark in present invention.
  • FIG.3 shows the mark modified from the mark of FIG.2.
  • Mark 1 of FIG.3 contains direction representing inner vertex mark(mk4) which is corresponding to the mark (mk3) of FIG.2.
  • the boundary rectangle(mo) of display of mobile computer can be used as a mark.
  • the image processing portion can recognizes the shape of mobile computer and extract mark by comparing the 3 dimensional model data of housing of mobile computer stored in memory and mark from the captured video image. If stereo camera(cal,ca2) is implemented ,then the mark can be extracted from the stereo video by comparing the left and right video image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An augmented reality computer device comprising camera portion capturing the image of mark of mobile computer, image processing portion outputting the distance and direction between camera and mark, image synthesis portion synthesizing the image by transforming and overlaying the screen shot image of mobile computer onto the video image captured by the camera portion where the transformation and the overlaying position are determined by the distance and direction between camera and mark, and head mount display portion outputting the synthesized image. The object of the present invention is to provide the enlarged display through the head mound display by augmented reality with the small physical display of small (handheld, mobile) computer such as mobile phone, navigator, mobile game machine, umpc(ultra mobile PC). The augmented reality is the technology to overlay the computer graphic image onto the real video image. The overlaied computer graphic image usually attached to the physical mark in video and the overlaied image looks like real physical thing.

Description

Description AUGMENTED REALITY COMPUTER DEVICE
Technical Field
[1] The present invention relates to the augmented reality computer device. The object of the present invention is to provide the enlarged display through the head mound display by augmented reality with the small physical display of small (handheld, mobile) computer such as mobile phone, navigator, mobile game machine, umpc(ultra mobile PC). The augmented reality is the technology to overlay the computer graphic image onto the real video image. The overlaied computer graphic image usually attached to the physical mark in video image and the overlaied image looks like real physical thing. For example, a person wares the mark on head and camera captures the person and image processing portion recognizes the mark from the video and overlays the 3 dimensional monster face image onto the mark then the final output video shows the person whose head is monster. The mark usually contains black rectangle in white background. The image processing portion recognizes the rectangle from the video and produces the 3 dimensional distance and direction between the camera and the mark by analyzing the shape(distortion) and size of rectangle in captured video and overlay the transformed 3 dimensional monster face image onto the video. The transformation includes the translation, scaling, rotation which is well known skill in 3 dimensional game programming.
[2]
Disclosure of Invention
Technical Problem
[3] The computing power of mobile ,hand held computer(small computer)is rapidly increasing and the programs and games which are running in desktop computer(big computer) are ready to be ported to the mobile computer but mobile computer has the problem that its display(screen) size is too small to display the complicated detail contents (screen image) of desktop computer.
[4]
Technical Solution
[5] To solve the problem, it is an object of present invention to provide the big enlarged virtual screen by augmented reality to the mobile computer whose physical screen size is too small to display the image of display of desktop computer. More specifically, the present invention captures the video in direction of view of eye of its user by the camera attached to the head mount display(or head of user) and overlays the enlarged screen image of mobile computer to the video. [6]
Advantageous Effects
[7] By using the present invention, user can use mobile computer which is small and light(not heavy) with large virtual screen whose size is the same as desktop computer's one, therefore the user don't have to carry large, big, heavy physical display (screen). It means that program ,web browser or game of desktop computer can be executed in the mobile computer with present invention at any place.
[8] By displaying the image of contents of program only to the head mount display screen and not to the display(screen) of mobile computer, the secret computing job can be safely operated, (nobody can watch my secret virtual screen overlaied in the head mount display).
[9] By overlaying the stereo image, user can enjoy the 3 dimensional image which is useful for 3 dimensional game or 3 dimensional modeling program such CAD.
[10] The head mount display of present invention is safer(easy to escape from the accident) than ordinary head mount display because the ordinary head mount display cover the whole field of view of its user so the user must take off the head mount display in order to watch the real world but the user of present invention can watch the real world and virtual overlaied screen at the same time without taking off the head mount display.
[H]
Brief Description of the Drawings
[12] FIG.1 shows the composition of present invention.
[13] FIG.2 shows the mark on display of mobile computer.
[14] FIG.3 shows the modified mark on display of mobile computer.
[15]
[16] <symbols in Figures>
[17] amo : enlarged and synthesized image of display of mobile computer
[18] cal,ca2 : camera ds : head mount display
[19] con : main portion mo : display of mobile computer
[20] md : display information of mobile computer
[21] cd : image information captured by camera
[22] ar : synthesized image information
[23] mkl,mk2,mk3,mk4 : mark displayed on display(screen) of mobile computer
[24]
Best Mode for Carrying Out the Invention
[25] embodiment 1
[26] The augmented reality computer device of present invention comprises [27] camera portion capturing the image of mark of mobile computer,
[28] image processing portion outputting the distance and direction between camera and mark,
[29] image synthesis portion synthesizing the image by transforming and overlaying the screen shot image of mobile computer onto the video image captured by the camera portion where the transformation and the overlaying position are determined by the distance and direction between camera and mark,
[30] and head mount display portion outputting the synthesized image.
[31] It is recommended to attach the camera on the head mount display in order to capture the video image in the direction of view of eye of its user and the feature points of mobile computer can be used as the mark. For example, rectangle of display(mo) of mobile computer or the rectangle of the case(housing) of the mobile computer can be used as the mark.
[32] FIG.l shows the composition of the present invention. In FIG.l, stereo cameras(cal,ca2) are attached on the head mount display(ds) in order to capture the video in the direction of view of eye. The display(mo) of mobile computer is captured by the said stereo camera(cal,ca2) and the captured video(cd) is transferred to the main portion (con). (It is also possible to use only one camera instead of stereo camera). Image processing program in main portion(con) analyzes the received video(cd) and extract feature points of mark(for example, rectangle of display(mo) of mobile computer ,rectangle of case of mobile computer ,LED attached on the mobile computer, or image in display(mo) of mobile computer) and outputs the direction and distance (3 dimensional position or 2 dimensional position) between camera and mark by recognizing the size and shape(distortion) of mark. There is well known technology (perspective n point problem) to calculate the 3 dimensional distance and direction between camera and mark whose size and shape is known. For example, by analyzing size and shape of rectangle mark (vertices of rectangle are 4 feature points) in captured image, the 3 dimensional distance and direction can be obtained by the solution of the perspective 4 point problem. Detail information can be found from http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL COPIES/MARBLE/high/pia/solv ing.htm
[33] The 3 dimensional direction and distance between mark and camera can also be obtained by analyzing the stereo video images (left and right image). The main portion(con) receives the screen shot video image(md) signal from the mobile computerlike the beam projector receives the video signal from computer.) The main portion(con) may be embedded into the mobile computer. The image synthesis portion(micro controller, computer, or digital signal processor) in main portion synthesizes the output video by transforming (scaling, translating, rotating, and distorting(affine transforaiation)) the received video image(md) of mobile computer and overlaying it onto the video image(cd) captured by camera where the overlaying position and the transformation is determined by the information of 3 dimensional distance and direction between camera(cal,ca2) and the mark of mobile computer. The synthesized video is outputted through the head mount display(ds) and the wearer of the head mount display can see the enlarged virtual display(amo) of mobile computer attached on the mobile computer. If the wearer of the head mount display(ds) hold the mobile computer and shake it then the wearer can see the moving virtual display(amo) through the head mount display as if the virtual display(amo) is physical large real display of the mobile computer. FIG.1 shows the large virtual display(amo) as if it is attached on the mobile computer (mobile phone).
[34] If there is an emergency like fire or accident, the wearer of head mount display of present invention can watch his environment quickly and find way out by just laying down the mobile computer (laying down the mobile computer removes the virtual large display (amo) from the view of the wearer). But the wearer of the conventional head mount display must take off the head mount display to watch his environment because the video image of mobile computer covers all the view of its wearer. Therefore it is safer to use the head mount display of present invention then conventional one.
[35] Physical mark (like LED) may be attached on the mobile computer and the virtual image of keyboard, mouse or stylus pen may be overlaied onto the final synthesize image. In such case, the wearer of head mount display of present invention can use the virtual keyboard or mouse to control the mobile computer by adding function of recognizing the gesture of hand of wearer of head mount display of present invention into the image processing portion.
[36] embodiment 2
[37] There is dual monitor system in desktop computer which can connect 2 monitors to one desktop computer. Similarly, dual monitor system can be implemented in mobile computer. In such a case, the 1st monitor(mo) outputs mark image like FIG.2 and the video signal containing ordinary program image(game, web browser or word processor) to the 2nd monitor corresponds to the signal(md) in FIG.l. FIG.2 shows the example of mark displayed on the display(mo) of mobile computer. The mark contains small rectangle (mk2) and big torus like rectangle (mkl) with common center and direction representing rectangle(mk3). Such mark can be obtained by outputting graphic image onto the display of mobile computer. The image processing portion can detect the rectangles (mkl, mk2) by recognizing the edge of rectangles by hough trans - formation(hough transformation is well known line detection technology) and analyzing the relative position of vertices which are the crossing points of the edge lines and the direction of mark can be determined by recognizing the another rectangle (mrk3). By inserting the positions of the vertices of rectangle (mkl or mk2) into the formula of perspective 4 point problem, 3 dimensional direction and distance between camera and the rectangle can be obtained. The rectangle is only one example of mark and there is no limitation on the modification of mark in present invention.
[38] embodiment 3
[39] FIG.3 shows the mark modified from the mark of FIG.2.
[40] Mark 1 of FIG.3 contains direction representing inner vertex mark(mk4) which is corresponding to the mark (mk3) of FIG.2.
[41] embodiment 4
[42] The boundary rectangle(mo) of display of mobile computer can be used as a mark.
The image processing portion can recognizes the shape of mobile computer and extract mark by comparing the 3 dimensional model data of housing of mobile computer stored in memory and mark from the captured video image. If stereo camera(cal,ca2) is implemented ,then the mark can be extracted from the stereo video by comparing the left and right video image.
[43]

Claims

Claims
[1] A computer device comprising:
A mobile computer including mark;
A camera portion capturing the said mark;
An image processing portion analyzing the image captured by the said camera portion and producing the relative position between camera and mark;
A image synthesis portion transforming the image from the mobile computer and overlaying the transformed image onto the image captured by the said camera portion wherein the transformation and the overlaying position are determined by the said relative position between camera and mark;
And a head mound display portion displaying the said synthesized image. [2] A computer device of claim 1, wherein,
The camera of camera portion is attached on the head mound display portion to capture the image in direction of view of eye of its user. [3] according to any one of claims 1 to 2, wherein the said mark includes rectangle; the image processing portion recognizes the said rectangle and produces the relative distance between camera and rectangle by analyzing the size and shape of the rectangle in captured image; and the image synthesis portion transforms the image from the mobile computer and overlays the transformed image by using the said relative position information so that the overlaied image looks like real big physical image attached on the mobile computer. [4] A computer device of claim 3, wherein,
The rectangle contains the rectangle displayed on the display device of mobile computer. [5] A computer device of claim 3, wherein,
The rectangle includes
1st rectangle which is big and torus like;
2nd rectangle which is smaller than the said 1st rectangle and whose center is the same as the center of the said 1st rectangle;
And feature point which is representing the direction of the said 1st rectangle. [6] A computer device of claim 3, wherein,
The rectangle includes rectangle of boundary edge of display of mobile computer.
PCT/KR2008/005842 2007-10-22 2008-10-05 Augmented reality computer device Ceased WO2009054619A2 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR20070106350 2007-10-22
KR10-2007-0106350 2007-10-22
KR20070108593 2007-10-28
KR10-2007-0108593 2007-10-28
KR10-2008-0097458 2008-10-05
KR1020080097458A KR20090040839A (en) 2007-10-22 2008-10-05 Augmented reality computer device

Publications (2)

Publication Number Publication Date
WO2009054619A2 true WO2009054619A2 (en) 2009-04-30
WO2009054619A3 WO2009054619A3 (en) 2009-06-04

Family

ID=40580210

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/005842 Ceased WO2009054619A2 (en) 2007-10-22 2008-10-05 Augmented reality computer device

Country Status (1)

Country Link
WO (1) WO2009054619A2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011160114A1 (en) * 2010-06-18 2011-12-22 Minx, Inc. Augmented reality
ES2391112A1 (en) * 2010-07-16 2012-11-21 Universidad Politécnica de Madrid SYSTEM OF SPACE PROJECTION OF INCREASED REALITY FIXED TO THE HEAD
US20130050194A1 (en) * 2011-08-31 2013-02-28 Nintendo Co., Ltd. Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique
EP2741480A1 (en) * 2012-12-07 2014-06-11 BlackBerry Limited A mobile device, system and method for controlling a heads-up display
CN103890836A (en) * 2010-09-20 2014-06-25 寇平公司 Bluetooth or other wireless interface with power management for head mounted display
EP2826007A2 (en) * 2012-03-15 2015-01-21 Crown Packaging Technology, Inc Device, system and method for facilitating interaction between a wireless communication device and a package
US9152378B2 (en) 2010-09-20 2015-10-06 Kopin Corporation Bluetooth or other wireless interface with power management for head mounted display
US9323057B2 (en) 2012-12-07 2016-04-26 Blackberry Limited Mobile device, system and method for controlling a heads-up display
US9582482B1 (en) 2014-07-11 2017-02-28 Google Inc. Providing an annotation linking related entities in onscreen content
US9703541B2 (en) 2015-04-28 2017-07-11 Google Inc. Entity action suggestion on a mobile device
US9710967B2 (en) 2011-08-31 2017-07-18 Nintendo Co., Ltd. Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique
US9965559B2 (en) 2014-08-21 2018-05-08 Google Llc Providing automatic actions for mobile onscreen content
US10055390B2 (en) 2015-11-18 2018-08-21 Google Llc Simulated hyperlinks on a mobile device based on user intent and a centered selection of text
CN108762482A (en) * 2018-04-16 2018-11-06 北京大学 Data interactive method and system between a kind of large screen and augmented reality glasses
CN108885856A (en) * 2016-03-29 2018-11-23 索尼公司 Information processing device, information processing method and program
US10178527B2 (en) 2015-10-22 2019-01-08 Google Llc Personalized entity repository
US10535005B1 (en) 2016-10-26 2020-01-14 Google Llc Providing contextual actions for mobile onscreen content
US10970646B2 (en) 2015-10-01 2021-04-06 Google Llc Action suggestions for user-selected content
US11237696B2 (en) 2016-12-19 2022-02-01 Google Llc Smart assist for repeated actions
US11348320B2 (en) 2020-04-02 2022-05-31 Samsung Electronics Company, Ltd. Object identification utilizing paired electronic devices

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040104935A1 (en) * 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
KR100586818B1 (en) * 2004-02-18 2006-06-08 한국과학기술원 Head mounted display device using augmented reality technology
US20050285878A1 (en) * 2004-05-28 2005-12-29 Siddharth Singh Mobile platform

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011160114A1 (en) * 2010-06-18 2011-12-22 Minx, Inc. Augmented reality
ES2391112A1 (en) * 2010-07-16 2012-11-21 Universidad Politécnica de Madrid SYSTEM OF SPACE PROJECTION OF INCREASED REALITY FIXED TO THE HEAD
CN103890836A (en) * 2010-09-20 2014-06-25 寇平公司 Bluetooth or other wireless interface with power management for head mounted display
EP2617202A4 (en) * 2010-09-20 2015-01-21 Kopin Corp Bluetooth or other wireless interface with power management for head mounted display
US9152378B2 (en) 2010-09-20 2015-10-06 Kopin Corporation Bluetooth or other wireless interface with power management for head mounted display
CN103890836B (en) * 2010-09-20 2017-03-15 寇平公司 The bluetooth with power management or other wave points for head mounted display
US9710967B2 (en) 2011-08-31 2017-07-18 Nintendo Co., Ltd. Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique
US20130050194A1 (en) * 2011-08-31 2013-02-28 Nintendo Co., Ltd. Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique
US8922588B2 (en) * 2011-08-31 2014-12-30 Nintendo Co., Ltd. Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique
EP2826007A2 (en) * 2012-03-15 2015-01-21 Crown Packaging Technology, Inc Device, system and method for facilitating interaction between a wireless communication device and a package
EP2741480A1 (en) * 2012-12-07 2014-06-11 BlackBerry Limited A mobile device, system and method for controlling a heads-up display
US9323057B2 (en) 2012-12-07 2016-04-26 Blackberry Limited Mobile device, system and method for controlling a heads-up display
US10592261B1 (en) 2014-07-11 2020-03-17 Google Llc Automating user input from onscreen content
US11573810B1 (en) 2014-07-11 2023-02-07 Google Llc Sharing screen content in a mobile environment
US9762651B1 (en) 2014-07-11 2017-09-12 Google Inc. Redaction suggestion for sharing screen content
US9788179B1 (en) 2014-07-11 2017-10-10 Google Inc. Detection and ranking of entities from mobile onscreen content
US9811352B1 (en) 2014-07-11 2017-11-07 Google Inc. Replaying user input actions using screen capture images
US9824079B1 (en) 2014-07-11 2017-11-21 Google Llc Providing actions for mobile onscreen content
US9886461B1 (en) 2014-07-11 2018-02-06 Google Llc Indexing mobile onscreen content
US9916328B1 (en) 2014-07-11 2018-03-13 Google Llc Providing user assistance from interaction understanding
US11704136B1 (en) 2014-07-11 2023-07-18 Google Llc Automatic reminders in a mobile environment
US10652706B1 (en) 2014-07-11 2020-05-12 Google Llc Entity disambiguation in a mobile environment
US10080114B1 (en) 2014-07-11 2018-09-18 Google Llc Detection and ranking of entities from mobile onscreen content
US11347385B1 (en) 2014-07-11 2022-05-31 Google Llc Sharing screen content in a mobile environment
US9582482B1 (en) 2014-07-11 2017-02-28 Google Inc. Providing an annotation linking related entities in onscreen content
US10963630B1 (en) 2014-07-11 2021-03-30 Google Llc Sharing screen content in a mobile environment
US10244369B1 (en) 2014-07-11 2019-03-26 Google Llc Screen capture image repository for a user
US10248440B1 (en) 2014-07-11 2019-04-02 Google Llc Providing a set of user input actions to a mobile device to cause performance of the set of user input actions
US10491660B1 (en) 2014-07-11 2019-11-26 Google Llc Sharing screen content in a mobile environment
US12147652B1 (en) 2014-07-11 2024-11-19 Google Llc Annotating screen content in a mobile environment
US11907739B1 (en) 2014-07-11 2024-02-20 Google Llc Annotating screen content in a mobile environment
US9965559B2 (en) 2014-08-21 2018-05-08 Google Llc Providing automatic actions for mobile onscreen content
US9703541B2 (en) 2015-04-28 2017-07-11 Google Inc. Entity action suggestion on a mobile device
US10970646B2 (en) 2015-10-01 2021-04-06 Google Llc Action suggestions for user-selected content
US12026593B2 (en) 2015-10-01 2024-07-02 Google Llc Action suggestions for user-selected content
US12505384B2 (en) 2015-10-01 2025-12-23 Google Llc Action suggestions for user-selected content
US10178527B2 (en) 2015-10-22 2019-01-08 Google Llc Personalized entity repository
US11089457B2 (en) 2015-10-22 2021-08-10 Google Llc Personalized entity repository
US12108314B2 (en) 2015-10-22 2024-10-01 Google Llc Personalized entity repository
US11716600B2 (en) 2015-10-22 2023-08-01 Google Llc Personalized entity repository
US10733360B2 (en) 2015-11-18 2020-08-04 Google Llc Simulated hyperlinks on a mobile device
US10055390B2 (en) 2015-11-18 2018-08-21 Google Llc Simulated hyperlinks on a mobile device based on user intent and a centered selection of text
CN108885856A (en) * 2016-03-29 2018-11-23 索尼公司 Information processing device, information processing method and program
US11734581B1 (en) 2016-10-26 2023-08-22 Google Llc Providing contextual actions for mobile onscreen content
US12141709B1 (en) 2016-10-26 2024-11-12 Google Llc Providing contextual actions for mobile onscreen content
US10535005B1 (en) 2016-10-26 2020-01-14 Google Llc Providing contextual actions for mobile onscreen content
US11860668B2 (en) 2016-12-19 2024-01-02 Google Llc Smart assist for repeated actions
US11237696B2 (en) 2016-12-19 2022-02-01 Google Llc Smart assist for repeated actions
US12379821B2 (en) 2016-12-19 2025-08-05 Google Llc Smart assist for repeated actions
CN108762482A (en) * 2018-04-16 2018-11-06 北京大学 Data interactive method and system between a kind of large screen and augmented reality glasses
US11348320B2 (en) 2020-04-02 2022-05-31 Samsung Electronics Company, Ltd. Object identification utilizing paired electronic devices

Also Published As

Publication number Publication date
WO2009054619A3 (en) 2009-06-04

Similar Documents

Publication Publication Date Title
WO2009054619A2 (en) Augmented reality computer device
US12154238B2 (en) Wearable augmented reality devices with object detection and tracking
US11262835B2 (en) Human-body-gesture-based region and volume selection for HMD
JP6323040B2 (en) Image processing apparatus, image processing method, and program
KR101171660B1 (en) Pointing device of augmented reality
US10089794B2 (en) System and method for defining an augmented reality view in a specific location
CN107851299B (en) Information processing apparatus, information processing method, and program
Du et al. Opportunistic interfaces for augmented reality: Transforming everyday objects into tangible 6dof interfaces using ad hoc ui
CN112581571B (en) Control method and device for virtual image model, electronic equipment and storage medium
KR20090040839A (en) Augmented reality computer device
JP2014029656A (en) Image processor and image processing method
JP2014029566A (en) Image processing apparatus, image processing method, and image processing program
Sobota et al. Mixed Reality: A Known
KR101582225B1 (en) System and method for providing interactive augmented reality service
CN117897682A (en) Displaying digital media content on physical surfaces
CN114972466B (en) Image processing method, device, electronic device and readable storage medium
CN117496097A (en) Digital sand table display system based on AR reality enhancement
CN115797815A (en) AR translation processing method and electronic device
US20240257440A1 (en) Information processing apparatus, information processing method, and program
JP2005301479A (en) Instruction input device based on projected action of presenter
ZHANG et al. Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects into Tangible 6DoF Interfaces Using Ad hoc UI
CN115731495A (en) Method for controlling three-dimensional virtual scene
Acharya IMAGE TAGGED INTUITIVE AUGMENTED REALITY

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08840914

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08840914

Country of ref document: EP

Kind code of ref document: A2