WO2006085580A1 - ポインタ光トラッキング方法、プログラムおよびその記録媒体 - Google Patents
ポインタ光トラッキング方法、プログラムおよびその記録媒体 Download PDFInfo
- Publication number
- WO2006085580A1 WO2006085580A1 PCT/JP2006/302249 JP2006302249W WO2006085580A1 WO 2006085580 A1 WO2006085580 A1 WO 2006085580A1 JP 2006302249 W JP2006302249 W JP 2006302249W WO 2006085580 A1 WO2006085580 A1 WO 2006085580A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pointer light
- display
- image data
- pointer
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/48—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
- G03B17/54—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- the present invention relates to a presentation system such as a projector system that projects computer images from a projector onto a projector screen, which is used in various presentations.
- Patent Document 1 Japanese Patent Laid-Open No. 11-85395
- Patent Document 2 Japanese Patent Laid-Open No. 11-39095
- Patent Document 3 Japanese Patent Laid-Open No. 2004-265235
- Non-Special Reference 1 R. suk thankar, R. Stockton, M. Mullin., Smarter Presentations: Expl oiting Homography in Camera-Projector Systems ", Proceedings of International Conference on Computer Vision, 2001
- Non-Patent Document 2 R. Suk thankar, R. Stockton, M. Mullin., "Self- Calibrating Camera- As sisted Presentation Interface, Proceedings of International Conference on Control, Automation, Robotics and Vision, 2000
- various presentations may utilize a large liquid crystal display, a plasma display, or the like.
- a large display system high-precision tracking of the laser pointer light is possible. Is preferred.
- an LED pointer that can be manufactured at a lower cost than a laser pointer alone, and to realize similar drawing by tracking the LED pointer light.
- the laser light from the laser pointer and the LED light from the LED pointer are referred to as pointer light.
- the invention of the present application is highly capable of tracking the pointer light projected on various displays such as a projector screen even when an image is displayed on the display. This makes it possible for anyone to draw lines and figures on the display easily and neatly, and to improve user abilities other than this drawing.
- An object of the present invention is to provide a pointer light tracking method, a pointer light tracking program, and a recording medium thereof that can realize various applications.
- a pointer light projected on the display is photographed by a camera, and on the display based on the obtained image data.
- a pointer light tracking method for tracking pointer light by a computer which projects a black solid image and white square images located at its four corners on a display, and displays a black solid image and a white square image on the display.
- the area corresponding to the white square image is extracted from the obtained image data, and the center of the extracted area is extracted.
- Coordinates (x, y) are calculated and projected from the calculated center coordinates ( ⁇ , y) and the center coordinates (X, Y) of the white square image to the coordinates representing the position of the pointer light on the display. It is characterized in that it calculates the parameters required for distortion correction using conversion.
- the invention of the present application captures pointer light projected on a display by a camera, and tracks pointer light on the display by a computer based on the obtained image data.
- This is a tracking method in which pointer light is projected and only one or more pointer light is projected by adjusting one or more of shutter speed, exposure, and gamma value of the camera that shoots the display. Image data obtained.
- the invention of the present application takes a pointer light projected on a display by a camera, and tracks the pointer light on the display by a computer based on the obtained image data.
- a tracking method is characterized in that the image data is subjected to a blurring process to enlarge a light spot area of the pointer light included in the image data.
- the present invention is a pointer light tracking program, which is projected on the display and based on image data obtained by photographing the pointer light with a camera.
- the computer corresponds to the white square image from the image data obtained by photographing the black solid image and the display showing the white square image located at the four corners.
- For extracting the area to be extracted means for calculating the center coordinates (X, y) of the extracted area, and the center coordinates (X, y) and the center coordinates (X, Y) of the white square image described above From the above, it is made to function as a means for calculating parameters necessary for performing distortion correction using projective transformation for the coordinates representing the position of the pointer light on the display.
- the present invention is a pointer light tracking program, which is a pointer on the display based on image data obtained by photographing the pointer light projected on the display with a camera.
- the computer captures the image on which the pointer light is projected so that only the pointer light can be captured, and the shutter speed, exposure, or gamma value of the camera that captures the display.
- One or two It functions as a means for adjusting the above.
- the invention of the present application is a pointer light tracking program, and a pointer on the display is based on image data obtained by photographing a pointer light projected on a display with a camera.
- a computer is caused to function as means for performing a blurring process on the image data so as to expand a light spot region of the pointer light included in the image data.
- the present invention is a computer-readable recording medium on which the pointer light tracking program is recorded.
- the projector pointer, the large liquid crystal display, and the like V are projected on various displays used for the presentation! Tracking is possible, and this high-precision tracking can be realized even when an image is shown on the display.
- the first invention by taking into account image distortion that normally occurs in projector projection and using the parameters obtained as described above, it is possible to perform extremely effective distortion correction, thereby enabling camera photography.
- the two-dimensional coordinate value of the pointer light spot included in the obtained image data can be calculated more accurately.
- this image distortion may occur in various display images other than the projection by the projector.
- distortion correction can be made favorable and high-precision tracking can be realized.
- the pointer light is buried in the projector light. It is difficult to extract it by computer image processing. ⁇ The power that is easy to fall into the situation.
- the projector by adjusting one or more of the shutter speed, exposure, and gamma value of the camera optimally, the projector The camera image data in which only the pointer light is strongly projected can be obtained by blocking the light, so that the pointer light spot can be accurately extracted.
- optimization adjustment can be performed on various displays other than the projector screen. It is also effective when using a key, and it is possible to block the extra light and extract the pointer light spot with high accuracy.
- the light spot of the pointer light is very small when viewed from the entire display, and image data in which the pointer light spot is emphasized can be obtained by adjusting the shutter speed. Even so, the size of the light spot itself remains small, so by performing the blurring process as described above to enlarge the light spot size, the pointer light spot that is not buried in the surrounding image can be more accurately identified. It can be extracted.
- this squeezing process with the above-described correction coordinate calculation and the above-described optimization adjustment, even higher-precision pointer light tracking can be realized.
- a computer program capable of obtaining the same effect as the first to third inventions, and a recording medium such as a flexible disk, a CD, or a DVD on which the computer program is recorded. Provided.
- Fig. 1 shows an embodiment of the present invention having the above-described features.
- This section mainly describes the projector system configuration that is often used for presentations.
- the projector system includes a projector screen 1, a projector 2, a computer 3, a laser pointer 4, and a camera 5.
- the projector screen 1 is capable of projecting the image projected from the projector 2.
- the projector screen 1 may be a wall surface as much as possible so that an image projected by only a screen installed on a wall or the like can be projected. ,.
- the projector 2 is a device capable of projecting the image sent from the computer 3 onto the projector screen 1.
- the projector 2 projects light that projects the image, and between the projector 3 and the projector 3.
- a communication unit for transmitting and receiving data signals and control signals.
- the computer 3 sends an image to be projected to the projector 2 and executes various application processes such as a laser beam tracking process and drawing based on the tracking according to the present invention. These processes are performed using laser light stored in the storage unit (memory) 32. It is executed by the processing unit (CPU) 31 that has received a command from the racking program and application program.
- the processing unit 31 includes a data file unit 33 that stores various data, a display unit 34 such as a program execution screen and a display that displays various data, an input unit 35 such as a keyboard and a mouse, and the projector 2.
- a bus 38 also connects the projector communication unit 36 that transmits and receives data signals and control signals to and the camera communication unit 37 that transmits and receives data signals and control signals to and from the camera 5.
- the image sent from the computer 3 to the projector 2 is usually a desktop screen displayed on the display.
- a presentation image created using presentation software is first displayed on the desktop screen (usually Is displayed on the full screen), the desktop screen is sent to the projector 2 and projected onto the projector screen 1, whereby the presentation image appearing on the desktop screen is projected onto the projector screen 1.
- the laser pointer 4 is a device capable of projecting laser light, and includes a light projecting unit that projects laser light, a part held by a user, and the like. Since it is possible to draw on the image as long as the desired part of the projected image can be indicated with laser light, it can also be called a laser pen.
- the camera 5 is a device that can take an image of the projector screen 1 on which an image from the projector 2 and a laser beam from the laser pointer 4 are projected, and input the image data to the computer 3. .
- an image sensor such as a CCD (Charge Coupled Device) and a communication unit that transmits and receives data signals and control signals to and from the computer 3 are provided.
- CCD Charge Coupled Device
- a communication unit that transmits and receives data signals and control signals to and from the computer 3 are provided.
- the overall configuration of the laser beam tracking program in the present embodiment is as shown in FIG. 2, and the distortion correction parameter acquisition program shown in FIG. 3, the camera control program shown in FIG. 6, and FIG.
- the light spot detection program shown in Fig. 11 and the distortion correction program shown in Fig. 11 are incorporated in relation to each other.
- the camera image data acquisition program shown in FIG. 7 and the camera image data input program shown in FIG. 8 are also incorporated, but are not shown in FIG.
- the application program executes various application functions such as drawing for improving the user's parity based on the processing result of the 1S laser beam tracking program, which will be described in detail later, that is, the coordinate data of the laser beam being tracked.
- the distortion correction parameter is a parameter necessary for matrix calculation of projective transformation performed in the distortion correction process described later.
- This process is an initial setting process that is performed only once when the laser beam tracking program is started.
- the process is incorporated in the laser beam tracking program and executed by the distortion correction parameter acquisition program (see FIG. 2). ).
- the distortion correction parameter acquisition program see FIG. 2.
- an event occurs in which the value of the distortion correction parameter fluctuates unacceptably in the execution of the tracking process, such as the positions of the projector 2 and the camera 5 being displaced, Execution is necessary.
- a solid black image is created and stored in advance by the computer 3, sent from the computer 3 to the projector 2, and projected from the projector 3 onto the projector screen 1.
- the created black solid image is displayed on the desktop screen on the display. If shown, it will be projected onto the projector screen 1 via the projector 2.
- the computer 3 previously creates and stores four white small square images located at the four corners of the black solid image and sends them to the projector 2 from the computer 3. Project to the projector screen 1. In this case as well, if the four white square images created are superimposed on the black solid image and displayed at the four corners of the desktop screen on the display, they are projected onto the projector screen 1 via the projector 2. Will be.
- the center coordinates of each of these four white square images are stored as (XI, Yl) (X2, Y2) (X3, Y3) (X4, Y4) (see Fig. 4).
- the respective center coordinates in the X-axis / ⁇ -axis coordinate system set in advance on the black solid image are also calculated.
- the coordinate system the screen coordinate system normally set for the display or desktop screen can be used, and the center coordinates of the white square image at the four corners in the screen coordinate system can be obtained.
- the black solid image and the white square image projected on the projector screen 1 are captured by the camera 5.
- the camera 5 starts shooting the projector screen 1 in accordance with a control signal from the computer 3 that instructs to start shooting.
- the system configuration can start the shooting of the camera 5 manually even if the computer 3 and the camera 5 are not linked in this way, needless to say!
- Extract areas corresponding to the above four white square images from the captured image data More specifically, since the solid black image and the white square image appear in the camera image data captured by the camera 5 and taken into the computer 3, the white image is obtained from the camera image data by image processing. If only the areas are extracted, four areas corresponding to the above four white square images can be obtained.
- center coordinates of each of these four areas are calculated and stored as (xl, yl) (x2, y2) (x3, y3) (x4, y4).
- the computer 3 calculates the center coordinates of each region in the X-axis / axis-axis coordinate system and stores them.
- This distortion correction is performed by two-dimensional projective transformation in which the camera image data obtained by capturing the projected image with the camera 5 is inserted into the original image data before the projection. Calculate 8 parameters necessary for the matrix calculation of the transformation and obtain the projective transformation formula.
- the center coordinates (XI, Yl) (X2, Y2) (X3, Y3) (X4, Y4) of the original image data of the white square and the center coordinates (xl, yl) of the camera image data are used.
- the formula in Fig. 5 is an example of a two-dimensional projective transformation formula. Since this equation allows scalar denominator, the number of independent parameters is 8.
- a presentation image is actually projected from the projector 2 to the projector screen 1, and the user starts the presentation using the laser pointer 4 as appropriate (hereinafter referred to as the following).
- the projector 5 blocks the light from the laser pointer 4 with the Veg Camera 5 which realizes even more suitable tracking. Perform processing to adjust the shutter speed, exposure, and gamma value of the camera 5 so that only the laser can capture image data that is strongly projected (see Fig. 6).
- tracking of the laser light projected on the projector screen 1 is performed by photographing the laser light on the projector screen 1 with the camera 5 as described in detail below.
- the area corresponding to the light spot is extracted from the image and the coordinates of the area are calculated by repeating the process according to the shooting speed of the camera 5 (the obtained coordinate values are used for the drawing process, etc.)
- the camera 5 In order to improve the accuracy of the light spot region extraction at this time, it is necessary that the camera 5 accurately captures the laser light. Therefore, it is a very important process to adjust the shutter speed of the camera 5 to block the projector light as much as possible so that only the laser light can be photographed. This process is executed by the camera control program incorporated in the laser beam tracking program (see Fig. 2).
- an arbitrary shutter speed value, exposure, and gamma value are sent from the computer 3 to the camera 5. More specifically, when the initial value of the shutter speed is input to the computer 3 by the user, it is sent from the computer 3 to the camera 5. Alternatively, a preset initial value stored in advance is sent from the computer 3 to the camera 5.
- Camera 5 uses the shutter speed value, exposure, and gamma values sent to it to release the shutter.
- Camera image data obtained by shooting (hereinafter simply referred to as “image data” for the sake of brevity) will be sent to the computer 3, and the computer 3 will provide sufficient laser light to the image data. Judge whether or not the image is strongly projected.
- color information data RGB value, luminance value, etc.
- RGB value a value representing laser light
- luminance value a value representing laser light
- the initial values are adjusted and new shutter speed values, exposure, and gamma values are set.
- Yes is when there is one or more images of the light spot and the image is of a size suitable for the laser beam.
- the determination is made based on whether the predetermined RGB value or the like exists continuously for a predetermined number of pixels.
- other determination methods can be applied.
- the light from the projector 2 can be blocked by hardware of the camera 5, and only the light from the laser pointer 4 can be selectively captured.
- the projector light is blocked, so only the laser light is in a dark background. Is bright and shining strongly.
- a power depending on the performance of the camera 5 and the surrounding environment for example, a shutter speed of 150 to 300 Zsec is one preferable range.
- the shutter speed, exposure, and gamma value to be optimized / adjusted do not necessarily need to be adjusted for all three, and may be a combination of two or more. However, although it depends on the usage environment, adjusting the shutter speed is considered to be the most effective. Therefore, for the combination, it is possible to increase the stability of good laser light extraction, which is more preferable to include the shutter speed and add exposure or gamma value to it.
- any combination is automatically selected, and the image is shot with the initial value of the combined parameter (S2-2-2). From the periphery of the combination, adjust the combination method with each parameter value and repeat this until the light spot can be identified (S2-2-3No, S2-2-4). Then, when it is discriminated, each parameter is set to an optimum value (S 2-2-3Yes, S2-2-5).
- the image data in which the laser beam is enhanced as described above is acquired by the camera 5.
- the camera 5 has the power to shoot the projector screen 1 on which the presentation image and the laser light are projected. Since the shutter speed of the camera 5 is optimized as described above, In the captured image, only the laser beam appears with emphasis. Therefore, in the case of a digital camera, when imaged by an image sensor such as a CCD, the color information of the image in which the laser beam is emphasized as described above can be acquired.
- the image size is 320 pixels X 240 pixels.
- the image data sent from the camera 5 is received by the camera communication unit 37 provided in the computer 3. Transmission and reception between the camera 5 and the computer 3 may be wired or wireless.
- the received image data is stored in the storage unit 32 included in the computer 3 or stored in the data file 33.
- the image data is taken into the computer 3.
- the entire image data is subjected to processing so as to enlarge the light spot area in the image data (see FIGS. 10A and 10B).
- averaging or Gaussian squeezing which are generally known in the field of image processing, can be used.
- a region extraction process is performed on the image data subjected to the blurring process.
- a region having a value equal to or larger than a predetermined threshold in color information in units of pixels constituting image data so as to detect a region corresponding to the enlarged light spot in image data. See Fig. 10 (C)).
- the threshold value is set in advance to the color information value (RGB value or luminance value) corresponding to the light spot.
- the coordinates (inputX, inputY) of the center of gravity of the extracted area are calculated (see FIG. 10C).
- calculation methods generally known in the field of image processing can be used.
- centroid coordinates (inputX, inputY) are projectively transformed by a projective transformation formula using the eight parameters obtained by the distortion correction parameter acquisition process.
- the corrected center-of-gravity coordinates (AGPointX, AGPointY) may be stored in the computer 3.
- the laser light from the laser pointer 4 projected on the projector screen 1 is accurately detected by the computer 3 and the camera 5, and the movement is tracked. (See Fig. 12 (A)).
- the projector system according to the present embodiment that realizes the high-precision tracking of the laser light on the projector screen 1 also realizes various functions that can further improve the user-friendliness not only by drawing with the laser pointer 4. can do. Specifically, while tracking the laser beam on the projector screen 1 by each of the above-mentioned processes 2.1 to 2.6, each application program is performed based on the coordinate data (AGPointX, AGPointY) as necessary. Execute.
- the overall configuration of the application program in the present embodiment is as shown in FIG. 13, the history recording program shown in FIG. 14, the history reproduction program shown in FIG. 15, and the function switching shown in FIG.
- the program, the pen drawing program shown in Fig. 17, the mouse emulation program shown in Fig. 19, the presentation software control program shown in Fig. 21, and the translucent display program are incorporated in relation to each other.
- these programs can be combined into one software, and the laser beam tracking program can be combined into one software and installed in the computer 3.
- FIG. 22 shows an example of a menu screen displayed on the desktop screen when the combined software is installed in the computer 3 and activated. Icons indicating various functions executed by each program are displayed. This menu screen is projected from the projector 2 onto the projector screen 1, and the laser light from the laser pointer 4 is projected. Easy icon operation just by projecting It is possible to do so, and the user pity is greatly improved.
- the coordinate data (AGPointX, AGPointY) is transferred from the distortion correction program to the history recording program.
- time t for example, the time when the delivery is performed or the time when the coordinate data is calculated can be considered. Also set to 0 when recording started.
- this coordinate history data represents the laser light projection history on the projector screen 1, and each icon operation and drawing operation by the laser light on the above menu screen of the projector screen 1 etc. It can be said that it represents the operation history.
- the coordinate history data is output as dummy data according to the recorded time t.
- a selection region for each icon is determined in advance and the region coordinate data is stored.
- the coordinate history data (AGPointX, AGPointY, t) is received from the history recording program.
- This function is a drawing function on the projector screen 1 by the laser pointer 4. Since the one pointer 4 functions like a pen, it is called a pen drawing function.
- the above-mentioned coordinate history data (AGPointX, AGPointY, t) is received in turn as the history recording program force, and the position force on the desktop screen corresponding to the first coordinate data Bitmap connecting to the position corresponding to the next coordinate data Generate data.
- each coordinate data is connected in the order of the time t, and the drawing is performed (see FIG. 18 (A)).
- interpolation processing with a coffee curve is performed on discrete data so that a more natural and smooth curve can be drawn.
- a new curve is created by offsetting the control point in the direction perpendicular to the curve.
- the offset amount is determined so that it is inversely proportional to the moving speed of the light spot. If this offset curve is increased to multiple lines (for example, about 30 lines), it is possible to draw the texture as if writing with a brush (see Fig. 18 (C)).
- the mouse emulation function is selected by switching the function with the laser beam as described above.
- the mouse emission function is selected by allowing laser light to stay in the mouse icon area representing the mouse emulation function on the menu screen for a certain period of time.
- a “move mouse” icon is displayed as shown in FIG.
- the laser light is allowed to stay in the icon area representing the desired mouse function for a certain period of time, so that the selected mouse function can be executed.
- This function is provided for various software such as presentation software by laser light on the projector screen 1, such as "Page forward”, “Page back”, “Cue”, “List display”. ”Etc. are sent, and software control is realized.
- the software control function is selected by switching the function with the laser beam as described above. More specifically, the software control function is selected by allowing laser light to stay in the area of the icon representing the software control function on the menu screen for a certain period of time.
- Icons indicating various software functions such as “back to page”, “cue” and “list display” are displayed.
- the laser beam is allowed to stay in the area of the icon representing the desired software function for a certain period of time, so that an instruction signal for selecting the icon and executing the assigned software function is sent to the software in the computer 3. Can be sent to.
- the page can be advanced to the next presentation page by selecting the page turning icon.
- This function is an option of the drawing function, and the bitmap data created as described above is arranged on the forefront of the desktop screen and is displayed in a translucent manner.
- the laser pointer 5 is capable of emitting two or more laser beams (for example, red and green), and the light spot area extraction by the light spot detection program is performed for each color.
- the laser pointer 5 is capable of emitting two or more laser beams (for example, red and green), and the light spot area extraction by the light spot detection program is performed for each color.
- the present invention can also be applied to tracking LED light from an LED pointer.
- the LED pointer is, for example, a pen-type or indicator-type with an LED at the tip, and emits red or green LED light as well as laser light. Of course, those that can emit light of two or more colors can also be used.
- high-precision tracking can be achieved by the same system configuration, tracking processing, and application processing, regardless of whether the laser light is laser pointer 5 or LED pointer LED light. And high user friendliness.
- the projector screen 1 can take various forms that can display the image from the projector 2, such as a white board or a white wall that can be used only in a general screen form.
- the present invention can be directly applied to various types of displays capable of displaying images, such as liquid crystal displays, plasma displays, projection televisions, etc. It becomes feasible.
- FIG. 1 is a diagram for explaining an embodiment of the present invention.
- FIG. 2 is a program configuration diagram for explaining an embodiment of the present invention.
- FIG. 3 is a flowchart for explaining a distortion correction parameter acquisition process.
- FIG. 4 is a diagram for explaining a distortion correction parameter acquisition process.
- FIG. 5 is another diagram for explaining a distortion correction parameter acquisition process.
- FIG. 7 is a flowchart for explaining camera image acquisition processing.
- FIG. 8 is a flowchart for explaining camera image input processing.
- FIG. 9 is a flowchart for explaining light spot detection processing.
- FIG. 10 (A), (B), and (C) are diagrams for explaining light spot detection processing.
- FIG. 11 is a flowchart for explaining a distortion correction process.
- FIGS. 12A and 12B are diagrams for explaining distortion correction processing, respectively.
- FIG. 14 is a flowchart for explaining history recording processing.
- FIG. 15 is a flowchart for explaining history reproduction processing.
- FIG. 16 is a flowchart for explaining function switching processing.
- FIG. 17 is a flowchart for explaining pen drawing processing.
- FIG. 18 (A), (B), and (C) are diagrams for explaining pen drawing processing.
- FIG. 19 is a flowchart for explaining mouse emulation processing.
- FIG. 20 is a diagram for explaining mouse emulation processing.
- FIG. 21 is a flowchart for explaining software control processing.
- FIG. 22 shows an example of a menu screen.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- Projection Apparatus (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/816,033 US8049721B2 (en) | 2005-02-10 | 2006-02-09 | Pointer light tracking method, program, and recording medium thereof |
| EP06713392A EP1855184A4 (en) | 2005-02-10 | 2006-02-09 | PROGRAM AND METHOD FOR TRACKING POINTER LIGHT AND RECORDING MEDIUM FOR SAID PROGRAM |
| JP2007502636A JPWO2006085580A1 (ja) | 2005-02-10 | 2006-02-09 | ポインタ光トラッキング方法、プログラムおよびその記録媒体 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2005035284 | 2005-02-10 | ||
| JP2005-035284 | 2005-02-10 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2006085580A1 true WO2006085580A1 (ja) | 2006-08-17 |
Family
ID=36793151
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2006/302249 Ceased WO2006085580A1 (ja) | 2005-02-10 | 2006-02-09 | ポインタ光トラッキング方法、プログラムおよびその記録媒体 |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US8049721B2 (ja) |
| EP (1) | EP1855184A4 (ja) |
| JP (1) | JPWO2006085580A1 (ja) |
| KR (1) | KR20070105322A (ja) |
| CN (1) | CN101116049A (ja) |
| WO (1) | WO2006085580A1 (ja) |
Cited By (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008225554A (ja) * | 2007-03-08 | 2008-09-25 | Takram Design Engineering:Kk | プロジェクタシステム |
| JP2008293289A (ja) * | 2007-05-24 | 2008-12-04 | Sharp Corp | プロジェクタ |
| WO2009061619A3 (en) * | 2007-11-07 | 2009-08-06 | Omnivision Tech Inc | Apparatus and method for tracking a light pointer |
| CN101430482B (zh) * | 2007-11-05 | 2010-06-09 | 鸿富锦精密工业(深圳)有限公司 | 投影画面操作系统及操作方法 |
| JP2010151997A (ja) * | 2008-12-24 | 2010-07-08 | Brother Ind Ltd | プレゼンテーションシステム及びそのプログラム |
| US7862179B2 (en) | 2007-11-07 | 2011-01-04 | Omnivision Technologies, Inc. | Dual-mode projection apparatus and method for locating a light spot in a projected image |
| WO2011090176A1 (ja) * | 2010-01-22 | 2011-07-28 | 京セラ株式会社 | 投影制御装置及び投影制御方法 |
| US8188973B2 (en) | 2007-11-07 | 2012-05-29 | Omnivision Technologies, Inc. | Apparatus and method for tracking a light pointer |
| JP2012234382A (ja) * | 2011-05-02 | 2012-11-29 | Ricoh Co Ltd | 画像表示システムおよび画像表示方法 |
| JP2013164489A (ja) * | 2012-02-10 | 2013-08-22 | Seiko Epson Corp | 画像表示装置、画像表示システム、および画像表示装置の制御方法 |
| JP2014120023A (ja) * | 2012-12-18 | 2014-06-30 | Seiko Epson Corp | 表示装置、位置検出装置、及び、表示装置の制御方法 |
| WO2015012409A1 (en) * | 2013-07-24 | 2015-01-29 | Ricoh Company, Limited | Information processing device, image projecting system, and computer program |
| JP2015060087A (ja) * | 2013-09-19 | 2015-03-30 | セイコーエプソン株式会社 | プロジェクターシステム、およびプロジェクターシステムの制御方法 |
| JP2015114975A (ja) * | 2013-12-13 | 2015-06-22 | 株式会社東芝 | 電子機器および表示方法 |
| CN107885315A (zh) * | 2016-09-29 | 2018-04-06 | 南京仁光电子科技有限公司 | 一种提高触控系统跟踪精确度的方法 |
| CN111788826A (zh) * | 2018-03-06 | 2020-10-16 | 索尼公司 | 信息处理装置、信息处理方法和程序 |
| EP4250756A1 (en) | 2022-03-24 | 2023-09-27 | FUJIFILM Corporation | Image generation device, image generation method, and image generation program |
Families Citing this family (44)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4404924B2 (ja) * | 2007-09-13 | 2010-01-27 | シャープ株式会社 | 表示システム |
| US20090244492A1 (en) * | 2008-03-28 | 2009-10-01 | Christie Digital Systems Usa, Inc. | Automated geometry correction for rear projection |
| US8733952B2 (en) | 2008-06-17 | 2014-05-27 | The Invention Science Fund I, Llc | Methods and systems for coordinated use of two or more user responsive projectors |
| US20090309826A1 (en) | 2008-06-17 | 2009-12-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Systems and devices |
| US8602564B2 (en) | 2008-06-17 | 2013-12-10 | The Invention Science Fund I, Llc | Methods and systems for projecting in response to position |
| US8820939B2 (en) | 2008-06-17 | 2014-09-02 | The Invention Science Fund I, Llc | Projection associated methods and systems |
| US8267526B2 (en) | 2008-06-17 | 2012-09-18 | The Invention Science Fund I, Llc | Methods associated with receiving and transmitting information related to projection |
| US8403501B2 (en) | 2008-06-17 | 2013-03-26 | The Invention Science Fund, I, LLC | Motion responsive devices and systems |
| US8936367B2 (en) | 2008-06-17 | 2015-01-20 | The Invention Science Fund I, Llc | Systems and methods associated with projecting in response to conformation |
| US8308304B2 (en) | 2008-06-17 | 2012-11-13 | The Invention Science Fund I, Llc | Systems associated with receiving and transmitting information related to projection |
| US8641203B2 (en) | 2008-06-17 | 2014-02-04 | The Invention Science Fund I, Llc | Methods and systems for receiving and transmitting signals between server and projector apparatuses |
| US8944608B2 (en) | 2008-06-17 | 2015-02-03 | The Invention Science Fund I, Llc | Systems and methods associated with projecting in response to conformation |
| US8608321B2 (en) | 2008-06-17 | 2013-12-17 | The Invention Science Fund I, Llc | Systems and methods for projecting in response to conformation |
| US8540381B2 (en) | 2008-06-17 | 2013-09-24 | The Invention Science Fund I, Llc | Systems and methods for receiving information associated with projecting |
| US8384005B2 (en) * | 2008-06-17 | 2013-02-26 | The Invention Science Fund I, Llc | Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface |
| US8723787B2 (en) | 2008-06-17 | 2014-05-13 | The Invention Science Fund I, Llc | Methods and systems related to an image capture projection surface |
| FR2933511A1 (fr) * | 2008-07-04 | 2010-01-08 | Optinnova | Dispositif et procede de visualisation interactif utilisant une camera de detection et un pointeur optique |
| US8248372B2 (en) * | 2009-06-26 | 2012-08-21 | Nokia Corporation | Method and apparatus for activating one or more remote features |
| US20100328214A1 (en) * | 2009-06-27 | 2010-12-30 | Hui-Hu Liang | Cursor Control System and Method |
| KR20110058438A (ko) * | 2009-11-26 | 2011-06-01 | 삼성전자주식회사 | 프리젠테이션 녹화 장치 및 방법 |
| KR20110069958A (ko) * | 2009-12-18 | 2011-06-24 | 삼성전자주식회사 | 프로젝터 기능의 휴대 단말기의 데이터 생성 방법 및 장치 |
| US9100681B2 (en) * | 2010-07-08 | 2015-08-04 | Nokia Technologies Oy | Visual data distribution |
| KR101054895B1 (ko) * | 2010-09-28 | 2011-08-05 | 하이브모션 주식회사 | 원격 포인팅 시스템 |
| JP2012145646A (ja) * | 2011-01-07 | 2012-08-02 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
| CN102622120B (zh) * | 2011-01-31 | 2015-07-08 | 宸鸿光电科技股份有限公司 | 多点触控面板的触碰轨迹追踪方法 |
| CN102841767B (zh) * | 2011-06-22 | 2015-05-27 | 华为终端有限公司 | 多投影拼接几何校正方法及校正装置 |
| JP5927845B2 (ja) * | 2011-11-01 | 2016-06-01 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、及び、プログラム |
| CN102611822B (zh) * | 2012-03-14 | 2015-07-01 | 海信集团有限公司 | 投影仪及其投影图像校正方法 |
| CN104918028A (zh) * | 2012-03-21 | 2015-09-16 | 海信集团有限公司 | 一种投影屏幕上的激光点位置去抖动方法 |
| DE102012210065A1 (de) * | 2012-06-14 | 2013-12-19 | Robert Bosch Gmbh | Vorrichtung und Verfahren zum Identifizieren eines Laserzeigers |
| US8922486B2 (en) | 2012-07-24 | 2014-12-30 | Christie Digital Systems Usa, Inc. | Method, system and apparatus for determining locations in a projected image |
| JP2014059678A (ja) * | 2012-09-14 | 2014-04-03 | Ricoh Co Ltd | 画像出力装置、画像出力システム、画像出力プログラム |
| KR101258910B1 (ko) * | 2013-02-07 | 2013-04-29 | (주)유한프리젠 | 프리젠테이션 시스템에서 레이저 포인터를 이용한 영상 판서 방법 |
| JP5830055B2 (ja) * | 2013-05-31 | 2015-12-09 | 京セラドキュメントソリューションズ株式会社 | 画像処理装置および画像処理システム |
| JP2015094768A (ja) * | 2013-11-08 | 2015-05-18 | セイコーエプソン株式会社 | 表示装置、表示システムおよび制御方法 |
| JP6488653B2 (ja) * | 2014-11-07 | 2019-03-27 | セイコーエプソン株式会社 | 表示装置、表示制御方法および表示システム |
| CN105653025B (zh) * | 2015-12-22 | 2019-12-24 | 联想(北京)有限公司 | 一种信息处理方法和电子设备 |
| CN106896746A (zh) * | 2017-01-09 | 2017-06-27 | 深圳前海勇艺达机器人有限公司 | 具有主持会议功能的机器人 |
| TWI629617B (zh) * | 2017-04-19 | 2018-07-11 | 中原大學 | 投影幕雷射筆偵測定位系統與方法 |
| CN111131800B (zh) * | 2018-10-31 | 2022-02-18 | 中强光电股份有限公司 | 影像拼接融合方法与投影系统 |
| CN111131799B (zh) | 2018-10-31 | 2021-11-19 | 中强光电股份有限公司 | 影像拼接融合方法、投影系统与其处理装置 |
| US11224798B2 (en) | 2018-12-27 | 2022-01-18 | Mattel, Inc. | Skills game |
| KR102433603B1 (ko) * | 2022-03-08 | 2022-08-18 | (주)에이블소프트 | 전자칠판용 적외선 터치펜의 인식 좌표 검출 및 인식 좌표 보정 시스템 |
| CN115103169B (zh) * | 2022-06-10 | 2024-02-09 | 深圳市火乐科技发展有限公司 | 投影画面校正方法、装置、存储介质以及投影设备 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1139095A (ja) | 1997-07-18 | 1999-02-12 | Canon Inc | プレゼンテーション装置、その方法、及びペン入力装置 |
| JPH1185395A (ja) | 1997-09-08 | 1999-03-30 | Sharp Corp | ポインティング機能付き液晶プロジェクタ装置 |
| JP2000276297A (ja) * | 1999-03-25 | 2000-10-06 | Seiko Epson Corp | ポインティング位置検出装置、ポインティング位置検出方法、プレーゼンテーションシステムおよび情報記憶媒体 |
| WO2001003106A1 (en) | 1999-07-06 | 2001-01-11 | Hansen Karl C | Computer presentation system and method with optical tracking of wireless pointer |
| WO2001047285A1 (en) | 1999-12-23 | 2001-06-28 | Justsystem Corporation | Method and apparatus for calibrating projector-camera system |
| WO2003056505A1 (en) | 2001-12-21 | 2003-07-10 | British Telecommunications Public Limited Company | Device and method for calculating a location on a display |
| JP2004265235A (ja) | 2003-03-03 | 2004-09-24 | Matsushita Electric Ind Co Ltd | プロジェクタシステム、プロジェクタ装置、画像投射方法 |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6292171B1 (en) * | 1999-03-31 | 2001-09-18 | Seiko Epson Corporation | Method and apparatus for calibrating a computer-generated projected image |
| US20010010514A1 (en) * | 1999-09-07 | 2001-08-02 | Yukinobu Ishino | Position detector and attitude detector |
| US6704000B2 (en) * | 2000-11-15 | 2004-03-09 | Blue Iris Technologies | Method for remote computer operation via a wireless optical device |
-
2006
- 2006-02-09 WO PCT/JP2006/302249 patent/WO2006085580A1/ja not_active Ceased
- 2006-02-09 US US11/816,033 patent/US8049721B2/en not_active Expired - Fee Related
- 2006-02-09 CN CNA200680004415XA patent/CN101116049A/zh active Pending
- 2006-02-09 EP EP06713392A patent/EP1855184A4/en not_active Withdrawn
- 2006-02-09 JP JP2007502636A patent/JPWO2006085580A1/ja active Pending
- 2006-02-09 KR KR1020077018106A patent/KR20070105322A/ko not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1139095A (ja) | 1997-07-18 | 1999-02-12 | Canon Inc | プレゼンテーション装置、その方法、及びペン入力装置 |
| JPH1185395A (ja) | 1997-09-08 | 1999-03-30 | Sharp Corp | ポインティング機能付き液晶プロジェクタ装置 |
| JP2000276297A (ja) * | 1999-03-25 | 2000-10-06 | Seiko Epson Corp | ポインティング位置検出装置、ポインティング位置検出方法、プレーゼンテーションシステムおよび情報記憶媒体 |
| WO2001003106A1 (en) | 1999-07-06 | 2001-01-11 | Hansen Karl C | Computer presentation system and method with optical tracking of wireless pointer |
| JP2003504705A (ja) * | 1999-07-06 | 2003-02-04 | ハンセン、カール、シー. | 無線ポインタの光学トラッキングを備えるコンピュータ・プレゼンテーション・システムおよび方法 |
| WO2001047285A1 (en) | 1999-12-23 | 2001-06-28 | Justsystem Corporation | Method and apparatus for calibrating projector-camera system |
| WO2003056505A1 (en) | 2001-12-21 | 2003-07-10 | British Telecommunications Public Limited Company | Device and method for calculating a location on a display |
| JP2004265235A (ja) | 2003-03-03 | 2004-09-24 | Matsushita Electric Ind Co Ltd | プロジェクタシステム、プロジェクタ装置、画像投射方法 |
Non-Patent Citations (3)
| Title |
|---|
| R. SUKTHANKAR; R. STOCKTON; M. MULLIN.: "Self-Calibrating Camera-Assisted Presentation Interface", PROCEEDINGS OF INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION, 2000 |
| R. SUKTHANKAR; R. STOCKTON; M. MULLIN.: "Smarter Presentation: Exploiting Homography in Camera-Projector Systems", PROCEEDINGS OF INTERNATIONAL CONFERENCE ON COMPUTER VISION, 2001 |
| See also references of EP1855184A4 |
Cited By (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008225554A (ja) * | 2007-03-08 | 2008-09-25 | Takram Design Engineering:Kk | プロジェクタシステム |
| JP2008293289A (ja) * | 2007-05-24 | 2008-12-04 | Sharp Corp | プロジェクタ |
| CN101430482B (zh) * | 2007-11-05 | 2010-06-09 | 鸿富锦精密工业(深圳)有限公司 | 投影画面操作系统及操作方法 |
| WO2009061619A3 (en) * | 2007-11-07 | 2009-08-06 | Omnivision Tech Inc | Apparatus and method for tracking a light pointer |
| US7862179B2 (en) | 2007-11-07 | 2011-01-04 | Omnivision Technologies, Inc. | Dual-mode projection apparatus and method for locating a light spot in a projected image |
| US8188973B2 (en) | 2007-11-07 | 2012-05-29 | Omnivision Technologies, Inc. | Apparatus and method for tracking a light pointer |
| JP2010151997A (ja) * | 2008-12-24 | 2010-07-08 | Brother Ind Ltd | プレゼンテーションシステム及びそのプログラム |
| US8890811B2 (en) | 2010-01-22 | 2014-11-18 | Kyocera Corporation | Projection controlling apparatus and projection controlling method |
| WO2011090176A1 (ja) * | 2010-01-22 | 2011-07-28 | 京セラ株式会社 | 投影制御装置及び投影制御方法 |
| JP2011150609A (ja) * | 2010-01-22 | 2011-08-04 | Kyocera Corp | 投影制御装置及び投影方法、並びに投影制御用コンピュータプログラム |
| JP2012234382A (ja) * | 2011-05-02 | 2012-11-29 | Ricoh Co Ltd | 画像表示システムおよび画像表示方法 |
| JP2013164489A (ja) * | 2012-02-10 | 2013-08-22 | Seiko Epson Corp | 画像表示装置、画像表示システム、および画像表示装置の制御方法 |
| JP2014120023A (ja) * | 2012-12-18 | 2014-06-30 | Seiko Epson Corp | 表示装置、位置検出装置、及び、表示装置の制御方法 |
| WO2015012409A1 (en) * | 2013-07-24 | 2015-01-29 | Ricoh Company, Limited | Information processing device, image projecting system, and computer program |
| JP2015043556A (ja) * | 2013-07-24 | 2015-03-05 | 株式会社リコー | 情報処理装置、画像投影システム、及びプログラム |
| JP2015060087A (ja) * | 2013-09-19 | 2015-03-30 | セイコーエプソン株式会社 | プロジェクターシステム、およびプロジェクターシステムの制御方法 |
| JP2015114975A (ja) * | 2013-12-13 | 2015-06-22 | 株式会社東芝 | 電子機器および表示方法 |
| CN107885315A (zh) * | 2016-09-29 | 2018-04-06 | 南京仁光电子科技有限公司 | 一种提高触控系统跟踪精确度的方法 |
| CN107885315B (zh) * | 2016-09-29 | 2018-11-27 | 南京仁光电子科技有限公司 | 一种提高触控系统跟踪精确度的方法 |
| CN111788826A (zh) * | 2018-03-06 | 2020-10-16 | 索尼公司 | 信息处理装置、信息处理方法和程序 |
| EP4250756A1 (en) | 2022-03-24 | 2023-09-27 | FUJIFILM Corporation | Image generation device, image generation method, and image generation program |
| JP2023142262A (ja) * | 2022-03-24 | 2023-10-05 | 富士フイルム株式会社 | 画像生成装置、画像生成方法、及び画像生成プログラム |
| US12483796B2 (en) | 2022-03-24 | 2025-11-25 | Fujifilm Corporation | Image generation device, image generation method, and image generation program |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1855184A4 (en) | 2008-12-10 |
| US20090021480A1 (en) | 2009-01-22 |
| JPWO2006085580A1 (ja) | 2008-06-26 |
| US8049721B2 (en) | 2011-11-01 |
| KR20070105322A (ko) | 2007-10-30 |
| EP1855184A1 (en) | 2007-11-14 |
| CN101116049A (zh) | 2008-01-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2006085580A1 (ja) | ポインタ光トラッキング方法、プログラムおよびその記録媒体 | |
| JP5122641B2 (ja) | カメラとマーク出力とによるポインティング装置 | |
| JP3640156B2 (ja) | 指示位置検出システムおよび方法、プレゼンテーションシステム並びに情報記憶媒体 | |
| CN106664465B (zh) | 用于创建和再现增强现实内容的系统以及使用其的方法 | |
| JP5870639B2 (ja) | 画像処理システム、画像処理装置、及び画像処理プログラム | |
| JP5560721B2 (ja) | 画像処理装置、画像表示システム、及び画像処理方法 | |
| CN101562703A (zh) | 用于在成像设备内执行基于触摸的调整的方法和装置 | |
| EP2880508A2 (en) | Improved identification of a gesture | |
| CN107659769A (zh) | 一种拍摄方法、第一终端及第二终端 | |
| US11277567B2 (en) | Electronic apparatus, control method of electronic apparatus and non-transitory computer readable medium | |
| JP6028589B2 (ja) | 入力プログラム、入力装置および入力方法 | |
| US9875565B2 (en) | Information processing device, information processing system, and information processing method for sharing image and drawing information to an external terminal device | |
| JP7198043B2 (ja) | 画像処理装置、画像処理方法 | |
| KR101613438B1 (ko) | 증강현실 컨텐츠의 생성 및 재생 시스템과, 이를 이용한 방법 | |
| JP2000276297A (ja) | ポインティング位置検出装置、ポインティング位置検出方法、プレーゼンテーションシステムおよび情報記憶媒体 | |
| US11048400B2 (en) | Electronic apparatus, control method of electronic apparatus, and non-transitory computer readable medium | |
| JP4500036B2 (ja) | 画像投影表示装置、画像投影表示方法および画像投影表示プログラム | |
| KR101518696B1 (ko) | 증강현실 컨텐츠의 생성 및 재생 시스템과, 이를 이용한 방법 | |
| JP6269227B2 (ja) | 表示装置、プロジェクター、および表示制御方法 | |
| JP2010068222A (ja) | 顔検出制御装置、顔検出制御プログラムおよび顔検出制御方法 | |
| GB2583813A (en) | Electronic apparatus for controlling display of image, control method of electronic apparatus and program | |
| US12348702B2 (en) | Electronic device and method for controlling electronic device | |
| US12445700B2 (en) | Electronic device, control method of an electronic device, and non-transitory computer readable medium | |
| JP2008076445A (ja) | 投影型表示装置、投影型表示方法、該方法を実行させるプログラム及び該プログラムを格納したコンピュータ読み取り可能な記録媒体 | |
| HK1110131A (en) | Pointer light tracking method, program, and recording medium thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2007502636 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1020077018106 Country of ref document: KR |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 200680004415.X Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 11816033 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2006713392 Country of ref document: EP |
|
| WWP | Wipo information: published in national office |
Ref document number: 2006713392 Country of ref document: EP |