[go: up one dir, main page]

HK1035790B - Interactive display presentation system - Google Patents

Interactive display presentation system Download PDF

Info

Publication number
HK1035790B
HK1035790B HK01106376.0A HK01106376A HK1035790B HK 1035790 B HK1035790 B HK 1035790B HK 01106376 A HK01106376 A HK 01106376A HK 1035790 B HK1035790 B HK 1035790B
Authority
HK
Hong Kong
Prior art keywords
image
display
control system
laser
laser spot
Prior art date
Application number
HK01106376.0A
Other languages
Chinese (zh)
Other versions
HK1035790A1 (en
Inventor
林尚宏
Original Assignee
精工爱普生株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/399,933 external-priority patent/US6346933B1/en
Application filed by 精工爱普生株式会社 filed Critical 精工爱普生株式会社
Publication of HK1035790A1 publication Critical patent/HK1035790A1/en
Publication of HK1035790B publication Critical patent/HK1035790B/en

Links

Description

Interactive display system
Technical Field
The present invention relates to a display system employing a display computer, a computer controlled image projector and a projection screen, and in particular to an interactive display system in which control is achieved by moving the beam of a laser pointer across the projection screen in a predetermined spatial pattern.
Background
Charts, text, and various patterns are typically displayed to viewers in meetings and classrooms by optically projecting these images onto a projection screen or wall. LCD (liquid crystal display) projectors are commonly used as image sources, where graphics, text and images are electronically produced by a display computer, such as a Personal Computer (PC) or laptop computer, which typically implements, for example, Microsoft PowerPointSuch display generation software. In such display systems, the PC provides a video output, such as standard VGA, super VGA or XGA. These LCD/PC projection display systems are becoming more prevalent than traditional overhead projectors and slides.
The operator, whether standing on a podium or walking in front of the audience, cannot directly control the image displayed on the projection screen when he uses a conventional LCD/PC projection display system. Conventional systems require the operator to return to the display computer or require an assistant seated at the computer to control the display. At the display computer, the instructor or assistant controls the displayed image by keyboard input or by "mouse commands" with a cursor on an appropriate area of the computer monitor screen. The actions of the operator walking to the display computer or communicating with the assistant deviate from the natural progress and flow of the display. It is believed that the closer the operator is in relationship to the listener, the more he can interactively control the display or change the image appearing on the projection screen without repeatedly diverting his attention from the screen to an assistant or moving to a display computer.
For example, with the control system disclosed in U.S. patent No. 5,782,548, which was assigned to Miyashita, an operator uses a remote control device to wirelessly transmit control signals to the projector sensors. This system requires the operator to return from his audience and projection screen when he wishes to change the displayed image on the screen and divert his attention to the projector via a remote control device. In addition, although the operator gains some flexibility through the remote control device, this flexibility is limited because he must stay within range of the projector sensor.
It is known in the art to use a laser pointer to project a laser spot onto an area where an image is displayed. Such systems typically require multiple steps or actions to perform control over the display and this limits the natural progress and flow of the display. For example, U.S. patent No. 5,682,181 to Nguyen et al discloses a system in which an operator invokes a drop down menu before being able to select a particular function, such as draw mode, page up, etc. Us patent 5,502,459 to Marshall et al discloses a method of achieving a double mouse click in which the user must first activate, then deactivate, then reactivate, then again release the laser pointer, and hold the projected laser spot in a rectangular area of the display image. Another system requires additional components used with conventional LCD/PC projection display systems. For example, U.S. patent No. 5,504,501 provides remote control by a laser pointer, but requires a filter to separate the laser spot from the surrounding display image.
The prior art does not provide an interactive display system in which an operator can control the display, for example by providing mouse commands, and maintain a high degree of interaction with the audience.
Object of the Invention
It is therefore an object of the present invention to provide a display system which overcomes the drawbacks of conventional display systems.
It is another object of the present invention to provide such a system in which mouse commands and other display functions, such as proceeding to the next image, underlining, zooming in, or highlighting, may be performed by a simple swipe of the laser pointer. Other objects of the invention will in part become more apparent when the following detailed description is read.
According to the present invention there is provided a display control system for controlling a projection display device comprising a display computer for producing an initial and continuous electronic image, a computer-controlled image projector for receiving the initial electronic image as the initial display image projected onto the projection screen, a projection screen, and a laser pointer operating in a pulsed and continuous radiation mode, the laser pointer providing control when laser points are emitted onto the projection screen to project a spatial pattern into the initial display image, the control system comprising: an electronic camera directed at an initial display image and producing a series of acquired images, each of said acquired images including an image element; and a control module corresponding to the obtained image. Wherein the control module comprises: means for positioning a laser spot on said acquired image; means for assigning a position coordinate to each of said localized laser points so as to produce a series of position coordinates corresponding to sequential positions of the laser points in the initial display image; means for analyzing the sequence of location coordinates in order to capture a gesture spatial pattern formed by the sequence of location coordinates; means for matching the captured gesture spatial pattern to one of a set of predetermined hand pad spatial patterns; means for selecting a display command, the selected display command associated with the matching predetermined gesture spatial pattern; and means for generating a subsequent electronic pattern in response to the selected display command, the subsequent electronic pattern being transmitted to the image projector for projection as a subsequent display image onto the projection screen. Or wherein the control module comprises: a memory portion including a set of predetermined gesture spatial patterns and a set of display commands, each of said display commands corresponding to one of said predetermined gesture spatial patterns; positioning circuitry for positioning the laser beam in said acquired image; image processing circuitry responsive to locations of laser points in said acquired images, said image processing circuitry for assigning a position coordinate to each of said located laser points, for generating a series of position coordinates corresponding to sequential positions of the laser points in an initially displayed image, and for analyzing said sequence of position coordinates so as to capture a gesture space pattern formed by said sequence of position coordinates; command circuitry responsive to said captured gesture spatial pattern for matching said captured gesture spatial pattern with one of said predetermined gesture spatial patterns for selecting one of said display commands corresponding to said matched predetermined gesture spatial pattern; and for generating mouse commands in response to said selected display commands; the mouse commands are sent to a display computer so that subsequent display images are projected onto a projection screen.
According to the present invention there is also provided a method of controlling a display apparatus comprising a computer controlled image projector for projecting a sequence of display images onto a projection screen, control being effected by projecting a laser pointer onto the projection screen with a laser spot moving across the display images in a predetermined gesture spatial pattern, said method comprising the steps of: obtaining a series of images corresponding to the projected display image and the laser spot, each acquired image comprising a plurality of pixels; assigning a series of position coordinates to the position of the laser point in each acquired image; analyzing the series of position coordinates to identify a gesture spatial pattern produced by the laser point; and generating a display command according to the recognition of the gesture space pattern.
The present invention provides an interactive display control system in which an instructor controls display by projecting a predetermined spatial pattern onto a projection screen through laser points emitted from a laser pointer. The predetermined spatial pattern is acquired and understood by the control system and then the display commands are sent to the projector. During display, the display computer generates a bitmap corresponding to the display on its local display. The bitmap is then sent to the LCD projector, and the bitmap is then projected onto the screen. The display is controlled by monitoring the laser spot projected onto the screen. Laser spot control is achieved by a control system. The control system includes a digital camera that acquires successive images of the laser spot, and a control module that includes a processing section that analyzes the position of the laser spot on the display image. When the processing section matches the positions of successive laser spots with a predetermined spatial pattern, a corresponding display command is issued. The display command may be sent to a display computer, which responds by an action, such as proceeding to the next electronic image, or the display command may perform a function in the control module, such as highlighting text or zooming in on a portion of the displayed image.
Drawings
The invention is described below with reference to the accompanying drawings, in which:
FIG. 1 is a schematic diagram of a first embodiment of an interactive display system including a computer-controlled image projector, a projection screen, a laser pointer, and a control system according to the present invention;
FIG. 2 is a plan view of a conventional display system showing an operator interacting primarily with a display computer while a listener is viewing a display image on a projection screen;
FIG. 3 is a plan view showing the interactive display system of FIG. 1 with the instructor interacting primarily with the audience and with the display image displayed on the projection screen;
FIG. 4 is a schematic view of a display image on the projection screen of the system of FIG. 1 showing a laser point generated by a laser pointer in the display image;
FIG. 5 illustrates how the position coordinates of the laser points of FIG. 4 are determined;
FIG. 6 is a schematic diagram of the Frahman chain code rule used in the present invention to describe the motion of the laser spot on the display image of FIG. 4;
FIG. 7 is a graphical representation of a predetermined spatial pattern projected by an instructor onto a projection screen, the movement of a laser spot being described by a series of Fremann chain codes and identified as a right-hand arrow; and
fig. 8 is a schematic diagram of a second embodiment of an interactive display system according to the present invention.
Detailed Description
Fig. 1 shows a first embodiment of an interactive display system 10 comprising, in accordance with the invention, a display computer 21, a computer controlled image projector 23, a projection screen 12, a laser pointer 11, and a display control system 30. The image projector 23 generally includes a light source 25 and an image panel 27 disposed between the light source 25 and the projection screen 12. The display control system 30 includes a digital camera 31 or similar imaging device and a control module 33. During operation of display system 10, display computer 21 generates an initial electronic image 22a from a display run by display generation software resident in display computer 21. The display program includes a set of electronic images 22a, 22b, 22k for projection during display. The initial electronic image 22a is sent by the display computer 21, preferably as a bitmap image, to the control module 33 for subsequent transmission to the image panel 27 as the projected image 29. The projected image 29 is then projected onto projection screen 12 as a corresponding initial display image 13.
The instructor 37 interactively controls the display system 10 by projecting the laser beam 15 with the laser pointer 11 onto the display image 13 as a laser spot 16. The laser pointer 11 is preferably a commercially available device capable of emitting continuous and pulsed radiation beams with a wavelength between 630 and 670 nm. The instructor 37 moves the laser spot 16 across the projection screen 12 to draw a predetermined spatial pattern of gestures to provide control over the display system 10. The gesture space pattern, which may include a horizontal underline, an arrow, a box enclosing an image portion, or some other predetermined geometric figure, is then recognized by the control module 33 and the corresponding predetermined display command is selected. The display commands may perform functions in the control module 33 or may be sent to the display computer 21, as described in more detail below. For example, in one mode of operation, display computer 21 receives a display command and invokes subsequent slide 22b for transmission to pattern projector 23 via control module 33.
The advantages of the system of the present invention can be described with reference to fig. 2 and 3. In a conventional display, as shown in FIG. 2, the operator 17 primarily interacts with the display computer 21 or an assistant seated at the display computer 21 to control the display on the projection screen 12, as indicated by the solid arrow 41. Thus, operator 17 has limited interaction with the display on screen 12, as indicated by dashed arrow 46. When viewing the display, members of the audience 19 place their attention primarily on the display, as indicated by the solid arrow 43, and less attentive to the operator 17, as indicated by the dashed arrow 45. In contrast, with the interactive display system 10 disclosed herein, the instructor 37 requires little or no interaction with the display computer 21 or with the display control system 30, and is therefore able to interact with the audience 19, as indicated by the solid arrow 41, and with the display on the projection screen 12, as indicated by the solid arrow 48, via the laser pointer 11.
During operation of the display control system 30 of fig. 1, the digital camera 31 is directed at the projection screen 12 to continuously acquire the currently appearing display images 13, preferably at a sampling rate of about 10 frames per second. The control module 33 receives these acquired images as respective successive acquired images 63a, 63 b. It will be apparent to those skilled in the art that the acquired images 63a, 63b, 63n are substantially identical, while the display image 13 remains on the projection screen 12, except that the position of the laser spot 16 has a shift between successive acquired images. Successive positions of the laser spot 16 are separated, retained in memory 34 and analyzed to identify a gesture space pattern projected onto the projection screen 12 by the instructor 37.
The control module 33 comprises a processing section 35 which detects the laser spot 16 in successive acquired images 63a, 63 b. Since the laser spot 16 comprises pixels having a greater brightness than any of the pixels making up the display image 13, the processing section 35 can locate the laser spot 16 by detecting pixels having a brightness that exceeds a predetermined threshold, where the threshold is greater than the maximum brightness level present in the display image. The processing section 35 analyzes the display image 13 for a contiguous set of such bright pixels, where the set has a height or width greater than the size of the laser spot 16. Preferably, the detection of the laser spot 16 is first performed in a low resolution mode (e.g., where the acquired images 63a, 63b, 63.. and 63n are viewed as images of 80 × 60 pixels) in order to find the approximate location of the laser spot 16 in the display image 13. For example, fig. 4 shows the laser spot 16 positioned on the projection screen 12 in a flow chart forming the display image 13. When the approximate position of the laser spot 16 is determined, the display image 13 is analyzed at a higher resolution of 160 x 120 pixels, as shown in fig. 5, and the position of the laser spot 16 is correspondingly more accurately determined, here at the position coordinates (128, 114).
In fig. 1, the processing section 35 assigns successive position coordinates to the position of the laser spot 16 for each of the acquired images 63a, 63b,.. and 63n in which the laser spot 16 is detected. Successive position coordinates are obtained or stored in the memory 30 and the spatial pattern that appears is analyzed. The control module 33 also includes a reference portion 36 in which a set of predetermined gesture space patterns is stored, as well as a set of pre-assigned display commands, each display command being assigned to (or corresponding to) one of the predetermined gesture space patterns. When a spatial pattern is recognized, the processing section 35 compares the recognized spatial pattern with the set of predetermined gesture spatial patterns. If a match is found, the processing section 35 detects a display command assigned to the matching predetermined gesture space pattern and executes the selected display command.
When sent to the display computer 21, the selected display command is received through the input port 65. In a preferred embodiment, the input port 65 comprises a serial port and the display commands are sent over the serial or USB cable 39. It will be appreciated by those skilled in the art that the processing portion 35 may reside in the control module 33 as shown, or may also reside in the digital camera 31 or display computer 21, may be retrieved from the storage medium 67 as computer readable instructions, or may be distributed among the digital camera 31, the control module 33 and the display computer 21.
When the laser point 16 is detected in each of the acquired images 63a, 63b, 63n, the processing section 35 converts the position coordinates (e.g., x @)i,yi) The successive positions assigned to the movement of the laser spot 16 to determine whether the laser spot 16 is moving and, if so, whether the movement is random or associated with one of a plurality of predetermined patterns or patterns. Motion and direction can be analyzed by analyzing the offset vectorIs determined. Preferably, a spatial pattern recognition tool known in the art as a Fraeman chain code is used to determine the motion of the laser spot 16 and detect the formation of arrows, boxes or other geometric figures. The friemann chain code assigns a number to the translation vector of the laser spot 16, that is to say, to a translation vector describing the image 63 obtained from a single imageiTo the next acquired image 63i+1Of the motion vector of (2). As shown in fig. 6, byThe vertical motion represented by direction vector 71 is assigned a "code 1" and is represented by a vector in The diagonal motion of the third quadrant represented by direction vector 73 is assigned to a "code 6", and so on. The processing section 35 distinguishes the random movement of the laser spot 16 from the intended gesture made by the instructor 37 by analyzing the resulting sequence of frieman codes.
For example, fig. 7 shows how a friemann chain code is used to describe an obtained gesture spatial pattern. The control module 33 captures the position coordinates of the laser spot 16 in 5 initial successive images, where the laser spot movement in four successive image pairs is described by a series of four "codes 4" represented by direction vectors 75a, 75b, 75c and 75 d. The movement of the laser spot 16 in the 5 subsequent images (i.e. the acquired images 63f to 63j, not shown) is described by a series of 5 "codes 6" represented by direction vectors 77a, 77b, 77c, 77d, 77e respectively.
The processing portion 35 compares the captured gesture spatial sequence of 4 consecutive "codes 4" followed by a series of 5 consecutive "codes 6" with predetermined gesture spatial pattern items provided to the processing portion 35 from a reference portion 36, which may be random access memory, firmware or a look-up table. A match indicates that the captured gesture constitutes a right-hand arrow 51. For captured gesture space patterns that constitute random sequences of different freiman chain codes, the processing section 35 will determine that random laser motion is occurring and not issue a display command.
Display system 10 may be used in a "mouse mode" to project a series of Power pointsThe slide shows as the display image 13. For example, if the spatial pattern of gestures drawn by the instructor 37 constitutes the right-hand arrow 51, the display camera may respond by invoking the next in the series of display slides to be sent to the image projector 23. That is, a command to "proceed to the next slide" would correspond to or be assigned to the right-hand arrow 51. In the same way, if the gesture spatial pattern constitutes a left-hand arrow, the previous picture will be played back and provided as the projected image 29. In additionIn addition, arrow indicators may be assigned to change the volume of sound, or to change other slide features displayed, by predetermined display commands.
The instructor 37 may also position the laser spot 16 in a particular area of the display image 13, for example on an icon or on a menu item. The processing section 35 interprets this as selecting the icon or menu down, equivalent to a first mouse click. After this selection action, a second mouse click is effected by pulsing the laser pointer 11 at a predetermined time interval (preferably three seconds) while the laser spot 16 remains on the icon or menu item. In addition, positioning the laser spot 16 at the boundary of the display image 13 creates a drop down menu for subsequent selection and action. The subsequent single pulse of the laser spot 16 on the selected area is then interpreted by the processing section 35 as a second mouse click.
In this way, the laser beam 15 can be projected onto the display image 13 from substantially any location in the presentation room for controlling the display, e.g. opening a new document, cutting or pasting. Thus, the laser pointer 11 is used for the display controller 30 in the same manner as a mouse, keyboard or other peripheral device used to control the projected image 29 by direct input to the display computer 21. This control method is particularly useful in presentations where the instructor 37 needs to access and browse the internet while talking and walking around.
In addition, the "drawing mode" may be activated when the laser pointer 11 is pulsed to generate three consecutive mouse clicks. In the drawing mode, the laser spot 16 may be used to draw a straight line or a curve on the display image 13. To exit the drawing mode, the laser pointer 11 is pulsed to generate four mouse clicks. By plotting, other functions may be performed during the presentation. In a preferred embodiment, for example, when the instructor 37 draws horizontal lines in text or image portions of the displayed image 13, the processing section 35 generates the subsequent electronic image 24 by combining the initial electronic image 22a with a stack that provides an accent or underline corresponding to the horizontal lines emitted by the laser spot 16.
The resulting electronic image 24 is then sent to the projector 23 to be projected as a display image 13 with the corresponding text or image portion highlighted with underlining or highlighting. In addition, when the instructor 31 draws a rectangle enclosing a portion of the display image 13, the initial slide 22 is enlarged in the corresponding portion and is transmitted as a result of the electronic image 24, thereby realizing the enlarging operation.
In another embodiment, as shown in FIG. 8, the image projector 23 is directly connected to the display computer 21. This result may be used when control system 30 provides only mouse commands to display computer 21. This operation is similar to the embodiment described above, with the laser spot 16 projected onto the display image 13 to provide control. The digital camera 31 captures the display image 13 and the acquired images 63a, 63b, 63n are analyzed by the processing portion 35 in the control module 33. When the recognized gesture space pattern is found to match the predetermined gesture space pattern from the reference portion 36, a corresponding mouse command is sent to the display computer 21 through the serial computer 39. As in the previous embodiment, the instructions of the processing section 35 and the data of the reference section 36 may be fetched from the storage medium 67.
Prior to operation of display system 10, control system 30 may utilize processing section 35 to correct to associate the pixel and location coordinates of displayed image 13 with the corresponding elements and coordinates that make up initial electronic image 22a located in display computer 21. This can be achieved, for example, by: i) generating an embedded predetermined pattern (not shown) in the projected image 29; ii) obtaining a display image 13 including the embedded pattern by an electronic camera (electronic camera) 31; iii) detecting the predetermined pattern and determining the relative position coordinates in the pattern obtained by the electronic camera 31; and iv) analyzing the pattern coordinates to mathematically relate the coordinates of the display pattern 13 to the coordinates of the initial electronic image 22 a. Such a correction process is described in commonly assigned co-pending patent application entitled "method and apparatus for correcting computer-generated projection patterns" (application 3/31/1999, serial No. 09/283,093), the entire contents of which are incorporated herein by reference.
The processing section 35 is also used to detect the ambient light level sensed by the electronic camera 31 in the display area in order to provide the best settings for the amount of brightness and contrast of the displayed image 13. The color balance of the display image 13 is optimized by the image projector 23 projecting a standard color correction table (not shown) onto the projection screen 12 and comparing the values of the projected colors obtained by the electronic camera 31, the chart colors having known values. From the comparison result, the processing section 35 determines the color correction to be made, and adjusts the display image 13 accordingly.
Although the present invention has been described with reference to particular embodiments, it is to be understood that the invention is not limited to the particular constructions and methods disclosed herein and/or shown in the drawings, but also includes any modifications or equivalents within the scope of the present invention.

Claims (37)

1. A display control system for controlling a projection display device including a display computer producing initial and continuous electronic images, a computer-controlled image projector for receiving the initial electronic image as an initial display image projected onto the projection screen, a projection screen, and a laser pointer operating in a pulsed and continuous radiation mode, the laser pointer providing control when laser points are emitted onto the projection screen to project a spatial pattern into the initial display image, the control system comprising:
an electronic camera directed at an initial display image and producing a series of acquired images, each of said acquired images including an image element; and
a control module corresponding to the obtained image, the control module comprising:
means for positioning a laser spot on said acquired image;
means for assigning a position coordinate to each of said localized laser points so as to produce a series of position coordinates corresponding to sequential positions of the laser points in the initial display image;
means for analyzing the sequence of location coordinates in order to capture a gesture spatial pattern formed by the sequence of location coordinates;
means for matching the captured gesture spatial pattern to one of a set of predetermined hand pad spatial patterns;
means for selecting a display command, the selected display command associated with the matching predetermined gesture spatial pattern; and
means for generating a subsequent electronic pattern in response to the selected display command, the subsequent electronic pattern being sent to the image projector for projection as a subsequent display image onto the projection screen.
2. The control system of claim 1, wherein the means for positioning the laser spot comprises: means for identifying bright pixels comprising obtained pixels having a brightness exceeding a predetermined threshold.
3. The control system of claim 2, wherein the means for positioning the laser spot further comprises: means for identifying a set of the bright pixels, the set comprising a set of neighboring pixels having a linear scale exceeding a predetermined value.
4. The control system of claim 1, wherein the means for assigning location coordinates comprises: means for analyzing the obtained image in a reduced resolution mode.
5. The control system of claim 1, wherein the means for analyzing the sequence of position coordinates comprises: for deriving a series of offset vectorsEach of said offset vectors corresponding to a difference in successive position coordinates.
6. The control system of claim 5, wherein the means for analyzing the sequence of position coordinates comprises: means for describing the sequence of the offset vector by a sequence of Flemann chain codes.
7. The control system of claim 1, further comprising means for detecting laser spot pulses generated by operating the laser pointer in a pulsed mode for a predetermined time interval.
8. The control system of claim 7, wherein the predetermined time interval is 3 seconds.
9. Control system according to claim 7, characterized in that the display command comprises a first mouse click which is sent to the display computer when a stationary laser point is detected in a predetermined range of the initial display image by the means for analyzing the sequence of position coordinates.
10. The control system of claim 9, wherein the predetermined range includes an icon.
11. The control system of claim 9, wherein the first mouse click provides a pull-down menu in the subsequent display image.
12. The control system of claim 9, wherein the display command further comprises a second mouse click when the laser spot pulse is detected, the second mouse click being sent to a display computer.
13. The control system according to claim 12, wherein the second mouse selects the predetermined range of the initially displayed image among the subsequent displayed images with a single click.
14. The control system of claim 7, wherein the selected display command comprises: an overlay command associated with three of the laser spot pulses, the overlay command producing a subsequent electronic image constituting an overlay image overlaid on the initial electronic image.
15. The control system of claim 14, wherein the overlay image includes one or more curves corresponding to translational movement of the laser spot on the initial display image.
16. The control system of claim 14, wherein the overlay image includes a highlight region corresponding to a translational movement of the laser spot on the initial display image.
17. The control system of claim 15, wherein when the one or more curves determine a surrounding portion of an initial display image, the selected display command comprises a zoom-in command that generates a subsequent electronic image comprising an enlarged portion of the initial electronic image, the enlarged portion corresponding to the surrounding portion of the initial display image.
18. The control system of claim 1, wherein the means for generating a subsequent electronic image comprises: means for sending mouse commands to a display computer, the subsequent electronic image being sent by the display computer to an image projector.
19. The control system of claim 18, wherein the mouse command comprises a command to advance to a next electronic image when the matching predetermined gesture space pattern comprises a right-hand arrow.
20. The control system of claim 1, further comprising a storage device for storing the location coordinates.
21. The control system of claim 20, wherein the storage device includes a reference portion for storing the display commands and the pre-established images.
22. A display control system for controlling a projection display device including a display computer producing initial and continuous electronic images, a computer-controlled image projector for receiving the initial electronic image as an initial display image projected onto the projection screen, a projection screen, and a laser pointer operating in a pulsed and continuous radiation mode, the laser pointer providing control when laser points are emitted onto the projection screen to project a spatial pattern into the initial display image, the control system comprising:
an electronic camera directed at an initial display image and producing a series of acquired images, each of said acquired images corresponding to an image element; and
a control module corresponding to the obtained image, the control module comprising:
a memory portion including a set of predetermined gesture spatial patterns and a set of display commands, each of said display commands corresponding to one of said predetermined gesture spatial patterns;
positioning circuitry for positioning the laser beam in said acquired image;
image processing circuitry responsive to locations of laser points in said acquired images, said image processing circuitry for assigning a position coordinate to each of said located laser points, for generating a series of position coordinates corresponding to sequential positions of the laser points in an initially displayed image, and for analyzing said sequence of position coordinates so as to capture a gesture space pattern formed by said sequence of position coordinates;
command circuitry responsive to said captured gesture spatial pattern for matching said captured gesture spatial pattern with one of said predetermined gesture spatial patterns for selecting one of said display commands corresponding to said matched predetermined gesture spatial pattern; and for generating mouse commands in response to said selected display commands;
the mouse commands are sent to a display computer so that subsequent display images are projected onto a projection screen.
23. The control system of claim 22, wherein the positioning circuit comprises: means for identifying bright pixels comprising obtained pixels having a brightness exceeding a predetermined threshold.
24. The control system of claim 23, wherein the positioning circuit further comprises: means for identifying a set of the bright pixels, the set comprising a set of neighboring pixels having a linear scale exceeding a predetermined value.
25. The control system of claim 23, wherein the image processing circuit comprises: for deriving a series of offset vectorsEach of said offset vectorsThe quantities correspond to differences in successive position coordinates.
26. The control system of claim 25, wherein the computing circuit further comprises: a conversion circuit for converting said sequence of offset vectors into a sequence of Fleminn chain codes.
27. A method for controlling a display device including a computer-controlled image projector for projecting a sequence of display images onto a projection screen, control being effected by projecting laser spots onto the projection screen using a laser pointer, the laser spots moving across the display images in a predetermined gesture spatial pattern, said method comprising the steps of:
obtaining a series of images corresponding to the projected display image and the laser spot, each acquired image comprising a plurality of pixels;
assigning a series of position coordinates to the position of the laser point in each acquired image;
analyzing the series of position coordinates to identify a gesture spatial pattern produced by the laser point; and
generating a display command based on the recognition of the gesture spatial pattern.
28. The method of claim 27, further comprising the step of matching the recognized gesture spatial pattern to a predetermined gesture spatial pattern.
29. The method of claim 27, wherein said step of analyzing said location coordinates comprises the step of detecting obtained bright pixels, each of said obtained bright pixels having a brightness exceeding a predetermined threshold.
30. The method of claim 29, further comprising the step of identifying as the laser spot a set of obtained bright pixels comprising a set of consecutive obtained pixels having a linear dimension exceeding a predetermined threshold.
31. The method of claim 29, further comprising the step of detecting a fixed laser spot.
32. The method of claim 31, further comprising the step of selecting a portion of an original slide in response to detection of the fixed laser point.
33. The method of claim 31, further comprising the step of detecting a pulsed laser spot.
34. The method of claim 33, further comprising the step of plotting a curve in a subsequent display image in response to detection of said pulsed laser spot, said plotted curve corresponding to motion of said laser spot.
35. The method of claim 34, further comprising the step of highlighting a portion of said subsequently displayed image, said highlighted portion corresponding to said plotted curve.
36. The method of claim 27 wherein the analyzing step comprises the step of describing the recognized gesture spatial pattern by a friemann chain code.
37. The method of claim 36, wherein said generating step comprises the step of generating a series of display images when said recognized gesture spatial pattern comprises an arrow shape.
HK01106376.0A 1999-09-21 2001-09-10 Interactive display presentation system HK1035790B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/399,933 US6346933B1 (en) 1999-09-21 1999-09-21 Interactive display presentation system
US09/399933 1999-09-21

Publications (2)

Publication Number Publication Date
HK1035790A1 HK1035790A1 (en) 2001-12-07
HK1035790B true HK1035790B (en) 2005-01-21

Family

ID=

Similar Documents

Publication Publication Date Title
EP1087327B1 (en) Interactive display presentation system
KR100452413B1 (en) Method and apparatus for calibrating a computer-generated projected image
EP0718748B1 (en) Pointed-position detecting apparatus and method
JP3640156B2 (en) Pointed position detection system and method, presentation system, and information storage medium
US20030210229A1 (en) Presentation system, material presenting device, and photographing device for presentation
US7830362B2 (en) Laser and digital camera computer pointer device system
JP2004185007A (en) Method of controlling display device
JP2001056675A (en) Screen superimposing display type information input/ output device
US6731330B2 (en) Method for robust determination of visible points of a controllable display within a camera view
US20020136455A1 (en) System and method for robust foreground and background image data separation for location of objects in front of a controllable display within a camera view
EP1356423B1 (en) System and method for extracting a point of interest of an object in front of a computer controllable display captured by an imaging device
US20130162518A1 (en) Interactive Video System
US5187467A (en) Universal light pen system
US20080297624A1 (en) Image processing apparatus, image processing system, computer readable medium, and image processing method
US7119788B2 (en) Image processing apparatus, image processing method, providing medium and presentation system
US20180089805A1 (en) Display apparatus, information processing apparatus, and information processing method
HK1035790B (en) Interactive display presentation system
US11330236B2 (en) Projector controlling method and projector
WO2009108123A1 (en) Laser pointer based interactive display system and method thereof
KR20080041049A (en) Interface Method and Device of Exhibition System Considering Visitor's Hand Information
JP2001350585A (en) Image display device with coordinate input function
JP2019191431A (en) Image projection device and method for controlling image projection device
CA2762977A1 (en) Interactive video system
JP2015158644A (en) Projection device, projection method, and program
JP2006040110A (en) Pointing device, and method for displaying point image