WO2017017812A1 - 映像表示システム - Google Patents
映像表示システム Download PDFInfo
- Publication number
- WO2017017812A1 WO2017017812A1 PCT/JP2015/071532 JP2015071532W WO2017017812A1 WO 2017017812 A1 WO2017017812 A1 WO 2017017812A1 JP 2015071532 W JP2015071532 W JP 2015071532W WO 2017017812 A1 WO2017017812 A1 WO 2017017812A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- video display
- display system
- pattern
- screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03542—Light pens for emitting or receiving light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
Definitions
- the present invention relates to a technology related to a projection type video display device, and more particularly to a technology effective when applied to a video display system including a projection type video display device having an interactive function and a pen.
- Projection-type image display devices that project images onto a screen or the like (hereinafter sometimes referred to as “projectors”) have an interactive function that allows users to make presentations efficiently and effectively. It has been put into practical use. As a result, the user can use the dedicated pen (or more accurately “pen-shaped pointing device”, but may be simply referred to as “pen”) or the display screen projected on the screen using a finger or the like. By performing an operation such as drawing, it is possible to draw characters and figures on the display screen and to perform a switching operation of the display image.
- Various methods have been proposed for detecting the position of a pen or the like relative to the display screen. For example, a method of identifying the position by irradiating a laser beam etc. on the screen and detecting reflected light hitting the pen etc. or detecting a light shielding state, laser light emitted from the pen, etc. There is a method of detecting the position where the screen is illuminated.
- the user may want to switch attributes such as the color or thickness of a drawn line or figure at any time.
- attributes such as the color or thickness of a drawn line or figure at any time.
- a palette that is implemented and projected on the screen as a software
- an area indicating an attribute such as a desired color or line thickness is selected with the pen.
- the method of switching was taken.
- a method has been proposed in which a plurality of pens are prepared, and different pens are used by assigning different attributes such as colors and line thicknesses.
- Patent Document 1 Japanese Patent Laying-Open No. 2012-221115
- each of the plurality of pointing devices stores unique identification information for identifying each other, and a transmission signal including the switch information indicating the state of the plurality of switch means and the identification information is uniquely set in the identification information. It is generated with a repeating cycle.
- the coordinate input device receives the transmission signal from the pointing tool, specifies the repetition cycle of the transmission signal based on the identification signal for the pointing tool corresponding to the identification information detected from the received signal, and synchronizes with this.
- an object of the present invention is to enable switching of attributes such as the color and thickness of lines and figures to be drawn more flexibly and at low cost in an image display system including a projection type image display apparatus having an interactive function and a pen. It is to do.
- a video display system is a video display system including a projection-type video display device that projects a video on a screen, and a pen-shaped pointing device operated by a user,
- the pointing device includes a light emitting unit that emits invisible light having a predetermined wavelength
- the projection display device includes a sensor that captures reflected light of the invisible light including the predetermined wavelength on the screen. Recognize the position and content of the light emission pattern which is the shape of the invisible light of the predetermined wavelength irradiated on the screen by the pointing device from the captured video data, and the position and content of the recognized light emission pattern.
- an interactive function unit that executes different processing contents based on the above.
- the color and thickness of lines and figures to be drawn are more flexible and low cost. Etc. can be switched.
- (A), (b) is the figure which showed the outline
- (A), (b) is the figure which showed the outline
- (A), (b) is the figure which showed the outline
- FIG. 1 is a diagram showing an outline of a configuration example of a video display system according to Embodiment 1 of the present invention.
- the video display system includes a projection video display device (projector) 100 and one or more light emitting pens 30 (two in the figure, 30a and 30b).
- a video output device 200 and an information processing device 300 are connected to the projection video display device 100 by wire or wirelessly.
- Projection type image display device 100 projects and displays an image on screen 10 which is a projection surface.
- the display screen 20 projected on the screen 10 has an operation icon area 22 for displaying an operation icon group for performing an interactive function, in addition to a video area 21 for displaying an image.
- the operation icons include, for example, “mouse operation”, “line drawing” and “eraser” which are drawing functions, and an icon for calling an operation menu of the projection video display apparatus 100 main body.
- it may have a function of switching the video display contents of the projection video display apparatus 100, a function of adjusting other operations (such as audio output) of the projection video display apparatus 100, and the like.
- the light-emitting pens 30a and 30b emit light with different light-emitting patterns 35 (35a and 35b in the drawing), respectively, and drawing is performed according to the light-emitting pattern 35 as described later. Attributes such as line color and thickness can be switched.
- the video output device 200 outputs video data to be projected and displayed on the video area 21 to the projection video display device 100.
- various information processing devices such as a personal computer (PC), a tablet terminal, and a mobile terminal such as a smartphone, and various video devices such as a DVD player can be used.
- the video output device 200 may be a small video storage device such as a USB memory directly connected to the projection video display device 100.
- the video output device 200 may be a streaming video receiving device that is directly connected to the projection video display device 100, receives streaming data from a server on the Internet, and outputs the video to the projection video display device 100.
- the information processing apparatus 300 inputs and outputs various control signals including those related to the interactive function with the projection display apparatus 100. Thereby, the display operation of the projection display apparatus 100 can be controlled on the information processing apparatus 300 side.
- a portable terminal such as a PC, a tablet terminal, or a smartphone can be used.
- the projection display apparatus 100 and the information processing apparatus 300 are illustrated as separate apparatuses, but may be configured to be integrated in the projection display apparatus 100 as the same apparatus.
- the information processing device 300 may have the function of the video output device 200.
- the projection display apparatus 100 may have the function of the image output apparatus 200. Therefore, the functions of the video output device 200 and the information processing device 300 may be integrated into the projection video display device 100 and configured integrally.
- connection between the projection display apparatus 100, the image output apparatus 200, and the information processing apparatus 300 may be wired or wireless.
- the image output apparatus 200 and the information processing apparatus 300 may be connected to a tablet terminal, In the case of a mobile terminal such as a smartphone, it is particularly preferable to use a wireless communication connection.
- FIG. 2 is a diagram showing an overview of a configuration example of the projection display apparatus 100 according to the present embodiment.
- the projection type image display apparatus 100 includes, for example, a projection optical system 101, a display element 102, a display element driving unit 103, an illumination optical system 104, a light source 105, a power source 106, a cooling unit 107, an audio output unit 108, a nonvolatile memory 109, Each unit includes a memory 110, a control unit 111, a sensor 112, an interactive function unit 113, a video input unit 114, an audio input unit 115, a communication unit 116, and an operation signal input unit 117.
- the projection optical system 101 is an optical system that projects an image onto the screen 10 and includes a lens and / or a mirror.
- the display element 102 is an element that generates an image to be projected.
- a transmissive liquid crystal panel, a reflective liquid crystal panel, a DMD (Digital Micromirror Device) (registered trademark) panel, or the like can be used.
- the display element driving unit 103 sends a drive signal corresponding to the video signal to the display element 102 to generate an image to be projected.
- the illumination optical system 104 is an optical system that collects illumination light generated by a light source 105 (to be described later), makes it uniform, and irradiates the display element 102.
- the light source 105 generates illumination light for projection.
- a high-pressure mercury lamp, a xenon lamp, an LED (Light-Emitting-Diode) light source, a laser light source, or the like can be used.
- the power supply 106 is supplied with power from an external power supply and supplies power for operation to each unit including the light source 105.
- the cooling unit 107 is composed of a fan or the like for releasing heat generated from the light source 105 to the outside and suppressing a rise in temperature.
- the audio output unit 108 includes a speaker, an external output terminal, and the like, and outputs audio information related to the display video, and provides notification sound, warning sound, audio information, and the like related to operations and errors of the projection video display device 100. Or output.
- the non-volatile memory 109 is configured by a non-volatile memory such as a flash memory, and stores data for various operations in the interactive function, display icons, calibration data, and the like. You may make it hold
- the memory 110 is configured by a volatile memory such as a DRAM (Dynamic Random Access Memory), and buffers projection target video data input via a video input unit 114 or the like described later, or the projection type video display device 100. It develops and holds various control data related to.
- DRAM Dynamic Random Access Memory
- the control unit 111 controls the operation of each unit of the projection display apparatus 100.
- an interactive function is realized by controlling a sensor 112 and an interactive function unit 113 described later.
- the sensor 112 is a camera that captures the front surface of the screen 10 and detects reflected light when the screen 10 is irradiated with light emitted from the light-emitting pen 30 by detecting an infrared light component (invisible light component). Can do.
- the cut wavelength of the optical filter to the visible light wavelength region (for example, setting in the middle of the red visible light region), some visible light components other than infrared light (that is, projection of the display screen 20) It is also possible to take an image) together with an infrared light component.
- the sensor 112 may be configured to be removable from the main body of the projection display apparatus 100 as necessary.
- the interactive function unit 113 realizes a function of performing an interactive operation such as writing a character, a figure, or the like in the video area 21 when the user operates the light emitting pen 30.
- an interactive operation such as writing a character, a figure, or the like in the video area 21 when the user operates the light emitting pen 30.
- a basic function for example, by analyzing an infrared image acquired from the sensor 112 and recognizing the position of reflected light when the light emitting pen 30 irradiates the screen 10, the light emitting pen 30 on the screen 10 is recognized. The position, that is, the position where the user operates the light emitting pen 30 is calculated.
- the shape and pattern of reflected light when the light-emitting pen 30 irradiates the screen 10 that is, the content of the light-emitting pattern 35 in FIG.
- the attributes such as the color and thickness of the line to be drawn can be switched accordingly.
- the interactive function unit 113 may have a function of executing an application or the like that can be operated with the light emitting pen 30.
- the video area 21 and the operation icon area 22 may be combined and output.
- an application for performing a drawing process or the like based on an operation of the light emitting pen 30 by the user, an application for performing an operation on a video input from the video output apparatus 200, or the like may be executed.
- the interactive function unit 113 has a function of performing calibration such as position correction and coordinate conversion between the region of the image (display screen 20) projected on the screen 10 by the projection optical system 101 and the imaging range by the sensor 112. May be. All or some of the functions of the interactive function unit 113 may be executed on the information processing apparatus 300 side.
- the video input unit 114 and the audio input unit 115 input video data and audio data to be projected or output from the video output device 200 or the like connected via an external interface. Input video data and audio data may be buffered in the memory 110.
- the communication unit 116 has a function of communicating with an external apparatus such as the information processing apparatus 300 and inputting / outputting various control signals.
- the operation signal input unit 117 is an input interface for performing an operation on the projection display apparatus 100, and includes, for example, operation buttons provided on the main body of the projection display apparatus 100 and a light receiving section of a remote controller. An operation signal from the user is input.
- FIG. 3 is a diagram showing an outline of a configuration example of the light emitting pen 30 in the present embodiment.
- FIG. 3A schematically shows an example of the structure of the light-emitting pen 30.
- the light-emitting pen 30 is a pen-shaped indicating instrument as illustrated, and has a configuration in which a light-emitting pen tip 32 having a three-dimensional shape is attached to the light-emitting pen main body 31.
- a light source 38 is provided at the tip of the light-emitting pen body 31, and the light-emitting pen tip 32 is mounted so as to cover the light source 38.
- the light source 38 receives power from a battery (not shown) mounted inside the light-emitting pen body 31 and emits invisible light such as infrared light.
- the light emitting pen 30 is used as the pen-shaped pointing device.
- the shape of the pointing device is not limited to the pen shape, and any appropriate shape can be used as long as it can be operated by the user. It can be.
- FIG. 3B schematically shows an example of the structure of the light emitting pen tip 32.
- a top view and a perspective view of the light emitting pen tip 32 as viewed from the tip side are shown.
- the light emitting pen tip 32 has a substantially dome shape.
- the light-emitting pen tip 32 is formed of, for example, a resin such as plastic that transmits infrared light, and a part of the light-emitting pen tip 32 is subjected to a filtering process that blocks infrared light, whereby a light transmitting part 33 and a light transmitting part 33 illustrated in FIG.
- a light impermeable portion 34 is formed.
- a light emitting pattern 35 irradiated on the screen 10 when emitting infrared light is obtained. It can be a specific shape.
- the method of emitting light is not particularly limited.
- the light emitting pen tip 32 can be moved in the axial direction of the light emitting pen 30 with a gap between the light emitting pen body 31 and the light emitting pen main body 31 in a normal state.
- the user operates to push the light emitting pen tip 32 of the light emitting pen 30 in the direction of the light emitting pen tip 32 in a state where the light emitting pen tip 32 is in contact with the screen 10
- a switch (not shown) is pushed in to be in a conductive state (ON state) and can be configured to emit light.
- the light-emitting pen tip 32 When the user moves the light-emitting pen tip 32 away from the screen 10, the light-emitting pen tip 32 returns to the state shown on the left side of FIG. 3C by an elastic member such as a spring (not shown), and the switch is turned off to emit light. Can be stopped.
- the method of turning on / off the switch is not limited to this, and for example, the user may switch on / off the light emission by operating a switch provided in the light emitting pen body 31 portion.
- the light emitting pen tip 32 that covers the light source 38 has a three-dimensional dome shape, and therefore the light shielding pattern (filter unit) by the light-impermeable portion 34 also has a three-dimensional shape.
- a flat light-shielding pattern may be formed according to the degree of diffusion of the light source 38 (for example, the light-emitting pen tip 32 is configured by a disk-shaped or cylindrical member).
- the three-dimensional shape is not limited to the dome shape as described above, and may be a polygonal pyramid shape or the like so as to correspond to the light shielding pattern by the light opaque portion 34 (in this case, the apex portion). It is desirable to apply a smoothing process).
- the method of forming the light impermeable portion 34 is not particularly limited, and for example, a method such as attaching a film that does not transmit infrared light or applying a paint can be appropriately employed. It is also possible to form the light opaque portion 34 by liquid crystal display using a transmissive liquid crystal panel or the like.
- FIG. 4 is a diagram showing an outline of an example of the difference in the light emission pattern 35 by the light emitting pen tip 32 in the present embodiment.
- three types of light-emitting pens 30 (light-emitting pens 30a, 30b, and 30c) each having a light-emitting pen tip 32 (light-emitting pen tips 32a, 32b, and 32c) having different light shielding patterns by the light-impermeable portion 34 are displayed on the screen 10.
- the light emission pattern 35 (light emission pattern 35a, 35b, 35c) when irradiating with infrared light is shown, respectively.
- the light emitting pen tip 32a of the light emitting pen 30a does not have the light opaque portion 34 and the entire surface is the light transmitting portion 33.
- a light shielding pattern and the light emitting pen tip 32a can also be used.
- the light emission pattern 35a is substantially circular.
- the irradiation area extends substantially in two directions like the light-emitting pattern 35b.
- the irradiation area extends in four directions in a generally star shape like the light-emitting pattern 35c.
- the interactive function unit 113 of the projection display apparatus 100 recognizes and identifies the position and shape of each light emission pattern 35 as shown in FIG. 4 from the infrared image captured by the sensor 112 by a method described later.
- By identifying the shape of the light emitting pattern 35 it is possible to draw a line, a figure, or the like according to attributes (color, thickness, etc.) associated with the shape.
- attributes color, thickness, etc.
- the corresponding attribute can be switched quickly and at a lower cost than the method of switching the corresponding attribute in terms of software using the palette displayed on the screen.
- the light-emitting pen tip 32 attached to the light-emitting pen 30 has the light-shielding pattern formed by the light transmitting portion 33 and the light non-transmitting portion 34.
- the light emitting pattern 35 irradiated on the screen 10 can be made different for each light emitting pen 30 having the light emitting pen tip 32 having a different light shielding pattern.
- the attributes such as the color and thickness of the line when drawing the line or figure by the interactive function can be obtained. It is possible to switch quickly.
- Embodiment 2 of the present invention an example of a technique for switching the light-shielding pattern of the light-emitting pen tip 32 (that is, the light-emitting pattern 35) to another pattern in the light-emitting pen 30 will be described.
- a plurality of light-emitting pens 30 having light-emitting pen tips 32 having different light-shielding patterns (light-emitting patterns 35) are prepared as shown in FIG. 1 and FIG. ing.
- FIG. 5 is a diagram showing an outline of an example of a method for switching the light shielding pattern of the light emitting pen tip 32 in the present embodiment.
- FIG. 5A schematically shows an example of switching the light shielding pattern by removing the light emitting pen tip 32a from the light emitting pen body 31 and replacing it with a light emitting pen tip 32b having another light shielding pattern.
- the light-emitting pen tip 32 of the light-emitting pen 30 is formed of a material having elasticity such as resin, and is configured so that it can be easily attached to and detached from the light-emitting pen body 31 by a fitting structure or the like.
- the light emitting pen tip 32 having another light shielding pattern can be easily replaced or replaced.
- a light-shielding pattern is formed by a light-emitting pen tip 32f having a structure in which a light-emitting pen tip 32d and a light-emitting pen tip 32e smaller than the light-emitting pen tip 32d are overlapped so as to cover the light-emitting pen tip 32e.
- An example of switching is schematically shown.
- the outer light emitting pen tip 32d is configured to be rotatable in the circumferential direction relative to the light emitting pen tip 32e.
- the light-shielding pattern of the light-emitting pen tip 32f as a whole can be switched depending on the overlapping state of the light-shielding patterns of the light-emitting pen tip 32d and the light-emitting pen tip 32e.
- a stop mechanism (not shown) that stops the rotation of the light-emitting pen tip 32d at a position where the entire light-shielding pattern of the light-emitting pen tip 32f has an appropriate shape (for example, stop by fitting a protruding portion into the groove) ) Is desirable.
- FIG. 5C schematically shows an example in which the light-blocking pattern of the light-emitting pen tip 32 is switched when the user knocks the knock portion 36 provided in the light-emitting pen main body 31.
- the light emitting pen main body 31 is mounted with the light emitting pen tip 32 without the light opaque portion 34 and is a cylindrical member having a light opaque portion corresponding to a light shielding pattern.
- a state in which the light shielding portion 37 is stored inside the light emitting pen body 31 is shown.
- the light emitting pen body 31 is provided with a knock portion 36 similar to, for example, a general knock-type ballpoint pen or the like, and the user performs a knocking operation to mechanically move the knock portion 36.
- the light-shielding part 37 is pushed out from the light-emitting pen body 31 and moves to the inside (or outside) of the light-emitting pen tip 32, and the light-opaque part 34 is formed in the light-emitting pen tip 32.
- the method of forming the light opaque portion 34 by moving the light shielding portion 37 independent of the light emitting pen tip 32 is not limited to the above-described knocking operation by the knocking portion 36, and various methods are appropriately adopted. Can do. For example, when the user operates a rotating unit configured such that a part of the light-emitting pen main body 31 can rotate around the axis of the light-emitting pen main body 31, a movable unit such as a switch unit, or an operation unit, It is good also as what has the mechanism to which the light-shielding part 37 moves with the motive power obtained mechanically or electrically and magnetically.
- the light shielding units 37 having a plurality of different patterns may be sequentially switched and applied each time the user's operation is repeated.
- the light emitting pen tip 32 here is not limited to the light-transmitting portion 33 and does not have the light non-transmitting portion 34 as shown in the figure, and is specified by being overlapped with the light-blocking pattern of the light-blocking portion 37.
- the light-opaque part 34 which forms the light-shielding pattern may be provided.
- the method for switching the light shielding pattern of the light-emitting pen tip 32 shown in FIGS. 5A to 5C is an example, and the present invention is not limited thereto, and other methods can be adopted as appropriate.
- the present invention is not limited thereto, and other methods can be adopted as appropriate.
- a transmissive liquid crystal panel or the like provided on the light emitting pen tip 32.
- not only one method but also a plurality of methods can be used in appropriate combination.
- the video display system it is possible to dynamically change the light shielding pattern of the light emitting pen tip 32 as a whole while using the same light emitting pen 30. it can.
- the light-emitting pen tip 32 configured to be detachable is replaced with one having another light-shielding pattern, or the light-shielding pattern is changed by operating a structural movable portion provided in the light-emitting pen tip 32.
- Embodiment 3 of the present invention an example of a technique for recognizing and identifying differences in the shape and the like of the light emitting pattern 35 irradiated on the screen 10 by the light emitting pen 30 will be described.
- the interactive function unit 113 of the projection display apparatus 100 recognizes and identifies the difference in the content of the light emission pattern 35, so that when the line or figure is drawn automatically, the corresponding attribute (line color or thickness) is automatically displayed. Etc.).
- FIG. 6 is a diagram showing an outline of an example of a technique for identifying the light emission pattern 35 in the present embodiment.
- FIG. 6A calculates the number of protrusions in the shape of the light emitting pattern 35 (that is, the number in the direction in which the irradiation region extends and is indicated by a dotted line in the light emitting patterns 35e and 35f). An example of recognizing and identifying the difference is shown.
- the light emitting pattern 35d of the light emitting pen 30d is a circle. That is, this is a case where the light-emitting pen tip 32 does not have the light opaque portion 34 (light shielding pattern). In this case, the number of protrusions can be used as zero.
- the light emitting pattern 35e of the light emitting pen 30e is a case where the number of protrusions is 4, and the light emitting pattern 35f of the light emitting pen 30f is a case where the number of protrusions is 6.
- the number of protrusions for example, a portion where the luminance of infrared light is highest in a region irradiated with the light emitting pattern 35 (usually a portion where the light emitting pen tip 32 of the light emitting pen 30 is in contact with the screen 10). And the distribution of the luminance on the circumference is obtained by scanning the circumference of the circle with a predetermined radius centered on the location, and the number of distribution areas where the luminance exceeds the predetermined threshold Based on this, the number of protrusions can be calculated.
- FIG. 6B shows an example in which the difference in the light emission pattern 35 is recognized and identified by the difference in the area of the light emission pattern 35.
- the size of the area is, for example, light emission patterns 35d> 35e> 35f.
- the difference of the light emission pattern 35 may be determined depending on which of the plurality of sections / ranges the calculated absolute value of the area belongs to. You may make it judge that these are the different light emission patterns 35, when it is more than a threshold value.
- each of the above recognition methods recognizes and identifies between the light emission patterns 35 irradiated on the same screen 10, and the screen 10 and the projection type while the user is using the interactive function. It is assumed that the distance from the video display device 100 (projector) is constant. The arrangement relationship including the distance and inclination between the screen 10 and the projection display apparatus 100 is measured, for example, when the calibration process is executed, and the value is stored in the memory 110 and appropriately referred to. It can be configured.
- the sensor 112 When the user performs an operation such as drawing by bringing the light-emitting pen 30 into contact with the screen 10, the sensor 112 shoots the infrared light on the screen 10 depending on the position and height, how to hold the light-emitting pen 30, etc.
- a plurality of light emission patterns 35 are recognized based on the number of protrusions and the area (area ratio) in the shape of each light emission pattern 35. ⁇ Can be identified. Thereby, it is possible to automatically set an attribute when drawing a line or a figure for each light emitting pattern 35.
- Embodiment 4 In the fourth embodiment of the present invention, an example of a method for setting and assigning attributes for drawing lines and figures for each of the identified light emission patterns 35 will be described.
- FIG. 7 is a diagram showing an outline of an example of setting an attribute when drawing a line or a figure for the light emission pattern 35 in the present embodiment.
- FIG. 7A shows an example in which a line color at the time of drawing is fixedly assigned to each light emitting pattern 35 in advance.
- “black” is assigned to the circular light emission pattern 35 g in advance
- “red” is assigned to the light emission pattern 35 h having a shape in which the irradiation region extends in two directions.
- the interactive function unit 113 draws the traced trace with a “black” line.
- the traced traced trace is drawn with a “red” line.
- FIG. 7B shows an example in which attributes for drawing are dynamically assigned to each light emission pattern 35 using a palette provided in software.
- a palette provided in software.
- two types of light emission patterns 35g and 35h are used, and the assignment of attributes to each light emission pattern 35 at the time of drawing is not fixedly set. (Displayed as “attribute A” and “attribute B” in the figure. Default attributes may be set).
- the assignment of attributes to each light emission pattern 35 indicates that the user dynamically uses a palette displayed in software in the operation icon area 22 or the like when used.
- the light emitting pattern 35g has an attribute of “blue / thick line”.
- the interactive function unit 113 draws the traced trace with “blue” “thick line”.
- the locus traced by the light emitting pen 30h having the light emitting pattern 35h is drawn by “thin line” of “red”.
- the light-emitting pen 30 has light-emitting pen tips 32 having different light-shielding patterns at both ends thereof. Thereby, the attributes can be properly used with one light emitting pen 30.
- FIG. 8 is a diagram showing an outline of the configuration of the light-emitting pen 30 and examples of drawing lines and figures in the present embodiment.
- FIG. 8A shows an example of the structure of the light emitting pen 30 having the light emitting pen tips 32a and 32b at both ends.
- a method for mounting each light-emitting pen tip 32 the one shown in each of the above embodiments can be appropriately employed. Different attachment methods may be used at both ends.
- a normal line or figure drawing attribute is assigned to the light emitting pattern 35 by the light emitting pen tip 32 b having the light non-transmissive portion 34 (light shielding pattern), while the other end has no light shielding pattern. It is assumed that the “eraser” mode is assigned to the light emission pattern 35 by the pen tip 32a.
- FIG. 4B shows an example in which the two light emitting pen tips 32 in the light emitting pen 30 are used for drawing.
- the normal line is drawn, while the light emitting pen 30 is turned upside down as shown on the right side of the screen 10.
- drawing is performed using the light emitting pen tip 32a at the other end that does not have the light shielding pattern, this indicates that it functions as an “eraser” that erases the already drawn content.
- the light-emitting pen tip 32 having different light-shielding patterns is provided at both ends of the light-emitting pen 30, for example, light emission by one light-emitting pen tip 32a.
- An “eraser” mode is assigned to the pattern 35.
- drawing attribute assigned to the light emitting pattern 35 by the light emitting pen tip 32 at both ends is not limited to one in which one is in the “eraser” mode as described above.
- the line color and thickness may be switched at both ends, line types such as a solid line and a dotted line may be switched, or a combination of these may be used.
- the light emission is performed so that the user can easily grasp information about what drawing attribute is assigned to the light emission pattern 35 by each light emitting pen tip 32.
- You may have the function to display by a color, a figure, a text, etc. fixedly or dynamically to the pen main body 31 grade
- a color corresponding to the light-emitting pen tip 32 may be attached, or the thickness of the line may be displayed. it can.
- a display unit (not shown) provided in the light-emitting pen body 31 can emit a corresponding color or display text, including the case of dynamically switching attributes using a palette or the like by software.
- the light-emitting pen 30 communicates with the projection-type image display device 100 or the information processing device 300 by short-range wireless communication or the like, and information on assigned attributes or should be displayed. It is necessary to have a means for acquiring information related to the contents.
- the present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the invention. Needless to say.
- the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to the one having all the configurations described.
- a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .
- DESCRIPTION OF SYMBOLS 100 Projection type video display apparatus, 101 ... Projection optical system, 102 ... Display element, 103 ... Display element drive part, 104 ... Illumination optical system, 105 ... Light source, 106 ... Power supply, 107 ... Cooling part, 108 ... Sound output part 109 ... Non-volatile memory, 110 ... Memory, 111 ... Control unit, 112 ... Sensor, 113 ... Interactive function unit, 114 ... Video input unit, 115 ... Audio input unit, 116 ... Communication unit, 117 ... Operation signal input unit, 200: Video output device, 300: Information processing apparatus
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- Projection Apparatus (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
本発明の前記ならびにその他の目的と新規な特徴は、本明細書の記述および添付図面から明らかになるであろう。
(実施の形態1)
また、映像出力装置200は、投射型映像表示装置100に直接接続するUSBメモリなどの小型映像記憶装置でもよい。また、映像出力装置200は、投射型映像表示装置100に直接接続し、インターネット上のサーバなどからストリーミングデータを受信して投射型映像表示装置100に映像を出力するストリーミング映像受信デバイスでもよい。
なお、図1の例では、投射型映像表示装置100と情報処理装置300とを別個の装置として示しているが、同一の装置として投射型映像表示装置100に集約して構成してもよい。また、情報処理装置300が映像出力装置200の機能を備えていてもよい。また、投射型映像表示装置100が映像出力装置200の機能を備えていてもよい。したがって、映像出力装置200と、情報処理装置300の機能を、投射型映像表示装置100に集約して一体に構成してもよい。
メモリ110は、DRAM(Dynamic Random Access Memory)等の揮発性メモリにより構成され、後述する映像入力部114等を介して入力された投射対象の映像データをバッファリングしたり、投射型映像表示装置100に係る各種制御用データを展開して保持したりする。
(実施の形態2)
(実施の形態3)
(実施の形態4)
本発明の実施の形態4では、識別された発光パターン35のそれぞれに対して線や図形を描画する際の属性を設定・割り当てする手法の例について説明する。
(実施の形態5)
20…表示画面、21…映像領域、22…操作アイコン領域、23a、23b…描画図形、
200…映像出力装置、
300…情報処理装置
Claims (14)
- スクリーンに対して映像を投射する投射型映像表示装置と、ユーザが操作するペン形状の指示器具と、を含む映像表示システムであって、
前記指示器具は、所定の波長の非可視光を発光する発光部を有し、
前記投射型映像表示装置は、前記スクリーン上における前記所定の波長を含む非可視光の反射光を撮影するセンサにより撮影された映像データから、前記指示器具により前記スクリーンに照射された前記所定の波長の非可視光の形状である発光パターンについてその位置と内容を認識し、認識された前記発光パターンの位置および内容に基づいて異なる処理内容を実行するインタラクティブ機能部を有する、映像表示システム。 - 請求項1に記載の映像表示システムにおいて、
前記投射型映像表示装置の前記インタラクティブ機能部は、前記発光パターンの内容に対応付けられた描画の属性により、前記発光パターンの位置の軌跡に基づいて前記スクリーン上に線もしくは図形を描画する、映像表示システム。 - 請求項1に記載の映像表示システムにおいて、
前記指示器具は、先端に前記発光部を有する本体と、前記発光部を覆う態様で前記本体の先端に装着された立体形状を有する部材であるペン先部と、を有し、
前記ペン先部は、前記所定の波長の非可視光を通す光透過部と、前記所定の波長の非可視光を遮断する光不透過部により形成された、前記発光パターンに対応する遮光パターンを有する、映像表示システム。 - 請求項3に記載の映像表示システムにおいて、
前記指示器具の前記ペン先部は、前記指示器具に対して脱着可能であり、第1の遮光パターンを有する第1のペン先部を、他の第2の遮光パターンを有する第2のペン先部に取り替え可能である、映像表示システム。 - 請求項3に記載の映像表示システムにおいて、
前記指示器具の前記ペン先部は可動部を有し、第1の遮光パターンを有する状態から、前記可動部の動作により他の第2の遮光パターンを形成する、映像表示システム。 - 請求項5に記載の映像表示システムにおいて、
前記ペン先部の前記可動部は、前記ペン先部が前記第2の遮光パターンを形成する所定の位置で動作を停止させる機構を有する、映像表示システム。 - 請求項3に記載の映像表示システムにおいて、
前記指示器具の前記本体は可動な操作部を有し、前記ペン先部が第1の遮光パターンを有する状態から、ユーザによる前記操作部の操作により前記ペン先部が他の第2の遮光パターンを形成する、映像表示システム。 - 請求項7に記載の映像表示システムにおいて、
前記指示器具は、前記ペン先部とは独立した遮光部を有し、ユーザによる前記操作部の操作により前記遮光部が移動することで前記ペン先部が前記第2の遮光パターンを形成する、映像表示システム。 - 請求項3に記載の映像表示システムにおいて、
前記指示器具は、前記本体の先端に有する第1の発光部と第1の遮光パターンを有する第1のペン先部に加え、前記本体の他端に第2の発光部と第2の遮光パターンを有する第2のペン先部を有する、映像表示システム。 - 請求項9に記載の映像表示システムにおいて、
前記投射型映像表示装置の前記インタラクティブ機能部は、前記第2の遮光パターンによって前記指示器具により前記スクリーン上に照射される前記発光パターンの位置の軌跡に基づいて、前記スクリーン上に既に描画されている線もしくは図形を消去する、映像表示システム。 - 請求項1に記載の映像表示システムにおいて、
前記投射型映像表示装置の前記インタラクティブ機能部は、認識された前記発光パターンの形状における突起の数に応じて異なる処理を実行する、映像表示システム。 - 請求項1に記載の映像表示システムにおいて、
前記投射型映像表示装置の前記インタラクティブ機能部は、認識された前記発光パターンの面積に応じて異なる処理を実行する、映像表示システム。 - 請求項1に記載の映像表示システムにおいて、
前記投射型映像表示装置の前記インタラクティブ機能部は、ユーザにより前記指示器具によって前記スクリーン上の所定の領域内で操作が行われた場合に、前記所定の領域内で前記操作が行われた位置に応じて異なる処理を実行する、映像表示システム。 - スクリーンに対して映像を投射する投射型映像表示装置と、ユーザが操作するペン形状の指示器具と、情報処理装置と、を含む映像表示システムであって、
前記指示器具は、所定の波長の非可視光を発光する発光部を有し、
前記投射型映像表示装置は、前記スクリーン上における前記所定の波長を含む非可視光の反射光を撮影するセンサにより撮影された映像データを前記情報処理装置に送信し、
前記情報処理装置は、受信した前記映像データに基づいて、前記指示器具により前記スクリーンに照射された前記所定の波長の非可視光の形状である発光パターンについてその位置と内容を認識し、認識された前記発光パターンの位置および内容に基づいて異なる処理内容を前記投射型映像表示装置に実行させるインタラクティブ機能部を有する、映像表示システム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201580082009.4A CN107850968B (zh) | 2015-07-29 | 2015-07-29 | 影像显示系统 |
| JP2017530542A JP6437654B2 (ja) | 2015-07-29 | 2015-07-29 | 映像表示システム |
| PCT/JP2015/071532 WO2017017812A1 (ja) | 2015-07-29 | 2015-07-29 | 映像表示システム |
| US15/747,772 US10268284B2 (en) | 2015-07-29 | 2015-07-29 | Image display system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/071532 WO2017017812A1 (ja) | 2015-07-29 | 2015-07-29 | 映像表示システム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017017812A1 true WO2017017812A1 (ja) | 2017-02-02 |
Family
ID=57884296
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/071532 Ceased WO2017017812A1 (ja) | 2015-07-29 | 2015-07-29 | 映像表示システム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10268284B2 (ja) |
| JP (1) | JP6437654B2 (ja) |
| CN (1) | CN107850968B (ja) |
| WO (1) | WO2017017812A1 (ja) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111083455A (zh) * | 2018-10-22 | 2020-04-28 | 精工爱普生株式会社 | 位置检测装置、显示装置、显示系统以及位置检测方法 |
| JP2020140685A (ja) * | 2019-02-22 | 2020-09-03 | シャープ株式会社 | 入力装置及び入力システム |
| WO2020183519A1 (ja) * | 2019-03-08 | 2020-09-17 | Necディスプレイソリューションズ株式会社 | 情報処理装置、情報処理方法、プログラム、表示システム、表示方法及び電子筆記具 |
| JPWO2020250410A1 (ja) * | 2019-06-14 | 2021-12-09 | シャープNecディスプレイソリューションズ株式会社 | 情報処理装置、情報処理方法、プログラム、表示システム、表示方法及び電子筆記具 |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018180709A (ja) * | 2017-04-06 | 2018-11-15 | 富士ゼロックス株式会社 | 受付装置および検出装置 |
| JP2019159748A (ja) * | 2018-03-13 | 2019-09-19 | セイコーエプソン株式会社 | 画像投写システム、指示体及び画像投写システムの制御方法 |
| JP7174397B2 (ja) * | 2018-06-18 | 2022-11-17 | チームラボ株式会社 | 映像表示システム,映像表示方法,及びコンピュータプログラム |
| JP7251094B2 (ja) | 2018-10-22 | 2023-04-04 | セイコーエプソン株式会社 | 位置検出装置、表示システム、及び、位置検出方法 |
| US11435856B2 (en) | 2018-11-01 | 2022-09-06 | Sony Group Corporation | Information processing device, information processing method, and program |
| KR102625830B1 (ko) * | 2018-11-27 | 2024-01-16 | 삼성전자주식회사 | 디스플레이장치, 그 제어방법 및 기록매체 |
| US10921928B2 (en) * | 2019-02-22 | 2021-02-16 | Sharp Kabushiki Kaisha | Input apparatus and input system |
| KR20210030680A (ko) * | 2019-09-10 | 2021-03-18 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
| US11836300B2 (en) * | 2020-01-09 | 2023-12-05 | Sony Group Corporation | Information processing apparatus and information processing method |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH113170A (ja) * | 1997-06-13 | 1999-01-06 | Wacom Co Ltd | 光デジタイザ |
| JP2012221115A (ja) * | 2011-04-06 | 2012-11-12 | Canon Inc | 座標入力装置及びその制御方法、プログラム |
| JP2013235416A (ja) * | 2012-05-09 | 2013-11-21 | Seiko Epson Corp | 画像表示システム |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS59107297U (ja) * | 1983-01-08 | 1984-07-19 | 富士通株式会社 | 電子黒板用消去器 |
| CN2367469Y (zh) * | 1999-01-26 | 2000-03-08 | 光奇科技股份有限公司 | 全像镭射笔 |
| CN1952851A (zh) * | 2006-10-13 | 2007-04-25 | 广东威创日新电子有限公司 | 一种实现交互显示的电子装置和方法 |
| US8493340B2 (en) * | 2009-01-16 | 2013-07-23 | Corel Corporation | Virtual hard media imaging |
| US9864440B2 (en) * | 2010-06-11 | 2018-01-09 | Microsoft Technology Licensing, Llc | Object orientation detection with a digitizer |
| KR20120116076A (ko) * | 2011-04-12 | 2012-10-22 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
| CN102509068A (zh) * | 2011-10-09 | 2012-06-20 | 海信集团有限公司 | 投影方法和装置 |
| JP2014203305A (ja) * | 2013-04-05 | 2014-10-27 | 株式会社東芝 | 電子機器、電子機器の制御方法、電子機器の制御プログラム |
| JP6201519B2 (ja) * | 2013-08-21 | 2017-09-27 | 株式会社リコー | 座標検知装置、及び座標検知方法、及び電子情報ボードシステム |
-
2015
- 2015-07-29 WO PCT/JP2015/071532 patent/WO2017017812A1/ja not_active Ceased
- 2015-07-29 US US15/747,772 patent/US10268284B2/en active Active
- 2015-07-29 JP JP2017530542A patent/JP6437654B2/ja active Active
- 2015-07-29 CN CN201580082009.4A patent/CN107850968B/zh active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH113170A (ja) * | 1997-06-13 | 1999-01-06 | Wacom Co Ltd | 光デジタイザ |
| JP2012221115A (ja) * | 2011-04-06 | 2012-11-12 | Canon Inc | 座標入力装置及びその制御方法、プログラム |
| JP2013235416A (ja) * | 2012-05-09 | 2013-11-21 | Seiko Epson Corp | 画像表示システム |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111083455A (zh) * | 2018-10-22 | 2020-04-28 | 精工爱普生株式会社 | 位置检测装置、显示装置、显示系统以及位置检测方法 |
| CN111083455B (zh) * | 2018-10-22 | 2023-08-11 | 精工爱普生株式会社 | 位置检测装置、显示装置、显示系统以及位置检测方法 |
| JP2020140685A (ja) * | 2019-02-22 | 2020-09-03 | シャープ株式会社 | 入力装置及び入力システム |
| JP7312615B2 (ja) | 2019-02-22 | 2023-07-21 | シャープ株式会社 | 入力装置及び入力システム |
| WO2020183519A1 (ja) * | 2019-03-08 | 2020-09-17 | Necディスプレイソリューションズ株式会社 | 情報処理装置、情報処理方法、プログラム、表示システム、表示方法及び電子筆記具 |
| JPWO2020250410A1 (ja) * | 2019-06-14 | 2021-12-09 | シャープNecディスプレイソリューションズ株式会社 | 情報処理装置、情報処理方法、プログラム、表示システム、表示方法及び電子筆記具 |
| US11868544B2 (en) | 2019-06-14 | 2024-01-09 | Sharp Nec Display Solutions, Ltd. | Information processing device, information processing method, program, display system, display method, and electronic writing tool |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2017017812A1 (ja) | 2018-05-24 |
| US10268284B2 (en) | 2019-04-23 |
| CN107850968B (zh) | 2021-06-04 |
| JP6437654B2 (ja) | 2018-12-12 |
| US20180217683A1 (en) | 2018-08-02 |
| CN107850968A (zh) | 2018-03-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6437654B2 (ja) | 映像表示システム | |
| CN110067972B (zh) | 照明装置 | |
| US10133366B2 (en) | Interactive projector and interactive projection system | |
| US9182832B2 (en) | Display device, control method of display device and program | |
| JP6623812B2 (ja) | 位置検出装置、及び、そのコントラスト調整方法 | |
| JP6375672B2 (ja) | 位置検出装置、及び位置検出方法 | |
| CN101963846B (zh) | 光学笔 | |
| US10831288B2 (en) | Projector, projection system, and detection light radiator | |
| JP2011203830A (ja) | 投写システム及びその制御方法 | |
| US9733728B2 (en) | Position detecting device and position detecting method | |
| CN101963847A (zh) | 具有触发式开关的光学输入笔装置 | |
| KR20170129948A (ko) | 인터랙티브 프로젝터, 인터랙티브 프로젝션 시스템 및, 인터랙티브 프로젝터의 제어 방법 | |
| JP4434381B2 (ja) | 座標入力装置 | |
| JP6569259B2 (ja) | 位置検出装置、表示装置、位置検出方法、及び、表示方法 | |
| US9544561B2 (en) | Interactive projector and interactive projection system | |
| JP2011204059A (ja) | 情報入力システムおよび情報入力装置 | |
| JP6690271B2 (ja) | 位置検出システム、位置検出装置、および位置検出方法 | |
| JP6690272B2 (ja) | 位置検出システム、自発光指示体、および固有情報取得方法 | |
| JP2020134922A (ja) | プロジェクタシステム | |
| KR20160107684A (ko) | 전자 칠판 시스템 | |
| JP5935930B2 (ja) | 表示装置、表示装置の制御方法およびプログラム | |
| WO2022034745A1 (ja) | 書き込み画面画像を重畳する情報処理装置 | |
| JP5803427B2 (ja) | 表示装置、表示装置の制御方法およびプログラム | |
| CN115525201A (zh) | 图像处理方法和图像处理装置 | |
| JP2020134596A (ja) | プロジェクタシステム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15899645 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017530542 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15747772 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15899645 Country of ref document: EP Kind code of ref document: A1 |