US20190121227A1 - Projector - Google Patents
Projector Download PDFInfo
- Publication number
- US20190121227A1 US20190121227A1 US16/227,657 US201816227657A US2019121227A1 US 20190121227 A1 US20190121227 A1 US 20190121227A1 US 201816227657 A US201816227657 A US 201816227657A US 2019121227 A1 US2019121227 A1 US 2019121227A1
- Authority
- US
- United States
- Prior art keywords
- projected
- hand
- image
- projection
- icons
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/26—Projecting separately subsidiary matter simultaneously with main image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/28—Reflectors in projection beam
Definitions
- the present invention relates to a projector.
- a known projection device projects operation icons onto a projection surface (for example, see Patent Literature 1). This projection device decides the projection position or projection direction of the operation icon in accordance with the direction of approach of a finger on the projected image.
- Patent Literature 1 Japanese Unexamined Patent Application Publication No. 2009-064109
- the operation icon is sometimes shaded by the hand during operation, which may render operation difficult.
- the projector of the present invention includes: a projection unit that projects onto a projection surface a projected image that is based on image data; a detection unit that detects the direction of approach of an indication member or of the shadow of the indication member that has approached a detection region that includes the projected image; an icon memory unit that stores the image data of a plurality of icons; a decision unit that decides at least one of the display size of the icons, the display gaps between the icons, and the display positions of the icons, on the basis of the projection direction of the projected image and the direction of approach detected by the detection unit; and a projection control unit that projects the icons onto the display positions decided by the decision unit at the display sizes and display gaps decided by the decision unit.
- the projector of the present invention is also characterized by being provided with a projection unit that projects onto a projection surface a projected image that is based on image data; an icon memory unit that stores the image data of a plurality of icons; a position memory unit that stores the display positions of the icons; a decision unit that decides at least one of the display size of the icons or the display gaps between the icons displayed along the projection direction of the projected image by a distance from the projection unit that is based on the display positions of the icons; and a projection control unit that projects each of the icons at the display sizes and display gaps decided by the decision unit.
- the projector of the present invention is also characterized by being provided with a projection unit that projects onto a projection surface a projected image that is based on image data; a detection unit that detects the direction of approach of an indication member and the direction of approach of the shadow of the indication member that has approached a detection region that includes the projected image; an icon memory unit that stores image data of a plurality of icons; a decision unit that decides at least one of the display sizes of the icons, the display gaps between the icons, and the display positions of the icons, on the basis of the projection direction of the projected image and the direction of approach of the indication member and direction of approach of the shadow of the indication member detected by the detection unit; and a projection control unit that projects the icons at the display positions decided by the decision unit at the display sizes and display gaps decided by the decision unit.
- the operation icons are easy to be seen and favorable operation is possible.
- FIG. 1 is a perspective view depicting a projected state and a photographed state of a projector according to a first embodiment
- FIG. 2 is a block diagram depicting the configuration of the projector according to the first embodiment
- FIG. 3 is a flow chart depicting a process in the projector according to the first embodiment
- FIG. 4 is a front view and a side view depicting the projected state and the photographed state of the projector according to the first embodiment
- FIG. 5 is a diagram depicting a direction on a projected image of the projector according to the first embodiment
- FIG. 6 is a diagram depicting operation icons superimposed and projected on the projected image by the projector according to the first embodiment, as well as a direction of approach of a hand;
- FIG. 7 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment, as well as the direction of approach of a hand;
- FIG. 8 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment
- FIG. 9 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment
- FIG. 10 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment, as well as the position of a fingertip;
- FIG. 11 is a diagram depicting a photography region in the projector according to the first embodiment
- FIG. 12 is a diagram depicting the photography region in the projector according to the first embodiment
- FIG. 13 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment, as well as the direction of approach of a hand;
- FIG. 14 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment
- FIG. 15 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment
- FIG. 16 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment
- FIG. 17 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment
- FIG. 18 is a perspective view depicting a projected state and a photographed state of a projector according to a second embodiment
- FIG. 19 is a block diagram depicting the configuration of the projector according to the second embodiment.
- FIG. 20 is a flow chart depicting a process in the projector according to the second embodiment
- FIG. 21 is a front view and a side view depicting the projected state and the photographed state of the projector according to the second embodiment
- FIG. 22 is a diagram depicting operation icons superimposed and projected onto a projected image by the projector according to the second embodiment, as well as a direction of approach of the shadow of a hand;
- FIG. 23 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand;
- FIG. 24 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand;
- FIG. 25 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment
- FIG. 26 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand;
- FIG. 27 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment
- FIG. 28 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand;
- FIG. 29 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand;
- FIG. 30 is a diagram depicting the projected image projected by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand.
- FIG. 1 is a perspective view depicting a projected state and a photographed state of a projector 2 according to the first embodiment.
- the projector 2 is provided with a casing 4 made of metal or plastic, the casing 4 being mounted onto a mounting surface G, which is the top surface of a desk 6 or the like.
- the front surface of the casing 4 is also provided with a projection window 10 that projects a projected image 8 and an operation icon 9 onto the mounting surface G, and with a photography window 14 that photographs an indication member such as a hand 12 , which approaches a photography region 11 and indicates the operation icon 9 .
- FIG. 2 is a block diagram depicting the system configuration of the projector according to the first embodiment.
- the projector 2 is provided with a CPU 20 , the CPU 20 being connected to an operation unit 22 provided with a power switch and the like (not shown); a camera 24 having an imaging sensor constituted of a CCD or the like, that photographs a subject; an image memory unit 26 that stores image data of an image photographed by the camera 24 ; a program memory unit 30 that houses a program for setting and controlling photography, projection, and the like; a memory card 32 that stores the image data of an image to be projected; a projection unit 34 that projects an image that is based on image data stored in the image memory unit 26 or the memory card 32 ; a hand recognition unit 36 that determines whether or not a photographed image contains the shape of a hand 12 ; and a direction detection unit 38 that detects the direction in which the hand 12 approaches a photography region 11 .
- the projection unit 34 is provided with a power control unit 48 that turns an LED light source 46 on and off, and with
- the casing 4 is mounted onto a mounting surface G, and when the power is switched on, the CPU 20 indicates to the projection unit 34 to begin projecting, and reads out image data from the memory card 32 in order to use the projection control unit 52 to display on the LCOS 50 an image that is based on the image data.
- the power control unit 48 also switches on the LED light source 46 by the indication to begin projecting, and, as depicted in FIGS. 4( a ) and 4( b ) , emits projection light in a downward-sloping direction from the projection window 10 so as to project the projected image 8 onto the mounting surface G (step S 1 ).
- the CPU 20 also uses the camera 24 to begin photographing a photography region 11 that includes the projected image 8 (step S 2 ).
- the camera 24 photographs using video photography or still image photography at fixed time intervals, and image data of the image photographed by the camera 24 is stored in the image memory unit 26 .
- the CPU 20 reads out image data from the image memory unit 26 and uses the hand recognition unit 36 to determine whether or not the image that is based on image data contains the shape of the hand 12 (step S 3 ). Note that whether or not the image that is based on the image data contains the shape of the hand 12 is determined to detect the region of the hand 12 and the position of the fingertips from the image that is based on the image data by using pattern matching or the like.
- FIG. 5 is a diagram overlooking from above onto the projector 2 mounted onto the mounting surface G and the photography region 11 , which contains the projected image 8 projected onto the mounting surface G.
- the rear is taken to be the side closer to the casing 4 of the projector 2 on the projected image 8 , in the direction along the projection direction, and the front is taken to be the side farther away from the casing 4 .
- the right side is taken to be the right side on the projected image 8 in the direction orthogonal to the projection direction
- the left side is taken to be the left side thereof.
- step S 3 When the image that is based on the image data contains the shape of the hand 12 (step S 3 : Yes), the CPU 20 uses the direction detection unit 38 to detect from which of the directions depicted in FIG. 5 the hand 12 approaches the photography region 11 (step S 4 ). On the other hand, when the image that is based on the image data does not contain the shape of the hand 12 (step S 3 : No), the CPU 20 returns to the process in step S 1 without detecting the direction in which the hand 12 approaches.
- the CPU 20 determines whether the direction of approach of the hand 12 belongs to the front-rear direction or the left-right direction depicted in FIG. 5 , and decides the sizes and display gaps of the operation icons 9 on the basis of the determined results (step S 5 ). For example, when the hand 12 approaches from either the right side or the left side of the photography region 11 , the size of the operation icons 9 is decided to be 2 cm ⁇ 2 cm, and the gap between the operation icons 9 is decided to be 2 cm. Also, when the hand 12 approaches from either the front or the rear of the photography region 11 , the size of the operation icons 9 is decided to be 4 cm ⁇ 4 cm, and the gap between the operation icons 9 is decided to be 0.5 cm or the like.
- the CPU 20 reads out the image data of the operation icons 9 from the program memory unit 30 , and indicates to the projection unit 34 to project the operation icons 9 at the size and display gaps decided in step S 5 along the edge part of the side of the projected image 8 in which the hand 12 approaches (step S 6 ).
- the operation icons 9 are projected along the edge part of the right side of the projected image 8 at the size and display gaps decided in step S 5 .
- the operation icons 9 are projected along the edge part of the front of the projected image 8 at the size and display gaps decided in step S 5 .
- the CPU 20 determines whether or not the distance from the operation icons 9 to the fingertip is a given distance or less (step S 7 ). For example, as depicted in FIG. 8( a ) , the hand 12 approaching the photography region 11 from the front (see FIG. 5 ), the fingertip is taken as being located on an operation icon 9 “HOME.” In this case, the CPU 20 measures a distance X of the normal direction from the mounting surface G to the position of the fingertip, depicted in FIG. 8( b ) at the position of the operation icon 9 “HOME,” on the basis of the image data, and determines whether or not the distance X of the normal direction is a given distance or less.
- the operator is able to set the given distance as desired.
- step S 7 When the distance X of the normal direction is the given distance or less (step S 7 : Yes), the CPU 20 alters the size of the operation icon 9 located directly under the fingertip, for example, “HOME,” to a larger size, and uses the projection unit 34 to project the large-sized operation icon “HOME” as depicted in FIG. 8( a ) (step S 8 ).
- step S 7 when the distance X of the normal direction is greater than the given distance (step S 7 : No), the CPU 20 returns to the process of step S 1 without altering the size of the operation icon 9 .
- the camera 24 uses video photography or the like to photograph and because the distance X of the normal direction is measured sequentially, when the fingertip is slid onto the operation icon 9 while the distance X of the normal direction is held at the given distance or less, as depicted in FIG. 9 , the size of the operation icon 9 is altered along with the position of the fingertip.
- the CPU 20 determines whether or not the fingertip has come into contact with the mounting surface G from the image data (step S 9 ). For example, as depicted in FIGS. 10( a ) and 10( b ) , when a fingertip located on an icon 9 “SETTINGS” comes into contact with the mounting surface G (step S 9 : Yes), the CPU 20 indicates to the projection unit 34 to project an image “SETTINGS SCREEN” for setting, for example, the brightness of the projected image 8 , corresponding to the content of the operation icon 9 “SETTINGS” (step S 10 ). On the other hand, when the fingertip does not come into contact with the mounting surface G (step S 9 : No), the CPU 20 returns to the process of step S 1 without projecting an image corresponding to the content of the operation icon 9 .
- the operation icons 9 are easy to be seen even in the region shaded by the hand 12 , and favorable operation is possible.
- the operation icons 9 are projected along the edge part of the right side or the left side of the projected image 8 (see FIG. 6 ). In this case, because the operation icons 9 located farther to the front than the operation icon 9 indicated by the fingertip (see FIG.
- the width from the edge part of the projected image 8 to the edge part of the photography region 11 may be made to vary depending on the directions of the edge parts. For example, as depicted in FIG. 11 , the width of A may be made to be narrower than the width of B. Also, as depicted in FIG. 12 , the width from the edge part of the projected image 8 to the edge part of the photography region 11 may be made to become narrower as the distance from the casing 4 increases.
- the operation icons 9 being thereby projected prior to when the projected image 8 is shaded by the hand 12 when the hand 12 approaches from the right, the left, or the rear of the photography region 11 (see FIG. 5 ), the operator can recognize the position of the operation icons 9 earlier.
- the operation icons 9 located at the front of the projected image 8 which is less prone to being shaded by the hand 12 (see FIG. 5 ), are not projected until immediately before the hand 12 approaches the projected image 8 , mistaken operations, such as when the operation icons 9 are projected when the hand 12 mistakenly comes close to the mounting surface G, can be prevented.
- the photography region 11 remains in the original state (see FIG. 10( a ) and the like)
- the region in which the hand recognition unit 36 determines whether or not the image that is based on the image data contains the shape of the hand 12 may be altered like the photography region 11 depicted in FIGS. 11 and 12 .
- each of the operation icons 9 projected onto the projected image 8 has a uniform size, and the display gaps between the operation icons 9 are equivalent, but the sizes and display gaps of the operation icons 9 may be made to be different.
- the sizes and display gaps of the operation icons 9 may be decided on the basis of the distance from the projection window 10 , such that, as depicted in FIG. 13 , large-sized operation icons 9 are projected onto the front of the projected image 8 (see FIG. 5 ), and small-sized operation icons 9 are projected onto the rear, with the display gaps becoming narrower towards the front.
- the operation icons 9 located at the region shaded by the hand 12 as depicted in FIG. 14 can thereby be made to be further easier to be seen.
- the operation icons 9 when the operation icons 9 are projected along the projection direction as depicted in FIG. 6 , the distance from the edge part of the projected image 8 on the side in which the hand 12 approaches until the projection position of the operation icons 9 is uniform, but, as depicted in FIG. 15 , the operation icons 9 may be made to be projected at positions increasingly removed from the edge part of the projected image 8 on the side in which the hand 12 approaches as the positions of the operation icons 9 become farther and farther away from the projection window 10 .
- the operation icons 9 at the front of the projected image 8 (see FIG. 5 ) can thereby be prevented from being shaded by the hand 12 when an operation icon 9 located at the rear of the projected image 8 is indicated by the fingertip. Also, even in this case, the display gaps between the operation icons 9 may be made to be narrower towards the front (see FIG. 5 ).
- the region of the hand 12 and the position of the fingertips are detected from the image data such that a determination is made in the hand recognition unit 36 as to whether or not the image that is based on the image data contains the shape of the hand 12 , but the region of an indication rod or the like and the position of the tip thereof may be detected so as to determine whether or not the image that is based on the image data contains the shape of the indication rod or the like.
- the operation icons 9 can be thereby projected onto the projected image 8 even when an indication member other than the hand 12 approaches the photography region 11 .
- the operation icons 9 in the shadow of the indication rod or the like will not become difficult to be seen. Therefore, when the image that is based on the image data contains the shape of the indication rod or the like, as depicted in FIG. 16 , the operation icons 9 may be made to be projected around the position of the tip of the indication rod or the like 18 , so as to have superior operability. The size of the operation icons 9 may also be altered to a smaller size and the operation icons 9 may be projected at narrower display gaps. The operation icons 9 can thereby be indicated without major movement of the indication rod or the like 18 , when the projector 2 is operated with an indication rod or the like 18 which is less likely to shade the projected image 8 .
- the operation icon 9 positioned directly under the fingertip is altered to a larger size, but as depicted in FIG. 17 , all the operation icons 9 projected onto the projected image 8 may be altered to a larger size.
- the size and display gaps of the operation icons 9 are decided in step S 5 , but only one of either the size or the display gaps of the operation icons 9 may also be decided.
- the first embodiment has been described taking the example of a case in which the operation icons 9 are superimposed and projected onto the projected image 8 , but the projector 2 may be provided with an auxiliary projection unit that projects the operation icons 9 , in addition to the projection unit 34 , such that the auxiliary projection unit projects the operation icons 9 onto a region different from the projected image 8 —for example, an adjacent region.
- the operation icons 9 will not be shaded by the hand 12 , the operation icons 9 are further easier to be seen, and further favorable operation is possible.
- the icons 9 may be projected onto the region adjacent to the projected image 8 at sizes and display gaps that vary in accordance with the direction in which the hand 12 approaches the photography region 11 (see FIG. 5 ). Also, when the operation icons 9 are projected along the projection direction (see FIG. 6 ), large-sized operation icons 9 may be projected at narrow gaps at the front or the rear of the region adjacent to the projected image 8 (see FIG. 5 ), and small-sized operation icons 9 may be projected at wide gaps on the left or the right side. Further, when the operation icons 9 are projected along the projection direction (see FIG. 6 ), the operation icons 9 may be projected at positions increasingly away from the edge part of the region adjacent to the projected image 8 as the position of the operation icons 9 is farther and farther away from the projection window 10 .
- small-sized operation icons 9 may be projected at narrow gaps around the position corresponding to the tip of the indication rod or the like 18 in the region adjacent to the projected image 8 .
- the sizes of the operation icons 9 may also be altered to be a larger size when the fingertip comes close to the mounting surface G.
- the projected image 8 and operation icons 9 may also be projected side by side in a single region.
- a single region may be partitioned into two in order to project the projected image 8 on one side and project the operation icons 9 on another side.
- the projected image is projected onto the mounting surface G of the desk 6 , but the projected image may also be projected onto another level surface such as a wall or floor. Projection may also be done onto a curved surface body such as a ball, or onto a moving object or the like.
- the projector according to the second embodiment is provided with a photography window 14 that photographs the shadow 62 of the hand, as depicted in FIG. 18 .
- the projector 2 according to the first embodiment is additionally provided with a hand shadow recognition unit 60 that determines whether or not a photographed image contains the shape of the shadow 62 of a hand, and furthermore the direction detection unit 38 is additionally provided with a function of detecting the direction in which the shadow 62 of the hand approaches the photography region 11 .
- the casing 4 is mounted onto a mounting surface G, and when the power is switched on, the CPU 20 emits projection light in a downward-sloping direction from the projection window 10 to project the projected image 8 onto the mounting surface G, as depicted in FIGS. 21( a ) and 21( b ) (step S 11 ).
- the CPU 20 uses the camera 24 to begin photographing the photography region 11 , which contains the projected image 8 (step S 12 ).
- the camera 24 photographs using video photography or still image photography at fixed time intervals, and image data of the image photographed by the camera 24 is stored in the image memory unit 26 .
- the CPU 20 reads out the image data from the image memory unit 26 and uses the hand shadow recognition unit 60 to determine whether or not the image that is based on the image data contains the shape of the shadow 62 of the hand (step S 13 ).
- whether or not the image that is based on the image data contains the shape of the shadow 62 of the hand is determined to detect the region in the shadow 62 of the hand and the position corresponding to the fingertip in the region in the shadow 62 of the hand, from the image that is based on the image data by using pattern matching or the like.
- step S 13 When the image that is based on the image data contains the shape of the shadow 62 of the hand (step S 13 : Yes), the CPU 20 uses the direction detection unit 38 to detect from which direction, depicted in FIG. 5 , the shadow 62 of the hand approached the photography region 11 (step S 14 ). On the other hand, when the image that is based on the image data does not contain the shape of the shadow 62 of the hand (step S 13 : No), the CPU 20 returns to the process of step S 11 without detecting the direction in which the shadow 62 of the hand approached.
- the CPU 20 determines whether the direction of approach of the shadow 62 of the hand belongs to the front-rear direction or the left-right direction depicted in FIG. 5 , and then decides the size and display gaps of the operation icons 9 on the basis of the determined results (step S 15 ).
- the CPU 20 reads out the image data of the operation icons 9 from the program memory unit 30 , and indicates to the projection unit 34 to project the operation icons 9 at the sizes and display gaps decided in step S 15 , along the edge part of the side of the projected image 8 in which the shadow 62 of the hand approached (step S 16 ).
- the operation icons 9 are projected along the edge part of the right side of the projected image 8 at the sizes and display gaps decided in step S 15 .
- FIG. 23 when the shadow 62 of the hand approaches the photography region 11 from the front, the operation icons 9 are projected along the edge part of the front of the projected image 8 at the sizes and display gaps decided in step S 15 .
- the CPU 20 uses the hand recognition unit 36 to determine whether or not the image that is based on the image data contains the shape of the hand 12 (step S 17 ).
- step S 17 determines whether or not the distance from the operation icons 9 to the fingertip is a given distance or less (step S 18 ).
- step S 18 When the distance from the operation icons 9 to the fingertip is the given distance or less (step S 18 : Yes), the CPU 20 alters the size of the operation icon 9 located directly under the fingertip to a larger size, and uses the projection unit 34 to project the large-sized operation icon 9 (step S 19 ). On the other hand, when the distance from the operation icons 9 to the fingertip is greater than the given distance (step S 18 : No), the CPU 20 returns to the process of step S 11 without altering the size of the operation icons 9 .
- the CPU 20 determines whether or not the fingertip has come into contact with the mounting surface G from the image data (step S 20 ). For example, when the fingertip located above the icon 9 “SETTINGS” comes into contact with the mounting surface G (see FIGS. 10( a ) and 10( b ) ; step S 20 : Yes), the CPU 20 indicates to the projection unit 34 to project an image “SETTINGS SCREEN” for setting, for example, the brightness of the projected image 8 , corresponding to the content of the operation icon 9 “SETTINGS” (step S 21 ). On the other hand, when the fingertip does not come into contact with the mounting surface G (step S 20 : No), the CPU 20 returns to the process of step S 11 without projecting an image corresponding to the content of the operation icon 9 .
- the icons 9 can therefore be projected on the basis of the direction of approach of the shadow 62 of the hand, even when, for example, the position (height) of the hand 12 is separated from the mounting surface G and the hand 12 does not approach the photography region 11 . Since the icons 9 are therefore projected earlier, the operator can select the operation icon 9 well in advance. Further, because the icons 9 are projected at different sizes and display gaps in accordance with the direction of approach of the shadow 62 of the hand, the operation icons 9 are easy to be seen and favorable operation is possible, similar to the first embodiment, even in the region shaded by the hand 12 .
- the width from the edge part of the projected image 8 to the edge part of the photography region 11 may be made to vary depending on the directions of the edge parts.
- the width of A may be made to be narrower than the width of B.
- the width from the edge part of the projected image 8 to the edge part of the photography region 11 may be made to become narrower as the distance from the casing 4 increases.
- the operation icons 9 located at the front of the projected image 8 which is less prone to being shaded by the hand 12 (see FIG. 5 ), are not projected until immediately before the shadow 62 of the hand approaches the projected image 8 , mistaken operations, such as when the operation icons 9 are projected when the hand 12 mistakenly comes close to the mounting surface G, can be prevented.
- the photography region 11 remains in the original state (see FIG. 10( a ) and the like)
- the region in which the hand shadow recognition unit 60 determines whether or not the image that is based on the image data contains the shape of the shadow 62 of the hand may be altered like the photography region 11 depicted in FIGS. 11 and 12 .
- each of the operation icons 9 projected onto the projected image 8 has a uniform size, and the display gaps between the operation icons 9 are equivalent, but the sizes and display gaps of the operation icons 9 may be made to be different.
- the sizes and display gaps of the operation icons 9 may be decided on the basis of the distance from the projection window 10 , such that, as depicted in FIG. 24 , large-sized operation icons 9 are projected onto the front of the projected image 8 (see FIG. 5 ), and small-sized operation icons 9 are projected onto the rear thereof, with the display gaps becoming narrower towards the front.
- the operation icons 9 located at the region shaded by the hand 12 as depicted in FIG. 25 can thereby be made to be further easier to be seen.
- the operation icons 9 may be made to be projected at positions increasingly away from the edge part of the projected image 8 on the side in which the shadow 62 of the hand approaches as the positions of the operation icons 9 become farther and farther away from the projection window 10 .
- the operation icons 9 at the front of the projected image 8 can thereby be prevented from being shaded by the hand 12 when an operation icon 9 located at the rear of the projected image 8 is indicated by the fingertip.
- the display gaps between the operation icons 9 may be made to be narrower towards the front (see FIG. 5 ).
- the hand shadow recognition unit 60 may determine whether or not the image that is based on the image data contains the shadow of the indication rod or the like.
- the operation icons 9 can thereby be projected onto the projected image 8 even when the shadow of an indication member other than the hand 12 approaches the photography region 11 .
- the operation icons 9 in the shadow of the indication rod or the like will not become difficult to be seen. Therefore, when the image that is based on the image data contains the shadow of the indication rod or the like, as depicted in FIG. 27 , the operation icons 9 may be made to be projected around the position of the tip of the shadow 19 of the indication rod or the like, so as to have superior operability.
- the size of the operation icons 9 may also be altered to a smaller size and the operation icons 9 may be projected at narrower display gaps. The operation icons 9 can thereby be indicated without major movement of the indication rod or the like 18 , when the projector 200 is operated with an indication rod or the like 18 which is less likely to shade the projected image 8 .
- the second embodiment has been described taking the example of a case in which the operation icons 9 are superimposed and projected onto the projected image 8 , but the projector 200 may be provided with an auxiliary projection unit that projects the operation icons 9 , in addition to the projection unit 34 , such that the auxiliary projection unit projects the operation icons 9 onto a region different from the projected image 8 —for example, an adjacent region.
- the operation icons 9 will not be shaded by the hand 12 , the operation icons 9 are further easier to be seen, and further favorable operation is possible.
- the projected image 8 and operation icons 9 may also be projected side by side in a single region.
- a single region may be partitioned into two in order to project the projected image 8 on one side and project the operation icons 9 on another side.
- the hand recognition unit 36 may be made to determine whether or not the image data contains image data corresponding to the shape of the hand 12 .
- the CPU 20 uses the direction detection unit 38 to detect from which of the directions depicted in FIG. 5 the hand 12 has approached the photography region 11 , and then determines whether the direction of approach belongs to the front-rear direction or the left-right direction illustrated in FIG. 5 .
- the hand shadow recognition unit 60 may be used to determine whether or not the image data contains image data corresponding to the shape of the shadow 62 of the hand (step S 13 ).
- the CPU 20 is thereby able to project operation icons 9 when, as depicted in FIG. 25 , the hand 12 approaches the photography region 11 and also the shadow 62 of the hand approaches the photography region 11 . Note that in this case, as depicted in FIG.
- the operation icons 9 are not projected when the hand 12 does not approach the photography region 11 , even though the shadow 62 of the hand may have approached it, and therefore, for example, it is possible to prevent mistaken operations, such as the projection of the operation icons 9 when the operator has mistakenly brought the hand 12 close to the projector 200 .
- the operation icons 9 may be made to be projected only when the position of the shadow 62 of the hand, having approached the photography region 11 , is located on an extension of the position of the hand 12 in the projection direction.
- the hand 12 and the shadow 12 of the hand have approached the photography region 11 .
- the icons 9 are projected when the position of the shadow 12 of the hand is located on an extension of the position of the hand 12 in the projection direction.
- FIG. 29( b ) B the icons 9 are projected when the position of the shadow 12 of the hand is located on an extension of the position of the hand 12 in the projection direction.
- the position of the shadow 62 of the hand is not located on an extension of the position of the hand 12 in the projection direction, and therefore the operation icons 9 are not projected. It is thereby possible to prevent mistaken operations, such as the projection of the operation icons 9 in a case in which someone other than the operator has mistakenly brought a hand 12 close to the projector 200 and the shadow 62 of the hand based on illumination light other than the projection light approaches the photography region 11 (see FIG. 30 ), or the like, thus allowing the projector 200 to have enhanced operability.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Position Input By Displaying (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A projector is provided with a projection unit that projects an image on a projection surface; a detection unit that detects the direction of approach of an indication member or of the shadow of the indication member that has approached a detection region including the projection surface; and a controller that controls display gaps of a plurality of icons which is projected on the projection surface based on the projection direction in which the image is projected by the projection unit and the direction of approach which is detected by the detection unit, wherein the controller makes the display gaps when a plurality of the icons are projected along the projection direction wider than the display gaps when a plurality of the icons are projected along a direction which is different from the projection direction of the projection unit.
Description
- The present application is a continuation application of U.S. application Ser. No. 13/251,732, filed on Oct. 3, 2011, which claims the benefit of priority from Japanese Patent Application No. 2010-231075 filed on Oct. 14, 2010 and Japanese Patent Application No. 2011-121266 filed on May 31, 2011; the entire contents of all of which are incorporated herein by reference.
- The present invention relates to a projector.
- A known projection device projects operation icons onto a projection surface (for example, see Patent Literature 1). This projection device decides the projection position or projection direction of the operation icon in accordance with the direction of approach of a finger on the projected image.
- Patent Literature 1: Japanese Unexamined Patent Application Publication No. 2009-064109
- However, in the above-described projection device, the operation icon is sometimes shaded by the hand during operation, which may render operation difficult.
- It is an object of the present invention to provide a projector in which the operation icon is easy to be seen, and which can be favorably operated.
- The projector of the present invention includes: a projection unit that projects onto a projection surface a projected image that is based on image data; a detection unit that detects the direction of approach of an indication member or of the shadow of the indication member that has approached a detection region that includes the projected image; an icon memory unit that stores the image data of a plurality of icons; a decision unit that decides at least one of the display size of the icons, the display gaps between the icons, and the display positions of the icons, on the basis of the projection direction of the projected image and the direction of approach detected by the detection unit; and a projection control unit that projects the icons onto the display positions decided by the decision unit at the display sizes and display gaps decided by the decision unit.
- The projector of the present invention is also characterized by being provided with a projection unit that projects onto a projection surface a projected image that is based on image data; an icon memory unit that stores the image data of a plurality of icons; a position memory unit that stores the display positions of the icons; a decision unit that decides at least one of the display size of the icons or the display gaps between the icons displayed along the projection direction of the projected image by a distance from the projection unit that is based on the display positions of the icons; and a projection control unit that projects each of the icons at the display sizes and display gaps decided by the decision unit.
- The projector of the present invention is also characterized by being provided with a projection unit that projects onto a projection surface a projected image that is based on image data; a detection unit that detects the direction of approach of an indication member and the direction of approach of the shadow of the indication member that has approached a detection region that includes the projected image; an icon memory unit that stores image data of a plurality of icons; a decision unit that decides at least one of the display sizes of the icons, the display gaps between the icons, and the display positions of the icons, on the basis of the projection direction of the projected image and the direction of approach of the indication member and direction of approach of the shadow of the indication member detected by the detection unit; and a projection control unit that projects the icons at the display positions decided by the decision unit at the display sizes and display gaps decided by the decision unit.
- According to the projector of the present invention, the operation icons are easy to be seen and favorable operation is possible.
-
FIG. 1 is a perspective view depicting a projected state and a photographed state of a projector according to a first embodiment; -
FIG. 2 is a block diagram depicting the configuration of the projector according to the first embodiment; -
FIG. 3 is a flow chart depicting a process in the projector according to the first embodiment; -
FIG. 4 is a front view and a side view depicting the projected state and the photographed state of the projector according to the first embodiment; -
FIG. 5 is a diagram depicting a direction on a projected image of the projector according to the first embodiment; -
FIG. 6 is a diagram depicting operation icons superimposed and projected on the projected image by the projector according to the first embodiment, as well as a direction of approach of a hand; -
FIG. 7 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment, as well as the direction of approach of a hand; -
FIG. 8 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment; -
FIG. 9 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment; -
FIG. 10 is a diagram depicting the operation icons superimposed and projected on the projected image by the projector according to the first embodiment, as well as the position of a fingertip; -
FIG. 11 is a diagram depicting a photography region in the projector according to the first embodiment; -
FIG. 12 is a diagram depicting the photography region in the projector according to the first embodiment; -
FIG. 13 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment, as well as the direction of approach of a hand; -
FIG. 14 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment; -
FIG. 15 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment; -
FIG. 16 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment; -
FIG. 17 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the first embodiment; -
FIG. 18 is a perspective view depicting a projected state and a photographed state of a projector according to a second embodiment; -
FIG. 19 is a block diagram depicting the configuration of the projector according to the second embodiment; -
FIG. 20 is a flow chart depicting a process in the projector according to the second embodiment; -
FIG. 21 is a front view and a side view depicting the projected state and the photographed state of the projector according to the second embodiment; -
FIG. 22 is a diagram depicting operation icons superimposed and projected onto a projected image by the projector according to the second embodiment, as well as a direction of approach of the shadow of a hand; -
FIG. 23 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand; -
FIG. 24 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand; -
FIG. 25 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment; -
FIG. 26 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand; -
FIG. 27 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment; -
FIG. 28 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand; -
FIG. 29 is a diagram depicting the operation icons superimposed and projected onto the projected image by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand; and -
FIG. 30 is a diagram depicting the projected image projected by the projector according to the second embodiment, as well as the direction of approach of the shadow of a hand. - The following is a description of a projector according to a first embodiment of the present invention, with reference to the drawings.
FIG. 1 is a perspective view depicting a projected state and a photographed state of aprojector 2 according to the first embodiment. Theprojector 2 is provided with acasing 4 made of metal or plastic, thecasing 4 being mounted onto a mounting surface G, which is the top surface of adesk 6 or the like. The front surface of thecasing 4 is also provided with aprojection window 10 that projects a projectedimage 8 and an operation icon 9 onto the mounting surface G, and with aphotography window 14 that photographs an indication member such as ahand 12, which approaches aphotography region 11 and indicates the operation icon 9. -
FIG. 2 is a block diagram depicting the system configuration of the projector according to the first embodiment. Theprojector 2 is provided with aCPU 20, theCPU 20 being connected to anoperation unit 22 provided with a power switch and the like (not shown); acamera 24 having an imaging sensor constituted of a CCD or the like, that photographs a subject; animage memory unit 26 that stores image data of an image photographed by thecamera 24; aprogram memory unit 30 that houses a program for setting and controlling photography, projection, and the like; amemory card 32 that stores the image data of an image to be projected; a projection unit 34 that projects an image that is based on image data stored in theimage memory unit 26 or thememory card 32; ahand recognition unit 36 that determines whether or not a photographed image contains the shape of ahand 12; and adirection detection unit 38 that detects the direction in which thehand 12 approaches aphotography region 11. Herein, the projection unit 34 is provided with apower control unit 48 that turns anLED light source 46 on and off, and with aprojection control unit 52 that controls the display of an LCOS 50 that displays an image to be projected. - The following is a description of a process in the projector according to the first embodiment, with reference to the flowchart depicted in
FIG. 3 . First, as depicted inFIG. 4(a) , thecasing 4 is mounted onto a mounting surface G, and when the power is switched on, theCPU 20 indicates to the projection unit 34 to begin projecting, and reads out image data from thememory card 32 in order to use theprojection control unit 52 to display on the LCOS 50 an image that is based on the image data. Thepower control unit 48 also switches on theLED light source 46 by the indication to begin projecting, and, as depicted inFIGS. 4(a) and 4(b) , emits projection light in a downward-sloping direction from theprojection window 10 so as to project the projectedimage 8 onto the mounting surface G (step S1). - The
CPU 20 also uses thecamera 24 to begin photographing aphotography region 11 that includes the projected image 8 (step S2). Herein, thecamera 24 photographs using video photography or still image photography at fixed time intervals, and image data of the image photographed by thecamera 24 is stored in theimage memory unit 26. - Next, the
CPU 20 reads out image data from theimage memory unit 26 and uses thehand recognition unit 36 to determine whether or not the image that is based on image data contains the shape of the hand 12 (step S3). Note that whether or not the image that is based on the image data contains the shape of thehand 12 is determined to detect the region of thehand 12 and the position of the fingertips from the image that is based on the image data by using pattern matching or the like. - Herein,
FIG. 5 is a diagram overlooking from above onto theprojector 2 mounted onto the mounting surface G and thephotography region 11, which contains the projectedimage 8 projected onto the mounting surface G. As depicted in thisFIG. 5 , the rear is taken to be the side closer to thecasing 4 of theprojector 2 on the projectedimage 8, in the direction along the projection direction, and the front is taken to be the side farther away from thecasing 4. Also, the right side is taken to be the right side on the projectedimage 8 in the direction orthogonal to the projection direction, and the left side is taken to be the left side thereof. - When the image that is based on the image data contains the shape of the hand 12 (step S3: Yes), the
CPU 20 uses thedirection detection unit 38 to detect from which of the directions depicted inFIG. 5 thehand 12 approaches the photography region 11 (step S4). On the other hand, when the image that is based on the image data does not contain the shape of the hand 12 (step S3: No), theCPU 20 returns to the process in step S1 without detecting the direction in which thehand 12 approaches. - Next, the
CPU 20 determines whether the direction of approach of thehand 12 belongs to the front-rear direction or the left-right direction depicted inFIG. 5 , and decides the sizes and display gaps of the operation icons 9 on the basis of the determined results (step S5). For example, when thehand 12 approaches from either the right side or the left side of thephotography region 11, the size of the operation icons 9 is decided to be 2 cm×2 cm, and the gap between the operation icons 9 is decided to be 2 cm. Also, when thehand 12 approaches from either the front or the rear of thephotography region 11, the size of the operation icons 9 is decided to be 4 cm×4 cm, and the gap between the operation icons 9 is decided to be 0.5 cm or the like. - Next, the
CPU 20 reads out the image data of the operation icons 9 from theprogram memory unit 30, and indicates to the projection unit 34 to project the operation icons 9 at the size and display gaps decided in step S5 along the edge part of the side of the projectedimage 8 in which thehand 12 approaches (step S6). For example, as depicted inFIG. 6 , when thehand 12 approaches thephotography region 11 from the right side, the operation icons 9 are projected along the edge part of the right side of the projectedimage 8 at the size and display gaps decided in step S5. Also, as depicted inFIG. 7 , when thehand 12 approaches thephotography region 11 from the front, the operation icons 9 are projected along the edge part of the front of the projectedimage 8 at the size and display gaps decided in step S5. - Next, the
CPU 20 determines whether or not the distance from the operation icons 9 to the fingertip is a given distance or less (step S7). For example, as depicted inFIG. 8(a) , thehand 12 approaching thephotography region 11 from the front (seeFIG. 5 ), the fingertip is taken as being located on an operation icon 9 “HOME.” In this case, theCPU 20 measures a distance X of the normal direction from the mounting surface G to the position of the fingertip, depicted inFIG. 8(b) at the position of the operation icon 9 “HOME,” on the basis of the image data, and determines whether or not the distance X of the normal direction is a given distance or less. Herein, the operator is able to set the given distance as desired. - When the distance X of the normal direction is the given distance or less (step S7: Yes), the
CPU 20 alters the size of the operation icon 9 located directly under the fingertip, for example, “HOME,” to a larger size, and uses the projection unit 34 to project the large-sized operation icon “HOME” as depicted inFIG. 8(a) (step S8). On the other hand, when the distance X of the normal direction is greater than the given distance (step S7: No), theCPU 20 returns to the process of step S1 without altering the size of the operation icon 9. - Herein, because the
camera 24 uses video photography or the like to photograph and because the distance X of the normal direction is measured sequentially, when the fingertip is slid onto the operation icon 9 while the distance X of the normal direction is held at the given distance or less, as depicted inFIG. 9 , the size of the operation icon 9 is altered along with the position of the fingertip. - Next, the
CPU 20 determines whether or not the fingertip has come into contact with the mounting surface G from the image data (step S9). For example, as depicted inFIGS. 10(a) and 10(b) , when a fingertip located on an icon 9 “SETTINGS” comes into contact with the mounting surface G (step S9: Yes), theCPU 20 indicates to the projection unit 34 to project an image “SETTINGS SCREEN” for setting, for example, the brightness of the projectedimage 8, corresponding to the content of the operation icon 9 “SETTINGS” (step S10). On the other hand, when the fingertip does not come into contact with the mounting surface G (step S9: No), theCPU 20 returns to the process of step S1 without projecting an image corresponding to the content of the operation icon 9. - According to the
projector 2 based on this first embodiment, because the icons 9 are projected at different sizes and display gaps in accordance with the direction of approach of thehand 12, the operation icons 9 are easy to be seen even in the region shaded by thehand 12, and favorable operation is possible. For example, when thehand 12 approaches from the right side or the left side (seeFIG. 5 ) of thephotography region 11, the operation icons 9 are projected along the edge part of the right side or the left side of the projected image 8 (seeFIG. 6 ). In this case, because the operation icons 9 located farther to the front than the operation icon 9 indicated by the fingertip (seeFIG. 5 ) is shaded by thehand 12, small-sized operation icons 9 are projected along the edge part of the right side or the left side of the projectedimage 8 at wide gaps, so as to have superior ease of being seen. On the other hand, when thehand 12 approaches from the front or the rear of the photography region 11 (seeFIG. 5 ), the operation icons 9 are projected along the edge part of the front or the rear of the projected image 8 (seeFIG. 7 ). In such a case, because the operation icons 9 adjacent to the indicated operation icon 9 are not shaded by thehand 12, large-sized operation icons 9 are projected along the edge part of the front or the rear of the projectedimage 8 at narrow gaps, so as to have superior operability. - Note that in the first embodiment, the width from the edge part of the projected
image 8 to the edge part of thephotography region 11 may be made to vary depending on the directions of the edge parts. For example, as depicted inFIG. 11 , the width of A may be made to be narrower than the width of B. Also, as depicted inFIG. 12 , the width from the edge part of the projectedimage 8 to the edge part of thephotography region 11 may be made to become narrower as the distance from thecasing 4 increases. The operation icons 9 being thereby projected prior to when the projectedimage 8 is shaded by thehand 12 when thehand 12 approaches from the right, the left, or the rear of the photography region 11 (seeFIG. 5 ), the operator can recognize the position of the operation icons 9 earlier. Also, because the operation icons 9 located at the front of the projectedimage 8, which is less prone to being shaded by the hand 12 (seeFIG. 5 ), are not projected until immediately before thehand 12 approaches the projectedimage 8, mistaken operations, such as when the operation icons 9 are projected when thehand 12 mistakenly comes close to the mounting surface G, can be prevented. Note that when thephotography region 11 remains in the original state (seeFIG. 10(a) and the like), the region in which thehand recognition unit 36 determines whether or not the image that is based on the image data contains the shape of thehand 12 may be altered like thephotography region 11 depicted inFIGS. 11 and 12 . - Also, in the first embodiment, when the operation icons 9 are projected along the projection direction as depicted in
FIG. 6 , each of the operation icons 9 projected onto the projectedimage 8 has a uniform size, and the display gaps between the operation icons 9 are equivalent, but the sizes and display gaps of the operation icons 9 may be made to be different. For example, in step S5, the sizes and display gaps of the operation icons 9 may be decided on the basis of the distance from theprojection window 10, such that, as depicted inFIG. 13 , large-sized operation icons 9 are projected onto the front of the projected image 8 (seeFIG. 5 ), and small-sized operation icons 9 are projected onto the rear, with the display gaps becoming narrower towards the front. The operation icons 9 located at the region shaded by thehand 12 as depicted inFIG. 14 can thereby be made to be further easier to be seen. - Also, in the first embodiment, when the operation icons 9 are projected along the projection direction as depicted in
FIG. 6 , the distance from the edge part of the projectedimage 8 on the side in which thehand 12 approaches until the projection position of the operation icons 9 is uniform, but, as depicted inFIG. 15 , the operation icons 9 may be made to be projected at positions increasingly removed from the edge part of the projectedimage 8 on the side in which thehand 12 approaches as the positions of the operation icons 9 become farther and farther away from theprojection window 10. The operation icons 9 at the front of the projected image 8 (seeFIG. 5 ) can thereby be prevented from being shaded by thehand 12 when an operation icon 9 located at the rear of the projectedimage 8 is indicated by the fingertip. Also, even in this case, the display gaps between the operation icons 9 may be made to be narrower towards the front (seeFIG. 5 ). - Also, in the first embodiment, the region of the
hand 12 and the position of the fingertips are detected from the image data such that a determination is made in thehand recognition unit 36 as to whether or not the image that is based on the image data contains the shape of thehand 12, but the region of an indication rod or the like and the position of the tip thereof may be detected so as to determine whether or not the image that is based on the image data contains the shape of the indication rod or the like. The operation icons 9 can be thereby projected onto the projectedimage 8 even when an indication member other than thehand 12 approaches thephotography region 11. - Also, when a thin member such as an indication rod or the like is used to indicate an operation icon 9, the operation icons 9 in the shadow of the indication rod or the like will not become difficult to be seen. Therefore, when the image that is based on the image data contains the shape of the indication rod or the like, as depicted in
FIG. 16 , the operation icons 9 may be made to be projected around the position of the tip of the indication rod or the like 18, so as to have superior operability. The size of the operation icons 9 may also be altered to a smaller size and the operation icons 9 may be projected at narrower display gaps. The operation icons 9 can thereby be indicated without major movement of the indication rod or the like 18, when theprojector 2 is operated with an indication rod or the like 18 which is less likely to shade the projectedimage 8. - Further, in the first embodiment, when the distance X of the normal direction (see
FIG. 8(b) ) is a given distance or less, the operation icon 9 positioned directly under the fingertip is altered to a larger size, but as depicted inFIG. 17 , all the operation icons 9 projected onto the projectedimage 8 may be altered to a larger size. - Moreover, in the first embodiment, the size and display gaps of the operation icons 9 are decided in step S5, but only one of either the size or the display gaps of the operation icons 9 may also be decided.
- Further, the first embodiment has been described taking the example of a case in which the operation icons 9 are superimposed and projected onto the projected
image 8, but theprojector 2 may be provided with an auxiliary projection unit that projects the operation icons 9, in addition to the projection unit 34, such that the auxiliary projection unit projects the operation icons 9 onto a region different from the projectedimage 8—for example, an adjacent region. In this case, because the operation icons 9 will not be shaded by thehand 12, the operation icons 9 are further easier to be seen, and further favorable operation is possible. - Moreover, in this case, the icons 9 may be projected onto the region adjacent to the projected
image 8 at sizes and display gaps that vary in accordance with the direction in which thehand 12 approaches the photography region 11 (seeFIG. 5 ). Also, when the operation icons 9 are projected along the projection direction (seeFIG. 6 ), large-sized operation icons 9 may be projected at narrow gaps at the front or the rear of the region adjacent to the projected image 8 (seeFIG. 5 ), and small-sized operation icons 9 may be projected at wide gaps on the left or the right side. Further, when the operation icons 9 are projected along the projection direction (seeFIG. 6 ), the operation icons 9 may be projected at positions increasingly away from the edge part of the region adjacent to the projectedimage 8 as the position of the operation icons 9 is farther and farther away from theprojection window 10. - Also, when the image data contains the shape of the indication rod or the like, small-sized operation icons 9 may be projected at narrow gaps around the position corresponding to the tip of the indication rod or the like 18 in the region adjacent to the projected
image 8. The sizes of the operation icons 9 may also be altered to be a larger size when the fingertip comes close to the mounting surface G. - The projected
image 8 and operation icons 9 may also be projected side by side in a single region. For example, a single region may be partitioned into two in order to project the projectedimage 8 on one side and project the operation icons 9 on another side. - Further, in the first embodiment, the projected image is projected onto the mounting surface G of the
desk 6, but the projected image may also be projected onto another level surface such as a wall or floor. Projection may also be done onto a curved surface body such as a ball, or onto a moving object or the like. - The following is a description of a projector according to a second embodiment. Note that under the
projection window 10 on the front surface of thecasing 4 of theprojector 200, the projector according to the second embodiment is provided with aphotography window 14 that photographs theshadow 62 of the hand, as depicted inFIG. 18 . Further, as depicted in the block diagram ofFIG. 19 , theprojector 2 according to the first embodiment is additionally provided with a handshadow recognition unit 60 that determines whether or not a photographed image contains the shape of theshadow 62 of a hand, and furthermore thedirection detection unit 38 is additionally provided with a function of detecting the direction in which theshadow 62 of the hand approaches thephotography region 11. Accordingly, a more detailed description of those parts of the configuration that are identical to the first embodiment having been omitted, a description is provided only for the points of different. Further, the same reference numerals are applied to describe the parts of the configuration that are the same as the first embodiment. - The following is a description of the process in a
projector 200 according to the second embodiment, with reference to the flow chart illustrated inFIG. 20 . First, as depicted inFIG. 21(a) , thecasing 4 is mounted onto a mounting surface G, and when the power is switched on, theCPU 20 emits projection light in a downward-sloping direction from theprojection window 10 to project the projectedimage 8 onto the mounting surface G, as depicted inFIGS. 21(a) and 21(b) (step S11). - Next, the
CPU 20 uses thecamera 24 to begin photographing thephotography region 11, which contains the projected image 8 (step S12). Herein, thecamera 24 photographs using video photography or still image photography at fixed time intervals, and image data of the image photographed by thecamera 24 is stored in theimage memory unit 26. - Next, the
CPU 20 reads out the image data from theimage memory unit 26 and uses the handshadow recognition unit 60 to determine whether or not the image that is based on the image data contains the shape of theshadow 62 of the hand (step S13). Herein, whether or not the image that is based on the image data contains the shape of theshadow 62 of the hand is determined to detect the region in theshadow 62 of the hand and the position corresponding to the fingertip in the region in theshadow 62 of the hand, from the image that is based on the image data by using pattern matching or the like. - When the image that is based on the image data contains the shape of the
shadow 62 of the hand (step S13: Yes), theCPU 20 uses thedirection detection unit 38 to detect from which direction, depicted inFIG. 5 , theshadow 62 of the hand approached the photography region 11 (step S14). On the other hand, when the image that is based on the image data does not contain the shape of theshadow 62 of the hand (step S13: No), theCPU 20 returns to the process of step S11 without detecting the direction in which theshadow 62 of the hand approached. - Next, the
CPU 20 determines whether the direction of approach of theshadow 62 of the hand belongs to the front-rear direction or the left-right direction depicted inFIG. 5 , and then decides the size and display gaps of the operation icons 9 on the basis of the determined results (step S15). - Next, the
CPU 20 reads out the image data of the operation icons 9 from theprogram memory unit 30, and indicates to the projection unit 34 to project the operation icons 9 at the sizes and display gaps decided in step S15, along the edge part of the side of the projectedimage 8 in which theshadow 62 of the hand approached (step S16). For example, as depicted inFIG. 22 , when theshadow 62 of the hand approaches thephotography region 11 from the right side, the operation icons 9 are projected along the edge part of the right side of the projectedimage 8 at the sizes and display gaps decided in step S15. Further, as depicted inFIG. 23 , when theshadow 62 of the hand approaches thephotography region 11 from the front, the operation icons 9 are projected along the edge part of the front of the projectedimage 8 at the sizes and display gaps decided in step S15. - Next, the
CPU 20 uses thehand recognition unit 36 to determine whether or not the image that is based on the image data contains the shape of the hand 12 (step S17). When the image that is based on the image data does not contain the shape of the hand 12 (step S17: No), process returns to that of step S11. On the other hand, when the image that is based on the image data does contain the shape of the hand 12 (step S17: Yes), theCPU 20 determines whether or not the distance from the operation icons 9 to the fingertip is a given distance or less (step S18). - When the distance from the operation icons 9 to the fingertip is the given distance or less (step S18: Yes), the
CPU 20 alters the size of the operation icon 9 located directly under the fingertip to a larger size, and uses the projection unit 34 to project the large-sized operation icon 9 (step S19). On the other hand, when the distance from the operation icons 9 to the fingertip is greater than the given distance (step S18: No), theCPU 20 returns to the process of step S11 without altering the size of the operation icons 9. - Next, the
CPU 20 determines whether or not the fingertip has come into contact with the mounting surface G from the image data (step S20). For example, when the fingertip located above the icon 9 “SETTINGS” comes into contact with the mounting surface G (seeFIGS. 10(a) and 10(b) ; step S20: Yes), theCPU 20 indicates to the projection unit 34 to project an image “SETTINGS SCREEN” for setting, for example, the brightness of the projectedimage 8, corresponding to the content of the operation icon 9 “SETTINGS” (step S21). On the other hand, when the fingertip does not come into contact with the mounting surface G (step S20: No), theCPU 20 returns to the process of step S11 without projecting an image corresponding to the content of the operation icon 9. - According to the
projector 200 based on this second embodiment, because thedirection detection unit 38 detects the direction in which theshadow 62 of the hand approaches thephotography region 11, the icons 9 can therefore be projected on the basis of the direction of approach of theshadow 62 of the hand, even when, for example, the position (height) of thehand 12 is separated from the mounting surface G and thehand 12 does not approach thephotography region 11. Since the icons 9 are therefore projected earlier, the operator can select the operation icon 9 well in advance. Further, because the icons 9 are projected at different sizes and display gaps in accordance with the direction of approach of theshadow 62 of the hand, the operation icons 9 are easy to be seen and favorable operation is possible, similar to the first embodiment, even in the region shaded by thehand 12. - Note that in the second embodiment, the width from the edge part of the projected
image 8 to the edge part of thephotography region 11 may be made to vary depending on the directions of the edge parts. For example, as depicted inFIG. 11 , the width of A may be made to be narrower than the width of B. Also, as depicted inFIG. 12 , the width from the edge part of the projectedimage 8 to the edge part of thephotography region 11 may be made to become narrower as the distance from thecasing 4 increases. The operation icons 9 being thereby projected prior to when the projectedimage 8 is shaded by thehand 12 when theshadow 62 of the hand approaches from the right, the left, or the rear of the photography region 11 (seeFIG. 5 ), the operator can recognize the position of the operation icons 9 earlier. Also, because the operation icons 9 located at the front of the projectedimage 8, which is less prone to being shaded by the hand 12 (seeFIG. 5 ), are not projected until immediately before theshadow 62 of the hand approaches the projectedimage 8, mistaken operations, such as when the operation icons 9 are projected when thehand 12 mistakenly comes close to the mounting surface G, can be prevented. Note that when thephotography region 11 remains in the original state (seeFIG. 10(a) and the like), the region in which the handshadow recognition unit 60 determines whether or not the image that is based on the image data contains the shape of theshadow 62 of the hand may be altered like thephotography region 11 depicted inFIGS. 11 and 12 . - Also, in the second embodiment, when the operation icons 9 are projected along the projection direction as depicted in
FIG. 22 , each of the operation icons 9 projected onto the projectedimage 8 has a uniform size, and the display gaps between the operation icons 9 are equivalent, but the sizes and display gaps of the operation icons 9 may be made to be different. For example, in step S15, the sizes and display gaps of the operation icons 9 may be decided on the basis of the distance from theprojection window 10, such that, as depicted inFIG. 24 , large-sized operation icons 9 are projected onto the front of the projected image 8 (seeFIG. 5 ), and small-sized operation icons 9 are projected onto the rear thereof, with the display gaps becoming narrower towards the front. The operation icons 9 located at the region shaded by thehand 12 as depicted inFIG. 25 can thereby be made to be further easier to be seen. - Also, in the second embodiment, as depicted in
FIG. 26 , the operation icons 9 may be made to be projected at positions increasingly away from the edge part of the projectedimage 8 on the side in which theshadow 62 of the hand approaches as the positions of the operation icons 9 become farther and farther away from theprojection window 10. The operation icons 9 at the front of the projected image 8 (seeFIG. 5 ) can thereby be prevented from being shaded by thehand 12 when an operation icon 9 located at the rear of the projectedimage 8 is indicated by the fingertip. Also, even in this case, the display gaps between the operation icons 9 may be made to be narrower towards the front (seeFIG. 5 ). - Also, in the second embodiment, the hand
shadow recognition unit 60 may determine whether or not the image that is based on the image data contains the shadow of the indication rod or the like. The operation icons 9 can thereby be projected onto the projectedimage 8 even when the shadow of an indication member other than thehand 12 approaches thephotography region 11. - Also, when a thin member such as an indication rod or the like is used to indicate an operation icon 9, the operation icons 9 in the shadow of the indication rod or the like will not become difficult to be seen. Therefore, when the image that is based on the image data contains the shadow of the indication rod or the like, as depicted in
FIG. 27 , the operation icons 9 may be made to be projected around the position of the tip of theshadow 19 of the indication rod or the like, so as to have superior operability. The size of the operation icons 9 may also be altered to a smaller size and the operation icons 9 may be projected at narrower display gaps. The operation icons 9 can thereby be indicated without major movement of the indication rod or the like 18, when theprojector 200 is operated with an indication rod or the like 18 which is less likely to shade the projectedimage 8. - Further, the second embodiment has been described taking the example of a case in which the operation icons 9 are superimposed and projected onto the projected
image 8, but theprojector 200 may be provided with an auxiliary projection unit that projects the operation icons 9, in addition to the projection unit 34, such that the auxiliary projection unit projects the operation icons 9 onto a region different from the projectedimage 8—for example, an adjacent region. In this case, because the operation icons 9 will not be shaded by thehand 12, the operation icons 9 are further easier to be seen, and further favorable operation is possible. - The projected
image 8 and operation icons 9 may also be projected side by side in a single region. For example, a single region may be partitioned into two in order to project the projectedimage 8 on one side and project the operation icons 9 on another side. - Further, in the second embodiment, after the photography of the
photography region 11 has begun (step S12), thehand recognition unit 36 may be made to determine whether or not the image data contains image data corresponding to the shape of thehand 12. Herein, when the image data does contain image data corresponding to the shape of thehand 12, theCPU 20 uses thedirection detection unit 38 to detect from which of the directions depicted inFIG. 5 thehand 12 has approached thephotography region 11, and then determines whether the direction of approach belongs to the front-rear direction or the left-right direction illustrated inFIG. 5 . Then, after the direction of approach of thehand 12 has been determined, the handshadow recognition unit 60 may be used to determine whether or not the image data contains image data corresponding to the shape of theshadow 62 of the hand (step S13). TheCPU 20 is thereby able to project operation icons 9 when, as depicted inFIG. 25 , thehand 12 approaches thephotography region 11 and also theshadow 62 of the hand approaches thephotography region 11. Note that in this case, as depicted inFIG. 28 , the operation icons 9 are not projected when thehand 12 does not approach thephotography region 11, even though theshadow 62 of the hand may have approached it, and therefore, for example, it is possible to prevent mistaken operations, such as the projection of the operation icons 9 when the operator has mistakenly brought thehand 12 close to theprojector 200. - In this case, the operation icons 9 may be made to be projected only when the position of the
shadow 62 of the hand, having approached thephotography region 11, is located on an extension of the position of thehand 12 in the projection direction. For example, as depicted inFIG. 29(a) , thehand 12 and theshadow 12 of the hand have approached thephotography region 11. In this case, as depicted inFIG. 29(b) B, the icons 9 are projected when the position of theshadow 12 of the hand is located on an extension of the position of thehand 12 in the projection direction. On the other hand, as depicted inFIG. 30 , in a case such as when thehand 12 and theshadow 62 of the hand approach thephotography region 11 from different directions, the position of theshadow 62 of the hand is not located on an extension of the position of thehand 12 in the projection direction, and therefore the operation icons 9 are not projected. It is thereby possible to prevent mistaken operations, such as the projection of the operation icons 9 in a case in which someone other than the operator has mistakenly brought ahand 12 close to theprojector 200 and theshadow 62 of the hand based on illumination light other than the projection light approaches the photography region 11 (seeFIG. 30 ), or the like, thus allowing theprojector 200 to have enhanced operability. - The above-described embodiments have been recited in order to facilitate understanding of the present invention, and are not recited in order to limit the present invention. Accordingly, in effect, each element disclosed in the above-described embodiments also includes all design changes and equivalents falling within the technical scope of the present invention.
Claims (5)
1. A projector, comprising:
a projection unit that projects an image on a projection surface;
a detection unit that detects the direction of approach of an indication member or of the shadow of the indication member that has approached a detection region including the projection surface; and
a controller that controls display gaps of a plurality of icons which is projected on the projection surface based on the projection direction in which the image is projected by the projection unit and the direction of approach which is detected by the detection unit,
wherein the controller makes the display gaps when a plurality of the icons are projected along the projection direction wider than the display gaps when a plurality of the icons are projected along a direction which is different from the projection direction of the projection unit.
2. The projector according to claim 1 , wherein the detection unit have an imaging sensor which photographs the projection surface.
3. The projector according to claim 1 , comprising an icon memory unit that stores image data of a plurality of the icons.
4. The projector according to claim 1 , wherein the projection unit projects a plurality of the icons in a different region from a region in which the image is projected on the projection surface.
5. The projector according to claim 1 , wherein the controller controls display sizes of a plurality of the icons based on the projection direction in which the image is projected by the projection unit and the direction of approach which is detected by the detection unit.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/227,657 US20190121227A1 (en) | 2010-10-14 | 2018-12-20 | Projector |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010231075 | 2010-10-14 | ||
| JP2010-231075 | 2010-10-14 | ||
| JP2011121266A JP5304848B2 (en) | 2010-10-14 | 2011-05-31 | projector |
| JP2011-121266 | 2011-05-31 | ||
| US13/251,732 US10203594B2 (en) | 2010-10-14 | 2011-10-03 | Projector |
| US16/227,657 US20190121227A1 (en) | 2010-10-14 | 2018-12-20 | Projector |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/251,732 Continuation US10203594B2 (en) | 2010-10-14 | 2011-10-03 | Projector |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20190121227A1 true US20190121227A1 (en) | 2019-04-25 |
Family
ID=45972649
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/251,732 Active US10203594B2 (en) | 2010-10-14 | 2011-10-03 | Projector |
| US16/227,657 Abandoned US20190121227A1 (en) | 2010-10-14 | 2018-12-20 | Projector |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/251,732 Active US10203594B2 (en) | 2010-10-14 | 2011-10-03 | Projector |
Country Status (3)
| Country | Link |
|---|---|
| US (2) | US10203594B2 (en) |
| JP (1) | JP5304848B2 (en) |
| CN (1) | CN102457730B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240161664A1 (en) * | 2021-05-06 | 2024-05-16 | Beijing Youzhuju Network Technology Co., Ltd. | Projection display method and apparatus, terminal, and non-transitory storage medium |
Families Citing this family (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2013061680A (en) * | 2010-10-14 | 2013-04-04 | Nikon Corp | Display device |
| JP6000553B2 (en) * | 2012-01-24 | 2016-09-28 | キヤノン株式会社 | Information processing apparatus and control method thereof |
| JP6135239B2 (en) * | 2012-05-18 | 2017-05-31 | 株式会社リコー | Image processing apparatus, image processing program, and image processing method |
| JP5972692B2 (en) * | 2012-07-11 | 2016-08-17 | 株式会社Nttドコモ | User interface device, user interface method and program |
| JP6167529B2 (en) * | 2013-01-16 | 2017-07-26 | 株式会社リコー | Image projection apparatus, image projection system, control method, and program |
| EP2992682B1 (en) * | 2013-05-01 | 2020-01-22 | Lumo Interactive Inc. | Content generation for interactive video projection systems |
| JP5882270B2 (en) * | 2013-08-29 | 2016-03-09 | 東芝テック株式会社 | Information processing apparatus and program |
| JP6245938B2 (en) * | 2013-10-25 | 2017-12-13 | キヤノン株式会社 | Information processing apparatus and control method thereof, computer program, and storage medium |
| JP5973087B2 (en) * | 2013-11-19 | 2016-08-23 | 日立マクセル株式会社 | Projection-type image display device |
| US10194124B2 (en) | 2013-12-19 | 2019-01-29 | Maxell, Ltd. | Projection type video display device and projection type video display method |
| DE102014012516A1 (en) * | 2014-01-20 | 2015-07-23 | Beijing Lenovo Software Ltd. | Information processing method and electrical device |
| US10133355B2 (en) * | 2014-03-21 | 2018-11-20 | Dell Products L.P. | Interactive projected information handling system support input and output devices |
| US9304599B2 (en) | 2014-03-21 | 2016-04-05 | Dell Products L.P. | Gesture controlled adaptive projected information handling system input and output devices |
| US9965038B2 (en) | 2014-03-21 | 2018-05-08 | Dell Products L.P. | Context adaptable projected information handling system input environment |
| JP6079695B2 (en) * | 2014-05-09 | 2017-02-15 | コニカミノルタ株式会社 | Image display photographing system, photographing device, display device, image display and photographing method, and computer program |
| US9993733B2 (en) | 2014-07-09 | 2018-06-12 | Lumo Interactive Inc. | Infrared reflective device interactive projection effect system |
| US20170214862A1 (en) * | 2014-08-07 | 2017-07-27 | Hitachi Maxell, Ltd. | Projection video display device and control method thereof |
| JP6381361B2 (en) * | 2014-08-20 | 2018-08-29 | キヤノン株式会社 | DATA PROCESSING DEVICE, DATA PROCESSING SYSTEM, DATA PROCESSING DEVICE CONTROL METHOD, AND PROGRAM |
| JP6047763B2 (en) * | 2014-09-03 | 2016-12-21 | パナソニックIpマネジメント株式会社 | User interface device and projector device |
| US10620748B2 (en) * | 2014-10-22 | 2020-04-14 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and device for providing a touch-based user interface |
| US10452206B2 (en) | 2014-12-08 | 2019-10-22 | Maxell, Ltd. | Projection video display device and video display method |
| JP6585628B2 (en) * | 2014-12-26 | 2019-10-02 | マクセル株式会社 | Lighting device |
| JP6604528B2 (en) * | 2015-01-09 | 2019-11-13 | 凸版印刷株式会社 | Image processing apparatus and image processing system |
| CN106556963A (en) * | 2015-09-24 | 2017-04-05 | 北京京东尚科信息技术有限公司 | Projection arrangement and projecting method |
| JP2017146927A (en) | 2016-02-19 | 2017-08-24 | ソニーモバイルコミュニケーションズ株式会社 | Control device, control method, and program |
| JP6607121B2 (en) | 2016-03-30 | 2019-11-20 | セイコーエプソン株式会社 | Image recognition apparatus, image recognition method, and image recognition unit |
| JP6285980B2 (en) * | 2016-04-14 | 2018-02-28 | 株式会社ソニー・インタラクティブエンタテインメント | Processing apparatus and projection image generation method |
| JP6844159B2 (en) * | 2016-09-12 | 2021-03-17 | セイコーエプソン株式会社 | Display device and control method of display device |
| JP6822209B2 (en) * | 2017-02-24 | 2021-01-27 | セイコーエプソン株式会社 | Projector and projector control method |
| JP6996114B2 (en) | 2017-05-29 | 2022-01-17 | セイコーエプソン株式会社 | Projector and projector control method |
| JP6456551B1 (en) * | 2017-08-31 | 2019-01-23 | 三菱電機株式会社 | OPTICAL DEVICE CONTROL DEVICE, OPTICAL DEVICE CONTROL METHOD, AND OPTICAL DEVICE CONTROL PROGRAM |
| CN108509090B (en) * | 2018-03-26 | 2020-08-25 | 联想(北京)有限公司 | Projection control method and electronic system |
| JP7227020B2 (en) * | 2019-01-31 | 2023-02-21 | 住友重機械工業株式会社 | Injection molding machine |
| JP7162813B1 (en) | 2022-03-04 | 2022-10-31 | チームラボ株式会社 | Display control system for drawing |
| TWI892140B (en) * | 2023-05-30 | 2025-08-01 | 精元電腦股份有限公司 | Non-contact detection and identification system |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100093399A1 (en) * | 2008-10-15 | 2010-04-15 | Lg Electronics Inc. | Image projection in a mobile communication terminal |
| US20100238141A1 (en) * | 2009-03-19 | 2010-09-23 | Sanyo Electric Co., Ltd. | Projection display apparatus, writing/drawing board, and projection system |
| US20110169746A1 (en) * | 2007-09-04 | 2011-07-14 | Canon Kabushiki Kaisha | Projection apparatus and control method for same |
Family Cites Families (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9513744B2 (en) * | 1994-08-15 | 2016-12-06 | Apple Inc. | Control systems employing novel physical controls and touch screens |
| JPH10341487A (en) * | 1997-04-09 | 1998-12-22 | Sony Corp | Information terminal device, information processing method, information providing device and method, information network system, and providing medium |
| TWI227462B (en) * | 2003-12-16 | 2005-02-01 | Ind Tech Res Inst | Projector device with virtual input element image |
| JP4351599B2 (en) * | 2004-09-03 | 2009-10-28 | パナソニック株式会社 | Input device |
| JP4479962B2 (en) | 2005-02-25 | 2010-06-09 | ソニー エリクソン モバイル コミュニケーションズ, エービー | Input processing program, portable terminal device, and input processing method |
| JP2007011459A (en) * | 2005-06-28 | 2007-01-18 | Konica Minolta Business Technologies Inc | Image formation device |
| US7286062B2 (en) * | 2005-06-29 | 2007-10-23 | Honeywell International, Inc. | Perspective view conformal traffic targets display |
| US7730422B2 (en) * | 2006-01-25 | 2010-06-01 | Microsoft Corporation | Smart icon placement across desktop size changes |
| US8913003B2 (en) * | 2006-07-17 | 2014-12-16 | Thinkoptics, Inc. | Free-space multi-dimensional absolute pointer using a projection marker system |
| JP2008197934A (en) * | 2007-02-14 | 2008-08-28 | Calsonic Kansei Corp | Operator determining method |
| JP4829855B2 (en) * | 2007-09-04 | 2011-12-07 | キヤノン株式会社 | Image projection apparatus and control method thereof |
| JP2009284468A (en) * | 2008-04-23 | 2009-12-03 | Sharp Corp | Personal digital assistant, computer readable program and recording medium |
| JP5258399B2 (en) * | 2008-06-06 | 2013-08-07 | キヤノン株式会社 | Image projection apparatus and control method thereof |
| KR101537596B1 (en) * | 2008-10-15 | 2015-07-20 | 엘지전자 주식회사 | Mobile terminal and method for recognizing touch thereof |
| CN104298398A (en) | 2008-12-04 | 2015-01-21 | 三菱电机株式会社 | Display input device |
| US8707195B2 (en) * | 2010-06-07 | 2014-04-22 | Apple Inc. | Devices, methods, and graphical user interfaces for accessibility via a touch-sensitive surface |
-
2011
- 2011-05-31 JP JP2011121266A patent/JP5304848B2/en active Active
- 2011-10-03 US US13/251,732 patent/US10203594B2/en active Active
- 2011-10-14 CN CN201110312252.4A patent/CN102457730B/en active Active
-
2018
- 2018-12-20 US US16/227,657 patent/US20190121227A1/en not_active Abandoned
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110169746A1 (en) * | 2007-09-04 | 2011-07-14 | Canon Kabushiki Kaisha | Projection apparatus and control method for same |
| US20100093399A1 (en) * | 2008-10-15 | 2010-04-15 | Lg Electronics Inc. | Image projection in a mobile communication terminal |
| US20100238141A1 (en) * | 2009-03-19 | 2010-09-23 | Sanyo Electric Co., Ltd. | Projection display apparatus, writing/drawing board, and projection system |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240161664A1 (en) * | 2021-05-06 | 2024-05-16 | Beijing Youzhuju Network Technology Co., Ltd. | Projection display method and apparatus, terminal, and non-transitory storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2012104096A (en) | 2012-05-31 |
| CN102457730B (en) | 2017-04-19 |
| CN102457730A (en) | 2012-05-16 |
| US10203594B2 (en) | 2019-02-12 |
| US20120098865A1 (en) | 2012-04-26 |
| JP5304848B2 (en) | 2013-10-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20190121227A1 (en) | Projector | |
| JP6075122B2 (en) | System, image projection apparatus, information processing apparatus, information processing method, and program | |
| US9400562B2 (en) | Image projection device, image projection system, and control method | |
| JP5434997B2 (en) | Image display device | |
| JP5973087B2 (en) | Projection-type image display device | |
| TWI501121B (en) | Gesture recognition method and touch system incorporating the same | |
| JP5258399B2 (en) | Image projection apparatus and control method thereof | |
| CN102196220B (en) | Messaging device and information processing method | |
| US20080267607A1 (en) | Image pickup apparatus and electronic device | |
| CN101681217A (en) | Image projection apparatus and control method for same | |
| JP6079695B2 (en) | Image display photographing system, photographing device, display device, image display and photographing method, and computer program | |
| WO2012120958A1 (en) | Projection device | |
| JP5817149B2 (en) | Projection device | |
| JP2012185630A (en) | Projection device | |
| JP6350175B2 (en) | POSITION DETECTION DEVICE, PROJECTOR, AND POSITION DETECTION METHOD | |
| JP2004164559A (en) | Portable terminal device | |
| US20220116543A1 (en) | Following shoot method, gimbal control method, photographing apparatus, handheld gimbal and photographing system | |
| WO2015104919A1 (en) | Gesture recognition device, operation input device, and gesture recognition method | |
| JPWO2018150569A1 (en) | Gesture recognition device, gesture recognition method, projector including gesture recognition device, and video signal supply device | |
| JP2011095985A (en) | Image display apparatus | |
| JP2015171116A (en) | Display device of camera | |
| JP2018055685A (en) | Information processing apparatus, control method therefor, program, and storage medium | |
| JP6057407B2 (en) | Touch position input device and touch position input method | |
| JP2013061680A (en) | Display device | |
| JP7652156B2 (en) | Detection method, detection device, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |