US20160316081A1 - Augmented reality operation system, and non-transitory computer-readable recording medium storing augmented reality operation program - Google Patents
Augmented reality operation system, and non-transitory computer-readable recording medium storing augmented reality operation program Download PDFInfo
- Publication number
- US20160316081A1 US20160316081A1 US15/135,853 US201615135853A US2016316081A1 US 20160316081 A1 US20160316081 A1 US 20160316081A1 US 201615135853 A US201615135853 A US 201615135853A US 2016316081 A1 US2016316081 A1 US 2016316081A1
- Authority
- US
- United States
- Prior art keywords
- image
- augmented reality
- section
- hand
- synthetic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00413—Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1202—Dedicated interfaces to print systems specifically adapted to achieve a particular effect
- G06F3/1203—Improving or facilitating administration, e.g. print management
- G06F3/1205—Improving or facilitating administration, e.g. print management resulting in increased flexibility in print job configuration, e.g. job settings, print requirements, job tickets
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1223—Dedicated interfaces to print systems specifically adapted to use a particular technique
- G06F3/1237—Print job management
- G06F3/1253—Configuration of print job parameters, e.g. using UI at the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00493—Particular location of the interface or console
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04801—Cursor retrieval aid, i.e. visual aspect modification, blinking, colour changes, enlargement or other visual cues, for helping user do find the cursor in graphical user interfaces
Definitions
- This disclosure relates to an augmented reality operation system and a non-transitory computer-readable recording medium storing an augmented reality operation program, and to a technology of displaying a synthetic image obtained by synthesizing an operation image for operating an electronic device with an image taken, and receiving operation on an operation section in the operation image.
- a typical image forming system which displays, on a touch panel, a synthetic image obtained by synthesizing an operation image for operating an image forming apparatus with an image taken, and then receives, on the touch panel, operation on an operation section in the operation image.
- this typical image forming system needs a touch panel operated through direct touch by a user.
- an augmented reality operation system includes: an image-taking device, a display device, an electronic device, an augmented reality processing section, and an operation reception section.
- the augmented reality processing section displays, on the display device, a synthetic image obtained by synthesizing an operation image illustrating an operation section for operating the electronic device in a manner such that the operation image is viewed to be located at a predefined position in a virtual space in a taken image taken by the image-taking device.
- the operation reception section receives operation on the operation section illustrated by the operation image.
- the operation reception section based on a position (on the synthetic image) of an image illustrating a hand included in the taken image taken by the image-taking device and a position of the operation section illustrated by the operation image on the synthetic image, receives the operation for the electronic device performed on the operation section illustrated by the operation image.
- a non-transitory computer-readable recording medium stores an augmented reality operation program, and this augmented reality operation program causes a computer to function as: an augmented reality processing section displaying, on a display device, a synthetic image obtained by synthesizing an operation image illustrating an operation section for operating an electronic device with an image taken by an image-taking device; and an operation reception section receiving operation on the operation section illustrated by the operation image.
- the augmented reality operation program further causes the computer to function in a manner such that, based on a position on the synthetic image of an image illustrating a hand included in the image taken by the image-taking device and on a position of the operation section illustrated by the operation image on the synthetic image, the operation reception section receives the operation for the electronic device performed on the operation section illustrated by the operation image.
- FIG. 1 is a schematic diagram illustrating a configuration of an augmented reality operation system according to one embodiment of this disclosure
- FIG. 2 is a schematic diagram of a block configuration of an MFP in FIG. 1 ;
- FIG. 3 is a schematic diagram of a block configuration of an augmented reality server in FIG. 1 ;
- FIG. 4 is a flowchart of operation performed by the augmented reality server in FIG. 3 when operating the MFP;
- FIG. 5 is a schematic diagram illustrating one example of a synthetic image displayed on a display device in FIG. 1 ;
- FIG. 6 is a schematic diagram illustrating positional relationship between the MFP, an operation image, an image-taking device, and the display device when viewed from a direction orthogonal to a direction in which image-taking is performed by the image-taking device in a case where the display device displays the synthetic image in FIG. 5 ;
- FIG. 7 is a schematic diagram illustrating one example of a synthetic image displayed on the display device in FIG. 1 in a case where a user's hand is located closer to a display device side than the operation image;
- FIG. 8 is a schematic diagram illustrating positional relationship between the MFP, the virtually arranged operation image and hand, the image-taking device, and the display device when viewed from the direction orthogonal to the direction in which the image-taking is performed by the image-taking device in a case where the display device displays the synthetic image in FIG. 7 ;
- FIG. 9 is a schematic diagram illustrating one example of a synthetic image displayed on the display device in FIG. 1 in a case where the user's hand is located closer to an MFP side than the operation image;
- FIG. 10 is a schematic diagram illustrating positional relationship between the MFP, the virtually arranged operation image and hand, the image-taking device, and the display device when viewed from the direction orthogonal to the direction in which the image-taking is performed by the image-taking device in a case where the display device displays the synthetic image in FIG. 9 ;
- FIG. 11 is a schematic diagram illustrating one example of a synthetic image displayed on the display device in FIG. 1 in a case where an image of a virtual hand is arranged with respect to the operation image;
- FIG. 12 is a schematic diagram illustrating positional relationship between the MFP, the virtually arranged operation image and hand, the image-taking device, and the display device when viewed from the direction orthogonal to the direction in which the image-taking is performed by the image-taking device in a case the display device displays the synthetic image illustrated in FIG. 11 ;
- FIG. 13 is a perspective diagram illustrating the MFP, the virtually arranged operation image and hand, and the display device in FIG. 12 ;
- FIG. 14 is a schematic diagram illustrating one example of a synthetic image displayed on the display device illustrated in FIG. 1 in a case where the user's hand is located closer to the MFP side than the operation image and an image of a virtual hand is arranged with respect to the operation image;
- FIG. 15 is a schematic diagram illustrating one example of a synthetic image displayed on the display device in FIG. 1 , an example different from the examples in FIGS. 11 and 14 , in a case where an image of a virtual hand is arranged with respect to the operation image.
- FIG. 1 is a schematic diagram illustrating a configuration of the augmented reality operation system 10 according to this embodiment.
- the augmented reality operation system 10 includes: a multifunction peripheral (MFP) 20 as an electronic device of this disclosure, an augmented reality (AR) server 30 , such as a personal computer (PC), which realizes augmented reality (AR), an image-taking device 41 such as a camera, a display device 42 such as a liquid crystal display (LCD), and an audio output device 43 such as a speaker.
- MFP 20 , the AR server 30 , the image-taking device 41 , the display device 42 , and the audio output device 43 are connected via a network 11 such as a local area network (LAN) or the Internet in a manner such as to be capable of communicating with each other.
- LAN local area network
- the Internet such as a local area network (LAN) or the Internet
- FIG. 2 is a schematic diagram of a block configuration of the MFP 20 .
- the MFP 20 includes: an operation section 21 as an input device such as buttons into which various kinds of operation by a user are inputted; a display section 22 as a display device such as the LCD which displays various pieces of informing; a scanner 23 as a reading device which reads an image from a document; a printer 24 as a printing device which executes printing on a recording medium such as paper; a fax communication section 25 as a fax device which performs fax communication with an external facsimile device, not illustrated, via a communication line such as a public phone line; a communication section 26 as a communication device which performs communication with an external device via the network 11 (see FIG.
- a storage section 27 as a nonvolatile storage device such as an electrically erasable programmable read only memory (EEPROM, registered trademark) or a hard disk drive (HDD); and a control section 28 which controls the entire MFP 20 .
- EEPROM electrically erasable programmable read only memory
- HDD hard disk drive
- the control section 28 includes: for example, a central processing unit (CPU); a read only memory (ROM) which stores programs and various pieces of data; and a random access memory (RAM) used as a working area of the CPU.
- the CPU can execute programs stored in the ROM or the storage section 27 .
- FIG. 3 is a schematic diagram of a block configuration of the AR server 30 .
- the AR server 30 includes: an operation section 31 as an input device such as a mouse or a keyboard through which various kinds of operation by the user are inputted; a display section 32 as a display device such as the LCD which displays various pieces of informing; a communication section 33 as a communication device which performs communication with the external device via the network 11 (see FIG. 1 ); a storage section 34 as a nonvolatile storage device such as the HDD which stores various pieces of data; and a control section 35 which controls the entire AR server 30 .
- the display section 32 is part of the operation section 31 .
- the storage section 34 stores an AR operation program 34 a which is executed by the control section 35 .
- the AR operation program 34 a may be installed in the AR server 30 at a stage at which the AR server 30 is manufactured, may be additionally installed in the AR server 30 from a computer-readable non-transitory recording medium, for example, an external storage medium such as a compact disk (CD), a digital versatile disk (DVD), or a universal serial bus (USB) memory, or may be additionally installed in the AR server 30 from the network 11 .
- a computer-readable non-transitory recording medium for example, an external storage medium such as a compact disk (CD), a digital versatile disk (DVD), or a universal serial bus (USB) memory, or may be additionally installed in the AR server 30 from the network 11 .
- CD compact disk
- DVD digital versatile disk
- USB universal serial bus
- the control section 35 includes: for example, a CPU, a ROM which storages programs and various pieces of data; and a RAM used as a working area of the CPU.
- the CPU executes programs stored in the ROM or the storage section 34 .
- the control section 35 by executing the AR operation program 34 a stored in the storage section 34 , functions as: an AR processing section 35 a which displays, on the display device 42 , a synthetic image obtained by synthesizing an operation image for operating the MFP 20 with an image taken by the image-taking device 41 ; an operation reception section 35 b which receives operation performed for the operation section in the operation image; and a stereophonic section 35 c which makes the audio output device 43 output operation sound for the operation section through a stereophonic technology in a manner such that the operation sound is outputted at a position in a real space corresponding to a position of the operation section in the synthetic image.
- FIG. 4 is a flowchart of operation performed by the AR server 30 when operating the MFP 20 .
- the AR processing section 35 a of the AR server 30 Upon start of image-taking by the image-taking device 41 , the AR processing section 35 a of the AR server 30 , through either of a location base AR based on position informing and a vision base AR based on image processing, as illustrated in FIG. 5 , starts to display, on the display device 42 , a synthetic image 70 obtained by synthetizing a prepared operation image 50 illustrating the operation section 21 of the MFP 20 with a taken image 60 taken by the image-taking device 41 (S 101 ).
- FIG. 5 is a schematic diagram illustrating one example of the synthetic image 70 displayed on the display device 42 .
- the operation image 50 illustrated in FIG. 5 is an image illustrating the operation section 21 of the MFP 20 , in particular, the display section 32 of the operation section 21 .
- the operation image 50 is a screen for setting, for example, a function of printing an image, which has been read from a document by the scanner 23 of the MFP 20 , on a recording medium by the printer 24 of the MFP 20 , that is, a copy function.
- the operation image 50 includes: a setting reception image 51 for setting a number of copies generated through copying; a setting reception image 52 for setting from which supply source a recording medium will be supplied in a case where there are a plurality of supply sources of recording media in the MFP 20 ; a setting reception image 53 for setting magnification of the copying; a setting reception image 54 for setting concentration of the copying; a setting reception image 55 for setting whether or not to read both sides of the document and whether or not to perform printing on the both sides of the recording medium; a setting reception image 56 for setting images of how many documents will be printed per recording medium; and a setting reception image 57 for setting whether or not to execute the printing on the recording media in order of documents' page numbers and whether or not to output the printed recording media on an individual print basis.
- the operation image 50 may be half-translucent.
- the taken image 60 illustrated in FIG. 5 includes an image 61 of the MFP 20 which is a taken image taken by the image-taking device 41 .
- FIG. 6 is a schematic diagram illustrating positional relationship between the MFP 20 , the virtually illustrated operation image 50 , the image-taking device 41 , and the display device 42 when viewed from a direction orthogonal to a direction in which the image-taking is performed by the image-taking device 41 in a case where the display device 42 displays the synthetic image 70 illustrated in FIG. 5 .
- the AR processing section 35 a generates the synthetic image 70 such that the operation image 50 appears, for the user viewing a display screen of the display device 42 , to be arranged in a space between the MFP 20 and the display device 42 .
- the operation image 50 appears, for the user viewing the synthetic image 70 displayed by the display device 42 , to be arranged in the space between the MFP 20 and the display device 42 .
- the operation image 50 is so illustrated in FIG. 6 as to be arranged on a real space for easier understanding, but it is actually not present on the real space.
- the AR processing section 35 a judges, based on the taken image 60 taken by the image-taking device 41 , whether or not an image illustrating a user's hand is present (S 102 ).
- the AR processing section 35 a can detect a hand image 62 illustrating a hand 90 of the user on the taken image 60 either upon generation of the synthetic image 70 viewed in a manner such that the hand image 62 lies closer to the display device 42 than the operation image 50 arranged in the aforementioned space, as illustrated in FIGS. 7 and 8 , or upon generation of the synthetic image 70 viewed in a manner such that the hand image 62 lies closer to the MFP 20 than the operation image 50 , as illustrated in FIGS. 9 and 10 . Illustrated in FIGS.
- FIGS. 8 and 10 are: the operation image 50 ; and the synthetic image 70 for easier understanding of vertical relationship in overlapping with the hand image 62 , but the AR processing section 35 a actually judges, by using the taken image 60 in a state in which the operation image 50 is not synthesized, whether or not the hand image 62 is present on the taken image 60 .
- the operation image 50 is drawn in spaces, but this operation image 50 is only virtually illustrated, and as is the case with FIG. 6 , the operation image 50 is actually not present on the real spaces.
- the AR processing section 35 a upon judgment that the hand image 62 is present on the taken image 60 taken by the image-taking device 41 , that is, upon detection of the hand image 62 (YES in S 102 ), deletes the image 62 of the actual hand 90 included in the taken image 60 from the synthetic image 70 through image processing (S 103 ). Therefore, the synthetic images 70 illustrated in FIGS. 7 and 9 turn into those like the synthetic image 70 illustrated in FIG. 5 .
- Possible methods of deleting the image 62 through the image processing of the AR processing section 35 a include: for example, a method of interpolating from an area around the image 62 ; and a method of storing the taken image 60 for a specific period and returning an area of the image 62 into a state before appearance of the image 62 .
- the AR processing section 35 a after processing of S 103 , arranges a prepared image 80 of a virtual hand, instead of the image 62 illustrating the hand on the synthetic image 70 , at a position of this image 62 illustrating the hand on the taken image 60 , and further generates a synthetic image 70 illustrated in an example of FIG. 11 (S 104 ).
- the AR processing section 35 a uses, as the image 80 of the virtual hand, at least part of the image 62 of the real hand included in the taken image 60 .
- This image 80 is, for example, an image obtained by processing the image 62 through image processing to more clarify contrast with respect to surrounding images.
- the AR processing section 35 a determines a position at which the image 80 is arranged in a manner such that positions of the operation image 50 and a fingertip of a hand 91 illustrated by the image 80 appear, for the user viewing the synthetic image 70 displayed on the display device 42 , to overlap in the direction in which the image-taking is performed by the image-taking device 41 , as illustrated in FIG. 12 , and generates a synthetic image 70 .
- the operation image 50 and the hand 91 are arranged on a real space in FIG. 12 for easier understanding, but are actually not present on the real space.
- the AR processing section 35 a after processing of S 104 , displays, on the display device 42 , the synthetic image 70 generated in S 104 (S 105 ).
- the operation reception section 35 b of the AR server 30 after processing of S 105 , judges whether or not any of the setting reception images 51 to 57 in the operation image 50 on the synthetic image 70 has been pressed (S 106 ). Specifically, the operation reception section 35 b receives operation on the setting reception images 51 to 57 based on a position (on the synthetic image 70 ) of the hand image 62 of the hand 90 taken by the image-taking device 41 and positions of the setting reception images 51 to 57 on the synthetic image 70 .
- the operation reception section 35 b receives operation corresponding to the position of the image 62 illustrating the hand 90 in this operation image 50 .
- the operation reception section 35 b judges that the setting reception image 56 has been pressed when a position of a fingertip of the image 62 of the hand 90 in the taken image 60 taken by the image-taking device 41 overlaps with a position of the setting reception image 56 illustrated in FIGS. 7 and 9 on the synthetic image 70 and also when a size of this fingertip decreases while its position on the synthetic image 70 does not change.
- the operation reception section 35 b receives predefined operation on the setting reception image 56 .
- the operation reception section 35 b may receive operation on the setting reception images 51 to 57 .
- the stereophonic section 35 c of the AR server 30 upon judgment in S 106 that any of the setting reception images 51 to 57 on the synthetic image 70 has been pressed, makes the audio output device 43 output operation sound for the setting reception image 56 pressed on the synthetic image 70 through the stereophonic technology in a manner such that the operation sound is outputted at a position in the real space corresponding to a position of the setting reception image 56 in a virtual space (a position within the synthetic image 70 displayed on the display device 42 ) (S 107 ). For example, in a case where the display device 42 displays the synthetic image 70 illustrated in FIG.
- the stereophonic section 35 c makes the audio output device 43 output operation sound for the setting reception image 56 pressed on the synthetic image 70 through the stereophonic technology in a manner such that the operation sound is outputted at a position 92 in a real space corresponding to the aforementioned position of the setting reception image 56 in the virtual space.
- the operation reception section 35 b after processing of S 107 , notifies the operation received in S 106 to the MFP 20 (S 108 ).
- the control section 28 of the MFP 20 executes action in accordance with this notified operation.
- the AR processing section 35 a upon judgment in S 106 that none of the setting reception images 51 to 57 in the operation image 50 on the synthetic image 70 has been pressed and upon end of processing of S 108 , returns to processing in S 102 .
- the AR operation system 10 receives the operation corresponding to the setting reception images 51 to 57 based on the position (on the synthetic image 70 ) of the image 62 of the hand 90 taken by the image-taking device 41 and the positions of the setting reception images 51 to 57 on the synthetic image 70 (S 106 ).
- the operation on the setting reception images 51 to 57 in the operation image 50 synthesized with the taken image 60 can be received.
- the human being is featured to holistically recognize information obtained through five senses to judge operation feeling. For example, the human being is featured to, when heard of operation sound from a little distant area while a section actually operated cannot be viewed, feel that a hand is located at a section where the operation sound has been heard. That is, the human being is featured to have no uncomfortable feeling even if the section where the operation sound has been heard is actually different from the section actually operated.
- the AR operation system 10 can suppress the uncomfortable feeling for the operation performed on the setting reception images 51 to 57 not present in the real space with operation sound through the stereophonic technology outputted in a manner such as to be outputted at the position in the real space corresponding to the positions of the operation sections 51 to 57 on the synthetic image 70 , which can therefore improve operation feeling for the setting reception images 51 to 57 .
- the user for example, can easily execute the operation on the setting reception images 51 to 57 , making it difficult to make any mistake with the operation on the setting reception images 51 to 57 .
- the AR operation system 10 can suppress the uncomfortable feeling for the operation on the setting reception images 51 to 57 not present in the real space through interaction between the image 80 of the virtual hand 91 arranged with respect to the operation image 50 on the synthetic image 70 and the operation sound through the stereophonic technology outputted in a manner such as to be outputted at the position in the real space corresponding to the positions of the setting reception images 51 to 57 on the synthetic image 70 , which can therefore further improve the operation feeling for the setting reception images 51 to 57 .
- the AR operation system 10 can make the image 80 of the virtual hand 91 more outstanding by not including the image 62 of the real hand 90 in the synthetic image 70 , which can therefore further suppress the uncomfortable feeling for the operation by the virtual hand 91 .
- the AR operation system 10 in the processing performed by the AR processing section 35 a, for example, as illustrated in FIG. 14 , may display both the image 80 of the virtual hand 91 and the image 62 of the real hand 90 while keeping the image 62 of the real hand 90 included in the synthetic image 70 .
- the AR operation system 10 uses at least the part of the image 62 of the user's own real hand 90 as the image 80 of the virtual hand 91 , which can therefore further suppress the uncomfortable feeling for the operation, by the virtual hand 91 , for the setting reception images 51 to 57 not present in the real space.
- the AR operation system 10 may use, instead of the hand image 62 , for example, a predefined graphic image as the image 80 of the virtual hand 91 .
- the AR operation system 10 may have two or more of the MFP 20 , the AR server 30 , the image-taking device 41 , the display device 42 , and the audio output device 43 manufactured as the same device.
- the AR operation system 10 may include a portable device such as a smart phone or a head mount display which includes two or more of the AR server 30 , the image-taking device 41 , the display device 42 , and the audio output device 43 .
- the AR operation system 10 may realize at least part of functions of the AR server 30 by the MFP 20 .
- the AR operation system 10 may not include the AR server 30 in a case where all of the functions of the AR server 30 are realized by the MFP 20 .
- the AR operation system 10 includes the MFP as the electronic device of this disclosure, but an image forming apparatus, such as a print-only device, other than the MFP may be provided as the electronic device of this disclosure, or an electronic device, such as a PC, other than the image forming apparatus may be provided as the electronic device of this disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Facsimiles In General (AREA)
- Telephonic Communication Services (AREA)
- Accessory Devices And Overall Control Thereof (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- This application claims priority to Japanese Patent Application No. 2015-089901 filed on Apr. 25, 2015, the entire contents of which are incorporated by reference herein.
- This disclosure relates to an augmented reality operation system and a non-transitory computer-readable recording medium storing an augmented reality operation program, and to a technology of displaying a synthetic image obtained by synthesizing an operation image for operating an electronic device with an image taken, and receiving operation on an operation section in the operation image.
- A typical image forming system is known which displays, on a touch panel, a synthetic image obtained by synthesizing an operation image for operating an image forming apparatus with an image taken, and then receives, on the touch panel, operation on an operation section in the operation image. However, this typical image forming system needs a touch panel operated through direct touch by a user.
- According to an aspect of this disclosure, an augmented reality operation system includes: an image-taking device, a display device, an electronic device, an augmented reality processing section, and an operation reception section.
- The augmented reality processing section displays, on the display device, a synthetic image obtained by synthesizing an operation image illustrating an operation section for operating the electronic device in a manner such that the operation image is viewed to be located at a predefined position in a virtual space in a taken image taken by the image-taking device.
- The operation reception section receives operation on the operation section illustrated by the operation image. The operation reception section, based on a position (on the synthetic image) of an image illustrating a hand included in the taken image taken by the image-taking device and a position of the operation section illustrated by the operation image on the synthetic image, receives the operation for the electronic device performed on the operation section illustrated by the operation image.
- According to another aspect of this disclosure, a non-transitory computer-readable recording medium stores an augmented reality operation program, and this augmented reality operation program causes a computer to function as: an augmented reality processing section displaying, on a display device, a synthetic image obtained by synthesizing an operation image illustrating an operation section for operating an electronic device with an image taken by an image-taking device; and an operation reception section receiving operation on the operation section illustrated by the operation image.
- The augmented reality operation program further causes the computer to function in a manner such that, based on a position on the synthetic image of an image illustrating a hand included in the image taken by the image-taking device and on a position of the operation section illustrated by the operation image on the synthetic image, the operation reception section receives the operation for the electronic device performed on the operation section illustrated by the operation image.
-
FIG. 1 is a schematic diagram illustrating a configuration of an augmented reality operation system according to one embodiment of this disclosure; -
FIG. 2 is a schematic diagram of a block configuration of an MFP inFIG. 1 ; -
FIG. 3 is a schematic diagram of a block configuration of an augmented reality server inFIG. 1 ; -
FIG. 4 is a flowchart of operation performed by the augmented reality server inFIG. 3 when operating the MFP; -
FIG. 5 is a schematic diagram illustrating one example of a synthetic image displayed on a display device inFIG. 1 ; -
FIG. 6 is a schematic diagram illustrating positional relationship between the MFP, an operation image, an image-taking device, and the display device when viewed from a direction orthogonal to a direction in which image-taking is performed by the image-taking device in a case where the display device displays the synthetic image inFIG. 5 ; -
FIG. 7 is a schematic diagram illustrating one example of a synthetic image displayed on the display device inFIG. 1 in a case where a user's hand is located closer to a display device side than the operation image; -
FIG. 8 is a schematic diagram illustrating positional relationship between the MFP, the virtually arranged operation image and hand, the image-taking device, and the display device when viewed from the direction orthogonal to the direction in which the image-taking is performed by the image-taking device in a case where the display device displays the synthetic image inFIG. 7 ; -
FIG. 9 is a schematic diagram illustrating one example of a synthetic image displayed on the display device inFIG. 1 in a case where the user's hand is located closer to an MFP side than the operation image; -
FIG. 10 is a schematic diagram illustrating positional relationship between the MFP, the virtually arranged operation image and hand, the image-taking device, and the display device when viewed from the direction orthogonal to the direction in which the image-taking is performed by the image-taking device in a case where the display device displays the synthetic image inFIG. 9 ; -
FIG. 11 is a schematic diagram illustrating one example of a synthetic image displayed on the display device inFIG. 1 in a case where an image of a virtual hand is arranged with respect to the operation image; -
FIG. 12 is a schematic diagram illustrating positional relationship between the MFP, the virtually arranged operation image and hand, the image-taking device, and the display device when viewed from the direction orthogonal to the direction in which the image-taking is performed by the image-taking device in a case the display device displays the synthetic image illustrated inFIG. 11 ; -
FIG. 13 is a perspective diagram illustrating the MFP, the virtually arranged operation image and hand, and the display device inFIG. 12 ; -
FIG. 14 is a schematic diagram illustrating one example of a synthetic image displayed on the display device illustrated inFIG. 1 in a case where the user's hand is located closer to the MFP side than the operation image and an image of a virtual hand is arranged with respect to the operation image; and -
FIG. 15 is a schematic diagram illustrating one example of a synthetic image displayed on the display device inFIG. 1 , an example different from the examples inFIGS. 11 and 14 , in a case where an image of a virtual hand is arranged with respect to the operation image. - Hereinafter, as one aspect of this disclosure, an augmented reality operation system and a computer-readable non-transitory recording medium storing an augmented reality operation program will be described with reference to the drawings.
- First, configuration of the augmented reality operation system according to this embodiment will be described.
FIG. 1 is a schematic diagram illustrating a configuration of the augmentedreality operation system 10 according to this embodiment. - As illustrated in
FIG. 1 , the augmentedreality operation system 10 includes: a multifunction peripheral (MFP) 20 as an electronic device of this disclosure, an augmented reality (AR)server 30, such as a personal computer (PC), which realizes augmented reality (AR), an image-taking device 41 such as a camera, adisplay device 42 such as a liquid crystal display (LCD), and anaudio output device 43 such as a speaker. TheMFP 20, theAR server 30, the image-taking device 41, thedisplay device 42, and theaudio output device 43 are connected via anetwork 11 such as a local area network (LAN) or the Internet in a manner such as to be capable of communicating with each other. -
FIG. 2 is a schematic diagram of a block configuration of theMFP 20. As illustrated inFIG. 2 , theMFP 20 includes: anoperation section 21 as an input device such as buttons into which various kinds of operation by a user are inputted; adisplay section 22 as a display device such as the LCD which displays various pieces of informing; ascanner 23 as a reading device which reads an image from a document; aprinter 24 as a printing device which executes printing on a recording medium such as paper; afax communication section 25 as a fax device which performs fax communication with an external facsimile device, not illustrated, via a communication line such as a public phone line; acommunication section 26 as a communication device which performs communication with an external device via the network 11 (seeFIG. 1 ); astorage section 27 as a nonvolatile storage device such as an electrically erasable programmable read only memory (EEPROM, registered trademark) or a hard disk drive (HDD); and acontrol section 28 which controls theentire MFP 20. - The
control section 28 includes: for example, a central processing unit (CPU); a read only memory (ROM) which stores programs and various pieces of data; and a random access memory (RAM) used as a working area of the CPU. The CPU can execute programs stored in the ROM or thestorage section 27. -
FIG. 3 is a schematic diagram of a block configuration of theAR server 30. As illustrated inFIG. 3 , theAR server 30 includes: anoperation section 31 as an input device such as a mouse or a keyboard through which various kinds of operation by the user are inputted; adisplay section 32 as a display device such as the LCD which displays various pieces of informing; acommunication section 33 as a communication device which performs communication with the external device via the network 11 (seeFIG. 1 ); astorage section 34 as a nonvolatile storage device such as the HDD which stores various pieces of data; and acontrol section 35 which controls theentire AR server 30. Note that thedisplay section 32 is part of theoperation section 31. - The
storage section 34 stores anAR operation program 34 a which is executed by thecontrol section 35. TheAR operation program 34 a may be installed in theAR server 30 at a stage at which theAR server 30 is manufactured, may be additionally installed in theAR server 30 from a computer-readable non-transitory recording medium, for example, an external storage medium such as a compact disk (CD), a digital versatile disk (DVD), or a universal serial bus (USB) memory, or may be additionally installed in theAR server 30 from thenetwork 11. - The
control section 35 includes: for example, a CPU, a ROM which storages programs and various pieces of data; and a RAM used as a working area of the CPU. The CPU executes programs stored in the ROM or thestorage section 34. - The
control section 35, by executing theAR operation program 34 a stored in thestorage section 34, functions as: anAR processing section 35 a which displays, on thedisplay device 42, a synthetic image obtained by synthesizing an operation image for operating theMFP 20 with an image taken by the image-taking device 41; anoperation reception section 35 b which receives operation performed for the operation section in the operation image; and a stereophonic section 35 c which makes theaudio output device 43 output operation sound for the operation section through a stereophonic technology in a manner such that the operation sound is outputted at a position in a real space corresponding to a position of the operation section in the synthetic image. - Next, operation of the
AR operation system 10 will be described.FIG. 4 is a flowchart of operation performed by theAR server 30 when operating theMFP 20. - Upon start of image-taking by the image-
taking device 41, theAR processing section 35 a of theAR server 30, through either of a location base AR based on position informing and a vision base AR based on image processing, as illustrated inFIG. 5 , starts to display, on thedisplay device 42, asynthetic image 70 obtained by synthetizing a preparedoperation image 50 illustrating theoperation section 21 of theMFP 20 with a takenimage 60 taken by the image-taking device 41 (S101). -
FIG. 5 is a schematic diagram illustrating one example of thesynthetic image 70 displayed on thedisplay device 42. - The
operation image 50 illustrated inFIG. 5 is an image illustrating theoperation section 21 of theMFP 20, in particular, thedisplay section 32 of theoperation section 21. Theoperation image 50 is a screen for setting, for example, a function of printing an image, which has been read from a document by thescanner 23 of theMFP 20, on a recording medium by theprinter 24 of theMFP 20, that is, a copy function. For example, theoperation image 50 illustrated inFIG. 5 includes: a settingreception image 51 for setting a number of copies generated through copying; a settingreception image 52 for setting from which supply source a recording medium will be supplied in a case where there are a plurality of supply sources of recording media in theMFP 20; a settingreception image 53 for setting magnification of the copying; asetting reception image 54 for setting concentration of the copying; a settingreception image 55 for setting whether or not to read both sides of the document and whether or not to perform printing on the both sides of the recording medium; a settingreception image 56 for setting images of how many documents will be printed per recording medium; and a settingreception image 57 for setting whether or not to execute the printing on the recording media in order of documents' page numbers and whether or not to output the printed recording media on an individual print basis. Theoperation image 50 may be half-translucent. - Moreover, the taken
image 60 illustrated inFIG. 5 includes animage 61 of theMFP 20 which is a taken image taken by the image-taking device 41. -
FIG. 6 is a schematic diagram illustrating positional relationship between theMFP 20, the virtually illustratedoperation image 50, the image-taking device 41, and thedisplay device 42 when viewed from a direction orthogonal to a direction in which the image-taking is performed by the image-taking device 41 in a case where thedisplay device 42 displays thesynthetic image 70 illustrated inFIG. 5 . - The
AR processing section 35 a generates thesynthetic image 70 such that theoperation image 50 appears, for the user viewing a display screen of thedisplay device 42, to be arranged in a space between theMFP 20 and thedisplay device 42. As a result, upon the display of thesynthetic image 70 by thedisplay device 42, as illustrated inFIG. 6 , theoperation image 50 appears, for the user viewing thesynthetic image 70 displayed by thedisplay device 42, to be arranged in the space between theMFP 20 and thedisplay device 42. Note that theoperation image 50 is so illustrated inFIG. 6 as to be arranged on a real space for easier understanding, but it is actually not present on the real space. - As illustrated in
FIG. 4 , theAR processing section 35 a judges, based on the takenimage 60 taken by the image-taking device 41, whether or not an image illustrating a user's hand is present (S102). - For example, the user holds the
display device 42 to view the display screen of thisdisplay device 42. TheAR processing section 35 a can detect ahand image 62 illustrating ahand 90 of the user on the takenimage 60 either upon generation of thesynthetic image 70 viewed in a manner such that thehand image 62 lies closer to thedisplay device 42 than theoperation image 50 arranged in the aforementioned space, as illustrated inFIGS. 7 and 8 , or upon generation of thesynthetic image 70 viewed in a manner such that thehand image 62 lies closer to theMFP 20 than theoperation image 50, as illustrated inFIGS. 9 and 10 . Illustrated inFIGS. 7 and 9 are: theoperation image 50; and thesynthetic image 70 for easier understanding of vertical relationship in overlapping with thehand image 62, but theAR processing section 35 a actually judges, by using the takenimage 60 in a state in which theoperation image 50 is not synthesized, whether or not thehand image 62 is present on the takenimage 60. Moreover, inFIGS. 8 and 10 , theoperation image 50 is drawn in spaces, but thisoperation image 50 is only virtually illustrated, and as is the case withFIG. 6 , theoperation image 50 is actually not present on the real spaces. - As illustrated in
FIG. 4 , theAR processing section 35 a, upon judgment that thehand image 62 is present on the takenimage 60 taken by the image-takingdevice 41, that is, upon detection of the hand image 62 (YES in S102), deletes theimage 62 of theactual hand 90 included in the takenimage 60 from thesynthetic image 70 through image processing (S103). Therefore, thesynthetic images 70 illustrated inFIGS. 7 and 9 turn into those like thesynthetic image 70 illustrated inFIG. 5 . Possible methods of deleting theimage 62 through the image processing of theAR processing section 35 a include: for example, a method of interpolating from an area around theimage 62; and a method of storing the takenimage 60 for a specific period and returning an area of theimage 62 into a state before appearance of theimage 62. - The
AR processing section 35 a, after processing of S103, arranges aprepared image 80 of a virtual hand, instead of theimage 62 illustrating the hand on thesynthetic image 70, at a position of thisimage 62 illustrating the hand on the takenimage 60, and further generates asynthetic image 70 illustrated in an example ofFIG. 11 (S104). - Here, the
AR processing section 35 a uses, as theimage 80 of the virtual hand, at least part of theimage 62 of the real hand included in the takenimage 60. Thisimage 80 is, for example, an image obtained by processing theimage 62 through image processing to more clarify contrast with respect to surrounding images. TheAR processing section 35 a determines a position at which theimage 80 is arranged in a manner such that positions of theoperation image 50 and a fingertip of ahand 91 illustrated by theimage 80 appear, for the user viewing thesynthetic image 70 displayed on thedisplay device 42, to overlap in the direction in which the image-taking is performed by the image-takingdevice 41, as illustrated inFIG. 12 , and generates asynthetic image 70. Note that theoperation image 50 and thehand 91 are arranged on a real space inFIG. 12 for easier understanding, but are actually not present on the real space. - As illustrated in
FIG. 4 , theAR processing section 35 a, after processing of S104, displays, on thedisplay device 42, thesynthetic image 70 generated in S104 (S105). - The
operation reception section 35 b of theAR server 30, after processing of S105, judges whether or not any of thesetting reception images 51 to 57 in theoperation image 50 on thesynthetic image 70 has been pressed (S106). Specifically, theoperation reception section 35 b receives operation on thesetting reception images 51 to 57 based on a position (on the synthetic image 70) of thehand image 62 of thehand 90 taken by the image-takingdevice 41 and positions of thesetting reception images 51 to 57 on thesynthetic image 70. Here, upon a predefined change in theimage 62 illustrating thehand 90 at a position displayed on thedisplay device 42 as a result of its overlapping with theoperation image 50, theoperation reception section 35 b receives operation corresponding to the position of theimage 62 illustrating thehand 90 in thisoperation image 50. For example, theoperation reception section 35 b judges that thesetting reception image 56 has been pressed when a position of a fingertip of theimage 62 of thehand 90 in the takenimage 60 taken by the image-takingdevice 41 overlaps with a position of thesetting reception image 56 illustrated inFIGS. 7 and 9 on thesynthetic image 70 and also when a size of this fingertip decreases while its position on thesynthetic image 70 does not change. Then theoperation reception section 35 b receives predefined operation on thesetting reception image 56. Note that, in S106, based on a position of theimage 80 on thesynthetic image 70 and positions of thesetting reception images 51 to 57 on thesynthetic image 70, theoperation reception section 35 b may receive operation on thesetting reception images 51 to 57. - The stereophonic section 35 c of the
AR server 30, upon judgment in S106 that any of thesetting reception images 51 to 57 on thesynthetic image 70 has been pressed, makes theaudio output device 43 output operation sound for thesetting reception image 56 pressed on thesynthetic image 70 through the stereophonic technology in a manner such that the operation sound is outputted at a position in the real space corresponding to a position of thesetting reception image 56 in a virtual space (a position within thesynthetic image 70 displayed on the display device 42) (S107). For example, in a case where thedisplay device 42 displays thesynthetic image 70 illustrated inFIG. 11 , when theoperation image 50 appears, for the user viewing thesynthetic image 70 displayed by thedisplay device 42, to be arranged in a space between theMFP 20 and thedisplay device 42, as illustrated inFIG. 13 , the stereophonic section 35 c makes theaudio output device 43 output operation sound for thesetting reception image 56 pressed on thesynthetic image 70 through the stereophonic technology in a manner such that the operation sound is outputted at aposition 92 in a real space corresponding to the aforementioned position of thesetting reception image 56 in the virtual space. - The
operation reception section 35 b, after processing of S107, notifies the operation received in S106 to the MFP 20 (S108). Thecontrol section 28 of theMFP 20 executes action in accordance with this notified operation. - The
AR processing section 35 a, upon judgment in S106 that none of thesetting reception images 51 to 57 in theoperation image 50 on thesynthetic image 70 has been pressed and upon end of processing of S108, returns to processing in S102. - As described above, the
AR operation system 10 receives the operation corresponding to thesetting reception images 51 to 57 based on the position (on the synthetic image 70) of theimage 62 of thehand 90 taken by the image-takingdevice 41 and the positions of thesetting reception images 51 to 57 on the synthetic image 70 (S106). Thus, even without providing an input device operated through direct touch by the user, the operation on thesetting reception images 51 to 57 in theoperation image 50 synthesized with the takenimage 60 can be received. - The human being is featured to holistically recognize information obtained through five senses to judge operation feeling. For example, the human being is featured to, when heard of operation sound from a little distant area while a section actually operated cannot be viewed, feel that a hand is located at a section where the operation sound has been heard. That is, the human being is featured to have no uncomfortable feeling even if the section where the operation sound has been heard is actually different from the section actually operated.
- The
AR operation system 10 can suppress the uncomfortable feeling for the operation performed on thesetting reception images 51 to 57 not present in the real space with operation sound through the stereophonic technology outputted in a manner such as to be outputted at the position in the real space corresponding to the positions of theoperation sections 51 to 57 on thesynthetic image 70, which can therefore improve operation feeling for thesetting reception images 51 to 57. As a result, the user, for example, can easily execute the operation on thesetting reception images 51 to 57, making it difficult to make any mistake with the operation on thesetting reception images 51 to 57. - The
AR operation system 10 can suppress the uncomfortable feeling for the operation on thesetting reception images 51 to 57 not present in the real space through interaction between theimage 80 of thevirtual hand 91 arranged with respect to theoperation image 50 on thesynthetic image 70 and the operation sound through the stereophonic technology outputted in a manner such as to be outputted at the position in the real space corresponding to the positions of thesetting reception images 51 to 57 on thesynthetic image 70, which can therefore further improve the operation feeling for thesetting reception images 51 to 57. - The
AR operation system 10 can make theimage 80 of thevirtual hand 91 more outstanding by not including theimage 62 of thereal hand 90 in thesynthetic image 70, which can therefore further suppress the uncomfortable feeling for the operation by thevirtual hand 91. Note that theAR operation system 10, in the processing performed by theAR processing section 35 a, for example, as illustrated inFIG. 14 , may display both theimage 80 of thevirtual hand 91 and theimage 62 of thereal hand 90 while keeping theimage 62 of thereal hand 90 included in thesynthetic image 70. - The
AR operation system 10 uses at least the part of theimage 62 of the user's ownreal hand 90 as theimage 80 of thevirtual hand 91, which can therefore further suppress the uncomfortable feeling for the operation, by thevirtual hand 91, for thesetting reception images 51 to 57 not present in the real space. Note that theAR operation system 10, as illustrated inFIG. 15 for example, may use, instead of thehand image 62, for example, a predefined graphic image as theimage 80 of thevirtual hand 91. - The
AR operation system 10 may have two or more of theMFP 20, theAR server 30, the image-takingdevice 41, thedisplay device 42, and theaudio output device 43 manufactured as the same device. For example, theAR operation system 10 may include a portable device such as a smart phone or a head mount display which includes two or more of theAR server 30, the image-takingdevice 41, thedisplay device 42, and theaudio output device 43. - The
AR operation system 10 may realize at least part of functions of theAR server 30 by theMFP 20. TheAR operation system 10 may not include theAR server 30 in a case where all of the functions of theAR server 30 are realized by theMFP 20. - In this embodiment, the
AR operation system 10 includes the MFP as the electronic device of this disclosure, but an image forming apparatus, such as a print-only device, other than the MFP may be provided as the electronic device of this disclosure, or an electronic device, such as a PC, other than the image forming apparatus may be provided as the electronic device of this disclosure. - Various modifications and alterations of this disclosure will be apparent to those skilled in the art without departing from the scope and spirit of this disclosure, and it should be understood that this disclosure is not limited to the illustrative embodiments set forth herein.
Claims (7)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015089901A JP6354653B2 (en) | 2015-04-25 | 2015-04-25 | Augmented reality operation system and augmented reality operation program |
| JP2015-089901 | 2015-04-25 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20160316081A1 true US20160316081A1 (en) | 2016-10-27 |
| US9628646B2 US9628646B2 (en) | 2017-04-18 |
Family
ID=57147090
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/135,853 Expired - Fee Related US9628646B2 (en) | 2015-04-25 | 2016-04-22 | Augmented reality operation system and augmented reality operation method |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US9628646B2 (en) |
| JP (1) | JP6354653B2 (en) |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170038830A1 (en) * | 2015-08-04 | 2017-02-09 | Google Inc. | Context sensitive hand collisions in virtual reality |
| US20190050062A1 (en) * | 2017-08-10 | 2019-02-14 | Google Llc | Context-sensitive hand interaction |
| US20190212962A1 (en) * | 2017-07-14 | 2019-07-11 | Kyocera Document Solutions Inc. | Display device and display system |
| US20190266393A1 (en) * | 2018-02-28 | 2019-08-29 | Fuji Xerox Co.,Ltd. | Information processing apparatus and non-transitory computer readable medium |
| US20190340833A1 (en) * | 2018-05-04 | 2019-11-07 | Oculus Vr, Llc | Prevention of User Interface Occlusion in a Virtual Reality Environment |
| CN110543233A (en) * | 2018-05-29 | 2019-12-06 | 富士施乐株式会社 | Information processing apparatus and non-transitory computer readable medium |
| US11223729B2 (en) * | 2018-05-29 | 2022-01-11 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium for instructing an object to perform a specific function |
| US20240403484A1 (en) * | 2023-06-04 | 2024-12-05 | Apple Inc. | Privacy-protecting mixed reality recording |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018132824A (en) * | 2017-02-13 | 2018-08-23 | 株式会社Soken | Operation device |
| JP7363399B2 (en) * | 2019-11-15 | 2023-10-18 | 富士フイルムビジネスイノベーション株式会社 | Information processing device, information processing system, and information processing program |
Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060293765A1 (en) * | 2005-06-27 | 2006-12-28 | Konica Minolta Business Technologies, Inc. | Apparatus, apparatus system, image forming apparatus, and control method and computer programs of apparatus |
| US20090135135A1 (en) * | 2007-11-22 | 2009-05-28 | Takehiko Tsurumi | Recording and reproducing apparatus |
| US20120026530A1 (en) * | 2010-07-27 | 2012-02-02 | Xerox Corporation | Augmented reality system and method for device management and service |
| US20120113223A1 (en) * | 2010-11-05 | 2012-05-10 | Microsoft Corporation | User Interaction in Augmented Reality |
| US20120327003A1 (en) * | 2010-03-23 | 2012-12-27 | Sharp Kabushiki Kaisha | Information display device and document data editing method |
| US20130041648A1 (en) * | 2008-10-27 | 2013-02-14 | Sony Computer Entertainment Inc. | Sound localization for user in motion |
| US20130142371A1 (en) * | 2011-12-01 | 2013-06-06 | Jason P. Martin | Detachable Audio Speakers for Portable Devices and Methods for Manufacturing such Speakers |
| US20130257848A1 (en) * | 2012-03-28 | 2013-10-03 | Microsoft Corporation | Augmented Reality Light Guide Display |
| US20140063542A1 (en) * | 2012-08-29 | 2014-03-06 | Ricoh Company, Ltd. | Mobile terminal device, image forming method, and image processing system |
| US20140232747A1 (en) * | 2013-02-15 | 2014-08-21 | Konica Minolta, Inc. | Operation display system and operation display method |
| US9310883B2 (en) * | 2010-03-05 | 2016-04-12 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
| US20160217617A1 (en) * | 2013-08-30 | 2016-07-28 | Hewlett-Packard Development Company, L.P. | Augmented reality device interfacing |
| US20160232369A1 (en) * | 2015-02-11 | 2016-08-11 | Ricoh Company, Ltd. | Managing Access To Images Using Roles |
| US20160349926A1 (en) * | 2014-01-10 | 2016-12-01 | Nec Corporation | Interface device, portable device, control device and module |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0749744A (en) * | 1993-08-04 | 1995-02-21 | Pioneer Electron Corp | Head mounting type display input device |
| JP4676303B2 (en) * | 2005-10-18 | 2011-04-27 | 株式会社日立製作所 | Terminal device |
| JP4900741B2 (en) * | 2010-01-29 | 2012-03-21 | 島根県 | Image recognition apparatus, operation determination method, and program |
| JP2012029164A (en) | 2010-07-26 | 2012-02-09 | Konica Minolta Business Technologies Inc | Portable terminal and device managing method |
| JP6195893B2 (en) * | 2013-02-19 | 2017-09-13 | ミラマ サービス インク | Shape recognition device, shape recognition program, and shape recognition method |
-
2015
- 2015-04-25 JP JP2015089901A patent/JP6354653B2/en not_active Expired - Fee Related
-
2016
- 2016-04-22 US US15/135,853 patent/US9628646B2/en not_active Expired - Fee Related
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060293765A1 (en) * | 2005-06-27 | 2006-12-28 | Konica Minolta Business Technologies, Inc. | Apparatus, apparatus system, image forming apparatus, and control method and computer programs of apparatus |
| US20090135135A1 (en) * | 2007-11-22 | 2009-05-28 | Takehiko Tsurumi | Recording and reproducing apparatus |
| US20130041648A1 (en) * | 2008-10-27 | 2013-02-14 | Sony Computer Entertainment Inc. | Sound localization for user in motion |
| US9310883B2 (en) * | 2010-03-05 | 2016-04-12 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
| US20120327003A1 (en) * | 2010-03-23 | 2012-12-27 | Sharp Kabushiki Kaisha | Information display device and document data editing method |
| US20120026530A1 (en) * | 2010-07-27 | 2012-02-02 | Xerox Corporation | Augmented reality system and method for device management and service |
| US20120113223A1 (en) * | 2010-11-05 | 2012-05-10 | Microsoft Corporation | User Interaction in Augmented Reality |
| US20130142371A1 (en) * | 2011-12-01 | 2013-06-06 | Jason P. Martin | Detachable Audio Speakers for Portable Devices and Methods for Manufacturing such Speakers |
| US20130257848A1 (en) * | 2012-03-28 | 2013-10-03 | Microsoft Corporation | Augmented Reality Light Guide Display |
| US20140063542A1 (en) * | 2012-08-29 | 2014-03-06 | Ricoh Company, Ltd. | Mobile terminal device, image forming method, and image processing system |
| US20140232747A1 (en) * | 2013-02-15 | 2014-08-21 | Konica Minolta, Inc. | Operation display system and operation display method |
| US20160217617A1 (en) * | 2013-08-30 | 2016-07-28 | Hewlett-Packard Development Company, L.P. | Augmented reality device interfacing |
| US20160349926A1 (en) * | 2014-01-10 | 2016-12-01 | Nec Corporation | Interface device, portable device, control device and module |
| US20160232369A1 (en) * | 2015-02-11 | 2016-08-11 | Ricoh Company, Ltd. | Managing Access To Images Using Roles |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10635161B2 (en) * | 2015-08-04 | 2020-04-28 | Google Llc | Context sensitive hand collisions in virtual reality |
| US20170038830A1 (en) * | 2015-08-04 | 2017-02-09 | Google Inc. | Context sensitive hand collisions in virtual reality |
| US20190212962A1 (en) * | 2017-07-14 | 2019-07-11 | Kyocera Document Solutions Inc. | Display device and display system |
| US11181986B2 (en) * | 2017-08-10 | 2021-11-23 | Google Llc | Context-sensitive hand interaction |
| US10782793B2 (en) * | 2017-08-10 | 2020-09-22 | Google Llc | Context-sensitive hand interaction |
| US20190050062A1 (en) * | 2017-08-10 | 2019-02-14 | Google Llc | Context-sensitive hand interaction |
| US20210397829A1 (en) * | 2018-02-28 | 2021-12-23 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium to allow operation without contact |
| US20190266393A1 (en) * | 2018-02-28 | 2019-08-29 | Fuji Xerox Co.,Ltd. | Information processing apparatus and non-transitory computer readable medium |
| US10789459B2 (en) * | 2018-02-28 | 2020-09-29 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium to allow operation without user contact |
| US11144751B2 (en) * | 2018-02-28 | 2021-10-12 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium to allow operation without contact |
| CN110209309A (en) * | 2018-02-28 | 2019-09-06 | 富士施乐株式会社 | Information processing unit and the computer-readable medium for storing program |
| US11514705B2 (en) * | 2018-02-28 | 2022-11-29 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium to allow operation without contact |
| US10916065B2 (en) * | 2018-05-04 | 2021-02-09 | Facebook Technologies, Llc | Prevention of user interface occlusion in a virtual reality environment |
| US20190340833A1 (en) * | 2018-05-04 | 2019-11-07 | Oculus Vr, Llc | Prevention of User Interface Occlusion in a Virtual Reality Environment |
| CN110543233A (en) * | 2018-05-29 | 2019-12-06 | 富士施乐株式会社 | Information processing apparatus and non-transitory computer readable medium |
| US11223729B2 (en) * | 2018-05-29 | 2022-01-11 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium for instructing an object to perform a specific function |
| US11265428B2 (en) * | 2018-05-29 | 2022-03-01 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium for operating a target object in a real space through a virtual interface by detecting a motion of a user between a display surface displaying the virtual interface and the user |
| US20240403484A1 (en) * | 2023-06-04 | 2024-12-05 | Apple Inc. | Privacy-protecting mixed reality recording |
Also Published As
| Publication number | Publication date |
|---|---|
| US9628646B2 (en) | 2017-04-18 |
| JP6354653B2 (en) | 2018-07-11 |
| JP2016207048A (en) | 2016-12-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9628646B2 (en) | Augmented reality operation system and augmented reality operation method | |
| US8358280B2 (en) | Electronic device capable of showing page flip effect and method thereof | |
| US9203983B2 (en) | Image forming apparatus and image data processing method | |
| US8203722B2 (en) | Image processing apparatus, image forming apparatus, and output-format setting method | |
| JP4375578B2 (en) | Image forming apparatus and setting method in image forming apparatus | |
| US20090222756A1 (en) | Electronic device capable of showing page flip effect and method thereof | |
| US9753548B2 (en) | Image display apparatus, control method of image display apparatus, and program | |
| US9413908B2 (en) | Image forming apparatus remotely operated by external terminal, method for controlling image forming apparatus, recording medium, and image forming system including image forming apparatus | |
| US20110141519A1 (en) | Data processing system, data processing apparatus, data processing method, information processing apparatus, information processing method, and storage medium | |
| JP2024025795A (en) | Image processing device | |
| US11494142B2 (en) | Image processing apparatus, non-transitory computer readable medium, and method for processing image | |
| CN104869270B (en) | Document distribution server and document distribution method | |
| JP2015014888A (en) | Operation device, image forming apparatus, control method of operation device, and program | |
| US11379159B2 (en) | Information processing device and non-transitory computer readable medium | |
| US20150138570A1 (en) | Display processing apparatus and display processing method | |
| US10404872B2 (en) | Multi-function device with selective redaction | |
| US11029829B2 (en) | Information processing apparatus and method for display control based on magnification | |
| US11089179B2 (en) | Image processing apparatus, system, and computer program product capable of performing image processing on target image with respect to image data of the target image corresponding to target image state by acquired display image | |
| JP2005246683A (en) | Image forming apparatus and printing control program | |
| US9111374B2 (en) | Mobile terminal, method for controlling the same, and non-transitory storage medium storing program to be executed by mobile terminal | |
| US20210168248A1 (en) | Information processing apparatus, home screen display method, and home screen display program | |
| JP4968937B2 (en) | Image forming apparatus | |
| JP6977384B2 (en) | Information processing equipment, programs and image output systems | |
| JP2020064506A (en) | Display device, image processing device, and program | |
| JP6541836B2 (en) | INFORMATION PROCESSING APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: KYOCERA DOCUMENT SOLUTIONS INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATSUKI, YOSHITAKA;KAWASAKI, TOMOHIRO;SHIMAMOTO, KUNIHIKO;REEL/FRAME:038353/0091 Effective date: 20160413 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20210418 |