US20140125661A1 - Image processing apparatus, image processing method, and program - Google Patents
Image processing apparatus, image processing method, and program Download PDFInfo
- Publication number
- US20140125661A1 US20140125661A1 US14/122,361 US201214122361A US2014125661A1 US 20140125661 A1 US20140125661 A1 US 20140125661A1 US 201214122361 A US201214122361 A US 201214122361A US 2014125661 A1 US2014125661 A1 US 2014125661A1
- Authority
- US
- United States
- Prior art keywords
- image
- images
- plane
- plane images
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/261—Image signal generators with monoscopic-to-stereoscopic image conversion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present disclosure relates to an image processing apparatus, an image processing method, and a program and particularly, to a technology for generating a three-dimensional image.
- an image display apparatus that displays an image (three-dimensional image: so-called 3D image) perceived by a user as a three-dimensional image is released and begins to spread (for example, Patent Literature 1).
- the apparatus that can display the three-dimensional image is not limited to a television and other image display apparatuses. Of personal computers, there are types that can display a three-dimensional image.
- applications that can generate contents having three-dimensional images. If the contents are generated by the applications and the user views the contents in a predetermined manner, the user can perceive images included in the contents as the three-dimensional images.
- An image processing apparatus of the present disclosure comprises an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively.
- An image processing apparatus of the present disclosure comprises a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit.
- An image processing apparatus of the present disclosure comprises a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit.
- An image processing apparatus of the present disclosure comprises an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively.
- An image processing apparatus of the present disclosure comprises and an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
- An image processing method of the present disclosure comprises generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
- a program of the present disclosure that causes a computer to execute an image process, the program causing the computer to execute: generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
- a plurality of plane images are appropriately displayed so that a user can generate and display a desired three-dimensional image through a simple operation.
- an image processing apparatus an image processing method, and a computer program which are novel or improved ones and which enables contents having three-dimensional images to be easily generated.
- FIG. 1 A first figure.
- FIG. 1 is a block diagram illustrating an example of the configuration of an image processing apparatus according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of an editing screen (example in which all layers are displayed) according to the embodiment of the present disclosure.
- FIG. 3 is an exemplary diagram illustrating an example of an image of each layer according to the embodiment of the present disclosure.
- FIG. 4 is an exemplary diagram illustrating an example of a depth display screen according to the embodiment of the present disclosure.
- FIG. 5 is an exemplary diagram illustrating a concept of converting images of a plurality of layers into three-dimensional images.
- FIG. 6 is an exemplary diagram illustrating a state in which images of a plurality of layers are converted into three-dimensional images.
- FIG. 7 is an exemplary diagram illustrating an example in which images of a plurality of layers are converted into three-dimensional left and right channel images.
- FIG. 8 is an exemplary diagram illustrating an example of three-dimensional images that are generated from images of a plurality of layers.
- FIG. 9 is an exemplary diagram illustrating a converted state of the case in which there is a layer in which the front side of a virtual display surface is set to a virtual position.
- FIG. 10 is an exemplary diagram illustrating an example of a three-dimensional image of the case where there is a layer in which the front side of a virtual display surface is set to a virtual position.
- FIG. 11 is a flowchart illustrating an example of the flow of an editing screen display process according to the embodiment of the present disclosure.
- FIG. 12 is an exemplary diagram illustrating an example of an editing screen (example in which a third layer is displayed) according to the embodiment of the present disclosure.
- FIG. 13 is an exemplary diagram illustrating an example of an editing screen (example in which a second layer is displayed) according to the embodiment of the present disclosure.
- FIG. 14 is an exemplary diagram illustrating an example of an editing screen (example in which a first layer is displayed) according to the embodiment of the present disclosure.
- FIG. 15 is a flowchart illustrating an example of the flow of a depth screen display process according to the embodiment of the present disclosure.
- FIG. 16 is an exemplary diagram illustrating an example (first example) of displaying a depth screen according to the embodiment of the present disclosure.
- FIG. 17 is an exemplary diagram illustrating an example (second example) of displaying a depth screen according to the embodiment of the present disclosure.
- FIG. 18 is an exemplary diagram illustrating an example (third example) of displaying a depth screen according to the embodiment of the present disclosure.
- FIG. 19 is an exemplary diagram illustrating an example (fourth example) of displaying a depth screen according to the embodiment of the present disclosure.
- FIG. 20 is a flowchart illustrating an example of the flow of a horizontal line setting process according to the embodiment of the present disclosure.
- FIG. 21 is an exemplary diagram illustrating an example of a horizontal line setting screen according to the embodiment of the present disclosure.
- FIG. 22 is an exemplary diagram illustrating an example (first example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
- FIG. 23 is an exemplary diagram illustrating an example (second example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
- FIG. 24 is an exemplary diagram illustrating an example (third example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
- FIG. 25 is an exemplary diagram illustrating an example (fourth example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
- FIG. 26 is an exemplary diagram illustrating an example of a display screen at the time of capturing a camera image according to the embodiment of the present disclosure.
- FIG. 27 is an exemplary diagram illustrating a display example of a three-dimensional image where camera images are synthesized according to the embodiment of the present disclosure.
- FIG. 28 is an exemplary diagram illustrating an example of a display screen at the time of importing an image file according to the embodiment of the present disclosure.
- FIG. 29 is an exemplary diagram illustrating an example of an importing range setting at the time of importing an image file according to the embodiment of the present disclosure.
- FIG. 30 is an exemplary diagram illustrating a display example of a three-dimensional image in which images imported from an image file are synthesized according to the embodiment of the present disclosure.
- FIG. 31 is an exemplary diagram illustrating an example of a list display screen of productions according to the embodiment of the present disclosure.
- FIG. 32 is a block diagram illustrating an example of detailed hardware configuration of an image processing apparatus according to the embodiment of the present disclosure.
- FIGS. 2 and FIGS. 11 to 14 Display example of an editing screen ( FIGS. 2 and FIGS. 11 to 14 )
- FIGS. 15 to 19 Display example of a depth adjustment screen ( FIGS. 15 to 19 )
- FIGS. 20 to 25 Display example of a ground surface setting screen ( FIGS. 20 to 25 )
- FIG. 1 is a diagram illustrating the configuration of an image processing apparatus 100 according to this example.
- An image processing apparatus 100 is configured to generate an image by a user operation and display and store the generated image.
- the image processing apparatus 100 includes an image generating/processing unit 110 , an image storage unit 120 , an input unit 130 , and an image display unit 140 .
- the image generating/processing unit 110 provides the user with an image generation screen through the image display unit 140 or generates a three- dimensional image from the image generated by the user.
- the image generating/processing unit 110 that is included in the image processing apparatus 100 according to this example includes an image generating unit 112 , a three-dimensional image converting unit 114 , a three-dimensional image generating unit 116 , and an editing screen generating unit 118 .
- the editing screen generating unit 118 also functions as an image control unit that controls image generation in each unit.
- the image generating/processing unit 110 may include an image control unit that controls image generation in each unit, separately from the editing screen generating unit 118 .
- the image generating/processing unit 110 generates a plurality of plane images (for example, three plane images) on the basis of a user operation in a state in which an editing screen for image generation is displayed on the image display unit 140 and generates a three-dimensional image from the plurality of generated plane images (two-dimensional images).
- the editing screen for the image generation is a screen illustrated in FIG. 2 and displays an intermediate image in generation processing at the center and displays operation buttons or tabs around the image.
- the editing screen illustrated in FIG. 2 will be described below in detail.
- Information of the depth from the reference position (for example, virtual display surface) is set to each of the plurality of generated plane images and the three-dimensional image is generated on the basis of the depth.
- the image generating/processing unit 110 supplies image data of the three-dimensional image generated by the image generating/processing unit 110 to the image display unit 140 and the image display unit 140 displays the three-dimensional image.
- the user views the three-dimensional image using a predetermined method (for example, in a state in which the user wears a shutter spectacle of a time-sharing driving type) and perceives the three-dimensional image displayed on the image display unit 140 as a three-dimensional image.
- the image generating unit 112 displays the editing screen for the image generation on the image display unit 140 and generates an image by the user operation. If the image generating unit 112 generates images including a plurality of layers using the image generation screen provided by the image generating unit 112 , the images that include the plurality of layers are converted into a three-dimensional image by the three-dimensional image converting unit 114 and the three-dimensional image generating unit 116 . The images including the plurality of layers that are generated by the image generating unit 112 are stored in the image storage unit 120 according to the user operation.
- the three-dimensional image converting unit 114 executes a conversion process for displaying the images including the plurality of layers transmitted from the image generating unit 112 as the three-dimensional image on the image display unit 140 .
- the image processing apparatus 100 previously assumes the distance between eyes of the user and the distance between the user and the display surface and executes a conversion process for displaying the image as the three-dimensional image on the image display unit 140 , on the basis of the virtual distance between the layers (information of the depth of each layer of the images).
- the three-dimensional image converting unit 114 executes a coordinate conversion process with respect to the images including the plurality of layers.
- the three-dimensional image converting unit 114 executes the conversion process in real time according to the change. Thereby, the user can adjust the depth of each layer of the images and confirm the three-dimensional image after the adjustment through the display on the editing screen in real time.
- An example of a process for adjusting the depth of each layer of the images will be described in detail below.
- the image processing apparatus 100 executes preview display of the three-dimensional image.
- the preview display of the three-dimensional image the user can previously grasp how the images would be seen in a three-dimensional manner before storing the generated image as the three-dimensional image.
- the three-dimensional image generating unit 116 generates the three-dimensional image from the images including the plurality of layers, on the basis of the conversion process executed by the three-dimensional image converting unit 114 .
- the three-dimensional image generated by the three-dimensional image generating unit 116 is displayed on the image display unit 140 and is stored in the image storage unit 120 according to the operation of the input unit 130 from the user.
- the editing screen generating unit 118 generates display data of the editing screen, on the basis of a reception state of an input operation in the input unit 130 .
- the editing screen generating unit 118 supplies the display data generated by the editing screen generating unit 118 to the image display unit 140 and displays the editing screen.
- the image storage unit 120 stores the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers.
- the images stored in the image storage unit 120 are read from the image storage unit 120 according to the operation of the input unit 130 by the user, are processed by the image generating/processing unit 110 , and are displayed by the image display unit 140 .
- the input unit 130 includes various input devices that execute an input operation with respect to the image processing apparatus 100 by the user, for example, a keyboard, a mouse, a graphic tablet, and a touch panel.
- the user can generate the images including the plurality of layers by operating the input unit 130 and adjust the depth of each layer of the images when the images are converted into the three-dimensional image.
- the image display unit 140 is a display that displays an image.
- the image display unit 140 displays the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers.
- the image display unit 140 displays a screen to allow the images to be generated by the user of the image processing apparatus 100 .
- An example of the display screen will be described below.
- a touch panel may be disposed on an image display surface of the image display unit 140 and the user may directly operate buttons in a displayed image.
- the touch panel that is included in the image display unit 140 functions as a part of the input unit 130 .
- the image display unit 140 may be a device that is separated from the image processing apparatus 100 .
- the image display unit 140 may be configured using a display device that can display the three-dimensional image.
- a method that displays the three-dimensional image is not limited to a specific display method.
- a method that switches an image for a right eye and an image for a left eye at a high speed and displays the images is known.
- a method that transmits the three-dimensional image to the image display unit 140 a frame sequential method, a side-by-side method, and a top and bottom method and the like are known.
- the images that are generated by the image generating/processing unit 110 may be output to the television receiver and other display devices that are connected to the image processing apparatus 100 and can display the three-dimensional image.
- the image processing apparatus 100 has plane images 251 , 252 , and 253 including three layers of a long-distance view, a middle-distance view, and a short-distance view generated by the user.
- the image processing apparatus 100 synthesizes the plane images 251 , 252 , and 253 including the three layers of the long-distance view, the middle-distance view, and the short-distance view and the plane images are converted into a three-dimensional image by the three-dimensional image converting unit 114 .
- the three-dimensional image converting unit 114 As such, by converting the images including the three layers into the three-dimensional image, the user can generate the three-dimensional image without executing a complicated imaging process.
- objects a, b, c, and g respectively corresponding to a mountain, the sun, a road, and a horizontal line are drawn in the plane image 251 of the long-distance view
- an object d of a tree is drawn in the plane image 252 of the middle-distance view
- objects e and f each corresponding to a dog and an insect are drawn in the plane image 253 of the short-distance view.
- the number of layers that is three is exemplary and the number of layers may be plural.
- FIG. 4 is a diagram illustrating a setting situation of the depth.
- a setting state illustrated in FIG. 4 is a 3D state thumbnail tab 202 of the editing screen to be described below and the state illustrated in the drawing is reduced and displayed.
- the setting state illustrated in FIG. 4 is enlarged and displayed on the editing screen.
- a depth axis 301 is set to a direction orthogonal to a display screen and the depth positions of the layer images 311 , 312 , and 313 are set on the depth axis 301 .
- the depth axis 301 is illustrated as an oblique axis that has a predetermined angle.
- An image frame that is illustrated by a broken line in FIG. 4 illustrates a virtual display surface 304 at the position of the image display surface of the display.
- the virtual display surface 304 exists at the depth position 301 a on the depth axis 301 .
- the position of the most front side of the depth axis 301 is illustrated as a front edge portion 302 .
- the first layer image 311 of the long-distance view exists at the innermost depth position 301 b on the depth axis 301 and the second layer image 312 of the middle-distance view exists at the depth position 301 c near almost the center on the depth axis 301 .
- the third layer image 313 of the short-distance view exists at the depth position 301 d of the front side of the virtual display surface 304 on the depth axis 301 .
- the layer image 313 of the short-distance view exists at the depth position of the front side of the virtual display surface 304 .
- the layer image 313 of the short-distance view may be inner than the virtual display surface 304 (closer to the long-distance view).
- the layer image may be automatically set to the predetermined depth position suitable for the long-distance view as the first layer image 311 , in an initial state.
- the layer image may be automatically set to the predetermined depth position suitable for the middle-distance view as the second layer image 312 and the layer image may be automatically set to the predetermined depth position suitable for the short-distance view as the third layer image 313 .
- the three-dimensional image is generated such that the objects a to g in the layer images 311 , 312 , and 313 are disposed at the depth positions of the layer images in a virtual three-dimensional space.
- an image portion of the lower side of the horizontal line position 321 of the layer image 311 of the long-distance view is disposed to be inclined on the three-dimensional space and protrude to the front side. That is, the image portion of the lower side of the horizontal line c becomes an inclined surface 303 that gradually protrudes to the front side, from the depth position 301 b of the layer image 311 to the front edge portion 302 , as the image portion progresses to the lower side.
- the object g of the road that is drawn on the lower side of the horizontal line position 321 in the layer image 311 is disposed on the inclined surface 303 .
- the inclined surface 303 protrudes to the front edge portion 302 of the front side of the third layer image 313 .
- the inclined surface 303 may protrude to the virtual display surface 304 .
- the layer (third layer image 313 ) of the most front side protrudes to the front side of a terminating point of the front side of the inclined surface 303 .
- each object is automatically appropriately disposed on the inclined surface 303 .
- a process for erasing inappropriate portions of the objects is executed.
- a specific example of the setting of the horizontal line and the process associated with the horizontal line will be described below.
- FIG. 5 is a diagram illustrating a concept of converting normal two-dimensional images including a plurality of layers into a three-dimensional image.
- FIG. 5 illustrates an aspect where two-dimensional images including a plurality of layers (plane images 251 , 252 , and 253 ) are converted into an image 250 R for a right eye which the user views using the right eye and an image 250 L for a left eye which the user views using the left eye.
- the process of the horizontal line described above is not illustrated.
- the three-dimensional image converting unit 114 calculates the drawing positions of the image for the right eye and the image for the left eye to generate the image 250 R for the right eye and the image 250 L for the left eye from the two-dimensional images.
- FIGS. 6 to 8 are exemplary diagrams illustrating an example where normal two-dimensional images including a plurality of layers are converted into a three-dimensional image.
- FIG. 6 illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated in FIG. 7 and illustrates a coordinate conversion when the three layers are inner than the display surface, as illustrated in FIG. 8 .
- FIG. 6 schematically illustrates a state in which each layer and the display surface are viewed from the top side.
- the three-dimensional image converting unit 114 executes a projection coordinate conversion for the image for the right eye and a projection coordinate conversion for the image for the left eye, using the layer depths D 1 , D 2 , and D 3 between the virtual display surface 259 and the layers. With the projection coordinate conversion, the pixel positions of the objects of each layer image in the virtual display surface 259 are calculated.
- the three-dimensional image converting unit 114 executes the projection coordinate conversion with respect to the virtual display surface 259 and the image processing apparatus 100 can convert the normal two-dimensional images including the plurality of layers into the three-dimensional image.
- FIGS. 6 to 8 are examples where each layer is set to the depth deeper than the depth of the virtual display surface 259 . However, each layer may be set to the depth of the front side of the virtual display surface 259 .
- FIGS. 9 and 10 illustrate an example where two-dimensional images are converted into a three-dimensional image when an image of the layer of the short-distance view is disposed on the front side of a virtual display surface 259 ′.
- FIG. 9 schematically illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated in FIG. 10 , in an upwardly viewed state.
- the pixel positions of the image for the right eye and the image for the left eye where the objects of the image of the layer of the front side of the virtual display surface 259 ′ are projected on the virtual display surface 259 ′ are reversed to the pixel positions in the layer deeper than the virtual display surface 259 ′ at the left and right sides.
- the editing screen is the screen illustrated in FIG. 2 that is displayed on the image display unit 140 , on the basis of the process in the editing screen generating unit 118 .
- a flowchart of FIG. 11 illustrates a flow of the process in the editing screen generating unit 118 .
- the description will be given with reference to FIG. 11 .
- step S 13 it is determined whether an operation to select any one of the displayed multi-layer thumbnails by the input unit 130 is received.
- the operation is not received, a waiting state is maintained and the image of the editing screen is not changed.
- the image of the layer that corresponds to the selected layer thumbnail is displayed on the editing screen (step S 14 ).
- the display brightness of the images of the other layers is decreased and the image of each layer is simultaneously displayed, and the entire three-dimensional image is displayed to be recognized.
- step S 15 it is determined whether an operation to change the layer to another layer by selecting the layer thumbnail exists.
- step S 16 a change process of the layer that is displayed on the editing screen is executed (step S 16 ), the process returns to step S 14 , and an image of the layer after the change is displayed.
- step S 17 it is determined whether an operation to change the display to the overlapped display of all of the layers exists (step S 17 ). In this case, when it is determined that the operation to change the display to the overlapped display of all of the layers exists, the display is changed to the overlapped display of all of the layers and the process returns to the 3D editing screen display in step S 12 .
- step S 17 when it is determined that the operation to change the display to the overlapped display of all of the layers does not exist, the screen display of the selected layer in step S 14 is continuously executed.
- FIG. 2 illustrates an example of an editing screen in a state in which all layers of intermediate images in generation processing are overlapped and displayed.
- intermediate image display 203 in edition processing is performed with a relatively large image at a center portion.
- an image where the images 251 to 253 of the three layers illustrated in FIG. 3 are overlapped and displayed is displayed.
- edge portions 203 a to 203 c of the three images are overlapped and illustrated with respect to an upper edge portion of the editing image display 203 .
- the edge portion (for example, edge portion 203 a ) of the image that corresponds to the selected tab is displayed on the front side and the edge portions of the other images are displayed on the inner side.
- plural tabs 201 , 202 , and 211 to 213 to select a display image are disposed on the upper side of the intermediate image display 203 in edition processing in the editing screen.
- the tabs 201 , 202 , and 211 to 213 display the images allocated to the selected tabs, when a user operation to select the corresponding tab display places by the touch panel operation or the like exists.
- the layer thumbnail tabs 211 , 212 , and 213 are tabs to individually display the layers.
- the first layer thumbnail tab 211 is a tab that displays a first layer image
- the second layer thumbnail tab 212 is a tab that displays a second layer image
- the third layer thumbnail tab 213 is a tab that displays a third layer image.
- images of the layers are reduced and displayed as the thumbnail images. Therefore, when the image of each layer is modified, the thumbnail image is also modified in the same way.
- the tab 201 that is displayed adjacent to the layer thumbnail tabs 211 , 212 , and 213 is a tab that adds the layer image. That is, when the tab 201 is selected, a layer image is newly added and an editing screen of the newly added layer image is displayed.
- the 3D state thumbnail tab 202 that is disposed to be closer to the right side of the upper side of the intermediate image display 203 in edition processing is a tab that displays a depth state of the image of each layer.
- the display screen of the depth state illustrated in FIG. 4 is enlarged and displayed in the place of the editing image display 203 .
- the display screen of the depth state illustrated in FIG. 4 is reduced and displayed.
- the depth state is adjusted and a display content changes according to the adjustment.
- buttons 221 to 225 are disposed on the left end of the editing image display 203 .
- a file importing button 221 a camera capturing button 222 , a stamp button 223 , a character input button 224 , and a depth operation button 225 are prepared.
- stamp button 223 and the character input button 224 a process for inputting prepared figures or characters starts, by a user operation of each button.
- a depth adjustment screen is displayed on the editing image display 203 .
- a display process of the depth adjustment screen will be described below.
- a generation start button 241 is displayed on the right end of the editing image display 203 .
- the generation start button 241 is a button to instruct to generate a new three-dimensional image.
- the save button 242 is a button to instruct to store an intermediate image in generation processing in the image storage unit 120 .
- the three-dimensional display button 243 is a button to switch three-dimensional display and normal display (2D display) of the image displayed on the editing image display 203 .
- a pen tool 290 is displayed on the lower side of the right end of the editing image display 203 .
- the pen tool 290 of this example displays a first pen 291 , a second pen 292 , a third pen 293 , and an eraser 294 . If a user operation to select display places of the pens 291 , 292 , and 293 exists, a line can be drawn with a color or a line type allocated to each pen. If a user operation to select a display place of the eraser 294 exists, the drawn line is erased.
- the pen tool 290 may have a function of supporting other drawing or erasure.
- object display units 231 to 237 are provided on a lower end of the editing image display 203 .
- the plurality of object display units 231 to 237 for example, seven newest objects that are used by the user are displayed.
- each object of the intermediate image in generation processing is illustrated.
- FIG. 2 illustrates an example where images of the individual layers are overlapped and displayed in the editing image display 203 . Even when the other images are displayed in the editing image display 203 , the same display is performed around the editing image display 203 .
- a tab that selects overlapped display where the image of each layer illustrated in FIG. 2 is overlapped is not provided in particular. However, the tab that selects the overlapped display may be prepared separately from the tab for the image of each layer.
- a thumbnail image where an overlapped image of the image of each layer is reduced and displayed is displayed is displayed on the tab. Therefore, when the image of each layer is modified, the thumbnail image where the overlapped image is reduced and displayed is modified according to the change.
- FIG. 12 illustrates an example where the image of the third layer (image 253 of the short-distance view of FIG. 3 ) is displayed in the editing image display 203 .
- the image display of the third layer is performed by selecting the third layer thumbnail tab 213 by a user operation.
- the objects e and f in the image of the third layer are displayed with set colors or brightness.
- the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. That is, only the image of the third layer is highlighted and displayed and the images of the other layers are grayed out and displayed.
- the image of the third layer is generated or edited.
- FIG. 13 illustrates an example where the image of the second layer (image 252 of the middle-distance view of FIG. 3 ) is displayed in the editing image display 203 .
- the image display of the second layer is performed by selecting the second layer thumbnail tab 212 by a user operation.
- the object d in the image of the second layer is displayed with set colors or brightness.
- the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
- FIG. 14 illustrates an example where the image of the first layer (image 251 of the long-distance view of FIG. 3 ) is displayed in the editing image display 203 .
- the image display of the first layer is performed by selecting the first layer thumbnail tab 211 by a user operation.
- the objects a, c, and g in the image of the first layer are displayed with set colors or brightness.
- the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
- the image of the first layer is generated or edited.
- the image of the selected layer is highlighted and displayed and the images of the other layers are grayed out and displayed. Meanwhile, as the editing image, only the objects of the image of the selected layer may be displayed, and the objects of the images of the other layers may not be displayed.
- a depth adjustment process is started by a user operation to select a depth operation button 225 of the editing screen illustrated in FIG. 2 , for example. That is, as illustrated in FIG. 15 , the editing screen generating unit 118 determines whether an operation to select the depth operation button 225 exists in a state in which the editing screen is displayed (step S 21 ). When the operation to select the depth operation button 225 does not exist, a waiting state is maintained until the operation exists.
- step S 21 when it is determined that the operation to select the depth operation button 225 exists, overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a depth bar is displayed on the upper side of the intermediate image display 203 in edition processing (step S 22 ).
- the depth bar is a scale that illustrates the depth of each layer. As illustrated in this example, when the depth bar is configured using the images of the three layers, the depth positions of the three layers are displayed by the depth bar. On the lower side of the editing image display 203 , a depth adjustment button is displayed.
- a specific display example will be described below.
- an adjustment button to move the depth of the image to the inner side and an adjustment button to move the depth of the image to the front side are prepared and displayed.
- the editing screen generating unit 118 determines whether a user operation of any adjustment button exists (step S 23 ). In this case, when it is determined that the operation of the adjustment button does not exist, the display of step S 22 is continuously executed. When it is determined that the operation of the adjustment button exists, the image of the layer that corresponds to the operated operation button is displayed as the intermediate image display 203 in edition processing (step S 24 ). The depth position that is set to the image of the corresponding layer is changed according to the operation situation of the adjustment button and the depth position of the depth bar display is changed according to the corresponding position (step S 25 ). After the setting or the display is changed in step S 25 , the display returns to the display of step S 22 . However, the intermediate image display 203 in edition processing may be the display of only the operated layer until a next operation exists.
- FIG. 16 is a diagram illustrating a display example of the depth bar.
- the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and the depth bar 401 is displayed on the upper side of the editing image display 203 .
- the depth positions of the images of the three layers are illustrated in one depth bar 401 . That is, in the depth bar 401 , the depth position 401 a of the image of the first layer, the depth position 401 b of the image of the second layer, and the depth position 401 c of the image of the third layer are illustrated by changing the display colors.
- buttons that adjust the depth position are displayed for the image of each layer.
- a depth adjustment button 411 that moves the depth to the front side and a depth adjustment button 412 that moves the depth to the inner side are displayed as adjustment buttons for the image of the first layer.
- a depth adjustment button 421 that moves the depth to the front side and a depth adjustment button 422 that moves the depth to the inner side are displayed as adjustment buttons for the image of the second layer.
- a depth adjustment button 431 that moves the depth to the front side and a depth adjustment button 432 that moves the depth to the inner side are displayed as adjustment buttons for the image of the third layer.
- the setting of the depth of the image of each layer is changed.
- the depth position of the image of the first layer is changed by the operations of the depth adjustment buttons 411 and 412 and the display of the depth position 401 a of the image of the first layer in the depth bar 401 is moved as illustrated by an arrow La.
- the depth position of the image of the second layer is changed by the operations of the depth adjustment buttons 421 and 422 and the display of the depth position 401 b of the image of the second layer in the depth bar 401 is moved as illustrated by an arrow Lb.
- the depth position of the image of the third layer is changed by the operations of the depth adjustment buttons 431 and 432 and the display of the depth position 401 c of the image of the third layer in the depth bar 401 is moved as illustrated by an arrow Lc.
- the depth adjustment buttons 411 to 432 are limited to the positions adjacent to the depth positions of the images of the adjacent layers. For example, a range of the movement La of the image of the first image is from the deepest position to the position adjacent to the depth position of the image of the adjacent second layer.
- FIG. 17 illustrates another display example of the depth bar.
- the image of the selected layer (in this example, image of the third layer) is highlighted and displayed.
- a depth bar 501 only the depth position 502 of the layer of the emphasis display is illustrated as the depth bar 501 .
- the depth bar 501 an entire depth range set by the image processing apparatus 100 is illustrated and a position display 503 of the virtual display surface is illustrated.
- FIG. 18 illustrates another display example of the depth bar.
- the image of the selected layer (in this example, image of the second layer) is highlighted and displayed in the editing image display 203 .
- a depth bar 601 only a range where the depth of the image of the layer can be adjusted is displayed with scales.
- a depth position 602 is displayed in the depth bar 601 . That is, in the depth of the image of the second layer, the inner side is limited to the depth position of the image of the first layer and the front side is limited to the depth position of the image of the third layer.
- a range where the depth is limited and adjusted becomes a scale display range of the depth bar 601 .
- the depth position 602 moves in the range of the displayed depth bar 601 , as illustrated by an arrow Le.
- FIG. 19 illustrates another display example of the depth bar.
- the overlapped display of the images of all of the layers is performed in the intermediate image display 203 in edition processing and a depth position 702 of the image of the layer where the depth is adjusted is illustrated in a depth bar 701 .
- a position 703 of the virtual display surface is indicated. Similar to the example of FIG. 16 , depth adjustment buttons 711 , 712 , 721 , 722 , 731 , and 732 are provided for each layer and the depth position 702 is changed by adjusting the depth by the button operation, as illustrated by an arrow Lf.
- the position of the image of the layer that is compatible with the adjustment of the depth is indicated by an image frame 704 of four corners, in the editing image display 203 .
- the ground surface setting process starts when the user operates a button (not illustrated in the drawings) to instruct setting of the ground surface, in the editing screen illustrated by FIG. 2 . That is, as illustrated in FIG. 20 , the editing screen generating unit 118 determines whether an operation to instruct setting of the ground surface exists, in a state in which the editing screen is displayed (step S 31 ). When the operation of the setting of the ground surface does not exist, a waiting state is maintained until the corresponding operation exists.
- step S 31 when it is determined that the operation of the setting of the ground surface exists, the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a slider bar for horizontal line adjustment is vertically displayed on one end of the intermediate image display 203 in edition processing (step S 32 ).
- a slider handle that indicates the position of the horizontal line is displayed on the slider bar for the horizontal line adjustment, and it is determined whether a drag operation of the slider handle exists (step S 33 ).
- a change process of the position of the horizontal line is executed according to the operation (step S 34 ).
- the lower side of the horizontal line of the image of the innermost layer (first layer) is set to the inclined surface on the three-dimensional space, according to the change of the position of the horizontal line.
- the setting of the inclined surface is the process already described in FIG. 4 and the inclined surface corresponds to the ground surface.
- step S 35 it is determined whether a mode to erase the lower side of the ground surface in the images of the layers other than the first layer is set.
- the mode to erase the lower side of the ground surface is set, the objects at the positions that become the lower side of the inclined surface (ground surface) of the images of the layers other than the first layer are erased (step S 36 ).
- step S 35 it is determined whether the object set to be disposed on the ground surface exists among the objects of the images of the individual layers (step S 37 ). In this case, when it is determined that the object set to be disposed on the ground surface exists, the position of the lower end of the corresponding object is adjusted to the position crossing the inclined surface in the image of the layer where the object exists (step S 38 ).
- step S 33 When it is determined that a drag operation does not exist in step S 33 , after the processes of steps S 36 and S 38 are executed and when it is determined that the object set to be disposed on the ground surface does not exist in step S 37 , the process returns to the horizontal line slider bar display process of step S 32 .
- FIG. 21 illustrates an example where the horizontal line adjustment bar 261 which is the horizontal line slider bar is displayed in the intermediate image display 203 in edition processing in the editing screen.
- the ground surface setting position display 262 is indicated.
- the ground surface setting position display 262 is matched with the ground line c drawn in the image of the first layer. The matching process is executed by the user operation.
- the ground surface of the lower side of the horizontal line of the image of the first layer is set as the inclined surface illustrated in FIG. 4 by setting the horizontal line.
- the position of the ground surface may be displayed in the images of the layers other than the first layer (images of the second layer and the third layer).
- ground surface position display 271 where the image 252 of the second layer and the ground surface (inclined surface) cross is indicated by a broken line as the intermediate image display 203 in edition processing in the editing screen.
- this display it is determined whether the arrangement position of the object in the image 252 of the layer (in this example, object d of a tree) is appropriate, from the display of the ground surface position. That is, a lower end of the object d of the tree is almost matched with the ground surface position display 271 illustrate by a broken line and an appropriate three-dimensional image is obtained. Meanwhile, the lower end of the object d of the tree is at the upper side of the ground surface position display 271 illustrated by the broken line, the tree is floated.
- the ground surface position display 271 is performed to effectively prevent the image from becoming the unnatural three-dimensional image.
- FIG. 23 illustrates another display example of the position of the ground surface.
- the image of the second layer is displayed as the image of the second layer and the lower side of the ground surface is displayed as a non-display portion 272 (black display portion).
- the image of the third layer of the front side of the second layer is displayed, for example, after decreasing the brightness.
- the objects e and f in the image of the third layer may not be displayed.
- FIG. 24 illustrates an example where the objects of the lower side of the ground surface that is the inclined surface in the layers other than the first layer that is the long-distance view are erased and the display is performed as the intermediate image display 203 in edition processing in the editing screen.
- a process of FIG. 24 corresponds to the process in step S 36 of the flowchart of FIG. 20 .
- a lower part of an object e of a dog of the third layer that is the short-distance view becomes the lower side of the ground surface.
- the object e is displayed in a state in which the lower part of the object e of the lower side of the ground surface is erased.
- the partial erasure process of the object illustrated in FIG. 24 may be executed in the three-dimensional image display and the corresponding object may be completely displayed when the image of each layer is individually displayed.
- FIG. 25 illustrates an example of an operation screen in the case where a lower end of the object in the image of each layer is matched with the ground surface of the inclined surface and display is performed as the intermediate image display 203 in edition processing in the editing screen.
- a process of FIG. 25 corresponds to the process in step S 38 of the flowchart of FIG. 20 .
- FIG. 25 an operation screen with respect to the object e of the dog of the third layer that is the short-distance view is illustrated.
- position movement buttons 281 and 282 , a returning button 283 , an erasure button 284 , and a ground surface adjustment button 285 are displayed around the object e.
- the user performs the operation to select each button and the position of the object e is modified.
- the editing screen generating unit 118 executes a process for automatically matching the position of the lower end of the object e with a surface crossing the ground surface.
- the corresponding object e is automatically disposed on the ground surface by selecting the ground surface adjustment button 285 and an appropriate three-dimensional image can be generated.
- FIGS. 26 and 27 illustrate an example where a camera image is taken in an intermediate image in generation processing.
- a camera capturing operation screen 810 is displayed on the editing screen.
- a process for reading a camera image from an external camera device (or storage device where the camera image is stored) connected to the image processing apparatus 100 is executed by an operation using the camera capturing operation screen 810 .
- display of the camera capturing image 811 is performed by the reading.
- An extraction image 812 where a background is removed from the camera capturing image 811 is obtained by an operation in the camera capturing operation screen 810 .
- the extraction image 812 By disposing the extraction image 812 on the image of any layer, the extraction image 812 can be disposed as one of the objects in the intermediate image in generation processing, as illustrated in FIG. 27 .
- the depth position of the layer where the extraction image is disposed is selected by the user using the camera capturing operation screen 810 .
- the camera capturing image may be automatically disposed on the layer of the most front side (short-distance view).
- FIGS. 28 to 30 illustrate an example where a file image is imported in an intermediate image in generation processing.
- an image file importing operation screen 820 is displayed in the editing screen.
- a process for reading selected image data from the image file stored in the designated place is executed by an operation using the image file importing operation screen 820 .
- an imported image 821 is displayed by the reading.
- an extraction image 822 that is partially extracted from the imported image 821 is obtained by an operation in the image file importing operation screen 820 .
- the extraction image 822 By disposing the extraction image 822 on the image of any layer, as illustrated in FIG. 30 , the extraction image 822 can be disposed as one of the objects in the generated image. In this case, the depth position of the layer where the extraction image is disposed is selected by the user using the image file importing operation screen 820 .
- the three-dimensional image that is generated using the editing screen in the process described above is stored in the image storage unit 120 of the image processing apparatus 100 .
- a list of data of the stored three-dimensional images can be displayed on one screen.
- FIG. 31 illustrates an example where a list of generated images is displayed.
- generated images 11 , 12 , 13 , . . . are reduced and displayed.
- a two-dimensional image or a three-dimensional image may be displayed.
- numbers of layers displays 11 a , 12 a , 13 a , . . . that number of layers are indicated by figures are performed.
- a figure where three images are overlapped is displayed when the number of layers is three.
- the generated images can be easily selected.
- the selected images may be displayed in the editing screen illustrated in FIG. 2 and editing work may be performed.
- FIG. 32 illustrates an example where the image processing apparatus 100 is configured as an information processing device such as a computer device.
- the image processing apparatus 100 mainly includes a CPU 901 , a ROM 903 , a RAM 905 , a host bus 907 , a bridge 909 , an external bus 911 , an interface 913 , an input device 915 , an output device 917 , an image capturing device 918 , a storage device 919 , a drive 921 , a connection port 923 , and a communication device 925 .
- the CPU 901 functions as an operation processing device and a control device and controls all or a part of operations in the image processing apparatus 100 , according to various programs stored in the ROM 903 , the RAM 905 , the storage device 919 , and a removable recording medium 927 .
- the ROM 903 stores programs or operation parameters that are used by the CPU 901 .
- the RAM 905 primarily stores programs used in execution of the CPU 901 or parameters appropriately changed in the execution. These devices are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus.
- the host bus 907 is connected to an external bus 911 such as a peripheral component interconnect/interface (PCI) bus through the bridge 909 .
- PCI peripheral component interconnect/interface
- the input device 915 is an operation unit such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever operated by the user.
- the input device 915 may be a remote control unit (so-called remote controller) using infrared rays and other radio waves or an external connection apparatus 929 such as a mobile phone or a PDA that is compatible with the operation of the image processing apparatus 100 .
- the input device 915 is configured using an input control circuit that generates an input signal on the basis of information input by the user using the operation unit and outputs the input signal to the CPU 901 .
- the user of the image processing apparatus 100 operates the input device 915 and can input various data to the image processing apparatus 100 or instructs the image processing apparatus 100 to execute a process operation.
- the output device 917 is configured using a display device such as a liquid crystal display device, a plasma display device, an EL display device, and a lamp, a sound output device such as a speaker and a headphone, or a device such as a printer device, a mobile phone, and a facsimile that can visually or audibly notify the user of acquired information.
- the output device 917 outputs the result that is obtained by various processes executed by the image processing apparatus 100 .
- the display device displays the result obtained by the various processes executed by the image processing apparatus 100 with a text or an image.
- the sound output device converts an audio signal configured using reproduced sound data or acoustic data into an analog signal and outputs the analog signal.
- the image capturing device 918 is provided on the display device and the image processing apparatus 100 can capture a still image or a moving image of the user with the image capturing device 918 .
- the image capturing device 918 includes a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor and converts light condensed by a lens into an electric signal and can capture a still image or a moving image.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- the storage device 919 is a data storage device that is configured as an example of a storage unit of the image processing apparatus 100 .
- the storage device 919 is configured using a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901 , various data, and acoustic signal data or image signal data acquired from the outside.
- the drive 921 is a reader/writer for a storage medium and is incorporated in the image processing apparatus 100 or is attached to the outside of the image processing apparatus 100 .
- the drive 921 reads information that is recorded in the mounted removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory and outputs the information to the RAM 905 .
- the drive 921 can record an information on the mounted removable recording medium 927 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory.
- the removable recording medium 927 is a DVD medium, a Blu-ray medium, a compact flash (registered trademark) (CompactFlash: CF), a memory stick, a secure digital (SD) memory card, or the like.
- the removable recording medium 927 may be an integrated circuit (IC) card or an electronic apparatus where an IC chip of a non-contact type is mounted.
- the connection port 923 is a port to directly connect the apparatus to the image processing apparatus 100 , such as a universal serial bus (USB) port, an IEEE1394 port such as i.Link, a small computer system interface (SCSI) port, an RS-232C port, an optical audio terminal, or a high-definition multimedia interface (HDMI) port.
- USB universal serial bus
- IEEE1394 port such as i.Link
- SCSI small computer system interface
- RS-232C Small computer system interface
- HDMI high-definition multimedia interface
- the communication device 925 is a communication interface that is configured by a communication device for connection with a communication network 931 .
- the communication device 925 is a communication card such as a wired or wireless local area network (LAN), a Bluetooth, and a communication card for a wireless USB (WUSB), a router for optical communication, a router for an asymmetrical digital subscriber line (ADSL), or a modem for various communications.
- the communication device 925 can transmit and receive a signal based on a predetermined protocol such as TCP/IP between the Internet or other communication apparatuses and the communication device.
- the communication network 931 that is connected to the communication device 925 is configured by a network connected by wire or wireless.
- the communication network 931 may be the Internet, a home LAN, infrared communication, radio wave communication, or satellite communications.
- the example of the hardware configuration that can realize the function of the image processing apparatus 100 according to this example is described.
- the various components may be configured using general-purpose members or may be configured by hardware specialized in the functions of the individual components. Therefore, the hardware used in the configuration may be appropriately changed according to various technological levels to carry out this embodiment.
- the program (software) that executes each process step executed by the image processing apparatus 100 according to this example may be generated, the program may be deployed in a general-purpose computer device, and the same process may be executed.
- the program may be stored in various media or may be downloaded from the server side to the computer device through the Internet.
- An image processing apparatus comprising:
- An image processing apparatus comprising:
- the image processing apparatus according to any one of (12) to (24), further comprising:
- An image processing method comprising:
- a program that causes a computer to execute an image process the program causing the computer to execute:
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The present disclosure provides an image processing apparatus to be newly improved that can easily generate contents having three-dimensional images. The image processing apparatus generates a plurality of plane images and sets the virtual distances of a depth direction to the plurality of generated plane images, respectively. The image processing apparatus converts the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images, and obtains data of the three-dimensional image. In addition, the image processing apparatus displays an editing screen that generates the plurality of plane images. The editing screen displays the plurality of plane images individually or to be overlapped and displays the plane images by providing tabs to the plane images, respectively.
Description
- The present disclosure relates to an image processing apparatus, an image processing method, and a program and particularly, to a technology for generating a three-dimensional image.
- In recent years, an image display apparatus that displays an image (three-dimensional image: so-called 3D image) perceived by a user as a three-dimensional image is released and begins to spread (for example, Patent Literature 1). The apparatus that can display the three-dimensional image is not limited to a television and other image display apparatuses. Of personal computers, there are types that can display a three-dimensional image.
- Among applications running in the personal computer, there are applications that can generate contents having three-dimensional images. If the contents are generated by the applications and the user views the contents in a predetermined manner, the user can perceive images included in the contents as the three-dimensional images.
-
PTL 1 - JP 2010-210712A
- However, according to the related art, in generating contents having three-dimensional images, it is necessary to set a positional relation using dedicated software. Accordingly, it is difficult for an end user to generate such contents.
- In light of the foregoing, it is desirable to provide an image processing apparatus, an image processing method, and a computer program which are novel or improved ones and which enable contents having three-dimensional images to be generated easily.
- An image processing apparatus of the present disclosure comprises an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively.
- An image processing apparatus of the present disclosure comprises a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit.
- An image processing apparatus of the present disclosure comprises a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit.
- An image processing apparatus of the present disclosure comprises an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively.
- An image processing apparatus of the present disclosure comprises and an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
- An image processing method of the present disclosure comprises generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
- A program of the present disclosure that causes a computer to execute an image process, the program causing the computer to execute: generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
- According to the present disclosure, a plurality of plane images are appropriately displayed so that a user can generate and display a desired three-dimensional image through a simple operation.
- According to the present disclosure, there are provided an image processing apparatus, an image processing method, and a computer program which are novel or improved ones and which enables contents having three-dimensional images to be easily generated.
-
FIG. 1 -
FIG. 1 is a block diagram illustrating an example of the configuration of an image processing apparatus according to an embodiment of the present disclosure. -
FIG. 2 -
FIG. 2 is a diagram illustrating an example of an editing screen (example in which all layers are displayed) according to the embodiment of the present disclosure. -
FIG. 3 -
FIG. 3 is an exemplary diagram illustrating an example of an image of each layer according to the embodiment of the present disclosure. -
FIG. 4 -
FIG. 4 is an exemplary diagram illustrating an example of a depth display screen according to the embodiment of the present disclosure. -
FIG. 5 -
FIG. 5 is an exemplary diagram illustrating a concept of converting images of a plurality of layers into three-dimensional images. -
FIG. 6 -
FIG. 6 is an exemplary diagram illustrating a state in which images of a plurality of layers are converted into three-dimensional images. -
FIG. 7 -
FIG. 7 is an exemplary diagram illustrating an example in which images of a plurality of layers are converted into three-dimensional left and right channel images. -
FIG. 8 -
FIG. 8 is an exemplary diagram illustrating an example of three-dimensional images that are generated from images of a plurality of layers. -
FIG. 9 -
FIG. 9 is an exemplary diagram illustrating a converted state of the case in which there is a layer in which the front side of a virtual display surface is set to a virtual position. -
FIG. 10 -
FIG. 10 is an exemplary diagram illustrating an example of a three-dimensional image of the case where there is a layer in which the front side of a virtual display surface is set to a virtual position. -
FIG. 11 -
FIG. 11 is a flowchart illustrating an example of the flow of an editing screen display process according to the embodiment of the present disclosure. -
FIG. 12 -
FIG. 12 is an exemplary diagram illustrating an example of an editing screen (example in which a third layer is displayed) according to the embodiment of the present disclosure. -
FIG. 13 -
FIG. 13 is an exemplary diagram illustrating an example of an editing screen (example in which a second layer is displayed) according to the embodiment of the present disclosure. -
FIG. 14 -
FIG. 14 is an exemplary diagram illustrating an example of an editing screen (example in which a first layer is displayed) according to the embodiment of the present disclosure. -
FIG. 15 -
FIG. 15 is a flowchart illustrating an example of the flow of a depth screen display process according to the embodiment of the present disclosure. -
FIG. 16 -
FIG. 16 is an exemplary diagram illustrating an example (first example) of displaying a depth screen according to the embodiment of the present disclosure. -
FIG. 17 -
FIG. 17 is an exemplary diagram illustrating an example (second example) of displaying a depth screen according to the embodiment of the present disclosure. -
FIG. 18 -
FIG. 18 is an exemplary diagram illustrating an example (third example) of displaying a depth screen according to the embodiment of the present disclosure. -
FIG. 19 -
FIG. 19 is an exemplary diagram illustrating an example (fourth example) of displaying a depth screen according to the embodiment of the present disclosure. -
FIG. 20 -
FIG. 20 is a flowchart illustrating an example of the flow of a horizontal line setting process according to the embodiment of the present disclosure. -
FIG. 21 -
FIG. 21 is an exemplary diagram illustrating an example of a horizontal line setting screen according to the embodiment of the present disclosure. -
FIG. 22 -
FIG. 22 is an exemplary diagram illustrating an example (first example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. -
FIG. 23 -
FIG. 23 is an exemplary diagram illustrating an example (second example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. -
FIG. 24 -
FIG. 24 is an exemplary diagram illustrating an example (third example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. -
FIG. 25 -
FIG. 25 is an exemplary diagram illustrating an example (fourth example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. -
FIG. 26 -
FIG. 26 is an exemplary diagram illustrating an example of a display screen at the time of capturing a camera image according to the embodiment of the present disclosure. -
FIG. 27 -
FIG. 27 is an exemplary diagram illustrating a display example of a three-dimensional image where camera images are synthesized according to the embodiment of the present disclosure. -
FIG. 28 -
FIG. 28 is an exemplary diagram illustrating an example of a display screen at the time of importing an image file according to the embodiment of the present disclosure. -
FIG. 29 -
FIG. 29 is an exemplary diagram illustrating an example of an importing range setting at the time of importing an image file according to the embodiment of the present disclosure. -
FIG. 30 -
FIG. 30 is an exemplary diagram illustrating a display example of a three-dimensional image in which images imported from an image file are synthesized according to the embodiment of the present disclosure. -
FIG. 31 -
FIG. 31 is an exemplary diagram illustrating an example of a list display screen of productions according to the embodiment of the present disclosure. -
FIG. 32 -
FIG. 32 is a block diagram illustrating an example of detailed hardware configuration of an image processing apparatus according to the embodiment of the present disclosure. - Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings, in the following order.
- 1-1. Configuration of an image processing apparatus (
FIG. 1 ) - 1-2. Outline of a three-dimensional image (
FIGS. 3 to 10 ) - 1-3. Display example of an editing screen (
FIGS. 2 andFIGS. 11 to 14 ) - 1-4. Display example of a depth adjustment screen (
FIGS. 15 to 19 ) - 1-5. Display example of a ground surface setting screen (
FIGS. 20 to 25 ) - 1-6. Example where a camera image is captured (
FIGS. 26 and 27 ) - 1-7. Example where an image file is imported (
FIGS. 28 to 30 ) - 1-8. Display example of a generated image list (
FIG. 31 ) - 1-9. Specific example of hardware configuration (
FIG. 32 ) - 1-1. The functional configuration of an image processing apparatus
- First, the configuration of an image processing apparatus according to an embodiment (hereinafter, called “this example”) of the present disclosure will be described.
FIG. 1 is a diagram illustrating the configuration of animage processing apparatus 100 according to this example. - An
image processing apparatus 100 according to this example illustrated inFIG. 1 is configured to generate an image by a user operation and display and store the generated image. As illustrated inFIG. 1 , theimage processing apparatus 100 includes an image generating/processing unit 110, animage storage unit 120, aninput unit 130, and animage display unit 140. - The image generating/
processing unit 110 provides the user with an image generation screen through theimage display unit 140 or generates a three- dimensional image from the image generated by the user. As illustrated inFIG. 1 , the image generating/processing unit 110 that is included in theimage processing apparatus 100 according to this example includes animage generating unit 112, a three-dimensionalimage converting unit 114, a three-dimensional image generating unit 116, and an editingscreen generating unit 118. The editingscreen generating unit 118 also functions as an image control unit that controls image generation in each unit. Alternatively, the image generating/processing unit 110 may include an image control unit that controls image generation in each unit, separately from the editingscreen generating unit 118. - The image generating/
processing unit 110 generates a plurality of plane images (for example, three plane images) on the basis of a user operation in a state in which an editing screen for image generation is displayed on theimage display unit 140 and generates a three-dimensional image from the plurality of generated plane images (two-dimensional images). The editing screen for the image generation is a screen illustrated inFIG. 2 and displays an intermediate image in generation processing at the center and displays operation buttons or tabs around the image. The editing screen illustrated inFIG. 2 will be described below in detail. Information of the depth from the reference position (for example, virtual display surface) is set to each of the plurality of generated plane images and the three-dimensional image is generated on the basis of the depth. - In addition, the image generating/
processing unit 110 supplies image data of the three-dimensional image generated by the image generating/processing unit 110 to theimage display unit 140 and theimage display unit 140 displays the three-dimensional image. The user views the three-dimensional image using a predetermined method (for example, in a state in which the user wears a shutter spectacle of a time-sharing driving type) and perceives the three-dimensional image displayed on theimage display unit 140 as a three-dimensional image. - The
image generating unit 112 displays the editing screen for the image generation on theimage display unit 140 and generates an image by the user operation. If theimage generating unit 112 generates images including a plurality of layers using the image generation screen provided by theimage generating unit 112, the images that include the plurality of layers are converted into a three-dimensional image by the three-dimensionalimage converting unit 114 and the three-dimensional image generating unit 116. The images including the plurality of layers that are generated by theimage generating unit 112 are stored in theimage storage unit 120 according to the user operation. - The three-dimensional
image converting unit 114 executes a conversion process for displaying the images including the plurality of layers transmitted from theimage generating unit 112 as the three-dimensional image on theimage display unit 140. Theimage processing apparatus 100 according to this example previously assumes the distance between eyes of the user and the distance between the user and the display surface and executes a conversion process for displaying the image as the three-dimensional image on theimage display unit 140, on the basis of the virtual distance between the layers (information of the depth of each layer of the images). Specifically, in order for the three-dimensional image to be generated by performing a coordinate conversion on the images including the plurality of layers, the three-dimensionalimage converting unit 114 executes a coordinate conversion process with respect to the images including the plurality of layers. - At the time of the conversion process in the three-dimensional
image converting unit 114, if the user adjusts the depth of each layer of the images while the three-dimensional image is displayed on theimage display unit 140 and changes a depth state, the three-dimensionalimage converting unit 114 executes the conversion process in real time according to the change. Thereby, the user can adjust the depth of each layer of the images and confirm the three-dimensional image after the adjustment through the display on the editing screen in real time. An example of a process for adjusting the depth of each layer of the images will be described in detail below. - When the
image processing apparatus 100 generates the three-dimensional image from the planar images including the plurality of layers generated by the user, theimage processing apparatus 100 executes preview display of the three-dimensional image. By the preview display of the three-dimensional image, the user can previously grasp how the images would be seen in a three-dimensional manner before storing the generated image as the three-dimensional image. - The three-dimensional image generating unit 116 generates the three-dimensional image from the images including the plurality of layers, on the basis of the conversion process executed by the three-dimensional
image converting unit 114. - The three-dimensional image generated by the three-dimensional image generating unit 116 is displayed on the
image display unit 140 and is stored in theimage storage unit 120 according to the operation of theinput unit 130 from the user. - The editing
screen generating unit 118 generates display data of the editing screen, on the basis of a reception state of an input operation in theinput unit 130. The editingscreen generating unit 118 supplies the display data generated by the editingscreen generating unit 118 to theimage display unit 140 and displays the editing screen. - The
image storage unit 120 stores the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers. The images stored in theimage storage unit 120 are read from theimage storage unit 120 according to the operation of theinput unit 130 by the user, are processed by the image generating/processing unit 110, and are displayed by theimage display unit 140. - The
input unit 130 includes various input devices that execute an input operation with respect to theimage processing apparatus 100 by the user, for example, a keyboard, a mouse, a graphic tablet, and a touch panel. The user can generate the images including the plurality of layers by operating theinput unit 130 and adjust the depth of each layer of the images when the images are converted into the three-dimensional image. - The
image display unit 140 is a display that displays an image. For example, theimage display unit 140 displays the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers. Theimage display unit 140 displays a screen to allow the images to be generated by the user of theimage processing apparatus 100. An example of the display screen will be described below. A touch panel may be disposed on an image display surface of theimage display unit 140 and the user may directly operate buttons in a displayed image. The touch panel that is included in theimage display unit 140 functions as a part of theinput unit 130. Theimage display unit 140 may be a device that is separated from theimage processing apparatus 100. - The
image display unit 140 may be configured using a display device that can display the three-dimensional image. A method that displays the three-dimensional image is not limited to a specific display method. For example, as the method that displays the three-dimensional image, a method that switches an image for a right eye and an image for a left eye at a high speed and displays the images is known. As a method that transmits the three-dimensional image to theimage display unit 140, a frame sequential method, a side-by-side method, and a top and bottom method and the like are known. - The images that are generated by the image generating/
processing unit 110 may be output to the television receiver and other display devices that are connected to theimage processing apparatus 100 and can display the three-dimensional image. - 1-2. Outline of generation of a three-dimensional image
- Next, an outline of generation of the three-dimensional image by the
image processing apparatus 100 according to the embodiment of the present disclosure will be described with reference toFIGS. 3 to 10 . - First, as illustrated in
FIG. 3 , theimage processing apparatus 100 according to this example has 251, 252, and 253 including three layers of a long-distance view, a middle-distance view, and a short-distance view generated by the user. Theplane images image processing apparatus 100 synthesizes the 251, 252, and 253 including the three layers of the long-distance view, the middle-distance view, and the short-distance view and the plane images are converted into a three-dimensional image by the three-dimensionalplane images image converting unit 114. As such, by converting the images including the three layers into the three-dimensional image, the user can generate the three-dimensional image without executing a complicated imaging process. - In an example of
FIG. 3 , objects a, b, c, and g respectively corresponding to a mountain, the sun, a road, and a horizontal line are drawn in theplane image 251 of the long-distance view, an object d of a tree is drawn in theplane image 252 of the middle-distance view, and objects e and f each corresponding to a dog and an insect are drawn in theplane image 253 of the short-distance view. The number of layers that is three is exemplary and the number of layers may be plural. - The depth that is illustrated by the distance from the reference position is set to the image of each layer.
FIG. 4 is a diagram illustrating a setting situation of the depth. A setting state illustrated inFIG. 4 is a 3Dstate thumbnail tab 202 of the editing screen to be described below and the state illustrated in the drawing is reduced and displayed. When the 3Dstate thumbnail tab 202 is selected, the setting state illustrated inFIG. 4 is enlarged and displayed on the editing screen. - The setting state of the depth illustrated in
FIG. 4 will be described. Adepth axis 301 is set to a direction orthogonal to a display screen and the depth positions of the 311, 312, and 313 are set on thelayer images depth axis 301. In the example ofFIG. 4 , thedepth axis 301 is illustrated as an oblique axis that has a predetermined angle. - An image frame that is illustrated by a broken line in
FIG. 4 illustrates avirtual display surface 304 at the position of the image display surface of the display. Thevirtual display surface 304 exists at thedepth position 301 a on thedepth axis 301. The position of the most front side of thedepth axis 301 is illustrated as afront edge portion 302. - In the example of
FIG. 4 , the first layer image 311 of the long-distance view exists at theinnermost depth position 301 b on thedepth axis 301 and thesecond layer image 312 of the middle-distance view exists at thedepth position 301 c near almost the center on thedepth axis 301. Thethird layer image 313 of the short-distance view exists at thedepth position 301 d of the front side of thevirtual display surface 304 on thedepth axis 301. In this example, thelayer image 313 of the short-distance view exists at the depth position of the front side of thevirtual display surface 304. However, thelayer image 313 of the short-distance view may be inner than the virtual display surface 304 (closer to the long-distance view). - As such, when the three layer images are prepared, the layer image may be automatically set to the predetermined depth position suitable for the long-distance view as the first layer image 311, in an initial state. Likewise, the layer image may be automatically set to the predetermined depth position suitable for the middle-distance view as the
second layer image 312 and the layer image may be automatically set to the predetermined depth position suitable for the short-distance view as thethird layer image 313. - As illustrated in
FIG. 4 , the three-dimensional image is generated such that the objects a to g in the 311, 312, and 313 are disposed at the depth positions of the layer images in a virtual three-dimensional space.layer images - However, when the position of a horizontal line c is set to the
horizontal line position 321 with respect to the layer image 311 of the long-distance view, an image portion of the lower side of thehorizontal line position 321 of the layer image 311 of the long-distance view is disposed to be inclined on the three-dimensional space and protrude to the front side. That is, the image portion of the lower side of the horizontal line c becomes aninclined surface 303 that gradually protrudes to the front side, from thedepth position 301 b of the layer image 311 to thefront edge portion 302, as the image portion progresses to the lower side. In the example ofFIG. 4 , the object g of the road that is drawn on the lower side of thehorizontal line position 321 in the layer image 311 is disposed on theinclined surface 303. - In the example of
FIG. 4 , theinclined surface 303 protrudes to thefront edge portion 302 of the front side of thethird layer image 313. However, theinclined surface 303 may protrude to thevirtual display surface 304. In this case, the layer (third layer image 313) of the most front side protrudes to the front side of a terminating point of the front side of theinclined surface 303. - As such, when setting of the horizontal line is performed with respect to the first layer image 311 of the long-distance view and setting matched with the horizontal line is performed with respect to the objects in the
312 and 313, each object is automatically appropriately disposed on theother layer images inclined surface 303. Alternatively, a process for erasing inappropriate portions of the objects is executed. A specific example of the setting of the horizontal line and the process associated with the horizontal line will be described below. - Next, an example of a process for converting the images into the three-dimensional image in the three-dimensional
image converting unit 114 will be described. -
FIG. 5 is a diagram illustrating a concept of converting normal two-dimensional images including a plurality of layers into a three-dimensional image.FIG. 5 illustrates an aspect where two-dimensional images including a plurality of layers ( 251, 252, and 253) are converted into anplane images image 250R for a right eye which the user views using the right eye and animage 250L for a left eye which the user views using the left eye. However, inFIG. 5 , the process of the horizontal line described above is not illustrated. The three-dimensionalimage converting unit 114 calculates the drawing positions of the image for the right eye and the image for the left eye to generate theimage 250R for the right eye and theimage 250L for the left eye from the two-dimensional images. - Next, an example of a specific method to calculate the drawing positions of the image for the right eye and the image for the left eye will be described.
-
FIGS. 6 to 8 are exemplary diagrams illustrating an example where normal two-dimensional images including a plurality of layers are converted into a three-dimensional image.FIG. 6 illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated inFIG. 7 and illustrates a coordinate conversion when the three layers are inner than the display surface, as illustrated inFIG. 8 .FIG. 6 schematically illustrates a state in which each layer and the display surface are viewed from the top side. - As illustrated in
FIG. 6 , a distance E between the left eye and the right eye and the virtual viewing distance L are previously assumed. The three-dimensionalimage converting unit 114 executes a projection coordinate conversion for the image for the right eye and a projection coordinate conversion for the image for the left eye, using the layer depths D1, D2, and D3 between thevirtual display surface 259 and the layers. With the projection coordinate conversion, the pixel positions of the objects of each layer image in thevirtual display surface 259 are calculated. - As such, the three-dimensional
image converting unit 114 executes the projection coordinate conversion with respect to thevirtual display surface 259 and theimage processing apparatus 100 can convert the normal two-dimensional images including the plurality of layers into the three-dimensional image. - The examples of
FIGS. 6 to 8 are examples where each layer is set to the depth deeper than the depth of thevirtual display surface 259. However, each layer may be set to the depth of the front side of thevirtual display surface 259. -
FIGS. 9 and 10 illustrate an example where two-dimensional images are converted into a three-dimensional image when an image of the layer of the short-distance view is disposed on the front side of avirtual display surface 259′.FIG. 9 schematically illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated inFIG. 10 , in an upwardly viewed state. - As illustrated in
FIG. 9 , the pixel positions of the image for the right eye and the image for the left eye where the objects of the image of the layer of the front side of thevirtual display surface 259′ are projected on thevirtual display surface 259′ are reversed to the pixel positions in the layer deeper than thevirtual display surface 259′ at the left and right sides. - In the case of this example, as illustrated in
FIG. 10 , a three-dimensional image where the objects e and f disposed on the layer of the short-distance view are viewed to protrude to the front side of the display surface is obtained. - 1-3. Display example of the editing screen
- Next, an image generation process using the editing screen that is needed to generate the three-dimensional image will be described.
- The editing screen is the screen illustrated in
FIG. 2 that is displayed on theimage display unit 140, on the basis of the process in the editingscreen generating unit 118. A flowchart ofFIG. 11 illustrates a flow of the process in the editingscreen generating unit 118. The description will be given with reference toFIG. 11 . First, it is determined whether theinput unit 130 receives a user operation to display the editing screen (step S11). In this case, when the user operation is not received, a waiting state is maintained and when the user operation to display the editing screen is received, the process for displaying the 3D editing screen illustrated inFIG. 2 is executed (step S12). In this state, in the 3D editing screen, all of the layers are overlapped and displayed as the displayed intermediate images in generation processing. In addition, multi-layer thumbnails that correspond to the number of layers are displayed. - In addition, it is determined whether an operation to select any one of the displayed multi-layer thumbnails by the
input unit 130 is received (step S13). When the operation is not received, a waiting state is maintained and the image of the editing screen is not changed. - In addition, when the operation to select any one of the layer thumbnails is received, the image of the layer that corresponds to the selected layer thumbnail is displayed on the editing screen (step S14). However, in this case, in the image display of each layer, the display brightness of the images of the other layers is decreased and the image of each layer is simultaneously displayed, and the entire three-dimensional image is displayed to be recognized.
- In a state in which the image of the specific layer is displayed in step S14, it is determined whether an operation to change the layer to another layer by selecting the layer thumbnail exists (step S15). In this case, when it is determined that the operation to change the layer to another layer exists, a change process of the layer that is displayed on the editing screen is executed (step S16), the process returns to step S14, and an image of the layer after the change is displayed.
- When it is determined that the operation to change the layer to another layer does not exist in step S15, it is determined whether an operation to change the display to the overlapped display of all of the layers exists (step S17). In this case, when it is determined that the operation to change the display to the overlapped display of all of the layers exists, the display is changed to the overlapped display of all of the layers and the process returns to the 3D editing screen display in step S12.
- In step S17, when it is determined that the operation to change the display to the overlapped display of all of the layers does not exist, the screen display of the selected layer in step S14 is continuously executed.
- Next, a specific display example of the editing screen will be described with reference to
FIGS. 2 and 12 to 14.FIG. 2 illustrates an example of an editing screen in a state in which all layers of intermediate images in generation processing are overlapped and displayed. In the editing screen,intermediate image display 203 in edition processing is performed with a relatively large image at a center portion. In the example ofFIG. 2 , as theediting image display 203, an image where theimages 251 to 253 of the three layers illustrated inFIG. 3 are overlapped and displayed is displayed. In order to illustrate the image including the 251, 252, and 253 of the three layers, three layer edge portions of a firstimages layer edge portion 203 a, a secondlayer edge portion 203 b, and a thirdlayer edge portion 203 c are overlapped and illustrated with respect to an upper edge portion of theediting image display 203. In theedge portions 203 a to 203 c of the three images, the edge portion (for example,edge portion 203 a) of the image that corresponds to the selected tab is displayed on the front side and the edge portions of the other images are displayed on the inner side. - In addition,
201, 202, and 211 to 213 to select a display image are disposed on the upper side of theplural tabs intermediate image display 203 in edition processing in the editing screen. The 201, 202, and 211 to 213 display the images allocated to the selected tabs, when a user operation to select the corresponding tab display places by the touch panel operation or the like exists.tabs - The image that is allocated to each tab will be described. The
211, 212, and 213 are tabs to individually display the layers. Specifically, the firstlayer thumbnail tabs layer thumbnail tab 211 is a tab that displays a first layer image, the secondlayer thumbnail tab 212 is a tab that displays a second layer image, and the thirdlayer thumbnail tab 213 is a tab that displays a third layer image. In thethumbnail tabs 211 to 213, images of the layers are reduced and displayed as the thumbnail images. Therefore, when the image of each layer is modified, the thumbnail image is also modified in the same way. - The
tab 201 that is displayed adjacent to the 211, 212, and 213 is a tab that adds the layer image. That is, when thelayer thumbnail tabs tab 201 is selected, a layer image is newly added and an editing screen of the newly added layer image is displayed. - In addition, the 3D
state thumbnail tab 202 that is disposed to be closer to the right side of the upper side of theintermediate image display 203 in edition processing is a tab that displays a depth state of the image of each layer. When the 3Dstate thumbnail tab 202 is selected, the display screen of the depth state illustrated inFIG. 4 is enlarged and displayed in the place of theediting image display 203. In the 3Dstate thumbnail tab 202, the display screen of the depth state illustrated inFIG. 4 is reduced and displayed. With respect to the reduction display in the 3Dstate thumbnail tab 202, the depth state is adjusted and a display content changes according to the adjustment. - A peripheral portion of the
intermediate image display 203 in edition processing of the editing screen illustrated inFIG. 2 will be described. On the left end of theediting image display 203, a plurality ofbuttons 221 to 225 are disposed. In this example, afile importing button 221, acamera capturing button 222, astamp button 223, acharacter input button 224, and adepth operation button 225 are prepared. - If a user operation to select the
file importing button 221 exists, a process for importing a file image that is prepared in theimage storage unit 120 or an external memory starts. - If a user operation to select the
camera capturing button 222 exists, a process for capturing image data from a camera device connected to the apparatus starts. - In the
stamp button 223 and thecharacter input button 224, a process for inputting prepared figures or characters starts, by a user operation of each button. - If a user operation to select the
depth operation button 225 exists, a depth adjustment screen is displayed on theediting image display 203. A display process of the depth adjustment screen will be described below. - As illustrated in
FIG. 2 , ageneration start button 241, asave button 242, and a three-dimensional display button 243 are displayed on the right end of theediting image display 203. Thegeneration start button 241 is a button to instruct to generate a new three-dimensional image. Thesave button 242 is a button to instruct to store an intermediate image in generation processing in theimage storage unit 120. The three-dimensional display button 243 is a button to switch three-dimensional display and normal display (2D display) of the image displayed on theediting image display 203. - On the lower side of the right end of the
editing image display 203, apen tool 290 is displayed. Thepen tool 290 of this example displays afirst pen 291, asecond pen 292, athird pen 293, and aneraser 294. If a user operation to select display places of the 291, 292, and 293 exists, a line can be drawn with a color or a line type allocated to each pen. If a user operation to select a display place of thepens eraser 294 exists, the drawn line is erased. Thepen tool 290 may have a function of supporting other drawing or erasure. - As illustrated in
FIG. 2 ,object display units 231 to 237 are provided on a lower end of theediting image display 203. In the plurality ofobject display units 231 to 237, for example, seven newest objects that are used by the user are displayed. In the example ofFIG. 2 , each object of the intermediate image in generation processing is illustrated. -
FIG. 2 illustrates an example where images of the individual layers are overlapped and displayed in theediting image display 203. Even when the other images are displayed in theediting image display 203, the same display is performed around theediting image display 203. In the editing screen ofFIG. 2 , a tab that selects overlapped display where the image of each layer illustrated inFIG. 2 is overlapped is not provided in particular. However, the tab that selects the overlapped display may be prepared separately from the tab for the image of each layer. When the tab to select the overlapped display where the image of each layer is overlapped is provided, a thumbnail image where an overlapped image of the image of each layer is reduced and displayed is displayed on the tab. Therefore, when the image of each layer is modified, the thumbnail image where the overlapped image is reduced and displayed is modified according to the change. -
FIG. 12 illustrates an example where the image of the third layer (image 253 of the short-distance view ofFIG. 3 ) is displayed in theediting image display 203. The image display of the third layer is performed by selecting the thirdlayer thumbnail tab 213 by a user operation. - When the
image 253 of the third layer is displayed, the objects e and f in the image of the third layer are displayed with set colors or brightness. The objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. That is, only the image of the third layer is highlighted and displayed and the images of the other layers are grayed out and displayed. - By performing an operation to add the objects in the images in the
intermediate image display 203 in edition processing or drawing by the user in a state illustrated inFIG. 12 , the image of the third layer is generated or edited. -
FIG. 13 illustrates an example where the image of the second layer (image 252 of the middle-distance view ofFIG. 3 ) is displayed in theediting image display 203. The image display of the second layer is performed by selecting the secondlayer thumbnail tab 212 by a user operation. - When the
image 252 of the second layer is displayed, the object d in the image of the second layer is displayed with set colors or brightness. The objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. By performing an operation to add the objects in the images in theintermediate image display 203 in edition processing or drawing by the user in a state illustrated inFIG. 13 , the image of the second layer is generated or edited. -
FIG. 14 illustrates an example where the image of the first layer (image 251 of the long-distance view ofFIG. 3 ) is displayed in theediting image display 203. The image display of the first layer is performed by selecting the firstlayer thumbnail tab 211 by a user operation. - When the
image 251 of the first layer is displayed, the objects a, c, and g in the image of the first layer are displayed with set colors or brightness. At the time of the display, the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. - By performing an operation to add the objects in the images in the
intermediate image display 203 in edition processing or drawing by the user in a state illustrated inFIG. 14 , the image of the first layer is generated or edited. - In the display examples of
FIGS. 12 to 14 , the image of the selected layer is highlighted and displayed and the images of the other layers are grayed out and displayed. Meanwhile, as the editing image, only the objects of the image of the selected layer may be displayed, and the objects of the images of the other layers may not be displayed. - 1-4. Display example of a depth adjustment screen
- Next, a flow of a depth adjustment process of an image of each layer will be described with reference to a flowchart of
FIG. 15 . - A depth adjustment process is started by a user operation to select a
depth operation button 225 of the editing screen illustrated inFIG. 2 , for example. That is, as illustrated inFIG. 15 , the editingscreen generating unit 118 determines whether an operation to select thedepth operation button 225 exists in a state in which the editing screen is displayed (step S21). When the operation to select thedepth operation button 225 does not exist, a waiting state is maintained until the operation exists. - In step S21, when it is determined that the operation to select the
depth operation button 225 exists, overlapped display of the images of all of the layers is performed as theintermediate image display 203 in edition processing and a depth bar is displayed on the upper side of theintermediate image display 203 in edition processing (step S22). The depth bar is a scale that illustrates the depth of each layer. As illustrated in this example, when the depth bar is configured using the images of the three layers, the depth positions of the three layers are displayed by the depth bar. On the lower side of theediting image display 203, a depth adjustment button is displayed. A specific display example will be described below. - However, in this example, for each layer, an adjustment button to move the depth of the image to the inner side and an adjustment button to move the depth of the image to the front side are prepared and displayed.
- The editing
screen generating unit 118 determines whether a user operation of any adjustment button exists (step S23). In this case, when it is determined that the operation of the adjustment button does not exist, the display of step S22 is continuously executed. When it is determined that the operation of the adjustment button exists, the image of the layer that corresponds to the operated operation button is displayed as theintermediate image display 203 in edition processing (step S24). The depth position that is set to the image of the corresponding layer is changed according to the operation situation of the adjustment button and the depth position of the depth bar display is changed according to the corresponding position (step S25). After the setting or the display is changed in step S25, the display returns to the display of step S22. However, theintermediate image display 203 in edition processing may be the display of only the operated layer until a next operation exists. -
FIG. 16 is a diagram illustrating a display example of the depth bar. - When the operation of the
depth operation button 225 of the editing screen exists, as illustrated inFIG. 16 , the overlapped display of the images of all of the layers is performed as theintermediate image display 203 in edition processing and thedepth bar 401 is displayed on the upper side of theediting image display 203. - In this example, the depth positions of the images of the three layers are illustrated in one
depth bar 401. That is, in thedepth bar 401, thedepth position 401 a of the image of the first layer, the depth position 401 b of the image of the second layer, and thedepth position 401 c of the image of the third layer are illustrated by changing the display colors. - In scales that are given to the
depth bar 401, “0” indicates the depth position of the virtual display surface, the inner side is indicated by a minus value, and the front side is indicated by a plus value (plus display is not illustrated). - On the lower side of the
editing image display 203, buttons that adjust the depth position are displayed for the image of each layer. Specifically, adepth adjustment button 411 that moves the depth to the front side and adepth adjustment button 412 that moves the depth to the inner side are displayed as adjustment buttons for the image of the first layer. Adepth adjustment button 421 that moves the depth to the front side and adepth adjustment button 422 that moves the depth to the inner side are displayed as adjustment buttons for the image of the second layer. Adepth adjustment button 431 that moves the depth to the front side and adepth adjustment button 432 that moves the depth to the inner side are displayed as adjustment buttons for the image of the third layer. - If the user operation to select the display places of the
depth adjustment buttons 411 to 432 exists, the setting of the depth of the image of each layer is changed. For example, the depth position of the image of the first layer is changed by the operations of the 411 and 412 and the display of thedepth adjustment buttons depth position 401 a of the image of the first layer in thedepth bar 401 is moved as illustrated by an arrow La. - In addition, the depth position of the image of the second layer is changed by the operations of the
421 and 422 and the display of the depth position 401 b of the image of the second layer in thedepth adjustment buttons depth bar 401 is moved as illustrated by an arrow Lb. - In addition, the depth position of the image of the third layer is changed by the operations of the
431 and 432 and the display of thedepth adjustment buttons depth position 401 c of the image of the third layer in thedepth bar 401 is moved as illustrated by an arrow Lc. - The
depth adjustment buttons 411 to 432 are limited to the positions adjacent to the depth positions of the images of the adjacent layers. For example, a range of the movement La of the image of the first image is from the deepest position to the position adjacent to the depth position of the image of the adjacent second layer. -
FIG. 17 illustrates another display example of the depth bar. - In this example, in the
editing image display 203, the image of the selected layer (in this example, image of the third layer) is highlighted and displayed. In adepth bar 501, only thedepth position 502 of the layer of the emphasis display is illustrated as thedepth bar 501. In the case of the example ofFIG. 17 , in thedepth bar 501, an entire depth range set by theimage processing apparatus 100 is illustrated and aposition display 503 of the virtual display surface is illustrated. - In the case of the example of
FIG. 17 , with respect to the depth adjustment buttons, only the 511 and 512 for the image of the third layer that is the selecting layer are displayed. The depth position of the image of the layer of the emphasis display is adjusted by the operations of thedepth adjustment buttons 511 and 512 and the display position of thedepth adjustment buttons depth position 502 of thedepth bar 501 is changed as illustrated by an arrow Ld. -
FIG. 18 illustrates another display example of the depth bar. - In the example of
FIG. 18 , similar to the example ofFIG. 17 , the image of the selected layer (in this example, image of the second layer) is highlighted and displayed in theediting image display 203. In adepth bar 601, only a range where the depth of the image of the layer can be adjusted is displayed with scales. Adepth position 602 is displayed in thedepth bar 601. That is, in the depth of the image of the second layer, the inner side is limited to the depth position of the image of the first layer and the front side is limited to the depth position of the image of the third layer. A range where the depth is limited and adjusted becomes a scale display range of thedepth bar 601. - Therefore, when the depth position is adjusted by the operations of the
611 and 612, thedepth adjustment buttons depth position 602 moves in the range of the displayeddepth bar 601, as illustrated by an arrow Le. -
FIG. 19 illustrates another display example of the depth bar. - In an example of
FIG. 19 , the overlapped display of the images of all of the layers is performed in theintermediate image display 203 in edition processing and adepth position 702 of the image of the layer where the depth is adjusted is illustrated in adepth bar 701. In addition, aposition 703 of the virtual display surface is indicated. Similar to the example ofFIG. 16 , 711, 712, 721, 722, 731, and 732 are provided for each layer and thedepth adjustment buttons depth position 702 is changed by adjusting the depth by the button operation, as illustrated by an arrow Lf. - When the depth of the specific layer is adjusted by the operations of the
711, 712, 721, 722, 731, and 732, the position of the image of the layer that is compatible with the adjustment of the depth is indicated by andepth adjustment buttons image frame 704 of four corners, in theediting image display 203. - At almost the center of the image, display 705 of a numerical value (in this example, “−25”) indicating the setting position of the depth is performed.
- In this way, display where setting of the depth can be recognized from the display image may be performed.
- 1-5. Display example of a ground surface setting screen
- Next, a flow of a ground surface setting process of an image will be described with reference to a flowchart of
FIG. 20 . - The ground surface setting process starts when the user operates a button (not illustrated in the drawings) to instruct setting of the ground surface, in the editing screen illustrated by
FIG. 2 . That is, as illustrated inFIG. 20 , the editingscreen generating unit 118 determines whether an operation to instruct setting of the ground surface exists, in a state in which the editing screen is displayed (step S31). When the operation of the setting of the ground surface does not exist, a waiting state is maintained until the corresponding operation exists. - In step S31, when it is determined that the operation of the setting of the ground surface exists, the overlapped display of the images of all of the layers is performed as the
intermediate image display 203 in edition processing and a slider bar for horizontal line adjustment is vertically displayed on one end of theintermediate image display 203 in edition processing (step S32). A slider handle that indicates the position of the horizontal line is displayed on the slider bar for the horizontal line adjustment, and it is determined whether a drag operation of the slider handle exists (step S33). - When it is determined that the operation of the slider handle exists, a change process of the position of the horizontal line is executed according to the operation (step S34). The lower side of the horizontal line of the image of the innermost layer (first layer) is set to the inclined surface on the three-dimensional space, according to the change of the position of the horizontal line. The setting of the inclined surface is the process already described in
FIG. 4 and the inclined surface corresponds to the ground surface. - Next, it is determined whether a mode to erase the lower side of the ground surface in the images of the layers other than the first layer is set (step S35). In this case, when the mode to erase the lower side of the ground surface is set, the objects at the positions that become the lower side of the inclined surface (ground surface) of the images of the layers other than the first layer are erased (step S36).
- When it is determined that the mode to erase the lower side of the ground surface is not set in step S35, it is determined whether the object set to be disposed on the ground surface exists among the objects of the images of the individual layers (step S37). In this case, when it is determined that the object set to be disposed on the ground surface exists, the position of the lower end of the corresponding object is adjusted to the position crossing the inclined surface in the image of the layer where the object exists (step S38).
- When it is determined that a drag operation does not exist in step S33, after the processes of steps S36 and S38 are executed and when it is determined that the object set to be disposed on the ground surface does not exist in step S37, the process returns to the horizontal line slider bar display process of step S32.
- Next, a specific display example at the time of adjusting the horizontal line will be described.
-
FIG. 21 illustrates an example where the horizontal line adjustment bar 261 which is the horizontal line slider bar is displayed in theintermediate image display 203 in edition processing in the editing screen. In the horizontal line adjustment bar 261, the ground surface setting position display 262 is indicated. In this example, the ground surface setting position display 262 is matched with the ground line c drawn in the image of the first layer. The matching process is executed by the user operation. - As such, the ground surface of the lower side of the horizontal line of the image of the first layer is set as the inclined surface illustrated in
FIG. 4 by setting the horizontal line. The position of the ground surface may be displayed in the images of the layers other than the first layer (images of the second layer and the third layer). - For example, as illustrated in
FIG. 22 , ground surface position display 271 where theimage 252 of the second layer and the ground surface (inclined surface) cross is indicated by a broken line as theintermediate image display 203 in edition processing in the editing screen. By this display, it is determined whether the arrangement position of the object in theimage 252 of the layer (in this example, object d of a tree) is appropriate, from the display of the ground surface position. That is, a lower end of the object d of the tree is almost matched with the ground surface position display 271 illustrate by a broken line and an appropriate three-dimensional image is obtained. Meanwhile, the lower end of the object d of the tree is at the upper side of the ground surface position display 271 illustrated by the broken line, the tree is floated. When the lower end of the object d is at the lower side of the ground surface position display 271, the tree sinks into the ground surface. As a result, an unnatural three-dimensional image is obtained in both cases. As illustrated inFIG. 22 , the ground surface position display 271 is performed to effectively prevent the image from becoming the unnatural three-dimensional image. -
FIG. 23 illustrates another display example of the position of the ground surface. In the example ofFIG. 23 , only the upper side of the position crossing the ground surface is displayed as the image of the second layer and the lower side of the ground surface is displayed as a non-display portion 272 (black display portion). InFIG. 23 , the image of the third layer of the front side of the second layer is displayed, for example, after decreasing the brightness. However, the objects e and f in the image of the third layer may not be displayed. -
FIG. 24 illustrates an example where the objects of the lower side of the ground surface that is the inclined surface in the layers other than the first layer that is the long-distance view are erased and the display is performed as theintermediate image display 203 in edition processing in the editing screen. A process ofFIG. 24 corresponds to the process in step S36 of the flowchart ofFIG. 20 . - In the example of
FIG. 24 , a lower part of an object e of a dog of the third layer that is the short-distance view becomes the lower side of the ground surface. At this time, the object e is displayed in a state in which the lower part of the object e of the lower side of the ground surface is erased. - In this way, the object that becomes the lower side of the ground surface is not displayed so that unnatural display where the object exists on the lower side of the ground surface when the generated image is viewed three-dimensionally can be prevented. The partial erasure process of the object illustrated in
FIG. 24 may be executed in the three-dimensional image display and the corresponding object may be completely displayed when the image of each layer is individually displayed. -
FIG. 25 illustrates an example of an operation screen in the case where a lower end of the object in the image of each layer is matched with the ground surface of the inclined surface and display is performed as theintermediate image display 203 in edition processing in the editing screen. A process ofFIG. 25 corresponds to the process in step S38 of the flowchart ofFIG. 20 . - In the example of
FIG. 25 , an operation screen with respect to the object e of the dog of the third layer that is the short-distance view is illustrated. In this example, as illustrated inFIG. 25 , 281 and 282, a returningposition movement buttons button 283, anerasure button 284, and a groundsurface adjustment button 285 are displayed around the object e. - The user performs the operation to select each button and the position of the object e is modified. In this case, when the operation to select the ground
surface adjustment button 285 exists, the editingscreen generating unit 118 executes a process for automatically matching the position of the lower end of the object e with a surface crossing the ground surface. - Therefore, the corresponding object e is automatically disposed on the ground surface by selecting the ground
surface adjustment button 285 and an appropriate three-dimensional image can be generated. - 1-6. Example where a camera image is captured
-
FIGS. 26 and 27 illustrate an example where a camera image is taken in an intermediate image in generation processing. - For example, if an operation to select the
camera capturing button 222 in the editing screen illustrated inFIG. 2 exists, as illustrated inFIG. 26 , a cameracapturing operation screen 810 is displayed on the editing screen. A process for reading a camera image from an external camera device (or storage device where the camera image is stored) connected to theimage processing apparatus 100 is executed by an operation using the cameracapturing operation screen 810. In addition, as illustrated inFIG. 26 , display of thecamera capturing image 811 is performed by the reading. Anextraction image 812 where a background is removed from thecamera capturing image 811 is obtained by an operation in the cameracapturing operation screen 810. - By disposing the
extraction image 812 on the image of any layer, theextraction image 812 can be disposed as one of the objects in the intermediate image in generation processing, as illustrated inFIG. 27 . The depth position of the layer where the extraction image is disposed is selected by the user using the cameracapturing operation screen 810. Alternatively, the camera capturing image may be automatically disposed on the layer of the most front side (short-distance view). - 1-7. Example where an image file is imported
-
FIGS. 28 to 30 illustrate an example where a file image is imported in an intermediate image in generation processing. - For example, if an operation to select the
file importing button 221 in the editing screen illustrated inFIG. 2 exists, as illustrated inFIG. 28 , an image file importingoperation screen 820 is displayed in the editing screen. A process for reading selected image data from the image file stored in the designated place is executed by an operation using the image file importingoperation screen 820. In addition, as illustrated inFIG. 28 , an importedimage 821 is displayed by the reading. As illustrated inFIG. 29 , anextraction image 822 that is partially extracted from the importedimage 821 is obtained by an operation in the image file importingoperation screen 820. - By disposing the
extraction image 822 on the image of any layer, as illustrated inFIG. 30 , theextraction image 822 can be disposed as one of the objects in the generated image. In this case, the depth position of the layer where the extraction image is disposed is selected by the user using the image file importingoperation screen 820. - <1-8. Display example of a list of generated images
- The three-dimensional image that is generated using the editing screen in the process described above is stored in the
image storage unit 120 of theimage processing apparatus 100. A list of data of the stored three-dimensional images can be displayed on one screen. -
FIG. 31 illustrates an example where a list of generated images is displayed. In this example, generated 11, 12, 13, . . . are reduced and displayed. In this case, in display of each image, a two-dimensional image or a three-dimensional image may be displayed.images - In columns of generated image display, numbers of layers displays 11 a, 12 a, 13 a, . . . that number of layers are indicated by figures are performed. In this case, as the displays of the number of layers by the figures, a figure where three images are overlapped is displayed when the number of layers is three.
- By displaying the list of generated images, the generated images can be easily selected. The selected images may be displayed in the editing screen illustrated in
FIG. 2 and editing work may be performed. - 1-9. Specific example of hardware configuration
- Next, a specific example of the hardware configuration of the
image processing apparatus 100 according to this example will be described with reference toFIG. 32 . -
FIG. 32 illustrates an example where theimage processing apparatus 100 is configured as an information processing device such as a computer device. - The
image processing apparatus 100 mainly includes aCPU 901, aROM 903, aRAM 905, ahost bus 907, a bridge 909, anexternal bus 911, aninterface 913, aninput device 915, anoutput device 917, animage capturing device 918, astorage device 919, adrive 921, aconnection port 923, and acommunication device 925. - The
CPU 901 functions as an operation processing device and a control device and controls all or a part of operations in theimage processing apparatus 100, according to various programs stored in theROM 903, theRAM 905, thestorage device 919, and aremovable recording medium 927. TheROM 903 stores programs or operation parameters that are used by theCPU 901. TheRAM 905 primarily stores programs used in execution of theCPU 901 or parameters appropriately changed in the execution. These devices are connected to each other by ahost bus 907 configured by an internal bus such as a CPU bus. - The
host bus 907 is connected to anexternal bus 911 such as a peripheral component interconnect/interface (PCI) bus through the bridge 909. - The
input device 915 is an operation unit such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever operated by the user. Theinput device 915 may be a remote control unit (so-called remote controller) using infrared rays and other radio waves or an external connection apparatus 929 such as a mobile phone or a PDA that is compatible with the operation of theimage processing apparatus 100. Theinput device 915 is configured using an input control circuit that generates an input signal on the basis of information input by the user using the operation unit and outputs the input signal to theCPU 901. The user of theimage processing apparatus 100 operates theinput device 915 and can input various data to theimage processing apparatus 100 or instructs theimage processing apparatus 100 to execute a process operation. - The
output device 917 is configured using a display device such as a liquid crystal display device, a plasma display device, an EL display device, and a lamp, a sound output device such as a speaker and a headphone, or a device such as a printer device, a mobile phone, and a facsimile that can visually or audibly notify the user of acquired information. Theoutput device 917 outputs the result that is obtained by various processes executed by theimage processing apparatus 100. Specifically, the display device displays the result obtained by the various processes executed by theimage processing apparatus 100 with a text or an image. Meanwhile, the sound output device converts an audio signal configured using reproduced sound data or acoustic data into an analog signal and outputs the analog signal. - For example, the
image capturing device 918 is provided on the display device and theimage processing apparatus 100 can capture a still image or a moving image of the user with theimage capturing device 918. Theimage capturing device 918 includes a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor and converts light condensed by a lens into an electric signal and can capture a still image or a moving image. - The
storage device 919 is a data storage device that is configured as an example of a storage unit of theimage processing apparatus 100. For example, thestorage device 919 is configured using a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. Thestorage device 919 stores programs executed by theCPU 901, various data, and acoustic signal data or image signal data acquired from the outside. - The
drive 921 is a reader/writer for a storage medium and is incorporated in theimage processing apparatus 100 or is attached to the outside of theimage processing apparatus 100. Thedrive 921 reads information that is recorded in the mountedremovable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory and outputs the information to theRAM 905. Thedrive 921 can record an information on the mountedremovable recording medium 927 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory. Theremovable recording medium 927 is a DVD medium, a Blu-ray medium, a compact flash (registered trademark) (CompactFlash: CF), a memory stick, a secure digital (SD) memory card, or the like. Theremovable recording medium 927 may be an integrated circuit (IC) card or an electronic apparatus where an IC chip of a non-contact type is mounted. - The
connection port 923 is a port to directly connect the apparatus to theimage processing apparatus 100, such as a universal serial bus (USB) port, an IEEE1394 port such as i.Link, a small computer system interface (SCSI) port, an RS-232C port, an optical audio terminal, or a high-definition multimedia interface (HDMI) port. By connecting the external connection apparatus 929 to theconnection port 923, theimage processing apparatus 100 acquires the acoustic image data or the image signal data directly from the external connection apparatus 929 or provides the acoustic image data or the image signal data to the external connection apparatus 929. - The
communication device 925 is a communication interface that is configured by a communication device for connection with acommunication network 931. Thecommunication device 925 is a communication card such as a wired or wireless local area network (LAN), a Bluetooth, and a communication card for a wireless USB (WUSB), a router for optical communication, a router for an asymmetrical digital subscriber line (ADSL), or a modem for various communications. Thecommunication device 925 can transmit and receive a signal based on a predetermined protocol such as TCP/IP between the Internet or other communication apparatuses and the communication device. Thecommunication network 931 that is connected to thecommunication device 925 is configured by a network connected by wire or wireless. For example, thecommunication network 931 may be the Internet, a home LAN, infrared communication, radio wave communication, or satellite communications. - The example of the hardware configuration that can realize the function of the
image processing apparatus 100 according to this example is described. The various components may be configured using general-purpose members or may be configured by hardware specialized in the functions of the individual components. Therefore, the hardware used in the configuration may be appropriately changed according to various technological levels to carry out this embodiment. - The program (software) that executes each process step executed by the
image processing apparatus 100 according to this example may be generated, the program may be deployed in a general-purpose computer device, and the same process may be executed. The program may be stored in various media or may be downloaded from the server side to the computer device through the Internet. - It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
- Note that the following configurations are within the scope of the present disclosure. (1)
- An image processing apparatus comprising:
-
- an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively; a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit;
- a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit;
- an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively; and
- an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
- (2)
- The image processing apparatus according to (1),
-
- wherein the editing screen that is generated by the editing screen generating unit includes the tabs for the individual plane images that display thumbnail images where the plane images corresponding to the individual tabs are reduced, and the plane image corresponding to any tab is displayed on the editing screen by a selection operation of any tab in the input unit.
- (3)
- The image processing apparatus according to (1) or (2),
-
- wherein the editing screen that is generated by the editing screen generating unit includes tabs that display thumbnail images where images showing the virtual distances of the plurality of plane images are reduced.
- (4)
- The image processing apparatus according to any one of (1) to (3),
-
- wherein distance scales showing setting of the virtual distances of the plane images and operation positions to operate the virtual distances of the plane images by the input unit are displayed on the editing screen that is generated by the editing screen generating unit.
- (5)
- The image processing apparatus according to any one of (1) to (4),
-
- wherein, in the distance scales, settings of the distances of the plurality of plane images are distinguished and displayed, and the operation positions are prepared for every plane image.
- (6)
- The image processing apparatus according to any one of (1) to (5),
-
- wherein displaying to indicate the position of a virtual display surface is performed in the distance scales.
- (7)
- The image processing apparatus according to any one of (1) to (6),
-
- wherein the three-dimensional image converting unit converts the plurality of plane images into a three-dimensional image where an image portion of a lower side from the horizontal line position set to one specific plane image of the plural plane images becomes an inclined surface gradually changing from the virtual distance.
- (8)
- The image processing apparatus according to any one of (1) to (7),
-
- wherein horizontal line position scales showing setting of the horizontal line position are displayed on the editing screen generated by the editing screen generating unit and the horizontal line position shown by the horizontal line position scales is changed by receiving an operation in the input unit.
- (9)
- The image processing apparatus according to any one of (1) to (8),
-
- wherein the plane images other than the specific plane image display the position crossing the inclined surface in the three-dimensional image, on the editing screen generated by the editing screen generating unit.
- (10)
- The image processing apparatus according to any one of (1) to (9),
-
- wherein the three-dimensional image converting unit erases an object that becomes the lower side of the position crossing the inclined surface in the three-dimensional image, with respect to the plane images other than the specific plane image.
- (11)
- The image processing apparatus according to any one of (1) to (10),
-
- wherein the image generating unit sets a lower end of a designated object in the plane images other than the specific plane image to the position matched with the position crossing the inclined surface in the three-dimensional image.
- (12)
- An image processing apparatus comprising:
-
- an image control unit that controls an image displayed on a display unit;
- a three-dimensional image generating unit that generates a three-dimensional image where the space positions of objects of a plurality of plane images are set, from the plurality of plane images having the virtual distances in a depth direction, respectively; and
- an input unit that receives an operation from a user,
- wherein the image control unit displays thumbnail images in which the plurality of plane images and overlapped images, in which the plane images are displayed in an overlapped manner at a predetermined angle in a depth direction, are reduced, and the image control unit displays the plane image or the overlapped image corresponding to the selected thumbnail image in an editable state along the thumbnail images of the plurality of plane images and the overlapped images, when the input unit receives a command selecting the thumbnail image.
- (13)
- The image processing apparatus according to (12),
-
- wherein the image control unit displays the plane images and the overlapped images in an overlapped manner and displays the thumbnail images as tabs of the plurality of plane images and the overlapped images, respectively.
- (14)
- The image processing apparatus according to (12) or (13),
-
- wherein the image control unit further displays a tab to receive an addition of the plane images.
- (15)
- The image processing apparatus according to any one of (12) to (14),
-
- wherein the image control unit displays the overlapped image on a front surface and displays only the tabs of the non-selected plane images, when the thumbnail image corresponding to the overlapped image is selected, and
- the input unit accepts an input to change the virtual distance of the selected plane image in a depth direction, among the plane images displayed in the overlapped image.
- (16)
- The image processing apparatus according to any one of (12) to (15),
-
- wherein the image control unit displays a screen where only an image of the plane image in the screen is highlighted on a front surface, when the thumbnail image corresponding to the plane image is selected.
- (17)
- The image processing apparatus according to any one of (12) to (16),
-
- wherein the image control unit displays a screen where images of the non-selected plane images in the screen are grayed out on a front surface, when the thumbnail image corresponding to the plane image is selected.
- (18)
- The image processing apparatus according to any one of (12) to (17),
-
- wherein the image control unit displays distance scales showing the virtual distances of the plane images in the depth direction and displays a depth change operation input unit to operate the virtual distances of the plane images displayed in an editable state in the depth direction by the input unit, and
- the input unit accepts an operation to generate or edit the images in the plane images and accepts an operation with respect to the depth change operation input unit, when the plane images are displayed in the editable state.
- (19)
- The image processing apparatus according to any one of (12) to (18),
-
- wherein the depth change operation input unit has a button to move the virtual distances of the plane images displayed in the editable state in the depth direction to the front side and a button to move the virtual distances to the inner side.
- (20)
- The image processing apparatus according to any one of (12) to (19),
-
- wherein the depth change operation input unit is an object that corresponds to the plane image displayed on the distance scales in the editable state.
- (21)
- The image processing apparatus according to any one of (12) to (20),
-
- wherein the image control unit displays a horizontal line change operation input unit to operate the horizontal line position set in the plane image displayed in an editable state by the input unit, and
- the input unit accepts an operation with respect to the horizontal line change operation input unit, when the plane image is displayed in an editable state.
- (22)
- The image processing apparatus according to any one of (12) to (21),
-
- wherein the image control unit does not display an image of the lower side of the horizontal line position set in the plane image displayed in the editable state.
- (23)
- The image processing apparatus according to any one of (12) to (22),
-
- wherein the three-dimensional image generating unit does not reflect an image of the lower side of the horizontal line position set in each plane image to a generated three-dimensional image.
- (24)
- The image processing apparatus according to any one of (12) to (23),
-
- wherein the image control unit performs a control operation to reflect the edit result on the thumbnail image of the overlapped image, when the plane image corresponding to the selected thumbnail image is edited.
- (25)
- The image processing apparatus according to any one of (12) to (24), further comprising:
-
- the display unit.
- (26)
- An image processing method comprising:
-
- generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively;
- converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the
- virtual distances set to the plurality of generated plane images;
- outputting data of the converted three-dimensional image;
- displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
- accepting an operation to generate or edit images in the generated editing screen.
- (27)
- A program that causes a computer to execute an image process, the program causing the computer to execute:
-
- generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively;
- converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images;
- outputting data of the converted three-dimensional image;
- displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
- accepting an operation to generate or edit images in the generated editing screen.
- a to f display object
- 11, 12, 13 generated image
- 11 a, 12 a, 13 a number of layers display
- 100 image processing apparatus
- 110 image generating/processing unit
- 112 image generating unit
- 114 three-dimensional image converting unit
- 116 three-dimensional image generating unit
- 118 editing screen generating unit
- 120 image storage unit
- 130 input unit
- 140 image display unit
- 201 tab
- 202 3D state thumbnail tab
- 203 image display in edition processing
- 203 a first layer edge portion
- 203 b second layer edge portion
- 203 c third layer edge portion
- 211 to 213 layer thumbnail tab
- 221 file importing button
- 222 camera capturing button
- 223 stamp button
- 224 character input button
- 225 depth operation button
- 231 to 237 object display unit
- 241 generation start button
- 242 save button
- 243 three-dimensional display button
- 250L image for left eye
- 250R image for right eye
- 251, 251′ first layer image
- 252, 252′ second layer image
- 253, 253′ third layer image
- 259, 259′ display surface
- 261 horizontal line adjustment bar
- 262 ground surface setting position display
- 271 ground surface position display in layer
- 272 non-display portion
- 281, 282 position movement button
- 283 returning button
- 284 erasure button
- 285 ground surface adjustment button
- 290 pen tool
- 291 first pen
- 292 second pen
- 293 third pen
- 294 eraser
- 301 depth axis
- 301 a to 301 d depth position
- 302 front edge portion
- 303 ground surface setting position
- 304 virtual display surface
- 311 first layer image
- 312 second layer image
- 313 third layer image
- 321 horizontal line position
- 401 depth bar display
- 401 a, 401 b, 401 c layer position
- 411, 412, 421, 422, 431, 432 depth adjustment button
- 501 depth bar display
- 502 virtual display surface position
- 503 layer position
- 511, 512 depth adjustment button
- 601 depth bar display
- 603 layer position
- 611, 612 depth adjustment button
- 701 depth bar display
- 702 virtual display surface position
- 703 layer position
- 704 depth frame
- 705 depth value display
- 711, 712, 721, 722, 731, 732 depth adjustment button
- 810 camera capturing operation screen
- 811 camera capturing image
- 812 extraction image
- 820 image file importing operation screen
- 821 imported image
- 822 extraction image
Claims (27)
1. An image processing apparatus comprising:
an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively;
a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit;
a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit;
an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively; and
an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
2. The image processing apparatus according to claim 1 ,
wherein the editing screen that is generated by the editing screen generating unit includes the tabs for the individual plane images that display thumbnail images where the plane images corresponding to the individual tabs are reduced, and the plane image corresponding to any tab is displayed on the editing screen by a selection operation of any tab in the input unit.
3. The image processing apparatus according to claim 2 ,
wherein the editing screen that is generated by the editing screen generating unit includes tabs that display thumbnail images where images showing the virtual distances of the plurality of plane images are reduced.
4. The image processing apparatus according to claim 1 ,
wherein distance scales showing setting of the virtual distances of the plane images and operation positions to operate the virtual distances of the plane images by the input unit are displayed on the editing screen that is generated by the editing screen generating unit.
5. The image processing apparatus according to claim 4 ,
wherein, in the distance scales, settings of the distances of the plurality of plane images are distinguished and displayed, and the operation positions are prepared for every plane image.
6. The image processing apparatus according to claim 4 ,
wherein displaying to indicate the position of a virtual display surface is performed in the distance scales.
7. The image processing apparatus according to claim 1 ,
wherein the three-dimensional image converting unit converts the plurality of plane images into a three-dimensional image where an image portion of a lower side from the horizontal line position set to one specific plane image of the plural plane images becomes an inclined surface gradually changing from the virtual distance.
8. The image processing apparatus according to claim 7 ,
wherein horizontal line position scales showing setting of the horizontal line position are displayed on the editing screen generated by the editing screen generating unit and the horizontal line position shown by the horizontal line position scales is changed by receiving an operation in the input unit.
9. The image processing apparatus according to claim 8 ,
wherein the plane images other than the specific plane image display the position crossing the inclined surface in the three-dimensional image, on the editing screen generated by the editing screen generating unit.
10. The image processing apparatus according to claim 8 ,
wherein the three-dimensional image converting unit erases an object that becomes the lower side of the position crossing the inclined surface in the three-dimensional image, with respect to the plane images other than the specific plane image.
11. The image processing apparatus according to claim 8 ,
wherein the image generating unit sets a lower end of a designated object in the plane images other than the specific plane image to the position matched with the position crossing the inclined surface in the three-dimensional image.
12. An image processing apparatus comprising:
an image control unit that controls an image displayed on a display unit;
a three-dimensional image generating unit that generates a three-dimensional image where the space positions of objects of a plurality of plane images are set, from the plurality of plane images having the virtual distances in a depth direction, respectively; and
an input unit that receives an operation from a user,
wherein the image control unit displays thumbnail images in which the plurality of plane images and overlapped images, in which the plane images are displayed in an overlapped manner at a predetermined angle in a depth direction, are reduced, and the image control unit displays the plane image or the overlapped image corresponding to the selected thumbnail image in an editable state along the thumbnail images of the plurality of plane images and the overlapped images, when the input unit receives a command selecting the thumbnail image.
13. The image processing apparatus according to claim 12 ,
wherein the image control unit displays the plane images and the overlapped images in an overlapped manner and displays the thumbnail images as tabs of the plurality of plane images and the overlapped images, respectively.
14. The image processing apparatus according to claim 13 ,
wherein the image control unit further displays a tab to receive an addition of the plane images.
15. The image processing apparatus according to claim 12 ,
wherein the image control unit displays the overlapped image on a front surface and displays only the tabs of the non-selected plane images, when the thumbnail image corresponding to the overlapped image is selected, and the input unit accepts an input to change the virtual distance of the selected plane image in a depth direction, among the plane images displayed in the overlapped image.
16. The image processing apparatus according to claim 12 ,
wherein the image control unit displays a screen where only an image of the plane image in the screen is highlighted on a front surface, when the thumbnail image corresponding to the plane image is selected.
17. The image processing apparatus according to claim 12 ,
wherein the image control unit displays a screen where images of the non-selected plane images in the screen are grayed out on a front surface, when the thumbnail image corresponding to the plane image is selected.
18. The image processing apparatus according to claim 12 ,
wherein the image control unit displays distance scales showing the virtual distances of the plane images in the depth direction and displays a depth change operation input unit to operate the virtual distances of the plane images displayed in an editable state in the depth direction by the input unit, and
the input unit accepts an operation to generate or edit the images in the plane images and accepts an operation with respect to the depth change operation input unit, when the plane images are displayed in the editable state.
19. The image processing apparatus according to claim 18 ,
wherein the depth change operation input unit has a button to move the virtual distances of the plane images displayed in the editable state in the depth direction to the front side and a button to move the virtual distances to the inner side.
20. The image processing apparatus according to claim 18 ,
wherein the depth change operation input unit is an object that corresponds to the plane image displayed on the distance scales in the editable state.
21. The image processing apparatus according to claim 12 ,
wherein the image control unit displays a horizontal line change operation input unit to operate the horizontal line position set in the plane image displayed in an editable state by the input unit, and
the input unit accepts an operation with respect to the horizontal line change operation input unit, when the plane image is displayed in an editable state.
22. The image processing apparatus according to claim 21 ,
wherein the image control unit does not display an image of the lower side of the horizontal line position set in the plane image displayed in the editable state.
23. The image processing apparatus according to claim 21 ,
wherein the three-dimensional image generating unit does not reflect an image of the lower side of the horizontal line position set in each plane image to a generated three-dimensional image.
24. The image processing apparatus according to claim 13 ,
wherein the image control unit performs a control operation to reflect the edit result on the thumbnail image of the overlapped image, when the plane image corresponding to the selected thumbnail image is edited.
25. The image processing apparatus according to claim 12 , further comprising:
the display unit.
26. An image processing method comprising:
generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively;
converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images;
outputting data of the converted three-dimensional image;
displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
accepting an operation to generate or edit images in the generated editing screen.
27. A program that causes a computer to execute an image process, the program causing the computer to execute:
generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively;
converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images;
outputting data of the converted three-dimensional image;
displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
accepting an operation to generate or edit images in the generated editing screen.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010219867 | 2010-09-29 | ||
| JP2011-126792 | 2011-06-06 | ||
| JP2011126792A JP2012094111A (en) | 2010-09-29 | 2011-06-06 | Image processing device, image processing method and program |
| PCT/JP2012/001012 WO2012169097A1 (en) | 2011-06-06 | 2012-02-16 | Image processing apparatus, image processing method, and program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20140125661A1 true US20140125661A1 (en) | 2014-05-08 |
Family
ID=44785456
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/205,913 Expired - Fee Related US9741152B2 (en) | 2010-09-29 | 2011-08-09 | Image processing apparatus, image processing method, and computer program |
| US14/122,361 Abandoned US20140125661A1 (en) | 2010-09-29 | 2012-02-16 | Image processing apparatus, image processing method, and program |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/205,913 Expired - Fee Related US9741152B2 (en) | 2010-09-29 | 2011-08-09 | Image processing apparatus, image processing method, and computer program |
Country Status (6)
| Country | Link |
|---|---|
| US (2) | US9741152B2 (en) |
| EP (1) | EP2437503A3 (en) |
| JP (1) | JP2012094111A (en) |
| KR (1) | KR20120033246A (en) |
| CN (1) | CN102438164B (en) |
| TW (1) | TWI477141B (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130194594A1 (en) * | 2012-01-31 | 2013-08-01 | Seiko Epson Corporation | Printing device and method of producing printing material |
| US20150116465A1 (en) * | 2013-10-28 | 2015-04-30 | Ray Wang | Method and system for providing three-dimensional (3d) display of two-dimensional (2d) information |
| US20160198097A1 (en) * | 2015-01-05 | 2016-07-07 | GenMe, Inc. | System and method for inserting objects into an image or sequence of images |
| KR20200083130A (en) * | 2018-12-31 | 2020-07-08 | 한국전자통신연구원 | Apparatus and method for generating 3d geographical data |
| US20200293791A1 (en) * | 2016-10-28 | 2020-09-17 | Axon Enterprise, Inc. | Identifying and redacting captured data |
| US10810776B2 (en) | 2016-11-28 | 2020-10-20 | Sony Corporation | Image processing device and image processing method |
Families Citing this family (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2013005376A (en) * | 2011-06-21 | 2013-01-07 | Office Sansara Co Ltd | Image processing program, image processing apparatus, and image processing method |
| US20130265296A1 (en) * | 2012-04-05 | 2013-10-10 | Wing-Shun Chan | Motion Activated Three Dimensional Effect |
| US9610449B2 (en) * | 2013-05-16 | 2017-04-04 | Nuvectra Corporation | Method and apparatus for displaying a graphical impedance history for output channels of a lead |
| US20160180584A1 (en) * | 2013-06-26 | 2016-06-23 | Lucid Global, Llc. | Virtual model user interface pad |
| US20150033157A1 (en) * | 2013-07-25 | 2015-01-29 | Mediatek Inc. | 3d displaying apparatus and the method thereof |
| US20150130800A1 (en) | 2013-11-12 | 2015-05-14 | Fyusion, Inc. | Segmentation of surround view data |
| US20150215530A1 (en) * | 2014-01-27 | 2015-07-30 | Microsoft Corporation | Universal capture |
| US10334221B2 (en) * | 2014-09-15 | 2019-06-25 | Mantisvision Ltd. | Methods circuits devices systems and associated computer executable code for rendering a hybrid image frame |
| JP6525617B2 (en) * | 2015-02-03 | 2019-06-05 | キヤノン株式会社 | Image processing apparatus and control method thereof |
| US20170155886A1 (en) * | 2015-06-24 | 2017-06-01 | Derek John Hartling | Colour-Z: Low-D Loading to High-D Processing |
| JP6784115B2 (en) * | 2016-09-23 | 2020-11-11 | コニカミノルタ株式会社 | Ultrasound diagnostic equipment and programs |
| JP6789833B2 (en) * | 2017-01-26 | 2020-11-25 | キヤノン株式会社 | Image processing equipment, imaging equipment, image processing methods and programs |
| US10735707B2 (en) | 2017-08-15 | 2020-08-04 | International Business Machines Corporation | Generating three-dimensional imagery |
| DE102018130640A1 (en) * | 2017-12-18 | 2019-06-19 | Löwenstein Medical Technology S.A. Luxembourg | Virtual operation |
| JP6532094B1 (en) * | 2018-04-06 | 2019-06-19 | 株式会社アクセル | Display processing apparatus, display processing method, and program |
| US11989398B2 (en) * | 2021-10-22 | 2024-05-21 | Ebay Inc. | Digital content view control system |
| US12462441B2 (en) | 2023-03-20 | 2025-11-04 | Sony Interactive Entertainment Inc. | Iterative image generation from text |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6130676A (en) * | 1998-04-02 | 2000-10-10 | Avid Technology, Inc. | Image composition system and process using layers |
| US6404443B1 (en) * | 1999-08-25 | 2002-06-11 | Sharp Laboratories Of America | Three-dimensional graphical user interface for managing screen objects |
| US20020140736A1 (en) * | 2001-03-28 | 2002-10-03 | Ulead Systems, Inc. | Method for manipulating multiple multimedia objects |
| US20070060346A1 (en) * | 2005-06-28 | 2007-03-15 | Samsung Electronics Co., Ltd. | Tool for video gaming system and method |
| US7928981B2 (en) * | 2005-04-22 | 2011-04-19 | Tektronix International Sales Gmbh | Signal generator display interface for instinctive operation of editing waveform parameters |
| US20110158504A1 (en) * | 2009-12-31 | 2011-06-30 | Disney Enterprises, Inc. | Apparatus and method for indicating depth of one or more pixels of a stereoscopic 3-d image comprised from a plurality of 2-d layers |
Family Cites Families (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3182321B2 (en) * | 1994-12-21 | 2001-07-03 | 三洋電機株式会社 | Generation method of pseudo stereoscopic video |
| US6208348B1 (en) * | 1998-05-27 | 2001-03-27 | In-Three, Inc. | System and method for dimensionalization processing of images in consideration of a pedetermined image projection format |
| WO2002013143A1 (en) * | 2000-08-04 | 2002-02-14 | Dynamic Digital Depth Research Pty Ltd. | Image conversion and encoding technique |
| JP4729812B2 (en) * | 2001-06-27 | 2011-07-20 | ソニー株式会社 | Image processing apparatus and method, recording medium, and program |
| US7365875B2 (en) * | 2002-05-14 | 2008-04-29 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, program, and recording medium |
| CA2511040A1 (en) * | 2004-09-23 | 2006-03-23 | The Governors Of The University Of Alberta | Method and system for real time image rendering |
| US9395905B2 (en) * | 2006-04-05 | 2016-07-19 | Synaptics Incorporated | Graphical scroll wheel |
| US8972902B2 (en) * | 2008-08-22 | 2015-03-03 | Northrop Grumman Systems Corporation | Compound gesture recognition |
| KR101539935B1 (en) * | 2008-06-24 | 2015-07-28 | 삼성전자주식회사 | Method and apparatus for processing 3D video image |
| KR20100041006A (en) * | 2008-10-13 | 2010-04-22 | 엘지전자 주식회사 | A user interface controlling method using three dimension multi-touch |
| WO2010084724A1 (en) * | 2009-01-21 | 2010-07-29 | 株式会社ニコン | Image processing device, program, image processing method, recording method, and recording medium |
| JP2010210712A (en) | 2009-03-06 | 2010-09-24 | Sony Corp | Image display apparatus, image display observation system, and image display method |
| JP2010219867A (en) | 2009-03-17 | 2010-09-30 | Sharp Corp | Print job processing apparatus and image processing system |
| US20110107216A1 (en) * | 2009-11-03 | 2011-05-05 | Qualcomm Incorporated | Gesture-based user interface |
| US8520935B2 (en) * | 2010-02-04 | 2013-08-27 | Sony Corporation | 2D to 3D image conversion based on image content |
| TWM382675U (en) * | 2010-02-09 | 2010-06-11 | Chunghwa Telecom Co Ltd | Video camera control device based on gesture recognition |
-
2011
- 2011-06-06 JP JP2011126792A patent/JP2012094111A/en not_active Withdrawn
- 2011-08-09 US US13/205,913 patent/US9741152B2/en not_active Expired - Fee Related
- 2011-09-07 TW TW100132251A patent/TWI477141B/en not_active IP Right Cessation
- 2011-09-21 KR KR1020110095134A patent/KR20120033246A/en not_active Withdrawn
- 2011-09-22 EP EP11182290.4A patent/EP2437503A3/en not_active Withdrawn
- 2011-09-29 CN CN201110293323.0A patent/CN102438164B/en not_active Expired - Fee Related
-
2012
- 2012-02-16 US US14/122,361 patent/US20140125661A1/en not_active Abandoned
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6130676A (en) * | 1998-04-02 | 2000-10-10 | Avid Technology, Inc. | Image composition system and process using layers |
| US6404443B1 (en) * | 1999-08-25 | 2002-06-11 | Sharp Laboratories Of America | Three-dimensional graphical user interface for managing screen objects |
| US20020140736A1 (en) * | 2001-03-28 | 2002-10-03 | Ulead Systems, Inc. | Method for manipulating multiple multimedia objects |
| US7928981B2 (en) * | 2005-04-22 | 2011-04-19 | Tektronix International Sales Gmbh | Signal generator display interface for instinctive operation of editing waveform parameters |
| US20070060346A1 (en) * | 2005-06-28 | 2007-03-15 | Samsung Electronics Co., Ltd. | Tool for video gaming system and method |
| US20110158504A1 (en) * | 2009-12-31 | 2011-06-30 | Disney Enterprises, Inc. | Apparatus and method for indicating depth of one or more pixels of a stereoscopic 3-d image comprised from a plurality of 2-d layers |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130194594A1 (en) * | 2012-01-31 | 2013-08-01 | Seiko Epson Corporation | Printing device and method of producing printing material |
| US9007652B2 (en) * | 2012-01-31 | 2015-04-14 | Seiko Epson Corporation | Printing device and method of producing printing material |
| US20150116465A1 (en) * | 2013-10-28 | 2015-04-30 | Ray Wang | Method and system for providing three-dimensional (3d) display of two-dimensional (2d) information |
| US9667948B2 (en) * | 2013-10-28 | 2017-05-30 | Ray Wang | Method and system for providing three-dimensional (3D) display of two-dimensional (2D) information |
| US20160198097A1 (en) * | 2015-01-05 | 2016-07-07 | GenMe, Inc. | System and method for inserting objects into an image or sequence of images |
| US20200293791A1 (en) * | 2016-10-28 | 2020-09-17 | Axon Enterprise, Inc. | Identifying and redacting captured data |
| US12380235B2 (en) * | 2016-10-28 | 2025-08-05 | Axon Enterprise, Inc. | Identifying and redacting captured data |
| US10810776B2 (en) | 2016-11-28 | 2020-10-20 | Sony Corporation | Image processing device and image processing method |
| KR20200083130A (en) * | 2018-12-31 | 2020-07-08 | 한국전자통신연구원 | Apparatus and method for generating 3d geographical data |
| KR102454180B1 (en) * | 2018-12-31 | 2022-10-14 | 한국전자통신연구원 | Apparatus and method for generating 3d geographical data |
Also Published As
| Publication number | Publication date |
|---|---|
| US9741152B2 (en) | 2017-08-22 |
| US20120075290A1 (en) | 2012-03-29 |
| TWI477141B (en) | 2015-03-11 |
| TW201218746A (en) | 2012-05-01 |
| CN102438164A (en) | 2012-05-02 |
| EP2437503A3 (en) | 2014-02-26 |
| KR20120033246A (en) | 2012-04-06 |
| JP2012094111A (en) | 2012-05-17 |
| EP2437503A2 (en) | 2012-04-04 |
| CN102438164B (en) | 2015-10-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20140125661A1 (en) | Image processing apparatus, image processing method, and program | |
| EP3642802B1 (en) | Apparatus for editing image using depth map and method thereof | |
| EP2333640A1 (en) | Method and system for adaptive viewport for a mobile device based on viewing angle | |
| CN102193771B (en) | Conference system, information processing apparatus, and display method | |
| EP3672227A1 (en) | Dynamic region of interest adaptation and image capture device providing same | |
| GB2571431A (en) | Electronic apparatus and method for controlling the same | |
| US20170347038A1 (en) | Control apparatus, imaging system, control method, and recording medium | |
| JP2014197824A5 (en) | ||
| CN104284064A (en) | Method and apparatus for previewing a dual-shot image | |
| JPWO2013054462A1 (en) | User interface control device, user interface control method, computer program, and integrated circuit | |
| JP2009053539A (en) | Information display device, information display method, and program | |
| KR20130037998A (en) | Display apparatus and display method thereof | |
| TW201301130A (en) | Image processing apparatus and method, and computer program product | |
| RU2740119C1 (en) | Display control device, image forming device, control method and computer-readable medium | |
| US20120092457A1 (en) | Stereoscopic image display apparatus | |
| JP2012216095A (en) | Detection area magnifying device, display device, detection area magnifying method, program, and computer-readable recording medium | |
| WO2012169097A1 (en) | Image processing apparatus, image processing method, and program | |
| KR102160038B1 (en) | Mobile terminal and method for controlling the same | |
| JP2015089021A (en) | Imaging device, imaging control method, and program | |
| JP7719622B2 (en) | Electronic device and control method thereof | |
| JP2022191143A (en) | Image processing device and image processing method | |
| US20200257439A1 (en) | Electronic apparatus, control method of electronic apparatus, and non-transitory computer readable medium | |
| US20190287489A1 (en) | Head-mounted display apparatus and method for controlling head-mounted display apparatus | |
| US11750916B2 (en) | Image processing apparatus, image processing method, and non-transitory computer readable medium | |
| KR102541173B1 (en) | Terminal and method for controlling the same |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUROSAKI, DAISUKE;ODA, YASUMASA;HIGUCHI, HANAE;SIGNING DATES FROM 20131015 TO 20131017;REEL/FRAME:031997/0144 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |