[go: up one dir, main page]

WO2014002790A1 - Image processing device, image processing method, and program - Google Patents

Image processing device, image processing method, and program Download PDF

Info

Publication number
WO2014002790A1
WO2014002790A1 PCT/JP2013/066435 JP2013066435W WO2014002790A1 WO 2014002790 A1 WO2014002790 A1 WO 2014002790A1 JP 2013066435 W JP2013066435 W JP 2013066435W WO 2014002790 A1 WO2014002790 A1 WO 2014002790A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processing unit
unit
viewpoint
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2013/066435
Other languages
French (fr)
Japanese (ja)
Inventor
一 若林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of WO2014002790A1 publication Critical patent/WO2014002790A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels

Definitions

  • the present disclosure relates to an image processing device, an image processing method, and a program, and more particularly, to an image processing device, an image processing method, and a program that can more quickly superimpose icons and the like for 3D images. .
  • OSD On Screen Display
  • LSI Large Scale Integration
  • the 3D image is an image that is recognized by the viewer as a three-dimensional image, and is, for example, a two-dimensional image for the right eye that is visually recognized by the viewer's right eye, and is visually recognized by the viewer's left eye. It is comprised from the two-dimensional image for left eyes for doing.
  • the superimposition of an icon or the like on the 3D image is realized by superimposing the icon or the like on the right-eye two-dimensional image and the left-eye two-dimensional image constituting the 3D image.
  • the present disclosure has been made in view of such a situation, and is intended to more quickly superimpose icons and the like for 3D images.
  • An image processing apparatus includes a main processing unit that superimposes first superimposition data on a first viewpoint image obtained from a first viewpoint among a plurality of superposition data held in advance.
  • a second viewpoint image obtained from a second viewpoint different from the first viewpoint from among the plurality of superimposing data held in advance in synchronization with the main processing unit.
  • the image processing apparatus includes a slave processing unit that superimposes the image, a first viewpoint image after superimposition, and a 3D image generation unit that generates a 3D image composed of the second viewpoint image after superimposition.
  • a first output unit that outputs the first viewpoint image to the main processing unit and a second output unit that outputs the second viewpoint image to the sub-processing unit may be further provided.
  • the main processing unit communicates a synchronization signal for performing processing in synchronization with the main processing unit to the sub processing unit via a dedicated communication path, and the sub processing unit
  • the first superimposing data can be superimposed on the second viewpoint image in synchronization with the synchronization signal communicated from the main processing unit via a communication path.
  • the main processing unit includes a first holding unit that holds the plurality of superimposing data in advance, a first changing unit that changes an internal state of the main processing unit based on a user input operation, and the first Of the plurality of superposition data held in one holding unit, the first superposition data indicating the internal state of the main processing unit changed by the first change unit is used as the first superposition data.
  • a first superimposing unit that superimposes on the viewpoint image.
  • the secondary processing unit is configured to change the internal state of the secondary processing unit in synchronization with a change in the internal state of the second processing unit that holds the plurality of superimposing data in advance and the main processing unit.
  • a second changing unit that changes the internal state to the same, and among the plurality of superimposition data held in the second holding unit, the second processing unit changed by the second changing unit
  • a second superimposing unit that superimposes the first superimposing data indicating an internal state on the second viewpoint image.
  • the image processing apparatus may be provided with a plurality of sub processing units, and the 3D image generation unit may be obtained from the first viewpoint image after superimposition and the plurality of sub processing units, respectively.
  • a 3D image composed of the second viewpoint image can be generated.
  • An image processing method is an image processing method of an image processing device that generates a 3D image
  • the image processing device includes a main processing unit, a sub processing unit, and a 3D image generation unit
  • the main processing unit superimposes first superposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint
  • the sub-processing unit includes the main processing.
  • the first superimposition data is superimposed on a second viewpoint image obtained from a second viewpoint different from the first viewpoint among the plurality of superimposition data held in advance in synchronization with the unit.
  • the image processing method includes a step in which the 3D image generation unit generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition.
  • a program includes a main processing unit that superimposes a first superposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint.
  • a program for functioning as a 3D image generation unit that generates a 3D image composed of a slave processing unit to be superimposed on an image, the first viewpoint image after being superimposed, and the second viewpoint image after being superimposed. .
  • the first superimposing data is superimposed on the first viewpoint image obtained from the first viewpoint, and the main processing unit Synchronously, among the plurality of superimposing data held in advance by the slave processing unit, the first superimposing data is converted into a second viewpoint image obtained from a second viewpoint different from the first viewpoint.
  • a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition is generated.
  • FIG. 1 shows a configuration example of an image processing apparatus 1 according to the present embodiment.
  • the image processing apparatus 1 is, for example, a digital camera that captures a 3D image, and includes lenses 21 1 and 21 2 , imaging elements 22 1 and 22 2 , a main processing unit 23 1 , a sub processing unit 23 2 , and an operation unit 24. , A data storage unit 25, relay units 26 and 27, and a generation display unit 28.
  • the image processing apparatus 1 is described as a digital camera that captures a 3D image.
  • the image processing apparatus 1 is not limited to a digital camera as long as it is a device that performs a process of superimposing an icon or the like on a 3D image.
  • the present technology can also be applied to an apparatus.
  • the image processing apparatus 1 suppresses the communication amount of data communicated between the main processing unit 23 1 and the sub-processing unit 23 2 and quickly superimposes OSD data on the 3D image.
  • the 3D image is an image recognized by the user (viewer) as a three-dimensional image.
  • a 2D image for the right eye for allowing the user's right eye to visually recognize the image and a left eye of the user. It is composed of a left-eye two-dimensional image for visual recognition.
  • the 3D image may be composed of two two-dimensional images (two-dimensional image for right eye and two-dimensional image for left eye) and three or more two-dimensional images obtained from different viewpoints. it can.
  • the 3D image is assumed to be composed of two two-dimensional images, that is, a right-eye two-dimensional image and a left-eye two-dimensional image. To do.
  • the main processing unit 23 1 and the sub processing unit 23 2 hold a plurality of OSD data in advance, so that the OSD is connected between the main processing unit 23 1 and the sub processing unit 23 2. Data is not exchanged.
  • the imaging element 22 1 outputs the right-eye two-dimensional image to the main processing unit 23 1
  • the imaging element 22 2 outputs the left-eye two-dimensional image to the sub-processing unit 23 2 , thereby performing main processing.
  • the two-dimensional image for the right eye and the two-dimensional image for the left eye are not exchanged between the unit 23 1 and the sub-processing unit 23 2 .
  • the image processing apparatus 1 suppresses the communication amount of data communicated between the main processing unit 23 1 and the sub-processing unit 23 2 and reduces the time required for the communication, thereby superimposing the OSD data. To do it quickly.
  • light from the outside (for example, reflected light from a subject) is incident on the image sensor 22 1 via an optical system such as a lens 21 1 . Further, external light is incident on the image sensor 22 2 via an optical system such as the lens 21 2 .
  • the imaging element 22 1 performs photoelectric conversion for converting the light from the lens 21 1 into an image signal, and supplies a right-eye two-dimensional image as an image signal obtained as a result to the main processing unit 23 1 .
  • the imaging element 22 2 performs photoelectric conversion for converting light from the lens 21 2 into an image signal, and supplies a two-dimensional image for the left eye as an image signal obtained as a result to the slave processing unit 23 2 .
  • image sensors 22 1 and 22 2 for example, image sensors such as a CCD (Charge Coupled Device) and a CMOS (Complementary Metal Oxide Semiconductor) are employed.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the lens 21 1 and the lens 21 2 are provided at positions separated by a predetermined distance (for example, a distance corresponding to the distance between the human right eye and the left eye).
  • the image sensor 22 1 outputs a two-dimensional image for the right eye obtained by imaging the subject from the first viewpoint to the main processing unit 23 1 .
  • the imaging element 22 2 outputs a left-eye two-dimensional image obtained by capturing an image of a subject from a second viewpoint different from the first viewpoint to the slave processing unit 23 2 .
  • the main processing unit 23 for example, among the plurality of OSD data held in advance, the OSD data representing the internal state of the main processor 23 1, superimposed on the right eye two-dimensional images from the imaging device 22 1, the The right-eye OSD superimposed image obtained by superposition is output to the generation display unit 28.
  • the internal state of the main processing unit 23 1 for example, and the internal state of the main processor 23 1 performs the image stabilization processing for the two-dimensional image for the right eye
  • the main processing unit 23 1 is the right-eye 2
  • An internal state in which a dimensional image is recorded in the data storage unit 25 can be considered. The same applies to the internal state of the sub processor 23 2 .
  • the main processing unit 23 1 changes the internal state of the main processing unit 23 1 based on an operation signal from the operation unit 24.
  • the main processing unit 23 1 controls the sub processing unit 23 2 to perform the same processing as the main processing unit 23 1 .
  • the processing of the main processing unit 23 1 and the sub processing unit 23 2 is performed in synchronization with the synchronization signal generated by the main processing unit 23 1 .
  • the main processing unit 23 1 generates a synchronization signal for synchronizing with the main processing unit 23 1, and transmits the generated synchronization signal to the slave processing unit 23 2 via the relay unit 26.
  • the main processing unit 23 1 generates instruction data for instructing the sub-processing unit 23 2 to change the internal state, output a left-eye OSD superimposed image to be described later, and the like.
  • the instruction data is transmitted to the slave processing unit 23 2 via the relay unit 27.
  • the instruction data for instructing the change of the internal state is referred to as first instruction data.
  • the instruction data that instructs the output of the left-eye OSD superimposed image is referred to as second instruction data. Note that the instruction data is not limited to the first instruction data and the second instruction data.
  • the sub processor 23 2 receives the synchronization signal transmitted from the main processor 23 1 via the relay unit 26 and the instruction data transmitted from the main processor 23 1 via the relay unit 27.
  • the sub processor 23 2 performs the same processing as the main processor 23 1 in synchronization with the synchronization signal from the relay unit 26 based on the instruction data from the relay unit 27.
  • the slave processing unit 23 2 changes the internal state of the slave processing unit 23 2 in synchronization with the synchronization signal from the relay unit 26 based on the first instruction data from the relay unit 27. Change to the same internal state as 1 .
  • the change of the internal state of the sub processor 23 2 is performed at (almost) the same timing as the change of the internal state of the main processor 23 1 .
  • the slave processing unit 23 2 synchronizes the left eye from the image sensor 22 2 with OSD data representing the internal state of the slave processing unit 23 2 among a plurality of OSD data held in advance in synchronization with the synchronization signal. Is superimposed on the two-dimensional image. It is assumed that the sub processor 23 2 is instructed to superimpose OSD data by instruction data from the relay unit 27.
  • the superimposition of OSD data in the slave processing unit 23 2 is performed at the same timing as the superimposition timing of the OSD data in the main processing unit 23 1 .
  • the secondary processing unit 23 2 generates the left-eye OSD superimposed image obtained by the superimposition based on the second instruction data from the relay unit 27 in synchronization with the synchronization signal from the relay unit 26, To 28.
  • the output of the left-eye OSD superimposed image by the sub-processing unit 23 2 is performed at the same timing as the output timing of the right-eye OSD superimposed image from the main processing unit 23 1 .
  • Operation unit 24 is composed of operation buttons to be operated by the user, in response to being operated by the user, an operation signal corresponding to a user operation, and supplies to the main processor 23 1.
  • the user can change the internal state of the image processing apparatus 1, that is, the internal state of the main processing unit 23 1 and the secondary processing unit 23 2 by operating the operation unit 24.
  • the data storage unit 25 is, for example, a flash memory or the like, and stores the right-eye two-dimensional image from the main processing unit 23 1 and the left-eye two-dimensional image from the sub-processing unit 23 2 .
  • the data storage unit 25 stores in advance a program executed by the main processing unit 23 1 and the sub-processing unit 23 2 .
  • the relay unit 26 between the main processing unit 23 1 and the sub processing unit 23 2 , a synchronization signal output from the main processing unit 23 1 and a response output from the sub processing unit 23 2 (from the main processing unit 23 1 Response to the data).
  • the relay unit 27 between the main processor 23 1 and the slave processing unit 23 2, instruction data and output from the main processing unit 23 1, output from the slave processing unit 23 2, the response to the instruction data Relay etc.
  • the synchronization signal is communicated via a dedicated communication path (communication path via the relay unit 26) different from the communication path for communicating instruction data and the like.
  • the generation display unit 28 based on the OSD superimposed image for the right eye from the main processing unit 23 1 and the OSD superimposed image for the left eye from the sub-processing unit 23 2 , the OSD superimposed image for the right eye and the OSD superimposed for the left eye.
  • a 3D image composed of images is generated and displayed on a built-in display unit (not shown).
  • FIG. 2 shows a detailed configuration example of the main processing unit 23 1 , the sub-processing unit 23 2 , and the generation display unit 28.
  • the main processing unit 23 1 includes an image processing unit 41 1 , an AD (analog to digital) conversion unit 42 1 , a CPU (central processing unit) 43 1 , a holding unit 44 1 , a RAM (random access memory) 45 1 , and a superposition unit 46. 1 and a communication unit 47 1 .
  • a right-eye two-dimensional image is supplied from the image sensor 22 1 to the image processing unit 41 1 .
  • the image processing unit 41 1 performs image processing such as gamma correction processing, white balance processing, and camera shake correction processing on the two-dimensional image for the right eye from the image sensor 22 1, and performs two-dimensional processing for the right eye after the image processing.
  • the image is supplied to the AD conversion unit 42 1 .
  • the AD conversion unit 42 1 performs AD conversion for converting the two-dimensional image for the right eye as an analog signal from the image processing unit 41 1 into the two-dimensional image for the right eye as a digital signal, and performs the right conversion as a digital signal. a two-dimensional image for the eye, and supplies the CPU 43 1.
  • the CPU 43 1 controls the image processing unit 41 1 , the AD conversion unit 42 1 , the superimposition unit 46 1 , and the communication unit 47 1 .
  • CPU 43 1 generates the synchronization signal in synchronization with the generated synchronization signal, performs a predetermined process.
  • the CPU 43 1 changes the internal state of the main processing unit 23 1 based on the operation signal from the operation signal 25.
  • State information indicating the internal state of the main processor 23 1, for example, is held in the built-in memory or the like (not shown) of the CPU 43 1.
  • the CPU 43 1 uses the first instruction data for instructing to change the internal state of the sub processor 23 2 to the same internal state as the main processor 23 1 , and the left eye from the sub processor 23 2. Second instruction data or the like for instructing to output the OSD superimposed image is generated. Then, the CPU 43 1 supplies the generated first instruction data, second instruction data, and the like together with the synchronization signal to the communication unit 47 1 .
  • the CPU 43 1 controls the superimposing unit 46 1 to perform processing for superimposing OSD data representing the internal state of the main processing unit 23 1 on the right-eye two-dimensional image.
  • the CPU 43 1 supplies the right-eye two-dimensional image from the AD conversion unit 42 1 to the RAM 45 1 and temporarily holds it.
  • the CPU 43 1 identifies the internal state of the main processing unit 23 1 based on the state information held in a built-in memory (not shown). Furthermore, for example, CPU 43 1, among the plurality of OSD data previously held in the holding unit 44 1, the OSD data representing the internal state of the main processor 23 1 were identified, read from the holding unit 44 1, RAM 45 1 To temporarily hold.
  • CPU 43 1 from RAM 45 1 reads out the retained of the right-eye 2-D image and OSD data, and supplies the superposed portions 46 1 to perform the overlap of the OSD data to the two-dimensional image for the right eye.
  • the holding unit 44 1 holds in advance a plurality of OSD data each representing the internal state of the main processing unit 23 1 .
  • RAM 45 1 is adapted to temporarily hold the data writing has been instructed from the CPU 43 1, the data read is instructed from CPU 43 1, and supplies the CPU 43 1.
  • Superimposing unit 46 to the right eye two-dimensional images from the CPU 43 1, likewise superimposes the OSD data from the CPU 43 1, supplies the right-eye OSD superimposed image obtained by the superposition, on the generation and display unit 28 To do.
  • the communication unit 47 1 supplies the synchronization signal from the main processing unit 23 1 to the communication unit 47 2 via the relay unit 26. Further, the communication unit 47 1 supplies the first instruction data, the second instruction data, and the like from the main processing unit 23 1 to the communication unit 47 2 via the relay unit 27.
  • the communication unit 47 1 supplies data such as a response returned from the slave processing unit 23 2 via the relay unit 26 or the relay unit 27 to the CPU 43 1 .
  • the slave processing unit 23 2 includes an image processing unit 41 2 , an AD conversion unit 42 2 , a CPU 43 2 , a holding unit 44 2 , a RAM 45 2 , a superimposing unit 46 2 , and a communication unit 47 2 .
  • a left-eye two-dimensional image is supplied from the image sensor 22 2 to the image processing unit 41 2 .
  • the image processing unit 41 2 performs image processing such as gamma correction processing, white balance processing, and camera shake correction processing on the two-dimensional image for the left eye from the image sensor 22 2, and performs two-dimensional processing for the left eye after image processing. images, and supplies to the AD conversion unit 42 2.
  • the AD conversion unit 42 2 performs AD conversion for converting the two-dimensional image for the left eye as an analog signal from the image processing unit 41 1 into the two-dimensional image for the left eye as a digital signal, and performs left conversion as a digital signal. a two-dimensional image for the eye, and supplies the CPU 43 2.
  • the CPU 43 2 receives a synchronization signal transmitted from the CPU 43 1 of the main processing unit 23 1 via the communication unit 47 1 , the relay unit 26, and the communication unit 47 2 , and synchronizes with the received synchronization signal,
  • the image processing unit 41 2 , AD conversion unit 42 2 , superimposing unit 46 2 , and communication unit 47 2 are controlled.
  • the CPU 43 2 uses the first instruction data and the second instruction data transmitted from the CPU 43 1 of the main processing unit 23 1 via the communication unit 47 1 , the relay unit 27, and the communication unit 47 2. Etc.
  • CPU 43 2 in synchronization with the received synchronization signal also performs the processing instructed by the first instruction data and the second instruction data received.
  • the timing at which the change of the internal state of the main processing unit 23 1 is performed in synchronization with the synchronization signal is changed.
  • state information indicating the internal state of the sub processor 23 2 is held in, for example, a built-in memory (not shown) of the CPU 43 2 .
  • CPU 43 2 when receiving the second instruction data for instructing the output of the left-eye OSD superimposed image, in synchronization with the synchronizing signal, for the right eye from the main processing unit 23 1 of the overlapping portion 46 2 timing the same timing OSD superimposed image is output, to output the OSD superimposed image for the left eye superimposing unit 46 1.
  • the CPU 43 2 controls the superimposing unit 46 2 to perform a process of superimposing OSD data representing the internal state of the slave processing unit 23 2 on the left-eye two-dimensional image. Assume that the CPU 43 2 is instructed to superimpose OSD data by instruction data output from the CPU 43 1 and supplied.
  • the CPU 43 2 supplies the left-eye two-dimensional image from the AD conversion unit 42 2 to the RAM 45 2 and temporarily holds it.
  • the CPU 43 2 identifies the internal state of the sub processor 23 2 based on the state information held in a built-in memory (not shown). Furthermore, for example, CPU 43 2, of the plurality of OSD data previously held in the holding unit 44 2, the OSD data representing the internal state of the slave processor 23 2 identified, read from the storage portion 44 2, RAM 45 2 To temporarily hold.
  • CPU 43 2 from RAM 45 2, reads out the retained two-dimensional image and OSD data for the left eye, and supplies to the superimposing unit 46 2, and superimposes the OSD data into two-dimensional image for the left eye.
  • Holding portion 44 2 previously holds a plurality of OSD data respectively representing the internal state of the slave processing unit 23 2. Note that the holding portion 44 2, the same OSD data and a plurality of OSD data holding unit 44 1 is held is previously held.
  • RAM 45 2 is adapted to temporarily hold the data writing has been instructed from the CPU 43 2, the data read is instructed from the CPU 43 2, and supplies the CPU 43 2.
  • the superimposing unit 46 2 superimposes the OSD data from the CPU 43 2 on the two-dimensional image for the left eye from the CPU 43 2 , and supplies the OSD superimposed image for the left eye obtained by the superimposition to the generation display unit 28. To do.
  • the communication unit 47 2 supplies the synchronization signal from the relay unit 26 to the CPU 43 2, and supplies the first instruction data, the second instruction data, and the like from the relay unit 27 to the CPU 43 2 .
  • the communication unit 47 2 transmits a response from the relay unit 26 to the communication unit 47 1 via the relay unit 26. Further, the communication unit 47 2 transmits a response from the relay unit 27 to the communication unit 47 1 via the relay unit 27.
  • the generation display unit 28 includes a 3D image generation unit 61, a display control unit 62, and a display unit 63.
  • 3D image generation unit 61 a right-eye OSD overlapping image supplied from the superimposing unit 461 of the main processing unit 23 1, the left-eye OSD overlapping image supplied from the superimposing unit 46 2 of the slave processing unit 23 2 Based on this, a 3D image composed of the OSD superimposed image for the right eye and the OSD superimposed image for the left eye is generated and supplied to the display control unit 62.
  • the display control unit 62 supplies the 3D image from the 3D image generation unit 61 to the display unit 63 for display.
  • the display unit 63 is, for example, an LCD (liquid crystal display) or the like, and displays a 3D image according to control from the display control unit 62.
  • the display unit 63 is provided with a parallax barrier, a lenticular lens, or the like that optically separates the right-eye two-dimensional image and the left-eye two-dimensional image included in the 3D image.
  • the display unit 63 displays the OSD superimposed image for the left eye included in the 3D image so as to be visually recognized by the user's left eye using a parallax barrier, a lenticular lens, or the like, and for the right eye included in the 3D image.
  • the OSD superimposed image is displayed so as to be visually recognized by the user's right eye.
  • the display unit 63 may cause the user to visually recognize the 3D image by alternately displaying the two-dimensional image for the right eye and the two-dimensional image for the left eye. In this case, the user needs to wear 3D glasses or the like that alternately block the field of view of the right eye and the left eye.
  • This display process is started, for example, when the image processing apparatus 1 is powered on.
  • step S21 the processing branches depending on whether the main subject of processing is the main processing unit 23 1 or the sub processing unit 23 2 .
  • step S21 when the processing subject is the main processing unit 23 1 , the main processing unit 23 1 advances the processing to step S23, and when the processing subject is the sub processing unit 23 2 , the sub processing unit 23 In step 2 , the process proceeds to step S22.
  • step S22 CPU 43 2 of the slave processing unit 23 2, the CPU 43 1 of the main processing unit 23 1, the communication unit 47 1, via the relay unit 27 and the communication unit 47 2, the internal state of the slave processor 23 2 It is determined whether or not first instruction data for instructing change has been received.
  • the CPU 43 2 of the slave processing unit 23 2 waits to determine that the first instruction data has been received, and then proceeds to step S25.
  • step S25 the CPU 43 2 of the slave processing unit 23 2 synchronizes with the synchronization signal output and supplied from the CPU 43 1 of the main processing unit 23 1 based on the received first instruction data. The same processing as 23 1 is performed.
  • a synchronization signal is supplied to the CPU 43 2 of the slave processing unit 23 2 from the CPU 43 1 of the main processing unit 23 1 via the communication unit 47 1 , the relay unit 26, and the communication unit 47 2 .
  • the processing performed by the secondary processing unit 23 2 will be described together with the processing performed by the main processing unit 23 1 in steps S25 to S30.
  • step S23 the CPU 43 1 of the main processing unit 23 1 receives an operation signal from the operation unit 24. If the CPU 43 1 of the main processing unit 23 1 does not receive an operation signal from the operation unit 24, the process proceeds to step S26, and the subsequent processing is performed.
  • step S24 the CPU 43 1 generates first instruction data for instructing the change of the internal state of the slave processing unit 23 2 based on the operation signal from the operation unit 24, the communication unit 47 1 , the relay unit 27, The data is supplied to the CPU 43 2 of the slave processing unit 23 2 via the communication unit 47 2 .
  • step S25 the CPU 43 1 changes the internal state of the main processing unit 23 1 based on the operation signal from the operation unit 24.
  • the operation unit 24 sends an operation signal corresponding to the user's recording operation to the CPU 43 1. Output to.
  • the CPU 43 1 changes the internal state of the main processing unit 23 1 to an internal state in which recording processing is performed.
  • the CPU 43 2 of the slave processing unit 23 2 synchronizes with the synchronization signal based on the first instruction data received in step S22, at (almost) the same timing as the change of the internal state of the main processing unit 23 1. Then, the internal state of the sub processor 23 2 is changed to the same internal state as that of the sub processor 23 1 .
  • step S26 the image pickup element 221 is photoelectrically converted transmitted light from the lens 21 1, and outputs the result to the right eye two-dimensional images obtained, the image processing unit 41 of the main processing unit 23 1.
  • the imaging element 22 lens photoelectrically converting the transmitted light from 21 2, and outputs the result to the two-dimensional image for the left eye obtained, the image processing unit 41 2 of the slave processing unit 23 2.
  • step S27 the image processing unit 41 1 performs image processing on the right eye two-dimensional images from the imaging device 22 1, the two-dimensional image for the right eye after the image processing, and supplies to the AD converter section 42 1.
  • the image processing unit 41 2 performs image processing on the left-eye 2-D image from the image pickup element 22 2, the two-dimensional image for the left eye after the image processing, and supplies to the AD conversion unit 42 2.
  • step S28 the AD converter section 42 1, a right two-dimensional image for the eye from the image processing unit 41 1 to AD conversion, and supplies the CPU 43 1. Then, the CPU 43 1 supplies the right-eye two-dimensional image from the AD conversion unit 42 1 to the RAM 45 1 and temporarily holds it.
  • the AD conversion unit 42 2 the left two-dimensional image for the eye from the image processing unit 41 2 performs AD conversion and supplies it to the CPU 43 2. Then, the CPU 43 2 supplies the left-eye two-dimensional image from the AD conversion unit 42 2 to the RAM 45 2 and temporarily holds it.
  • step S29 the main processing unit 23 1 of the CPU 43 1, based on the status information held in the built-in memory (not shown) to identify the internal state of the main processor 23 1.
  • CPU 43 1 among the plurality of OSD data stored in advance in storage portion 44 1, the OSD data representing the internal state of the main processor 23 1 were identified, read from the holding unit 44 1, supplied to the RAM 45 1 And hold it temporarily.
  • CPU 43 2 of the slave processing unit 23 based on the state information held in the built-in memory (not shown) to identify the internal state of the slave processing unit 23 2.
  • step S30 CPU 43 1, the two-dimensional image and OSD data for the right eye is held in RAM 45 1, is read from RAM 45 1, and supplies the superimposed portion 46 1.
  • CPU 43 2 is the 2-dimensional image and OSD data for the left eye is held in RAM 45 2, it is read from RAM 45 2, and supplies the superimposed portion 46 2.
  • the superimposing unit 46 2 CPU 43 the OSD data supplied from the 2 by superimposing a two-dimensional image for the left eye supplied likewise from the CPU 43 2, and generates an OSD superimposed image for the left eye.
  • step S31 when the subject of processing is the main processing unit 23 1 , the main processing unit 23 1 advances the processing to step S33, and when the subject of processing is the sub processing unit 23 2 , the sub processing unit 23 2 Then, the process proceeds to step S32.
  • step S32 CPU 43 2 of the slave processing unit 23 2, the CPU 43 1 of the main processing unit 23 1, the communication unit 47 1, via the relay unit 27, and a communication unit 47 2, the OSD superimposing image for the left eye output It is determined whether or not the second instruction data for instructing is received.
  • the CPU 43 2 of the slave processing unit 23 2 waits to determine that the second instruction data has been received, and advances the process to step S34.
  • step S33 the CPU 43 1 of the main processing unit 23 1 generates second instruction data, and sends it to the CPU 43 2 of the slave processing unit 23 2 via the communication unit 47 1 , the relay unit 27, and the communication unit 47 2. Send.
  • step S34 the superimposing unit 46 1 of the main processing unit 23 1 outputs the right-eye OSD superimposed image generated in the immediately preceding step S30 to the 3D image generating unit 61 of the generation display unit 28.
  • superimposing unit 46 2 of the slave processing unit 23 based on the second instruction data received from CPU 43 1 of the main processing unit 23 1, already generated OSD superimposed image for the left eye in step S30 of immediately preceding generation
  • the data is output to the 3D image generation unit 61 of the display unit 28.
  • the left eye OSD superimposed image is output from the superimposing unit 46 2 at the same timing as the right eye OSD superimposed image is output from the superimposing unit 46 1 .
  • step S35 3D image generation unit 61 of generating the display unit 28, and a right-eye OSD overlapping images from the superimposing unit 461, on the basis of the left-eye OSD overlapping images from the superimposing unit 46 2, OSD for the right eye A 3D image including the superimposed image and the left-eye OSD superimposed image is generated and supplied to the display control unit 62.
  • step S36 the display control unit 62 supplies the 3D image from the 3D image generation unit 61 to the display unit 63 for display.
  • the display unit 63 displays the OSD superimposed image for the left eye included in the 3D image so that the left eye of the user can visually recognize the OSD superimposed image for the right eye included in the 3D image. It is displayed so that it can be visually recognized.
  • step S37 for example, the CPU 43 1 of the main processing unit 23 1 determines whether or not to continue the display process based on an operation signal from the operation unit 24. Returning to step S21, the same processing is performed thereafter.
  • step S37 when the main processing unit 23 1 determines not to continue the display process, the display process ends.
  • the main processing unit 23 1 does not need to communicate the OSD data for the left-eye two-dimensional image to the sub-processing unit 23 2 , so that the time required for such communication can be omitted.
  • the image sensor 22 1 outputs a right-eye two-dimensional image obtained by photoelectric conversion of the light from the lens 21 1 to the main processing unit 23 1 , and the image sensor 22 2 has the lens 21 2.
  • the two-dimensional image for the left eye obtained by photoelectric conversion of the light from is output to the sub processor 23 2 .
  • the communication unit 47 1 is provided with the relay unit 26 for communicating the synchronization signal and the relay unit 27 for communicating the instruction data and the like separately, the delivery of the synchronization signal is delayed. Thus, it is possible to prevent a situation in which synchronization cannot be established between the main processing unit 23 1 and the sub-processing unit 23 2 .
  • the main processing unit 23 1 and the sub processing unit 23 2 are configured in the same manner, the main processing unit 23 1 and the sub processing unit 23 2 can be manufactured in the same manufacturing process. For this reason, the manufacturing cost of the image processing apparatus 1 can be reduced.
  • main processing unit 23 1 and the sub-processing unit 23 2 are manufactured using parts common to a conventional imaging device that images a subject, it is not necessary to develop an additional electronic circuit or software, Costs for development can be reduced.
  • the main processing unit 23 1 superimposes the OSD data on the two-dimensional image for the right eye from the image sensor 22 1 , and the slave processing unit 23 2 for the left eye from the image sensor 22 2 .
  • the OSD data was superimposed on the 3D image by superimposing the OSD data on the two-dimensional image.
  • the OSD data may be superimposed on the 3D image stored in the data storage unit 25.
  • the main processing unit 23 1 reads the right-eye two-dimensional image included in the 3D image from the data storage unit 25, and superimposes the OSD data on the read right-eye two-dimensional image. Further, the sub processor 23 2 reads the left-eye two-dimensional image included in the 3D image from the data storage unit 25, and superimposes the OSD data on the read left-eye two-dimensional image.
  • one data storage unit 25 is provided in the image processing apparatus 1, as shown in FIG. 1, one data storage unit 25 is provided.
  • the main processing unit 23 1 may include the data storage unit 25 1 and the sub-processing unit 23 2 may include the data storage unit 25 2 . It is assumed that the same 3D image is stored in the data storage unit 25 1 and the data storage unit 25 2 .
  • the main processing unit 23 1 reads the right-eye two-dimensional image included in the 3D image from the built-in data storage unit 25 1 and superimposes the OSD data on the read right-eye two-dimensional image. Further, the sub processor 23 2 reads the left-eye two-dimensional image included in the 3D image from the built-in data storage unit 25 2 and superimposes the OSD data on the read left-eye two-dimensional image.
  • the image processing apparatus 1 has been described as including one slave processing unit 23 2, can be configured to include two slave processing unit 23 2 and 23 3.
  • the image processing apparatus 1 is also possible as will the included lens 21 3 and the image pickup element 22 3, the image pickup element 22 3, the light from the lens 21 3 and photoelectric conversion, the image pickup device 22 1 and 22 It acquires two-dimensional images of different viewpoints from the image obtained from 2, and outputs to the slave processing unit 23 3.
  • the sub processor 23 3 outputs, from the image sensor 22 3 , OSD data indicating the internal state of the sub processor 23 3 changed by the control from the main processor 23 1 among the plurality of OSD data held in advance.
  • the OSD superimposed image obtained by superimposing on the two-dimensional image is output to the 3D image generating unit 61.
  • the 3D image generation unit 61 generates a 3D image including the OSD superimposed image from the main processing unit 23 1 , the OSD superimposed image from the sub processing unit 23 2, and the OSD superposed image from the sub processing unit 23 3 , and performs display control. To the unit 62.
  • the display control unit 62 supplies the 3D image from the 3D image generation unit 61 to the display unit 63 for display.
  • N is a natural number of 3 or more sub-processing units 23 2 to 23 N + 1 may be provided. it can.
  • a main processing unit that superimposes first superposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint, and in synchronization with the main processing unit
  • a slave processing unit that superimposes the first superimposition data on a second viewpoint image obtained from a second viewpoint different from the first viewpoint among the plurality of superimposition data held in advance
  • An image processing apparatus comprising: the first viewpoint image after superimposition; and a 3D image generation unit that generates a 3D image composed of the second viewpoint image after superimposition.
  • the image processing apparatus communicates a synchronization signal for performing processing in synchronization with the main processing unit to the sub processing unit via a dedicated communication path.
  • the main processing unit includes a first holding unit that holds the plurality of superimposing data in advance, and a first changing unit that changes an internal state of the main processing unit based on a user input operation.
  • the first superimposing data indicating the internal state of the main processing unit changed by the first changing unit among the plurality of superimposing data held in the first holding unit,
  • the image processing apparatus according to (1) or (2) further including a first superimposing unit that superimposes on the first viewpoint image.
  • the sub processing unit synchronizes with the change of the internal state of the second processing unit that stores the plurality of superimposing data in advance and the internal state of the main processing unit.
  • the second changing unit that changes to the same internal state as the main processing unit, and the slave changed by the second changing unit.
  • the image processing apparatus according to (4) further including: a second superimposing unit that superimposes the first superimposing data indicating an internal state of the processing unit on the second viewpoint image.
  • the image processing apparatus includes a plurality of sub processing units, and the 3D image generation unit is obtained from the first viewpoint image after superimposition and the plurality of sub processing units, respectively.
  • the image processing device according to any one of (1) to (5), wherein a 3D image configured by the second viewpoint image is generated.
  • the image processing apparatus includes a main processing unit, a sub processing unit, and a 3D image generation unit, and the main processing unit holds a plurality of images in advance.
  • the first superimposing data is superimposed on the first viewpoint image obtained from the first viewpoint, and the slave processing unit holds in advance in synchronization with the main processing unit.
  • the first superimposition data is superimposed on a second viewpoint image obtained from a second viewpoint different from the first viewpoint, and the 3D image generation unit
  • An image processing method including a step of generating a 3D image composed of the first viewpoint image and the superimposed second viewpoint image.
  • a main processing unit that superimposes the first superposition data among a plurality of superposition data held in advance on the first viewpoint image obtained from the first viewpoint, and the main processing unit.
  • a sub-process for superimposing the first superimposition data on a second viewpoint image obtained from a second viewpoint different from the first viewpoint among the plurality of superposition data held in synchronization And a program for functioning as a 3D image generation unit that generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition.
  • the series of processes described above can be executed by hardware or software, for example.
  • a program constituting the software may execute various functions by installing a computer incorporated in dedicated hardware or various programs. For example, it is installed from a program recording medium in a general-purpose computer or the like.
  • FIG. 4 shows a configuration example of a computer that executes the above-described series of processing by a program.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 81 is also connected with an input / output interface 85 via the bus 84.
  • the input / output interface 85 is connected to an input unit 86 made up of a keyboard, a mouse, a microphone, etc., and an output unit 87 made up of a display, a speaker, etc.
  • the CPU 81 executes various processes in response to commands input from the input unit 86. Then, the CPU 81 outputs the processing result to the output unit 87.
  • the storage unit 88 connected to the input / output interface 85 is composed of, for example, a hard disk, and stores programs executed by the CPU 81 and various data.
  • the communication unit 89 communicates with an external device via a network such as the Internet or a local area network.
  • the program may be acquired via the communication unit 89 and stored in the storage unit 88.
  • the drive 90 connected to the input / output interface 85 drives a removable medium 91 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and drives the program or data recorded therein. Etc. The acquired program and data are transferred to and stored in the storage unit 88 as necessary.
  • a recording medium for recording (storing) a program installed in a computer and ready to be executed by the computer is a magnetic disk (including a flexible disk), an optical disk (CD-ROM (Compact Disc- Removable media 91, which is a package media made up of a read-only memory, DVD (digital versatile disc), a magneto-optical disk (including MD (mini-disc)), or a semiconductor memory, or the program is temporarily or It is composed of a ROM 82 that is permanently stored, a hard disk that constitutes the storage unit 88, and the like.
  • CD-ROM Compact Disc- Removable media 91
  • MD magneto-optical disk
  • semiconductor memory or the program is temporarily or It is composed of a ROM 82 that is permanently stored, a hard disk that constitutes the storage unit 88, and the like.
  • Recording of a program on a recording medium is performed using a wired or wireless communication medium such as a local area network, the Internet, or digital satellite broadcasting via a communication unit 89 that is an interface such as a router or a modem as necessary. Is called.
  • a wired or wireless communication medium such as a local area network, the Internet, or digital satellite broadcasting
  • a communication unit 89 that is an interface such as a router or a modem as necessary. Is called.
  • steps describing the series of processes described above are not limited to the processes performed in time series according to the described order, but are not necessarily performed in time series, either in parallel or individually.
  • the process to be executed is also included.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The present invention pertains to an image processing device, an image processing method and a program that enable icons and the like, which are intended to be handled as 3D icons, to be superimposed more quickly. A primary processing unit superimposes first superimposition data among a plurality of superimposition data retained in advance on a first viewpoint image acquired from a first viewpoint. A secondary processing unit superimposes the first superimposition data among the plurality of superimposition data retained in advance on a second viewpoint image obtained from a second viewpoint, which differs from the first viewpoint, in synchronization with the primary processing unit. A 3D image generating unit generates a 3D image composed of the superimposed first viewpoint image and the superimposed second viewpoint image. The present invention can be used in an image processing device that generates 3D images, for example.

Description

画像処理装置、画像処理方法、及びプログラムImage processing apparatus, image processing method, and program

 本開示は、画像処理装置、画像処理方法、及びプログラムに関し、特に、例えば、3D画像を対象としたアイコン等の重畳を、より迅速に行えるようにした画像処理装置、画像処理方法、及びプログラムに関する。 The present disclosure relates to an image processing device, an image processing method, and a program, and more particularly, to an image processing device, an image processing method, and a program that can more quickly superimpose icons and the like for 3D images. .

 従来、例えば、1のLSI(Large Scale Integration)を用いて、3D画像にアイコン等を重畳するOSD(On Screen Display)重畳技術が存在する(例えば、特許文献1及び2参照)。 Conventionally, for example, there is an OSD (On Screen Display) superposition technique for superimposing an icon or the like on a 3D image using one LSI (Large Scale Integration) (see, for example, Patent Documents 1 and 2).

 ここで、3D画像とは、立体的な画像として視認者により認識される画像であり、例えば、視認者の右眼に視認させるための右眼用2次元画像と、視認者の左眼に視認させるための左眼用2次元画像から構成される。 Here, the 3D image is an image that is recognized by the viewer as a three-dimensional image, and is, for example, a two-dimensional image for the right eye that is visually recognized by the viewer's right eye, and is visually recognized by the viewer's left eye. It is comprised from the two-dimensional image for left eyes for doing.

 このため、3D画像に対するアイコン等の重畳は、3D画像を構成する右眼用2次元画像と左眼用2次元画像に、それぞれ、アイコン等を重畳することにより実現される。 For this reason, the superimposition of an icon or the like on the 3D image is realized by superimposing the icon or the like on the right-eye two-dimensional image and the left-eye two-dimensional image constituting the 3D image.

特開2011-55148号公報JP 2011-55148 A 特開2011-135252号公報JP 2011-135252 A

 ところで、従来のOSD重畳技術によれば、1のLSIにより、アイコン等の重畳が行われるため、アイコン等の重畳を、比較的、速やかに行うことができなかった。 By the way, according to the conventional OSD superimposing technology, since the superimposition of the icon or the like is performed by one LSI, the superimposition of the icon or the like cannot be performed relatively quickly.

 本開示は、このような状況に鑑みてなされたものであり、3D画像を対象としたアイコン等の重畳を、より迅速に行えるようにするものである。 The present disclosure has been made in view of such a situation, and is intended to more quickly superimpose icons and the like for 3D images.

 本開示の一側面の画像処理装置は、予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳する主処理部と、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳する従処理部と、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する3D画像生成部とを含む画像処理装置である。 An image processing apparatus according to an aspect of the present disclosure includes a main processing unit that superimposes first superimposition data on a first viewpoint image obtained from a first viewpoint among a plurality of superposition data held in advance. A second viewpoint image obtained from a second viewpoint different from the first viewpoint from among the plurality of superimposing data held in advance in synchronization with the main processing unit. The image processing apparatus includes a slave processing unit that superimposes the image, a first viewpoint image after superimposition, and a 3D image generation unit that generates a 3D image composed of the second viewpoint image after superimposition.

 前記第1の視点画像を、前記主処理部に出力する第1の出力部と、前記第2の視点画像を、前記従処理部に出力する第2の出力部とをさらに設けることができる。 A first output unit that outputs the first viewpoint image to the main processing unit and a second output unit that outputs the second viewpoint image to the sub-processing unit may be further provided.

 前記主処理部は、前記主処理部と同期して処理を行わせるための同期信号を、専用の通信経路を経由して、前記従処理部に通信し、前記従処理部は、前記専用の通信経路を経由して前記主処理部から通信される前記同期信号に同期して、前記第1の重畳用データを、前記第2の視点画像に重畳することができる。 The main processing unit communicates a synchronization signal for performing processing in synchronization with the main processing unit to the sub processing unit via a dedicated communication path, and the sub processing unit The first superimposing data can be superimposed on the second viewpoint image in synchronization with the synchronization signal communicated from the main processing unit via a communication path.

 前記主処理部は、前記複数の重畳用データを予め保持する第1の保持部と、ユーザの入力操作に基づいて、前記主処理部の内部状態を変更する第1の変更部と、前記第1の保持部に保持されている前記複数の重畳用データのうち、前記第1の変更部により変更された前記主処理部の内部状態を示す前記第1の重畳用データを、前記第1の視点画像に重畳する第1の重畳部とを有することができる。 The main processing unit includes a first holding unit that holds the plurality of superimposing data in advance, a first changing unit that changes an internal state of the main processing unit based on a user input operation, and the first Of the plurality of superposition data held in one holding unit, the first superposition data indicating the internal state of the main processing unit changed by the first change unit is used as the first superposition data. A first superimposing unit that superimposes on the viewpoint image.

 前記従処理部は、前記複数の重畳用データを予め保持する第2の保持部と、前記主処理部の内部状態の変更に同期して、前記従処理部の内部状態を、前記主処理部と同一の内部状態に変更する第2の変更部と、前記第2の保持部に保持されている前記複数の重畳用データのうち、前記第2の変更部により変更された前記従処理部の内部状態を示す前記第1の重畳用データを、前記第2の視点画像に重畳する第2の重畳部とを有することができる。 The secondary processing unit is configured to change the internal state of the secondary processing unit in synchronization with a change in the internal state of the second processing unit that holds the plurality of superimposing data in advance and the main processing unit. A second changing unit that changes the internal state to the same, and among the plurality of superimposition data held in the second holding unit, the second processing unit changed by the second changing unit A second superimposing unit that superimposes the first superimposing data indicating an internal state on the second viewpoint image.

 前記画像処理装置には、複数の前記従処理部を設けることができ、前記3D画像生成部は、重畳後の前記第1の視点画像と、前記複数の従処理部からそれぞれ得られる、重畳後の前記第2の視点画像により構成される3D画像を生成することができる。 The image processing apparatus may be provided with a plurality of sub processing units, and the 3D image generation unit may be obtained from the first viewpoint image after superimposition and the plurality of sub processing units, respectively. A 3D image composed of the second viewpoint image can be generated.

 本開示の一側面の画像処理方法は、3D画像を生成する画像処理装置の画像処理方法であって、前記画像処理装置は、主処理部と、従処理部と、3D画像生成部を含み、前記主処理部が、予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳し、前記従処理部が、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳し、前記3D画像生成部が、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成するステップを含む画像処理方法である。 An image processing method according to an aspect of the present disclosure is an image processing method of an image processing device that generates a 3D image, and the image processing device includes a main processing unit, a sub processing unit, and a 3D image generation unit, The main processing unit superimposes first superposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint, and the sub-processing unit includes the main processing. The first superimposition data is superimposed on a second viewpoint image obtained from a second viewpoint different from the first viewpoint among the plurality of superimposition data held in advance in synchronization with the unit. The image processing method includes a step in which the 3D image generation unit generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition.

 本開示の一側面のプログラムは、コンピュータを、予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳する主処理部と、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳する従処理部と、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する3D画像生成部として機能させるためのプログラムである。 A program according to an aspect of the present disclosure includes a main processing unit that superimposes a first superposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint. The second viewpoint obtained from the second viewpoint different from the first viewpoint among the plurality of overlapping data held in advance in synchronization with the main processing unit. A program for functioning as a 3D image generation unit that generates a 3D image composed of a slave processing unit to be superimposed on an image, the first viewpoint image after being superimposed, and the second viewpoint image after being superimposed. .

 本開示によれば、主処理部が予め保持する複数の重畳用データのうち、第1の重畳用データが、第1の視点から得られる第1の視点画像に重畳され、前記主処理部に同期して、従処理部が予め保持する前記複数の重畳用データのうち、前記第1の重畳用データが、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳され、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像が生成される。 According to the present disclosure, among the plurality of superimposing data held in advance by the main processing unit, the first superimposing data is superimposed on the first viewpoint image obtained from the first viewpoint, and the main processing unit Synchronously, among the plurality of superimposing data held in advance by the slave processing unit, the first superimposing data is converted into a second viewpoint image obtained from a second viewpoint different from the first viewpoint. A 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition is generated.

 本開示によれば、3D画像を対象としたアイコン等の重畳を、より迅速に行うことが可能となる。 According to the present disclosure, it is possible to more quickly superimpose icons and the like for 3D images.

本技術を適用した画像処理装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the image processing apparatus to which this technique is applied. 主処理部、従処理部、及び生成表示部の詳細な構成例を示すブロック図である。It is a block diagram which shows the detailed structural example of a main process part, a sub process part, and a production | generation display part. 画像処理装置が行う表示処理を説明するためのフローチャートである。It is a flowchart for demonstrating the display process which an image processing apparatus performs. コンピュータの構成例を示すブロック図である。It is a block diagram which shows the structural example of a computer.

 以下、本開示における実施の形態(以下、本実施の形態という)について説明する。なお、説明は以下の順序で行う。
 1.本実施の形態(主処理部と従処理部が、それぞれ、OSDデータを予め保持しているときの一例)
 2.変形例
Hereinafter, an embodiment of the present disclosure (hereinafter referred to as the present embodiment) will be described. The description will be given in the following order.
1. This embodiment (an example when the main processing unit and the sub-processing unit each hold OSD data in advance)
2. Modified example

<1.本実施の形態>
[画像処理装置1の構成例]
 図1は、本実施の形態である画像処理装置1の構成例を示している。
<1. Embodiment>
[Configuration Example of Image Processing Apparatus 1]
FIG. 1 shows a configuration example of an image processing apparatus 1 according to the present embodiment.

 この画像処理装置1は、例えば、3D画像を撮像するデジタルカメラ等であり、レンズ211及び212、撮像素子221及び222、主処理部231、従処理部232、操作部24、データ記憶部25、中継部26及び27、並びに生成表示部28から構成される。 The image processing apparatus 1 is, for example, a digital camera that captures a 3D image, and includes lenses 21 1 and 21 2 , imaging elements 22 1 and 22 2 , a main processing unit 23 1 , a sub processing unit 23 2 , and an operation unit 24. , A data storage unit 25, relay units 26 and 27, and a generation display unit 28.

 本実施の形態では、画像処理装置1は、3D画像を撮像するデジタルカメラとして説明するが、3D画像にアイコン等を重畳する処理を行う装置であれば、デジタルカメラに限定されず、どのような装置にも本技術を適用することができる。 In the present embodiment, the image processing apparatus 1 is described as a digital camera that captures a 3D image. However, the image processing apparatus 1 is not limited to a digital camera as long as it is a device that performs a process of superimposing an icon or the like on a 3D image. The present technology can also be applied to an apparatus.

 なお、この画像処理装置1は、主処理部231と従処理部232の間で通信されるデータの通信量を抑制して、3D画像に対するOSDデータの重畳を迅速に行うものである。 The image processing apparatus 1 suppresses the communication amount of data communicated between the main processing unit 23 1 and the sub-processing unit 23 2 and quickly superimposes OSD data on the 3D image.

 ここで、3D画像とは、立体的な画像としてユーザ(視認者)により認識される画像であり、例えば、ユーザの右眼に視認させるための右眼用2次元画像と、ユーザの左眼に視認させるための左眼用2次元画像から構成される。 Here, the 3D image is an image recognized by the user (viewer) as a three-dimensional image. For example, a 2D image for the right eye for allowing the user's right eye to visually recognize the image and a left eye of the user. It is composed of a left-eye two-dimensional image for visual recognition.

 なお、3D画像としては、2枚の2次元画像(右眼用2次元画像及び左眼用2次元画像)の他、3枚以上の、それぞれ異なる視点から得られる2次元画像により構成することができる。 The 3D image may be composed of two two-dimensional images (two-dimensional image for right eye and two-dimensional image for left eye) and three or more two-dimensional images obtained from different viewpoints. it can.

 しかし、本実施の形態では、説明を簡単にするために、3D画像は、2枚の2次元画像、つまり、右眼用2次元画像と左眼用2次元画像により構成されているものとして説明する。 However, in the present embodiment, in order to simplify the description, the 3D image is assumed to be composed of two two-dimensional images, that is, a right-eye two-dimensional image and a left-eye two-dimensional image. To do.

 すなわち、画像処理装置1では、主処理部231と従処理部232が、それぞれ、複数のOSDデータを予め保持することにより、主処理部231と従処理部232の間で、OSDデータのやりとりを行わないようにしている。 That is, in the image processing apparatus 1, the main processing unit 23 1 and the sub processing unit 23 2 hold a plurality of OSD data in advance, so that the OSD is connected between the main processing unit 23 1 and the sub processing unit 23 2. Data is not exchanged.

 また、撮像素子221が、右眼用2次元画像を主処理部231に出力し、撮像素子222が、左眼用2次元画像を従処理部232に出力することにより、主処理部231と従処理部232の間で、右眼用2次元画像や左眼用2次元画像のやりとりを行わないようにしている。 Further, the imaging element 22 1 outputs the right-eye two-dimensional image to the main processing unit 23 1 , and the imaging element 22 2 outputs the left-eye two-dimensional image to the sub-processing unit 23 2 , thereby performing main processing. The two-dimensional image for the right eye and the two-dimensional image for the left eye are not exchanged between the unit 23 1 and the sub-processing unit 23 2 .

 これにより、画像処理装置1は、主処理部231と従処理部232の間で通信されるデータの通信量を抑制して、その通信に要する時間を少なくすることにより、OSDデータの重畳を迅速に行うようにしている。 Thereby, the image processing apparatus 1 suppresses the communication amount of data communicated between the main processing unit 23 1 and the sub-processing unit 23 2 and reduces the time required for the communication, thereby superimposing the OSD data. To do it quickly.

 図1の画像処理装置1において、外部からの光(例えば、被写体からの反射光など)は、レンズ211等の光学系を介して、撮像素子221に入射される。また、外部からの光は、レンズ212等の光学系を介して、撮像素子222に入射される。 In the image processing apparatus 1 of FIG. 1, light from the outside (for example, reflected light from a subject) is incident on the image sensor 22 1 via an optical system such as a lens 21 1 . Further, external light is incident on the image sensor 22 2 via an optical system such as the lens 21 2 .

 撮像素子221は、レンズ211からの光を画像信号に変換する光電変換を行い、その結果得られる画像信号としての右眼用2次元画像を、主処理部231に供給する。 The imaging element 22 1 performs photoelectric conversion for converting the light from the lens 21 1 into an image signal, and supplies a right-eye two-dimensional image as an image signal obtained as a result to the main processing unit 23 1 .

 また、撮像素子222は、レンズ212からの光を画像信号に変換する光電変換を行い、その結果得られる画像信号としての左眼用2次元画像を、従処理部232に供給する。 In addition, the imaging element 22 2 performs photoelectric conversion for converting light from the lens 21 2 into an image signal, and supplies a two-dimensional image for the left eye as an image signal obtained as a result to the slave processing unit 23 2 .

 ここで、撮像素子221及び222としては、例えば、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等のイメージセンサが採用される。 Here, as the image sensors 22 1 and 22 2 , for example, image sensors such as a CCD (Charge Coupled Device) and a CMOS (Complementary Metal Oxide Semiconductor) are employed.

 なお、レンズ211とレンズ212は、それぞれ、所定の距離(例えば、人間の右眼と左眼の間隔に応じた距離)だけ離れた位置に設けられている。 The lens 21 1 and the lens 21 2 are provided at positions separated by a predetermined distance (for example, a distance corresponding to the distance between the human right eye and the left eye).

 このため、撮像素子221は、第1の視点から被写体を撮像して得られる右眼用2次元画像を、主処理部231に出力する。 For this reason, the image sensor 22 1 outputs a two-dimensional image for the right eye obtained by imaging the subject from the first viewpoint to the main processing unit 23 1 .

 また、撮像素子222は、第1の視点とは異なる第2の視点から被写体を撮像して得られる左眼用2次元画像を、従処理部232に出力する。 In addition, the imaging element 22 2 outputs a left-eye two-dimensional image obtained by capturing an image of a subject from a second viewpoint different from the first viewpoint to the slave processing unit 23 2 .

 主処理部231は、例えば、予め保持する複数のOSDデータのうち、主処理部231の内部状態を表すOSDデータを、撮像素子221からの右眼用2次元画像に重畳し、その重畳により得られる右眼用OSD重畳画像を、生成表示部28に出力する。 The main processing unit 23 1, for example, among the plurality of OSD data held in advance, the OSD data representing the internal state of the main processor 23 1, superimposed on the right eye two-dimensional images from the imaging device 22 1, the The right-eye OSD superimposed image obtained by superposition is output to the generation display unit 28.

 ここで、主処理部231の内部状態としては、例えば、主処理部231が右眼用2次元画像に対して手振れ補正処理を行う内部状態や、主処理部231が右眼用2次元画像をデータ記憶部25に記録させる内部状態等が考えられる。このことは、従処理部232の内部状態についても同様である。 Here, the internal state of the main processing unit 23 1, for example, and the internal state of the main processor 23 1 performs the image stabilization processing for the two-dimensional image for the right eye, the main processing unit 23 1 is the right-eye 2 An internal state in which a dimensional image is recorded in the data storage unit 25 can be considered. The same applies to the internal state of the sub processor 23 2 .

 また、例えば、主処理部231は、操作部24からの操作信号に基づいて、主処理部231の内部状態を変更する。 For example, the main processing unit 23 1 changes the internal state of the main processing unit 23 1 based on an operation signal from the operation unit 24.

 さらに、例えば、主処理部231は、従処理部232を制御して、主処理部231と同様の処理を行わせる。なお、主処理部231と従処理部232の処理は、それぞれ、主処理部231で生成される同期信号に同期して行われる。 Further, for example, the main processing unit 23 1 controls the sub processing unit 23 2 to perform the same processing as the main processing unit 23 1 . Note that the processing of the main processing unit 23 1 and the sub processing unit 23 2 is performed in synchronization with the synchronization signal generated by the main processing unit 23 1 .

 すなわち、例えば、主処理部231は、主処理部231に同期させるための同期信号を生成し、生成した同期信号を、中継部26を経由して、従処理部232に送信する。 That is, for example, the main processing unit 23 1 generates a synchronization signal for synchronizing with the main processing unit 23 1, and transmits the generated synchronization signal to the slave processing unit 23 2 via the relay unit 26.

 また、例えば、主処理部231は、従処理部232に対して、内部状態の変更や、後述する左眼用OSD重畳画像の出力等を指示するための指示データを生成し、生成した指示データを、中継部27を経由して、従処理部232に送信する。 In addition, for example, the main processing unit 23 1 generates instruction data for instructing the sub-processing unit 23 2 to change the internal state, output a left-eye OSD superimposed image to be described later, and the like. The instruction data is transmitted to the slave processing unit 23 2 via the relay unit 27.

 ここで、以下の説明では、内部状態の変更を指示する指示データを、第1の指示データという。また、左眼用OSD重畳画像の出力を指示する指示データを、第2の指示データという。なお、指示データは、第1の指示データ及び第2の指示データに限定されない。 Here, in the following description, the instruction data for instructing the change of the internal state is referred to as first instruction data. Also, the instruction data that instructs the output of the left-eye OSD superimposed image is referred to as second instruction data. Note that the instruction data is not limited to the first instruction data and the second instruction data.

 従処理部232は、主処理部231から中継部26を経由して送信される同期信号、及び主処理部231から中継部27を経由して送信される指示データを受信する。 The sub processor 23 2 receives the synchronization signal transmitted from the main processor 23 1 via the relay unit 26 and the instruction data transmitted from the main processor 23 1 via the relay unit 27.

 そして、従処理部232は、中継部27からの指示データに基づき、中継部26からの同期信号に同期して、主処理部231と同様の処理を行う。 Then, the sub processor 23 2 performs the same processing as the main processor 23 1 in synchronization with the synchronization signal from the relay unit 26 based on the instruction data from the relay unit 27.

 すなわち、例えば、従処理部232は、中継部27からの第1の指示データに基づき、中継部26からの同期信号に同期して、従処理部232の内部状態を、主処理部231と同一の内部状態に変更する。 That is, for example, the slave processing unit 23 2 changes the internal state of the slave processing unit 23 2 in synchronization with the synchronization signal from the relay unit 26 based on the first instruction data from the relay unit 27. Change to the same internal state as 1 .

 この場合、従処理部232の内部状態の変更は、主処理部231の内部状態の変更のタイミングと(殆ど)同一のタイミングで行われる。 In this case, the change of the internal state of the sub processor 23 2 is performed at (almost) the same timing as the change of the internal state of the main processor 23 1 .

 また、例えば、従処理部232は、同期信号に同期して、予め保持する複数のOSDデータのうち、従処理部232の内部状態を表すOSDデータを、撮像素子222からの左眼用2次元画像に重畳する。なお、従処理部232には、中継部27からの指示データにより、OSDデータの重畳が指示されているものとする。 Further, for example, the slave processing unit 23 2 synchronizes the left eye from the image sensor 22 2 with OSD data representing the internal state of the slave processing unit 23 2 among a plurality of OSD data held in advance in synchronization with the synchronization signal. Is superimposed on the two-dimensional image. It is assumed that the sub processor 23 2 is instructed to superimpose OSD data by instruction data from the relay unit 27.

 この場合、従処理部232におけるOSDデータの重畳は、主処理部231におけるOSDデータの重畳のタイミングと同一のタイミングで行われる。 In this case, the superimposition of OSD data in the slave processing unit 23 2 is performed at the same timing as the superimposition timing of the OSD data in the main processing unit 23 1 .

 そして、従処理部232は、その重畳により得られる左眼用OSD重畳画像を、中継部27からの第2の指示データに基づき、中継部26からの同期信号に同期して、生成表示部28に出力する。 Then, the secondary processing unit 23 2 generates the left-eye OSD superimposed image obtained by the superimposition based on the second instruction data from the relay unit 27 in synchronization with the synchronization signal from the relay unit 26, To 28.

 この場合、従処理部232による左眼用OSD重畳画像の出力は、主処理部231から右眼用OSD重畳画像が出力されるタイミングと同一のタイミングで行われる。 In this case, the output of the left-eye OSD superimposed image by the sub-processing unit 23 2 is performed at the same timing as the output timing of the right-eye OSD superimposed image from the main processing unit 23 1 .

 操作部24は、ユーザにより操作される操作ボタン等から構成されており、ユーザに操作されたことに対応して、ユーザの操作に対応する操作信号を、主処理部231に供給する。 Operation unit 24 is composed of operation buttons to be operated by the user, in response to being operated by the user, an operation signal corresponding to a user operation, and supplies to the main processor 23 1.

 ユーザは、操作部24を操作することにより、画像処理装置1の内部状態、すなわち、主処理部231及び従処理部232の内部状態を変更することができる。 The user can change the internal state of the image processing apparatus 1, that is, the internal state of the main processing unit 23 1 and the secondary processing unit 23 2 by operating the operation unit 24.

 データ記憶部25は、例えばフラッシュメモリ等であり、主処理部231からの右眼用2次元画像や、従処理部232からの左眼用2次元画像を記憶する。また、データ記憶部25は、主処理部231や従処理部232により実行されるプログラムを予め記憶する。 The data storage unit 25 is, for example, a flash memory or the like, and stores the right-eye two-dimensional image from the main processing unit 23 1 and the left-eye two-dimensional image from the sub-processing unit 23 2 . The data storage unit 25 stores in advance a program executed by the main processing unit 23 1 and the sub-processing unit 23 2 .

 中継部26は、主処理部231と従処理部232の間で、主処理部231から出力される同期信号や、従処理部232から出力されるレスポンス(主処理部231からのデータに対するレスポンス)等を中継する。 The relay unit 26, between the main processing unit 23 1 and the sub processing unit 23 2 , a synchronization signal output from the main processing unit 23 1 and a response output from the sub processing unit 23 2 (from the main processing unit 23 1 Response to the data).

 また、中継部27は、主処理部231と従処理部232との間で、主処理部231から出力される指示データや、従処理部232から出力される、指示データに対するレスポンス等を中継する。 The relay unit 27, between the main processor 23 1 and the slave processing unit 23 2, instruction data and output from the main processing unit 23 1, output from the slave processing unit 23 2, the response to the instruction data Relay etc.

 画像処理装置1では、同期信号を、指示データ等を通信する通信経路とは異なる専用の通信経路(中継部26を経由する通信経路)を経由して通信している。 In the image processing apparatus 1, the synchronization signal is communicated via a dedicated communication path (communication path via the relay unit 26) different from the communication path for communicating instruction data and the like.

 これにより、主処理部231から従処理部232に供給される同期信号が遅延する事態を抑止して、主処理部231と従処理部232との間で、正確に同期をとるようにしている。 As a result, a situation in which the synchronization signal supplied from the main processing unit 23 1 to the sub processing unit 23 2 is delayed is prevented, and the main processing unit 23 1 and the sub processing unit 23 2 are accurately synchronized. I am doing so.

 生成表示部28は、主処理部231からの右眼用OSD重畳画像と、従処理部232からの左眼用OSD重畳画像に基づいて、右眼用OSD重畳画像と左眼用OSD重畳画像からなる3D画像を生成して、図示せぬ内蔵の表示部に表示させる。 The generation display unit 28, based on the OSD superimposed image for the right eye from the main processing unit 23 1 and the OSD superimposed image for the left eye from the sub-processing unit 23 2 , the OSD superimposed image for the right eye and the OSD superimposed for the left eye. A 3D image composed of images is generated and displayed on a built-in display unit (not shown).

[主処理部231、従処理部232、及び生成表示部28の詳細な構成例]
 次に、図2は、主処理部231、従処理部232、及び生成表示部28の詳細な構成例を示している。
[Detailed Configuration Example of Main Processing Unit 23 1 , Sub-Processing Unit 23 2 , and Generation Display Unit 28]
Next, FIG. 2 shows a detailed configuration example of the main processing unit 23 1 , the sub-processing unit 23 2 , and the generation display unit 28.

 主処理部231は、画像処理部411、AD(analog to digital)変換部421、CPU(central processing unit)431、保持部441、RAM(random access memory)451、重畳部461、及び通信部471から構成される。 The main processing unit 23 1 includes an image processing unit 41 1 , an AD (analog to digital) conversion unit 42 1 , a CPU (central processing unit) 43 1 , a holding unit 44 1 , a RAM (random access memory) 45 1 , and a superposition unit 46. 1 and a communication unit 47 1 .

 画像処理部411には、撮像素子221から右眼用2次元画像が供給される。画像処理部411は、撮像素子221からの右眼用2次元画像に対して、ガンマ補正処理やホワイトバランス処理、手振れ補正処理等の画像処理を施し、画像処理後の右眼用2次元画像を、AD変換部421に供給する。 A right-eye two-dimensional image is supplied from the image sensor 22 1 to the image processing unit 41 1 . The image processing unit 41 1 performs image processing such as gamma correction processing, white balance processing, and camera shake correction processing on the two-dimensional image for the right eye from the image sensor 22 1, and performs two-dimensional processing for the right eye after the image processing. The image is supplied to the AD conversion unit 42 1 .

 AD変換部421は、画像処理部411からの、アナログ信号としての右眼用2次元画像を、デジタル信号としての右眼用2次元画像に変換するAD変換を行い、デジタル信号としての右眼用2次元画像を、CPU431に供給する。 The AD conversion unit 42 1 performs AD conversion for converting the two-dimensional image for the right eye as an analog signal from the image processing unit 41 1 into the two-dimensional image for the right eye as a digital signal, and performs the right conversion as a digital signal. a two-dimensional image for the eye, and supplies the CPU 43 1.

 CPU431は、例えば、画像処理部411、AD変換部421、重畳部461、及び通信部471を制御する。 For example, the CPU 43 1 controls the image processing unit 41 1 , the AD conversion unit 42 1 , the superimposition unit 46 1 , and the communication unit 47 1 .

 また、例えば、CPU431は、同期信号を生成し、生成した同期信号に同期して、所定の処理を行う。 Further, for example, CPU 43 1 generates the synchronization signal in synchronization with the generated synchronization signal, performs a predetermined process.

 すなわち、例えば、CPU431は、操作信号25からの操作信号に基づいて、主処理部231の内部状態を変更する。主処理部231の内部状態を表す状態情報は、例えば、CPU431の図示せぬ内蔵のメモリ等に保持される。 That is, for example, the CPU 43 1 changes the internal state of the main processing unit 23 1 based on the operation signal from the operation signal 25. State information indicating the internal state of the main processor 23 1, for example, is held in the built-in memory or the like (not shown) of the CPU 43 1.

 また、例えば、CPU431は、従処理部232の内部状態を、主処理部231と同一の内部状態に変更させることを指示する第1の指示データ、及び従処理部232から左眼用OSD重畳画像を出力させることを指示する第2の指示データ等を生成する。そして、CPU431は、同期信号とともに、生成した第1の指示データや第2の指示データ等を、通信部471に供給する。 Further, for example, the CPU 43 1 uses the first instruction data for instructing to change the internal state of the sub processor 23 2 to the same internal state as the main processor 23 1 , and the left eye from the sub processor 23 2. Second instruction data or the like for instructing to output the OSD superimposed image is generated. Then, the CPU 43 1 supplies the generated first instruction data, second instruction data, and the like together with the synchronization signal to the communication unit 47 1 .

 さらに、例えば、CPU431は、重畳部461を制御して、右眼用2次元画像に、主処理部231の内部状態を表すOSDデータを重畳する処理を行わせる。 Further, for example, the CPU 43 1 controls the superimposing unit 46 1 to perform processing for superimposing OSD data representing the internal state of the main processing unit 23 1 on the right-eye two-dimensional image.

 すなわち、例えば、CPU431は、AD変換部421からの右眼用2次元画像を、RAM451に供給して一時的に保持させる。 That is, for example, the CPU 43 1 supplies the right-eye two-dimensional image from the AD conversion unit 42 1 to the RAM 45 1 and temporarily holds it.

 また、例えば、CPU431は、図示せぬ内蔵のメモリに保持されている状態情報に基づいて、主処理部231の内部状態を識別する。さらに、例えば、CPU431は、保持部441に予め保持されている複数のOSDデータのうち、識別した主処理部231の内部状態を表すOSDデータを、保持部441から読み出し、RAM451に供給して、一時的に保持させる。 For example, the CPU 43 1 identifies the internal state of the main processing unit 23 1 based on the state information held in a built-in memory (not shown). Furthermore, for example, CPU 43 1, among the plurality of OSD data previously held in the holding unit 44 1, the OSD data representing the internal state of the main processor 23 1 were identified, read from the holding unit 44 1, RAM 45 1 To temporarily hold.

 そして、CPU431は、RAM451から、保持済みの右眼用2次元画像及びOSDデータを読み出し、重畳部461に供給して、右眼用2次元画像に対するOSDデータの重畳を行わせる。 Then, CPU 43 1 from RAM 45 1, reads out the retained of the right-eye 2-D image and OSD data, and supplies the superposed portions 46 1 to perform the overlap of the OSD data to the two-dimensional image for the right eye.

 保持部441は、主処理部231の内部状態をそれぞれ表す複数のOSDデータを予め保持している。 The holding unit 44 1 holds in advance a plurality of OSD data each representing the internal state of the main processing unit 23 1 .

 RAM451は、CPU431から書き込みが指示されたデータを一時的に保持するとともに、CPU431から読み出しが指示されたデータを、CPU431に供給する。 RAM 45 1 is adapted to temporarily hold the data writing has been instructed from the CPU 43 1, the data read is instructed from CPU 43 1, and supplies the CPU 43 1.

 重畳部461は、CPU431からの右眼用2次元画像に対して、同じくCPU431からのOSDデータを重畳し、その重畳により得られる右眼用OSD重畳画像を、生成表示部28に供給する。 Superimposing unit 46 1, to the right eye two-dimensional images from the CPU 43 1, likewise superimposes the OSD data from the CPU 43 1, supplies the right-eye OSD superimposed image obtained by the superposition, on the generation and display unit 28 To do.

 通信部471は、主処理部231からの同期信号を、中継部26を経由して、通信部472に供給する。また、通信部471は、主処理部231からの第1の指示データや第2の指示データ等を、中継部27を経由して、通信部472に供給する。 The communication unit 47 1 supplies the synchronization signal from the main processing unit 23 1 to the communication unit 47 2 via the relay unit 26. Further, the communication unit 47 1 supplies the first instruction data, the second instruction data, and the like from the main processing unit 23 1 to the communication unit 47 2 via the relay unit 27.

 その他、例えば、通信部471は、従処理部232から、中継部26や中継部27を経由して返信されるレスポンス等のデータを、CPU431に供給する。 In addition, for example, the communication unit 47 1 supplies data such as a response returned from the slave processing unit 23 2 via the relay unit 26 or the relay unit 27 to the CPU 43 1 .

 従処理部232は、画像処理部412、AD変換部422、CPU432、保持部442、RAM452、重畳部462、及び通信部472から構成される。 The slave processing unit 23 2 includes an image processing unit 41 2 , an AD conversion unit 42 2 , a CPU 43 2 , a holding unit 44 2 , a RAM 45 2 , a superimposing unit 46 2 , and a communication unit 47 2 .

 画像処理部412には、撮像素子222から左眼用2次元画像が供給される。画像処理部412は、撮像素子222からの左眼用2次元画像に対して、ガンマ補正処理やホワイトバランス処理、手振れ補正処理等の画像処理を施し、画像処理後の左眼用2次元画像を、AD変換部422に供給する。 A left-eye two-dimensional image is supplied from the image sensor 22 2 to the image processing unit 41 2 . The image processing unit 41 2 performs image processing such as gamma correction processing, white balance processing, and camera shake correction processing on the two-dimensional image for the left eye from the image sensor 22 2, and performs two-dimensional processing for the left eye after image processing. images, and supplies to the AD conversion unit 42 2.

 AD変換部422は、画像処理部411からの、アナログ信号としての左眼用2次元画像を、デジタル信号としての左眼用2次元画像に変換するAD変換を行い、デジタル信号としての左眼用2次元画像を、CPU432に供給する。 The AD conversion unit 42 2 performs AD conversion for converting the two-dimensional image for the left eye as an analog signal from the image processing unit 41 1 into the two-dimensional image for the left eye as a digital signal, and performs left conversion as a digital signal. a two-dimensional image for the eye, and supplies the CPU 43 2.

 CPU432は、主処理部231のCPU431から、通信部471、中継部26、及び通信部472を経由して送信される同期信号を受信し、受信した同期信号に同期して、画像処理部412、AD変換部422、重畳部462、及び通信部472を制御する。 The CPU 43 2 receives a synchronization signal transmitted from the CPU 43 1 of the main processing unit 23 1 via the communication unit 47 1 , the relay unit 26, and the communication unit 47 2 , and synchronizes with the received synchronization signal, The image processing unit 41 2 , AD conversion unit 42 2 , superimposing unit 46 2 , and communication unit 47 2 are controlled.

 また、例えば、CPU432は、主処理部231のCPU431から、通信部471、中継部27、及び通信部472を経由して送信される第1の指示データや第2の指示データ等を受信する。 For example, the CPU 43 2 uses the first instruction data and the second instruction data transmitted from the CPU 43 1 of the main processing unit 23 1 via the communication unit 47 1 , the relay unit 27, and the communication unit 47 2. Etc.

 そして、CPU432は、受信した同期信号に同期して、同じく受信した第1の指示データや第2の指示データにより指示された処理を行う。 Then, CPU 43 2 in synchronization with the received synchronization signal, also performs the processing instructed by the first instruction data and the second instruction data received.

 すなわち、例えば、CPU432は、内部状態の変更を指示する第1の指示データを受信した場合、同期信号に同期して、主処理部231の内部状態の変更が行われるタイミングと(殆ど)同一のタイミングで、従処理部232の内部状態を変更する。 That is, for example, when the CPU 43 2 receives the first instruction data instructing the change of the internal state, the timing at which the change of the internal state of the main processing unit 23 1 is performed in synchronization with the synchronization signal (almost). At the same timing, the internal state of the sub processor 23 2 is changed.

 なお、従処理部232の内部状態を表す状態情報は、例えば、CPU432の図示せぬ内蔵のメモリ等に保持される。 Note that the state information indicating the internal state of the sub processor 23 2 is held in, for example, a built-in memory (not shown) of the CPU 43 2 .

 また、例えば、CPU432は、左眼用OSD重畳画像の出力を指示する第2の指示データを受信した場合、同期信号に同期して、主処理部231の重畳部462から右眼用OSD重畳画像が出力されるタイミングと同一のタイミングで、重畳部461に左眼用OSD重畳画像を出力させる。 Further, for example, CPU 43 2, when receiving the second instruction data for instructing the output of the left-eye OSD superimposed image, in synchronization with the synchronizing signal, for the right eye from the main processing unit 23 1 of the overlapping portion 46 2 timing the same timing OSD superimposed image is output, to output the OSD superimposed image for the left eye superimposing unit 46 1.

 さらに、例えば、CPU432は、重畳部462を制御して、左眼用2次元画像に、従処理部232の内部状態を表すOSDデータを重畳する処理を行わせる。なお、CPU432には、CPU431から出力されて供給される指示データにより、OSDデータの重畳が指示されているものとする。 Further, for example, the CPU 43 2 controls the superimposing unit 46 2 to perform a process of superimposing OSD data representing the internal state of the slave processing unit 23 2 on the left-eye two-dimensional image. Assume that the CPU 43 2 is instructed to superimpose OSD data by instruction data output from the CPU 43 1 and supplied.

 すなわち、例えば、CPU432は、AD変換部422からの左眼用2次元画像を、RAM452に供給して一時的に保持させる。 That is, for example, the CPU 43 2 supplies the left-eye two-dimensional image from the AD conversion unit 42 2 to the RAM 45 2 and temporarily holds it.

 また、例えば、CPU432は、図示せぬ内蔵のメモリに保持されている状態情報に基づいて、従処理部232の内部状態を識別する。さらに、例えば、CPU432は、保持部442に予め保持されている複数のOSDデータのうち、識別した従処理部232の内部状態を表すOSDデータを、保持部442から読み出し、RAM452に供給して、一時的に保持させる。 Further, for example, the CPU 43 2 identifies the internal state of the sub processor 23 2 based on the state information held in a built-in memory (not shown). Furthermore, for example, CPU 43 2, of the plurality of OSD data previously held in the holding unit 44 2, the OSD data representing the internal state of the slave processor 23 2 identified, read from the storage portion 44 2, RAM 45 2 To temporarily hold.

 そして、CPU432は、RAM452から、保持済みの左眼用2次元画像及びOSDデータを読み出し、重畳部462に供給して、左眼用2次元画像にOSDデータを重畳させる。 Then, CPU 43 2 from RAM 45 2, reads out the retained two-dimensional image and OSD data for the left eye, and supplies to the superimposing unit 46 2, and superimposes the OSD data into two-dimensional image for the left eye.

 保持部442は、従処理部232の内部状態をそれぞれ表す複数のOSDデータを予め保持している。なお、保持部442には、保持部441が保持している複数のOSDデータと同一のOSDデータが予め保持される。 Holding portion 44 2 previously holds a plurality of OSD data respectively representing the internal state of the slave processing unit 23 2. Note that the holding portion 44 2, the same OSD data and a plurality of OSD data holding unit 44 1 is held is previously held.

 RAM452は、CPU432から書き込みが指示されたデータを一時的に保持するとともに、CPU432から読み出しが指示されたデータを、CPU432に供給する。 RAM 45 2 is adapted to temporarily hold the data writing has been instructed from the CPU 43 2, the data read is instructed from the CPU 43 2, and supplies the CPU 43 2.

 重畳部462は、CPU432からの左眼用2次元画像に対して、同じくCPU432からのOSDデータを重畳し、その重畳により得られる左眼用OSD重畳画像を、生成表示部28に供給する。 The superimposing unit 46 2 superimposes the OSD data from the CPU 43 2 on the two-dimensional image for the left eye from the CPU 43 2 , and supplies the OSD superimposed image for the left eye obtained by the superimposition to the generation display unit 28. To do.

 通信部472は、中継部26からの同期信号を、CPU432に供給し、中継部27からの第1の指示データや第2の指示データ等を、CPU432に供給する。 The communication unit 47 2 supplies the synchronization signal from the relay unit 26 to the CPU 43 2, and supplies the first instruction data, the second instruction data, and the like from the relay unit 27 to the CPU 43 2 .

 また、例えば、通信部472は、中継部26からのレスポンス等を、中継部26を経由して、通信部471に送信する。さらに、通信部472は、中継部27からのレスポンス等を、中継部27を経由して、通信部471に送信する。 For example, the communication unit 47 2 transmits a response from the relay unit 26 to the communication unit 47 1 via the relay unit 26. Further, the communication unit 47 2 transmits a response from the relay unit 27 to the communication unit 47 1 via the relay unit 27.

 生成表示部28は、3D画像生成部61、表示制御部62、及び表示部63から構成される。 The generation display unit 28 includes a 3D image generation unit 61, a display control unit 62, and a display unit 63.

 3D画像生成部61は、主処理部231の重畳部461から供給される右眼用OSD重畳画像と、従処理部232の重畳部462から供給される左眼用OSD重畳画像に基づいて、右眼用OSD重畳画像と左眼用OSD重畳画像からなる3D画像を生成し、表示制御部62に供給する。 3D image generation unit 61, a right-eye OSD overlapping image supplied from the superimposing unit 461 of the main processing unit 23 1, the left-eye OSD overlapping image supplied from the superimposing unit 46 2 of the slave processing unit 23 2 Based on this, a 3D image composed of the OSD superimposed image for the right eye and the OSD superimposed image for the left eye is generated and supplied to the display control unit 62.

 表示制御部62は、3D画像生成部61からの3D画像を、表示部63に供給して表示させる。 The display control unit 62 supplies the 3D image from the 3D image generation unit 61 to the display unit 63 for display.

 表示部63は、例えば、LCD(liquid crystal display)等であり、表示制御部62からの制御に従って、3D画像を表示する。 The display unit 63 is, for example, an LCD (liquid crystal display) or the like, and displays a 3D image according to control from the display control unit 62.

 すなわち、例えば、表示部63には、3D画像に含まれる右眼用2次元画像と左眼用2次元画像を、光学的に分離するパララックスバリアやレンチキュラーレンズ等が設けられている。 That is, for example, the display unit 63 is provided with a parallax barrier, a lenticular lens, or the like that optically separates the right-eye two-dimensional image and the left-eye two-dimensional image included in the 3D image.

 このため、表示部63は、パララックスバリアやレンチキュラーレンズ等により、3D画像に含まれる左眼用OSD重畳画像を、ユーザの左眼に視認させるように表示し、3D画像に含まれる右眼用OSD重畳画像を、ユーザの右眼に視認させるように表示する。 Therefore, the display unit 63 displays the OSD superimposed image for the left eye included in the 3D image so as to be visually recognized by the user's left eye using a parallax barrier, a lenticular lens, or the like, and for the right eye included in the 3D image. The OSD superimposed image is displayed so as to be visually recognized by the user's right eye.

 なお、表示部63は、右眼用2次元画像と左眼用2次元画像を交互に表示することにより、3D画像をユーザに視認させるようにしてもよい。この場合、ユーザは、右眼と左眼の視界を交互に遮る3Dメガネ等を装着する必要がある。 The display unit 63 may cause the user to visually recognize the 3D image by alternately displaying the two-dimensional image for the right eye and the two-dimensional image for the left eye. In this case, the user needs to wear 3D glasses or the like that alternately block the field of view of the right eye and the left eye.

[画像処理装置1の動作説明]
 次に、図3のフローチャートを参照して、画像処理装置1が行う表示処理について説明する。
[Description of Operation of Image Processing Apparatus 1]
Next, display processing performed by the image processing apparatus 1 will be described with reference to the flowchart of FIG.

 この表示処理は、例えば、画像処理装置1の電源がオンされたときに開始される。 This display process is started, for example, when the image processing apparatus 1 is powered on.

 ステップS21では、処理の主体が、主処理部231であるか、それとも従処理部232であるかに応じて、処理は分岐される。 In step S21, the processing branches depending on whether the main subject of processing is the main processing unit 23 1 or the sub processing unit 23 2 .

 すなわち、ステップS21において、処理の主体が主処理部231である場合、主処理部231は、処理をステップS23に進め、処理の主体が従処理部232である場合、従処理部232は、処理をステップS22に進める。 That is, in step S21, when the processing subject is the main processing unit 23 1 , the main processing unit 23 1 advances the processing to step S23, and when the processing subject is the sub processing unit 23 2 , the sub processing unit 23 In step 2 , the process proceeds to step S22.

 ステップS22では、従処理部232のCPU432は、主処理部231のCPU431から、通信部471、中継部27及び通信部472を介して、従処理部232の内部状態の変更を指示する第1の指示データを受信したか否かを判定する。 In step S22, CPU 43 2 of the slave processing unit 23 2, the CPU 43 1 of the main processing unit 23 1, the communication unit 47 1, via the relay unit 27 and the communication unit 47 2, the internal state of the slave processor 23 2 It is determined whether or not first instruction data for instructing change has been received.

 従処理部232のCPU432は、第1の指示データを受信したと判定するのを待って、処理をステップS25に進める。 The CPU 43 2 of the slave processing unit 23 2 waits to determine that the first instruction data has been received, and then proceeds to step S25.

 ステップS25では、従処理部232のCPU432は、受信した第1の指示データに基づいて、主処理部231のCPU431から出力されて供給される同期信号に同期して、主処理部231と同様の処理を行う。 In step S25, the CPU 43 2 of the slave processing unit 23 2 synchronizes with the synchronization signal output and supplied from the CPU 43 1 of the main processing unit 23 1 based on the received first instruction data. The same processing as 23 1 is performed.

 なお、従処理部232のCPU432には、主処理部231のCPU431から、通信部471、中継部26及び通信部472を介して、同期信号が供給される。 A synchronization signal is supplied to the CPU 43 2 of the slave processing unit 23 2 from the CPU 43 1 of the main processing unit 23 1 via the communication unit 47 1 , the relay unit 26, and the communication unit 47 2 .

 従処理部232が行う処理は、ステップS25乃至ステップS30において、主処理部231が行う処理とともに説明する。 The processing performed by the secondary processing unit 23 2 will be described together with the processing performed by the main processing unit 23 1 in steps S25 to S30.

 ステップS23では、主処理部231のCPU431は、操作部24からの操作信号を受信する。なお、主処理部231のCPU431は、操作部24から、操作信号を受信しなかった場合、処理をステップS26に進め、それ以降の処理を行なう。 In step S23, the CPU 43 1 of the main processing unit 23 1 receives an operation signal from the operation unit 24. If the CPU 43 1 of the main processing unit 23 1 does not receive an operation signal from the operation unit 24, the process proceeds to step S26, and the subsequent processing is performed.

 ステップS24では、CPU431は、操作部24からの操作信号に基づいて、従処理部232の内部状態の変更を指示する第1の指示データを生成し、通信部471、中継部27及び通信部472を介して、従処理部232のCPU432に供給する。 In step S24, the CPU 43 1 generates first instruction data for instructing the change of the internal state of the slave processing unit 23 2 based on the operation signal from the operation unit 24, the communication unit 47 1 , the relay unit 27, The data is supplied to the CPU 43 2 of the slave processing unit 23 2 via the communication unit 47 2 .

 ステップS25では、CPU431は、操作部24からの操作信号に基づいて、主処理部231の内部状態を変更する。 In step S25, the CPU 43 1 changes the internal state of the main processing unit 23 1 based on the operation signal from the operation unit 24.

 すなわち、例えば、ユーザが、操作部24を用いて、画像処理装置1に録画を行わせるための録画操作を行った場合、操作部24は、ユーザの録画操作に対応する操作信号を、CPU431に出力する。 That is, for example, when the user performs a recording operation for causing the image processing apparatus 1 to perform recording using the operation unit 24, the operation unit 24 sends an operation signal corresponding to the user's recording operation to the CPU 43 1. Output to.

 この場合、CPU431は、操作部24からの操作信号に基づいて、主処理部231の内部状態を、録画処理を行う内部状態に変更する。 In this case, based on the operation signal from the operation unit 24, the CPU 43 1 changes the internal state of the main processing unit 23 1 to an internal state in which recording processing is performed.

 また、従処理部232のCPU432は、ステップS22で受信した第1の指示データに基づき、同期信号に同期して、主処理部231の内部状態の変更と(殆ど)同一のタイミングで、従処理部232の内部状態を、従処理部231と同一の内部状態に変更する。 Further, the CPU 43 2 of the slave processing unit 23 2 synchronizes with the synchronization signal based on the first instruction data received in step S22, at (almost) the same timing as the change of the internal state of the main processing unit 23 1. Then, the internal state of the sub processor 23 2 is changed to the same internal state as that of the sub processor 23 1 .

 ステップS26では、撮像素子221は、レンズ211からの透過光を光電変換し、その結果得られる右眼用2次元画像を、主処理部231の画像処理部411に出力する。 In step S26, the image pickup element 221 is photoelectrically converted transmitted light from the lens 21 1, and outputs the result to the right eye two-dimensional images obtained, the image processing unit 41 of the main processing unit 23 1.

 また、撮像素子222は、レンズ212からの透過光を光電変換し、その結果得られる左眼用2次元画像を、従処理部232の画像処理部412に出力する。 The imaging element 22 2, lens photoelectrically converting the transmitted light from 21 2, and outputs the result to the two-dimensional image for the left eye obtained, the image processing unit 41 2 of the slave processing unit 23 2.

 ステップS27では、画像処理部411は、撮像素子221からの右眼用2次元画像に画像処理を行い、画像処理後の右眼用2次元画像を、AD変換部421に供給する。 In step S27, the image processing unit 41 1 performs image processing on the right eye two-dimensional images from the imaging device 22 1, the two-dimensional image for the right eye after the image processing, and supplies to the AD converter section 42 1.

 また、画像処理部412は、撮像素子222からの左眼用2次元画像に画像処理を行い、画像処理後の左眼用2次元画像を、AD変換部422に供給する。 The image processing unit 41 2 performs image processing on the left-eye 2-D image from the image pickup element 22 2, the two-dimensional image for the left eye after the image processing, and supplies to the AD conversion unit 42 2.

 ステップS28では、AD変換部421は、画像処理部411からの右眼用2次元画像をAD変換して、CPU431に供給する。そして、CPU431は、AD変換部421からの右眼用2次元画像を、RAM451に供給して一時的に保持させる。 In step S28, the AD converter section 42 1, a right two-dimensional image for the eye from the image processing unit 41 1 to AD conversion, and supplies the CPU 43 1. Then, the CPU 43 1 supplies the right-eye two-dimensional image from the AD conversion unit 42 1 to the RAM 45 1 and temporarily holds it.

 また、AD変換部422は、画像処理部412からの左眼用2次元画像をAD変換して、CPU432に供給する。そして、CPU432は、AD変換部422からの左眼用2次元画像を、RAM452に供給して一時的に保持させる。 Further, the AD conversion unit 42 2, the left two-dimensional image for the eye from the image processing unit 41 2 performs AD conversion and supplies it to the CPU 43 2. Then, the CPU 43 2 supplies the left-eye two-dimensional image from the AD conversion unit 42 2 to the RAM 45 2 and temporarily holds it.

 ステップS29では、主処理部231のCPU431は、図示せぬ内蔵のメモリに保持されている状態情報に基づいて、主処理部231の内部状態を識別する。 In step S29, the main processing unit 23 1 of the CPU 43 1, based on the status information held in the built-in memory (not shown) to identify the internal state of the main processor 23 1.

 そして、CPU431は、保持部441に予め保持している複数のOSDデータのうち、識別した主処理部231の内部状態を表すOSDデータを、保持部441から読み出し、RAM451に供給して一時的に保持させる。 Then, CPU 43 1, among the plurality of OSD data stored in advance in storage portion 44 1, the OSD data representing the internal state of the main processor 23 1 were identified, read from the holding unit 44 1, supplied to the RAM 45 1 And hold it temporarily.

 また、従処理部232のCPU432は、図示せぬ内蔵のメモリに保持されている状態情報に基づいて、従処理部232の内部状態を識別する。 Further, CPU 43 2 of the slave processing unit 23 2, based on the state information held in the built-in memory (not shown) to identify the internal state of the slave processing unit 23 2.

 そして、CPU432は、保持部442に予め保持している複数のOSDデータのうち、識別した従処理部232の内部状態を表すOSDデータを、保持部442から読み出し、RAM452に供給して一時的に保持させる。 Then, CPU 43 2, of the plurality of OSD data stored in advance in storage portion 44 2, the OSD data representing the internal state of the slave processor 23 2 identified, read from the storage portion 44 2, supplied to the RAM 45 2 And hold it temporarily.

 ステップS30では、CPU431は、RAM451に保持されている右眼用2次元画像及びOSDデータを、RAM451から読み出して、重畳部461に供給する。そして、重畳部461は、CPU431から供給されるOSDデータを、同じくCPU431から供給される右眼用2次元画像に重畳することにより、右眼用OSD重畳画像を生成する。 In step S30, CPU 43 1, the two-dimensional image and OSD data for the right eye is held in RAM 45 1, is read from RAM 45 1, and supplies the superimposed portion 46 1. The superimposing unit 46 1, the OSD data supplied from the CPU 43 1, likewise by superimposing the right eye two-dimensional images supplied from the CPU 43 1, to produce an OSD superimposed image for the right eye.

 また、CPU432は、RAM452に保持されている左眼用2次元画像及びOSDデータを、RAM452から読み出して、重畳部462に供給する。そして、重畳部462は、CPU432から供給されるOSDデータを、同じくCPU432から供給される左眼用2次元画像に重畳することにより、左眼用OSD重畳画像を生成する。 Further, CPU 43 2 is the 2-dimensional image and OSD data for the left eye is held in RAM 45 2, it is read from RAM 45 2, and supplies the superimposed portion 46 2. The superimposing unit 46 2, CPU 43 the OSD data supplied from the 2 by superimposing a two-dimensional image for the left eye supplied likewise from the CPU 43 2, and generates an OSD superimposed image for the left eye.

 ステップS31では、処理の主体が主処理部231である場合、主処理部231は、処理をステップS33に進め、処理の主体が従処理部232である場合、従処理部232は、処理をステップS32に進める。 In step S31, when the subject of processing is the main processing unit 23 1 , the main processing unit 23 1 advances the processing to step S33, and when the subject of processing is the sub processing unit 23 2 , the sub processing unit 23 2 Then, the process proceeds to step S32.

 ステップS32では、従処理部232のCPU432は、主処理部231のCPU431から、通信部471、中継部27、及び通信部472を介して、左眼用OSD重畳画像の出力を指示する第2の指示データを受信したか否かを判定する。 In step S32, CPU 43 2 of the slave processing unit 23 2, the CPU 43 1 of the main processing unit 23 1, the communication unit 47 1, via the relay unit 27, and a communication unit 47 2, the OSD superimposing image for the left eye output It is determined whether or not the second instruction data for instructing is received.

 従処理部232のCPU432は、第2の指示データを受信したと判定するのを待って、処理をステップS34に進める。 The CPU 43 2 of the slave processing unit 23 2 waits to determine that the second instruction data has been received, and advances the process to step S34.

 ステップS33では、主処理部231のCPU431は、第2の指示データを生成し、通信部471、中継部27、及び通信部472を介して、従処理部232のCPU432に送信する。 In step S33, the CPU 43 1 of the main processing unit 23 1 generates second instruction data, and sends it to the CPU 43 2 of the slave processing unit 23 2 via the communication unit 47 1 , the relay unit 27, and the communication unit 47 2. Send.

 ステップS34では、主処理部231の重畳部461は、直前のステップS30で生成済みの右眼用OSD重畳画像を、生成表示部28の3D画像生成部61に出力する。 In step S34, the superimposing unit 46 1 of the main processing unit 23 1 outputs the right-eye OSD superimposed image generated in the immediately preceding step S30 to the 3D image generating unit 61 of the generation display unit 28.

 また、従処理部232の重畳部462は、主処理部231のCPU431から受信した第2の指示データに基づき、直前のステップS30で生成済みの左眼用OSD重畳画像を、生成表示部28の3D画像生成部61に出力する。これにより、重畳部462からは、重畳部461から右眼用OSD重畳画像が出力されるタイミングと同一のタイミングで、左眼用OSD重畳画像が出力される。 Further, superimposing unit 46 2 of the slave processing unit 23 2, based on the second instruction data received from CPU 43 1 of the main processing unit 23 1, already generated OSD superimposed image for the left eye in step S30 of immediately preceding generation The data is output to the 3D image generation unit 61 of the display unit 28. Thus, the left eye OSD superimposed image is output from the superimposing unit 46 2 at the same timing as the right eye OSD superimposed image is output from the superimposing unit 46 1 .

 ステップS35では、生成表示部28の3D画像生成部61は、重畳部461からの右眼用OSD重畳画像と、重畳部462からの左眼用OSD重畳画像に基づいて、右眼用OSD重畳画像と左眼用OSD重畳画像からなる3D画像を生成し、表示制御部62に供給する。 In step S35, 3D image generation unit 61 of generating the display unit 28, and a right-eye OSD overlapping images from the superimposing unit 461, on the basis of the left-eye OSD overlapping images from the superimposing unit 46 2, OSD for the right eye A 3D image including the superimposed image and the left-eye OSD superimposed image is generated and supplied to the display control unit 62.

 ステップS36では、表示制御部62は、3D画像生成部61からの3D画像を、表示部63に供給して表示させる。これにより、表示部63は、3D画像に含まれる左眼用OSD重畳画像を、ユーザの左眼に視認させるように表示し、3D画像に含まれる右眼用OSD重畳画像を、ユーザの右眼に視認させるように表示する。 In step S36, the display control unit 62 supplies the 3D image from the 3D image generation unit 61 to the display unit 63 for display. Thereby, the display unit 63 displays the OSD superimposed image for the left eye included in the 3D image so that the left eye of the user can visually recognize the OSD superimposed image for the right eye included in the 3D image. It is displayed so that it can be visually recognized.

 ステップS37では、主処理部231のCPU431は、例えば、操作部24からの操作信号に基づいて、表示処理を続行するか否かを判定し、表示処理を続行すると判定した場合、処理をステップS21に戻し、それ以降同様の処理が行われる。 In step S37, for example, the CPU 43 1 of the main processing unit 23 1 determines whether or not to continue the display process based on an operation signal from the operation unit 24. Returning to step S21, the same processing is performed thereafter.

 また、ステップS37において、主処理部231は、表示処理を続行しないと判定した場合、この表示処理を終了する。 In step S37, when the main processing unit 23 1 determines not to continue the display process, the display process ends.

 以上説明したように、表示処理によれば、主処理部231の保持部441と、従処理部232の保持部442には、複数のOSDデータを予め保持するようにした。 As described above, according to the display processing, the holding portion 44 of the main processing unit 23 1, the holding portion 44 2 of the slave processing unit 23 2, and so as to advance hold a plurality of OSD data.

 このため、例えば、主処理部231が、左眼用2次元画像用のOSDデータを、従処理部232に通信する必要がないので、そのような通信に要する時間を省略できる。 For this reason, for example, the main processing unit 23 1 does not need to communicate the OSD data for the left-eye two-dimensional image to the sub-processing unit 23 2 , so that the time required for such communication can be omitted.

 また、例えば、撮像素子221は、レンズ211からの光を光電変換して得られる右眼用2次元画像を、主処理部231に出力するとともに、撮像素子222は、レンズ212からの光を光電変換して得られる左眼用2次元画像を、従処理部232に出力するようにした。 Further, for example, the image sensor 22 1 outputs a right-eye two-dimensional image obtained by photoelectric conversion of the light from the lens 21 1 to the main processing unit 23 1 , and the image sensor 22 2 has the lens 21 2. The two-dimensional image for the left eye obtained by photoelectric conversion of the light from is output to the sub processor 23 2 .

 このため、例えば、主処理部231が、撮像素子222から左眼用2次元画像を取得して、従処理部232に通信する構成とした場合と比較して、そのような通信に要する時間を省略できる。 Thus, for example, the main processing unit 23 1, the image pickup element 22 2 to obtain a two-dimensional image for the left eye, as compared with the case of the configuration to communicate to the slave processor 23 2, such communication Time required can be omitted.

 よって、表示処理によれば、右眼用2次元画像及び左眼用2次元画像から構成される3D画像に、OSDデータの重畳を迅速に行うことが可能となる。 Therefore, according to the display processing, it is possible to quickly superimpose OSD data on the 3D image composed of the two-dimensional image for the right eye and the two-dimensional image for the left eye.

 さらに、例えば、通信部471は、同期信号を通信するための中継部26と、指示データ等を通信するための中継部27とを別々に設けるようにしたので、同期信号の配信が遅延して、主処理部231と従処理部232の間で同期がとれなくなる事態を抑止できる。 Furthermore, for example, since the communication unit 47 1 is provided with the relay unit 26 for communicating the synchronization signal and the relay unit 27 for communicating the instruction data and the like separately, the delivery of the synchronization signal is delayed. Thus, it is possible to prevent a situation in which synchronization cannot be established between the main processing unit 23 1 and the sub-processing unit 23 2 .

 また、例えば、主処理部231と従処理部232は、同様に構成されているため、主処理部231と従処理部232を、同様の製造工程で製造することができる。このため、画像処理装置1の製造コストを削減することができる。 For example, since the main processing unit 23 1 and the sub processing unit 23 2 are configured in the same manner, the main processing unit 23 1 and the sub processing unit 23 2 can be manufactured in the same manufacturing process. For this reason, the manufacturing cost of the image processing apparatus 1 can be reduced.

 さらに、例えば、主処理部231と従処理部232を、被写体を撮像する従来の撮像装置と共通の部品を用いて製造すれば、追加の電子回路やソフトウェアの開発をする必要がなくなり、開発のための費用を削減できる。 Further, for example, if the main processing unit 23 1 and the sub-processing unit 23 2 are manufactured using parts common to a conventional imaging device that images a subject, it is not necessary to develop an additional electronic circuit or software, Costs for development can be reduced.

<2.変形例>
 本実施の形態において、主処理部231は、撮像素子221からの右眼用2次元画像に対してOSDデータを重畳し、従処理部232は、撮像素子222からの左眼用2次元画像に対してOSDデータを重畳することにより、3D画像にOSDデータを重畳するようにした。
<2. Modification>
In the present embodiment, the main processing unit 23 1 superimposes the OSD data on the two-dimensional image for the right eye from the image sensor 22 1 , and the slave processing unit 23 2 for the left eye from the image sensor 22 2 . The OSD data was superimposed on the 3D image by superimposing the OSD data on the two-dimensional image.

 しかしながら、その他、例えば、データ記憶部25に3D画像が記憶されている場合、データ記憶部25に記憶されている3D画像に対してOSDデータを重畳するようにしてもよい。 However, for example, when a 3D image is stored in the data storage unit 25, the OSD data may be superimposed on the 3D image stored in the data storage unit 25.

 すなわち、主処理部231は、データ記憶部25から、3D画像に含まれる右眼用2次元画像を読み出し、読み出した右眼用2次元画像に対してOSDデータを重畳する。また、従処理部232は、データ記憶部25から、3D画像に含まれる左眼用2次元画像を読み出し、読み出した左眼用2次元画像に対してOSDデータを重畳する。 That is, the main processing unit 23 1 reads the right-eye two-dimensional image included in the 3D image from the data storage unit 25, and superimposes the OSD data on the read right-eye two-dimensional image. Further, the sub processor 23 2 reads the left-eye two-dimensional image included in the 3D image from the data storage unit 25, and superimposes the OSD data on the read left-eye two-dimensional image.

 また例えば、画像処理装置1では、図1に示したように、1のデータ記憶部25を設けるようにした。しかしながら、その他、例えば、主処理部231にデータ記憶部251を、従処理部232にデータ記憶部252をそれぞれ内蔵するようにしてもよい。なお、データ記憶部251とデータ記憶部252には、同一の3D画像が記憶されるものとする。 Further, for example, in the image processing apparatus 1, as shown in FIG. 1, one data storage unit 25 is provided. However, for example, the main processing unit 23 1 may include the data storage unit 25 1 and the sub-processing unit 23 2 may include the data storage unit 25 2 . It is assumed that the same 3D image is stored in the data storage unit 25 1 and the data storage unit 25 2 .

 この場合、主処理部231は、内蔵するデータ記憶部251から、3D画像に含まれる右眼用2次元画像を読み出し、読み出した右眼用2次元画像に対してOSDデータを重畳する。また、従処理部232は、内蔵するデータ記憶部252から、3D画像に含まれる左眼用2次元画像を読み出し、読み出した左眼用2次元画像に対してOSDデータを重畳する。 In this case, the main processing unit 23 1 reads the right-eye two-dimensional image included in the 3D image from the built-in data storage unit 25 1 and superimposes the OSD data on the read right-eye two-dimensional image. Further, the sub processor 23 2 reads the left-eye two-dimensional image included in the 3D image from the built-in data storage unit 25 2 and superimposes the OSD data on the read left-eye two-dimensional image.

 さらに、本実施の形態では、画像処理装置1が、1個の従処理部232を含むものとして説明したが、2個の従処理部232及び233を含む構成とすることができる。 Furthermore, in this embodiment, the image processing apparatus 1 has been described as including one slave processing unit 23 2, can be configured to include two slave processing unit 23 2 and 23 3.

 この場合、例えば、画像処理装置1には、レンズ213及び撮像素子223も含まれることとなり、撮像素子223は、レンズ213からの光を光電変換して、撮像素子221及び222から得られる画像とは異なる視点の2次元画像を取得し、従処理部233に出力する。 In this case, for example, the image processing apparatus 1, is also possible as will the included lens 21 3 and the image pickup element 22 3, the image pickup element 22 3, the light from the lens 21 3 and photoelectric conversion, the image pickup device 22 1 and 22 It acquires two-dimensional images of different viewpoints from the image obtained from 2, and outputs to the slave processing unit 23 3.

 そして、従処理部233は、予め保持する複数のOSDデータのうち、主処理部231からの制御により変更した従処理部233の内部状態を示すOSDデータを、撮像素子223からの2次元画像に重畳し、その重畳により得られるOSD重畳画像を、3D画像生成部61に出力する。 Then, the sub processor 23 3 outputs, from the image sensor 22 3 , OSD data indicating the internal state of the sub processor 23 3 changed by the control from the main processor 23 1 among the plurality of OSD data held in advance. The OSD superimposed image obtained by superimposing on the two-dimensional image is output to the 3D image generating unit 61.

 3D画像生成部61は、主処理部231からのOSD重畳画像、従処理部232からのOSD重畳画像、及び従処理部233からのOSD重畳画像からなる3D画像を生成し、表示制御部62に供給する。 The 3D image generation unit 61 generates a 3D image including the OSD superimposed image from the main processing unit 23 1 , the OSD superimposed image from the sub processing unit 23 2, and the OSD superposed image from the sub processing unit 23 3 , and performs display control. To the unit 62.

 表示制御部62は、3D画像生成部61からの3D画像を、表示部63に供給して表示させる。 The display control unit 62 supplies the 3D image from the 3D image generation unit 61 to the display unit 63 for display.

 なお、画像処理装置1では、2個の従処理部232及び233の他、N個(Nは3以上の自然数)の従処理部232乃至23N+1を設けるようにすることができる。 In the image processing apparatus 1, in addition to the two sub-processing units 23 2 and 23 3 , N (N is a natural number of 3 or more) sub-processing units 23 2 to 23 N + 1 may be provided. it can.

 ところで、本技術は、以下の構成をとることができる。
 (1)予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳する主処理部と、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳する従処理部と、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する3D画像生成部とを含む画像処理装置。
 (2)前記第1の視点画像を、前記主処理部に出力する第1の出力部と、前記第2の視点画像を、前記従処理部に出力する第2の出力部とをさらに含む前記(1)に記載の画像処理装置。
(3)前記主処理部は、前記主処理部と同期して処理を行わせるための同期信号を、専用の通信経路を経由して、前記従処理部に通信し、前記従処理部は、前記専用の通信経路を経由して前記主処理部から通信される前記同期信号に同期して、前記第1の重畳用データを、前記第2の視点画像に重畳する前記(1)又は(2)に記載の画像処理装置。
 (4)前記主処理部は、前記複数の重畳用データを予め保持する第1の保持部と、ユーザの入力操作に基づいて、前記主処理部の内部状態を変更する第1の変更部と、前記第1の保持部に保持されている前記複数の重畳用データのうち、前記第1の変更部により変更された前記主処理部の内部状態を示す前記第1の重畳用データを、前記第1の視点画像に重畳する第1の重畳部とを有する前記(1)又は(2)に記載の画像処理装置。
 (5)前記従処理部は、前記複数の重畳用データを予め保持する第2の保持部と、前記主処理部の内部状態の変更に同期して、前記従処理部の内部状態を、前記主処理部と同一の内部状態に変更する第2の変更部と、前記第2の保持部に保持されている前記複数の重畳用データのうち、前記第2の変更部により変更された前記従処理部の内部状態を示す前記第1の重畳用データを、前記第2の視点画像に重畳する第2の重畳部とを有する前記(4)に記載の画像処理装置。
 (6)前記画像処理装置は、複数の前記従処理部を含み、前記3D画像生成部は、重畳後の前記第1の視点画像と、前記複数の従処理部からそれぞれ得られる、重畳後の前記第2の視点画像により構成される3D画像を生成する前記(1)乃至(5)に記載の画像処理装置。
 (7)3D画像を生成する画像処理装置の画像処理方法において、前記画像処理装置は、主処理部と、従処理部と、3D画像生成部を含み、前記主処理部が、予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳し、前記従処理部が、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳し、前記3D画像生成部が、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成するステップを含む画像処理方法。
 (8)コンピュータを、予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳する主処理部と、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳する従処理部と、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する3D画像生成部として機能させるためのプログラム。
By the way, this technique can take the following structures.
(1) A main processing unit that superimposes first superposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint, and in synchronization with the main processing unit A slave processing unit that superimposes the first superimposition data on a second viewpoint image obtained from a second viewpoint different from the first viewpoint among the plurality of superimposition data held in advance; An image processing apparatus comprising: the first viewpoint image after superimposition; and a 3D image generation unit that generates a 3D image composed of the second viewpoint image after superimposition.
(2) further including a first output unit that outputs the first viewpoint image to the main processing unit, and a second output unit that outputs the second viewpoint image to the sub-processing unit. The image processing apparatus according to (1).
(3) The main processing unit communicates a synchronization signal for performing processing in synchronization with the main processing unit to the sub processing unit via a dedicated communication path. (1) or (2) superimposing the first superimposition data on the second viewpoint image in synchronization with the synchronization signal communicated from the main processing unit via the dedicated communication path ).
(4) The main processing unit includes a first holding unit that holds the plurality of superimposing data in advance, and a first changing unit that changes an internal state of the main processing unit based on a user input operation. The first superimposing data indicating the internal state of the main processing unit changed by the first changing unit among the plurality of superimposing data held in the first holding unit, The image processing apparatus according to (1) or (2), further including a first superimposing unit that superimposes on the first viewpoint image.
(5) The sub processing unit synchronizes with the change of the internal state of the second processing unit that stores the plurality of superimposing data in advance and the internal state of the main processing unit. Of the plurality of superposition data held in the second holding unit, the second changing unit that changes to the same internal state as the main processing unit, and the slave changed by the second changing unit. The image processing apparatus according to (4), further including: a second superimposing unit that superimposes the first superimposing data indicating an internal state of the processing unit on the second viewpoint image.
(6) The image processing apparatus includes a plurality of sub processing units, and the 3D image generation unit is obtained from the first viewpoint image after superimposition and the plurality of sub processing units, respectively. The image processing device according to any one of (1) to (5), wherein a 3D image configured by the second viewpoint image is generated.
(7) In the image processing method of the image processing apparatus that generates a 3D image, the image processing apparatus includes a main processing unit, a sub processing unit, and a 3D image generation unit, and the main processing unit holds a plurality of images in advance. Of the superimposing data, the first superimposing data is superimposed on the first viewpoint image obtained from the first viewpoint, and the slave processing unit holds in advance in synchronization with the main processing unit. Of the plurality of data for superimposition, the first superimposition data is superimposed on a second viewpoint image obtained from a second viewpoint different from the first viewpoint, and the 3D image generation unit An image processing method including a step of generating a 3D image composed of the first viewpoint image and the superimposed second viewpoint image.
(8) A main processing unit that superimposes the first superposition data among a plurality of superposition data held in advance on the first viewpoint image obtained from the first viewpoint, and the main processing unit. A sub-process for superimposing the first superimposition data on a second viewpoint image obtained from a second viewpoint different from the first viewpoint among the plurality of superposition data held in synchronization And a program for functioning as a 3D image generation unit that generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition.

 上述した一連の処理は、例えばハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、又は、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のコンピュータなどに、プログラム記録媒体からインストールされる。 The series of processes described above can be executed by hardware or software, for example. When a series of processing is executed by software, a program constituting the software may execute various functions by installing a computer incorporated in dedicated hardware or various programs. For example, it is installed from a program recording medium in a general-purpose computer or the like.

[コンピュータの構成例]
 図4は、上述した一連の処理をプログラムにより実行するコンピュータの構成例を示している。
[Computer configuration example]
FIG. 4 shows a configuration example of a computer that executes the above-described series of processing by a program.

 CPU(Central Processing Unit)81は、ROM(Read Only Memory)82、又は記憶部88に記憶されているプログラムに従って各種の処理を実行する。RAM(Random Access Memory)83には、CPU81が実行するプログラムやデータ等が適宜記憶される。これらのCPU81、ROM82、及びRAM83は、バス84により相互に接続されている。 CPU (Central Processing Unit) 81 executes various processes according to a program stored in ROM (Read Only Memory) 82 or storage unit 88. A RAM (Random Access Memory) 83 appropriately stores programs executed by the CPU 81, data, and the like. These CPU 81, ROM 82, and RAM 83 are connected to each other by a bus 84.

 CPU81にはまた、バス84を介して入出力インタフェース85が接続されている。入出力インタフェース85には、キーボード、マウス、マイクロホン等よりなる入力部86、ディスプレイ、スピーカ等よりなる出力部87が接続されている。CPU81は、入力部86から入力される指令に対応して各種の処理を実行する。そして、CPU81は、処理の結果を出力部87に出力する。 The CPU 81 is also connected with an input / output interface 85 via the bus 84. The input / output interface 85 is connected to an input unit 86 made up of a keyboard, a mouse, a microphone, etc., and an output unit 87 made up of a display, a speaker, etc. The CPU 81 executes various processes in response to commands input from the input unit 86. Then, the CPU 81 outputs the processing result to the output unit 87.

 入出力インタフェース85に接続されている記憶部88は、例えばハードディスクからなり、CPU81が実行するプログラムや各種のデータを記憶する。通信部89は、インタネットやローカルエリアネットワーク等のネットワークを介して外部の装置と通信する。 The storage unit 88 connected to the input / output interface 85 is composed of, for example, a hard disk, and stores programs executed by the CPU 81 and various data. The communication unit 89 communicates with an external device via a network such as the Internet or a local area network.

 また、通信部89を介してプログラムを取得し、記憶部88に記憶してもよい。 Further, the program may be acquired via the communication unit 89 and stored in the storage unit 88.

 入出力インタフェース85に接続されているドライブ90は、磁気ディスク、光ディスク、光磁気ディスク、或いは半導体メモリ等のリムーバブルメディア91が装着されたとき、それらを駆動し、そこに記録されているプログラムやデータ等を取得する。取得されたプログラムやデータは、必要に応じて記憶部88に転送され、記憶される。 The drive 90 connected to the input / output interface 85 drives a removable medium 91 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and drives the program or data recorded therein. Etc. The acquired program and data are transferred to and stored in the storage unit 88 as necessary.

 コンピュータにインストールされ、コンピュータによって実行可能な状態とされるプログラムを記録(記憶)する記録媒体は、図4に示すように、磁気ディスク(フレキシブルディスクを含む)、光ディスク(CD-ROM(Compact Disc-Read Only Memory),DVD(Digital Versatile Disc)を含む)、光磁気ディスク(MD(Mini-Disc)を含む)、もしくは半導体メモリ等よりなるパッケージメディアであるリムーバブルメディア91、又は、プログラムが一時的もしくは永続的に格納されるROM82や、記憶部88を構成するハードディスク等により構成される。記録媒体へのプログラムの記録は、必要に応じてルータ、モデム等のインタフェースである通信部89を介して、ローカルエリアネットワーク、インタネット、デジタル衛星放送といった、有線又は無線の通信媒体を利用して行われる。 As shown in FIG. 4, a recording medium for recording (storing) a program installed in a computer and ready to be executed by the computer is a magnetic disk (including a flexible disk), an optical disk (CD-ROM (Compact Disc- Removable media 91, which is a package media made up of a read-only memory, DVD (digital versatile disc), a magneto-optical disk (including MD (mini-disc)), or a semiconductor memory, or the program is temporarily or It is composed of a ROM 82 that is permanently stored, a hard disk that constitutes the storage unit 88, and the like. Recording of a program on a recording medium is performed using a wired or wireless communication medium such as a local area network, the Internet, or digital satellite broadcasting via a communication unit 89 that is an interface such as a router or a modem as necessary. Is called.

 なお、本明細書において、上述した一連の処理を記述するステップは、記載された順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的あるいは個別に実行される処理をも含むものである。 In the present specification, the steps describing the series of processes described above are not limited to the processes performed in time series according to the described order, but are not necessarily performed in time series, either in parallel or individually. The process to be executed is also included.

 また、本開示は、上述した本実施の形態に限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。 Further, the present disclosure is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present disclosure.

 1 画像処理装置, 211,212 レンズ, 221,222 撮像素子, 231 主処理部, 232 従処理部, 24 操作部, 25 データ記憶部, 26,27 中継部, 28 生成表示部, 411,412 画像処理部, 421,422 AD変換部, 431,432 CPU, 441,442 保持部, 451,452 RAM, 461,462 重畳部, 471,472 通信部, 61 3D画像生成部, 62 表示制御部, 63 表示部 DESCRIPTION OF SYMBOLS 1 Image processing device, 21 1 , 21 2 lens, 22 1 , 22 2 imaging device, 23 1 main processing unit, 23 2 sub processing unit, 24 operation unit, 25 data storage unit, 26, 27 relay unit, 28 generation display part, 41 1, 41 2 image processing unit, 42 1, 42 2 AD conversion unit, 43 1, 43 2 CPU, 44 1, 44 2 holding part, 45 1, 45 2 RAM, 46 1, 46 2 superimposing unit, 47 1 , 47 2 communication unit, 61 3D image generation unit, 62 display control unit, 63 display unit

Claims (8)

 予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳する主処理部と、
 前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳する従処理部と、
 重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する3D画像生成部と
 を含む画像処理装置。
A main processing unit that superimposes first superimposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint;
A second viewpoint image obtained from a second viewpoint different from the first viewpoint from among the plurality of superimposing data held in advance in synchronization with the main processing unit. A slave processing unit superposed on
An image processing apparatus comprising: the first viewpoint image after superimposition; and a 3D image generation unit that generates a 3D image composed of the second viewpoint image after superimposition.
 前記第1の視点画像を、前記主処理部に出力する第1の出力部と、
 前記第2の視点画像を、前記従処理部に出力する第2の出力部と
 をさらに含む請求項1に記載の画像処理装置。
A first output unit that outputs the first viewpoint image to the main processing unit;
The image processing apparatus according to claim 1, further comprising: a second output unit that outputs the second viewpoint image to the slave processing unit.
 前記主処理部は、前記主処理部と同期して処理を行わせるための同期信号を、専用の通信経路を経由して、前記従処理部に通信し、
 前記従処理部は、前記専用の通信経路を経由して前記主処理部から通信される前記同期信号に同期して、前記第1の重畳用データを、前記第2の視点画像に重畳する
 請求項2に記載の画像処理装置。
The main processing unit communicates a synchronization signal for performing processing in synchronization with the main processing unit to the sub processing unit via a dedicated communication path,
The slave processing unit superimposes the first superimposition data on the second viewpoint image in synchronization with the synchronization signal communicated from the main processing unit via the dedicated communication path. Item 3. The image processing apparatus according to Item 2.
 前記主処理部は、
  前記複数の重畳用データを予め保持する第1の保持部と、
  ユーザの入力操作に基づいて、前記主処理部の内部状態を変更する第1の変更部と、
  前記第1の保持部に保持されている前記複数の重畳用データのうち、前記第1の変更部により変更された前記主処理部の内部状態を示す前記第1の重畳用データを、前記第1の視点画像に重畳する第1の重畳部と
 を有する請求項2に記載の画像処理装置。
The main processing unit
A first holding unit that holds the plurality of superimposing data in advance;
A first changing unit for changing an internal state of the main processing unit based on a user input operation;
Among the plurality of superposition data held in the first holding unit, the first superposition data indicating the internal state of the main processing unit changed by the first change unit is the first superposition data. The image processing apparatus according to claim 2, further comprising: a first superimposing unit that superimposes on one viewpoint image.
 前記従処理部は、
  前記複数の重畳用データを予め保持する第2の保持部と、
  前記主処理部の内部状態の変更に同期して、前記従処理部の内部状態を、前記主処理部と同一の内部状態に変更する第2の変更部と、
  前記第2の保持部に保持されている前記複数の重畳用データのうち、前記第2の変更部により変更された前記従処理部の内部状態を示す前記第1の重畳用データを、前記第2の視点画像に重畳する第2の重畳部と
 を有する請求項4に記載の画像処理装置。
The slave processing unit
A second holding unit that holds the plurality of superimposing data in advance;
In synchronization with the change of the internal state of the main processing unit, a second changing unit that changes the internal state of the sub processing unit to the same internal state as the main processing unit,
Of the plurality of superimposing data held in the second holding unit, the first superimposing data indicating the internal state of the slave processing unit changed by the second changing unit is the first superimposing data. The image processing apparatus according to claim 4, further comprising: a second superimposing unit that superimposes the two viewpoint images.
 前記画像処理装置は、複数の前記従処理部を含み、
 前記3D画像生成部は、重畳後の前記第1の視点画像と、前記複数の従処理部からそれぞれ得られる、重畳後の前記第2の視点画像により構成される3D画像を生成する
 請求項1に記載の画像処理装置。
The image processing apparatus includes a plurality of sub processing units,
2. The 3D image generation unit generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition obtained from the plurality of sub-processing units, respectively. An image processing apparatus according to 1.
 3D画像を生成する画像処理装置の画像処理方法において、
 前記画像処理装置は、主処理部と、従処理部と、3D画像生成部を含み、
  前記主処理部が、予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳し、
  前記従処理部が、前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳し、
  前記3D画像生成部が、重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する
 ステップを含む画像処理方法。
In an image processing method of an image processing apparatus that generates a 3D image,
The image processing apparatus includes a main processing unit, a sub processing unit, and a 3D image generation unit,
The main processing unit superimposes the first superposition data among the plurality of superposition data held in advance on the first viewpoint image obtained from the first viewpoint,
The slave processing unit obtains the first superimposition data from a second viewpoint different from the first viewpoint among the plurality of superposition data held in advance in synchronization with the main processing unit. Superimposed on the second viewpoint image,
An image processing method including a step in which the 3D image generation unit generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition.
 コンピュータを、
 予め保持する複数の重畳用データのうち、第1の重畳用データを、第1の視点から得られる第1の視点画像に重畳する主処理部と、
 前記主処理部に同期して、予め保持する前記複数の重畳用データのうち、前記第1の重畳用データを、前記第1の視点とは異なる第2の視点から得られる第2の視点画像に重畳する従処理部と、
 重畳後の前記第1の視点画像と、重畳後の前記第2の視点画像により構成される3D画像を生成する3D画像生成部と
 して機能させるためのプログラム。
Computer
A main processing unit that superimposes first superimposition data among a plurality of superposition data held in advance on a first viewpoint image obtained from a first viewpoint;
A second viewpoint image obtained from a second viewpoint different from the first viewpoint from among the plurality of superimposing data held in advance in synchronization with the main processing unit. A slave processing unit superposed on
A program for functioning as a 3D image generation unit that generates a 3D image composed of the first viewpoint image after superimposition and the second viewpoint image after superimposition.
PCT/JP2013/066435 2012-06-25 2013-06-14 Image processing device, image processing method, and program Ceased WO2014002790A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-142023 2012-06-25
JP2012142023 2012-06-25

Publications (1)

Publication Number Publication Date
WO2014002790A1 true WO2014002790A1 (en) 2014-01-03

Family

ID=49782952

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/066435 Ceased WO2014002790A1 (en) 2012-06-25 2013-06-14 Image processing device, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2014002790A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011024021A (en) * 2009-07-16 2011-02-03 Fujifilm Corp 3d image display apparatus, and 3d image display method
JP2011114606A (en) * 2009-11-27 2011-06-09 Panasonic Corp Video signal processing apparatus and video signal processing method
JP2011244218A (en) * 2010-05-18 2011-12-01 Sony Corp Data transmission system
JP2012113078A (en) * 2010-11-24 2012-06-14 Seiko Epson Corp Display device, control method for display device and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011024021A (en) * 2009-07-16 2011-02-03 Fujifilm Corp 3d image display apparatus, and 3d image display method
JP2011114606A (en) * 2009-11-27 2011-06-09 Panasonic Corp Video signal processing apparatus and video signal processing method
JP2011244218A (en) * 2010-05-18 2011-12-01 Sony Corp Data transmission system
JP2012113078A (en) * 2010-11-24 2012-06-14 Seiko Epson Corp Display device, control method for display device and program

Similar Documents

Publication Publication Date Title
JP5917017B2 (en) Image processing apparatus, control method therefor, and program
US10958820B2 (en) Intelligent interface for interchangeable sensors
US9277201B2 (en) Image processing device and method, and imaging device
CN102724533A (en) Display system, display device and display assistance device
JP2013025649A (en) Image processing device, image processing method, and program
JP6667981B2 (en) Imbalance setting method and corresponding device
JPWO2014106916A1 (en) Image processing apparatus, imaging apparatus, program, and image processing method
JP2012133185A (en) Imaging apparatus
CN103959336A (en) Image processing device, method thereof, and non-transitory computer-readable storage medium
US20120314038A1 (en) Stereoscopic image obtaining apparatus
JP6515512B2 (en) Display device, display device calibration method, and calibration program
TWI524735B (en) Method and device for generating three-dimensional image
KR101228916B1 (en) Apparatus and method for displaying stereoscopic 3 dimensional image in multi vision
US20130343635A1 (en) Image processing apparatus, image processing method, and program
JP2015019118A (en) IMAGING DEVICE, IMAGING SYSTEM, AND IMAGING DEVICE CONTROL PROGRAM
WO2014002790A1 (en) Image processing device, image processing method, and program
KR101222101B1 (en) Apparatus for generating stereoscopic images
US8917316B2 (en) Photographing equipment
JP2014107836A (en) Imaging device, control method, and program
JP2014017543A (en) Image processing device, method for controlling the same, and program
JP2014057219A (en) Display control apparatus, image capture apparatus, control method, program, and recording medium
JP7808375B2 (en) Cloud desktop display method, apparatus, device and storage medium
CN103369343B (en) Three-dimensional image generation method and device
JP2012049897A (en) Image display device
JP2013046395A (en) Image capturing apparatus, control method therefor, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13810067

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13810067

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP