WO2019082415A1 - Image processing device, imaging apparatus, method for controlling image processing device, image processing program, and recording medium - Google Patents

Image processing device, imaging apparatus, method for controlling image processing device, image processing program, and recording medium

Info

Publication number
WO2019082415A1
WO2019082415A1 PCT/JP2018/014401 JP2018014401W WO2019082415A1 WO 2019082415 A1 WO2019082415 A1 WO 2019082415A1 JP 2018014401 W JP2018014401 W JP 2018014401W WO 2019082415 A1 WO2019082415 A1 WO 2019082415A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
area
unit
pixel
image processing
Prior art date
Application number
PCT/JP2018/014401
Other languages
French (fr)
Japanese (ja)
Inventor
奈保 徳井
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2019549820A priority Critical patent/JPWO2019082415A1/en
Publication of WO2019082415A1 publication Critical patent/WO2019082415A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an image processing apparatus, an imaging apparatus, a control method of the image processing apparatus, an image processing program, and a recording medium.
  • Patent Document 1 discloses a stereo camera for capturing a stereoscopic image.
  • Patent Document 2 describes a technique for generating an image in which an area of an image shielded by another lens is complemented in an imaging device having three or more lenses.
  • Japanese Patent Publication Japanese Patent Application Laid-Open No. 2005-70077 (released on March 17, 2005)
  • Japanese patent publication Japanese Patent Application Laid-Open No. 2015-133691 (released on July 23, 2015)
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing apparatus capable of generating a suitable image while reducing the load of presetting, and related techniques.
  • the image processing device concerning one mode of the present invention is the 1st image pick-up part concerned based on the 1st still field in the animation picturized with the 1st image pick-up part.
  • the image processing apparatus is configured such that the first imaging unit captures an image based on a first still area in a moving image captured by the first imaging unit.
  • an image processing apparatus capable of generating a suitable image and its related technology while reducing the load of preset.
  • FIG. 1 is a functional block diagram showing a configuration of an imaging device 102 according to an embodiment of the present invention.
  • the imaging device 102 includes an image processing device 101, a stereo imaging unit 103, a storage unit 104, and a display unit 105.
  • the imaging device 102 an imaging device provided with a plurality of imaging units for imaging the same subject from different positions can be used.
  • the imaging device 102 is a stereo camera, and may include, for example, a twin-lens digital camera and a video camera.
  • the stereo imaging unit 103 includes a first imaging unit 107 and a second imaging unit 108.
  • the stereo imaging unit 103 outputs the first image and the first moving image generated by the first imaging unit 107, and the second image and the second moving image generated by the second imaging unit 108 to the image processing apparatus 101.
  • the first image may be included in the first moving image.
  • the second image may be included in the second moving image.
  • the first imaging unit 107 is installed at a first viewpoint position corresponding to the left eye, and generates a first image or a first moving image including the subject by imaging the subject.
  • the second imaging unit 108 is installed at a second viewpoint position corresponding to the right eye, and generates a second image or a second moving image including the subject by imaging the subject.
  • Examples of the first imaging unit 107 and the second imaging unit 108 include those having an imaging lens such as a CCD (Charge Couple Device) and an imaging element.
  • the number of pixels of the images generated by the first imaging unit 107 and the second imaging unit 108 may be the same or may be different from each other.
  • the storage unit 104 is corrected by the first image, the second image, a shielded area (first compensated area) in the first image, a shielded area (second compensated area) in the second image, and the image correction unit 106.
  • the image information such as the corrected image is stored in the recording medium. Further, the storage unit 104 reads the above-mentioned image stored in the recording medium and outputs the read image to the image processing apparatus 101.
  • a general-purpose storage unit can be used as long as it has the above-described function.
  • the shielded area refers to a part of at least one of the first image and the second image, a part of the other imaging part other than the imaging part that captures each image (for example, a lens or a camera body) Etc. means the area where the image is reflected.
  • the shielded area in each image is also referred to as a complemented area because it is complemented using another image.
  • “complement” means correction to replace each pixel of the complemented region of the original image with the corresponding pixel of another image. In one aspect, correcting to replace may be to determine the pixel value of each pixel of the complemented area of the original image based on the pixel value of the corresponding pixel of the other image.
  • the storage unit 104 may be, for example, a member that reads the above-described image from a predetermined recording medium and reproduces the read image.
  • recording media include storage devices in commercially available cameras such as digital cameras and video cameras, and removable storage devices (for example, electronic media such as magneto-optical disks and semiconductor memories).
  • the display unit 105 displays, for example, a corrected image including text information and main subject information.
  • Examples of the display unit 105 include display screens such as an LCD (Liquid Crystal Display) and an organic EL (Electro Luminescence) display.
  • the image processing apparatus 101 generates a corrected image obtained by correcting the shielding area from the first image and the second image acquired from the stereo imaging unit 103, and outputs the corrected image to at least one of the storage unit 104 or the display unit 105.
  • the image processing apparatus 101 includes, for example, a central processing unit (CPU) or a graphic processing unit (GPU).
  • the image processing apparatus 101 further includes an image correction unit (complement unit) 106 and a shielding area setting unit (determination unit) 801.
  • the image correction unit 106 acquires a first image and a second image from the stereo imaging unit 103 or the storage unit 104, and generates a corrected image by correcting at least one of the first image and the second image. Specifically, if there is a shielded area shielded by the second imaging unit 108 in the first image, the image correction unit 106 corrects the shielded area so as to replace the shielded area using the second image. Generate a corrected image. In addition, if there is a shielded area shielded by the first imaging unit 107 in the second image, the image correction unit 106 corrects the corrected image by using the first image to correct the shielded area. Generate
  • FIG. 3 show the shielded area (first shielded area) 302 in the first image 301 captured by the stereo imaging unit 103 and the shielded area (second shielded area) 304 in the second image 303. It is a figure explaining an example.
  • the shielded area is a part of the other imaging unit other than the imaging unit that has captured the first image or the second image in at least one of the first image and the second image (lens and camera body Etc.) is the area in which it appears.
  • a shielded area in the case where the first imaging unit 107 in the stereo imaging unit 103 is a left-eye camera and the second imaging unit 108 is a right-eye camera will be described with reference to FIG.
  • the shielded area 302 in the first image 301 captured by the first imaging unit 107 is located at the right end of the angle of view. Further, as shown in (b) of FIG.
  • the shielded area 304 in the second image 303 captured by the second imaging unit 108 is located at the left end of the angle of view.
  • the occlusion area can represent whether each pixel in the image is an occlusion area or not.
  • the shielding area M (u, v) is set with respect to the pixel coordinates I (u, v) of the photographed image.
  • the shielding area setting unit 801 extracts an area not moving across the plurality of frame images from the plurality of input frame images, and sets the extracted area as a shielding area. Thereby, even when the base length of the stereo imaging unit 103 or the shape of the lens changes from the state where the shielding area is set in the storage unit 104 in advance, the shielding area can be set appropriately.
  • FIG. 2 is a flowchart showing the flow of image processing performed by the imaging device 102.
  • Step S201 The shielding area setting unit 801 acquires a first image and a second image which are a plurality of moving images (frame images) captured by the first imaging unit 107 and the second imaging unit 108 in synchronization with each other.
  • Step S202 The shielding area setting unit 801 extracts a stationary area in a plurality of frame images captured by one imaging unit, and sets (determines) the stationary area as a shielding area of one image.
  • the shielding area setting unit 801 outputs the set shielding area to the image correction unit 106 (determination step).
  • the shielded area can be suitably set even when the baseline length of the stereo imaging unit 103 or the shape of the lens changes from the state where the shielded area is set in the storage unit 104 in advance.
  • Step S203 The image correction unit 106, based on the shielded area, a pixel corresponding to the pixel in the other image captured by the other imaging unit with respect to each pixel of the shielded area in one image (first interpolation And a second complementing pixel) to generate a corrected image (complementary step).
  • Step S204 The display unit 105 displays the corrected image, and ends the image processing.
  • the imaging device 102 corrects at least one of the first image and the second image by performing the image processing of step S201 to step S204 on one or both of the first image and the second image.
  • the other imaging unit appears in an image captured by one imaging unit.
  • the second imaging unit 108 for capturing a second image appears in the first image that is an image for the left eye.
  • the first imaging unit 107 for capturing the left-eye image is captured in the second image that is the right-eye image.
  • the imaging device 102 when a stereoscopic image including an adjacent imaging unit is displayed on the display unit 105, different subjects appear in the first image as the left-eye image and the second image as the right-eye image. Be done. Therefore, it may be difficult for the user to stereoscopically view the image without fusion of corresponding points.
  • the imaging device 102 according to the imaging device 102, the same subject is photographed in the first image and the second image by correcting the shielded area in one image using the other image, and a suitable image is generated. Can.
  • the shielded area setting unit 801 extracts, in one image, an area shielded by the other imaging unit other than the imaging unit that has captured the image (step S202). That is, based on the first still area in the first moving image imaged by the first imaging unit 107, the shielded area (first complemented area) of the first image imaged by the first imaging unit 107 is determined. Further, based on the second still area in the second moving image captured by the second imaging unit 108, the shielded area (second supported region) of the second image captured by the second imaging unit 108 is determined. It is also good.
  • the stereo imaging unit 103 captures a moving image composed of a plurality of frame images, the shielded area is captured at the same position over a plurality of frames. This will be specifically described below with reference to FIG.
  • FIG. 4 illustrates the case where the plurality of frame images 901 to 903 are the first moving image captured by the first imaging unit 107 for convenience, a plurality of frames captured by the second imaging unit 108 Similarly, the setting of the shielding area can be performed on the image.
  • the shielding regions 904 to 906 are photographed at the same position in the plurality of frame images 901 to 903.
  • the person who is is photographed at a different position.
  • the shielding area setting unit 801 can make the area a shielding area by extracting a stationary area (stationary area) in a plurality of frame images, that is, an area not moving.
  • the shielding area setting unit 801 can set an area where the difference in pixel value between a plurality of frame images different at the time of photographing is smaller than a threshold as a stationary area and set the stationary area as the shielding area.
  • the other imaging unit viewed from one imaging unit is located at the angle of view end (the image end of the captured image). Further, it is assumed that an object fixed to the same pedestal as the imaging unit and serving as a shielding object is also located at the field angle end (the image end of the captured image).
  • the shielding area setting unit 801 may set an area in contact with the angle of view end as the shielding area among the plurality of areas.
  • the shielded area setting unit 801 can block the shielded areas 904 to 906 by the first imaging unit 107 or the second imaging unit 108. Can only extract.
  • a subject not moving to the center of the image for example, a subject such as a cable can be mentioned.
  • the shielding area setting unit 801 outputs the shielding area set as described above to at least one of the image correction unit 106 and the storage unit 104.
  • FIG. 5 is a diagram showing the relationship between the first imaging unit 107 and the second imaging unit 108 using a wide-angle lens and the subject 505 to be photographed in stereo.
  • the wide-angle lens means a lens having a maximum angle of view of 180 degrees.
  • the angle of view means the angle between the optical axis of the lens and the object 505.
  • the angle of view ⁇ 1 501 of the first imaging unit 107 means the angle formed by the optical axis 506 and the subject 505, and the angle of view ⁇ 2 502 of the second imaging unit 108 , Means the angle between the light axis 507 and the subject 505.
  • the second imaging unit 108 is within the angle of view ⁇ 1 501 of the first imaging unit 107, and the second imaging unit 108 shields the object 505 that is farther than the second imaging unit 108. I will not.
  • the imaging distance to the subject 505 (subject 505 corresponding to the shielding area of the first imaging unit 107) assumed to be captured in the shielding area of the first imaging unit 107 is defined as an estimated imaging distance 503.
  • the relationship of the following equation (1) holds between the angle of view ⁇ 1 501 of the first imaging unit 107 and the angle of view ⁇ 2 502 of the second imaging unit 108.
  • the image correction unit 106 can correct the shielded area in the angle of view ⁇ 1 501 by the unshielded area in the angle of view ⁇ 2 502 based on the relationship of Expression (1).
  • Equation (1) it is assumed that the angle of view ⁇ 1 501 and the angle of view ⁇ 2 502 are expressed in radians.
  • the parameter D represents the assumed shooting distance 503.
  • the parameter B represents the distance 504 between the imaging units, that is, the baseline length.
  • the baseline length uses a value obtained by measuring the distance between the imaging units.
  • a method of measuring the distance 504 between the imaging units a method of visually measuring using a scale or the like, a method of obtaining by camera calibration, and the like can be mentioned.
  • the image correction unit 106 sets the assumed shooting distance 503 to, for example, 5 m in advance.
  • FIGS. 6A and 6B are diagrams for explaining the deviations 602 and 603 of the corrected image when the assumed imaging distance 503 and the actual imaging distance 601 are different.
  • a shift 602 occurs in the corrected image.
  • the deviation 602 becomes smaller as it approaches the maximum angle of view, and becomes a minute deviation around the angle of view of 180 degrees (around 90 degrees of the image center when the image center is at 0 degrees).
  • the angle of view ⁇ around the maximum angle of view as shown in (b) of FIG. 6 rather than the deviation 602 at the angle of view ⁇ 1 501 of the first imaging unit 107.
  • the shift 603 at 1 604 makes the image shift smaller.
  • the image correction unit 106 can generate a corrected image having no sense of incongruity even when the second image captured by the second imaging unit 108 is combined with the first image captured by the first imaging unit 107.
  • the image correction unit 106 corrects the shielded area in the first image photographed by the first imaging unit 107 using the second image photographed by the second imaging unit 108. . That is, the image correction unit 106 corrects the pixel at the pixel position I 1 (u 1 , v 1 ) on the first image using the pixel at the pixel position I 2 (u 2 , v 2 ) on the second image Will be described in detail.
  • the image correction unit 106 converts the pixel position I 1 (u 1 , v 1 ) on the first image into the angle of view ( ⁇ 1 , ⁇ 1 ), and the angle of view ( ⁇ 1 , ⁇ on the first image).
  • the angle of view ( ⁇ 2 , ⁇ 2 ) on the second image corresponding to 1 ) is calculated.
  • the image correction unit 106 converts the angle of view ( ⁇ 2 , ⁇ 2 ) into the pixel position I 2 (u 2 , v 2 ) on the second image, and the pixels on the first image and the second image Correspond to the pixel.
  • the image correction unit 106 corrects the shielded area in the first image using the pixels on the associated second image. More specific description will be made below using formulas (2) to (12).
  • the image correction unit 106 converts the pixel position I 1 (u 1 , v 1 ) on the first image into the angle of view ( ⁇ 1 , ⁇ 1 ) using the following formulas (2) and (3) .
  • angle theta 1 is horizontal angle [radian]
  • the angle phi 1 represents the vertical angle [radian].
  • r is the distance [pixel] from the center of the image, and is expressed by the following equation (4).
  • w represents the number of horizontal pixels [pixel] of the image
  • h represents the number of vertical pixels of the image [pixel].
  • the focal length f may use a value determined in advance as lens specifications, or a value obtained by camera calibration.
  • the image correction unit 106 converts the angle of view ( ⁇ 1 , ⁇ 1 ) in the first image into the angle of view ( ⁇ 2 , ⁇ 2 ) in the second image.
  • the horizontal angle of view ⁇ is converted using the following equation (1).
  • the vertical angle of view is converted using equation (5).
  • the image correction unit 106 sets the angle of view ( ⁇ 2 , ⁇ 2 ) on the second image to the pixel position I 2 (u 2 , v 2 ) on the second image by the following equations (6) and (6) Convert using 7).
  • Equation (8) f (v, R) is represented by equation (8)
  • parameters ⁇ , ⁇ and v are represented by equations (9), (10) and (11).
  • the image correction unit 106 calculates the correspondence between the pixel position in the first image and the pixel position in the second image, and based on the correspondence, the pixels of the shielded area in the first image are pixels in the second image Use to correct. For example, the image correction unit 106 linearly approximates the pixel value using the nearest neighbor interpolation (the nearest neighbor interpolation) using the pixel value at the closest position as in the following equation (12), or using the four surrounding pixels. It may correct using bilinear interpolation which interpolates into.
  • the image correction unit 106 obtains the pixel value of the pixel position I 2 (u 2 , v 2 ) on the second image using nearest neighbor interpolation or bilinear interpolation, and calculates the pixel position I 1 (on the first image). Assign to u 1 , v 1 ). In the case where the number of pixels of the second image is larger than the number of pixels of the first image, etc., the pixel values of more pixels on the second image are used to complement the pixels on the first image It is also good.
  • the image correction unit 106 obtains the correspondence between the pixel position I 1 (u 1 , v 1 ) in the first image and the pixel position I 2 (u 2 , v 2 ) in the second image.
  • the image correction unit 106 corrects all pixels in the shielded area in the first image using the second image based on the correspondence relationship.
  • the image correction unit 106 corresponds each pixel of the shielded area (first complemented area) of the first image to the pixel (each pixel in the shielded area of the first image) in the second image. It complements by a pixel (pixel for the first complementation).
  • the image correction unit 106 sets each pixel of the shielded area (second complemented area) of the second image to a pixel (each pixel in the shielded area of the second image) in the first image (pixel). It may be complemented by the second complementary pixel).
  • the image correction unit 106 may perform various calculations with the shooting distance to the subject 505 corresponding to the first complemented area as the predetermined shooting distance D determined in advance. As described above, even in the case of using the predetermined assumed imaging distance D, it is possible to generate a corrected image with less discomfort.
  • the image correction unit 106 corrects both the first image and the second image, but in the present embodiment, the image correction unit 106 may correct only one image. For example, when the display unit 105 previews the captured image, the display unit 105 may display only the corrected first image as a corrected image. As a result, even if the display unit 105 is a display of a simple configuration that can not display a stereoscopic image, the user can preferably check the quality of the corrected image.
  • the image correction unit 106 corrects the first image and the second image input to the image correction unit 106 in a state in which they are input, but the first image input to the image correction unit 106 And the second image may be converted to a different projection scheme and then corrected.
  • an image captured by a fisheye lens is represented by an equidistant projection method or an equi-stereographic projection method, and the image correction unit 106 performs perspective projection on the first image and the second image represented by these projection methods. Correction may be performed after conversion to an image to be represented. After the image correction, the image correction unit 106 converts the corrected image into an image of the equidistant projection method or the isometric projection method again.
  • the image correction unit 106 corrects one image, which is converted into an image represented by a perspective projection system and the image end is greatly extended, using the other image, which is also greatly enlarged at the image end.
  • the image correction unit 106 corrects the pixels in the shielded area of one image end with high accuracy using the pixels at the image end of the other image because the number of pixels of the image in which the image end is greatly extended increases. be able to.
  • the image correction unit 106 corrects only the shielded area, but may correct the adjacent area adjacent to the shielded area. For example, when the image correction unit 106 corrects the shielded area of the first image using the second image, the pixel (each pixel of the adjacent area) for each pixel in the adjacent area adjacent to the shielded area, and The correction may be performed based on both of the pixels corresponding to the pixel (each pixel in the adjacent region) in the second image.
  • the image correction unit 106 can calculate the pixels corresponding to the respective pixels in the adjacent region in the second image by the same method as the pixels corresponding to the respective pixels in the shielding region in the second image.
  • the image correction unit 106 performs correction on each pixel of the adjacent area adjacent to the shielded area using a pixel obtained by mixing the pixel of the first image and the pixel of the second image corresponding thereto. Good. In this manner, the image correction unit 106 mixes and corrects the pixels of the first image and the pixels of the second image, thereby making the boundary between the corrected area and the uncorrected area in the image inconspicuous. be able to.
  • the image correction unit 106 may set the mixing ratio of the first image and the second image so that the ratio of the pixels of the first image increases as the image becomes closer to the image center from around the maximum angle of view.
  • the boundary between the area of the corrected image and the area of the uncorrected image can be smoothed. Further, even when the image correction unit 106 corrects the shielded area of the second image using the first image, the boundary between the corrected area and the uncorrected area in the image is determined using the same method. It can be smooth.
  • the image correction unit 106 outputs an image obtained by correcting each pixel of the shielded area of the first image using the pixels of the second image as it is, but further performs image processing on the shielded area after correction. It may be output after being applied.
  • the image correction unit 106 may perform blurring processing on the corrected shielding area or processing to suppress the contrast. This makes the corrected shielded area less noticeable than the uncorrected shielded area.
  • the image correction unit 106 may perform blurring processing intensified or processing in which contrast is further suppressed as the distance from the center of the image increases, that is, as the image ends.
  • the boundary between the area subjected to the correction and the area not subjected to the correction can be made more inconspicuous.
  • the image correction unit 106 may further perform the above-described image processing not only on the shielded area after correction but also on the periphery of the shielded area. As a result, the boundary between the area subjected to the correction and the area not subjected to the correction can be further made inconspicuous and smooth.
  • the image correction unit 106 finds the correspondence between the pixels of the first image and the pixels of the second image from the predetermined assumed shooting distance D, but the assumed shooting distance D is determined according to the angle of view. To calculate the correspondence between the pixels of the first image and the pixels of the second image.
  • an example of a method of changing the subject imaging distance D assumed to be captured in the shielded area in accordance with the angle of view will be described with reference to FIG.
  • FIG. 7 is a diagram for explaining an example of a method in which the image correction unit 106 changes the assumed shooting distance according to the angle of view.
  • the correction unit 106 changes the assumed shooting distance to be the assumed shooting distance 702.
  • the correction unit 106 changes the assumed shooting distance to the assumed shooting distance 704.
  • the image correction unit In 106 In the case of calculating a pixel (first complementary pixel) on the corresponding second image for a pixel whose angle of view is 90 degrees (image end) in the shielded area of the first image, the image correction unit In 106, the assumed shooting distance is changed to be the assumed shooting distance 705.
  • the image correction unit 106 sets the assumed imaging distance to be longer as the angle of view becomes larger, so that the projection plane 706 is inclined as shown in FIG. 7A. Can generate an image. This makes it possible to generate a corrected image in which the perspective of the subject is taken into consideration. Further, as shown in (b) of FIG. 7, the image correction unit 106 may change the relationship between the assumed imaging distance and the angle of view so that the projection surface 707 is a curved surface. This makes it possible to generate a corrected image in which the distortion of the wide-angle lens is taken into consideration.
  • the image correction unit 106 calculates the pixel of the other corresponding image for each pixel in one image, but calculates the correspondence between the pixels of the first image and the second image in advance. You may leave it. In this case, the image correction unit 106 stores the correspondence in a lookup table or the like. As a result, since the image correction unit 106 does not have to calculate the correspondence between the first image and the second image each time the correction processing is performed, the process from the photographing of the first image and the second image to the display of the corrected image Processing time can be shortened.
  • the image correction unit 106 outputs only the corrected image, but may output both the corrected image and the shielding area. By outputting not only the corrected image but also the shielded area, it is easy for the user to understand which part of the image has been corrected.
  • the first image and the second image having a maximum angle of view of 180 degrees are input to the imaging device 102.
  • a horizontal angle of view of 360 degrees using an omnidirectional camera using a mirror is input.
  • a wide angle image having a wider angle of view, such as an image, may be input.
  • the image correction unit 106 corrects a wide-angle image with a wider angle of view, the range of the shielded area shielded by the adjacent imaging unit or the like is larger than in the case where an image with a maximum angle of view of 180 degrees is photographed. . Therefore, a larger correction effect can be obtained by the image correction unit 106 correcting a wide-angle image with a wider angle of view.
  • the image correction unit (output unit) 106 outputs all the correction images, but may output an image obtained by cutting out a part of the correction image.
  • the image correction unit 106 changes the range of the image cut out from the corrected image according to the user's viewing angle and cuts the image cut out from the corrected image It can be displayed on the display unit 105.
  • imaging device 102 generates the 1st picture and the 2nd picture by stereo imaging part 103, it is not limited to this in this embodiment.
  • the imaging device 102 according to the present embodiment can be applied to any imaging unit in which a shielding area may be formed in the first image and the second image.
  • the image correction unit 106 is preferable because it can generate a corrected image with a particularly large correction effect when the stereo imaging unit 103 generates the first image and the second image.
  • the shielding area setting unit 801 extracts and sets the shielding area for each shooting by the stereo imaging unit 103, but a signal for setting the shielding area is input by an input unit (not shown).
  • the shielding area may be set.
  • the shielding area setting unit 801 may include a shielding area setting button, and the user may extract and set the shielding area when the button is pressed.
  • the shielding area setting unit 801 may detect the rotation of the imaging device 102a from the gyro sensor and extract the shielding area when the imaging device 102a is rotating. In particular, when the imaging device 102a is rotating, the shielded area setting unit 801 can easily extract the stereo imaging unit 103 as a still area by shooting a subject other than the stereo imaging unit 103 with a large movement. It is suitable.
  • the image correction unit 106 corrects the first image and the second image based on the shielding area set by the shielding area setting unit 801. However, under some conditions, the first image and the second image may be corrected based on the shielded area read from the storage unit 104. For example, the image correction unit 106 may correct the first image and the second image based on the shielded area read out from the storage unit 104 in a state where the image pickup apparatus 102 a is powered on and imaging is started.
  • the first image processing unit 106 performs the first process based on the shielded area set by the shielded area setting unit 801.
  • the image and the second image may be corrected.
  • the shielding area setting unit 801 can not extract the still area from the photographed image, the first image and the second image are corrected based on the shielding area stored in the storage unit 104 in advance. Can.
  • the shielding area setting unit 801 sets the shielding area of the first image using the first image, but the shielding area of the first image using both the first image and the second image May be set.
  • the shielded area setting unit 801 refers to the result of extracting the still area for each first image and the result of estimating corresponding points of the first image and the second image using template matching or the like.
  • the shielding area of one image may be set. Thereby, the shielding area setting unit 801 can distinguish between the stationary area common to the first image and the second image and the stationary area existing only in each image.
  • the shielding area setting unit 801 excludes the common stationary area from the shielding area because it can be considered that the stationary area common to the first image and the second image is not a shielding area but a distant subject such as a background, for example. (Exclude. Since the shielded area setting unit 801 can regard the still area existing only in each of the first image and the second image as an imaging unit other than the imaging unit that captures each image, the still area is set as a shielded area. Set Thus, based on the first image and the second image, the still area photographed only in the first image and the still area photographed only in the second image can be used as the shielding area in each image. By setting, the shielding area can be set suitably.
  • control blocks of the imaging devices 102 and 102a are realized by logic circuits (hardware) formed in integrated circuits (IC chips) or the like. It may be realized by software.
  • the imaging devices 102 and 102a each include a computer that executes an instruction of an image processing program that is software that implements each function.
  • the computer includes, for example, at least one processor (control device), and at least one computer readable recording medium storing the image processing program.
  • the computer also includes hardware such as an operating system (OS) and peripherals.
  • OS operating system
  • the processor reads the image processing program from the recording medium and executes the program to achieve the object of the present invention.
  • a CPU Central Processing Unit
  • the recording medium include “non-temporary tangible media”, portable media such as ROM (Read Only Memory) and CD-ROM, disks such as tapes, flexible disks and magneto-optical disks, cards, semiconductor memory, Programmable logic circuits, hard disks incorporated in computer systems, and the like can be used.
  • image processing is dynamically performed for a short time, as in the case of transmitting an image processing program via a network such as the Internet and a communication line such as a telephone line.
  • Those that hold programs, and those that hold image processing programs for a certain period of time such as volatile memory in computer systems that become servers and clients, are included.
  • the imaging devices 102 and 102a may further include a RAM (Random Access Memory) or the like for developing the image processing program.
  • the image processing program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the image processing program.
  • any transmission medium communication network, broadcast wave, etc.
  • one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the image processing program is embodied by electronic transmission.
  • part or all of the imaging devices 102 and 102a may be realized as an LSI, which is typically an integrated circuit.
  • the control blocks of the imaging devices 102 and 102a may be individually chipped, or some or all of the control blocks may be integrated and chipped.
  • the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible. In the case where an integrated circuit technology comes out to replace LSI's as a result of the advancement of semiconductor technology, it is also possible to use an integrated circuit according to such technology.
  • control lines and the information lines indicate what is considered to be necessary for the description, and not all the control lines and the information lines are necessarily shown on the product, and all the configurations May be mutually connected.
  • the imaging device (102) according to aspect 1 of the present invention is imaged by the first imaging unit based on the first still area in the moving image (frame images 901 to 903) imaged by the first imaging unit (107).
  • a determination unit shieldded region setting unit 801 for determining a first complemented region (shielded regions 904 to 906) of the first image, and a second imaging unit (108) for each pixel in the first
  • a complementing unit image correction unit 106 that is complemented by the first complementing pixel corresponding to the pixel in the second image captured in step b.
  • the first imaging unit and the second imaging unit may constitute a stereo camera.
  • a suitable stereoscopic image can be generated.
  • the determining unit may set an area close to the image end of the moving image It may be determined as the first complemented area.
  • the complementing unit further includes: each pixel in an adjacent area adjacent to the first complemented area in the first image; And, the correction may be performed based on both of the pixels corresponding to the pixel in the second image.
  • the boundary between the corrected area and the uncorrected area in the image can be made inconspicuous.
  • the imaging apparatus may further include an output unit configured to cut out and output a part of the first image complemented by the complementing section in the above-described aspects 1 to 4.
  • the determination unit further determines the second image based on a second still area in a moving image captured by the second imaging unit.
  • the second complemented region is determined, and the complementing unit further complements each pixel in the second complemented region by a second complementary pixel corresponding to the pixel in the first image. Good.
  • both the first image and the second image can be suitably corrected.
  • the area common to the first stationary area and the second stationary area is excluded from the first complemented area and the second complemented area.
  • the first region to be complemented and the second region to be complemented can be suitably determined.
  • An imaging device further includes the image processing device according to any one of aspects 1 to 7, the first imaging unit, and the second imaging unit.
  • the image processing apparatus is configured such that the first imaging unit captures an image based on the first still area in the moving image captured by the first imaging unit.
  • the image processing apparatus may be realized by a computer.
  • the computer is operated as each unit (software element) included in the image processing apparatus to cause the computer to execute the image processing apparatus.
  • An image processing program to be realized and a computer readable recording medium recording the same also fall within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

An imaging apparatus (102) is provided with: a shielding region setting unit (801) that determines shielding regions (904-906) of a first image on the basis of a first static region of frame images (901-903) imaged by a first imaging unit (107); and an image correction unit (106) that complements pixels in the shielding regions (904-906) by using first complementing pixels corresponding to the pixels in a second image.

Description

画像処理装置、撮像装置、画像処理装置の制御方法、画像処理プログラムおよび記録媒体IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, CONTROL METHOD OF IMAGE PROCESSING APPARATUS, IMAGE PROCESSING PROGRAM, AND RECORDING MEDIUM
 本発明は、画像処理装置、撮像装置、画像処理装置の制御方法、画像処理プログラムおよび記録媒体に関する。 The present invention relates to an image processing apparatus, an imaging apparatus, a control method of the image processing apparatus, an image processing program, and a recording medium.
 立体画像を撮影し、複数の撮像部を有する撮像装置が知られている。この撮像装置は、ユーザの左眼に左眼用画像が写り、右眼に右眼用画像が写るように表示部に画像を表示することで、ユーザが立体視できるようになっている。例えば、下記特許文献1には、立体画像を撮影するステレオカメラが開示されている。また、下記特許文献2には、3以上のレンズを有する撮像装置において、他のレンズによって遮蔽された画像の領域を補完した画像を生成する技術が記載されている。 There is known an imaging device that captures a stereoscopic image and has a plurality of imaging units. The imaging device displays the image on the display unit so that the image for the left eye appears in the left eye of the user and the image for the right eye appears in the right eye, so that the user can view stereoscopically. For example, the following Patent Document 1 discloses a stereo camera for capturing a stereoscopic image. Patent Document 2 below describes a technique for generating an image in which an area of an image shielded by another lens is complemented in an imaging device having three or more lenses.
日本国公開特許公報「特開2005-70077号公報(2005年3月17日公開)」Japanese Patent Publication "Japanese Patent Application Laid-Open No. 2005-70077 (released on March 17, 2005)" 日本国公開特許公報「特開2015-133691号公報(2015年7月23日公開)」Japanese patent publication "Japanese Patent Application Laid-Open No. 2015-133691 (released on July 23, 2015)"
 しかしながら、特許文献1に記載された技術では、他のレンズによって画像の一部が遮蔽され、好適な画像を生成できないことがある。また、特許文献2に記載された技術では、他のレンズによって遮蔽された画像の領域をあらかじめ設定しておく必要がある。 However, in the technique described in Patent Document 1, a part of the image may be blocked by another lens, and a suitable image may not be generated. Moreover, in the technique described in Patent Document 2, it is necessary to set in advance the area of the image shielded by another lens.
 本発明は上記課題を鑑みて発明されたものであり、事前設定の負荷を軽減しつつ、好適な画像を生成できる画像処理装置およびその関連技術を提供することを目的とする。 The present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing apparatus capable of generating a suitable image while reducing the load of presetting, and related techniques.
 上記の課題を解決するために、本発明の一態様に係る画像処理装置は、第一撮像部で撮像された動画像における第一静止領域に基づいて、当該第一撮像部で撮像された第一画像の第一被補完領域を決定する決定部と、前記第一被補完領域内の各画素を、第二撮像部で撮像された第二画像における、当該画素に対応する第一補完用画素によって補完する補完部と、を備えている。 In order to solve the above-mentioned subject, the image processing device concerning one mode of the present invention is the 1st image pick-up part concerned based on the 1st still field in the animation picturized with the 1st image pick-up part. A first complementation pixel corresponding to the pixel in a second image captured by a second imaging unit, a determination unit determining a first complemented region of one image, and each pixel in the first complemented region And a complementing unit that is complemented by the
 本発明の一態様に係る画像処理装置の制御方法は、画像処理装置が、第一撮像部で撮像された動画像における第一静止領域に基づいて、当該第一撮像部で撮像された第一画像の第一被補完領域を決定する決定工程と、前記画像処理装置が、前記第一被補完領域内の各画素を、第二撮像部で撮像された第二画像における、当該画素に対応する第一補完用画素によって補完する補完工程と、を包含する。 In the control method of an image processing apparatus according to an aspect of the present invention, the image processing apparatus is configured such that the first imaging unit captures an image based on a first still area in a moving image captured by the first imaging unit. A determining step of determining a first region to be complemented of the image; and the image processing device corresponding to each pixel in the first region to be complemented in the second image captured by the second imaging unit And a complementing step of complementing by the first complementing pixel.
 本発明の一態様によれば、事前設定の負荷を軽減しつつ、好適な画像を生成できる画像処理装置およびその関連技術を提供することができる。 According to one aspect of the present invention, it is possible to provide an image processing apparatus capable of generating a suitable image and its related technology while reducing the load of preset.
本発明の一実施形態に係る撮像装置の構成を示す機能ブロック図である。It is a functional block diagram showing composition of an imaging device concerning one embodiment of the present invention. 本発明の一実施形態における画像処理装置の制御処理の流れを示すフローチャートである。It is a flowchart which shows the flow of control processing of the image processing apparatus in one Embodiment of this invention. 本発明の一実施形態における遮蔽領域を説明するための図である。It is a figure for demonstrating the shielding area | region in one Embodiment of this invention. 本発明の一実施形態における遮蔽領域の決定方法を説明するための図である。It is a figure for demonstrating the determination method of the shielding area | region in one Embodiment of this invention. 本発明の一実施形態における各撮像部と被写体との位置関係を示す図である。It is a figure which shows the positional relationship of each imaging part and to-be-photographed object in one Embodiment of this invention. 本発明の一実施形態において想定撮影距離と実際の撮影距離とが異なる場合の補正画像のずれを説明するための図である。It is a figure for demonstrating the shift | offset | difference of the correction | amendment image when the presumed imaging | photography distance and actual imaging | photography distance differ in one Embodiment of this invention. 本発明の一実施形態において画角に応じて想定撮影距離を変更する方法の一例を説明するための図である。It is a figure for demonstrating an example of the method of changing an assumption imaging | photography distance according to an angle of view in one Embodiment of this invention.
 本発明の一実施形態に係る撮像装置102について、添付図面を参照することによって以下に詳細に説明する。 An imaging device 102 according to an embodiment of the present invention will be described in detail below by referring to the attached drawings.
 〔撮像装置102〕
 図1は、本発明の一実施形態に係る撮像装置102の構成を示す機能ブロック図である。図1に示すように、撮像装置102は、画像処理装置101、ステレオ撮像部103、記憶部104、および表示部105を備えている。撮像装置102としては、異なる位置から同一の被写体を撮像する複数の撮像部を備える撮像装置を用いることができる。一態様において、撮像装置102は、ステレオカメラであり、例えば、2眼方式のデジタルカメラおよびビデオカメラなどを挙げることができる。
[Imaging Device 102]
FIG. 1 is a functional block diagram showing a configuration of an imaging device 102 according to an embodiment of the present invention. As shown in FIG. 1, the imaging device 102 includes an image processing device 101, a stereo imaging unit 103, a storage unit 104, and a display unit 105. As the imaging device 102, an imaging device provided with a plurality of imaging units for imaging the same subject from different positions can be used. In one aspect, the imaging device 102 is a stereo camera, and may include, for example, a twin-lens digital camera and a video camera.
 [ステレオ撮像部103]
 ステレオ撮像部103は、第一撮像部107と第二撮像部108とから構成される。ステレオ撮像部103は、第一撮像部107によって生成された第一画像および第一動画像、ならびに、第二撮像部108によって生成された第二画像および第二動画像を画像処理装置101に出力する。なお、第一画像は、第一動画像に含まれていてもよい。また、第二画像は、第二動画像に含まれていてもよい。
[Stereo imaging unit 103]
The stereo imaging unit 103 includes a first imaging unit 107 and a second imaging unit 108. The stereo imaging unit 103 outputs the first image and the first moving image generated by the first imaging unit 107, and the second image and the second moving image generated by the second imaging unit 108 to the image processing apparatus 101. Do. The first image may be included in the first moving image. In addition, the second image may be included in the second moving image.
 (第一撮像部107、第二撮像部108)
 第一撮像部107は、左眼に対応する第一視点位置に設置されており、被写体を撮像することによって、当該被写体を含む第一画像または第一動画像を生成する。第二撮像部108は、右眼に対応する第二視点位置に設置されており、被写体を撮像することによって、当該被写体を含む第二画像または第二動画像を生成する。第一撮像部107および第二撮像部108としては、CCD(Charge Couple Device)などの撮像レンズと撮像素子を備えているものを挙げることができる。第一撮像部107および第二撮像部108が生成する画像の画素数は同一であってもよいし、互いに異なっていてもよい。
(First imaging unit 107, second imaging unit 108)
The first imaging unit 107 is installed at a first viewpoint position corresponding to the left eye, and generates a first image or a first moving image including the subject by imaging the subject. The second imaging unit 108 is installed at a second viewpoint position corresponding to the right eye, and generates a second image or a second moving image including the subject by imaging the subject. Examples of the first imaging unit 107 and the second imaging unit 108 include those having an imaging lens such as a CCD (Charge Couple Device) and an imaging element. The number of pixels of the images generated by the first imaging unit 107 and the second imaging unit 108 may be the same or may be different from each other.
 [記憶部104]
 記憶部104は、第一画像、第二画像、第一画像における遮蔽領域(第一被補完領域)、第二画像における遮蔽領域(第二被補完領域)、および、画像補正部106によって補正された補正画像などの画像情報を記録媒体に記憶する。また、記憶部104は、記録媒体に記憶された上述の画像を読み込んで画像処理装置101に出力する。記憶部104としては、上述の機能を有するものであれば、汎用のものを用いることができる。
[Storage unit 104]
The storage unit 104 is corrected by the first image, the second image, a shielded area (first compensated area) in the first image, a shielded area (second compensated area) in the second image, and the image correction unit 106. The image information such as the corrected image is stored in the recording medium. Further, the storage unit 104 reads the above-mentioned image stored in the recording medium and outputs the read image to the image processing apparatus 101. As the storage unit 104, a general-purpose storage unit can be used as long as it has the above-described function.
 本明細書において、遮蔽領域とは、第一画像および第二画像の少なくとも一方の一部に、当該各画像を撮像する撮像部以外のもう一方の撮像部の一部(例えば、レンズまたはカメラ本体など)が写り込んでいる領域のことを意味する。本発明において、各画像における遮蔽領域は、他の画像を用いて補完されるものであるため、被補完領域とも称される。本明細書において、補完とは、元の画像の被補完領域の各画素を、他の画像の対応する画素を用いて置き換えるように補正することを意味する。一態様において、置き換えるように補正するとは、元の画像の被補完領域の各画素の画素値を、他の画像の対応する画素の画素値に依拠して決定することであり得る。 In the present specification, the shielded area refers to a part of at least one of the first image and the second image, a part of the other imaging part other than the imaging part that captures each image (for example, a lens or a camera body) Etc. means the area where the image is reflected. In the present invention, the shielded area in each image is also referred to as a complemented area because it is complemented using another image. In the present specification, “complement” means correction to replace each pixel of the complemented region of the original image with the corresponding pixel of another image. In one aspect, correcting to replace may be to determine the pixel value of each pixel of the complemented area of the original image based on the pixel value of the corresponding pixel of the other image.
 記憶部104は、例えば所定の記録媒体から上述の画像を読み込み、読み込んだ画像を再生する部材であってもよい。記録媒体としては、デジタルカメラおよびビデオカメラなどの市販のカメラ内の記憶装置、ならびに、取り外し可能な記憶装置(例えば、光磁気ディスクおよび半導体メモリなどの電子媒体)が例示的に挙げられる。 The storage unit 104 may be, for example, a member that reads the above-described image from a predetermined recording medium and reproduces the read image. Examples of recording media include storage devices in commercially available cameras such as digital cameras and video cameras, and removable storage devices (for example, electronic media such as magneto-optical disks and semiconductor memories).
 [表示部105]
 表示部105は、文字情報および主要被写体情報などを含む補正画像などを表示する。表示部105としては、例えば、LCD(Liquid Crystal Display:液晶ディスプレイ)および有機EL(Electro Luminescence)ディスプレイなどの表示画面が挙げられる。
[Display unit 105]
The display unit 105 displays, for example, a corrected image including text information and main subject information. Examples of the display unit 105 include display screens such as an LCD (Liquid Crystal Display) and an organic EL (Electro Luminescence) display.
 [画像処理装置101]
 画像処理装置101は、ステレオ撮像部103から取得した第一画像および第二画像から、遮蔽領域を補正した補正画像を生成し、記憶部104または表示部105の少なくともいずれかに出力する。画像処理装置101は、例えば、CPU(Central Processing Unit:中央処理装置)またはGPU(Graphic Processing Unit:画像処理用処理装置)などから構成される。
[Image processing apparatus 101]
The image processing apparatus 101 generates a corrected image obtained by correcting the shielding area from the first image and the second image acquired from the stereo imaging unit 103, and outputs the corrected image to at least one of the storage unit 104 or the display unit 105. The image processing apparatus 101 includes, for example, a central processing unit (CPU) or a graphic processing unit (GPU).
 また、画像処理装置101は画像補正部(補完部)106および遮蔽領域設定部(決定部)801を備えている。 The image processing apparatus 101 further includes an image correction unit (complement unit) 106 and a shielding area setting unit (determination unit) 801.
 (画像補正部106)
 画像補正部106は、ステレオ撮像部103または記憶部104から第一画像および第二画像を取得し、第一画像および第二画像の少なくとも一つを補正することで補正画像を生成する。具体的には、画像補正部106は、第一画像内において第二撮像部108によって遮蔽されている遮蔽領域がある場合には第二画像を用いて当該遮蔽領域を置き換えるように補正することで補正画像を生成する。また、画像補正部106は、第二画像内において第一撮像部107によって遮蔽されている遮蔽領域がある場合には第一画像を用いて当該遮蔽領域を置き換えるように補正することで補正画像を生成する。
(Image correction unit 106)
The image correction unit 106 acquires a first image and a second image from the stereo imaging unit 103 or the storage unit 104, and generates a corrected image by correcting at least one of the first image and the second image. Specifically, if there is a shielded area shielded by the second imaging unit 108 in the first image, the image correction unit 106 corrects the shielded area so as to replace the shielded area using the second image. Generate a corrected image. In addition, if there is a shielded area shielded by the first imaging unit 107 in the second image, the image correction unit 106 corrects the corrected image by using the first image to correct the shielded area. Generate
 図3の(a)および(b)は、ステレオ撮像部103で撮影された第一画像301における遮蔽領域(第一遮蔽領域)302および第二画像303における遮蔽領域(第二遮蔽領域)304の一例を説明する図である。 (A) and (b) of FIG. 3 show the shielded area (first shielded area) 302 in the first image 301 captured by the stereo imaging unit 103 and the shielded area (second shielded area) 304 in the second image 303. It is a figure explaining an example.
 上述したように、遮蔽領域とは、第一画像および第二画像の少なくとも一方において、第一画像または第二画像を撮影した撮像部以外の、もう一方の撮像部の一部(レンズおよびカメラ本体など)が写りこんでいる領域のことである。以下、ステレオ撮像部103における第一撮像部107を左眼用カメラ、第二撮像部108を右眼用カメラとした場合における遮蔽領域について図3を用いて説明する。図3の(a)に示すように、第一撮像部107で撮影された第一画像301内の遮蔽領域302は画角右端に位置している。また、図3の(b)に示すように、第二撮像部108で撮影された第二画像303内の遮蔽領域304は画角左端に位置している。遮蔽領域は画像内の各画素が遮蔽領域か否かどちらかであるかを表すことができる。例えば、撮影画像の画素座標I(u、v)に対し、遮蔽領域M(u、v)のように設定する。 As described above, the shielded area is a part of the other imaging unit other than the imaging unit that has captured the first image or the second image in at least one of the first image and the second image (lens and camera body Etc.) is the area in which it appears. Hereinafter, a shielded area in the case where the first imaging unit 107 in the stereo imaging unit 103 is a left-eye camera and the second imaging unit 108 is a right-eye camera will be described with reference to FIG. As illustrated in (a) of FIG. 3, the shielded area 302 in the first image 301 captured by the first imaging unit 107 is located at the right end of the angle of view. Further, as shown in (b) of FIG. 3, the shielded area 304 in the second image 303 captured by the second imaging unit 108 is located at the left end of the angle of view. The occlusion area can represent whether each pixel in the image is an occlusion area or not. For example, the shielding area M (u, v) is set with respect to the pixel coordinates I (u, v) of the photographed image.
 (遮蔽領域設定部801)
 遮蔽領域設定部801は、入力された複数のフレーム画像から、複数のフレーム画像にわたり動いていない領域を抽出し、遮蔽領域として設定する。これにより、あらかじめ記憶部104に遮蔽領域を設定した状態からステレオ撮像部103の基線長またはレンズの形状が変わった場合にも好適に遮蔽領域を設定することができる。
(Shielded area setting unit 801)
The shielding area setting unit 801 extracts an area not moving across the plurality of frame images from the plurality of input frame images, and sets the extracted area as a shielding area. Thereby, even when the base length of the stereo imaging unit 103 or the shape of the lens changes from the state where the shielding area is set in the storage unit 104 in advance, the shielding area can be set appropriately.
 〔画像処理〕
 続いて、撮像装置102による画像処理装置の制御処理(画像処理装置の制御方法)について、図2を用いて説明する。図2は、撮像装置102によって実行される画像処理の流れを表すフローチャートである。
〔Image processing〕
Subsequently, control processing (control method of the image processing apparatus) of the image processing apparatus by the imaging apparatus 102 will be described with reference to FIG. FIG. 2 is a flowchart showing the flow of image processing performed by the imaging device 102.
 ステップS201:遮蔽領域設定部801は、第一撮像部107と第二撮像部108とが互いに同期して撮影した複数の動画像(フレーム画像)である第一画像および第二画像を取得する。 Step S201: The shielding area setting unit 801 acquires a first image and a second image which are a plurality of moving images (frame images) captured by the first imaging unit 107 and the second imaging unit 108 in synchronization with each other.
 ステップS202:遮蔽領域設定部801は、一方の撮像部で撮像された複数のフレーム画像における静止領域を抽出し、当該静止領域を一方の画像の遮蔽領域として設定(決定)する。遮蔽領域設定部801は、設定した遮蔽領域を画像補正部106に出力する(決定工程)。 Step S202: The shielding area setting unit 801 extracts a stationary area in a plurality of frame images captured by one imaging unit, and sets (determines) the stationary area as a shielding area of one image. The shielding area setting unit 801 outputs the set shielding area to the image correction unit 106 (determination step).
 上述の構成によれば、あらかじめ記憶部104に遮蔽領域を設定した状態からステレオ撮像部103の基線長またはレンズの形状が変わった場合にも好適に遮蔽領域を設定することができる。 According to the above-described configuration, the shielded area can be suitably set even when the baseline length of the stereo imaging unit 103 or the shape of the lens changes from the state where the shielded area is set in the storage unit 104 in advance.
 ステップS203:画像補正部106は、遮蔽領域に基づき、一方の画像における遮蔽領域の各画素を、もう一方の撮像部で撮像されたもう一方の画像における、当該画素に対応する画素(第一補完用画素または第二補完用画素)によって補完し、補正画像を生成する(補完工程)。 Step S203: The image correction unit 106, based on the shielded area, a pixel corresponding to the pixel in the other image captured by the other imaging unit with respect to each pixel of the shielded area in one image (first interpolation And a second complementing pixel) to generate a corrected image (complementary step).
 ステップS204:表示部105は補正画像を表示し、画像処理を終了する。 Step S204: The display unit 105 displays the corrected image, and ends the image processing.
 撮像装置102は、ステップS201~ステップS204の画像処理を、第一画像および第二画像の一方または両方に対して行うことで、第一画像および第二画像の少なくとも一方を補正する。 The imaging device 102 corrects at least one of the first image and the second image by performing the image processing of step S201 to step S204 on one or both of the first image and the second image.
 第一撮像部107と第二撮像部108とが隣接し、特に撮像部が広角レンズのステレオカメラである場合、一方の撮像部で撮影する画像に他方の撮像部が写り込む。例えば、広角レンズを有する第一撮像部107および第二撮像部108を平行に配置した場合、左眼用画像である第一画像には第二画像を撮影するための第二撮像部108が写りこむ。また、この場合、右眼用画像である第二画像には左眼用画像を撮影するための第一撮像部107が写りこむ。このように、隣接する撮像部が写りこんだ立体画像を表示部に105に表示させた場合、左眼用画像である第一画像および右眼用画像である第二画像において異なる被写体が写しだされる。そのため、ユーザは、対応点の融合が働かずに画像を立体視することが困難となる可能性がある。これに対し、撮像装置102によれば、一方の画像における遮蔽領域を他方の画像を用いて補正することで、第一画像および第二画像において同一の被写体を写し、好適な画像を生成することができる。 When the first imaging unit 107 and the second imaging unit 108 are adjacent to one another, and in particular when the imaging unit is a stereo camera with a wide-angle lens, the other imaging unit appears in an image captured by one imaging unit. For example, when the first imaging unit 107 having a wide-angle lens and the second imaging unit 108 are arranged in parallel, the second imaging unit 108 for capturing a second image appears in the first image that is an image for the left eye. Follow Further, in this case, the first imaging unit 107 for capturing the left-eye image is captured in the second image that is the right-eye image. As described above, when a stereoscopic image including an adjacent imaging unit is displayed on the display unit 105, different subjects appear in the first image as the left-eye image and the second image as the right-eye image. Be done. Therefore, it may be difficult for the user to stereoscopically view the image without fusion of corresponding points. On the other hand, according to the imaging device 102, the same subject is photographed in the first image and the second image by correcting the shielded area in one image using the other image, and a suitable image is generated. Can.
 (遮蔽領域の設定の詳細)
 遮蔽領域設定部801は、一方の画像において、当該画像を撮影した撮像部以外の、もう一方の撮像部により遮蔽されている領域を抽出する(ステップS202)。すなわち、第一撮像部107で撮像された第一動画像における第一静止領域に基づいて、第一撮像部107で撮像された第一画像の遮蔽領域(第一被補完領域)を決定する。また、第二撮像部108で撮像された第二動画像における第二静止領域に基づいて、第二撮像部108で撮像された第二画像の遮蔽領域(第二被補完領域)を決定してもよい。ステレオ撮像部103が複数のフレーム画像によって構成される動画像を撮影した場合、遮蔽されている領域は複数フレームにわたり、同じ位置に撮影される。これを、図4を用いて以下に具体的に説明する。
(Details of setting of shielding area)
The shielded area setting unit 801 extracts, in one image, an area shielded by the other imaging unit other than the imaging unit that has captured the image (step S202). That is, based on the first still area in the first moving image imaged by the first imaging unit 107, the shielded area (first complemented area) of the first image imaged by the first imaging unit 107 is determined. Further, based on the second still area in the second moving image captured by the second imaging unit 108, the shielded area (second supported region) of the second image captured by the second imaging unit 108 is determined. It is also good. When the stereo imaging unit 103 captures a moving image composed of a plurality of frame images, the shielded area is captured at the same position over a plurality of frames. This will be specifically described below with reference to FIG.
 図4の(a)~(c)は、複数のフレーム画像(第一画像)901~903における遮蔽領域(第一遮蔽領域)904~906を説明するための図である。なお、図4では、便宜上、複数のフレーム画像901~903が、第一撮像部107によって撮像された第一動画像である場合について説明するが、第二撮像部108によって撮像された複数のフレーム画像に対しても、同様に遮蔽領域の設定を行うことができる。 (A) to (c) of FIG. 4 are diagrams for explaining shielded regions (first shielded regions) 904 to 906 in a plurality of frame images (first images) 901 to 903. Although FIG. 4 illustrates the case where the plurality of frame images 901 to 903 are the first moving image captured by the first imaging unit 107 for convenience, a plurality of frames captured by the second imaging unit 108 Similarly, the setting of the shielding area can be performed on the image.
 図4の(a)~(c)に示すように、複数のフレーム画像901~903において、遮蔽領域904~906は、複数のフレーム画像901~903における同じ位置に撮影されるが、その他の被写体である人は異なる位置に撮影されている。これは、第一撮像部107と第二撮像部108とが平行を保つために同じ台座に取り付けられることで、撮像部全体が移動したとしても第一撮像部107と第二撮像部108とは同じ動きをし、互いの画像内で互いの撮像部が静止して撮影されるためである。そのため、遮蔽領域設定部801は、複数のフレーム画像において静止している領域(静止領域)、つまり、動いていない領域を抽出することで、その領域を遮蔽領域とすることができる。 As shown in (a) to (c) of FIG. 4, in the plurality of frame images 901 to 903, the shielding regions 904 to 906 are photographed at the same position in the plurality of frame images 901 to 903. The person who is is photographed at a different position. This is because the first imaging unit 107 and the second imaging unit 108 are attached to the same pedestal in order to maintain parallel, and even if the entire imaging unit moves, the first imaging unit 107 and the second imaging unit 108 The reason is that the same movement is performed, and the imaging units in each other are photographed stationary. Therefore, the shielding area setting unit 801 can make the area a shielding area by extracting a stationary area (stationary area) in a plurality of frame images, that is, an area not moving.
 ここで、遮蔽されていない領域は、撮影されている被写体が異なるため、撮影時が異なる複数のフレーム画像間での画素値の差分の値が大きくなる。一方、静止領域は、同じ被写体が写っているため複数のフレーム画像間での画素値の差分の値が小さくなる。そのため、遮蔽領域設定部801は、撮影時が異なる複数のフレーム画像間での画素値の差分が閾値よりも小さい領域を静止領域とし、当該静止領域を遮蔽領域に設定することができる。 Here, in the non-shielded area, the subject being photographed is different, so the value of the difference in pixel value between a plurality of frame images different in photographing time becomes large. On the other hand, in the static area, since the same subject is captured, the value of the difference in pixel value between a plurality of frame images becomes small. Therefore, the shielding area setting unit 801 can set an area where the difference in pixel value between a plurality of frame images different at the time of photographing is smaller than a threshold as a stationary area and set the stationary area as the shielding area.
 また、図4の(a)~(c)に示すように、一方の撮像部から見た他方の撮像部は、画角端(撮像した画像の画像端)に位置すると想定される。また、撮像部と同じ台座に固定され、遮蔽物となる物体も、同様に、画角端(撮像した画像の画像端)に位置すると想定される。 Further, as shown in (a) to (c) of FIG. 4, it is assumed that the other imaging unit viewed from one imaging unit is located at the angle of view end (the image end of the captured image). Further, it is assumed that an object fixed to the same pedestal as the imaging unit and serving as a shielding object is also located at the field angle end (the image end of the captured image).
 そこで、遮蔽領域設定部801は、静止領域が複数の領域に分かれている場合、当該複数の領域のうち、画角端に接している領域を遮蔽領域に設定してもよい。これにより、画像中心に動いていない被写体が複数のフレーム画像901~903に写り込んでいる場合でも、遮蔽領域設定部801は、第一撮像部107または第二撮像部108による遮蔽領域904~906のみを抽出できる。画像中心に動いていない被写体としては、例えば、ケーブルなどの被写体を挙げることができる。 Therefore, when the stationary area is divided into a plurality of areas, the shielding area setting unit 801 may set an area in contact with the angle of view end as the shielding area among the plurality of areas. As a result, even when a subject that is not moved to the center of the image is included in the plurality of frame images 901 to 903, the shielded area setting unit 801 can block the shielded areas 904 to 906 by the first imaging unit 107 or the second imaging unit 108. Can only extract. As a subject not moving to the center of the image, for example, a subject such as a cable can be mentioned.
 遮蔽領域設定部801は、上述のように設定した遮蔽領域を画像補正部106および記憶部104の少なくとも一方に出力する。 The shielding area setting unit 801 outputs the shielding area set as described above to at least one of the image correction unit 106 and the storage unit 104.
 (画像の補正の概要)
 画像補正部106による画像の補正について図5および図6を用いて以下に説明する。
(Overview of image correction)
The correction of the image by the image correction unit 106 will be described below with reference to FIGS. 5 and 6.
 まず、図5を用いて画角と画像補正との関係について説明する。図5は、広角レンズを用いた第一撮像部107および第二撮像部108と、ステレオ撮影される被写体505との関係を示す図である。ここで、広角レンズとは最大画角が180度のレンズのことを意味する。また、画角とはレンズの光軸と被写体505とのなす角のことを意味する。 First, the relationship between the angle of view and the image correction will be described with reference to FIG. FIG. 5 is a diagram showing the relationship between the first imaging unit 107 and the second imaging unit 108 using a wide-angle lens and the subject 505 to be photographed in stereo. Here, the wide-angle lens means a lens having a maximum angle of view of 180 degrees. Further, the angle of view means the angle between the optical axis of the lens and the object 505.
 図5に示す例では、第一撮像部107の画角θ501とは、光軸506と被写体505とのなす角のことを意味し、第二撮像部108の画角θ502とは、光軸507と被写体505とのなす角のことを意味する。図5では、第一撮像部107の画角θ501内に第二撮像部108があり、第二撮像部108よりも遠方にある被写体505が第二撮像部108によって遮蔽されるため、撮影されない。 In the example shown in FIG. 5, the angle of view θ 1 501 of the first imaging unit 107 means the angle formed by the optical axis 506 and the subject 505, and the angle of view θ 2 502 of the second imaging unit 108 , Means the angle between the light axis 507 and the subject 505. In FIG. 5, the second imaging unit 108 is within the angle of view θ 1 501 of the first imaging unit 107, and the second imaging unit 108 shields the object 505 that is farther than the second imaging unit 108. I will not.
 ここで、第一撮像部107の遮蔽領域に写っていると想定される被写体505(第一撮像部107の遮蔽領域に対応する被写体505)までの撮影距離を想定撮影距離503と規定する。この場合、第一撮像部107の画角θ501と、第二撮像部108の画角θ502とは以下の式(1)の関係が成り立つ。画像補正部106は、式(1)の関係に基づいて、画角θ501内の遮蔽領域を画角θ502内の遮蔽されていない領域によって補正可能となる。 Here, the imaging distance to the subject 505 (subject 505 corresponding to the shielding area of the first imaging unit 107) assumed to be captured in the shielding area of the first imaging unit 107 is defined as an estimated imaging distance 503. In this case, the relationship of the following equation (1) holds between the angle of view θ 1 501 of the first imaging unit 107 and the angle of view θ 2 502 of the second imaging unit 108. The image correction unit 106 can correct the shielded area in the angle of view θ 1 501 by the unshielded area in the angle of view θ 2 502 based on the relationship of Expression (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 式(1)において、画角θ501および画角θ502はラジアンで表現されているものとする。パラメータDは想定撮影距離503を表す。また、パラメータBは撮像部間の距離504、すなわち基線長を表す。基線長は、撮像部間の距離を計測した値を用いる。撮像部間の距離504を計測する方法としては、スケールなどを用いて目視によって計測する方法、および、カメラキャリブレーションにより求める方法などが挙げられる。画像補正部106は、想定撮影距離503を、例えば5mなどにあらかじめ設定する。 In Equation (1), it is assumed that the angle of view θ 1 501 and the angle of view θ 2 502 are expressed in radians. The parameter D represents the assumed shooting distance 503. The parameter B represents the distance 504 between the imaging units, that is, the baseline length. The baseline length uses a value obtained by measuring the distance between the imaging units. As a method of measuring the distance 504 between the imaging units, a method of visually measuring using a scale or the like, a method of obtaining by camera calibration, and the like can be mentioned. The image correction unit 106 sets the assumed shooting distance 503 to, for example, 5 m in advance.
 図6の(a)および(b)は、想定撮影距離503と実際の撮影距離601とが異なる場合の補正画像のずれ602および603を説明するための図である。図6の(a)に示すように、想定撮影距離503と、実際の撮影距離601とが異なると補正画像にずれ602が生じる。このずれ602は最大画角周辺になるに従い小さくなり、180度の画角周辺(画像中心を0度とする場合は、90度の画角周辺)では微小なずれとなる。例えば、図6の(a)に示すように、第一撮像部107の画角θ501におけるずれ602よりも、図6の(b)に示すように、最大画角周辺となる画角θ604におけるずれ603のほうが画像のずれが小さくなる。図6の(b)では、第一撮像部107と第二撮像部108とでは、画像上の略同一位置に同一の被写体が撮影される。そのため、画像補正部106は、第二撮像部108で撮影した第二画像を第一撮像部107で撮影した第一画像に合成しても違和感のない補正画像を生成できる。 FIGS. 6A and 6B are diagrams for explaining the deviations 602 and 603 of the corrected image when the assumed imaging distance 503 and the actual imaging distance 601 are different. As shown in (a) of FIG. 6, when the assumed imaging distance 503 and the actual imaging distance 601 are different, a shift 602 occurs in the corrected image. The deviation 602 becomes smaller as it approaches the maximum angle of view, and becomes a minute deviation around the angle of view of 180 degrees (around 90 degrees of the image center when the image center is at 0 degrees). For example, as shown in (a) of FIG. 6, the angle of view θ around the maximum angle of view as shown in (b) of FIG. 6 rather than the deviation 602 at the angle of view θ 1 501 of the first imaging unit 107. The shift 603 at 1 604 makes the image shift smaller. In (b) of FIG. 6, in the first imaging unit 107 and the second imaging unit 108, the same subject is photographed at substantially the same position on the image. Therefore, the image correction unit 106 can generate a corrected image having no sense of incongruity even when the second image captured by the second imaging unit 108 is combined with the first image captured by the first imaging unit 107.
 次に、画像における遮蔽領域の補正の詳細について説明する。ここではまず、画像補正部106が、第一撮像部107によって撮影された第一画像における遮蔽領域を、第二撮像部108によって撮影された第二画像を用いて補正する方法について詳細に説明する。すなわち、画像補正部106が、第一画像上の画素位置I(u、v)の画素を、第二画像上の画素位置I(u、v)の画素を用いて補正する方法について詳細に説明する。まず、画像補正部106は、第一画像上の画素位置I(u、v)を画角(θ、φ)に変換し、第一画像上の画角(θ、φ)に相当する第二画像上の画角(θ、φ)を算出する。そして、画像補正部106は、画角(θ、φ)を第二画像上の画素位置I(u、v)に変換し、第一画像上の画素と第二画像上の画素とを対応付ける。画像補正部106は、対応付けた第二画像上の画素を用いて第一画像における遮蔽領域を補正する。以下に、式(2)~(12)を用いてより具体的に説明する。 Next, the details of the correction of the shielded area in the image will be described. Here, first, a method will be described in detail in which the image correction unit 106 corrects the shielded area in the first image photographed by the first imaging unit 107 using the second image photographed by the second imaging unit 108. . That is, the image correction unit 106 corrects the pixel at the pixel position I 1 (u 1 , v 1 ) on the first image using the pixel at the pixel position I 2 (u 2 , v 2 ) on the second image Will be described in detail. First, the image correction unit 106 converts the pixel position I 1 (u 1 , v 1 ) on the first image into the angle of view (θ 1 , φ 1 ), and the angle of view (θ 1 , φ on the first image). The angle of view (θ 2 , φ 2 ) on the second image corresponding to 1 ) is calculated. Then, the image correction unit 106 converts the angle of view (θ 2 , φ 2 ) into the pixel position I 2 (u 2 , v 2 ) on the second image, and the pixels on the first image and the second image Correspond to the pixel. The image correction unit 106 corrects the shielded area in the first image using the pixels on the associated second image. More specific description will be made below using formulas (2) to (12).
 まず、画像補正部106は、第一画像上の画素位置I(u、v)を画角(θ、φ)に以下の式(2)および(3)を用いて変換する。ここで画角θは水平方向の画角[ラジアン]を、画角φは垂直方向の画角[ラジアン]を表す。 First, the image correction unit 106 converts the pixel position I 1 (u 1 , v 1 ) on the first image into the angle of view (θ 1 , φ 1 ) using the following formulas (2) and (3) . Here angle theta 1 is horizontal angle [radian], the angle phi 1 represents the vertical angle [radian].
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、rは画像中心からの距離[ピクセル]であり、以下の式(4)で表される。Rは画角θ=π/2の場合の画像中心からの距離r[ピクセル]である。 Here, r is the distance [pixel] from the center of the image, and is expressed by the following equation (4). R is a distance r [pixel] from the image center in the case of the angle of view θ = π / 2.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 ここで、wは画像の横画素数[ピクセル]を表し、hは画像の縦画素数[ピクセル]を表す。焦点距離fはあらかじめレンズの仕様として定められている値を用いてもよいし、カメラキャリブレーションにより求めた値を用いてもよい。 Here, w represents the number of horizontal pixels [pixel] of the image, and h represents the number of vertical pixels of the image [pixel]. The focal length f may use a value determined in advance as lens specifications, or a value obtained by camera calibration.
 式(3)および(4)におけるuにuを代入し、vにvを代入することで画角(θ、φ)が得られる。 By substituting u 1 into u in equations (3) and (4) and substituting v 1 into v, the angle of view (θ 1 , φ 1 ) is obtained.
 次に、画像補正部106は、第一画像における画角(θ、φ)を、第二画像における画角(θ、φ)に変換する。水平画角θについては、以下の式(1)を用いて変換する。垂直画角については、式(5)を用いて変換する。 Next, the image correction unit 106 converts the angle of view (θ 1 , φ 1 ) in the first image into the angle of view (θ 2 , φ 2 ) in the second image. The horizontal angle of view θ is converted using the following equation (1). The vertical angle of view is converted using equation (5).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 続いて、画像補正部106は、第二画像上の画角(θ、φ)を、第二画像上の画素位置I(u、v)に以下の式(6)および(7)を用いて変換する。 Subsequently, the image correction unit 106 sets the angle of view (θ 2 , φ 2 ) on the second image to the pixel position I 2 (u 2 , v 2 ) on the second image by the following equations (6) and (6) Convert using 7).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 ここで、f(v、R)は式(8)で表され、パラメータλ、μおよびvは式(9)、(10)および(11)により表される。 Here, f (v, R) is represented by equation (8), and parameters λ, μ and v are represented by equations (9), (10) and (11).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 式(9)~(11)のθにθを代入し、φにφを代入したf(v、R)、λ、μおよびvを式(6)および(7)に代入することで、画素位置I(u、v)が求められる。 By substituting θ 2 into θ in Equations (9) to (11) and substituting φ 2 into φ, f (v, R), λ, μ, and v are substituted into Equations (6) and (7). , Pixel position I 2 (u 2 , v 2 ) is determined.
 以上により、画像補正部106は、第一画像における画素位置と第二画像における画素位置との対応関係を算出し、当該対応関係に基づいて第一画像における遮蔽領域の画素を第二画像における画素を用いて補正する。例えば、画像補正部106は、以下の式(12)のように最も近い位置にある画素値を用いる最近傍補間(ニアレストネイバー補間)、または、周辺の4画素を用いて画素値を直線的に補間するバイリニア補間を用いて補正してもよい。この場合、画像補正部106は、最近傍補間またはバイリニア補間を用いて第二画像上の画素位置I(u、v)の画素値を求め、第一画像上の画素位置I(u、v)に代入する。なお、第一画像の画素数よりも第二画像の画素数が多いなどの場合には、第二画像上のより多くの画素の画素値を用いて、第一画像上の画素を補完してもよい。 As described above, the image correction unit 106 calculates the correspondence between the pixel position in the first image and the pixel position in the second image, and based on the correspondence, the pixels of the shielded area in the first image are pixels in the second image Use to correct. For example, the image correction unit 106 linearly approximates the pixel value using the nearest neighbor interpolation (the nearest neighbor interpolation) using the pixel value at the closest position as in the following equation (12), or using the four surrounding pixels. It may correct using bilinear interpolation which interpolates into. In this case, the image correction unit 106 obtains the pixel value of the pixel position I 2 (u 2 , v 2 ) on the second image using nearest neighbor interpolation or bilinear interpolation, and calculates the pixel position I 1 (on the first image). Assign to u 1 , v 1 ). In the case where the number of pixels of the second image is larger than the number of pixels of the first image, etc., the pixel values of more pixels on the second image are used to complement the pixels on the first image It is also good.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 このように、画像補正部106は、第一画像における画素位置I(u、v)と、第二画像における画素位置I(u、v)との対応関係を求める。画像補正部106は、当該対応関係に基づいて第一画像における遮蔽領域の全画素を、第二画像を用いて補正する。 Thus, the image correction unit 106 obtains the correspondence between the pixel position I 1 (u 1 , v 1 ) in the first image and the pixel position I 2 (u 2 , v 2 ) in the second image. The image correction unit 106 corrects all pixels in the shielded area in the first image using the second image based on the correspondence relationship.
 画像補正部106が、第二撮像部108で撮影された第二画像上の遮蔽領域を、第一撮像部107で撮影された第一画像を用いて補正する方法については詳しい説明を省略するが、上述した方法と同様の方法で補正することができる。 A detailed description of the method of correcting the shielded area on the second image captured by the second imaging unit 108 using the first image captured by the first imaging unit 107 will be omitted. , And can be corrected by the same method as described above.
 このように、画像補正部106は、第一画像の遮蔽領域(第一被補完領域)の各画素を、第二画像における、当該画素(第一画像の遮蔽領域内の各画素)に対応する画素(第一補完用画素)によって補完する。また、画像補正部106は、第二画像の遮蔽領域(第二被補完領域)の各画素を、第一画像における、当該画素(第二画像の遮蔽領域内の各画素)に対応する画素(第二補完用画素)によって補完してもよい。これにより、動的に遮蔽領域を決定することによって、事前設定の負荷を軽減しつつ、他のレンズによって遮蔽された画像の領域を補完して、好適な画像を生成することができる。 Thus, the image correction unit 106 corresponds each pixel of the shielded area (first complemented area) of the first image to the pixel (each pixel in the shielded area of the first image) in the second image. It complements by a pixel (pixel for the first complementation). In addition, the image correction unit 106 sets each pixel of the shielded area (second complemented area) of the second image to a pixel (each pixel in the shielded area of the second image) in the first image (pixel). It may be complemented by the second complementary pixel). Thereby, by dynamically determining the shielded area, it is possible to complement the area of the image shielded by the other lens to generate a suitable image while reducing the preset load.
 また、画像補正部106は、第一被補完領域に対応する被写体505までの撮影距離を、あらかじめ定められた想定撮影距離Dとして、種々の計算を行ってもよい。上述したように、あらかじめ定められた想定撮影距離Dを用いた場合であっても、違和感の少ない補正画像を生成することができる。 In addition, the image correction unit 106 may perform various calculations with the shooting distance to the subject 505 corresponding to the first complemented area as the predetermined shooting distance D determined in advance. As described above, even in the case of using the predetermined assumed imaging distance D, it is possible to generate a corrected image with less discomfort.
 (変形例1)
 上述の例では、画像補正部106は第一画像と第二画像との両方に補正を施しているが、本実施形態では、画像補正部106は一方の画像のみに補正を施してもよい。例えば、表示部105が撮影画像をプレビュー表示する場合、表示部105は補正された第一画像のみを補正画像として表示してもよい。これにより、表示部105が立体画像を表示できない簡易な構成のディスプレイであっても、ユーザは好適に補正画像の品質を確認することができる。
(Modification 1)
In the above-described example, the image correction unit 106 corrects both the first image and the second image, but in the present embodiment, the image correction unit 106 may correct only one image. For example, when the display unit 105 previews the captured image, the display unit 105 may display only the corrected first image as a corrected image. As a result, even if the display unit 105 is a display of a simple configuration that can not display a stereoscopic image, the user can preferably check the quality of the corrected image.
 (変形例2)
 上述の例では、画像補正部106は、画像補正部106に入力された第一画像および第二画像を入力された状態のまま補正しているが、画像補正部106に入力された第一画像および第二画像を異なる投影方式に変換してから補正してもよい。例えば、魚眼レンズで撮影された画像は等距離射影方式または等立体射影方式によって表現されており、画像補正部106は、これらの射影方式によって表される第一画像および第二画像を透視投影方式で表される画像に変換してから補正を行ってもよい。画像補正後、画像補正部106は、補正画像を等距離射影方式または等立体射影方式の画像に再び変換する。画像補正部106は、透視投影方式によって表される画像に変換して画像端が大きく引き延ばされた一方の画像を、同じく画像端が大きく引き伸ばされたもう一方の画像を用いて補正する。画像端が大きく引き延ばされた画像は画素数が増えるため、画像補正部106は、もう一方の画像の画像端における画素を用いて一方の画像端の遮蔽領域の画素を高精度に補正することができる。
(Modification 2)
In the above-described example, the image correction unit 106 corrects the first image and the second image input to the image correction unit 106 in a state in which they are input, but the first image input to the image correction unit 106 And the second image may be converted to a different projection scheme and then corrected. For example, an image captured by a fisheye lens is represented by an equidistant projection method or an equi-stereographic projection method, and the image correction unit 106 performs perspective projection on the first image and the second image represented by these projection methods. Correction may be performed after conversion to an image to be represented. After the image correction, the image correction unit 106 converts the corrected image into an image of the equidistant projection method or the isometric projection method again. The image correction unit 106 corrects one image, which is converted into an image represented by a perspective projection system and the image end is greatly extended, using the other image, which is also greatly enlarged at the image end. The image correction unit 106 corrects the pixels in the shielded area of one image end with high accuracy using the pixels at the image end of the other image because the number of pixels of the image in which the image end is greatly extended increases. be able to.
 (変形例3)
 上述の例では、画像補正部106は、遮蔽領域のみに補正を施しているが、遮蔽領域に隣接する隣接領域にも補正を施してもよい。例えば、画像補正部106は、第二画像を用いて第一画像の遮蔽領域を補正する場合、遮蔽領域に隣接する隣接領域内の各画素に対し、当該画素(隣接領域の各画素)、および、第二画像における、当該画素(隣接領域内の各画素)に対応する画素の両方に基づいて補正してもよい。画像補正部106は、第二画像における隣接領域内の各画素に対応する画素を、第二画像における遮蔽領域内の各画素に対応する画素と同様の手法によって算出することができる。
(Modification 3)
In the above-described example, the image correction unit 106 corrects only the shielded area, but may correct the adjacent area adjacent to the shielded area. For example, when the image correction unit 106 corrects the shielded area of the first image using the second image, the pixel (each pixel of the adjacent area) for each pixel in the adjacent area adjacent to the shielded area, and The correction may be performed based on both of the pixels corresponding to the pixel (each pixel in the adjacent region) in the second image. The image correction unit 106 can calculate the pixels corresponding to the respective pixels in the adjacent region in the second image by the same method as the pixels corresponding to the respective pixels in the shielding region in the second image.
 一態様において、画像補正部106は、遮蔽領域に隣接する隣接領域の各画素に対し、第一画像の画素とそれに対応する第二画像の画素とを混合した画素を用いて補正を行ってもよい。このように、画像補正部106が、第一画像の画素と第二画像の画素とを混合して補正することで、画像における補正された領域と補正されていない領域との境界を目立たなくすることができる。また、画像補正部106は、第一画像と第二画像との混合比を、最大画角周辺から画像中心になるに従って第一画像の画素の比率を高くなるように設定してもよい。これにより、補正された画像の領域と補正されていない画像の領域との境界が滑らかにすることができる。また、画像補正部106が、第二画像の遮蔽領域を、第一画像を用いて補正する場合にも、同様の方法を用いて画像における補正された領域と補正されていない領域との境界を滑らかにすることができる。 In one aspect, the image correction unit 106 performs correction on each pixel of the adjacent area adjacent to the shielded area using a pixel obtained by mixing the pixel of the first image and the pixel of the second image corresponding thereto. Good. In this manner, the image correction unit 106 mixes and corrects the pixels of the first image and the pixels of the second image, thereby making the boundary between the corrected area and the uncorrected area in the image inconspicuous. be able to. In addition, the image correction unit 106 may set the mixing ratio of the first image and the second image so that the ratio of the pixels of the first image increases as the image becomes closer to the image center from around the maximum angle of view. Thus, the boundary between the area of the corrected image and the area of the uncorrected image can be smoothed. Further, even when the image correction unit 106 corrects the shielded area of the second image using the first image, the boundary between the corrected area and the uncorrected area in the image is determined using the same method. It can be smooth.
 (変形例4)
 上述の例では、画像補正部106は、第一画像の遮蔽領域の各画素を第二画像の画素を用いて補正した画像をそのまま出力しているが、補正後の遮蔽領域に画像処理をさらに施してから出力してもよい。例えば、画像補正部106は、補正した遮蔽領域にぼかし処理を施したり、コントラストを抑制するような処理を施したりしてもよい。これにより、補正した遮蔽領域を補正していない遮蔽領域よりも目立たなくすることができる。また、画像補正部106は、画像中心からの距離が遠くなるに従い、すなわち、画像端になるに従い、ぼかし処理を強くしたり、コントラストがより抑制されるような処理を施したりしてもよい。これにより、補正を施した領域と補正を施していない領域との境界をより目立たなくすることができる。また、画像補正部106は、補正後の遮蔽領域だけでなく、遮蔽領域周辺に上述の画像処理をさらに施してもよい。これにより、補正を施した領域と補正を施していない領域との境界をさらに目立たなくし、滑らかにすることができる。
(Modification 4)
In the above-described example, the image correction unit 106 outputs an image obtained by correcting each pixel of the shielded area of the first image using the pixels of the second image as it is, but further performs image processing on the shielded area after correction. It may be output after being applied. For example, the image correction unit 106 may perform blurring processing on the corrected shielding area or processing to suppress the contrast. This makes the corrected shielded area less noticeable than the uncorrected shielded area. In addition, the image correction unit 106 may perform blurring processing intensified or processing in which contrast is further suppressed as the distance from the center of the image increases, that is, as the image ends. Thereby, the boundary between the area subjected to the correction and the area not subjected to the correction can be made more inconspicuous. Further, the image correction unit 106 may further perform the above-described image processing not only on the shielded area after correction but also on the periphery of the shielded area. As a result, the boundary between the area subjected to the correction and the area not subjected to the correction can be further made inconspicuous and smooth.
 (変形例5)
 上述の例では、画像補正部106は、あらかじめ定められた想定撮影距離Dから第一画像の画素と第二画像の画素との対応関係を求めているが、画角に応じて想定撮影距離Dを変更して第一画像の画素と第二画像の画素との対応関係を求めてもよい。以下に、図7を用いて画角に応じて、遮蔽領域に写っていると想定される被写体撮像距離Dを変更する方法の例を説明する。
(Modification 5)
In the above-described example, the image correction unit 106 finds the correspondence between the pixels of the first image and the pixels of the second image from the predetermined assumed shooting distance D, but the assumed shooting distance D is determined according to the angle of view. To calculate the correspondence between the pixels of the first image and the pixels of the second image. Hereinafter, an example of a method of changing the subject imaging distance D assumed to be captured in the shielded area in accordance with the angle of view will be described with reference to FIG.
 図7は、画像補正部106が画角によって想定撮影距離を変更する方法の一例を説明するための図である。例えば、第一画像の遮蔽領域内の画素のうち、画角がθ701となるような画素について、対応する第二画像上の画素(第一補完用画素)を算出する場合には、画像補正部106は、想定撮影距離を想定撮影距離702となるように変更する。また、第一画像の遮蔽領域内の画素のうち、画角がθ703となるような画素について、対応する第二画像上の画素(第一補完用画素)を算出する場合には、画像補正部106は、想定撮影距離を想定撮影距離704となるように変更する。第一画像の遮蔽領域のうち、画角が90度(画像端)となるような画素について、対応する第二画像上の画素(第一補完用画素)を算出する場合には、画像補正部106は、想定撮影距離を想定撮影距離705となるように変更する。 FIG. 7 is a diagram for explaining an example of a method in which the image correction unit 106 changes the assumed shooting distance according to the angle of view. For example, in the case of calculating a pixel (first complementary pixel) on the corresponding second image for a pixel in which the angle of view is θ 1 701 among the pixels in the shielded area of the first image, The correction unit 106 changes the assumed shooting distance to be the assumed shooting distance 702. In addition, in the case of calculating a pixel (first complementary pixel) on the corresponding second image for a pixel in which the angle of view becomes θ 1 703 among the pixels in the shielded area of the first image, The correction unit 106 changes the assumed shooting distance to the assumed shooting distance 704. In the case of calculating a pixel (first complementary pixel) on the corresponding second image for a pixel whose angle of view is 90 degrees (image end) in the shielded area of the first image, the image correction unit In 106, the assumed shooting distance is changed to be the assumed shooting distance 705.
 このように、画像補正部106が、画角が大きくなるにつれ、想定撮影距離が長くなるように設定することで、図7の(a)に示すように投影面706が斜めに傾いている補正画像を生成できる。これにより、被写体の遠近感を考慮した補正画像を生成することができる。また、図7の(b)に示すように、画像補正部106は、投影面707が曲面となるように想定撮影距離と画角との関係を変更してもよい。これにより、広角レンズの歪みを考慮した補正画像を生成することができる。 As described above, the image correction unit 106 sets the assumed imaging distance to be longer as the angle of view becomes larger, so that the projection plane 706 is inclined as shown in FIG. 7A. Can generate an image. This makes it possible to generate a corrected image in which the perspective of the subject is taken into consideration. Further, as shown in (b) of FIG. 7, the image correction unit 106 may change the relationship between the assumed imaging distance and the angle of view so that the projection surface 707 is a curved surface. This makes it possible to generate a corrected image in which the distortion of the wide-angle lens is taken into consideration.
 (変形例6)
 上述の例では、画像補正部106は、一方の画像における画素ごとに、対応するもう一方の画像の画素を算出しているが、あらかじめ第一画像と第二画像との画素の対応関係を算出しておいてもよい。この場合、画像補正部106は、当該対応関係をルックアップテーブルなどに保存しておく。これにより、画像補正部106が、補正処理を行うたびに第一画像と第二画像との対応関係を計算する必要がなくなるため、第一画像および第二画像の撮影から補正画像の表示までの処理時間を短くすることができる。
(Modification 6)
In the above example, the image correction unit 106 calculates the pixel of the other corresponding image for each pixel in one image, but calculates the correspondence between the pixels of the first image and the second image in advance. You may leave it. In this case, the image correction unit 106 stores the correspondence in a lookup table or the like. As a result, since the image correction unit 106 does not have to calculate the correspondence between the first image and the second image each time the correction processing is performed, the process from the photographing of the first image and the second image to the display of the corrected image Processing time can be shortened.
 (変形例7)
 上述の例では、画像補正部106は、補正画像のみを出力しているが、補正画像と遮蔽領域との両方を出力してもよい。補正画像だけでなく遮蔽領域も出力することにより、ユーザにとって画像上のどの部分を補正したのかが分かりやすくなる。
(Modification 7)
In the above-described example, the image correction unit 106 outputs only the corrected image, but may output both the corrected image and the shielding area. By outputting not only the corrected image but also the shielded area, it is easy for the user to understand which part of the image has been corrected.
 (変形例8)
 上述の例では、撮像装置102には、最大画角が180度の第一画像および第二画像が入力されているが、例えば、ミラーを利用した全方位カメラを用いた水平画角360度の画像など、より画角が広い広角画像が入力されてもよい。より画角が広い広角画像を画像補正部106が補正することで、隣接する撮像部などにより遮蔽される遮蔽領域の範囲が、最大画角が180度である画像を撮影した場合よりも増加する。そのため、画像補正部106がより画角が広い広角画像を補正することでより大きな補正の効果を得ることができる。
(Modification 8)
In the above-described example, the first image and the second image having a maximum angle of view of 180 degrees are input to the imaging device 102. For example, a horizontal angle of view of 360 degrees using an omnidirectional camera using a mirror is input. A wide angle image having a wider angle of view, such as an image, may be input. When the image correction unit 106 corrects a wide-angle image with a wider angle of view, the range of the shielded area shielded by the adjacent imaging unit or the like is larger than in the case where an image with a maximum angle of view of 180 degrees is photographed. . Therefore, a larger correction effect can be obtained by the image correction unit 106 correcting a wide-angle image with a wider angle of view.
 (変形例9)
 上述の例では、画像補正部(出力部)106は、補正画像全てを出力しているが、補正画像から一部分を切り出した画像を出力してもよい。ユーザが広角画像をヘッドマウントディスプレイなどの表示装置を用いて視聴する場合、画像補正部106は、ユーザの視野角に応じて補正画像から切り出す画像の範囲を変更して補正画像から切り出した画像を表示部105に表示できる。
(Modification 9)
In the above-described example, the image correction unit (output unit) 106 outputs all the correction images, but may output an image obtained by cutting out a part of the correction image. When the user views a wide-angle image using a display device such as a head mounted display, the image correction unit 106 changes the range of the image cut out from the corrected image according to the user's viewing angle and cuts the image cut out from the corrected image It can be displayed on the display unit 105.
 (変形例10)
 上述の例では、撮像装置102は、ステレオ撮像部103によって第一画像および第二画像を生成しているが、本実施形態ではこれに限定されない。本実施形態に係る撮像装置102は、第一画像および第二画像に遮蔽領域が形成される可能性のある任意の撮像部に適用することができる。ただし、画像補正部106は、ステレオ撮像部103によって第一画像および第二画像を生成した場合に、特に補正効果の大きい補正画像を生成できるため、好適である。
(Modification 10)
In the above-mentioned example, although imaging device 102 generates the 1st picture and the 2nd picture by stereo imaging part 103, it is not limited to this in this embodiment. The imaging device 102 according to the present embodiment can be applied to any imaging unit in which a shielding area may be formed in the first image and the second image. However, the image correction unit 106 is preferable because it can generate a corrected image with a particularly large correction effect when the stereo imaging unit 103 generates the first image and the second image.
 (変形例11)
 上述の例では、遮蔽領域設定部801は、ステレオ撮像部103による撮影ごとに遮蔽領域を抽出し、設定しているが、図示しない入力部にて遮蔽領域を設定する信号が入力された場合に遮蔽領域を設定するようになっていてもよい。例えば、遮蔽領域設定部801は、遮蔽領域設定ボタンを備えており、ユーザがそのボタンを押すと遮蔽領域を抽出および設定するようになっていてもよい。また、遮蔽領域設定部801は、ジャイロセンサから撮像装置102aの回転を検出し、撮像装置102aが回転している場合に遮蔽領域を抽出するようになっていてもよい。特に、撮像装置102aが回転している場合は、ステレオ撮像部103以外の被写体が大きく動いて撮影されることで、遮蔽領域設定部801は、ステレオ撮像部103を静止領域として抽出しやすくなるため好適である。
(Modification 11)
In the above-described example, the shielding area setting unit 801 extracts and sets the shielding area for each shooting by the stereo imaging unit 103, but a signal for setting the shielding area is input by an input unit (not shown). The shielding area may be set. For example, the shielding area setting unit 801 may include a shielding area setting button, and the user may extract and set the shielding area when the button is pressed. In addition, the shielding area setting unit 801 may detect the rotation of the imaging device 102a from the gyro sensor and extract the shielding area when the imaging device 102a is rotating. In particular, when the imaging device 102a is rotating, the shielded area setting unit 801 can easily extract the stereo imaging unit 103 as a still area by shooting a subject other than the stereo imaging unit 103 with a large movement. It is suitable.
 (変形例12)
 上述の例では、画像補正部106は、遮蔽領域設定部801によって設定された遮蔽領域に基づいて第一画像および第二画像を補正している。ただし、一部の状態下では、記憶部104から読み出した遮蔽領域に基づいて第一画像および第二画像を補正してもよい。例えば、画像補正部106は、撮像装置102aの電源を入れ、撮影を始めた状態では記憶部104から読み出した遮蔽領域に基づいて第一画像および第二画像を補正してもよい。また、画像補正部106は、ある程度第一画像、第二画像を撮影し、遮蔽領域が記憶部104に蓄積された状態では、遮蔽領域設定部801にて設定された遮蔽領域に基づいて第一画像および第二画像を補正してもよい。これにより、遮蔽領域設定部801が撮影画像から静止領域を抽出できていない状況でも、あらかじめ記憶部104に記憶しておいた遮蔽領域に基づいて、第一画像および第二画像の補正を行うことができる。
(Modification 12)
In the above-described example, the image correction unit 106 corrects the first image and the second image based on the shielding area set by the shielding area setting unit 801. However, under some conditions, the first image and the second image may be corrected based on the shielded area read from the storage unit 104. For example, the image correction unit 106 may correct the first image and the second image based on the shielded area read out from the storage unit 104 in a state where the image pickup apparatus 102 a is powered on and imaging is started. Further, in a state where the image correction unit 106 captures the first image and the second image to some extent and the shielded area is accumulated in the storage unit 104, the first image processing unit 106 performs the first process based on the shielded area set by the shielded area setting unit 801. The image and the second image may be corrected. Thus, even if the shielding area setting unit 801 can not extract the still area from the photographed image, the first image and the second image are corrected based on the shielding area stored in the storage unit 104 in advance. Can.
 (変形例13)
 上述の例では、遮蔽領域設定部801は、第一画像を用いて第一画像の遮蔽領域を設定しているが、第一画像と第二画像との両方を用いて第一画像の遮蔽領域を設定してもよい。例えば、遮蔽領域設定部801は、第一画像ごとに静止領域を抽出した結果と、テンプレートマッチングなどを用いて第一画像と第二画像との対応点を推定した結果とを参照することで第一画像の遮蔽領域を設定してもよい。これにより、遮蔽領域設定部801は、第一画像および第二画像において共通の静止領域と、各々の画像のみに存在する静止領域とを区別することができる。遮蔽領域設定部801は、第一画像と第二画像とに共通の静止領域を、遮蔽領域ではなく、例えば、背景など遠方の被写体であるとみなせるため、当該共通の静止領域を遮蔽領域から除外(排除)する。遮蔽領域設定部801は、第一画像および第二画像各々の画像にのみ存在する静止領域を、それぞれの画像を撮影する撮像部以外の撮像部であるとみなせるため、当該静止領域を遮蔽領域に設定する。このように、第一画像と第二画像とに基づいて、第一画像のみに撮影されている静止領域と、第二画像のみに撮影されている静止領域とを、各々の画像における遮蔽領域に設定することで、遮蔽領域を好適に設定することができる。
(Modification 13)
In the above-described example, the shielding area setting unit 801 sets the shielding area of the first image using the first image, but the shielding area of the first image using both the first image and the second image May be set. For example, the shielded area setting unit 801 refers to the result of extracting the still area for each first image and the result of estimating corresponding points of the first image and the second image using template matching or the like. The shielding area of one image may be set. Thereby, the shielding area setting unit 801 can distinguish between the stationary area common to the first image and the second image and the stationary area existing only in each image. The shielding area setting unit 801 excludes the common stationary area from the shielding area because it can be considered that the stationary area common to the first image and the second image is not a shielding area but a distant subject such as a background, for example. (Exclude. Since the shielded area setting unit 801 can regard the still area existing only in each of the first image and the second image as an imaging unit other than the imaging unit that captures each image, the still area is set as a shielded area. Set Thus, based on the first image and the second image, the still area photographed only in the first image and the still area photographed only in the second image can be used as the shielding area in each image. By setting, the shielding area can be set suitably.
 〔ソフトウェアによる実現例〕
 撮像装置102および102aの制御ブロック(特に画像処理装置101および101aにおける画像補正部106および遮蔽領域設定部801)は、集積回路(ICチップ)などに形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。
[Example of software implementation]
The control blocks of the imaging devices 102 and 102a (in particular, the image correction unit 106 and the shielding area setting unit 801 in the image processing devices 101 and 101a) are realized by logic circuits (hardware) formed in integrated circuits (IC chips) or the like. It may be realized by software.
 後者の場合、撮像装置102および102aは、各機能を実現するソフトウェアである画像処理プログラムの命令を実行するコンピュータを備えている。このコンピュータは、例えば少なくとも1つのプロセッサ(制御装置)を備えていると共に、上記画像処理プログラムを記憶したコンピュータ読み取り可能な少なくとも1つの記録媒体を備えている。また、このコンピュータは、OS(Operating Systems)および周辺機器などのハードウェアも備えている。そして、上記コンピュータにおいて、上記プロセッサが上記画像処理プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。 In the latter case, the imaging devices 102 and 102a each include a computer that executes an instruction of an image processing program that is software that implements each function. The computer includes, for example, at least one processor (control device), and at least one computer readable recording medium storing the image processing program. The computer also includes hardware such as an operating system (OS) and peripherals. In the computer, the processor reads the image processing program from the recording medium and executes the program to achieve the object of the present invention.
 上記プロセッサとしては、例えばCPU(Central Processing Unit)を用いることができる。上記記録媒体としては、「一時的でない有形の媒体」、例えば、ROM(Read Only Memory)およびCD-ROMなどの可搬媒体、テープ、フレキシブルディスクおよび光磁気ディスクなどのディスク、カード、半導体メモリ、プログラマブルな論理回路、ならびに、コンピュータシステムに内蔵されるハードディスクなどを用いることができる。上記コンピュータ読み取り可能な記録媒体としては、インターネットなどのネットワーク、および、電話回線などの通信回線を介して画像処理プログラムを送信する場合の通信線のように、短時間の間、動的に画像処理プログラムを保持するもの、ならびに、サーバおよびクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間画像処理プログラムを保持しているものが挙げられる。 For example, a CPU (Central Processing Unit) can be used as the processor. Examples of the recording medium include “non-temporary tangible media”, portable media such as ROM (Read Only Memory) and CD-ROM, disks such as tapes, flexible disks and magneto-optical disks, cards, semiconductor memory, Programmable logic circuits, hard disks incorporated in computer systems, and the like can be used. As the computer readable recording medium, image processing is dynamically performed for a short time, as in the case of transmitting an image processing program via a network such as the Internet and a communication line such as a telephone line. Those that hold programs, and those that hold image processing programs for a certain period of time, such as volatile memory in computer systems that become servers and clients, are included.
 また、撮像装置102および102aは、上記画像処理プログラムを展開するRAM(Random Access Memory)などをさらに備えていてもよい。また、上記画像処理プログラムは、該画像処理プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波など)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記画像処理プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In addition, the imaging devices 102 and 102a may further include a RAM (Random Access Memory) or the like for developing the image processing program. The image processing program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the image processing program. Note that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the image processing program is embodied by electronic transmission.
 また、撮像装置102および102aの一部、または全部を典型的には集積回路であるLSIとして実現してもよい。この場合、撮像装置102および102aの制御ブロックを個別にチップ化してもよいし、一部、または全部を集積してチップ化してもよい。また、集積回路化の手法はLSIに限らず、専用回路、または汎用プロセッサで実現してもよい。また、半導体技術の進歩によりLSIに代替する集積回路化の技術が出現した場合、当該技術による集積回路を用いることも可能である。 Further, part or all of the imaging devices 102 and 102a may be realized as an LSI, which is typically an integrated circuit. In this case, the control blocks of the imaging devices 102 and 102a may be individually chipped, or some or all of the control blocks may be integrated and chipped. Further, the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible. In the case where an integrated circuit technology comes out to replace LSI's as a result of the advancement of semiconductor technology, it is also possible to use an integrated circuit according to such technology.
 また、上述の実施形態において、制御線および情報線は、説明上必要と考えられるものを示しており、製品上、必ずしも全ての制御線および情報線を示しているとは限らず、全ての構成が相互に接続されていてもよい。 Further, in the above-described embodiment, the control lines and the information lines indicate what is considered to be necessary for the description, and not all the control lines and the information lines are necessarily shown on the product, and all the configurations May be mutually connected.
 〔まとめ〕
 本発明の態様1に係る撮像装置(102)は、第一撮像部(107)で撮像された動画像(フレーム画像901~903)における第一静止領域に基づいて、当該第一撮像部で撮像された第一画像の第一被補完領域(遮蔽領域904~906)を決定する決定部(遮蔽領域設定部801)と、前記第一被補完領域内の各画素を、第二撮像部(108)で撮像された第二画像における、当該画素に対応する第一補完用画素によって補完する補完部(画像補正部106)と、を備えている。
[Summary]
The imaging device (102) according to aspect 1 of the present invention is imaged by the first imaging unit based on the first still area in the moving image (frame images 901 to 903) imaged by the first imaging unit (107). A determination unit (shielded region setting unit 801) for determining a first complemented region (shielded regions 904 to 906) of the first image, and a second imaging unit (108) for each pixel in the first And a complementing unit (image correction unit 106) that is complemented by the first complementing pixel corresponding to the pixel in the second image captured in step b.
 上記の構成によれば、動的に遮蔽領域を決定することによって、事前設定の負荷を軽減しつつ、他のレンズによって遮蔽された画像の領域を補完して、好適な画像を生成することができる。 According to the above configuration, it is possible to complement the area of the image shielded by another lens to generate a suitable image while reducing the preset load by dynamically determining the shielded area. it can.
 本発明の態様2に係る撮像装置は、上記態様1において、前記第一撮像部および前記第二撮像部はステレオカメラを構成していてもよい。 In the imaging device according to aspect 2 of the present invention, in the aspect 1, the first imaging unit and the second imaging unit may constitute a stereo camera.
 上記の構成によれば、好適な立体画像を生成することができる。 According to the above configuration, a suitable stereoscopic image can be generated.
 本発明の態様3に係る撮像装置は、上記態様1または2において、前記決定部は、前記第一静止領域が複数の領域に分かれている場合、当該動画像の画像端に近い領域を、前記第一被補完領域として決定してもよい。 In the imaging device according to aspect 3 of the present invention, in the above aspect 1 or 2, when the first still area is divided into a plurality of areas, the determining unit may set an area close to the image end of the moving image It may be determined as the first complemented area.
 上記の構成によれば、画像中心に動いていない被写体、例えばケーブルなどの被写体が複数のフレーム画像に写り込んでいる場合でも、第一撮像部および第二撮像部による遮蔽領域のみを抽出することができる。 According to the above configuration, even when an object not moving around the image, for example, an object such as a cable is reflected in a plurality of frame images, only the shielding area by the first imaging unit and the second imaging unit is extracted. Can.
 本発明の態様4に係る撮像装置は、上記態様1~3において、前記補完部は、さらに、前記第一画像における前記第一被補完領域に隣接する隣接領域内の各画素を、当該画素、および、前記第二画像における、当該画素に対応する画素の両方に基づいて補正してもよい。 In the imaging device according to Aspect 4 of the present invention, in the above Aspects 1 to 3, the complementing unit further includes: each pixel in an adjacent area adjacent to the first complemented area in the first image; And, the correction may be performed based on both of the pixels corresponding to the pixel in the second image.
 上記の構成によれば、画像における補正された領域と補正されていない領域との境界を目立たなくすることができる。 According to the above configuration, the boundary between the corrected area and the uncorrected area in the image can be made inconspicuous.
 本発明の態様5に係る撮像装置は、上記態様1~4において、前記補完部が補完した前記第一画像の一部を切り出して出力する出力部を備えていてもよい。 The imaging apparatus according to aspect 5 of the present invention may further include an output unit configured to cut out and output a part of the first image complemented by the complementing section in the above-described aspects 1 to 4.
 上記の構成によれば、有用な出力を行うことができる。 According to the above configuration, useful output can be performed.
 本発明の態様6に係る画像処理装置は、上記態様1~5において、前記決定部は、さらに、前記第二撮像部で撮像された動画像における第二静止領域に基づいて、前記第二画像の第二被補完領域を決定し、前記補完部は、さらに、前記第二被補完領域内の各画素を、前記第一画像における、当該画素に対応する第二補完用画素によって補完してもよい。 In the image processing device according to aspect 6 of the present invention, in the above aspects 1 to 5, the determination unit further determines the second image based on a second still area in a moving image captured by the second imaging unit. The second complemented region is determined, and the complementing unit further complements each pixel in the second complemented region by a second complementary pixel corresponding to the pixel in the first image. Good.
 上記の構成によれば、第一画像および第二画像の両方を好適に補正することができる。 According to the above configuration, both the first image and the second image can be suitably corrected.
 本発明の態様7に係る画像処理装置は、上記態様6において、前記第一静止領域および前記第二静止領域に共通する領域を、前記第一被補完領域および前記第二被補完領域から排除してもよい。 In the image processing apparatus according to aspect 7 of the present invention, in the aspect 6, the area common to the first stationary area and the second stationary area is excluded from the first complemented area and the second complemented area. May be
 上記の構成によれば、第一被補完領域および第二被補完領域を好適に決定することができる。 According to the above configuration, the first region to be complemented and the second region to be complemented can be suitably determined.
 本発明の態様8に係る撮像装置は、上記態様1~7の何れか一つの画像処理装置と、前記第一撮像部および前記第二撮像部をさらに備えている。 An imaging device according to aspect 8 of the present invention further includes the image processing device according to any one of aspects 1 to 7, the first imaging unit, and the second imaging unit.
 上記の構成によれば、本発明の一態様に係る画像処理装置と同様の効果を奏する。 According to the above configuration, the same effects as the image processing device according to one aspect of the present invention are obtained.
 本発明の態様9に係る画像処理装置の制御方法は、画像処理装置が、第一撮像部で撮像された動画像における第一静止領域に基づいて、当該第一撮像部で撮像された第一画像の第一被補完領域を決定する決定工程と、前記画像処理装置が、前記第一被補完領域内の各画素を、第二撮像部で撮像された第二画像における、当該画素に対応する第一補完用画素によって補完する補完工程と、を包含する。 In the control method of an image processing apparatus according to aspect 9 of the present invention, the image processing apparatus is configured such that the first imaging unit captures an image based on the first still area in the moving image captured by the first imaging unit. A determining step of determining a first region to be complemented of the image; and the image processing device corresponding to each pixel in the first region to be complemented in the second image captured by the second imaging unit And a complementing step of complementing by the first complementing pixel.
 上記の構成によれば、本発明の一態様に係る画像処理装置と同様の効果を奏する。 According to the above configuration, the same effects as the image processing device according to one aspect of the present invention are obtained.
 本発明の各態様に係る画像処理装置は、コンピュータによって実現してもよく、この場合には、コンピュータを上記画像処理装置が備える各部(ソフトウェア要素)として動作させることにより上記画像処理装置をコンピュータにて実現させる画像処理プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The image processing apparatus according to each aspect of the present invention may be realized by a computer. In this case, the computer is operated as each unit (software element) included in the image processing apparatus to cause the computer to execute the image processing apparatus. An image processing program to be realized and a computer readable recording medium recording the same also fall within the scope of the present invention.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。 The present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims, and embodiments obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present invention. Furthermore, new technical features can be formed by combining the technical means disclosed in each embodiment.
 (関連出願の相互参照)
 本出願は、2017年10月26日に出願された日本国特許出願:特願2017-207312に対して優先権の利益を主張するものであり、それを参照することにより、その内容の全てが本書に含まれる。
(Cross-reference to related applications)
This application claims the benefit of priority to Japanese Patent Application No. 2017-207312 filed Oct. 26, 2017, the entire contents of which are hereby incorporated by reference. Included in this book.
 101 画像処理装置
 102 撮像装置
 104 記憶部
 105 表示部
 106 画像補正部(補完部)
 107 第一撮像部
 108 第二撮像部
 301 第一画像
 303 第二画像
 302、904~906 遮蔽領域(第一被補完領域)
 304 遮蔽領域(第二被補完領域)
 501、502、604、605、701、703 画角
 801 遮蔽領域設定部(決定部)
 901、902、903 フレーム画像(動画像)
DESCRIPTION OF SYMBOLS 101 Image processing apparatus 102 Imaging apparatus 104 Storage part 105 Display part 106 Image correction part (complement part)
107 first imaging unit 108 second imaging unit 301 first image 303 second image 302, 904 to 906 shielded area (first complemented area)
304 Shielded area (second complemented area)
501, 502, 604, 605, 701, 703 Angle of view 801 Shielded area setting unit (determination unit)
901, 902, 903 frame images (moving images)

Claims (11)

  1.  第一撮像部で撮像された動画像における第一静止領域に基づいて、当該第一撮像部で撮像された第一画像の第一被補完領域を決定する決定部と、
     前記第一被補完領域内の各画素を、第二撮像部で撮像された第二画像における、当該画素に対応する第一補完用画素によって補完する補完部と、を備えていることを特徴とする画像処理装置。
    A determination unit configured to determine a first complemented region of the first image captured by the first imaging unit based on the first still region in the moving image captured by the first imaging unit;
    And a complementing unit for complementing each pixel in the first complemented region with a first complementing pixel corresponding to the pixel in the second image captured by the second imaging unit. Image processing device.
  2.  前記第一撮像部および前記第二撮像部はステレオカメラを構成していることを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the first imaging unit and the second imaging unit constitute a stereo camera.
  3.  前記決定部は、前記第一静止領域が複数の領域に分かれている場合、当該動画像の画像端に近い領域を、前記第一被補完領域として決定することを特徴とする請求項1または2に記載の画像処理装置。 3. The apparatus according to claim 1, wherein, when the first still area is divided into a plurality of areas, the determination unit determines an area close to the image end of the moving image as the first complemented area. The image processing apparatus according to claim 1.
  4.  前記補完部は、さらに、前記第一画像における前記第一被補完領域に隣接する隣接領域内の各画素を、当該画素、および、前記第二画像における、当該画素に対応する画素の両方に基づいて補正することを特徴とする請求項1~3の何れか一項に記載の画像処理装置。 The complementing unit further determines each pixel in an adjacent area adjacent to the first complemented area in the first image based on both the pixel and a pixel corresponding to the pixel in the second image. The image processing apparatus according to any one of claims 1 to 3, wherein the correction is performed.
  5.  前記補完部が補完した前記第一画像の一部を切り出して出力する出力部を備えていることを特徴とする請求項1~4の何れか一項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 4, further comprising: an output unit configured to cut out and output a part of the first image complemented by the complementing unit.
  6.  前記決定部は、さらに、前記第二撮像部で撮像された動画像における第二静止領域に基づいて、前記第二画像の第二被補完領域を決定し、
     前記補完部は、さらに、前記第二被補完領域内の各画素を、前記第一画像における、当該画素に対応する第二補完用画素によって補完することを特徴とする請求項1~5の何れか一項に記載の画像処理装置。
    The determination unit further determines a second complemented region of the second image based on a second still region in the moving image captured by the second imaging unit,
    The said complementation part further complements each pixel in the said 2nd to-be-complemented area by the pixel for 2nd complementation corresponding to the said pixel in the said 1st image, Any one of Claim 1 to 5 characterized by the above-mentioned. An image processing apparatus according to any one of the preceding claims.
  7.  前記決定部は、前記第一静止領域および前記第二静止領域に共通する領域を、前記第一被補完領域および前記第二被補完領域から排除することを特徴とする請求項6に記載の画像処理装置。 The image according to claim 6, wherein the determination unit excludes an area common to the first still area and the second still area from the first complemented area and the second complemented area. Processing unit.
  8.  請求項1~7の何れか一項に記載の画像処理装置と、前記第一撮像部および前記第二撮像部をさらに備えていることを特徴とする撮像装置。 An imaging apparatus comprising the image processing apparatus according to any one of claims 1 to 7, and the first imaging unit and the second imaging unit.
  9.  画像処理装置が、第一撮像部で撮像された動画像における第一静止領域に基づいて、当該第一撮像部で撮像された第一画像の第一被補完領域を決定する決定工程と、
     前記画像処理装置が、前記第一被補完領域内の各画素を、第二撮像部で撮像された第二画像における、当該画素に対応する第一補完用画素によって補完する補完工程と、を包含することを特徴とする画像処理装置の制御方法。
    A determination step of determining a first complemented region of the first image captured by the first imaging unit based on the first still region in the moving image captured by the first imaging unit;
    The image processing apparatus includes a complementing step of complementing each pixel in the first complemented region with a first complementing pixel corresponding to the pixel in a second image captured by a second imaging unit And controlling the image processing apparatus.
  10.  請求項1~7の何れか一項に記載の画像処理装置としてコンピュータを機能させるための画像処理プログラムであって、前記決定部および前記補完部として前記コンピュータを機能させるための画像処理プログラム。 An image processing program for causing a computer to function as the image processing apparatus according to any one of claims 1 to 7, wherein the image processing program for causing the computer to function as the determination unit and the complement unit.
  11.  請求項10に記載の画像処理プログラムを記録したコンピュータ読み取り可能な記録媒体。 The computer-readable recording medium which recorded the image processing program of Claim 10.
PCT/JP2018/014401 2017-10-26 2018-04-04 Image processing device, imaging apparatus, method for controlling image processing device, image processing program, and recording medium WO2019082415A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019549820A JPWO2019082415A1 (en) 2017-10-26 2018-04-04 Image processing device, imaging device, control method of image processing device, image processing program and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-207312 2017-10-26
JP2017207312 2017-10-26

Publications (1)

Publication Number Publication Date
WO2019082415A1 true WO2019082415A1 (en) 2019-05-02

Family

ID=66247366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/014401 WO2019082415A1 (en) 2017-10-26 2018-04-04 Image processing device, imaging apparatus, method for controlling image processing device, image processing program, and recording medium

Country Status (2)

Country Link
JP (1) JPWO2019082415A1 (en)
WO (1) WO2019082415A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022004038A1 (en) * 2020-06-29 2022-01-06 株式会社シータ Method, program, and device for outputting image

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006013609A (en) * 2004-06-22 2006-01-12 Sharp Corp Multi-viewpoint stereoscopic image photographing device
JP2010041586A (en) * 2008-08-07 2010-02-18 Olympus Corp Imaging device
JP2010288253A (en) * 2009-05-14 2010-12-24 Fujifilm Corp Apparatus, method, and program for processing images
JP2011160223A (en) * 2010-02-01 2011-08-18 Nikon Corp Imaging apparatus
JP2012034147A (en) * 2010-07-30 2012-02-16 Hitachi Ltd Surveillance camera system having camera abnormality detection device
JP2015133691A (en) * 2013-12-13 2015-07-23 パナソニックIpマネジメント株式会社 Imaging apparatus, image processing system, imaging method and recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006013609A (en) * 2004-06-22 2006-01-12 Sharp Corp Multi-viewpoint stereoscopic image photographing device
JP2010041586A (en) * 2008-08-07 2010-02-18 Olympus Corp Imaging device
JP2010288253A (en) * 2009-05-14 2010-12-24 Fujifilm Corp Apparatus, method, and program for processing images
JP2011160223A (en) * 2010-02-01 2011-08-18 Nikon Corp Imaging apparatus
JP2012034147A (en) * 2010-07-30 2012-02-16 Hitachi Ltd Surveillance camera system having camera abnormality detection device
JP2015133691A (en) * 2013-12-13 2015-07-23 パナソニックIpマネジメント株式会社 Imaging apparatus, image processing system, imaging method and recording medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022004038A1 (en) * 2020-06-29 2022-01-06 株式会社シータ Method, program, and device for outputting image
TWI759136B (en) * 2020-06-29 2022-03-21 日商Theta股份有限公司 A method for outputting an image, and program product and apparatus therefor

Also Published As

Publication number Publication date
JPWO2019082415A1 (en) 2020-11-05

Similar Documents

Publication Publication Date Title
JP6484349B2 (en) Camera rig and 3D image capture
KR101991080B1 (en) Omni-stereo capture and rendering of panoramic virtual reality content
US8581961B2 (en) Stereoscopic panoramic video capture system using surface identification and distance registration technique
EP2328125B1 (en) Image splicing method and device
JP5679978B2 (en) Stereoscopic image alignment apparatus, stereoscopic image alignment method, and program thereof
US20160301868A1 (en) Automated generation of panning shots
WO2015081870A1 (en) Image processing method, device and terminal
KR20180111798A (en) Adaptive stitching of frames in the panorama frame creation process
KR20130055002A (en) Zoom camera image blending technique
JP2010171503A (en) Obstacle detection display device
KR101991754B1 (en) Image processing method and apparatus, and electronic device
CN109785390B (en) Method and device for image correction
CN110651275A (en) System and method for correcting panoramic digital overlay images
CN109785225B (en) Method and device for correcting image
JP2022183177A (en) Head-mounted display device
JP2019029721A (en) Image processing apparatus, image processing method, and program
JP6016180B2 (en) Image processing method and image processing apparatus
WO2019082415A1 (en) Image processing device, imaging apparatus, method for controlling image processing device, image processing program, and recording medium
JP6732440B2 (en) Image processing apparatus, image processing method, and program thereof
JP6625654B2 (en) Projection device, projection method, and program
KR20160115043A (en) Method for increasing film speed of video camera
JP6579706B2 (en) Image processing apparatus, image processing method, and image processing program
JP5689693B2 (en) Drawing processor
JP2017028606A (en) Imaging apparatus
JP2014049895A (en) Image processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18871239

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019549820

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18871239

Country of ref document: EP

Kind code of ref document: A1