WO2020170723A1 - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
WO2020170723A1
WO2020170723A1 PCT/JP2020/002771 JP2020002771W WO2020170723A1 WO 2020170723 A1 WO2020170723 A1 WO 2020170723A1 JP 2020002771 W JP2020002771 W JP 2020002771W WO 2020170723 A1 WO2020170723 A1 WO 2020170723A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
frame
video signal
imaging device
focal plane
Prior art date
Application number
PCT/JP2020/002771
Other languages
French (fr)
Japanese (ja)
Inventor
拓洋 澁谷
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2021501759A priority Critical patent/JP7121852B2/en
Publication of WO2020170723A1 publication Critical patent/WO2020170723A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an image pickup apparatus using an image pickup element that operates in a rolling shutter system.
  • pixels are two-dimensionally arranged, and an operation at the time of image pickup is often performed by a rolling shutter system that sequentially performs exposure for each horizontal line.
  • a rolling shutter system that sequentially performs exposure for each horizontal line.
  • the exposure timing of the pixels on the upper line and the exposure timing of the pixels on the lower line are displaced.
  • such a shift in the exposure timing does not cause a big problem.
  • the image of the frame in which the focal plane distortion has occurred can be replaced with an image in which the images of the frames before and after the frame in which the focal plane distortion has not occurred are combined.
  • an in-vehicle image pickup device recognizes movement speed information of a vehicle equipped with the same, and performs coordinate conversion of pixels based on the movement speed information to obtain focal plane distortion. The image in the frame in which is generated is corrected.
  • the present invention has been made in view of such a situation, and an object thereof is to solve the above problems.
  • the present invention relates to an image pickup apparatus that uses a camera module that outputs a video signal that forms a moving image by combining an image pickup device that operates in a rolling shutter system and an optical system, and a temporal change of an image pickup state in the camera module.
  • a fluctuation detection unit that detects a rate, a blur processing unit that performs blur processing in the video signal, and a frame image that corresponds to a time point when the temporal change rate is equal to or greater than a threshold in the video signal.
  • a control unit for instructing the blur processing unit to perform the processing.
  • the present invention is also an imaging device using a camera module that outputs a video signal that forms a moving image by combining an imaging device that operates in a rolling shutter system and an optical system, and the time of the imaging state in the camera module is used.
  • Change detection unit that detects a dynamic change rate
  • an external process information creation unit that creates external process information that indicates the time when the temporal change rate is equal to or greater than a threshold value in association with the frame in the video signal
  • the video signal And a control unit that outputs the external processing information in association with each other.
  • FIG. 6 is a diagram showing an example of a video (when there is a zoom operation) output by the image pickup apparatus according to the embodiment.
  • FIG. 6 is a diagram showing an example of an image (when there is a pan angle change) output by the image pickup apparatus according to the embodiment.
  • It is a figure which shows the example of the image
  • It is a figure which shows the example of the video output (when there is a pan angle change) output by the conventional imaging device.
  • 6 is a flowchart showing an operation of the image pickup apparatus according to the embodiment. It is a figure which shows the structure of the modification of the imaging device which concerns on embodiment.
  • FIG. 1 is a block diagram showing a configuration of an image pickup apparatus 1 according to the embodiment of the present invention.
  • a camera module 10 in which an optical system 11 and an image pickup element 12 are fixed is used.
  • the optical system 11 is configured by combining a plurality of lenses, and the enlargement/reduction of the angle of view (zoom out, zoom in) is mechanically controlled by the zoom adjustment unit 13.
  • the image sensor 12 is composed of a two-dimensional CMOS sensor.
  • the camera module 10 is configured to be rotatable in the horizontal direction and the vertical direction with respect to the main body fixed to a facility or the like. Therefore, by adjusting the set angle (pan angle) in the horizontal direction and the set angle (tilt angle) in the vertical direction of the camera module 10, the object to be imaged (viewing direction) can be adjusted. Therefore, a pan/tilt angle control unit 21 including a pan axis motor and a tilt axis motor for controlling the pan angle and the tilt angle of the camera module 10 and a drive mechanism for driving the camera module by these is provided. Has been.
  • the control unit 20 configured by using a CPU or the like controls the zoom adjustment unit 13 and the pan/tilt angle control unit 21 according to a user's operation or a remote signal from the outside, and controls the size and direction of the angle of view of the camera module 10. Can be controlled.
  • the gamma correction unit 22 performs gamma correction on the video signal of the moving image output from the image sensor 12. Finally, the video signal is output to the outside by the video signal output unit 23. With this video signal, the video imaged by the camera module 10 can be displayed on an external display.
  • the image pickup device 12 sequentially outputs the outputs of the two-dimensionally arranged pixels in a fixed format. At this time, the exposure timing of each pixel is sequentially set from the upper line toward the lower side for each column (horizontal line) in the two-dimensional array. That is, the image pickup device 12 is driven by the rolling shutter method. Therefore, when the size and direction of the angle of view in the camera module 10 change abruptly, focal plane distortion may occur in the video signal output from the image sensor 12.
  • the blurring process is a filtering process for removing spatial high frequency components in the two-dimensional image signal, and is performed by a method using a well-known algorithm.
  • the video signal passes through a blur processing unit 24 that performs such blur processing using a filter before the video signal output unit 23.
  • the temporal variation of the imaging state includes the variation of the size of the angle of view and the variation of the direction of the angle of view.
  • the zoom operation detection unit 14 is provided in the camera module 10. Further, in order to detect the latter variation, a pan/tilt operation detection unit 25 that detects the variation of the pan angle and the tilt angle of the camera module 10 is provided.
  • the zoom operation detecting unit 14 and the pan/tilt operation detecting unit 25 are an example of a variation detecting unit for recognizing the temporal change rate of the imaging state.
  • the zoom operation detection unit 14 recognizes the change rate (temporal change rate) of the zoom operation.
  • This is composed of, for example, a sensor that recognizes the position of the lens.
  • the pan/tilt operation detection unit 25 is configured by a sensor that recognizes the change rate of the pan angle and the tilt angle as described above, for example, by the rotation speed of the pan axis motor and the tilt axis motor. It is not necessary to recognize with high precision.
  • an acceleration sensor or the like that can detect the change rate of the pan angle and the tilt angle may be used.
  • the zoom operation detection unit 14 and the pan/tilt operation detection unit 25 are set so that the change rates of these can be recognized in real time for each frame of the video signal. If the control unit 20 can similarly recognize the change rates of these in real time by the signals for controlling the zoom adjustment unit 13 and the pan/tilt angle control unit 21, it is necessary to actually measure these change rates.
  • the zoom operation detection unit 14 and the pan/tilt operation detection unit 25 are unnecessary, and the zoom adjustment unit 13 and the pan/tilt angle control unit 21 themselves can be used as the fluctuation detection unit.
  • FIG. 2A shows an ideal image in which focal plane distortion does not exist in this case
  • FIG. 2B shows an image in which focal plane distortion exists as an image for each frame.
  • the character “F” is ideal when the focal point is from the frame 2 of FIG. It is distorted greatly due to plane distortion.
  • the images of the frames 1 to 3 are sequentially output in the moving image by this video signal, the greatly distorted image of the frame 2 is conspicuous.
  • FIG. 2(c) is an example of the case where the above-described blur processing is performed.
  • the images of the frames 1 and 3 are the same as those in FIGS. 2A and 2B, and the blur processing is applied only to the image of the frame 2 in which the zoom operation is performed.
  • the three images in FIG. 2C are continuously displayed, it is difficult for the user to visually recognize only the image for a short time in which the focal plane distortion has occurred, and thus the focal point in the impression that the user receives as a whole of the moving image. The effect of plane distortion is reduced.
  • the subject is the letter “F”.
  • the target imaged by the imaging device 1 is a person, particularly a well-known person or the like
  • the person's right of a person with a focal plane distortion is not recognized. Infringement may occur. As described above, such a situation is avoided by performing the blurring process on the image of the frame estimated to have the focal plane distortion.
  • FIG. 3 shows an image of each frame when the same zooming operation as described above is performed and the panning angle (horizontal angle) is changed while the tilt angle is kept constant. The same is shown.
  • the fluctuation of the pan angle occurs only in the frame 2.
  • FIG. 3A In an ideal image (FIG. 3A) where there is no focal plane distortion, the character “F” moves from the left side to the right side from frame 1 to frame 3.
  • FIG. 3B In the case of FIG. 3B in which the above processing is not performed at all due to the focal plane distortion, the character “F” is distorted in frame 2 in a manner different from the case of FIG. 2B due to the focal plane distortion. Is displayed. Also in this case, in the case of FIG.
  • the blur processing is performed only on the image of the frame 2, so that only a short-time image in which focal plane distortion has occurred is generated. It becomes difficult to visually recognize, and the influence of focal plane distortion in the entire moving image is reduced.
  • FIG. 4 shows an example of an image corresponding to FIG. 2 (FIG. 2C) in this case (when the zoom operation is performed in the frame 2).
  • FIGS. 2A and 2B an ideal image and an image when focal plane distortion is present and no blur processing is performed are shown in FIGS. 2A and 2B, respectively.
  • this correction is performed so that this image is at a certain reference time point, and it is assumed that this reference time point is the initial time point when the image of the frame 2 is captured. For this reason, in the frame 2 of FIG. 4, it is the lower area where the exposure timing is particularly late that is greatly affected by the correction.
  • This correction is performed on the frame 2 (as shown in FIG. 2B) as described above, but since the lower area of the frame 2 is a zoomed-in image, the lower area of the frame 2 to be corrected is corrected.
  • the area (area below the letter “F”) is outside the range of the image of the frame used for correction. Therefore, in the image of the frame 2 in FIG. 4, an image similar to that of the frame 2 in FIG. 2A, which should be originally obtained, is obtained in the upper side, but in the lower side, the corrected image cannot be obtained (black area). ) Occurs. That is, when the correction is performed using the images of the preceding and succeeding frames in this manner, the image is substantially defective.
  • the object to be imaged is the letter “F”, but as described above, when the object is a person or the like, such a partially missing image may be a particular problem.
  • FIG. 5 similarly shows an image (when the pan angle changes in frame 2) corresponding to FIG. 3 (FIG. 3C) when the same correction is performed.
  • FIGS. 3A and 3B an ideal image and an image in the case where focal plane distortion exists and no blur processing is performed are shown in FIGS. 3A and 3B.
  • a region outside the range of the image used for the correction is necessary for the correction, and thus the corrected image of the frame 2 also has a defect in a region different from that in FIG.
  • FIG. 6 shows a case where the image corresponding to FIG. 3 (FIG. 3(c)) is corrected such that the image of the frame 2 is a combined image of the frames 1 and 3.
  • the image of frame 2 is significantly different from the actual image (frame 2 of FIG. 3A).
  • the difference between the image obtained by such synthesis processing and the original image becomes extremely large.
  • the image loss as described above does not occur in the frame 2, and the image between the corrected image and the original image is not generated. You can make the difference inconspicuous. Further, as described above, the process of performing the blurring process on the image for each frame is simpler than the correction process described in Patent Document 1 and the like, so that the time required for the process is short and the power consumption by the process is small. Therefore, this process can be easily performed in real time at the time of image capturing.
  • FIG. 7 is a flowchart of a process of determining whether to perform a blur process on each frame image. This process is performed for each frame.
  • the control unit 20 changes the imaging state during the exposure period (the period from the first exposure timing to the last exposure timing when obtaining this image) from the image sensor 12, the zoom operation detection unit 14, the pan/tilt. It is detected by the motion detector 25 (variation detector) (S1).
  • the zoom operation detection unit 14 uses, for example, the position of the lens (focal length) when the exposure is started and the position of the lens (focal length) when the exposure is ended as the amount that directly reflects the zoom change amount. (Mm) is recognized as the amount of zoom change, and the pan/tilt operation detection unit 25 can recognize the pan/tilt angle change speed (°/sec), for example.
  • control unit 20 determines whether the recognized zoom change amount is equal to or more than a predetermined threshold value (S2). When it is equal to or more than the threshold value (S2: Yes), the control unit 20 instructs the blur processing unit 24 to perform blur processing on the obtained image (S3), and the processed image is the frame. Is output as an image (S4).
  • the control unit 20 determines whether or not the change in the pan angle and the tilt angle is large (S5). This determination is made by comprehensively looking at the changes in the pan angle and the tilt angle. For example, if at least the pan angle change speed is equal to or more than a preset threshold (pan angle change speed threshold), the tilt angle change speed is equal to or greater than a preset threshold (tilt angle change speed threshold). In any of the cases, the change in the pan angle and the tilt angle can be large (S5: Yes).
  • this determination is made based on the magnitude sum of the square sum of the pan angle change speed and the tilt angle change speed, and the threshold value set in advance corresponding thereto. May be done.
  • S5 the image is subjected to the blurring process in the same manner as described above (S3) and then output (S4).
  • the blur processing is not performed (S6), and the image as it is is output as the image of this frame (S4).
  • the video signal subjected to the blur processing is output only to the image of the frame in which the focal plane distortion occurs as described above.
  • the threshold value used for the above determinations (S2, S5) is not zero in the zoom change amount and the change speed of the pan angle and tilt angle, but the blur processing is not performed in a range where the focal plane distortion is not conspicuous because these are small Is set as appropriate.
  • these thresholds may be settable by the user.
  • the degree of the blurring process may be changed according to the variation amount. For example, when the focal plane distortion is large due to the high variation amount, the blurring amount is large. It is possible to increase the degree of bending.
  • the image pickup apparatus 1 can output the video signal that has been subjected to the above processing to the outside, and can display the video signal as it is on the display outside. However, it is also possible to perform the same processing outside the imaging device immediately before displaying the video signal.
  • the imaging device 2 as a modified example is configured assuming such a case.
  • FIG. 8 is a block diagram showing the configuration corresponding to FIG.
  • the image pickup apparatus 2 does not use the blurring processing unit 24, so that the blurring processing is not performed at all, and a video signal in which focal plane distortion has occurred in some frames. Is transmitted to the outside via the video signal output unit 23.
  • the control unit 20 determines for each frame whether or not the change in the above-described imaging condition is large (whether or not the blur processing is required).
  • An external process information creating unit 26 is provided which creates external process information indicating the result (S2, S5 in FIG. 6).
  • the external processing information may be created in any format as long as a frame that requires gradation processing is specified in the output video signal, and may be in a format that becomes metadata accompanying the image signal. ..
  • this image pickup device 2 an image signal that has not been processed is obtained by an external device, and this device performs the blur processing using the same blur processing section as described above, and images based on this are captured. It can be displayed in the same manner as when the device 1 is used. In this case, if necessary, it is possible to view an image on which the focal plane distortion is present but the blur processing is not performed on the device side. In addition, adjustment of the degree of blurring processing can be performed by an external device.
  • the blurring process is set to be performed when the size of the angle of view in the camera module 10 changes (zoom operation) and the direction of the angle of view changes (changes in pan and tilt angles).
  • the same effect can be obtained by performing the blurring process in the same manner.
  • the same can be applied when there is a sudden movement in the horizontal and vertical directions of the image pickup apparatus main body instead of the camera module.
  • the operation to be recognized as the change of the image pickup state to be the target is appropriately set according to the configuration of the camera module (image pickup apparatus).
  • the fluctuation detection unit, the threshold value for the above determination, and the like are also appropriately set.
  • the degree of shading processing can be set appropriately. For example, when the frames in which the focal plane distortion occurs are continuous, it is not necessary to perform the same processing on all the continuous frames. For example, when the focal plane distortion continuously occurs in three or more frames, the degree of blurring processing may be reduced in the first and last frames in time and increased in the central frame. ..
  • the threshold value for determining that the blurring process is performed may be set lower as the exposure time is longer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present invention reduces the effects of focal plane distortion without causing the occurrence of missing frames or pixels. Fig. 2(a) depicts video in which focal plane distortion is not present, and fig. 2(b) depicts video in which focal plane distortion is present. Fig. 2(c) depicts video obtained by performing gradation processing on video in which focal plane distortion is present. When the three images in fig. 2(c) are displayed in succession, the effects of focal plane distortion, for the video as whole, as perceived by a user, are reduced.

Description

撮像装置Imaging device
 本発明は、ローリングシャッタ方式で動作する撮像素子が用いられた撮像装置に関する。 The present invention relates to an image pickup apparatus using an image pickup element that operates in a rolling shutter system.
 固体撮像素子、特にCMOS撮像素子においては、画素が2次元に配列され、その撮像時の動作は、水平ライン毎に露光を順次行わせるローリングシャッタ方式で行われる場合が多く、これによって、撮像装置を安価とすることができる。この場合、例えば上側のラインの画素における露光タイミングと、下側のラインの画素における露光タイミングにはずれが発生する。動きの小さな被写体を撮像する場合には、一般的には、このような露光タイミングのずれは大きな問題になることはない。 In a solid-state image pickup device, particularly a CMOS image pickup device, pixels are two-dimensionally arranged, and an operation at the time of image pickup is often performed by a rolling shutter system that sequentially performs exposure for each horizontal line. Can be cheaper. In this case, for example, the exposure timing of the pixels on the upper line and the exposure timing of the pixels on the lower line are displaced. In the case of picking up an image of a subject having a small motion, generally, such a shift in the exposure timing does not cause a big problem.
 一方、被写体が高速で動く場合や、撮像の途中で画角の大きさや方向が変動した場合には、このような露光タイミングのずれがある場合、実際の映像に対して歪が発生した映像がCMOS撮像素子から出力される。このような映像の歪はフォーカルプレーン歪として知られている。 On the other hand, if the subject moves at high speed, or if the size and direction of the angle of view change during image capture, if there is such a shift in the exposure timing, a distorted image may appear in the actual image. It is output from the CMOS image sensor. Such image distortion is known as focal plane distortion.
 このようなフォーカルプレーン歪の影響を低減する方法として、例えばフォーカルプレーン歪が生じたフレームの画像を、その直前と直後のフレームの画像に基づいて作成した画像に置換することが知られている。この場合、例えば、最も単純な場合には、フォーカルプレーン歪が生じたフレームの画像を、フォーカルプレーン歪が発生していないその前後のフレームの画像を合成した画像に置換することができる。 As a method of reducing the influence of such focal plane distortion, for example, it is known to replace the image of the frame in which the focal plane distortion has occurred with an image created based on the images of the frames immediately before and immediately after that. In this case, for example, in the simplest case, the image of the frame in which the focal plane distortion has occurred can be replaced with an image in which the images of the frames before and after the frame in which the focal plane distortion has not occurred are combined.
 また、特許文献1に記載の技術においては、これよりも高度な補正を行うために、フォーカルプレーン歪が生じたフレームにおいて、このフレームにおける基準時点を設定し、直前と直後のフレーム画像を解析して、基準時点における画素と、この画素と対応した直前と直後のフレーム画像における画素の位置関係が算出される。これにより、フォーカルプレーン歪が生じたフレームの画像は、画素がこの基準時点における位置にあるものとして、前後のフレームの画像を用いた線形補完して補正される。 In addition, in the technique described in Patent Document 1, in order to perform higher-level correction, a reference time point in this frame is set in a frame in which focal plane distortion has occurred, and frame images immediately before and immediately after are analyzed. Then, the positional relationship between the pixel at the reference time point and the pixel in the frame image immediately before and immediately after that corresponding to this pixel is calculated. As a result, the image of the frame in which the focal plane distortion has occurred is corrected by linearly complementing the images of the preceding and following frames, assuming that the pixel is at the position at this reference time point.
 また、特許文献2に記載の技術においては、車載の撮像装置において、これが搭載された車両の移動速度情報を認識し、この移動速度情報に基づいて画素の座標変換を行うことによって、フォーカルプレーン歪が生じたフレームにおける画像が補正される。 Further, according to the technique described in Patent Document 2, an in-vehicle image pickup device recognizes movement speed information of a vehicle equipped with the same, and performs coordinate conversion of pixels based on the movement speed information to obtain focal plane distortion. The image in the frame in which is generated is corrected.
特開2006-148496号公報Japanese Patent Laid-Open No. 2006-148496 特開2012-222374号公報JP 2012-222374A
 特許文献1に記載の技術のように、フォーカルプレーン歪が生じたフレームの画像を、その直前と直後におけるフォーカルプレーン歪が発生していないフレームの画像を用いて作成する、あるいはこれらの画像を用いた情報により補正(補完)する場合においては、フォーカルプレーン歪が生じたフレームが一つである場合には、このフレームでフォーカルプレーン歪が補正された画像を得ることができる。しかしながら、フォーカルプレーン歪が生じたフレームが複数にわたる場合には、直前、直後においてフォーカルプレーン歪が発生していないフレームが存在しないため、上記のような処理が困難となった。この場合、フレームが欠落する、あるいはフォーカルプレーン歪が存在するフレームの画像をそのまま用いることが必要となった。また、この処理は複雑であるため、リアルタイムでこの処理を行うことが困難であった、あるいはこの処理の際の消費電力が大きくなった。 As in the technique described in Patent Document 1, images of frames in which focal plane distortion has occurred are created using images of frames in which focal plane distortion has not occurred immediately before and immediately after, or these images are used. In the case of correcting (complementing) with the existing information, if there is only one frame in which focal plane distortion has occurred, it is possible to obtain an image in which focal plane distortion has been corrected in this frame. However, when there are a plurality of frames in which focal plane distortion has occurred, there is no frame in which focal plane distortion has not occurred immediately before and immediately after, so the above-described processing becomes difficult. In this case, it is necessary to use the image of the frame in which the frame is missing or the focal plane distortion exists. Further, since this process is complicated, it is difficult to perform this process in real time, or the power consumption during this process becomes large.
 また、特許文献2に記載の技術においては、高精度の移動速度情報が得られれば、補正を高精度で行うことが可能となる。しかしながら、前記のように、補正後における各画素の位置は実際の各画素の位置から変化するため、補正後の画像で示された実空間における範囲と、補正前の画像で示された実空間における範囲とは異なる。この相違は移動速度が高い場合に特に顕著となる。この場合、補正後の画像と対応すべき画素が補正前の画像の範囲には含まれない場合があり、この場合には補正後の画像にはこの画素は欠落する。このため、補正後の映像(動画)において、フォーカルプレーン歪が発生していない前後のフレームの間にこのように画素に欠落が生じたフレームが挿入された、不自然な映像しか得ることができなかった。 Further, in the technique described in Patent Document 2, if highly accurate moving speed information is obtained, it becomes possible to perform the correction with high accuracy. However, as described above, since the position of each pixel after correction changes from the position of each actual pixel, the range in the real space shown in the image after correction and the real space shown in the image before correction Different from the range in. This difference is particularly remarkable when the moving speed is high. In this case, the pixel that should correspond to the image after correction may not be included in the range of the image before correction, and in this case, this pixel is missing in the image after correction. Therefore, in the corrected image (moving image), it is possible to obtain only an unnatural image in which such a frame in which a pixel is missing is inserted between frames before and after focal plane distortion has not occurred. There wasn't.
 このため、フレームや画素の欠落を発生させることなく、フォーカルプレーン歪の影響を低減する技術が望まれた。 Therefore, a technology that reduces the influence of focal plane distortion without causing the loss of frames and pixels has been desired.
 本発明は、このような状況に鑑みなされたもので、上記課題を解決することを目的とする。 The present invention has been made in view of such a situation, and an object thereof is to solve the above problems.
 本発明は、ローリングシャッタ方式で動作する撮像素子と光学系とが組み合わされて動画を構成する映像信号を出力するカメラモジュールが用いられる撮像装置であって、前記カメラモジュールにおける撮像状態の時間的変化率を検出する変動検出部と、前記映像信号において、暈し処理を施す暈し処理部と、前記映像信号において、前記時間的変化率が閾値以上となった時点に対応したフレームの画像に暈し処理を施すよう前記暈し処理部へ指示する制御部と、を具備する。
 また、本発明は、ローリングシャッタ方式で動作する撮像素子と光学系とが組み合わされて動画を構成する映像信号を出力するカメラモジュールが用いられる撮像装置であって、前記カメラモジュールにおける撮像状態の時間的変化率を検出する変動検出部と、前記時間的変化率が閾値以上となった時点を前記映像信号におけるフレームと対応させて示す外部処理情報を作成する外部処理情報作成部と、前記映像信号と前記外部処理情報とを対応付けて出力させる制御部と、を具備する。
The present invention relates to an image pickup apparatus that uses a camera module that outputs a video signal that forms a moving image by combining an image pickup device that operates in a rolling shutter system and an optical system, and a temporal change of an image pickup state in the camera module. A fluctuation detection unit that detects a rate, a blur processing unit that performs blur processing in the video signal, and a frame image that corresponds to a time point when the temporal change rate is equal to or greater than a threshold in the video signal. And a control unit for instructing the blur processing unit to perform the processing.
The present invention is also an imaging device using a camera module that outputs a video signal that forms a moving image by combining an imaging device that operates in a rolling shutter system and an optical system, and the time of the imaging state in the camera module is used. Change detection unit that detects a dynamic change rate, an external process information creation unit that creates external process information that indicates the time when the temporal change rate is equal to or greater than a threshold value in association with the frame in the video signal, and the video signal And a control unit that outputs the external processing information in association with each other.
 本発明によると、フレームや画素の欠落を発生させることなく、フォーカルプレーン歪の影響を低減することができる。 According to the present invention, it is possible to reduce the influence of focal plane distortion without causing loss of frames and pixels.
実施の形態に係る撮像装置の構成を示す図である。It is a figure which shows the structure of the imaging device which concerns on embodiment. 実施の形態に係る撮像装置により出力される映像(ズーム動作があった場合)の例を示す図である。FIG. 6 is a diagram showing an example of a video (when there is a zoom operation) output by the image pickup apparatus according to the embodiment. 実施の形態に係る撮像装置により出力される映像(パン角変化があった場合)の例を示す図である。FIG. 6 is a diagram showing an example of an image (when there is a pan angle change) output by the image pickup apparatus according to the embodiment. 従来の撮像装置により出力される映像(ズーム動作があった場合)の例を示す図である。It is a figure which shows the example of the image|video (when there exists a zoom operation) output by the conventional imaging device. 従来の撮像装置により出力される映像(パン角変化があった場合)の例を示す図である。It is a figure which shows the example of the video output (when there is a pan angle change) output by the conventional imaging device. 他の従来の撮像装置により出力される映像(パン角変化があった場合)の例を示す図である。It is a figure which shows the example of the image|video (when there is a pan angle change) output by another conventional imaging device. 実施の形態に係る撮像装置の動作を示すフローチャートである。6 is a flowchart showing an operation of the image pickup apparatus according to the embodiment. 実施の形態に係る撮像装置の変形例の構成を示す図である。It is a figure which shows the structure of the modification of the imaging device which concerns on embodiment.
 次に、本発明を実施するための形態を図面を参照して具体的に説明する。図1は、本発明の実施の形態に係る撮像装置1の構成を示すブロック図である。この撮像装置1においては、光学系11と撮像素子12が固定されたカメラモジュール10が用いられる。光学系11は複数のレンズが組み合わされて構成され、画角の拡大・縮小(ズームアウト、ズームイン)がズーム調整部13で機械的に制御される。撮像素子12は2次元CMOSセンサで構成される。 Next, a mode for carrying out the present invention will be specifically described with reference to the drawings. FIG. 1 is a block diagram showing a configuration of an image pickup apparatus 1 according to the embodiment of the present invention. In this image pickup apparatus 1, a camera module 10 in which an optical system 11 and an image pickup element 12 are fixed is used. The optical system 11 is configured by combining a plurality of lenses, and the enlargement/reduction of the angle of view (zoom out, zoom in) is mechanically controlled by the zoom adjustment unit 13. The image sensor 12 is composed of a two-dimensional CMOS sensor.
 また、カメラモジュール10は施設等に固定された本体に対して、水平方向及び鉛直方向に沿って回動可能なように構成される。このため、カメラモジュール10の水平方向における設定角度(パン角)、鉛直方向における設定角度(チルト角)を調整することによって、撮像される対象(視野方向)を調整することができる。このため、カメラモジュール10のパン角、チルト角を各々制御するためのパン軸モータ、チルト軸モータとこれらによりカメラモジュールを駆動するための駆動機構で構成されたパン・チルト角制御部21が設けられている。CPU等を用いて構成される制御部20は、ユーザの操作や外部からの遠隔信号によってズーム調整部13、パン・チルト角制御部21を制御し、カメラモジュール10の画角の大きさ、方向を制御することができる。 Also, the camera module 10 is configured to be rotatable in the horizontal direction and the vertical direction with respect to the main body fixed to a facility or the like. Therefore, by adjusting the set angle (pan angle) in the horizontal direction and the set angle (tilt angle) in the vertical direction of the camera module 10, the object to be imaged (viewing direction) can be adjusted. Therefore, a pan/tilt angle control unit 21 including a pan axis motor and a tilt axis motor for controlling the pan angle and the tilt angle of the camera module 10 and a drive mechanism for driving the camera module by these is provided. Has been. The control unit 20 configured by using a CPU or the like controls the zoom adjustment unit 13 and the pan/tilt angle control unit 21 according to a user's operation or a remote signal from the outside, and controls the size and direction of the angle of view of the camera module 10. Can be controlled.
 撮像素子12から出力される動画の映像信号に対しては、ガンマ補正部22によりガンマ補正が施される。最終的に、この映像信号は、映像信号出力部23によって外部に出力される。この映像信号によって、カメラモジュール10によって撮像された映像を外部のディスプレイで表示させることができる。 The gamma correction unit 22 performs gamma correction on the video signal of the moving image output from the image sensor 12. Finally, the video signal is output to the outside by the video signal output unit 23. With this video signal, the video imaged by the camera module 10 can be displayed on an external display.
 また、撮像素子12は、2次元配列された画素の出力を一定のフォーマットで順次出力する。この際、各画素の露光タイミングは、2次元配列における列(水平ライン)毎に上側のラインから下側に向けて順次設定される。すなわち、この撮像素子12は、ローリングシャッタ方式で駆動される。このため、カメラモジュール10における画角の大きさや方向が急激に変動した場合には、撮像素子12から出力される映像信号には、フォーカルプレーン歪が発生する場合がある。 Further, the image pickup device 12 sequentially outputs the outputs of the two-dimensionally arranged pixels in a fixed format. At this time, the exposure timing of each pixel is sequentially set from the upper line toward the lower side for each column (horizontal line) in the two-dimensional array. That is, the image pickup device 12 is driven by the rolling shutter method. Therefore, when the size and direction of the angle of view in the camera module 10 change abruptly, focal plane distortion may occur in the video signal output from the image sensor 12.
 ここで、この撮像装置1においては、出力される映像信号において、フレーム毎の画像に対して暈し処理を施すことが可能である。ここで、暈し処理とは、2次元画像信号における空間的な高周波成分を除去するフィルタリング処理であり、周知のアルゴリズムを用いた方法によって行われる。映像信号は、映像信号出力部23の前でこのような暈し処理をフィルターを用いて行う暈し処理部24を通過する。 Here, in the image pickup apparatus 1, it is possible to perform blur processing on an image for each frame in the output video signal. Here, the blurring process is a filtering process for removing spatial high frequency components in the two-dimensional image signal, and is performed by a method using a well-known algorithm. The video signal passes through a blur processing unit 24 that performs such blur processing using a filter before the video signal output unit 23.
 この暈し処理は、撮像状態の時間的変動が大きいフレームの画像のみに対して行われる。撮像状態の時間的変動は、画角の大きさの変動と、画角の方向の変動がある。前者の変動を検知するため、カメラモジュール10においてズーム動作検出部14が設けられる。また、後者の変動を検知するため、カメラモジュール10のパン角、チルト角の変動を検出するパン・チルト動作検出部25が設けられる。ズーム動作検出部14、パン・チルト動作検出部25は、撮像状態の時間的変化率を認識するための変動検出部の一例である。 -This blur processing is performed only on the image of the frame in which the temporal change of the imaging state is large. The temporal variation of the imaging state includes the variation of the size of the angle of view and the variation of the direction of the angle of view. In order to detect the former variation, the zoom operation detection unit 14 is provided in the camera module 10. Further, in order to detect the latter variation, a pan/tilt operation detection unit 25 that detects the variation of the pan angle and the tilt angle of the camera module 10 is provided. The zoom operation detecting unit 14 and the pan/tilt operation detecting unit 25 are an example of a variation detecting unit for recognizing the temporal change rate of the imaging state.
 ここで、ズーム動作検出部14はズーム動作の変化率(時間的変化率)を認識する。これは、例えばレンズの位置を認識するセンサで構成される。ただし、ズーム動作の時間変化率を高精度で認識できる必要はなく、このレンズの位置(焦点距離)の変化量を認識できれば十分である。一方、パン・チルト動作検出部25は、前記のようなパン角、チルト角の変化率を、例えばパン軸モータ、チルト軸モータの回転速度によって認識するセンサで構成され、パン角、チルト角を高精度で認識する必要はない。あるいは、パン角、チルト角の変化率を検知可能な加速度センサ等を用いてもよい。 Here, the zoom operation detection unit 14 recognizes the change rate (temporal change rate) of the zoom operation. This is composed of, for example, a sensor that recognizes the position of the lens. However, it is not necessary to recognize the time change rate of the zoom operation with high accuracy, and it is sufficient to recognize the change amount of the position (focal length) of the lens. On the other hand, the pan/tilt operation detection unit 25 is configured by a sensor that recognizes the change rate of the pan angle and the tilt angle as described above, for example, by the rotation speed of the pan axis motor and the tilt axis motor. It is not necessary to recognize with high precision. Alternatively, an acceleration sensor or the like that can detect the change rate of the pan angle and the tilt angle may be used.
 ズーム動作検出部14、パン・チルト動作検出部25は、これらの変化率を映像信号のフレーム単位でリアルタイムで認識できるように設定される。なお、ズーム調整部13、パン・チルト角制御部21を制御するための信号によって、制御部20がこれらの変化率を同様にリアルタイムで認識できる場合には、これらの変化率を実測するためのズーム動作検出部14、パン・チルト動作検出部25は不要であり、ズーム調整部13、パン・チルト角制御部21自身を変動検出部として用いることができる。 The zoom operation detection unit 14 and the pan/tilt operation detection unit 25 are set so that the change rates of these can be recognized in real time for each frame of the video signal. If the control unit 20 can similarly recognize the change rates of these in real time by the signals for controlling the zoom adjustment unit 13 and the pan/tilt angle control unit 21, it is necessary to actually measure these change rates. The zoom operation detection unit 14 and the pan/tilt operation detection unit 25 are unnecessary, and the zoom adjustment unit 13 and the pan/tilt angle control unit 21 themselves can be used as the fluctuation detection unit.
 この処理による動画の例を、一定間隔の格子と共に文字「F」を含む画像が撮像されている場合において、具体的に説明する。初めに、パン角及びチルト角を変えずにズーム動作が行われた場合について説明する。図2は、画角を小さくする(ズームイン)処理が2つ目のフレーム(フレーム2)でのみ行われているものとし、その直後のフレーム3においては、その直前のフレーム1における場合と比べて「F」が拡大されている。 A concrete example of a moving image obtained by this processing will be described in the case where an image including the character “F” is captured together with a grid at regular intervals. First, a case where the zoom operation is performed without changing the pan angle and the tilt angle will be described. In FIG. 2, it is assumed that the processing for reducing the angle of view (zoom in) is performed only in the second frame (frame 2), and in the frame 3 immediately after that, compared to the case in the frame 1 immediately before that. "F" has been expanded.
 図2(a)は、この場合においてフォーカルプレーン歪が存在しない理想的な映像を示し、図2(b)はフォーカルプレーン歪が存在する場合の映像をフレーム毎の画像としてそれぞれ示す。前記の通り、撮像素子12はローリングシャッタ方式で動作するため、図2(b)のフレーム2の画像においては、文字「F」が理想的な場合(図2(a)のフレーム2)からフォーカルプレーン歪により大きく歪んで表示されている。この映像信号による動画においてフレーム1~3の画像が順次出力された場合には、フレーム2の大きく歪んだ画像が目立つ。 FIG. 2A shows an ideal image in which focal plane distortion does not exist in this case, and FIG. 2B shows an image in which focal plane distortion exists as an image for each frame. As described above, since the image sensor 12 operates by the rolling shutter method, in the image of the frame 2 of FIG. 2B, the character “F” is ideal when the focal point is from the frame 2 of FIG. It is distorted greatly due to plane distortion. When the images of the frames 1 to 3 are sequentially output in the moving image by this video signal, the greatly distorted image of the frame 2 is conspicuous.
 一方、図2(c)は、上記の暈し処理を行った場合の例である。ここで、フレーム1、3の画像は図2(a)(b)と変わらず、ズーム動作が行われているフレーム2の画像のみに暈し処理が施されている。図2(c)における3つの画像が連続的に表示された場合には、ユーザは、フォーカルプレーン歪が発生した短時間の画像のみが視認しにくくなるため、動画全体としてユーザが受ける印象におけるフォーカルプレーン歪の影響が低減される。 On the other hand, FIG. 2(c) is an example of the case where the above-described blur processing is performed. Here, the images of the frames 1 and 3 are the same as those in FIGS. 2A and 2B, and the blur processing is applied only to the image of the frame 2 in which the zoom operation is performed. When the three images in FIG. 2C are continuously displayed, it is difficult for the user to visually recognize only the image for a short time in which the focal plane distortion has occurred, and thus the focal point in the impression that the user receives as a whole of the moving image. The effect of plane distortion is reduced.
 図2の例では被写体が文字「F」であったが、例えばこの撮像装置1で撮影される対象が人物、特に著名人物等である場合には、フォーカルプレーン歪が存在する画像による人格権の侵害が発生するおそれもある。上記のようにフォーカルプレーン歪が発生していると推定されるフレームの画像に暈し処理を施すことによって、こうした状況は回避される。 In the example of FIG. 2, the subject is the letter “F”. However, for example, when the target imaged by the imaging device 1 is a person, particularly a well-known person or the like, the person's right of a person with a focal plane distortion is not recognized. Infringement may occur. As described above, such a situation is avoided by performing the blurring process on the image of the frame estimated to have the focal plane distortion.
 図3は、上記と同様の対象を撮像する際に、上記のようなズーム動作を全く行わず、チルト角を一定としてパン角(水平方向の角度)のみを変えた場合の各フレームの画像を同様に示す。ここで、前記の場合と同様に、パン角の変動はフレーム2においてのみ発生しているものとする。ここで、フォーカルプレーン歪が存在しない理想的な映像(図3(a))においては、文字「F」がフレーム1からフレーム3にかけて、左側から右側に移動する。フォーカルプレーン歪があり上記の処理が全く行われない図3(b)の場合には、フレーム2において、フォーカルプレーン歪により、図2(b)の場合とは異なる態様で文字「F」が歪んで表示されている。この場合においても、上記の暈し処理が行われた図3(c)の場合において、フレーム2の画像においてのみ暈し処理が行われることにより、フォーカルプレーン歪が発生した短時間の画像のみが視認しにくくなり、動画全体としてのフォーカルプレーン歪の影響が低減される。 FIG. 3 shows an image of each frame when the same zooming operation as described above is performed and the panning angle (horizontal angle) is changed while the tilt angle is kept constant. The same is shown. Here, as in the case described above, it is assumed that the fluctuation of the pan angle occurs only in the frame 2. Here, in an ideal image (FIG. 3A) where there is no focal plane distortion, the character “F” moves from the left side to the right side from frame 1 to frame 3. In the case of FIG. 3B in which the above processing is not performed at all due to the focal plane distortion, the character “F” is distorted in frame 2 in a manner different from the case of FIG. 2B due to the focal plane distortion. Is displayed. Also in this case, in the case of FIG. 3C in which the above-described blur processing is performed, the blur processing is performed only on the image of the frame 2, so that only a short-time image in which focal plane distortion has occurred is generated. It becomes difficult to visually recognize, and the influence of focal plane distortion in the entire moving image is reduced.
 一方、従来技術のように、フレーム2の画像を、フレーム1、2の画像に基づいて補正した場合と比較する。図4は、この場合における図2(図2(c))に対応する映像(フレーム2においてズーム動作が行われた場合)の例を示す。ここで、理想的な映像、フォーカルプレーン歪が存在し暈し処理が全く施されない場合の映像は、それぞれ図2(a)(b)である。ここで、前記の通り、この補正はこの画像がある基準時点のものとなるように補正されるが、この基準時点はフレーム2の画像が撮像される際の初期の時点であるものとする。このため、図4のフレーム2において補正により大きく影響を受けるのは、特に露光タイミングが遅い下側の領域となる。 On the other hand, as in the conventional technique, the image of the frame 2 is compared with the case where the image of the frame 1 and the image of the frame 2 are corrected. FIG. 4 shows an example of an image corresponding to FIG. 2 (FIG. 2C) in this case (when the zoom operation is performed in the frame 2). Here, an ideal image and an image when focal plane distortion is present and no blur processing is performed are shown in FIGS. 2A and 2B, respectively. Here, as described above, this correction is performed so that this image is at a certain reference time point, and it is assumed that this reference time point is the initial time point when the image of the frame 2 is captured. For this reason, in the frame 2 of FIG. 4, it is the lower area where the exposure timing is particularly late that is greatly affected by the correction.
 この補正は、前記の通り(図2(b)の)フレーム2に対して行われるが、フレーム2の下側の領域はズームインされた画像であるため、フレーム2において補正されるべき下側の領域(文字「F」よりも下側の領域)は、補正に用いられるフレームの画像の範囲外となる。このため、図4のフレーム2の画像では、上側では本来得られるべき図2(a)のフレーム2と同様の画像が得られるものの、下側では補正後の画像が得られなくなる領域(黒色領域)が発生する。すなわち、このように前後のフレームの画像を用いて補正を行う場合には、実質的に画像に欠損が発生する。この例では撮像の対象が文字「F」であるが、前記のように、対象が人物等である場合には、こうした一部が欠損した画像は、特に問題になる場合がある。 This correction is performed on the frame 2 (as shown in FIG. 2B) as described above, but since the lower area of the frame 2 is a zoomed-in image, the lower area of the frame 2 to be corrected is corrected. The area (area below the letter “F”) is outside the range of the image of the frame used for correction. Therefore, in the image of the frame 2 in FIG. 4, an image similar to that of the frame 2 in FIG. 2A, which should be originally obtained, is obtained in the upper side, but in the lower side, the corrected image cannot be obtained (black area). ) Occurs. That is, when the correction is performed using the images of the preceding and succeeding frames in this manner, the image is substantially defective. In this example, the object to be imaged is the letter “F”, but as described above, when the object is a person or the like, such a partially missing image may be a particular problem.
 図5は、同様の補正を行った場合における図3(図3(c))に対応する映像(フレーム2においてパン角が変化している場合)を同様に示す。ここで、理想的な映像、フォーカルプレーン歪が存在し暈し処理が全く施されない場合の映像は図3(a)(b)となる。この場合においても、補正に用いるための画像の範囲外となる領域が補正のために必要となるため、やはり補正後のフレーム2の画像には欠損が、図4とは異なる領域で発生する。 FIG. 5 similarly shows an image (when the pan angle changes in frame 2) corresponding to FIG. 3 (FIG. 3C) when the same correction is performed. Here, an ideal image and an image in the case where focal plane distortion exists and no blur processing is performed are shown in FIGS. 3A and 3B. In this case as well, a region outside the range of the image used for the correction is necessary for the correction, and thus the corrected image of the frame 2 also has a defect in a region different from that in FIG.
 また、図6は、図3(図3(c))に対応する映像において、フレーム2の画像を、フレーム1、3を合成した画像とする補正を行う場合を示す。この場合には、フレーム2の画像は、実際の画像(図3(a)のフレーム2)とは大きく相違する。被写体によっては、このような合成処理によって得られる画像と本来の画像との間の相違は極めて大きくなる。 Further, FIG. 6 shows a case where the image corresponding to FIG. 3 (FIG. 3(c)) is corrected such that the image of the frame 2 is a combined image of the frames 1 and 3. In this case, the image of frame 2 is significantly different from the actual image (frame 2 of FIG. 3A). Depending on the subject, the difference between the image obtained by such synthesis processing and the original image becomes extremely large.
 図1の撮像装置1においては、図2、3に示されたように、フレーム2において上記のような画像の欠損も発生させることがなく、かつ補正後の画像と本来の画像との間の相違を目立たなくすることができる。また、前記のように暈し処理をフレーム毎の画像に行う処理は、特許文献1等に記載の補正処理よりも単純であるため、その処理に要する時間も短く、これによる消費電力も小さい。このため、この処理を撮像時にリアルタイムで容易に行わせることができる。 In the image pickup apparatus 1 of FIG. 1, as shown in FIGS. 2 and 3, the image loss as described above does not occur in the frame 2, and the image between the corrected image and the original image is not generated. You can make the difference inconspicuous. Further, as described above, the process of performing the blurring process on the image for each frame is simpler than the correction process described in Patent Document 1 and the like, so that the time required for the process is short and the power consumption by the process is small. Therefore, this process can be easily performed in real time at the time of image capturing.
 図7は、各フレームの画像に対して暈し処理をするか判定する処理のフローチャートである。この処理は、フレーム毎に行われる。ここでは、制御部20は、撮像素子12から露光期間(この画像を得る際の最初の露光タイミングから最後の露光タイミングまでの期間)中における撮像状態の変化をズーム動作検出部14、パン・チルト動作検出部25(変動検出部)により検出する(S1)。ここでは、ズーム動作検出部14は、ズーム変化量を直接反映する量として、例えば露光を開始した時のレンズの位置(焦点距離)と、露光を終了した時のレンズの位置(焦点距離)との差分(mm)をズーム変化量として認識し、パン・チルト動作検出部25は、例えばパン角、チルト角の変化速度(°/sec)を認識することができる。 FIG. 7 is a flowchart of a process of determining whether to perform a blur process on each frame image. This process is performed for each frame. Here, the control unit 20 changes the imaging state during the exposure period (the period from the first exposure timing to the last exposure timing when obtaining this image) from the image sensor 12, the zoom operation detection unit 14, the pan/tilt. It is detected by the motion detector 25 (variation detector) (S1). Here, the zoom operation detection unit 14 uses, for example, the position of the lens (focal length) when the exposure is started and the position of the lens (focal length) when the exposure is ended as the amount that directly reflects the zoom change amount. (Mm) is recognized as the amount of zoom change, and the pan/tilt operation detection unit 25 can recognize the pan/tilt angle change speed (°/sec), for example.
 次に、制御部20は、認識されたズーム変化量が予め定められた閾値以上であるか否かを判定する(S2)。閾値以上である場合(S2:Yes)には、制御部20は、暈し処理部24に対し、得られた画像に暈し処理を施すよう指示し(S3)、処理後の画像がこのフレームの画像として出力される(S4)。 Next, the control unit 20 determines whether the recognized zoom change amount is equal to or more than a predetermined threshold value (S2). When it is equal to or more than the threshold value (S2: Yes), the control unit 20 instructs the blur processing unit 24 to perform blur processing on the obtained image (S3), and the processed image is the frame. Is output as an image (S4).
 認識されたズーム変化量が予め定められた閾値未満である場合(S2:No)には、制御部20は、パン角、チルト角の変化が大きいか否かを判定する(S5)。この判定は、パン角、チルト角の変化を総合的に見て行われる。例えば、少なくとも前記のパン角の変化速度が予め設定された閾値(パン角変化速度閾値)以上である場合、前記のチルト角の変化速度が予め設定された閾値(チルト角変化速度閾値)以上である場合のいずれかにおいて、パン角、チルト角の変化が大きい(S5:Yes)とすることができる。あるいは、例えば、これらの角度変化率を総合的に評価するため、パン角の変化速度、チルト角の変化速度の二乗和と、これに対応して予め設定された閾値の大小関係によってこの判定を行わせてもよい。パン角、チルト角の変化が大きいと判定した場合(S5:Yes)には、上記と同様に画像に対して暈し処理が施された上(S3)で出力される(S4)。 If the recognized zoom change amount is less than a predetermined threshold value (S2: No), the control unit 20 determines whether or not the change in the pan angle and the tilt angle is large (S5). This determination is made by comprehensively looking at the changes in the pan angle and the tilt angle. For example, if at least the pan angle change speed is equal to or more than a preset threshold (pan angle change speed threshold), the tilt angle change speed is equal to or greater than a preset threshold (tilt angle change speed threshold). In any of the cases, the change in the pan angle and the tilt angle can be large (S5: Yes). Alternatively, for example, in order to comprehensively evaluate these angle change rates, this determination is made based on the magnitude sum of the square sum of the pan angle change speed and the tilt angle change speed, and the threshold value set in advance corresponding thereto. May be done. When it is determined that the change in the pan angle and the tilt angle is large (S5: Yes), the image is subjected to the blurring process in the same manner as described above (S3) and then output (S4).
 パン角、チルト角の変化が大きいと判定しなかった場合(S5:No)には、暈し処理は施されず(S6)、そのままの画像がこのフレームの画像として出力される(S4)。以上の動作をフレーム毎に行わせることにより、上記のようにフォーカルプレーン歪が生じたフレームの画像のみに暈し処理が施された映像信号が出力される。 If it is not determined that the changes in the pan angle and the tilt angle are large (S5: No), the blur processing is not performed (S6), and the image as it is is output as the image of this frame (S4). By performing the above operation for each frame, the video signal subjected to the blur processing is output only to the image of the frame in which the focal plane distortion occurs as described above.
 上記の判定(S2、S5)に用いる閾値は、ズーム変化量やパン角、チルト角の変化速度は零ではないがこれらが小さいためにフォーカルプレーン歪が目立たない範囲では、暈し処理が施されないように適宜設定される。あるいは、撮像装置1において、これらの閾値がユーザにより設定可能とされていてもよい。また、暈し処理を施す場合に、暈し処理の程度をこれらの変化量に応じて変動させてもよく、例えば、これらの変化量が高いためにフォーカルプレーン歪の程度が大きい場合には暈しの程度を大きくすることができる。 The threshold value used for the above determinations (S2, S5) is not zero in the zoom change amount and the change speed of the pan angle and tilt angle, but the blur processing is not performed in a range where the focal plane distortion is not conspicuous because these are small Is set as appropriate. Alternatively, in the imaging device 1, these thresholds may be settable by the user. In addition, when performing the blurring process, the degree of the blurring process may be changed according to the variation amount. For example, when the focal plane distortion is large due to the high variation amount, the blurring amount is large. It is possible to increase the degree of bending.
 次に、上記の撮像装置1の変形例について説明する。上記の撮像装置1は、上記の処理が施された映像信号を外部に出力し、外部ではこの映像信号をそのままディスプレイ等で表示させることができる。しかしながら、同様の処理を撮像装置の外部で、映像信号を表示させる直前で行わせることも可能である。変形例となる撮像装置2は、こうした場合を想定して構成されている。図8は、その構成を示す、図1に対応したブロック図である。 Next, a modified example of the image pickup apparatus 1 will be described. The image pickup apparatus 1 can output the video signal that has been subjected to the above processing to the outside, and can display the video signal as it is on the display outside. However, it is also possible to perform the same processing outside the imaging device immediately before displaying the video signal. The imaging device 2 as a modified example is configured assuming such a case. FIG. 8 is a block diagram showing the configuration corresponding to FIG.
 この撮像装置2においては、前記の撮像装置1とは異なり、暈し処理部24は用いられないため、暈し処理が全く施されず一部のフレームにフォーカルプレーン歪が発生している映像信号が映像信号出力部23を介して外部に発せられる。一方、制御部20は、前記の場合に暈し処理部24を制御した代わりに、前記のような撮像条件の変化が大きかったか否か(暈し処理を要するか否か)をフレーム毎に判定した結果(図6におけるS2、S5)を示す外部処理情報を作成する外部処理情報作成部26が設けられる。外部処理情報は、出力される映像信号において暈し処理が必要となるフレームが特定される限りにおいて、任意のフォーマットで作成され、画像信号に付随するメタデータとなるようなフォーマットとすることもできる。 Unlike the image pickup apparatus 1 described above, the image pickup apparatus 2 does not use the blurring processing unit 24, so that the blurring processing is not performed at all, and a video signal in which focal plane distortion has occurred in some frames. Is transmitted to the outside via the video signal output unit 23. On the other hand, instead of controlling the blur processing unit 24 in the above case, the control unit 20 determines for each frame whether or not the change in the above-described imaging condition is large (whether or not the blur processing is required). An external process information creating unit 26 is provided which creates external process information indicating the result (S2, S5 in FIG. 6). The external processing information may be created in any format as long as a frame that requires gradation processing is specified in the output video signal, and may be in a format that becomes metadata accompanying the image signal. ..
 この撮像装置2によれば、処理が施されない映像信号を外部の装置で入手し、この装置側で上記と同様の暈し処理部を用いて暈し処理を行わせ、これに基づく映像を撮像装置1を用いた場合と同様に表示させることができる。この場合には、必要に応じ、この装置側で、フォーカルプレーン歪が存在するが暈し処理が施されない映像を見ることも可能となる。また、暈し処理の程度等の調整も、外部の装置で行うことが可能となる。 According to this image pickup device 2, an image signal that has not been processed is obtained by an external device, and this device performs the blur processing using the same blur processing section as described above, and images based on this are captured. It can be displayed in the same manner as when the device 1 is used. In this case, if necessary, it is possible to view an image on which the focal plane distortion is present but the blur processing is not performed on the device side. In addition, adjustment of the degree of blurring processing can be performed by an external device.
 なお、上記の例では、カメラモジュール10における画角の大きさの変動(ズーム動作)、画角の方向の変動(パン、チルト角の変化)が発生した場合に暈し処理を施す設定としたが、フォーカルプレーン歪を発生させるような撮像状態の時間的な変化が発生した場合には、同様に暈し処理を施せば、同様の効果を奏する。例えば、カメラモジュールではなく、撮像装置本体の水平、垂直方向における急激な移動があった場合においても、同様とすることができる。このような対象とすべき撮像状態の変化として認識すべき動作は、カメラモジュール(撮像装置)の構成に応じて適宜設定される。これに応じて、変動検出部や上記の判定の閾値等も適宜設定される。 In the above example, the blurring process is set to be performed when the size of the angle of view in the camera module 10 changes (zoom operation) and the direction of the angle of view changes (changes in pan and tilt angles). However, if a temporal change in the imaging state that causes focal plane distortion occurs, the same effect can be obtained by performing the blurring process in the same manner. For example, the same can be applied when there is a sudden movement in the horizontal and vertical directions of the image pickup apparatus main body instead of the camera module. The operation to be recognized as the change of the image pickup state to be the target is appropriately set according to the configuration of the camera module (image pickup apparatus). In accordance with this, the fluctuation detection unit, the threshold value for the above determination, and the like are also appropriately set.
 また、暈し処理の程度についても適宜設定が可能である。例えば、フォーカルプレーン歪が発生したフレームが連続する場合において、連続する全てのフレームに同一の処理が施される必要はない。例えば、3つ以上のフレームで連続してフォーカルプレーン歪が発生した場合において、暈し処理の程度を、時間的に最初と最後の側のフレームでは小さくし、中央のフレームで大きくしてもよい。 Also, the degree of shading processing can be set appropriately. For example, when the frames in which the focal plane distortion occurs are continuous, it is not necessary to perform the same processing on all the continuous frames. For example, when the focal plane distortion continuously occurs in three or more frames, the degree of blurring processing may be reduced in the first and last frames in time and increased in the central frame. ..
 また、撮像素子の露光時間が長くなると、フォーカルプレーン歪が発生しやすくなるため、露光時間が閾値より長い場合、暈し処理を施すと判定してもよい。または、露光時間が長い程、暈し処理を施すと判定する閾値(ズーム変化量の閾値、パン角・チルト角の変化速度の閾値)を低くしてもよい。 Also, if the exposure time of the image sensor becomes longer, focal plane distortion is likely to occur, so if the exposure time is longer than the threshold value, it may be determined that the blur processing should be performed. Alternatively, the threshold value (threshold value of zoom change amount, threshold value of pan/tilt angle change speed) for determining that the blurring process is performed may be set lower as the exposure time is longer.
 以上、本発明を実施形態をもとに説明した。この実施形態は例示であり、それらの各構成要素の組み合わせにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described above based on the embodiments. It should be understood by those skilled in the art that this embodiment is an exemplification, that various modifications can be made to the combination of the respective constituent elements, and that such modifications are within the scope of the present invention.
1、2 撮像装置10 カメラモジュール11 光学系12 撮像素子13 ズーム調整部14 ズーム動作検出部(変動検出部)20 制御部21 パン・チルト角制御部22 ガンマ補正部23 映像信号出力部24 暈し処理部25 パン・チルト動作検出部(変動検出部)26 外部処理情報作成部 1, 2 imaging device 10, camera module 11, optical system 12, imaging element 13, zoom adjustment unit 14, zoom operation detection unit (variation detection unit) 20, control unit 21, pan/tilt angle control unit 22, gamma correction unit 23, video signal output unit 24 Processing unit 25 Pan/tilt operation detection unit (variation detection unit) 26 External processing information creation unit

Claims (4)

  1.  ローリングシャッタ方式で動作する撮像素子と光学系とが組み合わされて動画を構成する映像信号を出力するカメラモジュールが用いられる撮像装置であって、
     前記カメラモジュールにおける撮像状態の時間的変化率を検出する変動検出部と、
     前記映像信号において、暈し処理を施す暈し処理部と、
     前記映像信号において、前記時間的変化率が閾値以上となった時点に対応したフレームの画像に対し、暈し処理を施すよう前記暈し処理部へ指示する制御部と、
     を具備することを特徴とする撮像装置。
    An imaging device using a camera module that outputs a video signal that forms a moving image by combining an imaging device that operates in a rolling shutter system and an optical system,
    A fluctuation detection unit that detects a temporal change rate of an image pickup state in the camera module,
    In the video signal, a blur processing unit that performs blur processing,
    In the video signal, a control unit for instructing the blur processing unit to perform blur processing on an image of a frame corresponding to a time point when the temporal change rate is equal to or more than a threshold value,
    An imaging device comprising:
  2.  前記暈し処理部において、
     前記時間的変化率が大きな場合に前記暈し処理の程度が大きくなるように設定されたことを特徴とする請求項1に記載の撮像装置。
    In the blur processing unit,
    The image pickup apparatus according to claim 1, wherein the degree of the blur processing is set to be large when the temporal change rate is large.
  3.  前記変動検出部は、前記時間的変化率として、前記光学系のズーム動作の変化量、又は前記カメラモジュールの視野方向を変化させる動作の速度を認識することを特徴とする請求項1に記載の撮像装置。 The variation detecting unit recognizes, as the temporal change rate, a change amount of a zoom operation of the optical system or a speed of an operation of changing a visual field direction of the camera module. Imaging device.
  4.  ローリングシャッタ方式で動作する撮像素子と光学系とが組み合わされて動画を構成する映像信号を出力するカメラモジュールが用いられる撮像装置であって、
     前記カメラモジュールにおける撮像状態の時間的変化率を検出する変動検出部と、
     前記時間的変化率が閾値以上となった時点を前記映像信号におけるフレームと対応させて示す外部処理情報を作成する外部処理情報作成部と、
     前記映像信号と前記外部処理情報とを対応付けて出力させる制御部と、
     を具備することを特徴とする撮像装置。
    An imaging device using a camera module that outputs a video signal that forms a moving image by combining an imaging device that operates in a rolling shutter system and an optical system,
    A fluctuation detection unit that detects a temporal change rate of an image pickup state in the camera module,
    An external processing information creating unit that creates external processing information indicating a time point when the temporal change rate is equal to or more than a threshold value in association with a frame in the video signal,
    A control unit that outputs the video signal and the external processing information in association with each other;
    An imaging device comprising:
PCT/JP2020/002771 2019-02-18 2020-01-27 Imaging device WO2020170723A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021501759A JP7121852B2 (en) 2019-02-18 2020-01-27 Imaging device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-026249 2019-02-18
JP2019026249 2019-02-18

Publications (1)

Publication Number Publication Date
WO2020170723A1 true WO2020170723A1 (en) 2020-08-27

Family

ID=72145048

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/002771 WO2020170723A1 (en) 2019-02-18 2020-01-27 Imaging device

Country Status (2)

Country Link
JP (1) JP7121852B2 (en)
WO (1) WO2020170723A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010035133A (en) * 2008-02-26 2010-02-12 Canon Inc Moving image encoding apparatus and moving image encoding method
JP2016058876A (en) * 2014-09-09 2016-04-21 キヤノン株式会社 Imaging apparatus, control method thereof, and imaging system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010035133A (en) * 2008-02-26 2010-02-12 Canon Inc Moving image encoding apparatus and moving image encoding method
JP2016058876A (en) * 2014-09-09 2016-04-21 キヤノン株式会社 Imaging apparatus, control method thereof, and imaging system

Also Published As

Publication number Publication date
JP7121852B2 (en) 2022-08-18
JPWO2020170723A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
CN109842753B (en) Camera anti-shake system, camera anti-shake method, electronic device and storage medium
CN110012224B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
US8149280B2 (en) Face detection image processing device, camera device, image processing method, and program
TWI446098B (en) Imaging device and its shutter drive mode selection method
US7782362B2 (en) Image pickup device for changing a resolution of frames and generating a static image based on information indicating the frames
CN109951638B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
US20120057028A1 (en) Imaging system and pixel signal readout method
JP2011130383A (en) Image capturing apparatus and method of controlling the same
JP2009010616A (en) Imaging device and image output control method
US8970711B2 (en) Imaging apparatus for correcting distortion in image captured using rolling shutter method and distortion correction method
JP4245045B2 (en) Imaging apparatus, imaging signal processing method, and program
US7936385B2 (en) Image pickup apparatus and imaging method for automatic monitoring of an image
US20150288949A1 (en) Image generating apparatus, imaging apparatus, and image generating method
JP4235510B2 (en) Digital camera and solid-state imaging device
JP5387341B2 (en) Imaging device
WO2017104102A1 (en) Imaging device
JP2009135713A (en) Method and device for correcting image distortion generated by blurring of imaging apparatus
JP5393877B2 (en) Imaging device and integrated circuit
WO2020170723A1 (en) Imaging device
JP2019080261A (en) Imaging apparatus
JP4969349B2 (en) Imaging apparatus and imaging method
JP2012244243A (en) Imaging apparatus and method of controlling imaging apparatus
JP2022083147A (en) Imaging apparatus, imaging method, program, and recording medium
JP6843651B2 (en) Image processing device, control method of image processing device, and program
JP2014216692A (en) Photographing device with high resolution processing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20758930

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021501759

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20758930

Country of ref document: EP

Kind code of ref document: A1