WO2011155161A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2011155161A1 WO2011155161A1 PCT/JP2011/003105 JP2011003105W WO2011155161A1 WO 2011155161 A1 WO2011155161 A1 WO 2011155161A1 JP 2011003105 W JP2011003105 W JP 2011003105W WO 2011155161 A1 WO2011155161 A1 WO 2011155161A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- inclination
- information
- angle
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to an image processing apparatus and an image processing method, and more particularly to an image correction technique.
- Japanese Patent Application Laid-Open No. 2004-228688 provides a method in which an acceleration sensor is provided in an imaging apparatus, the inclination of the imaging apparatus with respect to the earth axis is detected by measuring gravitational acceleration, and the inclination of an image captured based on the detected inclination angle is corrected.
- Patent Document 2 discloses a method for detecting line segments existing above a captured image, estimating the tilt angle of the entire image from the tilt angles of those line segments, and correcting the tilt of the captured image.
- Patent Document 3 discloses a technique for improving the accuracy of tilt angle calculation by using a combination of a sensor and image processing.
- Patent Document 4 when calculating the tilt angle of an image using image processing, the image is divided into small blocks, the directionality indicated by the texture in each block is determined, and the texture is unidirectional. A method of calculating an inclination angle only for a region having the target is shown. This method is equivalent to extracting the tilt information from only the structure in the image, and contributes to improving the accuracy and stability of the tilt angle calculation.
- the sensor output value includes fluctuation components such as inertia noise and other-axis sensitivity. Highly accurate tilt correction is difficult. Further, in Patent Document 2, there is a limitation on the composition of the captured image, which is not practical in a general usage situation.
- Patent Document 3 and Patent Document 4 it is proposed to use a combination of a sensor and image processing in order to compensate for the above drawbacks. That is, a method for selecting an angle component satisfying a certain standard from a plurality of tilt angle candidates using sensor information is shown, but when a fluctuation component is superimposed on the output value of the sensor, A decrease in accuracy is inevitable.
- Patent Document 4 discloses a method of improving accuracy by removing in advance tilt angle information from a non-structure that can be a fluctuation component in tilt angle estimation. The effect cannot be shown when the angle information itself is a fluctuation component.
- FIGS. 1A and 1B are diagrams for explaining an example in which inclination angle information from a structure is a fluctuation component.
- FIG. 1A is an image taken using a wide-angle lens such as a fisheye lens.
- a wide-angle lens such as a fisheye lens.
- FIG. 1B is an example of an image obtained by cutting out the vicinity of the center of FIG. 1A and performing distortion correction.
- FIG. 2 is an angle histogram in which the luminance gradient for each pixel of the image shown in FIG. 1B is represented by the angle on the horizontal axis and the frequency on the vertical axis. If this image is tilted by an angle ⁇ , it is desirable that the frequency of the bin of ⁇ should be the mode value, but the frequency of the bin of the angle obtained from the horizontal line of the structure exceeds that. You can see that Since this characteristic is particularly noticeable with wide-angle images, it cannot be solved by the method of previously removing the non-structure disclosed in Patent Document 4.
- Patent Document 5 a vanishing point candidate and an edge extracted from an image are connected by a line segment, a histogram in which the frequency of the inclination angle of the line segment satisfying a predetermined criterion is accumulated, and the vanishing point candidate is changed one after another. A method is shown in which this histogram is obtained and the vanishing point candidate having the highest frequency is used as the vanishing point. If this vanishing point can be determined, the horizontal line of the structure can be specified, and the fluctuation component from the structure can be removed. However, when the method disclosed in Patent Document 5 is used, there is a possibility that the vertical lines of the structure are also removed. In addition, a predetermined reference must be set in advance, and it cannot be used when it cannot be assumed at what tilt angle the image is taken.
- An object of the present invention is to make it possible to estimate an inclination angle by selecting information that can be used for estimating an inclination angle of an image from edge components obtained from the image and its inclination angle, and to correct the inclination of the image.
- JP 2006-245726 A Japanese Patent No. 3676360 International Publication No. 2009-001512 International Publication No. 2009-008174 Japanese Unexamined Patent Publication No. 63-106875
- An image processing apparatus includes an image acquisition unit, a tilt information calculation unit that calculates a plurality of tilt information for estimating the tilt of the image for each pixel of the image acquired by the image acquisition unit, and tilt information.
- An angle histogram generation unit that generates an angle histogram that is a frequency distribution for each inclination angle using the plurality of inclination information calculated by the calculation unit, a plurality of inclination information, and a position on the image from which the information is obtained
- the inclination information-position recording unit for recording the correspondence between the inclination information, the inclination information distribution degree calculating unit for calculating the distribution degree of inclination information from the inclination information and its position, and the inclination estimation of the inclination information from the distribution degree of inclination information
- a false information determination unit that determines what can be used and what cannot be used, a peak detection unit that extracts the maximum value or maximum value of an angle histogram that can be used based on the determination result of the false information determination unit, Consisting of
- the plurality of pieces of inclination information for estimating the inclination is obtained by calculating a line segment in the image together with an inclination angle thereof, and the inclination information calculation unit preliminarily stores the image.
- the calculation process may be performed for a predetermined range for a predetermined pixel interval.
- the plurality of pieces of inclination information for estimating the inclination is calculated by substituting the luminance information in the image into a predetermined arithmetic expression to obtain a luminance gradient of the luminance information.
- the inclination information calculation unit may calculate the predetermined range of the image with respect to a predetermined pixel interval.
- the angle histogram generation unit may weight each inclination angle information, and the weight may be increased as the length of the line segment detected from the image is longer.
- the angle histogram generation unit may weight each inclination angle information, and the weight may be increased as the luminance gradient detected from the image is closer to the vertical.
- the distribution degree of the inclination information may be a variance or a standard deviation with respect to a position for each inclination angle that is the same or within a predetermined range.
- the false information determination unit compares the calculation result of the inclination information distribution degree calculation unit with a preset value, and determines an inclination angle that is within a certain or predetermined range. Of the inclination angle satisfying a predetermined criterion in the angle histogram generation unit, the inclination angle determined not to be false information in the false information determination unit May be selected.
- the inclination estimation unit has an inclination angle that is determined not to be false information by the false information determination unit among inclination angles having a frequency that satisfies a predetermined criterion in the angle histogram generation unit.
- the tilt angle having the highest frequency may be selected.
- the image processing apparatus may further include an image correction unit that corrects an image according to the tilt information output by the tilt estimation unit.
- This configuration makes it possible to correct an image based on the tilt information estimated by the tilt estimation unit.
- At least two images are selected and read out from the image storage unit for storing the acquired image and the images stored in the image storage unit, and these images are combined to obtain a panoramic image.
- a panoramic image creation unit for creating the panorama image, and the tilt information calculation unit performs a predetermined calculation on each pixel of the panoramic image to obtain a plurality of pieces of information for estimating the panorama image tilt. It may be extracted.
- the tilt estimation unit estimates a tilt angle of the panoramic image, and uses the relative rotation angle between two or more images constituting the panoramic image to convert the panoramic image.
- the inclination angle of each of the two or more images constituting the image may be calculated.
- the tilt angle of an image when estimating the tilt angle of an image, for example, an image shot using a wide-angle lens and the horizontal component of the structure is dominant, the tilt angle of the image is set.
- the information to be expressed can be selected, and the inclination angle can be estimated and corrected correctly.
- FIG. 1A is a diagram illustrating an image photographed using a wide-angle lens for explaining an example in which tilt angle information from a structure is a fluctuation component.
- FIG. 1B is a diagram illustrating an image in which the vicinity of the center in FIG. 1A is cut out and distortion is removed for explaining an example in which the tilt angle information from the structure is a fluctuation component.
- FIG. 2 is a diagram showing an angle histogram of the image of FIG. 1B.
- FIG. 3 is a diagram showing the configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a flowchart showing the image processing method according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram for explaining the luminance gradient at points on the line segment.
- FIG. 5 is a diagram for explaining the luminance gradient at points on the line segment.
- FIG. 6A is a diagram illustrating a coefficient matrix used in the Sobel filter.
- FIG. 6B is a diagram illustrating a coefficient matrix used in the Sobel filter.
- FIG. 7A is a diagram for explaining parameters characterizing a line segment in the Hough transform.
- FIG. 7B is a diagram for explaining parameters characterizing a line segment in the Hough transform.
- FIG. 8 is a diagram for explaining an example of a format for recording an edge extracted from an image and its parameters.
- FIG. 9A is a diagram showing the difference in the spatial distribution characteristics of the vertical and horizontal lines of the structure.
- FIG. 9B is a diagram showing the difference in the spatial distribution characteristics of the vertical and horizontal lines of the structure.
- FIG. 9C is a diagram illustrating the difference in the spatial distribution characteristics of the vertical and horizontal lines of the structure.
- FIG. 9D is a diagram showing the difference in the spatial distribution characteristics of the vertical and horizontal lines of the structure.
- FIG. 10 is a diagram showing the configuration of the image processing apparatus according to Embodiment 2 of the present invention.
- FIG. 11A is a diagram illustrating an example of creating a panoramic image from a plurality of images and advantages of using the panoramic image in the second embodiment of the present invention.
- FIG. 11B is a diagram illustrating an example of creating a panoramic image from a plurality of images and advantages of using a panoramic image in the second embodiment of the present invention.
- FIG. 12 is a diagram showing the configuration of the image processing apparatus according to Embodiment 3 of the present invention.
- FIG. 13 is a diagram illustrating an example of triaxial output values of the acceleration sensor.
- FIG. 14 is a diagram illustrating an example of output values in the x and y directions when tilt detection is performed using an acceleration sensor.
- FIG. 3 is a diagram showing the configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a flowchart showing the image processing method according to Embodiment 1 of the present invention.
- the image acquisition unit 301 takes the image data acquired by the imaging unit 307 into the image processing apparatus 300 and sends it to the inclination information calculation unit 302 (S401).
- the inclination information calculation unit 302 calculates inclination information for the captured image (S402).
- grayscale conversion is performed on the image, a vertical and horizontal Sobel filter is applied to each pixel, and the result is combined to calculate a gradient vector, or Hough conversion that extracts a line segment from the image
- the present invention is not limited to this method.
- the image when extracting a line segment from an image, the image is divided into a plurality of small regions, and it is checked whether or not each small region includes a line segment or a pixel having an inclination to be determined (that is, in advance). It is also possible to examine whether or not a line segment or a pixel having an inclination to be determined for a predetermined pixel interval in a predetermined range is included. With this configuration, the processing load of the inclination information calculation unit 302 or the processing load after the inclination information calculation unit 302 can be reduced.
- FIGS. 7A and 7B are filter elements for measuring the contrast change in the x direction and the y direction, respectively.
- line segment detection by Hough transform will be described with reference to FIGS. 7A and 7B.
- FIG. 7A a straight line passing through the point (x_i, y_i) satisfies the following equation.
- ⁇ x_i ⁇ cos ⁇ + y_i ⁇ sin ⁇
- ⁇ represents the distance between the straight line and the origin
- ⁇ represents the angle between the perpendicular line from the origin to the straight line and the x axis.
- the angle histogram generation unit 303 performs processing such as accumulating the gradient vectors having the same direction component or adding the lengths of the line segments obtained by the Hough transform having the same direction component. To generate an angle histogram (S403).
- This angle histogram is an index that represents "how many edges exist in a certain angle direction". Generally, when only the structure is shown in the image with no tilt (when there is no distortion), the horizontal and vertical line components Therefore, the angle histogram values of 0 degree and 90 degrees have a strong peak. If the image is inclined by ⁇ as shown in FIGS. 1A and 1B, an angle histogram obtained by translating the angle histogram when there is no inclination as shown in FIG. 2 by ⁇ is obtained.
- the present invention is not limited to this.
- the image when obtaining a luminance gradient from an image, the image is divided into a plurality of small areas, and in each small area, is there a pixel having the same inclination angle in the direction of ⁇ 90 ° with the obtained inclination angle? (That is, whether or not a pixel having the same inclination angle is included in a direction that is ⁇ 90 ° with respect to an inclination angle obtained for a predetermined pixel interval in a predetermined range) may be used. .
- the processing load of the inclination information calculation unit 302 or the processing load after the inclination information calculation unit 302 can be reduced.
- the peak detection unit 304 searches for the angle indicating the maximum value of the angle histogram, and the tilt estimation unit 305 directly uses the angle indicating the maximum value of the angle histogram as the image tilt angle to perform image correction.
- the unit 306 performs processing for correcting the tilt angle.
- the peak detection unit 304 selects several local maximum values including the maximum value, and the slope estimation unit 305 calculates an envelope that smoothly connects the local maximum values by polynomial interpolation, the least square method, the gradient method, and the like.
- the maximum value is calculated analytically, and this is used as the tilt angle of the image, and the tilt is corrected by the image correction unit 306.
- the image correction unit 306 corrects rotation by affine transformation. In the present invention, the same processing is performed for the estimation of the tilt angle from the angle histogram and the correction by the affine transformation.
- the tilt information-position recording unit 310 records the tilt information obtained by the tilt information calculation unit 302 in association with the position in the image (S411).
- FIG. 8 is an example in which the tilt information and the coordinates representing the position in the image are recorded in association with each other.
- the position of the tilt information in the image is not limited to coordinates, and may be vector data or the like as long as the position can be expressed.
- frame ID, edge ID, x / y direction edge component, angle, edge strength, coordinates, etc. are recorded in association with each other.
- a feature amount or the like at the time of performing may be added, or a minimum configuration such as edge ID, x / y direction edge, only coordinates, or only edge ID, angle, coordinates may be used.
- the inclination information distribution degree calculation unit 311 extracts a plurality of pieces of inclination information having angles within a predetermined angular width from the database, and calculates a variance or standard deviation regarding these coordinates (S412).
- V_x / y the variance of coordinates in the x / y direction
- ⁇ _x / y the standard deviation in the x / y direction
- FIGS. 9B to 9D are obtained by extracting the horizontal lines of the structure.
- the horizontal line of the structure has the same inclination angle on the image only in the same horizontal line (in the case of FIG. 9D) or coincides with the inclination angle of the horizontal line of the structure on the opposite side of the vanishing point at the maximum.
- the fake information determination unit 312 determines whether or not the information is fake information by determining the locality of a certain inclination angle using the feature of the structure edge as described above (S413 to S415).
- a specific method of determining locality is to calculate Equation 1 or Equation 2 after calculating coordinates rotated by the inclination angle of the edge, and a set of inclination angles having locality as shown in FIGS.
- the dispersion or standard deviation in the x direction or y direction after rotation is a very small value, in FIG. 9A, there is a method using a characteristic that both the x direction and the y direction have a certain size.
- a method may be considered in which an image is divided into a plurality of small areas and each small area is examined to determine whether or not a line segment or a pixel having an inclination to be determined is included.
- it may be checked whether there is a pixel having the same inclination angle in a direction that makes ⁇ 90 ° with the inclination angle obtained there, or the result of Hough conversion may be used as it is.
- the locality determination method is not limited to the method described here, and any method may be used as long as it can determine the locality of the feature amount in the image.
- the peak detection unit 304 and the inclination estimation unit 305 are information such as an inclination angle that gives a plurality of maximum values of the angle histogram in the angle histogram generation unit 303 or an inclination angle that gives a mode value, and an inclination by the false information determination unit 312.
- the determination result for each angle is received.
- the angle that gives the mode value among the inclination angles determined not to be false information is set as the inclination estimation angle, and this information is sent to the image correction unit 306 (S404, S405).
- the image correction unit 306 can correct the amount of rotational movement between images by rotating the entire image by performing, for example, affine transformation.
- the tilt estimation unit 305 may ignore the estimation result and use the angle estimated immediately before as the tilt angle. .
- an appropriate error tolerance range (for example, ⁇ 1 °) is set for the inclination angle estimated immediately before by the inclination estimation unit 305, and the most frequent inclination angle determined not to be false information within this range.
- a tilt angle that gives a value may be selected and used as an estimated angle.
- the image correction unit 306 finally rotates the entire image to complete the correction of the target image (S406).
- the corrected image may be stored in the storage unit 308 and then displayed on a monitor device (not shown) or saved as a file in an external storage device (not shown). Moreover, you may send out via a network.
- angle histogram generation unit 303 may weight each inclination angle information, and the weight may be increased as the length of the line segment detected from the image is longer. This is because the longer the length of a line segment, the more likely it is a reliable horizontal or vertical line component.
- angle histogram generation unit 303 may weight each inclination angle information, and the weight may be increased as the luminance gradient detected from the image is closer to the vertical.
- a buffer for storing the estimated tilt angle may be provided.
- This buffer stores the estimated tilt angle calculated in time series, and when the tilt angle of a single image cannot be calculated, the result of successful tilt angle estimation before and after that is used.
- the estimation may be performed by performing interpolation. This is because the inclination angle of an image generally changes continuously in a device that acquires continuous images such as a movie, and by using this property, extrapolation processing can be performed from past inclination angle estimation results. It is also possible to estimate the tilt angle of the target image.
- FIG. 10 is a diagram showing the configuration of the image processing apparatus according to Embodiment 2 of the present invention. 10, the same components as those in FIG. 3 are denoted by the same reference numerals, and description thereof is omitted.
- the image storage unit 1001 records the image data obtained by the image acquisition unit 301 for a predetermined time width. It shall have the same configuration as the FIFO.
- the panorama image creation unit 1002 sequentially reads the image data recorded in the image storage unit 1001 and creates a panorama image.
- a known method may be used to create the panoramic image itself. When a panoramic image is created, the relative inclination angle between a plurality of image data serving as the original data is removed.
- the panorama image is sent to the tilt information calculation unit 302, and thereafter, the tilt angle is estimated and the panorama image is corrected by the same method as in the first embodiment.
- FIG. 11A and 11B are examples of panoramic images.
- FIG. 11A shows that a single panoramic image is created by combining four images. The overlapping portion of the image 4 from the image 1 is detected and combined so that the deviation of the portion is eliminated. A single panoramic image is created.
- FIG. 11B is an example of a line segment or pixel having the same inclination angle extracted from the panoramic image. For example, as in image 3 in FIG. 11B, there is a case where there is no difference in the locality of the spatial distribution between the vertical line of the structure represented by a solid line and the horizontal line of the structure represented by a dotted line. In such a case, it is difficult to determine false information according to the present invention, but it can be seen that the locality becomes remarkable by using a panoramic image.
- the false information in the panoramic image can be clarified by creating the panoramic image, and the tilt angle can be estimated. Then, it is possible to determine the inclination angle of each image by feeding back the information to a single image. That is, the panorama image creation unit 1002 stores the tilt angle with respect to the reference image when creating the panorama image or the tilt angle with respect to the adjacent image, and calculates the relative angle for the result obtained by the tilt estimation unit 305. By performing addition / subtraction, tilt angle correction is performed for each of a plurality of images.
- FIG. 12 is a diagram showing the configuration of the image processing apparatus according to Embodiment 3 of the present invention. 12, the same components as those in FIG. 3 are denoted by the same reference numerals, and the description thereof is omitted.
- the sensor unit 1201 is installed inside or outside the imaging unit 307, and measures and outputs the movement of the imaging unit 307 at a predetermined cycle.
- FIG. 13 shows an example of sensor output.
- the sensor unit 1201 includes an acceleration sensor, a gyro sensor, an orientation sensor, and the like.
- the sensor unit 1201 is required to have three measurement axes in order to measure all the movements of the imaging unit 307. You don't need an axis.
- Image data and sensor data are handled so as to be synchronized inside the image processing apparatus 1200.
- the sensor acquisition unit 1202 processes the acquired sensor data and calculates auxiliary information for calculating the estimated tilt angle.
- the false information determination unit 312 uses information from the sensor acquisition unit 1202 to estimate which direction of the image the vertical direction is, and which of the line segment extracted from the image or the inclination angle of the pixel is the structure. Determine if it is likely to be a vertical line. Specifically, the inner product of the unit vector indicating the vertical direction obtained from the sensor information and the unit vector having the tilt angle extracted from the image is obtained, and the result is 1 (the tilt angle is represented by a normal vector).
- FIG. 14 is an example showing changes in acceleration in the x and y directions.
- FIG. 14 shows the acceleration obtained in each direction at the timings (a), (b), and (c), mapped onto a two-dimensional plane, and vectorized.
- the vectors of (a), (b), and (c) indicate the inclination angle with respect to the vertical direction when the image capturing unit 307 captures the image
- the image is tilted by rotating the image so that the direction is directed to the vertical direction. Can be removed.
- the sensor value may not be accurate due to inertial noise or other axis sensitivity. In that case, correction by the method clarified in the present invention is required, but the effect of the present invention can be further enhanced by inputting the tilt direction measured by the sensor to the tilt estimation unit 305.
- a predetermined range is set around the direction indicated by the sensor, and the setting is made from the angle histogram obtained by the angle histogram generation unit 303.
- the mode value of the frequency within the specified range is set as the tilt angle.
- the range setting may be a fixed value or a fluctuation value. In the case of a fluctuation value, it may be changed depending on the magnitude of movement, that is, the amplitude or stability of the sensor (a dispersion value within a predetermined time range can be used). .
- the motion when the motion is small, it is determined that the error of the sensor output value is small and the range is set narrow, and when the motion is large, the error is determined to be large and the range is set wide.
- the change may be continuous or a discrete value having two or more steps.
- the estimated angle when the information for calculating the estimated angle cannot be obtained from the image, the estimated angle may be calculated from the past estimation result, the previous and subsequent estimation results, and the sensor value. In that case, applying the assumption that the tilt angle also changes continuously if the images are continuously acquired in time series, set an allowable value for the already estimated angle, and within that range If the sensor value falls within the range, a method may be considered in which the value is used as a correction value.
- Each of the above devices is specifically a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or hard disk unit.
- Each device achieves its functions by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- a part or all of the components constituting each of the above devices may be configured by one system LSI (Large Scale Integration).
- the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- a part or all of the constituent elements constituting each of the above devices may be constituted by an IC card or a single module that can be attached to and detached from each device.
- the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present invention may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
- the present invention also provides a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc). ), Recorded in a semiconductor memory or the like.
- the digital signal may be recorded on these recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present invention may be a computer system including a microprocessor and a memory, the memory storing the computer program, and the microprocessor operating according to the computer program.
- the program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like, and executed by another independent computer system. It is good.
- the image processing apparatus can be incorporated into a photographing apparatus, an image display apparatus, or a video display apparatus, thereby correcting the inclination of the acquired image and generating an image with the correct orientation. Even for an image that has been difficult with a tilt correction apparatus using conventional image processing, it is possible to extract tilt information of a desired image by integrating a plurality of pieces of image information.
- the present invention can be applied not only to a photographing apparatus and a display apparatus but also to tilt correction other than electronic media such as a printer and a scanner that handle video.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
Description
図3は、本発明の実施の形態1における画像処理装置の構成を示す図である。また、図4は、本発明の実施の形態1における画像処理方法を示すフローチャートである。
dx=∂I(P)/∂x、dy=∂I(P)/∂y
と表せる。このとき、コントラスト変化の方向をθとすると、
θ=tan^(-1)(dy/dx)
が成り立ち、これが前述の勾配ベクトルの方向に相当する。dx、dyを求めるには図6A、図6Bに示すSobelフィルタを用いる。図6Aおよび図6Bはそれぞれx方向、y方向のコントラスト変化を測定するためのフィルタ要素である。次にHough変換による線分検出について図7A、図7Bを用いて説明する。図7Aにおいて、点(x_i、y_i)を通る直線は次式を満たす。
なお、ρは直線と原点との距離、θは原点から直線への垂線とx軸のなす角を表す。
(式2) σ_x=√(V_x),σ_y=√(V_y)
ここでx、yはそれぞれの座標の平均値であり、nはそのエッジ数である。このデータベースに登録された傾き情報はいずれかの角度範囲の集合に属し、そのすべての集合に対して座標についての標準偏差あるいは分散が求められる。なお、ある集合の分散度合を求める計算方法であればこれに限定するものではない。
図10は、本発明の実施の形態2における画像処理装置の構成を示す図である。図10において、図3と同じ構成要素については同じ符号を用い、説明を省略する。
図12は、本発明の実施の形態3における画像処理装置の構成を示す図である。図12において、図3と同じ構成要素については同じ符号を用い、説明を省略する。
なお、本発明を上記実施の形態に基づいて説明してきたが、本発明は、上記の実施の形態に限定されないのはもちろんである。以下のような場合も本発明に含まれる。
302 傾き情報算出部
303 角度ヒストグラム生成部
304 ピーク検出部
305 傾き推定部
306 画像補正部
307 撮像部
308 記憶部
310 傾き情報-位置記録部
311 傾き情報分布度算出部
312 偽情報判定部
1001 画像蓄積部
1002 パノラマ画像作成部
1201 センサ部
1202 センサ取得部
Claims (18)
- 画像取得部と、
前記画像取得部で取得した画像の各画素に対して、画像の傾きを推定するための複数の傾き情報を算出する傾き情報算出部と、
前記傾き情報算出部で算出した複数の傾き情報を用いて、その傾き角度ごとの度数分布である角度ヒストグラムを生成する角度ヒストグラム生成部と、
前記複数の傾き情報と、その情報が得られた画像上の位置との対応を記録する傾き情報-位置記録部と、
前記傾き情報とその前記位置から、傾き情報の分布度を算出する傾き情報分布度算出部と、
前記傾き情報の分布度から、前記傾き情報のうち傾き推定に使用できるものと使用できないものとを判定する偽情報判定部と、
前記偽情報判定部の判定結果にもとづいて、使用できる前記角度ヒストグラムの最大値あるいは極大値を抽出するピーク検出部と、
前記ピーク検出部が検出した前記最大値あるいは前記極大値が示す傾き角度から画像の傾き角度を推定する傾き推定部と
からなる画像処理装置。 - 前記画像の傾きを推定するための複数の傾き情報は、前記画像中の線分をその傾斜角とともに算出されたものであり、
前記傾き情報算出部は、前記画像の予め定められた範囲を予め定められた画素間隔について前記算出処理を行う
請求項1に記載の画像処理装置。 - 前記画像の傾きを推定するための複数の傾き情報は、前記画像中の輝度情報を所定の演算式に代入して演算を行って、前記輝度情報の輝度勾配を求めて算出されたものであり、
前記傾き情報算出部は、前記画像の予め定められた範囲を予め定められた画素間隔について前記算出処理を行う
請求項1に記載の画像処理装置。 - 前記傾き情報算出部は、前記画像の予め定められた範囲における輝度勾配から得られる傾き角度と±90°をなす方向に同一の傾き角度を持つ画素があるかどうかを調べることで前記算出処理を行う
請求項3に記載の画像処理装置。 - 前記角度ヒストグラム生成部は、各傾き角度情報に対する重み付けを行い、その重みは前記画像から検出された線分の長さが長いほど大きくする
請求項2に記載の画像処理装置。 - 前記角度ヒストグラム生成部は、各傾き角度情報に対する重み付けを行い、その重みは前記画像から検出された輝度勾配が垂直に近いほど大きくする
請求項3に記載の画像処理装置。 - 前記傾き情報の分布度は、同一あるいは予め定められた範囲内にある傾き角度ごとの位置についての分散あるいは標準偏差である
請求項1に記載の画像処理装置。 - 前記偽情報判定部は、前記傾き情報分布度算出部の算出結果と予め設定された値とを比較して、ある同一あるいは予め定められた範囲内にある傾き角度が偽情報であるか否かを判断し、
前記傾き推定部は前記角度ヒストグラム生成部において予め定められた基準を満たす傾き角度のうち、前記偽情報判定部において偽情報でないと判断された傾き角度を選択する
請求項1に記載の画像処理装置。 - 前記傾き推定部は前記角度ヒストグラム生成部において予め定められた基準を満たす度数を有する傾き角度のうち、前記偽情報判定部において偽情報でないと判断された傾き角度の中で最も高い度数を有する傾き角度を選択する
請求項1に記載の画像処理装置。 - 前記傾き推定部が出力する前記傾き情報にしたがって画像を補正する画像補正部
をさらに備える請求項1に記載の画像処理装置。 - 取得した画像を記憶する画像蓄積部と、
前記画像蓄積部に記憶された画像のうち、少なくとも2つの画像を選択して読み出し、この画像を結合してパノラマ画像を作成するパノラマ画像作成部と、
をさらに備え、
前記傾き情報算出部は、前記パノラマ画像の各画素に対して所定の演算を行って、前記パノラマ画像の傾きを推定するための複数の情報を抽出する
請求項1に記載の画像処理装置。 - 前記傾き推定部は前記パノラマ画像についての傾き角度を推定し、
前記パノラマ画像を構成する2つ以上の画像間の相対的な回転角度を用いて、前記パノラマ画像を構成する2つ以上の画像それぞれの傾き角度を算出する
請求項11に記載の画像処理装置。 - 撮像部の動きを計測するセンサ部からのデータを取得するセンサ取得部
をさらに備え、
前記センサ取得部は、前記傾き角度から画像の傾き角度を推定するための補助情報を算出する
請求項1に記載の画像処理装置。 - 前記偽情報判定部は、前記センサ取得部から得られた前記補助情報を用いて、鉛直方向が
画像のどの方向であるかを推定し、画像から抽出された線分あるいは画素の傾き角度のいずれが垂直線であるかを決定する
請求項13に記載の画像処理装置。 - 画像取得ステップと、
前記画像取得ステップで取得した画像の各画素に対して、画像の傾きを推定するための複数の傾き情報を算出する傾き情報算出ステップと、
前記傾き情報算出ステップで算出した複数の傾き情報を用いて、その傾き角度ごとの度数分布である角度ヒストグラムを生成する角度ヒストグラム生成ステップと、
前記複数の傾き情報と、その情報が得られた画像上の位置との対応を記録する傾き情報-位置記録ステップと、
前記傾き情報とその前記位置から、傾き情報の分布度を算出する傾き情報分布度算出ステップと、
前記傾き情報の分布度から、前記傾き情報のうち傾き推定に使用できるものと使用できないものとを判定する偽情報判定ステップと、
前記偽情報判定ステップの判定結果にもとづいて、使用できる前記角度ヒストグラムの最大値あるいは極大値を抽出するピーク検出ステップと、
前記ピーク検出ステップで検出した前記最大値あるいは前記極大値が示す傾き角度から画像の傾き角度を推定する傾き推定ステップと
からなる画像処理方法。 - 請求項1に記載の画像処理装置を含む集積回路。
- 請求項15に記載の画像処理方法をコンピュータに実行させるためのプログラム。
- 請求項17に記載のプログラムを記憶した記憶媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11792121.3A EP2582127B1 (en) | 2010-06-11 | 2011-06-02 | Image processing apparatus and image processing method |
CN201180003255.8A CN102474573B (zh) | 2010-06-11 | 2011-06-02 | 图像处理装置以及图像处理方法 |
JP2012500753A JP5756455B2 (ja) | 2010-06-11 | 2011-06-02 | 画像処理装置および画像処理方法 |
US13/381,399 US9088715B2 (en) | 2010-06-11 | 2011-06-02 | Image processing apparatus and image processing method for image correction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-133623 | 2010-06-11 | ||
JP2010133623 | 2010-06-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011155161A1 true WO2011155161A1 (ja) | 2011-12-15 |
Family
ID=45097780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/003105 WO2011155161A1 (ja) | 2010-06-11 | 2011-06-02 | 画像処理装置および画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9088715B2 (ja) |
EP (1) | EP2582127B1 (ja) |
JP (1) | JP5756455B2 (ja) |
CN (1) | CN102474573B (ja) |
WO (1) | WO2011155161A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013094154A1 (ja) * | 2011-12-21 | 2013-06-27 | パナソニック株式会社 | 画像処理装置および画像処理方法 |
JP5843033B1 (ja) * | 2014-05-15 | 2016-01-13 | 株式会社リコー | 撮像システム、撮像装置、プログラムおよびシステム |
JP5843034B1 (ja) * | 2014-05-15 | 2016-01-13 | 株式会社リコー | 動画表示装置およびプログラム |
JP2016111585A (ja) * | 2014-12-09 | 2016-06-20 | 日本電気株式会社 | 画像処理装置、システム、画像処理方法、およびプログラム |
JP2017175616A (ja) * | 2012-09-11 | 2017-09-28 | 株式会社リコー | 撮像装置、画像処理装置および方法 |
CN117078913A (zh) * | 2023-10-16 | 2023-11-17 | 第六镜科技(成都)有限公司 | 对象倾斜矫正方法、装置、电子设备和存储介质 |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5267794B2 (ja) * | 2008-12-26 | 2013-08-21 | 株式会社リコー | 画像処理装置及び車載カメラ装置 |
JP5696419B2 (ja) * | 2010-09-30 | 2015-04-08 | カシオ計算機株式会社 | 画像処理装置及び方法、並びにプログラム |
CN102789642B (zh) | 2011-05-16 | 2017-08-25 | 索尼公司 | 消失方向确定方法和装置、摄像机自标定方法和装置 |
JP2013214947A (ja) * | 2012-03-09 | 2013-10-17 | Ricoh Co Ltd | 撮像装置、撮像システム、画像処理方法、情報処理装置、及びプログラム |
US9094540B2 (en) * | 2012-12-13 | 2015-07-28 | Microsoft Technology Licensing, Llc | Displacing image on imager in multi-lens cameras |
US9025823B2 (en) * | 2013-03-12 | 2015-05-05 | Qualcomm Incorporated | Tracking texture rich objects using rank order filtering |
JP6070449B2 (ja) * | 2013-07-08 | 2017-02-01 | 富士ゼロックス株式会社 | 傾斜角度補正装置、画像読取装置、画像形成装置およびプログラム |
KR101482448B1 (ko) * | 2013-11-01 | 2015-01-15 | 경북대학교 산학협력단 | 허프 변환을 통한 직선 정보 검출 방법 및 장치 |
JP5967504B1 (ja) * | 2015-05-18 | 2016-08-10 | パナソニックIpマネジメント株式会社 | 全方位カメラシステム |
US10937168B2 (en) * | 2015-11-02 | 2021-03-02 | Cognex Corporation | System and method for finding and classifying lines in an image with a vision system |
DE102016120775A1 (de) | 2015-11-02 | 2017-05-04 | Cognex Corporation | System und Verfahren zum Erkennen von Linien in einem Bild mit einem Sichtsystem |
JP6604831B2 (ja) * | 2015-11-30 | 2019-11-13 | キヤノン株式会社 | 画像処理装置、画像処理装置の制御方法及びプログラム |
US10148875B1 (en) * | 2016-05-17 | 2018-12-04 | Scott Zhihao Chen | Method and system for interfacing multiple channels of panoramic videos with a high-definition port of a processor |
KR102540236B1 (ko) * | 2016-12-05 | 2023-06-02 | 삼성전자주식회사 | 이미지 처리 장치 및 시스템 |
CN108038820B (zh) * | 2017-11-14 | 2021-02-02 | 影石创新科技股份有限公司 | 一种实现子弹时间拍摄效果的方法、装置及全景相机 |
US10652472B2 (en) * | 2018-02-22 | 2020-05-12 | Adobe Inc. | Enhanced automatic perspective and horizon correction |
CN108462838B (zh) * | 2018-03-16 | 2020-10-02 | 影石创新科技股份有限公司 | 一种全景视频防抖方法、装置及便携式终端 |
CN109447070B (zh) * | 2018-10-31 | 2020-08-28 | 新华三信息安全技术有限公司 | 一种信息确定方法及装置 |
US11128814B2 (en) * | 2018-11-30 | 2021-09-21 | Vecnos Inc. | Image processing apparatus, image capturing apparatus, video reproducing system, method and program |
CN109902695B (zh) * | 2019-03-01 | 2022-12-20 | 辽宁工程技术大学 | 一种面向像对直线特征匹配的线特征矫正与提纯方法 |
CN110312070B (zh) * | 2019-04-23 | 2021-08-24 | 维沃移动通信有限公司 | 一种图像处理方法及终端 |
JP7316867B2 (ja) * | 2019-07-25 | 2023-07-28 | キオクシア株式会社 | 半導体画像処理装置 |
CN111639642B (zh) * | 2020-05-07 | 2023-07-07 | 浙江大华技术股份有限公司 | 一种图像处理方法、设备及装置 |
CN112818991B (zh) * | 2021-02-18 | 2024-04-09 | 长江存储科技有限责任公司 | 图像处理方法及图像处理装置、电子设备、可读存储介质 |
US20220406003A1 (en) * | 2021-06-17 | 2022-12-22 | Fyusion, Inc. | Viewpoint path stabilization |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63106875A (ja) | 1986-10-24 | 1988-05-11 | Nissan Motor Co Ltd | 消点検出装置 |
JPH06290260A (ja) * | 1993-03-31 | 1994-10-18 | Toppan Printing Co Ltd | 台紙入力装置 |
JPH0737103A (ja) * | 1993-07-23 | 1995-02-07 | Olympus Optical Co Ltd | 傾き角度検出装置 |
JP2002207963A (ja) * | 2001-01-11 | 2002-07-26 | Ricoh Co Ltd | 画像処理装置 |
JP3676360B2 (ja) | 2003-02-25 | 2005-07-27 | 松下電器産業株式会社 | 画像撮像処理方法 |
JP2006245726A (ja) | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co Ltd | デジタルカメラ |
WO2009001512A1 (ja) | 2007-06-27 | 2008-12-31 | Panasonic Corporation | 撮像装置、方法、システム集積回路、及びプログラム |
WO2009001510A1 (ja) * | 2007-06-28 | 2008-12-31 | Panasonic Corporation | 画像処理装置、画像処理方法、プログラム |
WO2009008174A1 (ja) | 2007-07-12 | 2009-01-15 | Panasonic Corporation | 画像処理装置、画像処理方法、画像処理プログラム、画像処理プログラムを記録した記録媒体、および、画像処理プロセッサ |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1184808A3 (en) * | 2000-08-29 | 2003-10-29 | Eastman Kodak Company | Method for detection of skew angle and border of document images in a production scanning environment |
JP4285287B2 (ja) * | 2004-03-17 | 2009-06-24 | セイコーエプソン株式会社 | 画像処理装置、画像処理方法およびそのプログラム、記録媒体 |
JP4755490B2 (ja) | 2005-01-13 | 2011-08-24 | オリンパスイメージング株式会社 | ブレ補正方法および撮像装置 |
CN100487522C (zh) * | 2005-01-13 | 2009-05-13 | 奥林巴斯映像株式会社 | 模糊校正方法及摄像装置 |
US8422788B2 (en) * | 2008-08-26 | 2013-04-16 | Microsoft Corporation | Automatic image straightening |
GB2473248A (en) * | 2009-09-04 | 2011-03-09 | Sony Corp | Determining image misalignment by comparing image characteristics at points along a line |
KR100976138B1 (ko) * | 2009-09-16 | 2010-08-16 | (주)올라웍스 | 건축물 이미지의 계층적 매칭 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 |
US8610708B2 (en) * | 2010-09-22 | 2013-12-17 | Raytheon Company | Method and apparatus for three-dimensional image reconstruction |
-
2011
- 2011-06-02 JP JP2012500753A patent/JP5756455B2/ja not_active Expired - Fee Related
- 2011-06-02 CN CN201180003255.8A patent/CN102474573B/zh not_active Expired - Fee Related
- 2011-06-02 WO PCT/JP2011/003105 patent/WO2011155161A1/ja active Application Filing
- 2011-06-02 EP EP11792121.3A patent/EP2582127B1/en not_active Not-in-force
- 2011-06-02 US US13/381,399 patent/US9088715B2/en not_active Expired - Fee Related
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63106875A (ja) | 1986-10-24 | 1988-05-11 | Nissan Motor Co Ltd | 消点検出装置 |
JPH06290260A (ja) * | 1993-03-31 | 1994-10-18 | Toppan Printing Co Ltd | 台紙入力装置 |
JPH0737103A (ja) * | 1993-07-23 | 1995-02-07 | Olympus Optical Co Ltd | 傾き角度検出装置 |
JP2002207963A (ja) * | 2001-01-11 | 2002-07-26 | Ricoh Co Ltd | 画像処理装置 |
JP3676360B2 (ja) | 2003-02-25 | 2005-07-27 | 松下電器産業株式会社 | 画像撮像処理方法 |
JP2006245726A (ja) | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co Ltd | デジタルカメラ |
WO2009001512A1 (ja) | 2007-06-27 | 2008-12-31 | Panasonic Corporation | 撮像装置、方法、システム集積回路、及びプログラム |
WO2009001510A1 (ja) * | 2007-06-28 | 2008-12-31 | Panasonic Corporation | 画像処理装置、画像処理方法、プログラム |
WO2009008174A1 (ja) | 2007-07-12 | 2009-01-15 | Panasonic Corporation | 画像処理装置、画像処理方法、画像処理プログラム、画像処理プログラムを記録した記録媒体、および、画像処理プロセッサ |
Non-Patent Citations (1)
Title |
---|
See also references of EP2582127A4 |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103460249B (zh) * | 2011-12-21 | 2016-08-17 | 松下知识产权经营株式会社 | 图像处理装置以及图像处理方法 |
CN103460249A (zh) * | 2011-12-21 | 2013-12-18 | 松下电器产业株式会社 | 图像处理装置以及图像处理方法 |
JPWO2013094154A1 (ja) * | 2011-12-21 | 2015-04-27 | パナソニックIpマネジメント株式会社 | 画像処理装置および画像処理方法 |
US9183633B2 (en) | 2011-12-21 | 2015-11-10 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus and image processing method |
WO2013094154A1 (ja) * | 2011-12-21 | 2013-06-27 | パナソニック株式会社 | 画像処理装置および画像処理方法 |
JP2019097178A (ja) * | 2012-09-11 | 2019-06-20 | 株式会社リコー | 撮像装置、画像処理装置および方法 |
JP2017175616A (ja) * | 2012-09-11 | 2017-09-28 | 株式会社リコー | 撮像装置、画像処理装置および方法 |
JP5843033B1 (ja) * | 2014-05-15 | 2016-01-13 | 株式会社リコー | 撮像システム、撮像装置、プログラムおよびシステム |
JP2016149752A (ja) * | 2014-05-15 | 2016-08-18 | 株式会社リコー | ファイル |
JP2016149733A (ja) * | 2014-05-15 | 2016-08-18 | 株式会社リコー | 撮像システム、撮像装置、プログラムおよびシステム |
JP2016149734A (ja) * | 2014-05-15 | 2016-08-18 | 株式会社リコー | 動画表示装置およびプログラム |
JP5843034B1 (ja) * | 2014-05-15 | 2016-01-13 | 株式会社リコー | 動画表示装置およびプログラム |
US10681268B2 (en) | 2014-05-15 | 2020-06-09 | Ricoh Company, Ltd. | Imaging system, imaging apparatus, and system |
JP2016111585A (ja) * | 2014-12-09 | 2016-06-20 | 日本電気株式会社 | 画像処理装置、システム、画像処理方法、およびプログラム |
CN117078913A (zh) * | 2023-10-16 | 2023-11-17 | 第六镜科技(成都)有限公司 | 对象倾斜矫正方法、装置、电子设备和存储介质 |
CN117078913B (zh) * | 2023-10-16 | 2024-02-02 | 第六镜科技(成都)有限公司 | 对象倾斜矫正方法、装置、电子设备和存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2011155161A1 (ja) | 2013-08-01 |
CN102474573A (zh) | 2012-05-23 |
EP2582127A1 (en) | 2013-04-17 |
JP5756455B2 (ja) | 2015-07-29 |
US9088715B2 (en) | 2015-07-21 |
CN102474573B (zh) | 2016-03-16 |
US20120105578A1 (en) | 2012-05-03 |
EP2582127A4 (en) | 2014-07-23 |
EP2582127B1 (en) | 2018-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5756455B2 (ja) | 画像処理装置および画像処理方法 | |
JP5694300B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP4950290B2 (ja) | 撮像装置、方法、システム集積回路、及びプログラム | |
CN102714696B (zh) | 图像处理装置、图像处理方法及摄影装置 | |
US8798387B2 (en) | Image processing device, image processing method, and program for image processing | |
CN105407271B (zh) | 图像处理设备和方法、摄像设备以及图像生成设备 | |
JP5074322B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム、及び、撮像装置 | |
EP1670237A2 (en) | Matching un-synchronized image portions | |
US9224212B2 (en) | Image processing apparatus and image processing method | |
JP2007228154A (ja) | 画像処理装置および画像処理方法 | |
US8774550B2 (en) | Picture processing device, picture processing method, integrated circuit, and program | |
JP2011259342A (ja) | 画像処理装置および画像処理方法 | |
KR101845612B1 (ko) | 투구 연습을 통한 3차원 정보 획득 시스템 및 카메라 파라미터 산출 방법 | |
CN111279352A (zh) | 通过投球练习的三维信息获取***及摄像头参数算出方法 | |
JP2019149717A (ja) | 画像処理装置及び画像処理方法、撮像装置、プログラム、記憶媒体 | |
US20230377102A1 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
JP2010041416A (ja) | 画像処理装置、画像処理方法、画像処理プログラム、及び、撮像装置 | |
JP2016146546A (ja) | 画像処理装置、情報処理方法及びプログラム | |
JP2014007579A (ja) | 映像振れ補正装置及び映像振れ補正方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180003255.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13381399 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012500753 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11792121 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011792121 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |