WO2019187967A1 - Image processing device and image processing method - Google Patents

Image processing device and image processing method Download PDF

Info

Publication number
WO2019187967A1
WO2019187967A1 PCT/JP2019/007979 JP2019007979W WO2019187967A1 WO 2019187967 A1 WO2019187967 A1 WO 2019187967A1 JP 2019007979 W JP2019007979 W JP 2019007979W WO 2019187967 A1 WO2019187967 A1 WO 2019187967A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
image
projection
medium
unit
Prior art date
Application number
PCT/JP2019/007979
Other languages
French (fr)
Japanese (ja)
Inventor
中村 宏
Original Assignee
日本電産サンキョー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産サンキョー株式会社 filed Critical 日本電産サンキョー株式会社
Publication of WO2019187967A1 publication Critical patent/WO2019187967A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to an image processing apparatus and an image processing method for processing information on a rectangular medium.
  • Hough transformation is well known as a technique for detecting and recognizing specific geometric figures consisting of straight lines or straight lines on a rectangular medium to be processed using digital images. Widely used in various industrial fields.
  • preprocessing such as noise removal and edge enhancement is performed on an image including a target graphic
  • Hough transform is performed on the extracted image pattern to convert it into cumulative points
  • maximum A procedure is performed in which inverse Hough transform is performed on cumulative points having cumulative frequencies to obtain straight lines in the image space (see, for example, Patent Documents 1 and 2).
  • position coordinates are detected by using Hough transform for positioning of a workpiece in a semiconductor device manufacturing process, and used for position correction. After removal, inverse Hough transform is performed to obtain a reference straight line to obtain workpiece angle correction.
  • the Hough transform is applied to estimate a line segment constituting a part of the periphery of the object. Count the number of points to be converted to polar coordinates, and select polar coordinates with high frequency. For each selected polar coordinate, a straight line equation is calculated by inverse Hough transform and a rotation angle is detected.
  • the present invention has been made in view of the above-described situation, and provides a technique capable of reducing the calculation load with respect to an image processing technique for detecting a rotation angle on an image of a rectangular medium. With the goal.
  • the present invention is an image processing apparatus for detecting a rotation angle on an image of a rectangular medium using a digital image, and each of the horizontal axis and the vertical axis of image data of the rectangular medium.
  • a projection generation unit that generates a projection of pixel values by luminance projection, an end point detection unit that determines both end points of the projection pattern of the projection waveform for each of the projection to the horizontal axis and the projection to the vertical axis;
  • a processing target cutout unit that cuts out first image data using a rectangular portion including four left and right and upper and lower four end points as processing targets, first image data obtained by cutting out the medium for the processing target area, and 180 degrees of the first image data.
  • An image adjustment unit that generates third image data obtained by superimposing the second image data rotated by the degree of rotation, and the third image data in the area to be processed.
  • Two parallel lines are drawn parallel to at least the horizontal axis or the vertical axis at a position passing through the area to obtain an edge position of the rectangular medium on each parallel line, and an edge interval between the two edge positions is determined.
  • the image processing apparatus of the present invention can detect the rotation angle on the image of the rectangular medium using the digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
  • the image adjustment unit generates the third image data by taking an average of the pixel values of the first image data and the second image data obtained by rotating the first image data by 180 degrees. It is preferable to do.
  • the image adjusting unit is less susceptible to noise and the like by taking the average of the pixel values of the first image data and the second image data obtained by rotating the first image data by 180 degrees.
  • a difference in luminance between the background and the medium edge (edge) can be generated.
  • the present invention relates to an image processing method for detecting a rotation angle on an image of a rectangular medium using a digital image, and each of the horizontal axis and the vertical axis of the image data of the rectangular medium.
  • a projection generation step for generating a projection of pixel values by luminance projection, and an endpoint detection step for determining both end points of the projection pattern of the projection waveform for each of the projection to the horizontal axis and the projection to the vertical axis;
  • a processing target cutout step for cutting out first image data using a rectangular portion composed of four end points on the left, right, top, and bottom as a processing target, first image data obtained by cutting out the medium with respect to the area to be processed, and 180 for the first image data.
  • An image adjustment step for generating third image data obtained by superimposing the second image data rotated at a predetermined angle, and the third image data in the area to be processed. And drawing two parallel lines parallel to at least the horizontal axis or the vertical axis at a position passing through the area to be processed to obtain an edge position of the rectangular medium on each parallel line, A medium edge point deviation calculating step for obtaining an edge interval between edge positions; and an angle calculating step for calculating an inclination angle from the edge interval and a separation distance between the two parallel lines.
  • the present invention it is possible to detect a rotation angle on an image of a rectangular medium using a digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
  • FIG. 5A is a diagram illustrating an example of an image captured from a card and luminance projection according to the embodiment, and FIG. 5A is an example of first image data cut out from image data captured from the card; FIG. ) Is an example of projection onto the horizontal axis (X axis), and FIG.
  • FIG. 5C is an example of projection onto the vertical axis (Y axis).
  • FIG. 6A is a diagram for explaining image adjustment processing of the image adjustment unit according to the embodiment, FIG. 6A shows the first image data cut out by the processing target cutout unit, and FIG. The second image data obtained by rotating the first image data by 180 degrees is shown, and FIG. 6C shows the third image data obtained by superimposing the first image data and the second image data. It is a figure which shows the relationship between the rectangular area of the extracted 3rd image data based on embodiment, a 1st horizontal line, and a 2nd horizontal line.
  • FIG. 8A is a diagram showing a change in luminance in the vicinity of a medium edge position on parallel lines (first horizontal line, second horizontal line) according to the embodiment, and FIG.
  • 8A corresponds to the left region of the rectangular region; 8 (b) corresponds to the area on the right side of the rectangular area. It is a figure which shows the relationship between the rectangular area of the extracted 3rd image data based on embodiment, a 1st vertical line, and a 2nd vertical line.
  • FIG. 1 is a diagram illustrating a configuration example of a main part of an image processing apparatus 10 according to the embodiment.
  • FIG. 2 is a diagram schematically showing an appearance of a card 100 which is an example of a rectangular medium.
  • the image processing apparatus 10 is an apparatus that detects a rotation angle on an image of a rectangular medium 100 using a digital image.
  • the medium 100 having a rectangular shape is a general card (hereinafter referred to as “card”) compliant with JIS.
  • the card 100 has a rectangular shape, and is, for example, a plastic card having a width of 86 mm, a height of 54 mm, and a thickness of 0.76 mm.
  • the rectangular medium is not limited to the card 100 described above, and may be, for example, a passport book with a width of 125 mm and a height of 88 mm.
  • it is not limited to a plastic card or a passport book, but may be an ID card or a driver's license.
  • the longitudinal direction of the card 100 is the X-axis direction
  • the short direction is the Y-axis direction
  • a black magnetic stripe 110 is formed on the card 100 in the longitudinal direction of the card 100
  • a barcode 120 is formed on the card 100 in the longitudinal direction.
  • the origin of the image of the card 100 is the origin O on the upper left side as shown in FIGS. 2 and 5, and the longitudinal direction of the card 100 (from the origin O to the right).
  • the direction in which it faces is the X-axis direction.
  • a direction perpendicular to the X-axis direction (a direction downward from the origin O) is taken as a Y-axis direction.
  • the image processing apparatus 10 includes a table 20 on which a card 100 is placed, an image reading unit 30 as an image data input unit, an analog / digital converter (A / D converter) 40, an image memory 50, and A data processing unit 60 is included.
  • a / D converter analog / digital converter
  • the image reading unit 30 guides incident light to a pixel area of a CCD (Charge Coupled Device) image sensor, which is a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates charges (image sensor). It has an optical system (lens and the like) that forms a subject image, and is placed on the table 20 and images a predetermined area including the entire card 100 illuminated by the illumination light source 31.
  • a CMOS (Complementary Metal Metal Oxide Semiconductor) image sensor may be used as the solid-state imaging device (image sensor).
  • the A / D converter 40 converts an image including the card 100 imaged by the image reading unit 30 into digital image data and stores the digital image data in the image memory 50.
  • the A / D converter 40 can also include the function in the image reading unit 30.
  • the image memory 50 stores (stores) digitized image data of the card 100 including the magnetic stripe 110 and the barcode 120 on which information to be read imaged by the image reading unit 30 is recorded.
  • the original image stored in the image memory 50 is formed by arranging a plurality of pixels in a matrix. Specifically, although not shown in the drawing, M rows in the X-axis direction and N columns in the Y-axis direction Is arranged. Each pixel has a pixel value (luminance value). In the present embodiment, each pixel value takes any value between 0 and 255 when expressed in 8 bits, for example. The pixel value is smaller as it is closer to black, and is larger as it is closer to white.
  • the image memory 50 may be any memory such as RAM, SDRAM, DDRSDRAM, and RDRAM as long as it can store image data.
  • the data processing unit 60 uses the digital image to detect the rotation angle on the image of the card 100, and refers to the detected rotation angle to record the magnetic stripe 110, barcode 120, characters, etc. recorded on the card 100. It has a function of recognizing information.
  • the data processing unit 60 is configured as a part of a CPU or the like that controls the entire image processing apparatus 10.
  • the data processing unit 60 reads multi-valued image data (multi-gradation grayscale image, for example, 256 gradations) from the image memory 50.
  • multi-valued image data multi-gradation grayscale image, for example, 256 gradations
  • FIG. 3 is a block diagram illustrating a configuration example of the data processing unit 60 in the image processing apparatus 10.
  • FIG. 4 is a flowchart showing medium rotation angle detection processing by the data processing unit 60.
  • FIG. 5 is a diagram illustrating an example of first image data and luminance projection (hereinafter simply referred to as “projection”) cut out from image data obtained by capturing the card 100.
  • FIG. 5A shows an example of the first image data IMG1 cut out from the image data obtained by capturing the card 100
  • FIG. 5B shows an example of projection onto the horizontal axis (X axis)
  • FIG. ) Is an example of projection onto the vertical axis (Y-axis).
  • FIG. 5A shows an example of the first image data IMG1 cut out from the image data obtained by capturing the card 100
  • FIG. 5B shows an example of projection onto the horizontal axis (X axis)
  • FIG. ) Is an example of projection onto the vertical axis (Y-axis).
  • the horizontal axis indicates the position of the X axis
  • the vertical axis indicates the pixel value P
  • the horizontal axis indicates the pixel value P
  • the vertical axis indicates the position of the Y axis.
  • the data processing unit 60 includes a projection generation unit 610, an end point detection unit 620, a processing target cutout unit 630, an image adjustment unit 640, a medium edge position deviation calculation unit 650, and an angle calculation unit 660. A rotation angle detection process is performed.
  • the projection generation unit 610 executes the projection formation process ST11. Specifically, the projection forming unit 610 acquires the image data IMG from the image memory 50 and, for example, each of the horizontal axis (X axis) and the vertical axis (Y axis) of the image IMG shown in FIG. On the other hand, a first projection prjX (FIG. 5B) and a second projection prjY (FIG. 5C) of pixel values by luminance projection are generated.
  • the first projection prjX is an average of pixel values (luminance values) for each line in the direction perpendicular to the X axis.
  • the second projection prjY is an average of pixel values (luminance values) for each line in the direction perpendicular to the Y axis.
  • the endpoint detection unit 620 executes endpoint detection processing ST12. Specifically, the end point detection unit 620 detects four end points of the area in order to extract the area including the card 100 based on the first projection prjX and the second projection prjY. More specifically, the end point detection unit 620 first obtains the points at which the output values of the left end portion and the right end portion are minimum in the first projection prjX, and sets these as the left end point XL and the right end point XR, respectively. Similarly, the end point detection unit 620 obtains points at which the output values of the upper end and the lower end are minimum in the second projection prjY, and sets these as the upper end point YU and the lower end point YL, respectively.
  • the process target cutout unit 630 executes the process target cutout process ST13. Specifically, the processing target cutout unit 60 determines the position of both end points (left end point XL, right end point XR) with respect to the horizontal axis X and the position of both end points (upper end point YU, lower end point YL) with respect to the vertical axis Y. A rectangular area 150 having four vertices is cut out as the first image data IMG to be processed, and the other areas are removed as outside the processing target area.
  • the processing target cutout unit 630 includes a rectangular area (ABCD) that includes a point A (XL, YU), a point B (XL, YL), a point C (XR, YL), and a point D (XR, YU). ) 150 first image data IMG1 are cut out. Thereby, a part unnecessary for processing is separated.
  • the image adjustment unit 640 executes an image adjustment process ST14.
  • FIG. 6 is a diagram for explaining the image adjustment processing of the image adjustment unit 640.
  • 6A shows the first image data IMG1 cut out by the processing target cutout unit 630
  • FIG. 6B shows the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees
  • FIG. (C) shows the third image data IMG3 obtained by superimposing the first image data IMG1 and the second image data IMG2.
  • the image adjusting unit 640 rotates the first image data IMG1 shown in FIG. 6A and the first image data IMG1 that are cut out by the processing target cutout unit 630 by 180 degrees.
  • the second image data IMG2 is overlapped to generate the third image data as shown in FIG.
  • the image adjustment unit 640 takes the average of the respective pixel values of the original first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees, to obtain the third image data IMG3. Is generated.
  • the image adjustment unit 640 takes the average of the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees in the magnetic recording area.
  • the average of the pixel values of a certain magnetic stripe 110 portion and a portion without the magnetic stripe 110 is averaged, and the luminance of the averaged portion is made higher than the luminance of the magnetic stripe 110 to obtain the luminance between the background and the card edge. It is possible to make a difference.
  • the portion of the magnetic stripe 110 is averaged with the portion without the magnetic stripe, and the luminance of this portion is higher than the luminance of the magnetic stripe 110.
  • the brightness of the background portion is unchanged, there is a difference in brightness between the background and the card edge, preventing erroneous detection of the card edge position in the media edge position deviation calculation process in the next process. it can. That is, when the image adjustment unit 640 is not provided, in the process of specifying the card edge position, the “card-background luminance difference” at the position to be determined as the card edge becomes almost zero. May occur, and accurate angle detection may be hindered. However, by providing the image adjustment unit 640 as in the present embodiment, a “brightness difference between the card and the background” is secured, Misdetection of the card edge position is less likely to occur.
  • the medium edge position deviation calculation unit 650 executes medium edge position deviation calculation processing ST15.
  • FIG. 7 is a diagram illustrating a relationship between the rectangular region 150 of the extracted third image data IMG3, the first horizontal line L1, and the second horizontal line L2.
  • the medium edge position deviation calculation unit 650 as shown in FIG. 7, for the third image data IMG3 of the rectangular area 150 that is the processing target area, is parallel to two positions passing through the rectangular area 150.
  • a line (first horizontal line L1, second horizontal line L2) is drawn.
  • the first horizontal line L1 is on the lower side
  • the second horizontal line L2 is on the upper side.
  • the first horizontal line L1 is composed of pixels (XL to XR) formed in the X-axis direction of the Y1th row.
  • the second horizontal line L2 is composed of pixels (XL to XR) formed in the X-axis direction of the Y2th row.
  • the medium edge position deviation calculation unit 650 obtains the medium edge position (first edge position X1, second edge position X2) on each parallel line (first horizontal line L1, second horizontal line L2), and uses the following equation (1): A distance W between two horizontal edge positions is obtained.
  • first horizontal line L1 and second horizontal line L2 The place where two parallel lines (first horizontal line L1 and second horizontal line L2) are drawn is a position where the medium edge position (first edge position X1 and second edge position X2) is obtained on the same side of the card 100.
  • the same predetermined width is set from the upper and lower intermediate positions of the rectangular region 150.
  • the predetermined width can be about 1 ⁇ 4 of the vertical length.
  • FIG. 8 shows a change in luminance in the vicinity of the medium edge position on the parallel lines (first horizontal line L1, second horizontal line L2), and FIG. 8A corresponds to a region Q1 on the left side of the rectangular region 150. .
  • FIG. 8B corresponds to a region Q2 on the right side of the rectangular region 150.
  • the vertical distance H is obtained by the following equation 2.
  • the angle calculation unit 660 executes an angle calculation process ST16. Specifically, the angle calculation unit 660 calculates the inclination angle ⁇ from the horizontal edge position distance W and the vertical distance H of two parallel lines (first horizontal line L1 and second horizontal line L2) according to the following equation 3. Is calculated.
  • the calculated tilt angle ⁇ may be adopted as the final tilt angle. However, from the viewpoint of improving accuracy, the same operation is performed on the right side region Q2 of the rectangular region 150, and the result on the left end side is calculated. In addition, the angle may be calculated. For example, the average value can be the final tilt angle.
  • FIG. 9 is a diagram showing the relationship between the rectangular area 150 of the extracted third image data IMG3, the first vertical line V1, and the second vertical line V2. Furthermore, from the viewpoint of further improving the accuracy, as shown in FIG. 9, in the image of the same rectangular area 150, two vertical lines (first vertical line V1 and second vertical line V2) are drawn, and the upper area R1 is drawn. The intersection of the first vertical line V1 and the upper side edge of the medium is defined as a first edge position YY1, and the intersection of the second vertical line V2 and the upper side edge of the medium is defined as a second edge position YY2. A distance HH between the direction edge positions is obtained.
  • the first vertical line V1 is composed of pixels (YL to YR) formed in the Y-axis direction of the X1th row.
  • the second horizontal line V2 is composed of pixels (YL to YR) formed in the Y-axis direction of the X2th row.
  • the horizontal distance WW is obtained by the following Expression 5.
  • the inclination angle ⁇ is calculated by the following formula 6.
  • the same operation may be performed on the lower region R2 side of the medium, and the angle may be calculated together with the result on the upper region R1 side.
  • the average value is set as the final inclination angle.
  • the inclination angle ⁇ (or inclination angle ⁇ ) determined by the angle calculation unit 660 is output to the information recognition processing unit in the data processing unit 60, although not shown.
  • the information recognition processing unit corrects (corrects) the inclination on the image of the magnetic stripe 110 that is the recording area on the card 100 according to the inclination angle ⁇ (and / or the inclination angle ⁇ ), and the corrected image.
  • the recognition process of the information recorded on the magnetic stripe 110 is performed.
  • the image processing apparatus 10 detects the rotation angle on the image of the card 100 using the digital image.
  • the image processing apparatus 10 includes a projection forming unit 610 that generates a projection of pixel values by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, and a projection on the horizontal axis (X axis).
  • a projection forming unit 610 that generates a projection of pixel values by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, and a projection on the horizontal axis (X axis).
  • an end point detection unit 620 that determines both end points of the projection pattern of the projection waveform, and a rectangular portion (in this embodiment, a rectangular region 150) consisting of four left and right upper and lower end points are processed.
  • a processing target cutout unit 630 that cuts out the first image data IMG1 as a target, first image data IMG1 cut out by the processing target cutout unit 630, and second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees.
  • the image adjustment unit 640 that generates the third image data IMG3 by superimposing the third image data IMG3 in the area to be processed Two parallel lines (a set of the first horizontal line L1 and the second horizontal line L2 or a set of the first vertical line V1 and the second vertical line V2) at least at a horizontal axis (X axis) or a vertical axis at a position passing through the area
  • a medium edge point deviation calculating unit 650 that obtains an edge position (edge point) of the card 100 on each parallel line and obtains an edge interval between the two edge positions, and is drawn parallel to the (Y axis), and the edge interval and two parallels.
  • An angle calculation unit 660 that calculates an inclination angle from the line separation distance.
  • the calculated inclination angle is set as the rotation angle of the card 100, the calculation load can be reduced without using the Hough transform. In addition, erroneous detection of the card edge position can be prevented.
  • the image adjustment unit 640 generates the third image data IMG3 by averaging the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG2 by 180 degrees.
  • the image adjustment unit 640 takes the average of the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees in the magnetic recording area.
  • the average of the pixel values of a certain magnetic stripe 110 portion and a portion without the magnetic stripe 110 is taken, and the luminance of the averaged portion is made higher than the luminance of the magnetic recording area, so that the background and the edge of the medium (edge) It is possible to produce a difference in brightness between the two. More specifically, by rotating the first image data IMG1 by 180 degrees, the portion of the magnetic stripe 110 is averaged with the portion without the magnetic stripe, and the luminance of this portion is higher than the luminance of the magnetic stripe 110.
  • the brightness of the background portion is unchanged, there is a difference in brightness between the background and the card edge, preventing erroneous detection of the card edge position in the media edge position deviation calculation process in the next process. it can.
  • the medium edge position deviation calculation unit 650 obtains the edge interval in the third image data IMG3 at one place on each side opposite to the card 100 with respect to two parallel lines, and the angle calculation unit 660 Two inclination angles can be calculated from the edge interval at two locations and the separation distance between two parallel lines, and the average value thereof can be used as the rotation angle.
  • a pair of edge positions can be obtained with two parallel lines on the two opposite sides (long sides or short sides) corresponding to each other on the card 100, so that the inclination angle is obtained at the two opposite locations. Can do.
  • the detection accuracy can be further improved.
  • the medium edge position deviation calculation unit 650 draws at least one set of two parallel lines in the third image data IMG3 in parallel with the horizontal axis (X axis) and the vertical axis (Y axis). The edge interval and the separation distance are obtained for the two parallel lines, and the angle calculation unit 660 calculates an inclination angle for each of the obtained pair of the edge interval and the separation distance, and the average value thereof is used as the rotation angle. Good.
  • one or more inclination angles are obtained on the long side and the short side of the card 100, respectively.
  • the detection accuracy can be further improved.
  • the inclination angles are obtained on the four sides of the card 100 and the average value is set as the rotation angle, the detection accuracy can be further improved.
  • An image processing method for detecting a rotation angle on an image of the card 100 using a digital image, and each of the horizontal axis (X axis) and the vertical axis (Y axis) of the image data of the card 100 A projection generation step ST11 for generating a projection of pixel values by luminance projection; an end point detection step ST12 for determining the end points of the projection pattern of the projection waveform for each of the projection onto the horizontal axis and the projection onto the vertical axis;
  • a set of the first vertical line V1 and the second vertical line V2) is drawn in parallel to at least the horizontal axis (X axis) or the vertical axis (Y axis), and the edge position of the card 100 on each parallel line is obtained.
  • the medium edge point deviation calculating step ST15 for obtaining the edge interval of the edge position, and the angle calculating step ST16 for calculating the inclination angle from the edge interval and the distance between the two parallel lines are provided.
  • the calculated inclination angle is used as the rotation angle of the card 100, so that the calculation load can be reduced without using the Hough transform.
  • erroneous detection of the card edge position can be prevented.
  • SYMBOLS 10 ... Image processing apparatus, 20 ... Table, 30 ... Image reading part, 31 ... Illumination light source, 40 ... Analog-digital converter (A / D converter), 50 ... Image memory, 60 ... Data processing unit, 100 ... Card (medium), 110 ... Magnetic stripe, 120 ... Bar code, 610 ... Projection forming unit, 620 ... End point detection unit, 630 ... Processing target cutout unit, 640... Image adjustment unit, 650... Medium edge position deviation calculation unit (medium edge point deviation calculation unit), 660.
  • a / D converter Analog-digital converter
  • 50 ... Image memory
  • 60 Data processing unit, 100 ... Card (medium), 110 ... Magnetic stripe, 120 ... Bar code, 610 ... Projection forming unit, 620 ... End point detection unit, 630 ... Processing target cutout unit, 640... Image adjustment unit, 650... Medium edge position deviation calculation unit (medium edge point deviation calculation unit), 660.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Provided is technology that is related to image processing technology for detecting the rotational angle of a medium on an image, and that is capable of reducing the computational load and preventing the erroneous detection of a card edge position. A data processing unit of this image processing device comprises: a projection generating unit 610 that generates a projection of pixel values of image data, said pixel values being from a luminance projection with respect to an X-axis and a Y-axis; an end point detecting unit 620 that determines both end points of a projection pattern of a projection waveform on the X-axis and the Y-axis; a processing target extracting unit 630 that extracts first image data using a rectangular section comprising four left, right, upper, and lower end points as a processing target; an image adjusting unit 640 that generates third image data by overlapping the extracted first image data and second image data, which is the first image data rotated 180 degrees; and a medium edge point deviation calculating unit 650 that finds, with regards to the third image data, card edge positions on two parallel lines, which are parallel to the X-axis, at positions that pass through the processing target area, and finds edge intervals.

Description

画像処理装置および画像処理方法Image processing apparatus and image processing method
 本発明は、矩形状をした媒体上の情報を処理する画像処理装置および画像処理方法に関する。 The present invention relates to an image processing apparatus and an image processing method for processing information on a rectangular medium.
 デジタル画像を用いて、処理対象となる矩形状をした媒体上の直線あるいは直線で構成される特定の幾何学図形を検出し、認識する手法として、ハフ(Hough)変換がよく知られており、さまざまな産業分野で広く用いられている。 Hough transformation is well known as a technique for detecting and recognizing specific geometric figures consisting of straight lines or straight lines on a rectangular medium to be processed using digital images. Widely used in various industrial fields.
 この方法は、一般には、対象図形を含む画像にノイズ除去やエッジ強調等の前処理を施したのち、抽出された画像パターンに対してハフ変換を施して累積点に変換し、その後、最大の累積度数をもつ累積点に関して逆ハフ変換を行い、画像空間上の直線を求める、という手順をとる(たとえば、特許文献1,2参照)。 In this method, generally, preprocessing such as noise removal and edge enhancement is performed on an image including a target graphic, then Hough transform is performed on the extracted image pattern to convert it into cumulative points, and then the maximum A procedure is performed in which inverse Hough transform is performed on cumulative points having cumulative frequencies to obtain straight lines in the image space (see, for example, Patent Documents 1 and 2).
 特許文献1に記載の技術では、半導体デバイスの製造工程におけるワークの位置決めについて、位置座標を、ハフ変換を用いて検出し、位置補正に用いるものであるが、最大累積点を求めて、不要パターン除去を行った後、逆ハフ変換して基準となる直線を求め、ワークの角度補正を得るようにしている。 In the technique described in Patent Document 1, position coordinates are detected by using Hough transform for positioning of a workpiece in a semiconductor device manufacturing process, and used for position correction. After removal, inverse Hough transform is performed to obtain a reference straight line to obtain workpiece angle correction.
 特許文献2に記載の技術では、多角形形状の形状特徴量を算出する場合に、物体の周囲の一部を構成する線分を推定するためにハフ変換を適用している。極座標に変換される点の個数を数え上げ、頻度が多い極座標を選択する。選択された極座標ごとに逆ハフ変換で直線の式を算出するとともに回転角の検出を行っている。 In the technique described in Patent Document 2, when calculating the shape feature amount of a polygonal shape, the Hough transform is applied to estimate a line segment constituting a part of the periphery of the object. Count the number of points to be converted to polar coordinates, and select polar coordinates with high frequency. For each selected polar coordinate, a straight line equation is calculated by inverse Hough transform and a rotation angle is detected.
特開平11-97512号公報JP 11-97512 A 特開2010-79643号公報JP 2010-79643 A
 しかしながら、特許文献1および特許文献2に示す技術では、ハフ空間上の最大累積点を求めている。このように、ハフ空間上の最大累積点を求めることは、二次元空間上の極大点を求めることであり、不要パターンに起因するノイズを除去する必要があるなど、一般には容易ではなく、かつ演算負荷が大きい。そのため高速処理を行おうとすると、高性能なプロセッサを用いる必要があり、コスト上昇につながるという課題がある。 However, in the techniques shown in Patent Document 1 and Patent Document 2, the maximum cumulative point in the Hough space is obtained. Thus, obtaining the maximum cumulative point in the Hough space is to obtain a local maximum point in the two-dimensional space, and it is generally not easy, such as the need to remove noise caused by unnecessary patterns, and The computation load is large. For this reason, if high-speed processing is to be performed, it is necessary to use a high-performance processor, and there is a problem that the cost increases.
 そこで本発明は、上記の状況に鑑みなされたものであって、矩形状をした媒体の画像上の回転角度を検出する画像処理技術に関して、演算負荷を軽減することが可能な技術を提供することを目的とする。 Therefore, the present invention has been made in view of the above-described situation, and provides a technique capable of reducing the calculation load with respect to an image processing technique for detecting a rotation angle on an image of a rectangular medium. With the goal.
 本発明は、デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出する画像処理装置であって、前記矩形状をした媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成部と、前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出部と、左右上下の4端点からなる矩形部分を処理対象として第1画像データを切り出す処理対象切り出し部と、前記処理対象の区域について、前記媒体を切り出した第1画像データと、当該第1画像データを180度回転させた第2画像データとを重ね合わせた第3画像データを生成する画像調整部と、前記処理対象の区域の前記第3画像データについて、当該処理対象の区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状をした媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出部と、前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出部と、を備えることを特徴とする。 The present invention is an image processing apparatus for detecting a rotation angle on an image of a rectangular medium using a digital image, and each of the horizontal axis and the vertical axis of image data of the rectangular medium. A projection generation unit that generates a projection of pixel values by luminance projection, an end point detection unit that determines both end points of the projection pattern of the projection waveform for each of the projection to the horizontal axis and the projection to the vertical axis; A processing target cutout unit that cuts out first image data using a rectangular portion including four left and right and upper and lower four end points as processing targets, first image data obtained by cutting out the medium for the processing target area, and 180 degrees of the first image data. An image adjustment unit that generates third image data obtained by superimposing the second image data rotated by the degree of rotation, and the third image data in the area to be processed. Two parallel lines are drawn parallel to at least the horizontal axis or the vertical axis at a position passing through the area to obtain an edge position of the rectangular medium on each parallel line, and an edge interval between the two edge positions is determined. A medium edge point deviation calculation unit to be obtained, and an angle calculation unit that calculates an inclination angle from the edge interval and a separation distance between the two parallel lines.
 これによって、本発明の画像処理装置では、デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出することが可能になる。このため、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。 Thus, the image processing apparatus of the present invention can detect the rotation angle on the image of the rectangular medium using the digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
 また、本発明では、画像調整部は、前記第1画像データと、当該第1画像データを180度回転させた前記第2画像データの各画素値の平均をとって前記第3画像データを生成することが好ましい。 In the present invention, the image adjustment unit generates the third image data by taking an average of the pixel values of the first image data and the second image data obtained by rotating the first image data by 180 degrees. It is preferable to do.
 このような構成によって、画像調整部は、第1画像データと、この第1画像データを180度回転させた第2画像データの各画素値の平均をとることにより、ノイズなどの影響をうけにくくなり、背景と媒体エッジ(縁)との間の輝度の差を生じさせることが可能となる。 With such a configuration, the image adjusting unit is less susceptible to noise and the like by taking the average of the pixel values of the first image data and the second image data obtained by rotating the first image data by 180 degrees. Thus, a difference in luminance between the background and the medium edge (edge) can be generated.
 本発明は、デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出する画像処理方法であって、前記矩形状をした媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成ステップと、前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出ステップと、左右上下の4端点からなる矩形部分を処理対象として第1画像データを切り出す処理対象切り出しステップと、前記処理対象の区域について、前記媒体を切り出した第1画像データと、当該第1画像データを180度回転させた第2画像データとを重ね合わせた第3画像データを生成する画像調整ステップと、前記処理対象の区域の前記第3画像データについて、当該処理対象の区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状をした媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出ステップと、前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出ステップと、を備えることを特徴とする。 The present invention relates to an image processing method for detecting a rotation angle on an image of a rectangular medium using a digital image, and each of the horizontal axis and the vertical axis of the image data of the rectangular medium. A projection generation step for generating a projection of pixel values by luminance projection, and an endpoint detection step for determining both end points of the projection pattern of the projection waveform for each of the projection to the horizontal axis and the projection to the vertical axis; A processing target cutout step for cutting out first image data using a rectangular portion composed of four end points on the left, right, top, and bottom as a processing target, first image data obtained by cutting out the medium with respect to the area to be processed, and 180 for the first image data. An image adjustment step for generating third image data obtained by superimposing the second image data rotated at a predetermined angle, and the third image data in the area to be processed. And drawing two parallel lines parallel to at least the horizontal axis or the vertical axis at a position passing through the area to be processed to obtain an edge position of the rectangular medium on each parallel line, A medium edge point deviation calculating step for obtaining an edge interval between edge positions; and an angle calculating step for calculating an inclination angle from the edge interval and a separation distance between the two parallel lines.
 これによって、本発明の画像処理方法では、デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出することが可能になる。このため、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。 Thereby, in the image processing method of the present invention, it becomes possible to detect the rotation angle on the image of the rectangular medium using the digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
 本発明によれば、デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出することが可能になる。このため、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。 According to the present invention, it is possible to detect a rotation angle on an image of a rectangular medium using a digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
実施形態に係る、画像処理装置の主要部の構成例を示す図である。It is a figure which shows the structural example of the principal part of the image processing apparatus which concerns on embodiment. 実施形態に係る、矩形状をした媒体の一例であるカードの外観を模式的に示す図である。It is a figure which shows typically the external appearance of the card | curd which is an example of the medium which made the rectangular shape based on embodiment. 実施形態に係る、画像処理装置におけるデータ処理部の構成例を示すブロック図である。It is a block diagram showing an example of composition of a data processing part in an image processing device concerning an embodiment. 実施形態に係る、データ処理部による媒体回転角度検出処理を示すフローチャートである。It is a flowchart which shows the medium rotation angle detection process by the data processing part based on embodiment. 実施形態に係る、カードを撮像した画像および輝度投影の一例を示す図であり、図5(a)は、カードを撮像した画像データから切り出した第1画像データの例であり、図5(b)は、水平軸(X軸)への射影の例であり、図5(c)は、垂直軸(Y軸)への射影の例である。FIG. 5A is a diagram illustrating an example of an image captured from a card and luminance projection according to the embodiment, and FIG. 5A is an example of first image data cut out from image data captured from the card; FIG. ) Is an example of projection onto the horizontal axis (X axis), and FIG. 5C is an example of projection onto the vertical axis (Y axis). 実施形態に係る、画像調整部の画像調整処理を説明するための図であり、図6(a)は、処理対象切り出し部で切り出された第1画像データを示し、図6(b)は、第1画像データを180度回転させた第2画像データを示し、図6(c)は、第1画像データと第2画像データとを重ね合わせた第3画像データを示している。FIG. 6A is a diagram for explaining image adjustment processing of the image adjustment unit according to the embodiment, FIG. 6A shows the first image data cut out by the processing target cutout unit, and FIG. The second image data obtained by rotating the first image data by 180 degrees is shown, and FIG. 6C shows the third image data obtained by superimposing the first image data and the second image data. 実施形態に係る、抽出した第3画像データの長方形領域と第1水平線、第2水平線の関係を示す図である。It is a figure which shows the relationship between the rectangular area of the extracted 3rd image data based on embodiment, a 1st horizontal line, and a 2nd horizontal line. 実施形態に係る、平行線(第1水平線、第2水平線)上における媒体エッジ位置近傍の輝度の変化を示す図であり、図8(a)は、長方形領域の左側の領域に対応し、図8(b)は、長方形領域の右側の領域に対応する。FIG. 8A is a diagram showing a change in luminance in the vicinity of a medium edge position on parallel lines (first horizontal line, second horizontal line) according to the embodiment, and FIG. 8A corresponds to the left region of the rectangular region; 8 (b) corresponds to the area on the right side of the rectangular area. 実施形態に係る、抽出した第3画像データの長方形領域と第1垂直線、第2垂直線の関係を示す図である。It is a figure which shows the relationship between the rectangular area of the extracted 3rd image data based on embodiment, a 1st vertical line, and a 2nd vertical line.
 以下、発明を実施するための形態を、図面を参照しつつ説明する。 Hereinafter, embodiments for carrying out the invention will be described with reference to the drawings.
 図1は、実施形態に係る画像処理装置10の主要部の構成例を示す図である。図2は、矩形状をした媒体の一例であるカード100の外観を模式的に示す図である。 FIG. 1 is a diagram illustrating a configuration example of a main part of an image processing apparatus 10 according to the embodiment. FIG. 2 is a diagram schematically showing an appearance of a card 100 which is an example of a rectangular medium.
 画像処理装置10は、デジタル画像を用いて、矩形状をした媒体100の画像上の回転角度を検出する装置である。 The image processing apparatus 10 is an apparatus that detects a rotation angle on an image of a rectangular medium 100 using a digital image.
 矩形状をした媒体100は、JISに準拠している一般的なカード(以下、「カード」という。)である。このカード100は、矩形状をしており、たとえば、幅86mm、高さ54mm、厚み0.76mmというサイズのプラスチックカードである。なお、矩形状をした媒体は、上述したカード100に限定されるものではなく、たとえば、幅125mm、高さ88mmというサイズのパスポートブックでもよい。また、プラスチックカードやパスポートブックに限定されるものではなく、IDカードや運転免許証などでもよい。 The medium 100 having a rectangular shape is a general card (hereinafter referred to as “card”) compliant with JIS. The card 100 has a rectangular shape, and is, for example, a plastic card having a width of 86 mm, a height of 54 mm, and a thickness of 0.76 mm. Note that the rectangular medium is not limited to the card 100 described above, and may be, for example, a passport book with a width of 125 mm and a height of 88 mm. Moreover, it is not limited to a plastic card or a passport book, but may be an ID card or a driver's license.
 なお、本実施形態では、図2において、カード100の長手方向をX軸方向とし、短手方向をY軸方向としている。
 カード100には、黒色系の磁気ストライプ110がカード100の長手方向に形成されており、また、カード100には、バーコード120が長手方向に形成されている。
 本実施形態では、説明を簡単にするために、カード100の画像の原点は、図2、図5などに示すように、左上側を原点Oとし、カード100の長手方向(原点Oから右に向う方向)をX軸方向としている。そして、X軸方向に直交する方向(原点Oから下に向う方向)をY軸方向としている。
In this embodiment, in FIG. 2, the longitudinal direction of the card 100 is the X-axis direction, and the short direction is the Y-axis direction.
A black magnetic stripe 110 is formed on the card 100 in the longitudinal direction of the card 100, and a barcode 120 is formed on the card 100 in the longitudinal direction.
In the present embodiment, in order to simplify the explanation, the origin of the image of the card 100 is the origin O on the upper left side as shown in FIGS. 2 and 5, and the longitudinal direction of the card 100 (from the origin O to the right). The direction in which it faces is the X-axis direction. A direction perpendicular to the X-axis direction (a direction downward from the origin O) is taken as a Y-axis direction.
 図1に示すように、画像処理装置10は、カード100が載置されるテーブル20、画像データ入力部としての画像読取部30、アナログデジタルコンバータ(A/Dコンバータ)40、画像メモリ50、およびデータ処理部60を有している。 As shown in FIG. 1, the image processing apparatus 10 includes a table 20 on which a card 100 is placed, an image reading unit 30 as an image data input unit, an analog / digital converter (A / D converter) 40, an image memory 50, and A data processing unit 60 is included.
 画像読取部30は、光を検出して電荷を発生させる光電変換素子を用いた固体撮像装置(イメージセンサ)としてのCCD(Charge Coupled Device)イメージセンサ、イメージセンサの画素領域に入射光を導く(被写体像を結像する)光学系(レンズ等)を有し、テーブル20上に載置され、照明光源31で照明されるカード100の全体を含む所定の領域を撮像する。なお、固体撮像装置(イメージセンサ)として、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサが用いられてもよい。 The image reading unit 30 guides incident light to a pixel area of a CCD (Charge Coupled Device) image sensor, which is a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates charges (image sensor). It has an optical system (lens and the like) that forms a subject image, and is placed on the table 20 and images a predetermined area including the entire card 100 illuminated by the illumination light source 31. A CMOS (Complementary Metal Metal Oxide Semiconductor) image sensor may be used as the solid-state imaging device (image sensor).
 A/Dコンバータ40は、画像読取部30によって撮像されたカード100を含む画像をデジタル画像データに変換し、画像メモリ50に格納する。なお、A/Dコンバータ40は、画像読取部30にその機能を含ませることも可能である。 The A / D converter 40 converts an image including the card 100 imaged by the image reading unit 30 into digital image data and stores the digital image data in the image memory 50. The A / D converter 40 can also include the function in the image reading unit 30.
 画像メモリ50は、画像読取部30で撮像された読取対象の情報が記録された磁気ストライプ110やバーコード120を含むカード100のデジタル化された画像データを記憶(格納)する。画像メモリ50に格納される原画像は、複数の画素がマトリクス状に配列されて形成され、具体的には、図示していないが、X軸方向にM行、Y軸方向にN列の画素が配置されている。各画素はそれぞれ画素値(輝度値)を有する。本実施形態では、各画素値は、たとえば8ビットで表現すると0~255の間のいずれかの値をとり、画素値は黒に近いほど小さく、白に近いほど大きな値をとる。なお、この画像メモリ50は、RAM,SDRAM,DDRSDRAM,RDRAMなど、画像データを記憶しうるものであれば如何なるものであってもよい。 The image memory 50 stores (stores) digitized image data of the card 100 including the magnetic stripe 110 and the barcode 120 on which information to be read imaged by the image reading unit 30 is recorded. The original image stored in the image memory 50 is formed by arranging a plurality of pixels in a matrix. Specifically, although not shown in the drawing, M rows in the X-axis direction and N columns in the Y-axis direction Is arranged. Each pixel has a pixel value (luminance value). In the present embodiment, each pixel value takes any value between 0 and 255 when expressed in 8 bits, for example. The pixel value is smaller as it is closer to black, and is larger as it is closer to white. The image memory 50 may be any memory such as RAM, SDRAM, DDRSDRAM, and RDRAM as long as it can store image data.
 データ処理部60は、デジタル画像を用いて、カード100の画像上の回転角度を検出し、検出した回転角度を参照してカード100上に記録されている磁気ストライプ110、バーコード120、文字等の情報を認識する機能を有する。データ処理部60は、画像処理装置10の全体的な制御を司るCPU等の一部として構成される。 The data processing unit 60 uses the digital image to detect the rotation angle on the image of the card 100, and refers to the detected rotation angle to record the magnetic stripe 110, barcode 120, characters, etc. recorded on the card 100. It has a function of recognizing information. The data processing unit 60 is configured as a part of a CPU or the like that controls the entire image processing apparatus 10.
[データ処理部60の各部の構成および機能]
 次に、データ処理部60の各部の基本的な構成および機能について説明する。
 データ処理部60は、画像メモリ50から多値化された画像データ(多階調の濃淡画像、たとえば、256階調)を読み出す。
[Configuration and Function of Each Unit of Data Processing Unit 60]
Next, the basic configuration and function of each unit of the data processing unit 60 will be described.
The data processing unit 60 reads multi-valued image data (multi-gradation grayscale image, for example, 256 gradations) from the image memory 50.
 図3は、画像処理装置10におけるデータ処理部60の構成例を示すブロック図である。図4は、データ処理部60による媒体回転角度検出処理を示すフローチャートである。図5はカード100を撮像した画像データから切り出した第1画像データおよび輝度投影(以下、単に「射影」という。)の一例を示す図である。図5(a)がカード100を撮像した画像データから切り出した第1画像データIMG1の例であり、図5(b)が水平軸(X軸)への射影の例であり、図5(c)が垂直軸(Y軸)への射影の例である。なお、図5(b)において、横軸はX軸の位置を示す、縦軸は画素値Pを示す。また、図5(c)において、横軸は画素値Pを示し、縦軸はY軸の位置を示す。 FIG. 3 is a block diagram illustrating a configuration example of the data processing unit 60 in the image processing apparatus 10. FIG. 4 is a flowchart showing medium rotation angle detection processing by the data processing unit 60. FIG. 5 is a diagram illustrating an example of first image data and luminance projection (hereinafter simply referred to as “projection”) cut out from image data obtained by capturing the card 100. FIG. 5A shows an example of the first image data IMG1 cut out from the image data obtained by capturing the card 100, FIG. 5B shows an example of projection onto the horizontal axis (X axis), and FIG. ) Is an example of projection onto the vertical axis (Y-axis). In FIG. 5B, the horizontal axis indicates the position of the X axis, and the vertical axis indicates the pixel value P. In FIG. 5C, the horizontal axis indicates the pixel value P, and the vertical axis indicates the position of the Y axis.
 データ処理部60は、射影生成部610と、端点検出部620と、処理対象切り出し部630と、画像調整部640と、媒体エッジ位置偏差算出部650と、角度算出部660とを有し、媒体回転角度検出処理を行う。 The data processing unit 60 includes a projection generation unit 610, an end point detection unit 620, a processing target cutout unit 630, an image adjustment unit 640, a medium edge position deviation calculation unit 650, and an angle calculation unit 660. A rotation angle detection process is performed.
 射影生成部610は、射影形成処理ST11を実行する。具体的には、射影形成部610は、画像メモリ50から画像データIMGを取得し、たとえば、図5(a)に示す画像IMGの水平軸(X軸)および垂直軸(Y軸)のそれぞれに対して、輝度投影による画素値の第1射影prjX(図5(b))および第2射影prjY(図5(c))を生成する。 The projection generation unit 610 executes the projection formation process ST11. Specifically, the projection forming unit 610 acquires the image data IMG from the image memory 50 and, for example, each of the horizontal axis (X axis) and the vertical axis (Y axis) of the image IMG shown in FIG. On the other hand, a first projection prjX (FIG. 5B) and a second projection prjY (FIG. 5C) of pixel values by luminance projection are generated.
 ここで、第1射影prjXとは、X軸に垂直方向にラインごとの画素値(輝度値)の平均をとったものである。第2射影prjYとは、Y軸に垂直方向にラインごとの画素値(輝度値)の平均をとったものである。 Here, the first projection prjX is an average of pixel values (luminance values) for each line in the direction perpendicular to the X axis. The second projection prjY is an average of pixel values (luminance values) for each line in the direction perpendicular to the Y axis.
 端点検出部620は、端点検出処理ST12を実行する。具体的には、端点検出部620は、第1射影prjXおよび第2射影prjYをもとに、カード100が含まれている領域を抽出するために、その領域の4つの端点を検出する。
 より具体的には、端点検出部620は、まず、第1射影prjXにおいて、左端部および右端部の出力値の最小となる点を求め、これらをそれぞれ左端点XL、右端点XRとする。同様に、端点検出部620は、第2射影prjYにおいて、上端部および下端部の出力値の最小となる点を求め、これらをそれぞれ上端点YU、下端点YLとする。
The endpoint detection unit 620 executes endpoint detection processing ST12. Specifically, the end point detection unit 620 detects four end points of the area in order to extract the area including the card 100 based on the first projection prjX and the second projection prjY.
More specifically, the end point detection unit 620 first obtains the points at which the output values of the left end portion and the right end portion are minimum in the first projection prjX, and sets these as the left end point XL and the right end point XR, respectively. Similarly, the end point detection unit 620 obtains points at which the output values of the upper end and the lower end are minimum in the second projection prjY, and sets these as the upper end point YU and the lower end point YL, respectively.
 処理対象切り出し部630は、処理対象切り出し処理ST13を実行する。具体的には、処理対象切り出し部60は、水平軸Xに関する両端点(左端点XL、右端点XR)の位置および、垂直軸Yに関する両端点(上端点YU、下端点YL)の位置を、4頂点とする長方形領域150を処理対象の第1画像データIMGとして切り出し、それ以外の領域を処理対象区域外として除去する。 The process target cutout unit 630 executes the process target cutout process ST13. Specifically, the processing target cutout unit 60 determines the position of both end points (left end point XL, right end point XR) with respect to the horizontal axis X and the position of both end points (upper end point YU, lower end point YL) with respect to the vertical axis Y. A rectangular area 150 having four vertices is cut out as the first image data IMG to be processed, and the other areas are removed as outside the processing target area.
 すなわち、処理対象切り出し部630は、点A(XL、YU)、点B(XL、YL)、点C(XR、YL)、点D(XR、YU)を含んで形成される長方形領域(ABCD)150の第1画像データIMG1を切り出す。これにより、処理に不要な部分が切り離される。 In other words, the processing target cutout unit 630 includes a rectangular area (ABCD) that includes a point A (XL, YU), a point B (XL, YL), a point C (XR, YL), and a point D (XR, YU). ) 150 first image data IMG1 are cut out. Thereby, a part unnecessary for processing is separated.
 画像調整部640は、画像調整処理ST14を実行する。
 図6は、画像調整部640の画像調整処理を説明するための図である。図6(a)は処理対象切り出し部630で切り出された第1画像データIMG1を示し、図6(b)は第1画像データIMG1を180度回転させた第2画像データIMG2を示し、図6(c)は第1画像データIMG1と第2画像データIMG2とを重ね合わせた第3画像データIMG3を示している。
The image adjustment unit 640 executes an image adjustment process ST14.
FIG. 6 is a diagram for explaining the image adjustment processing of the image adjustment unit 640. 6A shows the first image data IMG1 cut out by the processing target cutout unit 630, FIG. 6B shows the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees, and FIG. (C) shows the third image data IMG3 obtained by superimposing the first image data IMG1 and the second image data IMG2.
 画像調整部640は、処理対象切り出し部630で切り出された、図6(a)に示す第1画像データIMG1と、第1画像データIMG1を180度回転させた、図6(b)に示すような第2画像データIMG2とを重ね合わせて、図6(c)に示すような第3画像データを生成する。 As shown in FIG. 6B, the image adjusting unit 640 rotates the first image data IMG1 shown in FIG. 6A and the first image data IMG1 that are cut out by the processing target cutout unit 630 by 180 degrees. The second image data IMG2 is overlapped to generate the third image data as shown in FIG.
 本実施形態において、画像調整部640は、元の第1画像データIMG1と、第1画像データIMG1を180度回転させた第2画像データIMG2の各画素値の平均をとって第3画像データIMG3を生成する。 In the present embodiment, the image adjustment unit 640 takes the average of the respective pixel values of the original first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees, to obtain the third image data IMG3. Is generated.
 このような構成によって、画像調整部640は、第1画像データIMG1と、この第1画像データIMG1を180度回転させた第2画像データIMG2の各画素値の平均をとるに際し、磁気記録領域である磁気ストライプ110の部分と磁気ストライプ110のない部分との画素値の平均をとって、この平均をとる部分の輝度を磁気ストライプ110の輝度より高くして、背景とカードエッジとの間の輝度の差を生じさせることが可能となる。 With such a configuration, the image adjustment unit 640 takes the average of the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees in the magnetic recording area. The average of the pixel values of a certain magnetic stripe 110 portion and a portion without the magnetic stripe 110 is averaged, and the luminance of the averaged portion is made higher than the luminance of the magnetic stripe 110 to obtain the luminance between the background and the card edge. It is possible to make a difference.
 より具体的には、第1画像データIMG1を180度回転させることにより、磁気ストライプ110の部分は磁気ストライプのない部分との平均をとることになり、この部分の輝度が磁気ストライプ110の輝度よりも大きくなる一方、背景部分の輝度は不変であるため、背景とカードエッジとの間に輝度の差異が生じることから、次工程の媒体エッジ位置偏差算出処理でのカードエッジ位置の誤検知を防止できる。
 すなわち、画像調整部640が設けられていない場合は、カードエッジ位置を特定する処理において、カードエッジと判定されるべき位置の「カード-背景間の輝度差」がほぼゼロになる結果、カードエッジの誤検知が発生し、正確な角度検出が妨げられる恐れがあるが、本実施形態のように画像調整部640を設けることにより、カード-背景間の輝度差」が確保されることになり、カードエッジ位置の誤検知が生じにくくなる。
More specifically, by rotating the first image data IMG1 by 180 degrees, the portion of the magnetic stripe 110 is averaged with the portion without the magnetic stripe, and the luminance of this portion is higher than the luminance of the magnetic stripe 110. On the other hand, since the brightness of the background portion is unchanged, there is a difference in brightness between the background and the card edge, preventing erroneous detection of the card edge position in the media edge position deviation calculation process in the next process. it can.
That is, when the image adjustment unit 640 is not provided, in the process of specifying the card edge position, the “card-background luminance difference” at the position to be determined as the card edge becomes almost zero. May occur, and accurate angle detection may be hindered. However, by providing the image adjustment unit 640 as in the present embodiment, a “brightness difference between the card and the background” is secured, Misdetection of the card edge position is less likely to occur.
 媒体エッジ位置偏差算出部650は、媒体エッジ位置偏差算出処理ST15を実行する。
 図7は、抽出した第3画像データIMG3の長方形領域150と第1水平線L1、第2水平線L2の関係を示す図である。
 媒体エッジ位置偏差算出部650は、具体的には、処理対象区域である長方形領域150の第3画像データIMG3について、図7に示すように、その長方形領域150を通過する位置に2本の平行線(第1水平線L1、第2水平線L2)を引く。ここでは、図示のように下側に第1水平線L1、上側に第2水平線L2である。本実施形態では、第1水平線L1は、Y1行目のX軸方向に形成された画素(XL~XR)から構成されている。同様に、第2水平線L2は、Y2行目のX軸方向に形成された画素(XL~XR)から構成されている。
The medium edge position deviation calculation unit 650 executes medium edge position deviation calculation processing ST15.
FIG. 7 is a diagram illustrating a relationship between the rectangular region 150 of the extracted third image data IMG3, the first horizontal line L1, and the second horizontal line L2.
Specifically, the medium edge position deviation calculation unit 650, as shown in FIG. 7, for the third image data IMG3 of the rectangular area 150 that is the processing target area, is parallel to two positions passing through the rectangular area 150. A line (first horizontal line L1, second horizontal line L2) is drawn. Here, as illustrated, the first horizontal line L1 is on the lower side, and the second horizontal line L2 is on the upper side. In the present embodiment, the first horizontal line L1 is composed of pixels (XL to XR) formed in the X-axis direction of the Y1th row. Similarly, the second horizontal line L2 is composed of pixels (XL to XR) formed in the X-axis direction of the Y2th row.
 媒体エッジ位置偏差算出部650は、各平行線(第1水平線L1、第2水平線L2)上における媒体エッジ位置(第1エッジ位置X1、第2エッジ位置X2)を求め、次の式1により、2点の水平方向エッジ位置間距離Wを求める。 The medium edge position deviation calculation unit 650 obtains the medium edge position (first edge position X1, second edge position X2) on each parallel line (first horizontal line L1, second horizontal line L2), and uses the following equation (1): A distance W between two horizontal edge positions is obtained.
Figure JPOXMLDOC01-appb-M000001
                     
Figure JPOXMLDOC01-appb-M000001
                     
 なお、2本の平行線(第1水平線L1、第2水平線L2)を引く場所は、カード100の同一辺上に媒体エッジ位置(第1エッジ位置X1、第2エッジ位置X2)が求まる位置とする。たとえば、長方形領域150の上下中間位置から同じ所定幅とする。所定幅は、上下の長さの1/4程度とすることができる。 The place where two parallel lines (first horizontal line L1 and second horizontal line L2) are drawn is a position where the medium edge position (first edge position X1 and second edge position X2) is obtained on the same side of the card 100. To do. For example, the same predetermined width is set from the upper and lower intermediate positions of the rectangular region 150. The predetermined width can be about ¼ of the vertical length.
 図8は平行線(第1水平線L1、第2水平線L2)上における媒体エッジ位置近傍の輝度の変化を示しており、図8(a)が長方形領域150の左側の領域Q1に対応している。図8(b)は長方形領域150の右側の領域Q2に対応している。 FIG. 8 shows a change in luminance in the vicinity of the medium edge position on the parallel lines (first horizontal line L1, second horizontal line L2), and FIG. 8A corresponds to a region Q1 on the left side of the rectangular region 150. . FIG. 8B corresponds to a region Q2 on the right side of the rectangular region 150.
 ここでは、長方形領域150の左側の領域Q1における媒体エッジ位置(第1エッジ位置X1、第2エッジ位置X2)を決定する例を挙げる。 Here, an example of determining the medium edge position (first edge position X1, second edge position X2) in the left side area Q1 of the rectangular area 150 will be described.
 平行線(第1水平線L1、第2水平線L2)の位置を第1水平線位置Y1、第2水平線位置Y2とすると、次の式2により垂直距離Hを求める。 When the positions of the parallel lines (the first horizontal line L1 and the second horizontal line L2) are the first horizontal line position Y1 and the second horizontal line position Y2, the vertical distance H is obtained by the following equation 2.
Figure JPOXMLDOC01-appb-M000002
                              
Figure JPOXMLDOC01-appb-M000002
                              
 角度算出部660は、角度算出処理ST16を実行する。具体的には、角度算出部660は、水平方向エッジ位置間距離Wと2本の平行線(第1水平線L1、第2水平線L2)の垂直距離Hとから、次の式3により傾斜角θを算出する。 The angle calculation unit 660 executes an angle calculation process ST16. Specifically, the angle calculation unit 660 calculates the inclination angle θ from the horizontal edge position distance W and the vertical distance H of two parallel lines (first horizontal line L1 and second horizontal line L2) according to the following equation 3. Is calculated.
Figure JPOXMLDOC01-appb-M000003
    
Figure JPOXMLDOC01-appb-M000003
    
 なお、算出した傾斜角度θを、最終的な傾斜角度として採用してもよいが、精度向上の観点から、同様の操作を長方形領域150の右側の領域Q2について行って、左端側での結果と合わせて角度を算出してもよい。たとえば平均値を最終の傾斜角度とすることができる。 The calculated tilt angle θ may be adopted as the final tilt angle. However, from the viewpoint of improving accuracy, the same operation is performed on the right side region Q2 of the rectangular region 150, and the result on the left end side is calculated. In addition, the angle may be calculated. For example, the average value can be the final tilt angle.
 図9は、抽出した第3画像データIMG3の長方形領域150と第1垂直線V1、第2垂直線V2の関係を示す図である。
 さらに、一層の精度向上の観点から、図9に示すように、同じ長方形領域150の画像において、2本の垂直線(第1垂直線V1、第2垂直線V2)を引き、上側の領域R1の第1垂直線V1と媒体上側エッジとの交点を第1エッジ位置YY1、第2垂直線V2と媒体上側エッジとの交点を第2エッジ位置YY2として、次の式4により、2点の垂直方向エッジ位置間距離HHを求める。本形態では、第1垂直線V1は、X1行目のY軸方向に形成された画素(YL~YR)から構成されている。同様に、第2水平線V2は、X2行目のY軸方向に形成された画素(YL~YR)から構成されている。
FIG. 9 is a diagram showing the relationship between the rectangular area 150 of the extracted third image data IMG3, the first vertical line V1, and the second vertical line V2.
Furthermore, from the viewpoint of further improving the accuracy, as shown in FIG. 9, in the image of the same rectangular area 150, two vertical lines (first vertical line V1 and second vertical line V2) are drawn, and the upper area R1 is drawn. The intersection of the first vertical line V1 and the upper side edge of the medium is defined as a first edge position YY1, and the intersection of the second vertical line V2 and the upper side edge of the medium is defined as a second edge position YY2. A distance HH between the direction edge positions is obtained. In the present embodiment, the first vertical line V1 is composed of pixels (YL to YR) formed in the Y-axis direction of the X1th row. Similarly, the second horizontal line V2 is composed of pixels (YL to YR) formed in the Y-axis direction of the X2th row.
Figure JPOXMLDOC01-appb-M000004
   
Figure JPOXMLDOC01-appb-M000004
   
 第1垂直線V1および第2垂直線V2の位置をそれぞれ第1垂直線位置XX1、第2垂直線位置XX2とすると、次の式5により水平距離WWを求める。 Suppose that the positions of the first vertical line V1 and the second vertical line V2 are the first vertical line position XX1 and the second vertical line position XX2, respectively, the horizontal distance WW is obtained by the following Expression 5.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 次の式6により傾斜角θθを算出する。 The inclination angle θθ is calculated by the following formula 6.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 そして、同様の操作を媒体の下端の領域R2側についても行って、上端の領域R1側での結果と合わせて角度を算出してもよい。たとえば平均値を最終の傾斜角度とする。 Then, the same operation may be performed on the lower region R2 side of the medium, and the angle may be calculated together with the result on the upper region R1 side. For example, the average value is set as the final inclination angle.
 さらに左右上下合計4か所での傾斜角度を算出してその全体の平均値を求め、これを傾斜角度としてもよい。また、左右上下のいずれか3箇所が用いられてもよい。 Furthermore, it is also possible to calculate the inclination angle at four places on the left, right, top and bottom in total, obtain the average value of the whole, and use this as the inclination angle. Moreover, any three places of right and left and up and down may be used.
 上述したとおり、角度算出部660で決定された傾斜角度θ(または傾斜角度θθ)は、図示していないが、データ処理部60内の情報認識処理部に出力される。たとえば、情報認識処理部は、カード100上の記録領域である磁気ストライプ110の画像上の傾きを、傾斜角度θ(および/または傾斜角度θθ)に応じて修正(補正)し、修正後の画像における磁気ストライプ110に記録されている情報の認識処理を行う。 As described above, the inclination angle θ (or inclination angle θθ) determined by the angle calculation unit 660 is output to the information recognition processing unit in the data processing unit 60, although not shown. For example, the information recognition processing unit corrects (corrects) the inclination on the image of the magnetic stripe 110 that is the recording area on the card 100 according to the inclination angle θ (and / or the inclination angle θθ), and the corrected image. The recognition process of the information recorded on the magnetic stripe 110 is performed.
 以上、本実施形態の特徴をまとめると次のとおりである。
 (1)画像処理装置10は、デジタル画像を用いて、カード100の画像上の回転角度を検出する。そして、画像処理装置10は、カード100の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影形成部610と、水平軸(X軸)への射影および垂直軸(Y軸)への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出部620と、左右上下の4端点からなる矩形部分(本形態では長方形領域150)を処理対象として第1画像データIMG1を切り出す処理対象切り出し部630と、処理対象切り出し部630で切り出された第1画像データIMG1と、第1画像データIMG1を180度回転させた第2画像データIMG2とを重ね合わせて第3画像データIMG3を生成する画像調整部640と、処理対象の区域の第3画像データIMG3について、その区域を通過する位置に2本の平行線(第1水平線L1と第2水平線L2の組、または第1垂直線V1と第2垂直線V2の組)を少なくとも水平軸(X軸)または垂直軸(Y軸)に平行に引き、各平行線上におけるカード100のエッジ位置(エッジ点)を求め、二つのエッジ位置のエッジ間隔を求める媒体エッジ点偏差算出部650と、エッジ間隔と2本の平行線の離間距離とから傾斜角を算出する角度算出部660と、を備える。
The characteristics of the present embodiment are summarized as follows.
(1) The image processing apparatus 10 detects the rotation angle on the image of the card 100 using the digital image. The image processing apparatus 10 includes a projection forming unit 610 that generates a projection of pixel values by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, and a projection on the horizontal axis (X axis). For each projection onto the vertical axis (Y-axis), an end point detection unit 620 that determines both end points of the projection pattern of the projection waveform, and a rectangular portion (in this embodiment, a rectangular region 150) consisting of four left and right upper and lower end points are processed. A processing target cutout unit 630 that cuts out the first image data IMG1 as a target, first image data IMG1 cut out by the processing target cutout unit 630, and second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees. The image adjustment unit 640 that generates the third image data IMG3 by superimposing the third image data IMG3 in the area to be processed Two parallel lines (a set of the first horizontal line L1 and the second horizontal line L2 or a set of the first vertical line V1 and the second vertical line V2) at least at a horizontal axis (X axis) or a vertical axis at a position passing through the area A medium edge point deviation calculating unit 650 that obtains an edge position (edge point) of the card 100 on each parallel line and obtains an edge interval between the two edge positions, and is drawn parallel to the (Y axis), and the edge interval and two parallels. An angle calculation unit 660 that calculates an inclination angle from the line separation distance.
 これによって、算出された傾斜角をカード100の回転角度とするので、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。しかもカードエッジ位置の誤検知を防止することが可能になる。 Thereby, since the calculated inclination angle is set as the rotation angle of the card 100, the calculation load can be reduced without using the Hough transform. In addition, erroneous detection of the card edge position can be prevented.
 (2)画像調整部640は、第1画像データIMG1と、第1画像データIMG2を180度回転させた第2画像データIMG2の各画素値の平均をとって第3画像データIMG3を生成する。 (2) The image adjustment unit 640 generates the third image data IMG3 by averaging the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG2 by 180 degrees.
 このような構成によって、画像調整部640は、第1画像データIMG1と、この第1画像データIMG1を180度回転させた第2画像データIMG2の各画素値の平均をとるに際し、磁気記録領域である磁気ストライプ110の部分と磁気ストライプ110のない部分との画素値の平均をとって、この平均をとる部分の輝度を磁気記録領域の輝度より高くして、背景と媒体エッジ(縁)との間の輝度の差を生じさせることが可能となる。
 より具体的には、第1画像データIMG1を180度回転させることにより、磁気ストライプ110の部分は磁気ストライプのない部分との平均をとることになり、この部分の輝度が磁気ストライプ110の輝度よりも大きくなる一方、背景部分の輝度は不変であるため、背景とカードエッジとの間に輝度の差異が生じることから、次工程の媒体エッジ位置偏差算出処理でのカードエッジ位置の誤検知を防止できる。
With such a configuration, the image adjustment unit 640 takes the average of the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees in the magnetic recording area. The average of the pixel values of a certain magnetic stripe 110 portion and a portion without the magnetic stripe 110 is taken, and the luminance of the averaged portion is made higher than the luminance of the magnetic recording area, so that the background and the edge of the medium (edge) It is possible to produce a difference in brightness between the two.
More specifically, by rotating the first image data IMG1 by 180 degrees, the portion of the magnetic stripe 110 is averaged with the portion without the magnetic stripe, and the luminance of this portion is higher than the luminance of the magnetic stripe 110. On the other hand, since the brightness of the background portion is unchanged, there is a difference in brightness between the background and the card edge, preventing erroneous detection of the card edge position in the media edge position deviation calculation process in the next process. it can.
 (3)媒体エッジ位置偏差算出部650は、第3画像データIMG3におけるエッジ間隔を、2本の平行線に対してカード100の対向する辺でそれぞれ1箇所ずつで求め、角度算出部660は、2箇所におけるエッジ間隔と、2本の平行線の離間距離とから、二つの傾斜角を算出し、それらの平均値を回転角度とすることができる。 (3) The medium edge position deviation calculation unit 650 obtains the edge interval in the third image data IMG3 at one place on each side opposite to the card 100 with respect to two parallel lines, and the angle calculation unit 660 Two inclination angles can be calculated from the edge interval at two locations and the separation distance between two parallel lines, and the average value thereof can be used as the rotation angle.
 これによって、2本の平行線で、カード100の対応する対向する2辺(長辺同士または短辺同士)において、それぞれエッジ位置の組を得られるため、対向する2箇所で傾斜角を求めることができる。これらの傾斜角の平均値をカード100の回転角度とすることで、検出精度をより向上させることができる。 As a result, a pair of edge positions can be obtained with two parallel lines on the two opposite sides (long sides or short sides) corresponding to each other on the card 100, so that the inclination angle is obtained at the two opposite locations. Can do. By setting the average value of these inclination angles as the rotation angle of the card 100, the detection accuracy can be further improved.
 (4)媒体エッジ位置偏差算出部650は、第3画像データIMG3における2本の平行線を、水平軸(X軸)および垂直軸(Y軸)に平行に、少なくともそれぞれ一組引き、それぞれの2本の平行線においてエッジ間隔と離間距離とを求め、角度算出部660は、求めたエッジ間隔と離間距離との組のそれぞれについて傾斜角を算出し、それらの平均値を前記回転角度としてもよい。 (4) The medium edge position deviation calculation unit 650 draws at least one set of two parallel lines in the third image data IMG3 in parallel with the horizontal axis (X axis) and the vertical axis (Y axis). The edge interval and the separation distance are obtained for the two parallel lines, and the angle calculation unit 660 calculates an inclination angle for each of the obtained pair of the edge interval and the separation distance, and the average value thereof is used as the rotation angle. Good.
 これによって、平行線を、水平軸および垂直軸のそれぞれに平行に1組以上引くので、カード100の長辺および短辺でそれぞれで1箇所以上の傾斜角が得られる。これらの傾斜角の平均値をカード100の回転角度とすることで、検出精度をより向上させることができる。また、カード100の4辺において傾斜角を求め平均値を回転角度とすれば、さらに検出精度を向上させることができる。 Thereby, since one or more sets of parallel lines are drawn parallel to the horizontal axis and the vertical axis, one or more inclination angles are obtained on the long side and the short side of the card 100, respectively. By setting the average value of these inclination angles as the rotation angle of the card 100, the detection accuracy can be further improved. Further, if the inclination angles are obtained on the four sides of the card 100 and the average value is set as the rotation angle, the detection accuracy can be further improved.
 (5)デジタル画像を用いて、カード100の画像上の回転角度を検出する画像処理方法であって、カード100の画像データの水平軸(X軸)および垂直軸(Y軸)のそれぞれに対して輝度投影による画素値の射影を生成する射影生成ステップST11と、水平軸への射影および垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出ステップST12と、左右上下の4端点からなる矩形部分(本形態では長方形領域150)を処理対象として第1画像データIMG1を切り出す処理対象切り出しステップST13と、処理対象切り出し部630で切り出された第1画像データIMG1と、第1画像データIMG1を180度回転させた第2画像データIMG2とを重ね合わせて第3画像データIMG3を生成する画像調整ステップST14と、処理対象の区域の第3画像データIMG3の長方形領域150について、その区域を通過する位置に2本の平行線(第1水平線L1および第2水平線L2の組、または第1垂直線V1および第2垂直線V2の組)を少なくとも水平軸(X軸)または垂直軸(Y軸)に平行に引き、各平行線上におけるカード100のエッジ位置を求め、二つのエッジ位置のエッジ間隔を求める媒体エッジ点偏差算出ステップST15と、エッジ間隔と2本の平行線の離間距離とから傾斜角を算出する角度算出ステップST16と、を備える。これによって、算出された傾斜角をカード100の回転角度とするので、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。しかもカードエッジ位置の誤検知を防止することが可能になる。 (5) An image processing method for detecting a rotation angle on an image of the card 100 using a digital image, and each of the horizontal axis (X axis) and the vertical axis (Y axis) of the image data of the card 100 A projection generation step ST11 for generating a projection of pixel values by luminance projection; an end point detection step ST12 for determining the end points of the projection pattern of the projection waveform for each of the projection onto the horizontal axis and the projection onto the vertical axis; A processing target cutout step ST13 for cutting out the first image data IMG1 using a rectangular portion (rectangular region 150 in the present embodiment) consisting of four left and right upper and lower end points, and the first image data IMG1 cut out by the processing target cutout unit 630, The first image data IMG1 and the second image data IMG2 rotated by 180 degrees are superimposed to form the third image data I For the image adjustment step ST14 for generating G3 and the rectangular area 150 of the third image data IMG3 in the area to be processed, a set of two parallel lines (a set of the first horizontal line L1 and the second horizontal line L2) at a position passing through the area. Or a set of the first vertical line V1 and the second vertical line V2) is drawn in parallel to at least the horizontal axis (X axis) or the vertical axis (Y axis), and the edge position of the card 100 on each parallel line is obtained. The medium edge point deviation calculating step ST15 for obtaining the edge interval of the edge position, and the angle calculating step ST16 for calculating the inclination angle from the edge interval and the distance between the two parallel lines are provided. As a result, the calculated inclination angle is used as the rotation angle of the card 100, so that the calculation load can be reduced without using the Hough transform. In addition, erroneous detection of the card edge position can be prevented.
 本発明を、実施の形態をもとに説明したが、この実施の形態は例示であり、それらの各構成要素の組み合わせ等にいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 Although the present invention has been described based on the embodiment, this embodiment is an exemplification, and various modifications can be made to combinations of the respective components, and such modifications are also included in the present invention. It will be understood by those skilled in the art that it is in the range.
10・・・画像処理装置、20・・・テーブル、30・・・画像読取部、31・・・照明光源、40・・・アナログデジタルコンバータ(A/Dコンバータ)、50・・・画像メモリ、60・・・データ処理部、100・・・カード(媒体)、110・・・磁気ストライプ、120・・・バーコード、610・・・射影形成部、620・・・端点検出部、630・・・処理対象切り出し部、640・・・画像調整部、650・・・媒体エッジ位置偏差算出部(媒体エッジ点偏差算出部)、660・・・角度算出部。 DESCRIPTION OF SYMBOLS 10 ... Image processing apparatus, 20 ... Table, 30 ... Image reading part, 31 ... Illumination light source, 40 ... Analog-digital converter (A / D converter), 50 ... Image memory, 60 ... Data processing unit, 100 ... Card (medium), 110 ... Magnetic stripe, 120 ... Bar code, 610 ... Projection forming unit, 620 ... End point detection unit, 630 ... Processing target cutout unit, 640... Image adjustment unit, 650... Medium edge position deviation calculation unit (medium edge point deviation calculation unit), 660.

Claims (4)

  1.  デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出する画像処理装置であって、
     前記矩形状をした媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成部と、
     前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出部と、
     左右上下の4端点からなる矩形部分を処理対象として第1画像データを切り出す処理対象切り出し部と、
     前記処理対象の区域について、前記媒体を切り出した第1画像データと、当該第1画像データを180度回転させた第2画像データとを重ね合わせた第3画像データを生成する画像調整部と、
     前記処理対象の区域の前記第3画像データについて、当該処理対象の区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状をした媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出部と、
     前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出部と、
     を備えることを特徴とする画像処理装置。
    An image processing apparatus that detects a rotation angle on an image of a rectangular medium using a digital image,
    A projection generation unit that generates a projection of pixel values by luminance projection for each of a horizontal axis and a vertical axis of the image data of the rectangular medium;
    For each of the projection onto the horizontal axis and the projection onto the vertical axis, an end point detection unit that determines both end points of the projection pattern of the projection waveform;
    A processing target cutout unit that cuts out first image data using a rectangular portion including four end points on the left, right, top, and bottom as a processing target;
    An image adjusting unit that generates third image data obtained by superimposing the first image data obtained by cutting out the medium and the second image data obtained by rotating the first image data by 180 degrees with respect to the area to be processed;
    For the third image data in the area to be processed, at least two parallel lines are drawn parallel to the horizontal axis or the vertical axis at positions passing through the area to be processed, and the rectangular shape on each parallel line is drawn. A medium edge point deviation calculating unit for obtaining an edge position of the medium and obtaining an edge interval between the two edge positions;
    An angle calculation unit that calculates an inclination angle from the edge interval and a separation distance between the two parallel lines;
    An image processing apparatus comprising:
  2.  前記処理対象区域画像調整部は、
      前記第1画像データと、当該第1画像データを180度回転させた前記第2画像データの各画素値の平均をとって前記第3画像データを生成する
     ことを特徴とする請求項1記載の画像処理装置。
    The processing target area image adjustment unit,
    2. The third image data is generated by taking an average of each pixel value of the first image data and the second image data obtained by rotating the first image data by 180 degrees. Image processing device.
  3.  デジタル画像を用いて、矩形状をした媒体の画像上の回転角度を検出する画像処理方法であって、
     前記矩形状をした媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成ステップと、
     前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出ステップと、
     左右上下の4端点からなる矩形部分を処理対象として第1画像データを切り出す処理対象切り出しステップと、
     前記処理対象の区域について、前記媒体を切り出した第1画像データと、当該第1画像データを180度回転させた第2画像データとを重ね合わせた第3画像データを生成する画像調整ステップと、
     前記処理対象の区域の前記第3画像データについて、当該処理対象の区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状をした媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出ステップと、
     前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出ステップと、
     を備えることを特徴とする画像処理方法。
    An image processing method for detecting a rotation angle on an image of a rectangular medium using a digital image,
    A projection generation step of generating a projection of pixel values by luminance projection for each of a horizontal axis and a vertical axis of the image data of the rectangular medium;
    For each of the projection onto the horizontal axis and the projection onto the vertical axis, an endpoint detection step for determining both end points of the projection pattern of the projection waveform;
    A processing target cutout step of cutting out the first image data using a rectangular portion composed of four end points on the left, right, top and bottom as processing targets;
    An image adjustment step for generating third image data in which the first image data obtained by cutting out the medium and the second image data obtained by rotating the first image data by 180 degrees are superimposed on the processing target area;
    For the third image data in the area to be processed, at least two parallel lines are drawn parallel to the horizontal axis or the vertical axis at positions passing through the area to be processed, and the rectangular shape on each parallel line is drawn. A medium edge point deviation calculating step for determining an edge position of the medium and determining an edge interval between the two edge positions;
    An angle calculating step of calculating an inclination angle from the edge interval and a distance between the two parallel lines;
    An image processing method comprising:
  4.  前記画像調整ステップでは、
      前記第1画像データと、当該第1画像データを180度回転させた前記第2画像データの各画素値の平均をとって前記第3画像データを生成する
     ことを特徴とする請求項3記載の画像処理方法。
    In the image adjustment step,
    The third image data is generated by taking an average of each pixel value of the first image data and the second image data obtained by rotating the first image data by 180 degrees. Image processing method.
PCT/JP2019/007979 2018-03-30 2019-03-01 Image processing device and image processing method WO2019187967A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018067419A JP2019179342A (en) 2018-03-30 2018-03-30 Image processing device and image processing method
JP2018-067419 2018-03-30

Publications (1)

Publication Number Publication Date
WO2019187967A1 true WO2019187967A1 (en) 2019-10-03

Family

ID=68061354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/007979 WO2019187967A1 (en) 2018-03-30 2019-03-01 Image processing device and image processing method

Country Status (2)

Country Link
JP (1) JP2019179342A (en)
WO (1) WO2019187967A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115229804A (en) * 2022-09-21 2022-10-25 荣耀终端有限公司 Method and device for attaching component

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022169874A (en) 2021-04-28 2022-11-10 株式会社Pfu Image processing apparatus, image processing method, and program
CN113828948B (en) * 2021-11-23 2022-03-08 济南邦德激光股份有限公司 Plate edge searching method, calibration system and edge searching system of laser cutting machine
CN113829673B (en) * 2021-11-26 2022-03-18 武汉宏博纸品包装有限公司 Honeycomb paper core stretching control method and system based on Hough transform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0735509A (en) * 1993-07-16 1995-02-07 Toshiba Corp Postage stamp detecting device
JP2006031166A (en) * 2004-07-13 2006-02-02 Glory Ltd Image collation device, image collation method and image collation program
JP2006135211A (en) * 2004-11-09 2006-05-25 Nikon Corp Surface inspection apparatus, surface inspection method, and exposure system
JP2010245788A (en) * 2009-04-03 2010-10-28 Sharp Corp Image output apparatus, portable terminal apparatus, captured image processing system, image output method, program, and recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0735509A (en) * 1993-07-16 1995-02-07 Toshiba Corp Postage stamp detecting device
JP2006031166A (en) * 2004-07-13 2006-02-02 Glory Ltd Image collation device, image collation method and image collation program
JP2006135211A (en) * 2004-11-09 2006-05-25 Nikon Corp Surface inspection apparatus, surface inspection method, and exposure system
JP2010245788A (en) * 2009-04-03 2010-10-28 Sharp Corp Image output apparatus, portable terminal apparatus, captured image processing system, image output method, program, and recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115229804A (en) * 2022-09-21 2022-10-25 荣耀终端有限公司 Method and device for attaching component

Also Published As

Publication number Publication date
JP2019179342A (en) 2019-10-17

Similar Documents

Publication Publication Date Title
WO2019187967A1 (en) Image processing device and image processing method
JP4911340B2 (en) Two-dimensional code detection system and two-dimensional code detection program
US8649593B2 (en) Image processing apparatus, image processing method, and program
US7949187B2 (en) Character string recognition method and device
JP2012243307A (en) Method for detecting strain in input image, device for detecting strain in input image and computer readable medium
US20050196070A1 (en) Image combine apparatus and image combining method
CN111164959B (en) Image processing apparatus, image processing method, and recording medium
WO2012172817A1 (en) Image stabilization apparatus, image stabilization method, and document
JP2015173430A (en) Projection system, semiconductor integrated circuit and image correction method
JP2016513320A (en) Method and apparatus for image enhancement and edge verification using at least one additional image
JP2011058812A (en) Method and device for parallax calculation
JP2000244729A (en) Image processor
JP2009020613A (en) Image processing program, image processing method, and image processor
WO2018147059A1 (en) Image processing device, image processing method, and program
JP2008147976A (en) Image inclination correction device and image inclination correcting method
WO2012029658A1 (en) Imaging device, image-processing device, image-processing method, and image-processing program
US7079265B2 (en) Distortion correction device for correcting imaged object to produce plane image without distortion
JP2004129271A (en) Method for determining skew angle and position for document in overscanned image
CN108335266B (en) Method for correcting document image distortion
JP2003304561A (en) Stereo image processing apparatus
WO2018061997A1 (en) Medium recognition device and medium recognition method
WO2019107141A1 (en) Image processing device and image processing method
JP6006675B2 (en) Marker detection apparatus, marker detection method, and program
JP6068080B2 (en) Image combining device, image combining method, and program
JP4696239B2 (en) Method and apparatus for correcting inclination of character string

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19777502

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19777502

Country of ref document: EP

Kind code of ref document: A1