JPH0218681A - Line segment detecting method - Google Patents

Line segment detecting method

Info

Publication number
JPH0218681A
JPH0218681A JP63170557A JP17055788A JPH0218681A JP H0218681 A JPH0218681 A JP H0218681A JP 63170557 A JP63170557 A JP 63170557A JP 17055788 A JP17055788 A JP 17055788A JP H0218681 A JPH0218681 A JP H0218681A
Authority
JP
Japan
Prior art keywords
line segment
pixels
pixel
edge direction
same
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP63170557A
Other languages
Japanese (ja)
Inventor
Satoshi Hino
聡 日野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to JP63170557A priority Critical patent/JPH0218681A/en
Publication of JPH0218681A publication Critical patent/JPH0218681A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

PURPOSE:To reduce a small area to be dismissed and to attain the efficient line segment detection of satisfactory accuracy by comparing the edge directions of adjacent (n)X(n) masks and generating the small area to constitute a same line segment while an uncontinuous picture element, which is in a neighborhood is also integrated into the same small area. CONSTITUTION:The edge directions of the picture elements in the (n)X(n) constitution of the masks are compared and the picture elements, between which difference is almost same, are classified as a same picture element set. Then, those picture element sets are caused to be mutually in the neighborhood and mutual link is decided only concerning the line segment to be obtained from the picture element set in the neighborhood. Thus, the undetected line segment, which is caused by the dismissing of the small area to be generated, is not increased and the load of merge processing by the fragmentation of the small area to be generated is decreased. Then, the load for the link decision of the line segment can be also reduced and the efficient line segment detection of the satisfactory accuracy can be executed.

Description

【発明の詳細な説明】 A、産業上の利用分野 本発明は、供試画像中の線分位置を効率的に精度良く検
出するための線分検出方法に関する。
DETAILED DESCRIPTION OF THE INVENTION A. Field of Industrial Application The present invention relates to a line segment detection method for efficiently and accurately detecting the position of a line segment in a test image.

B、従来の技術 従来の線分検出方法を第3図および第4図tコ基づいて
説明する。
B. Prior Art A conventional line segment detection method will be explained with reference to FIGS. 3 and 4.

第3図において、画像入力部101を通して得られたデ
ータ画像(例えば第4図(a)に示す直方体状の箱)を
エツジ強度・方向検出部102に入力することにより、
各画素のエツジ強度およびエツジ方向を検出する。すな
わち、局所領域内における座標(x、y)の画素の濃度
をf(x、y)とした時、X方向およびX方向の微分値
をそれぞエツジ強度およびエツジ方向は次式で示される
In FIG. 3, by inputting a data image (for example, a rectangular parallelepiped box shown in FIG. 4(a)) obtained through the image input section 101 to the edge strength/direction detection section 102,
Detect the edge intensity and edge direction of each pixel. That is, when the density of a pixel at coordinates (x, y) in a local area is f(x, y), the edge intensity and edge direction are expressed by the following equations as differential values in the X direction and the X direction, respectively.

aa エツジ方向二arctan (a 、 f(Xr y 
)/ a 、 f(Xr y ))ここで、この1次微
分をデジタル計算器で差分として近似するための微分オ
ペレータが各種提案されており、例えば5obelオペ
レータを用いてエツジ強度およびエツジ方向が求められ
る。第4図(b)の矢印はエツジ方向を示す。上の式か
られかるように、エツジ強度とは、各画素の濃度勾配の
大きさであり、エツジ方向とは、その濃度勾配の方向で
ある。
aa Edge direction 2 arctan (a, f(Xr y
)/a, f(Xry)) Here, various differential operators have been proposed for approximating this first-order differential as a difference using a digital calculator. It will be done. The arrow in FIG. 4(b) indicates the edge direction. As can be seen from the above equation, the edge strength is the magnitude of the density gradient of each pixel, and the edge direction is the direction of the density gradient.

次に同−線分小領域生成部およびモーメント訂算部10
3では、まず、ある値以上のエツジ強度を持ち、かつ同
等のエツジ方向を持つ隣接する画素同志を、同じ線分を
構成する小領域(画素集合)とみなす処理を行なう。す
なわち、第4図(c)に示すように画素行列の画素0と
画素rE”(1,。
Next, the same line segment small area generation unit and moment correction unit 10
3, first, a process is performed in which adjacent pixels having an edge strength greater than a certain value and having the same edge direction are regarded as small regions (pixel sets) constituting the same line segment. That is, as shown in FIG. 4(c), pixel 0 and pixel rE''(1,) in the pixel matrix.

2.3,4.)において、 画素o、rのエツジ強度≧αで、かつ 画素0のエツジ方向一画素rのエツジ方向≦βのとき、
0とrk同じ小領域とみなして、これらの画素o、rに
より同一線分小領域を生成する。
2.3,4. ), when the edge strength of pixels o and r≧α, and the edge direction of pixel 0 and the edge direction of pixel r≦β,
0 and rk are regarded as the same small area, and the same line segment small area is generated by these pixels o and r.

次いて、生成された各小領域の0次モーメン1〜。Next, the zero-order moment 1~ of each generated small area.

1次モーメン1へ、2次モーメン1へを求める。Find the first moment 1 and the second moment 1.

○次モーメンh m。0− Σ  f(x、y)(x、
y)CR 1次モーメンl’ m、o= Σ  x−f(x、y)
(x、y)ER mo1=  Σ   y−f (x +  y)(x、
y)ER 2次モーメン1−m2o=  Σ  x2・f(x、y
)(x、y)ER m□1− Σ   x−y −f (x+  y)(x
、y)CR mo2− Σ   y” f(x、y)(x、y)ER 但し、Rはある小領域を示す。
○Next moment h m. 0− Σ f(x,y)(x,
y) CR first moment l' m, o= Σ x-f(x, y)
(x, y)ER mo1= Σ y−f (x + y) (x,
y) ER second-order moment 1-m2o= Σ x2・f(x, y
) (x, y) ER m□1- Σ x-y -f (x+ y) (x
, y) CR mo2- Σ y” f(x, y) (x, y) ER However, R indicates a certain small region.

次に、直線方程式および線分の端点計算部104では、
生成された各小領域の中で、ある画素数(例えば10画
素)以」二の領域について、各小領域のモーメンI・か
ら直線の方程式と線分の端点の座標を求める。すなわち
、第4図(d)に示すように、ある画素数以上の小領域
の直線方程式yax+bを求め、さらに当該線分の端点
の座標(X□y y、) +  (Xi+ 3’2)を
求める。
Next, in the straight line equation and line segment end point calculation unit 104,
Among each generated small area, for areas with a certain number of pixels (for example, 10 pixels) or more, the equation of the straight line and the coordinates of the end points of the line segment are determined from the moment I of each small area. That is, as shown in FIG. 4(d), find the linear equation yax+b for a small area with a certain number of pixels or more, and then calculate the coordinates (X□y y,) + (Xi+ 3'2) of the end point of the line segment. demand.

その後、小領域合併部105において、第4図(d)の
ようにして特定された線分(小領域)の中で、第4図(
e)に示す如く隣接する一対の線分の主軸角度O工、0
□の差がある値以下でかつ両者間の距離t(端点の座標
から求める)がある値以下の場合、すなわち10□−1
921≦αでかっt≦βである場合に、一対の線分をそ
れぞれ特定した一対の小領域同志を合併し、合併した小
領域について、新たにその直線の方程式と該線分の端点
の座標を求めなおす。
Thereafter, in the small area merging unit 105, the line segments (small areas) identified as shown in FIG.
As shown in e), the main axis angle of a pair of adjacent line segments is 0
If the difference between □ is less than a certain value and the distance t between them (calculated from the coordinates of the end points) is less than a certain value, that is, 10□-1
If 921≦α and t≦β, merge a pair of small regions in which a pair of line segments are respectively specified, and for the merged small region, calculate the equation of the straight line and the coordinates of the end point of the line segment. Re-search for.

このようにして、入力画像である立方体を構成する各線
分が検出されることになるが、この段階では、多角形の
角部などのように本来複数の線分と端点同志を共有すべ
き部分ても実際に共有していない場合が多い。換言する
と、角を形成する複数の線分同志が連結されていない場
合が多い。従って、小領域合併部105で得られる線分
をさらに修正し、このような線分同志を連結させる必要
がある。
In this way, each line segment that makes up the cube that is the input image will be detected, but at this stage, we will be detecting parts that should originally share end points with multiple line segments, such as the corners of a polygon. However, in many cases they are not actually shared. In other words, a plurality of line segments forming a corner are often not connected to each other. Therefore, it is necessary to further modify the line segments obtained by the small area merging unit 105 and connect such line segments.

そこで、線分連結性判定部106では、第4図(f)に
示すように、小領域合併部105て得られた複数の線分
の交点Pと各線分の端点との距離s、tを基準として、
線分の連結性を判断する。
Therefore, the line segment connectivity determination unit 106 calculates the distances s and t between the intersection point P of the plurality of line segments obtained by the small area merging unit 105 and the end point of each line segment, as shown in FIG. 4(f). As a standard,
Determine the connectivity of line segments.

すなわち、距離し、Sがそれぞれt≦α、S≦βてあれ
ば、各線分が連結可能であると判断する。
That is, if the distances and S are t≦α and S≦β, respectively, it is determined that the line segments can be connected.

そして、線分連結部1.07において、各線分をその交
点まで延長、もしくは短縮して線分の連結を行ない、第
4図(g)に示す処理結果の画像を得る。以−ヒの方法
は、日本ロボソ1へ学会誌4巻3号1986.6rJ度
勾配の方向による領域生成と最小二乗法あてほめを用い
た線分抽出」に示されている。
Then, in the line segment connecting section 1.07, each line segment is extended or shortened to its intersection and connected, thereby obtaining the processed image shown in FIG. 4(g). This method is shown in ``Region generation based on direction of rJ degree gradient and line segment extraction using least squares method'', 1986.6 Journal of Japan Roboso, Vol. 4, No. 3, 1986.

C0発明が解決しようとする問題点 しかしながら、上述のような従来の線分検出方法では、
隣接する画素のみとの比較で同一線分を構成する小領域
を生成するようにしているので、第5図に示すように同
一線分を表わす画素a同志がノイズの影響により隣接し
ていないと、−本の同一線分とみなす入き画素群であっ
ても第5図の楕円で囲んだように例えば3つの線分I、
n、Hの画素狼合と認識してしまい、これに伴い生成さ
れる小領域が必要以」二に細分化される。このため、小
領域を併合する処理時間が長くなったり、上述した計算
部104において、画素数の少ない小領域が棄却される
可能性が高くなって線分として検出されない領域が多数
少ずるおそれがある。また、線分の連結性を判定する前
に線分間の位置関係を示す情報を得る方式になっていな
いため検出された全ての線分について、相互に連結性を
判定しなければならないため、線分検出の処理効率が悪
くなるという問題があった。
Problems to be solved by the C0 invention However, in the conventional line segment detection method as described above,
Since small areas constituting the same line segment are generated by comparing only adjacent pixels, it is possible that pixels a representing the same line segment are not adjacent due to the influence of noise, as shown in Figure 5. , - Even if the input pixel group is considered to be the same line segment of a book, for example, three line segments I, as surrounded by an ellipse in Fig. 5,
It is recognized as a pixel combination of n and H, and the small area generated accordingly is subdivided into two parts as necessary. For this reason, the processing time for merging small regions may become longer, and the calculation unit 104 described above may be more likely to reject small regions with a small number of pixels, resulting in a large number of regions that are not detected as line segments. be. In addition, since there is no method to obtain information indicating the positional relationship between line segments before determining the connectivity of line segments, it is necessary to determine the mutual connectivity of all detected line segments. There was a problem that the processing efficiency of minute detection deteriorated.

本発明は、上述のような問題点を解決するためになされ
たもので、検出精度が良く、かつ処理効率の高い線分検
出方法を提供することを目的とする。
The present invention has been made to solve the above-mentioned problems, and it is an object of the present invention to provide a line segment detection method with high detection accuracy and high processing efficiency.

D0問題点を解決するための手段 本発明は、 ■認識対称物体の画像データの各画素についてエツジ強
度およびエツジ方向を演算し、■エツジ強度とエツジ方
向とが略等しい画素同志を画素集合として分類し、 ■この画素集合のモーメン1−から各画素集合により形
成される線分および各線分の端点の座標を求め、 ■これから得た各線分の交点と端点間の距離かある値以
下のものについて連結補正して線分後検出する線分検出
方法に適用される。
Means for Solving the D0 Problem The present invention: 1) calculates the edge strength and edge direction for each pixel of the image data of the object to be recognized; and 2) classifies pixels whose edge strength and edge direction are approximately equal as a pixel set. Then, ■ Find the coordinates of the line segment formed by each pixel set and the end point of each line segment from the moment 1- of this pixel set, ■ For the distance between the intersection point and the end point of each line segment obtained from this, which is less than a certain value It is applied to a line segment detection method that performs connection correction and detects the line segment afterward.

そして上述の問題点は、 0画像データをn×nの画素から成る複数のマスク領域
に分割し、 ■この各マスク領域において略等しいエツジ方向をもつ
画素を同一のエツジ方向を持つ画素として分類すると共
に、隣接するマスク領域のエツジ方向が略等しい画素を
画素集合として分類し、■次いて隣接するマスク領域間
で分類される画素集合が互いに近傍にあるとして該画素
集合の組を記憶し、 ■この記憶値をもつ画素集合から演算された線分同志に
ついてのみ線分の交点と端点間の距離を求めて線分の連
結を行なうことにより解決される。
The above problem is solved by dividing the 0 image data into multiple mask areas consisting of n×n pixels, and classifying pixels with approximately the same edge direction in each mask area as pixels having the same edge direction. At the same time, pixels in adjacent mask areas having substantially the same edge direction are classified as a pixel set, (1) then, a set of pixel sets classified between adjacent mask areas is stored in the vicinity of each other; This problem is solved by finding the distance between the intersection point and the end point of the line segments calculated from the pixel set having this stored value and connecting the line segments.

E1作用 本発明においては、最小の画素集合(小領域)の生成段
階で、隣接するn×nのマスク同志のエツジ方向を比較
し、これにより近傍にある不連続な画素も同一の小領域
に組込みながら同一線分を構成する小領域を生成するか
ら、従来のように細い小領域に細分化されて棄却される
小領域が少なくなる。また、互いに近傍にある画素集合
を互いにある組として記憶し、線分の連結性判定は、そ
の同−組内の画素集合から生成された線分についてのみ
行なう。したがって、線分の連結性判定の負荷が軽減さ
れ、精度良い効率的な線分検出が可能になる。
E1 action In the present invention, at the stage of generating the minimum pixel set (small area), the edge directions of adjacent n×n masks are compared, and as a result, discontinuous pixels in the vicinity are also included in the same small area. Since small regions constituting the same line segment are generated while being incorporated, the number of small regions that are subdivided into thin small regions and discarded as in the conventional method is reduced. Further, pixel sets that are close to each other are stored as a set, and the connectivity of line segments is determined only for line segments generated from pixel sets within the same set. Therefore, the load of determining the connectivity of line segments is reduced, and accurate and efficient line segment detection becomes possible.

F、実施例 以下、本発明の実施例を第1図および第2図に基づいて
詳細に説明する。
F. EXAMPLE Hereinafter, an example of the present invention will be explained in detail based on FIGS. 1 and 2.

第1図は本発明による線分検出の機能ブロックを示す構
成図、第2図は本実施例における処理内容を示す説明図
である。
FIG. 1 is a block diagram showing functional blocks of line segment detection according to the present invention, and FIG. 2 is an explanatory diagram showing processing contents in this embodiment.

第1図において、画像入力部201では、画像情報をデ
ジタル信号に変換して、例えば第2図(a)に示すよう
な直方体状の箱の画像データを得る。この画像データは
、各画素を濃度情報f(x、y)で表わしたものであり
、その−次微分値は」二連の る。この画像データを次のエツジ強度・エツジ方向検出
部202で処理することにより、各画素のエツジ強度お
よびエツジ方向を検出する。すなわち、従来と同様に局
所的な微分オペレータにより各画素ごとにエツジ強度と
エツジ方向とを求める。
In FIG. 1, an image input unit 201 converts image information into a digital signal to obtain, for example, image data of a rectangular parallelepiped box as shown in FIG. 2(a). This image data represents each pixel by density information f(x, y), and its negative order differential values are two series. This image data is processed by the next edge strength/edge direction detection unit 202 to detect the edge strength and edge direction of each pixel. That is, the edge strength and edge direction are determined for each pixel using a local differential operator as in the conventional method.

第2図(b)の矢印は各画素のエツジ方向を表わしてい
る。
The arrows in FIG. 2(b) indicate the edge direction of each pixel.

次に、n×nマスクによるエツジ方向設定部203ては
、エツジ方向を示す画像データの画素行列を、n×n例
えば4×4の画素で区画されたマスク領域(以下、マス
クと呼ぶ)に分割し、このマスク内で近似角度をもつ各
画素のエツジ方向を平均化し、これをマスクのエツジ方
向とする。
Next, the edge direction setting unit 203 using an n×n mask converts the pixel matrix of the image data indicating the edge direction into a mask area (hereinafter referred to as a mask) partitioned by n×n, for example, 4×4 pixels. The edge direction of each pixel having an approximate angle within this mask is averaged, and this is taken as the edge direction of the mask.

すなわち、第2図(c)に示すようにエツジ方向画像の
4×4マスクの領域内で、30°、31゜29°または
70°、69°、71°のような近似角度をもつ画素を
、該画素のエツジ方向の平均30°、70°のエツジ方
向に置き換え、この30°と70°をマスクのエツジ方
向とする。このように同一マスク内のエツジ方向は複数
種ある場合もあるので、マスクの各エツジ方向とそのエ
ツジ方向を構成する画素とを対応付ける必要がある。例
えば、30’ を1,70°を2と表現して、マスク内
で互いにエツジ方向が近似している各画素を、第2図(
c)のようにエツジ方向の平均値を示す記号1,2・・
・で対応付けして分類する。
That is, as shown in FIG. 2(c), pixels with approximate angles such as 30°, 31°, 29°, or 70°, 69°, and 71° are located within the 4×4 mask area of the edge direction image. , the average edge directions of the pixels are replaced with edge directions of 30° and 70°, and these 30° and 70° are taken as the edge directions of the mask. As described above, since there may be multiple types of edge directions within the same mask, it is necessary to associate each edge direction of the mask with the pixels constituting that edge direction. For example, if 30' is expressed as 1 and 70° is expressed as 2, each pixel whose edge direction is similar to each other in the mask is expressed as shown in Figure 2 (
Symbols 1, 2, etc. indicating the average value in the edge direction as shown in c)
- Correlate and classify.

次に、同−線分小領域生成部204では、互いに隣接す
るマスク間において、例えば第2図(d)のマスクOと
マスク1〜4との間において、各マスク内で上記エツジ
方向の平均値を示す記号で対応付けられた画素同志を同
じ線分を構成する小領域(画素集合)とみなす。
Next, the line segment small area generation unit 204 calculates the average edge direction in each mask between adjacent masks, for example between mask O and masks 1 to 4 in FIG. 2(d). Pixels that are associated with each other using symbols indicating values are regarded as small regions (pixel sets) that form the same line segment.

例えば、第2図(cl)に示す4×4の画素で区画され
たマスク0,1,2,3.4の行列から、マスク0とマ
スクrE (i、2,3.4)において、 マスク0の各画素のエツジ方向:=(dO,、d○z+
dO”+ ・)マスクrの各画素のエツジ方向=(dr
□Idr21・drj、・)とすると、 d□1−drj  ≦d の時、dOi、 drjに対応する画素を同じ小領域と
みなす。すなわち、隣接する各マスク内において上述の
エツジ方向設定部203てエツジ方向の平均化処理に寄
与した画素の全てを、ある線分を表わす画素の集合とす
る。これを第2図(e)により詳細に説明する。
For example, from the matrix of masks 0, 1, 2, 3.4 partitioned by 4×4 pixels shown in FIG. 2 (cl), in mask 0 and mask rE (i, 2, 3.4), the mask Edge direction of each pixel of 0: = (dO,, d○z+
dO”+ ・) Edge direction of each pixel of mask r = (dr
□Idr21·drj, ·) When d□1−drj≦d, the pixels corresponding to dOi and drj are considered to be the same small area. That is, all the pixels that have contributed to the edge direction averaging process by the edge direction setting unit 203 in each adjacent mask are set as a set of pixels representing a certain line segment. This will be explained in detail with reference to FIG. 2(e).

第2図(e)は、上述したエツジ方向設定部203およ
び同−線分小領域生成部20/4による隣接する2つの
マスク0,1内の処理内容を示す。
FIG. 2(e) shows the processing contents of two adjacent masks 0 and 1 by the edge direction setting section 203 and the same line segment small region generation section 20/4 described above.

同図(e□)は、マスク0,1内における近似角度をも
つ画素のエツジ方向を示し、空白部はエツジ強度が所定
値以下の画素である。また同図(e2)は、平均処理に
より得たマスク0.1のエツジ方向を示し、同図(e3
)は、マスク0.1の各エツジ方向とそのエツジ方向を
構成した画素とが対応付けられた状態を表わす。
The figure (e□) shows the edge direction of pixels having approximate angles in masks 0 and 1, and blank areas are pixels whose edge strength is less than a predetermined value. In addition, the same figure (e2) shows the edge direction of the mask 0.1 obtained by the averaging process, and the same figure (e3
) represents a state in which each edge direction of mask 0.1 is associated with the pixels forming that edge direction.

例えば、第2図(e2)のように、マスク1のエツジ方
向が(30’、70°)、マスク0のエツジ方向が(2
9°、3°)と演算されると、マスク1の平均エツジ方
向30°の算出に寄与した各画素と、マスクOの平均エ
ツジ方向29°の算出に寄与した各画素との差がd以下
か否かを求め、d以下であるマスク0.1の各画素を、
記号「1」で特定される同一のノ」1領域(画素集合)
に分類する。この場合、この画素集合のエツジ方向が3
0’であることを示す「1」を各画素に付与する。また
、平均エツジ方向70’および平均エツジ方向3°の各
画素はそれぞれ一方のマスクにのみ存在し、それぞれの
平均エツジ方向の算出に寄与した各画素にr2」、T3
」を付与して画素集合として分類する。この結果が第2
図(e3)に示されている。
For example, as shown in FIG. 2 (e2), the edge direction of mask 1 is (30', 70°) and the edge direction of mask 0 is (2
9°, 3°), the difference between each pixel that contributed to the calculation of the average edge direction of mask 1 of 30° and each pixel that contributed to the calculation of the average edge direction of mask O of 29° is d or less Find out whether or not, and each pixel of mask 0.1 that is less than or equal to d,
Identical ``1'' area (pixel set) specified by the symbol ``1''
Classify into. In this case, the edge direction of this pixel set is 3
"1" indicating that the pixel is 0' is assigned to each pixel. Furthermore, each pixel in the average edge direction 70' and the average edge direction 3° exists only in one mask, and each pixel contributing to the calculation of the respective average edge direction has r2'', T3
” and classify it as a pixel set. This result is the second
This is shown in Figure (e3).

また、このように隣接するマスク間で(do1+doz
+ ”’doJ+ ”’drxr dr、、、 ”’d
rJ+ ”l に対して求められた小領域(1,2,3
)が、互いに近傍にある画素集合の組であることを記憶
する。
Also, in this way, between adjacent masks (do1+doz
+ ”'doJ+ ”'drxr dr,,, ”'d
The small area (1, 2, 3
) are a set of pixel sets that are close to each other.

つまり、このような小領域は1つの画像データに対して
複数個得られるが、隣接するマスク内の画像データによ
り生成された各小領域については、それらが互いに近傍
にある小領域であることが識別できるように記憶する。
In other words, although a plurality of such small regions can be obtained for one image data, it is important to note that the small regions generated from the image data in adjacent masks are close to each other. Memorize it so that it can be identified.

次に、同一線分領域のモーメント計算部205では、従
来と同様に」二連のように分類される各小領域のモーメ
ンI・を割算し、そして、直線方程式および線分の端点
計算部206では、小領域を構成する画素数がある数似
上の時、算出されたモーメントから直線の方程式(y=
ax十b)および該線分の端点の座標(X□+ yl)
 、(X213’2)を算出する(第2図(f)参照)
Next, the same line segment area moment calculation unit 205 divides the moment I of each small area classified as ``double series'' as in the past, and then calculates the straight line equation and line segment end point calculation unit. In 206, when the number of pixels constituting a small area is almost a certain number, the equation of a straight line (y=
ax + b) and the coordinates of the end point of the line segment (X + yl)
, (X213'2) (see Figure 2(f))
.

線分連結性判定部207では、直線方程式および線分の
端点計算部206で得た複数の線分の交点Pと端点の基
準距離t、sとを基準として、線分の連結性を判定する
(第2図(g)参照)。すなわち、距離t、sがそれぞ
れt≦α、S≦βの時、各線分が連結可能と判定する。
The line segment connectivity determination unit 207 determines the connectivity of line segments based on the line equation and the reference distances t and s between the intersection points P of the plurality of line segments and the end points obtained by the line segment end point calculation unit 206. (See Figure 2(g)). That is, when the distances t and s are t≦α and S≦β, it is determined that each line segment can be connected.

この連結性の判定は、同−線分小領域生成部204で記
憶した互いに近傍にある小領域(画素集合)同志につい
てのみ行なう。つまり、ある画像データの画像処理にお
いて得られた複数の小領域の全てに対して従来のように
連結性を判定するのではなく、多角形の角などのように
連結する必要のある小領域はそれぞれ隣接しているもの
であることに着目し、例えば、第2図(e3)の小領域
(1,2,3)のように隣接するマスク間の小領域同志
について連結性を判定する。
This determination of connectivity is performed only for small regions (pixel sets) that are stored in the same line segment small region generation unit 204 and are close to each other. In other words, instead of determining the connectivity of all of the multiple small areas obtained through image processing of certain image data, as in the past, small areas that need to be connected, such as the corners of a polygon, Focusing on the fact that they are adjacent to each other, for example, the connectivity is determined for small areas between adjacent masks, such as the small areas (1, 2, 3) in FIG. 2 (e3).

そして、次の線分連結部208では、従来と同様にして
、連結性ありと判定された各線分をその交点まで延長、
もしくは短縮して第2図(h)のように線分の連結を行
なう。
Then, in the next line segment connection section 208, each line segment that has been determined to have connectivity is extended to its intersection point in the same manner as before.
Alternatively, the line segments can be shortened and connected as shown in FIG. 2(h).

従って、」二連のような本実施例にあっては、隣接する
4×4のマスク間において同一線分の小領域を求めるよ
うにしているから、第5図に示すように画素に少々の飛
びがある場合でも同一の領域とみなすことができ、線分
をノイズにより細分化して検出することが防止されると
ともに、小領域の棄却という問題も解決できる。さらに
、隣接するマスク間において求められた小領域について
のみ線分の連結性判定を行なうようにしたから、その判
定処理の負荷を軽減できると共に、精度の良い効率的な
線分検出が可能になる。
Therefore, in this embodiment, which is similar to the "double series", small areas of the same line segment are found between adjacent 4x4 masks, so there is a slight difference in pixels as shown in Figure 5. Even if there is a jump, it can be regarded as the same area, preventing line segments from being detected as being subdivided due to noise, and solving the problem of rejecting small areas. Furthermore, since the connectivity of line segments is determined only for small regions found between adjacent masks, it is possible to reduce the burden of the determination process and to perform accurate and efficient line segment detection. .

G0発明の効果 以」二のように本発明によれば、隣接するn×n構成の
マスク内における画素同志のエツジ方向を比較し、その
差が略等しい画素を同一の画素集合として分類し、かつ
それらの画素集合が互いに近傍にあるとし、近傍にある
画素集合から求められた線分についてのみ相互の連結性
を判定するようにしたので、生成される小領域の棄却に
よる不検出線分が増加することがなく、かつ生成される
小領域の細分化による併合処理の負荷が軽減されるとと
もに、線分の連結性判定の負荷も軽減でき、精度の良い
効率的な線分検出ができるという効果が得られる。
According to the present invention, the edge directions of pixels in adjacent n×n masks are compared, and pixels having substantially the same difference are classified as the same pixel set, Furthermore, it is assumed that these pixel sets are in the vicinity of each other, and the mutual connectivity is determined only for line segments found from nearby pixel sets, so that undetected line segments due to the rejection of generated small regions are eliminated. This reduces the load on the merging process due to the subdivision of the generated small regions, and also reduces the load on determining the connectivity of line segments, enabling highly accurate and efficient line segment detection. Effects can be obtained.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は本発明に係る線分検出方法を実施する装置を機
能別に表わして示す構成図、第2図はその処理内容を示
す説明図、第3図は従来の線分検出方法を実施する装置
を機能別に表わして示す構成図、第4図はその処理内容
を示す説明図、第5図は同じ〈従来における線分生成例
を示す説明図である。 201:画像入力部 202ニ工ツジ強度・方向検出部 203:エツジ方向設定部 204:同−線分小領域生成部 205:モーメント計算部 206:直線方程式および線分の端点計算部207:線
分連結性判定部 208:線分連結部 第1図 特許出願人  日産自動車株式会社 代理人弁理士   永 井 冬 紀 −ノ(− aCtΣ Vll Vll 第3図 第4図 (d) 凸 凸 (f) 凸 凸 第4図 凸 第5図
Fig. 1 is a functional diagram showing the configuration of an apparatus that implements the line segment detection method according to the present invention, Fig. 2 is an explanatory diagram showing the processing contents, and Fig. 3 shows an apparatus for implementing the conventional line segment detection method. FIG. 4 is an explanatory diagram showing the processing contents of the apparatus, and FIG. 5 is an explanatory diagram showing a conventional line segment generation example. 201: Image input unit 202 Edge strength/direction detection unit 203: Edge direction setting unit 204: Line segment small area generation unit 205: Moment calculation unit 206: Straight line equation and line segment end point calculation unit 207: Line segment Connectivity determination unit 208: line segment connection part Figure 1 Patent applicant: Nissan Motor Co., Ltd. Representative Patent Attorney Fuyuki Nagai - aCtΣ Vll Vll Figure 3 Figure 4 (d) Convex convex (f) Convex Convex figure 4 Convex figure 5

Claims (1)

【特許請求の範囲】 認識対象物体の画像データの各画素についてエッジ強度
およびエッジ方向を演算し、エッジ強度とエッジ方向と
が略等しい画素同志を画素集合として分類し、この画素
集合のモーメントから各画素集合により形成される線分
および各線分の端点の座標を求め、これから得た各線分
の交点と端点間の距離がある値以下のものについて連結
補正して線分を検出する線分検出方法において、 画像データをn×nの画素から成る複数のマスク領域に
分割し、この各マスク領域において略等しいエッジ方向
をもつ画素を同一のエッジ方向を持つ画素として分類す
ると共に、隣接するマスク領域のエッジ方向が略等しい
画素を同一の画素集合として分類し、次いで隣接するマ
スク領域間で分類される画素集合が互いに近傍にあると
して該画素集合の組を記憶し、この記憶値をもつ画素集
合から演算された線分同志についてのみ線分の交点と端
点間の距離を求めて線分の連結を行なうことを特徴とす
る線分検出方法。
[Claims] The edge strength and edge direction are calculated for each pixel of the image data of the object to be recognized, pixels with substantially equal edge strength and edge direction are classified as a pixel set, and each pixel is calculated from the moment of this pixel set. A line segment detection method that calculates the coordinates of a line segment formed by a pixel set and the end point of each line segment, and performs connection correction to detect line segments if the distance between the intersection point and the end point of each line segment obtained from this is less than a certain value. In this method, image data is divided into a plurality of mask regions each consisting of n×n pixels, and in each mask region, pixels having approximately the same edge direction are classified as pixels having the same edge direction, and pixels in adjacent mask regions are classified as pixels having approximately the same edge direction. Pixels with approximately the same edge direction are classified as the same pixel set, and then a set of pixel sets classified between adjacent mask areas is stored in the vicinity of each other, and from the pixel set having this stored value, A line segment detection method characterized in that the distance between the intersection point and the end point of the calculated line segments is determined and the line segments are connected.
JP63170557A 1988-07-07 1988-07-07 Line segment detecting method Pending JPH0218681A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP63170557A JPH0218681A (en) 1988-07-07 1988-07-07 Line segment detecting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP63170557A JPH0218681A (en) 1988-07-07 1988-07-07 Line segment detecting method

Publications (1)

Publication Number Publication Date
JPH0218681A true JPH0218681A (en) 1990-01-22

Family

ID=15907063

Family Applications (1)

Application Number Title Priority Date Filing Date
JP63170557A Pending JPH0218681A (en) 1988-07-07 1988-07-07 Line segment detecting method

Country Status (1)

Country Link
JP (1) JPH0218681A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996023278A1 (en) * 1995-01-27 1996-08-01 Advantest Corporation Image processor
US7550539B2 (en) 2007-04-03 2009-06-23 Dupont Performance Elastomers Llc Partially neutralized chlorosulfonated polyolefin elastomers
US7680339B2 (en) 2004-10-27 2010-03-16 Canon Kabushiki Kaisha Image processing method and apparatus for edge detection in an image
US7838601B2 (en) 2007-04-03 2010-11-23 Dupont Performance Elastomers L.L.C. Partially neutralized chlorosulfonated polyolefin elastomers
US8013072B2 (en) 2009-05-13 2011-09-06 Dupont Performance Elastomers L.L.C. Partially hydrolyzed chlorosulfonated polyolefin elastomers
US8985975B2 (en) 2009-02-10 2015-03-24 Bp Exploration Operating Company Limited Multistage pump suitable for use in wells
JP2019106012A (en) * 2017-12-12 2019-06-27 富士通株式会社 Determination program, determination method, and information processing device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996023278A1 (en) * 1995-01-27 1996-08-01 Advantest Corporation Image processor
KR100257786B1 (en) * 1995-01-27 2000-06-01 오우라 히로시 Picture processing device
US7680339B2 (en) 2004-10-27 2010-03-16 Canon Kabushiki Kaisha Image processing method and apparatus for edge detection in an image
US7550539B2 (en) 2007-04-03 2009-06-23 Dupont Performance Elastomers Llc Partially neutralized chlorosulfonated polyolefin elastomers
US7838601B2 (en) 2007-04-03 2010-11-23 Dupont Performance Elastomers L.L.C. Partially neutralized chlorosulfonated polyolefin elastomers
US8985975B2 (en) 2009-02-10 2015-03-24 Bp Exploration Operating Company Limited Multistage pump suitable for use in wells
US8013072B2 (en) 2009-05-13 2011-09-06 Dupont Performance Elastomers L.L.C. Partially hydrolyzed chlorosulfonated polyolefin elastomers
JP2019106012A (en) * 2017-12-12 2019-06-27 富士通株式会社 Determination program, determination method, and information processing device

Similar Documents

Publication Publication Date Title
KR101698700B1 (en) Pattern inspecting and measuring device and program
US4908872A (en) Method and apparatus for extracting pattern contours in image processing
EP1343116B1 (en) Image angle detector and scanning line interpolating apparatus
JP5533091B2 (en) Method for identifying data point distribution region on coordinate plane and identification program thereof
JPS6332666A (en) Method for pattern defect
JP2003057019A (en) Pattern inspection device and inspection method using the same
CN111402316B (en) Rapid detection method for ellipses in image based on anti-fake links
JPH0218681A (en) Line segment detecting method
JP4062987B2 (en) Image area dividing method, image area dividing apparatus, and image area dividing program
US7151855B2 (en) Pattern measurement method, manufacturing method of semiconductor device, pattern measurement apparatus, and program
JP5486403B2 (en) Image processing apparatus, image processing method, and computer program
JP6018802B2 (en) Dimension measuring device and computer program
JP4760362B2 (en) Character reader
JP6800743B2 (en) Master image data creation device
JPH04323503A (en) Image processor
JP2516844B2 (en) Parts detection method and device
CN110889875A (en) Binocular camera calibration method based on partial corner points
JP2001243479A (en) Method and device for image processing and recording medium having image processing program recorded thereon
JP4080750B2 (en) Character extraction method, character extraction device, and program
JPH05108800A (en) Picture defect discrimination processor
JPH1091788A (en) Device for positioning pattern and method therefor
JPH0896137A (en) Input coordinate deciding device
JPH09265540A (en) Processor and method for image processing
JPH02138677A (en) Detecting device for picture feature point
JP2022060505A (en) Fingerprint processing device, fingerprint processing method, program, and fingerprint processing circuit