JP3553698B2 - White line recognition device that recognizes inclined white lines in images - Google Patents

White line recognition device that recognizes inclined white lines in images Download PDF

Info

Publication number
JP3553698B2
JP3553698B2 JP23052695A JP23052695A JP3553698B2 JP 3553698 B2 JP3553698 B2 JP 3553698B2 JP 23052695 A JP23052695 A JP 23052695A JP 23052695 A JP23052695 A JP 23052695A JP 3553698 B2 JP3553698 B2 JP 3553698B2
Authority
JP
Japan
Prior art keywords
edge
white line
image
edges
horizontal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
JP23052695A
Other languages
Japanese (ja)
Other versions
JPH0972716A (en
Inventor
伸和 島
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Ten Ltd
Original Assignee
Denso Ten Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Ten Ltd filed Critical Denso Ten Ltd
Priority to JP23052695A priority Critical patent/JP3553698B2/en
Publication of JPH0972716A publication Critical patent/JPH0972716A/en
Application granted granted Critical
Publication of JP3553698B2 publication Critical patent/JP3553698B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Description

【0001】
【発明の属する技術分野】
本発明は、自車の走行レーンからの逸脱や、走行レーン内の前方車両を検出したり認識するために自車の走行レーンを規定する画像中の白線を認識する白線認識装置に関し、特に傾斜白線の認識の精度向上を図る。
【0002】
【従来の技術】
上記白線認識装置では、白線の特徴部抽出のために画像処理を行って白線の輪郭線のみを抽出するエッジ抽出という手法が用いられている。エッジ抽出の手法とは、画像データを基に、画素を「エッジの部分」と「そうでない部分」の2値に分類してエッジ画像を抽出することをいう。走行レーンのエッジ画像から白線を確実に認識するために、以下のようにして、白線の幅のエッジが抽出される。
【0003】
図10は走行レーンの白線幅の画像を示す図である。本図(a)の走行レーン白線画像の一部のa部を本図(b)に示すように、一例として、水平方向、左から右に、明るい部分から暗い部分に変化する部分を正エッジ(右エッジ)として抽出し、暗い部分から明るい部分に変化する部分を負エッジ(左エッジ)として抽出し、明暗の変化の無い部分をエッジ無しとして画素を3つ値に分類する。このようにして得たエッジ画像から幅のある白線の右エッジ及び左エッジが求められる。右エッジと左エッジとの間隔を法律で定められた白線幅と比較して一致すれば、これらの右エッジと左エッジとは走行レーンの白線と認識される。このようにして、3値に分類して得た白線幅は一定なので、他の物体のエッジと識別ができ、雑音に強い白線認識が実現できる。なお、白線幅は、図10の立体図に示される走行レーンの白線を認識した後に、立体図を平面図に変換してから求めることができるが、この変換は、周知であるので、説明の簡単化のためにこれを省略する。
【0004】
【発明が解決しようとする課題】
図11は白線認識装置による白線の左エッジを示す図である。上記白線認識装置では、水平方向に、左から右に向かって画素を3値に分類したが、本図(a)、(b)に示すように、白線が垂直線又はこれに近い場合には、左エッジの画素は連続するが、本図(c)に示すように、白線が水平に近い場合には、左エッジの画素が飛び飛びの不連続になる。この場合、近くにノイズによる負エッジ(左エッジ)があるとそれを白線の左エッジと誤認する場合がある。すなわち、白線が傾き水平方向に傾きが大きくなると、白線の境界線にはエッジでない部分が多くなりノイズの影響を受けやすくなり、適切なエッジを抽出し難くなり、精度が高く、正確な認識処理の実現ができないとの問題がある。さらに、白線の傾きの大きさはカメラの画角、俯角により変化し、また、車両等へのカメラ設定高さによって変化するので、カメラの設置条件に困難が伴うとの問題がある。
【0005】
したがって、本発明は、上記問題点に鑑み、白線の傾斜に対して適切なエッジを抽出することができる白線認識装置を提供することを目的とする。
【0006】
【課題を解決するための手段】
本発明は、前記問題点を解決するために、次の構成を有する画像中の傾斜白線を認識する白線認識装置を提供する。すなわち、自車の走行レーンを規定する白線画像を認識するため、その明暗変化をエッジとして抽出する白線認識装置に、前記画像の同一部分について水平方向及び垂直方向にエッジを抽出する水平・垂直オペレータと、前記水平方向及び前記垂直方向でそれぞれ抽出されたエッジを1対に組み合わせて、この組合せに少なくとも1つのエッジが含まれる場合にエッジを抽出したと判断するエッジ抽出判断手段とが設けられる。前記エッジ抽出判断手段によりエッジと判断された2本のエッジ間隔が一定の場合には白線と認識される。
【0007】
前記水平・垂直オペレータは、前記画像の同一部分について水平方向及び垂直方向に、暗から明への変化を負エッジ、明から暗への変化を正エッジ、明暗のない変化をエッジ無しとし、各3つの状態のエッジを抽出するオペレータであり、前記エッジ抽出判断手段は、前記オペレータにより抽出された水平方向及び垂直方向のエッジの組合せ状態を9値に分類して画像を識別する9値識別手段と、走行レーンの左白線用の画像の9値を基に左白線の左右エッジを選択する左白線用左右エッジ選択手段と、走行レーンの右白線用の画像の9値を基に右白線の左右エッジを選択する右白線用左右エッジ選択手段とを具備するようにしてもよい。
【0008】
前記エッジ抽出判断手段は、水平方向及び前記垂直方向の1つのエッジの状態を基に左右白線の左右エッジを選択する共有左右エッジ選択手段とを備え、前記白線の認識程度に応じて、前記共有左右エッジ選択手段と前記左白線用左右エッジ選択手段及び左白線用左右エッジ選択手段とを択一的に切り換えするようにしてもよい。
【0009】
記エッジ抽出判断手段は、水平方向及び前記垂直方向の1つのエッジの状態を基に左右白線の左右エッジを選択する共有左右エッジ選択手段とを備え、該共有左右エッジ選択手段は、走行レーンの上部の画像の9値を基に画面の遠距離領域のエッジを選択し、前記左白線用左右エッジ選択手段及び白線用左右エッジ選択手段は、走行レーンの上部の画像の9値を基に画面の近距離領域のエッジを選択するようにしてもよい。
【0010】
本発明の画像中の傾斜白線を認識する白線認識装置によれば、前記水平方向及び前記垂直方向でそれぞれ抽出されたエッジを1対に組み合わせて、この組合せに少なくと1つのエッジが含まれる場合にエッジを抽出したと判断することにより、白線が傾き水平方向又は垂直方向に傾きが大きくなっても、白線の境界線にはエッジでない部分がなくなりノイズの影響を容易に受けなくなり、適切なエッジを抽出することが可能になる。そして、精度が高く、正確な認識処理の実現が可能となる。さらに、白線の傾きの大きさはカメラの画角、俯角により変化し、また、車両等へのカメラ設定高さによって変化しても、適切なエッジを抽出することができるので、カメラの設置条件によらず正確な認識処理を行えるようになった。
【0011】
前記白線の認識程度に応じて、前記共有左右エッジ選択手段と前記左白線用左右エッジ選択手段及び白線用左右エッジ選択手段とを択一的に切り換えることにより、画像雑音(孤立点)が多い場合にも対応できる。
前記共有左右エッジ選択手段は、走行レーンの上部の画像の9値を基に画面の遠距離領域のエッジを選択し、前記左白線用左右エッジ選択手段及び白線用左右エッジ選択手段は、走行レーンの上部の画像の9値を基に画面の近距離領域のエッジを選択することにより、近距離領域での精度の高度化、正確化を図り、遠距離領域での処理時間の短縮を図ることが可能になる。
【0012】
【発明の実施の形態】
以下本発明の実施の形態について図面を参照して説明する。
図1は本発明の実施の形態に係る白線認識装置であって画像中の傾斜白線を認識するものを示す図である。
本図に示すように、白線認識装置は、車両に搭載され自車前方の道路状態を撮像するカメラ1と、カメラ1により捉えられた画像を画像処理して走行レーンの白線を求める車載画像処理装置10とを具備する。
【0013】
車載画像処理装置10は画像メモリからなって画像データを記憶する画像記録部2と、画像データを特徴抽出されたエッジ画像に変換するエッジ抽出部3と、CPU(Central Processing Unit) からなってエッジ画像に対して道路上の走行レーンの白線を認識する認識処理手段4とを具備する。
なお、前記画像記録部2はエッジ抽出部3の後にあってもよい。
【0014】
ここで、エッジ抽出部3は、画像データの濃淡値の変化量の大きい部分をあるしきい値に基づき、分類して、エッジ画像を得る白線エッジ抽出部30を有する。
図2は白線エッジ抽出部30の構成を示す図である。本図に示すように、白線エッジ抽出部30はSobelオペレータ31と、9値識別手段33と、左右白線用の左右エッジ選択手段34、35とを有するエッジ抽出判断手段32とを具備する。
【0015】
図3はSobelオペレータ31を説明する図である。Sobelオペレータ31は画像データの画素を分類するオペレータであり、本図(a)に示すものは垂直方向への方向性を持ったオペレータであり、本図(b)に示すものは水平方向への方向性を持ったオペレータである。3×3の画素の輝度値に、本図に示すように、3×3のオペレータの値を乗算して、この結果を積算して、この積算値を基に3×3の画素の中心画素を、以下のように、分類する。
【0016】
垂直方向への方向性を持ったオペレータでは、3×3の画素中の画面で明るい部分から暗い部分へ変化する場合には上記積算値は「正」となり、この画素で構成される画像を「正エッジ」といい、逆に暗い部分から明るい部分へ変化する場合には上記積算値は「負」となり、この画素で構成される画像を「負エッジ」といい、明るさの部分又は暗い部分に変化がない場合には上記積値は「零」となり、この画素で構成される画像を「エッジ無し」という。このようにして、上記積算値により画素が3つに分類される。
【0017】
同様に、水平方向への方向性を持ったSobelオペレータにより、さらに画素が3つに分類される。このようにして、垂直方向、水平方向の双方で3×3=9種類に画素を分類することが可能になる。
図4は9値識別手段33を説明する図である。本図に示すように、9値識別手段33は、Sobelオペレータ31により分類された全画面の画素を0〜8の数字で識別する。具体的には、垂直方向のオペレータによる「負エッジ」に対して、水平方向のオペレータによる「負エッジ」、「エッジ無し」、「正エッジ」のそれぞれに対応する画素に「1」、「2」、「3」に識別番号が与えれる。そして、垂直方向のオペレータによる「エッジ無し」に対して、水平方向のオペレータによる「負エッジ」、「エッジ無し」、「正エッジ」のそれぞれに対応する画素に「4」、「0」、「5」に識別番号が割り与えれる。そして、垂直方向のオペレータによる「正エッジ」に対して、水平方向のオペレータによる「負エッジ」、「エッジ無し」、「正エッジ」のそれぞれに対応する画素に「6」、「7」、「8」に識別番号が割り与えれる。
【0018】
図5は左右白線用の左右エッジ選択手段34、35を説明する図である。道路の車両の走行レーンは左右の2本の白線からなるが、左白線用の左右エッジ選択手段34は、9値識別手段33から左画面の9値を入力し、左の白線の左右エッジを決定するために識別番号を選択する。すなわち左エッジの場合には識別番号「1」、「2」、「3」、「4」が選択される。このため、垂直方向のオペレータが「負エッジ」でかつ水平方向のオペレータが「負エッジ」である画素の場合(識別番号1)には「左エッジ」となるが、垂直方向のオペレータが「エッジ無し」でも水平方向のオペレータが「負エッジ」である画素の場合(識別番号4)には「左エッジ」となり、水平方向のオペレータが「エッジ無し」でも垂直方向のオペレータが「負エッジ」である画素の場合(識別番号2)には「左エッジ」となる。
【0019】
右エッジの場合には識別番号「5」、「6」、「7」、「8」が選択される。このため、垂直方向のオペレータが「正エッジ」でかつ水平方向のオペレータが「正エッジ」である画素の場合(識別番号8)には「右エッジ」となるが、垂直方向のオペレータが「エッジ無し」でも水平方向のオペレータが「正エッジ」である画素の場合(識別番号5)には「右エッジ」となり、水平方向のオペレータが「エッジ無し」でも垂直方向のオペレータが「正エッジ」である画素の場合(識別番号7)には「右エッジ」となる。
【0020】
次に、右白線用の左右エッジ選択手段35は、9値識別手段33から右画面の9値を入力し、右の白線の左右エッジを決定するために識別番号を選択する。すなわち左エッジの場合には識別番号「4」、「6」、「7」、「8」が選択される。このため、垂直方向のオペレータが「正エッジ」でかつ水平方向のオペレータが「負エッジ」である画素の場合(識別番号6)には「左エッジ」となるが、垂直方向のオペレータが「エッジ無し」でも水平方向のオペレータが「負エッジ」である画素の場合(識別番号4)には「左エッジ」となり、水平方向のオペレータが「エッジ無し」でも垂直方向のオペレータが「正エッジ」である画素の場合(識別番号7)には「左エッジ」となる。
【0021】
右エッジの場合には識別番号「1」、「2」、「3」、「5」が選択される。このため、垂直方向のオペレータが「負エッジ」でかつ水平方向のオペレータが「正エッジ」である画素の場合(識別番号3)には「右エッジ」となるが、垂直方向のオペレータが「エッジ無し」でも水平方向のオペレータが「正エッジ」である画素の場合(識別番号5)には「右エッジ」となり、水平方向のオペレータが「エッジ無し」でも垂直方向のオペレータが「負エッジ」である画素の場合(識別番号2)には「右エッジ」となる。
【0022】
したがって、本実施の形態によれば、左白線の左エッジを一例として、垂直方向のオペレータによると、左白線が水平に近くなると、白線の左境界では「負エッジ」の画素に対して「エッジ無し」の画素が増加し、負エッジの画素が不連続になるが、この「エッジ無し」画素は水平方向のオペレータにより「負エッジ」の画素とされるので、白線の左境界で負エッジの画素の連続性が確保できる。
【0023】
逆に、水平方向のオペレータによると、左白線が垂直に近くなると、白線の左境界では「負エッジ」の画素に対して「エッジ無し」の画素が増加し、負エッジの画素が不連続になるが、この「エッジ無し」画素は垂直方向のオペレータにより「負エッジ」の画素とされるので、白線の左境界で負エッジの画素の連続性が確保できる。このため、白線の傾斜に対して適切なエッジを抽出することができるので、カメラの画角、俯角、車両等へのカメラ設定高等の設置条件によっては、この方式を用いた方がより正確な認識を実現できる。
【0024】
ところで、左白線用左右選択手段34及び右白線用左右選択手段35だけを用いると画像雑音(画像処理でいう孤立点)が多く存在するようになり、精度が高く、正確な認識処理が実現不可能になる。この不都合を、以下のようにして、防止する。
図6は白線エッジ抽出部30の別の構成を示す図である。本図に示すように、前記左白線用左右選択手段34及び前記右白線用左右選択手段35に並列に共用左右エッジ選択手段36を新たに設け、前者と後者とを認識処理手段4からの情報を基にスイッチ37により切り換える。
【0025】
図7は共用左右エッジ選択手段36を説明する図である。共用左右エッジ選択手段36は、9値識別手段33から左画面の9値を入力し、道路の走行レーンの左右の2本の白線に共通に左右エッジを決定するために識別番号を選択する。すなわち左エッジの場合には、本図に示すように、識別番号「1」、「2」、「3」が選択され、右エッジの場合には、本図に示すように、識別番号「6」、「7」、「8」が選択さる。詳細には、識別番号「1」、「2」の場合には左白線の左エッジを示し、識別番号「3」の場合には右白線の左エッジを示し、識別番号「7」、「8」の場合は左白線の右エッジを示し、識別番号「6」は右白線用のエッジを示す。
【0026】
スイッチ37の切り換えは、認識処理手段4では、9値識別手段33の画面の9値分布を基に認識した結果、白線を認識できない場合に、左白線用左右選択手段34及び右白線用左右選択手段35から共用左右エッジ選択手段36へ、又はこの逆へ行ってもよい。この場合、認識処理手段4では9値識別手段33の前画面の9値の分布を用いてもよい。
【0027】
図8は画面の領域分割と特徴抽出方式を説明する図である。本図に示すように、道路画面を上下に分け、すなわち、車両に近い近距離領域と車に遠い遠距離領域とに分けて、白線エッジ抽出部30は、以下のように、エッジを抽出を行うようにしてもよい。
図9は白線エッジ抽出部30の別の構成を示す図である。本図に示すように、左白線用左右選択手段34は、9値識別手段33の画面の近距離領域の値を用いて左白線の左右エッジを抽出し、右白線用左右選択手段35は、9値識別手段33の画面の近距離領域の値を用いて右白線の左右エッジを抽出し、共用左右エッジ選択手段36は、9値識別手段33の画面の遠距離領域の値を用いて左右白線の左右エッジを抽出する。認識処理手段4では、近距離領域で白線と認識されたエッジを基に遠距離領域でのエッジを追跡して白線として認識する。この場合、遠距離領域で白線が不連続の場合にはエッジを飛び越えて追跡を行う。このようにして近距離領域では、前述のように、右エッジと左エッジとの間隔を道路構造令で定めた白線幅と比較して一致すれば、これらの右エッジと左エッジとは走行レーンの白線と認識することが可能であるが、遠距離領域では、右エッジと左エッジとの間隔が徐々に小さくなりこの間隔の精度が得られなくなるので、上記追跡、飛び越し追跡により白線の認識を可能にする。
【0028】
近距離領域では、左白線用左右選択手段34及び右白線用左右選択手段35を用いて精度が高く、正確な認識処理を行うが、遠距離領域では元来白線間隔の精度がそれほど高くないので白線認識の精度、正確性がそれほど求められいため構成が簡単な共用左右エッジ選択手段36を用いてむしろ処理時間の短縮を図ることが可能になる。
【0029】
なお、Sobelオペレータは一例であり、別のオペレータを使用してもよい。また白線の左エッジと右エッジを識別可能になるが、暗い部分から明るい部分への変化又はこの逆の変化と明暗の変化無しの2つに画素を分類し、白線の左右エッジを識別せずに単に白線エッジを求めるようにして高速化を測ってもよい。
【0030】
【発明の効果】
以上説明したように本発明によれば、水平方向及び垂直方向でそれぞれ抽出されたエッジを1対に組み合わせて、この組合せに少なくと1つのエッジが含まれる場合にエッジを抽出したと判断するので、白線が傾き水平方向又は垂直方向に傾きが大きくなっても、白線の境界線にはエッジでない部分がなくなりノイズの影響を容易に受けなくなり、適切なエッジを抽出することが可能になる。そして、精度が高く、正確な認識処理の実現が可能となる。さらに、白線の傾きの大きさはカメラの画角、俯角により変化し、また、車両等へのカメラ設定高さによって変化しても、適切なエッジを抽出することができるので、カメラの設置条件に困難が伴うことが無くなった。白線の認識程度に応じて、共有左右エッジ選択手段と左白線用左右エッジ選択手段及び白線用左右エッジ選択手段とを択一的に切り換えるので、画像雑音(孤立点)が多く存在するのを防止できる。共有左右エッジ選択手段は、走行レーンの上部の画像の9値を基に画面の遠距離領域のエッジを選択し、左白線用左右エッジ選択手段及び白線用左右エッジ選択手段は、走行レーンの上部の画像の9値を基に画面の近距離領域のエッジを選択するので、近距離領域での精度の高度化、正確化を図り、遠距離領域での処理時間の短縮を図ることが可能になる。
【図面の簡単な説明】
【図1】本発明の実施の形態に係る白線認識装置であって画像中の傾斜白線を認識するものを示す図である。
【図2】白線エッジ抽出部30の構成を示す図である。
【図3】Sobelオペレータ31を説明する図である。
【図4】9値識別手段33を説明する図である。
【図5】左右白線用の左右エッジ選択手段34、35を説明する図である。
【図6】白線エッジ抽出部30の別の構成を示す図である。
【図7】共用左右エッジ選択手段36を説明する図である。
【図8】画面の領域分割と特徴抽出方式を説明する図である。
【図9】白線エッジ抽出部30の別の構成を示す図である。
【図10】走行レーンの白線幅の画像を示す図である。
【図11】白線認識装置による白線の左エッジを示す図である。
【符号の説明】
31…水平・垂直オペレータ
32…エッジ抽出判断手段
33…9値識別手段
34…左白線用左右エッジ選択手段
35…左白線用左右エッジ選択手段
36…共有左右エッジ選択手段
[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a white line recognition device that recognizes a white line in an image that defines a travel lane of a vehicle in order to detect or recognize a deviation of the vehicle from the travel lane or a preceding vehicle in the travel lane. Improve the accuracy of white line recognition.
[0002]
[Prior art]
In the white line recognition apparatus, a method called edge extraction is used in which image processing is performed to extract a characteristic portion of the white line to extract only the outline of the white line. The edge extraction method refers to extracting an edge image by classifying pixels into binary values of an “edge portion” and a “non-edge portion” based on image data. In order to reliably recognize the white line from the edge image of the traveling lane, the edge having the width of the white line is extracted as follows.
[0003]
FIG. 10 is a diagram showing an image of the white line width of the traveling lane. Part (a) of the white image of the traveling lane white line in FIG. (A) is, as shown in FIG. (B), as an example, a part that changes from a bright part to a dark part in the horizontal direction from left to right and has a positive edge. (Right edge), a portion changing from a dark portion to a bright portion is extracted as a negative edge (left edge), and a portion where there is no change in brightness is classified into three values by defining no edge. From the edge image thus obtained, the right edge and the left edge of the wide white line are obtained. If the distance between the right edge and the left edge is compared with the width of the white line defined by law, the right edge and the left edge are recognized as the white line of the traveling lane. In this way, since the white line width obtained by classifying into three values is constant, it can be distinguished from the edge of another object, and white line recognition resistant to noise can be realized. It should be noted that the white line width can be obtained after recognizing the white line of the traveling lane shown in the three-dimensional diagram of FIG. 10 and then converting the three-dimensional diagram into a plan view. This is omitted for simplicity.
[0004]
[Problems to be solved by the invention]
FIG. 11 is a diagram showing the left edge of the white line by the white line recognition device. In the white line recognition device, pixels are classified into three values in the horizontal direction from left to right. However, as shown in FIGS. 3A and 3B, when the white line is a vertical line or close to the vertical line, , The pixels on the left edge are continuous, but as shown in FIG. 3C, when the white line is nearly horizontal, the pixels on the left edge are discontinuous and discontinuous. In this case, if there is a negative edge (left edge) due to noise nearby, it may be mistaken as the left edge of the white line. That is, when the white line is inclined and the inclination in the horizontal direction is large, there are many non-edge portions on the boundary line of the white line, and the boundary line is susceptible to noise, making it difficult to extract an appropriate edge, resulting in high accuracy and accurate recognition processing. There is a problem that cannot be realized. Furthermore, since the magnitude of the inclination of the white line changes depending on the angle of view and the angle of depression of the camera, and also changes depending on the height of the camera set on the vehicle or the like, there is a problem that the installation conditions of the camera are difficult.
[0005]
Accordingly, it is an object of the present invention to provide a white line recognition device capable of extracting an appropriate edge for the inclination of a white line in view of the above problems.
[0006]
[Means for Solving the Problems]
The present invention provides a white line recognition device for recognizing an inclined white line in an image having the following configuration, in order to solve the above problem. That is, in order to recognize a white line image that defines the traveling lane of the own vehicle, a horizontal / vertical operator that extracts edges in the horizontal and vertical directions for the same portion of the image is provided to a white line recognition device that extracts the change in brightness as an edge. And edge extraction determination means for determining that an edge has been extracted when at least one edge is included in the combination by combining a pair of edges extracted in the horizontal direction and the edges extracted in the vertical direction. If the interval between two edges determined as edges by the edge extraction determining means is constant, it is recognized as a white line.
[0007]
The horizontal and vertical operators, in the horizontal and vertical directions for the same part of the image, a negative edge for a change from dark to light, a positive edge for a change from light to dark, and no edge for a change without light and dark, An operator for extracting edges in three states, wherein the edge extraction determining means classifies the combination state of the horizontal and vertical edges extracted by the operator into nine values and identifies the image by a nine-value identifying means; And left and right edge selection means for selecting the left and right edges of the left white line based on the nine values of the left white line image of the driving lane, and the right white line of the right white line based on the nine values of the right white line image of the driving lane. Right and left line selection means for selecting the right and left edges may be provided.
[0008]
The edge extraction determination means includes shared left and right edge selection means for selecting the left and right edges of the left and right white lines based on the state of one edge in the horizontal direction and the vertical direction. The left and right edge selecting means, the left and right white line selecting means and the left and right white line selecting means may be selectively switched.
[0009]
Before disappeared Tsu di extraction determining means, and a shared lateral edge selection means for selecting the left and right edges of the left and right white lines on the basis of the state of the one edge of the horizontal direction and the vertical direction, the shared left and right edge selection means, The edge of the long distance area of the screen is selected based on the nine values of the image on the upper part of the driving lane, and the left and right edge selecting means for the left white line and the right and left edge selecting means for the right white line select the nine values of the upper image of the driving lane May be used to select an edge in a short-distance area of the screen.
[0010]
According to the white line detection apparatus recognizes the inclination white line in the image of the present invention, a combination of the horizontal direction and the edges extracted respectively in the vertical direction in a pair includes one edge also reduced in this combination By judging that an edge has been extracted in such a case, even if the white line is inclined and the inclination in the horizontal or vertical direction is increased, there is no non-edge portion on the boundary of the white line, and the boundary of the white line is not easily affected by noise. Edges can be extracted. Then, it is possible to realize highly accurate and accurate recognition processing. Furthermore, since the magnitude of the inclination of the white line changes depending on the angle of view and depression angle of the camera, and also changes depending on the height of the camera set on the vehicle or the like, an appropriate edge can be extracted. Accurate recognition processing can be performed regardless of the above.
[0011]
By selectively switching between the shared left / right edge selecting means and the left / right line left / right edge selecting means and the right white line right / left edge selecting means in accordance with the degree of recognition of the white line, image noise (isolated points) is increased. It can respond to cases.
The shared left / right edge selecting means selects an edge of a long-distance region of the screen based on the nine values of the image above the running lane, and the left / right line left / right edge selecting means and the right / white line right / left edge selecting means perform running. By selecting the edge of the near area of the screen based on the 9 values of the image at the top of the lane, the accuracy in the near area is improved and improved, and the processing time in the far area is shortened. It becomes possible.
[0012]
BEST MODE FOR CARRYING OUT THE INVENTION
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
FIG. 1 is a diagram showing a white line recognition device according to an embodiment of the present invention, which recognizes an inclined white line in an image.
As shown in the figure, the vehicle-mounted image white line recognition apparatus, for determining the camera 1 for imaging the road conditions ahead of the vehicle is mounted on a vehicle, the white line of the traveling lane by an image processing an image captured by the camera 1 And a processing device 10.
[0013]
The in-vehicle image processing apparatus 10 includes an image recording unit 2 that is composed of an image memory and stores image data, an edge extraction unit 3 that converts the image data into an edge image whose characteristics are extracted, and a CPU (Central Processing Unit). A recognition processing unit 4 for recognizing a white line of a traveling lane on the road with respect to the image.
The image recording unit 2 may be provided after the edge extracting unit 3.
[0014]
Here, the edge extraction unit 3 includes a white line edge extraction unit 30 that classifies a portion of the image data having a large amount of change in shading value based on a certain threshold to obtain an edge image.
FIG. 2 is a diagram illustrating the configuration of the white line edge extraction unit 30. As shown in the figure, the white line edge extracting unit 30 includes a Sobel operator 31, a 9-value discriminating unit 33, and an edge extraction determining unit 32 having left and right edge selecting units 34 and 35 for white lines.
[0015]
FIG. 3 is a diagram illustrating the Sobel operator 31. The Sobel operator 31 is an operator that classifies the pixels of the image data. The operator shown in FIG. 3A is an operator having a direction in the vertical direction, and the operator shown in FIG. He is a directional operator. As shown in the figure, the luminance value of the 3 × 3 pixel is multiplied by the value of the 3 × 3 operator, the result is integrated, and the central pixel of the 3 × 3 pixel is calculated based on the integrated value. Are classified as follows.
[0016]
For an operator having a directionality in the vertical direction, when a bright portion changes to a dark portion on a screen in 3 × 3 pixels, the integrated value becomes “positive”, and the image formed by the pixels is displayed as “positive”. In the case where the image changes from a dark portion to a bright portion, the integrated value becomes "negative". An image formed by these pixels is called a "negative edge", and the brightness portion or the dark portion. the totalized value if there is no change in the "zero", and the image composed of the pixels of "no edge". Thus, the pixels are classified into three according to the integrated value.
[0017]
Similarly, pixels are further classified into three by the Sobel operator having a directionality in the horizontal direction. In this way, it is possible to classify pixels into 3 × 3 = 9 types in both the vertical direction and the horizontal direction.
FIG. 4 is a diagram for explaining the ninth-value identification means 33. As shown in the figure, the ninth value discriminating means 33 discriminates the pixels of the entire screen classified by the Sobel operator 31 by numbers 0 to 8. Specifically, “1” and “2” are assigned to pixels corresponding to “negative edge”, “no edge”, and “positive edge” by the horizontal operator, respectively, with respect to “negative edge” by the vertical operator. "identification number" 3 "is given, et al. are. Then, in contrast to “no edge” by the vertical operator, “4”, “0”, “0”, and “4” are assigned to pixels corresponding to “negative edge”, “no edge”, and “positive edge” by the horizontal operator. identification number is found giving allocated to the 5 ". The pixels corresponding to “negative edge”, “no edge”, and “positive edge” by the horizontal operator are “6”, “7”, “ identification number is found giving allocated to 8 ".
[0018]
FIG. 5 is a diagram for explaining the left and right edge selection means 34 and 35 for the left and right white lines. The driving lane of the vehicle on the road is composed of two white lines on the left and right. The left and right edge selecting means 34 for the left white line inputs the nine values of the left screen from the nine-value identifying means 33 and determines the left and right edges of the left white line. Select an identification number to determine. That is, in the case of the left edge, the identification numbers “1”, “2”, “3”, and “4” are selected. Therefore, if the pixel in which the vertical operator is a “negative edge” and the horizontal operator is a “negative edge” (identification number 1), the pixel is the “left edge”, but the vertical operator is the “edge”. In the case where the pixel in which the horizontal operator is a “negative edge” even when “no” is present (identification number 4), “left edge” is obtained. Even if the horizontal operator is “no edge”, the vertical operator is “negative edge”. In the case of a certain pixel (identification number 2), it is “left edge”.
[0019]
In the case of the right edge, identification numbers "5", "6", "7", and "8" are selected. For this reason, if the pixel in which the vertical operator is the “positive edge” and the horizontal operator is the “positive edge” (identification number 8), the pixel is the “right edge”, but the vertical operator is the “edge”. In the case where the pixel in the horizontal direction is the “positive edge” even if “no” (identification number 5), the right edge is set. In the case where the horizontal operator is “no edge”, the vertical operator is the “positive edge”. In the case of a certain pixel (identification number 7), the right edge is set.
[0020]
Next, the right and left edge selection means 35 for the right white line inputs the nine values of the right screen from the nine value identification means 33 and selects an identification number to determine the right and left edges of the right white line. That is, in the case of the left edge, the identification numbers “4”, “6”, “7”, and “8” are selected. Therefore, when the pixel in which the vertical operator is a “positive edge” and the horizontal operator is a “negative edge” (identification number 6), the pixel is “left edge”, but the vertical operator is “edge”. In the case where the pixel in the horizontal direction is “negative edge” even if “no” (identification number 4), “left edge” is set, and the operator in the vertical direction is “positive edge” even if the horizontal operator is “no edge”. In the case of a certain pixel (identification number 7), it is “left edge”.
[0021]
In the case of the right edge, identification numbers “1”, “2”, “3”, and “5” are selected. For this reason, if the pixel in which the vertical operator is a “negative edge” and the horizontal operator is a “positive edge” (identification number 3), the pixel is “right edge”, but the vertical operator is “edge”. Even if there is no pixel, if the horizontal operator is a pixel having a "positive edge" (identification number 5) ("identification number 5"), the pixel is "right edge". Even if the horizontal operator is "no edge", the vertical operator is "negative edge". In the case of a certain pixel (identification number 2), the right edge is set.
[0022]
Therefore, according to the present embodiment, taking the left edge of the left white line as an example, according to the vertical operator, when the left white line becomes nearly horizontal, the “negative edge” pixel is set to the “negative edge” pixel at the left boundary of the white line. The number of “none” pixels increases, and the negative edge pixels become discontinuous. However, since this “no edge” pixel is regarded as a “negative edge” pixel by the horizontal operator, the negative edge Pixel continuity can be ensured.
[0023]
Conversely, according to the horizontal operator, when the left white line becomes closer to the vertical, the number of “no edge” pixels increases with respect to the “negative edge” pixel at the left boundary of the white line, and the negative edge pixel becomes discontinuous. However, since this "no edge" pixel is determined as a "negative edge" pixel by the operator in the vertical direction, continuity of the negative edge pixel at the left boundary of the white line can be ensured. For this reason, it is possible to extract an appropriate edge with respect to the inclination of the white line, and depending on the installation conditions such as the angle of view of the camera, the depression angle, and the height of the camera set on the vehicle, etc., it is more accurate to use this method. Recognition can be realized.
[0024]
By the way, if only the left and right white line selection means 34 and the right and white line left and right selection means 35 are used, many image noises (isolated points in image processing) will be present, and high accuracy and accurate recognition processing cannot be realized. Will be possible. This inconvenience is prevented as follows.
FIG. 6 is a diagram showing another configuration of the white line edge extraction unit 30. As shown in this figure, a shared left / right edge selecting means 36 is newly provided in parallel with the left / right line selecting means 34 and the right / left selecting means 35 for the right white line, and the information from the recognition processing means 4 is distinguished between the former and the latter. Is switched by the switch 37 on the basis of.
[0025]
FIG. 7 is a diagram for explaining the common left and right edge selection means 36. The shared left / right edge selection means 36 inputs the nine values of the left screen from the nine value identification means 33, and selects an identification number to determine the left and right edges in common for the two left and right white lines of the driving lane on the road. In other words, in the case of the left edge, the identification numbers “1”, “2”, and “3” are selected as shown in the figure, and in the case of the right edge, the identification numbers “6” as shown in the figure. "," 7 "," 8 "Ru is selected. More specifically, the identification numbers “1” and “2” indicate the left edge of the left white line, and the identification number “3” indicates the left edge of the right white line, and the identification numbers “7” and “8”. "Indicates the right edge of the left white line, and the identification number" 6 "indicates the edge for the right white line.
[0026]
When the recognition processing means 4 recognizes the white line as a result of the recognition based on the nine-value distribution on the screen of the nine-value identification means 33, the recognition processing means 4 switches the left and right white line selection means 34 and the right and left white line selection means 34. It is also possible to go from the means 35 to the shared left / right edge selecting means 36 or vice versa. In this case, the recognition processing unit 4 may use the distribution of nine values on the previous screen of the nine-value identification unit 33.
[0027]
FIG. 8 is a view for explaining a screen area division and a feature extraction method. As shown in the figure, the road screen is divided into upper and lower areas, that is, into a short distance area close to the vehicle and a long distance area far from the vehicle, and the white line edge extraction unit 30 extracts the edge as follows. It may be performed.
FIG. 9 is a diagram showing another configuration of the white line edge extraction unit 30. As shown in the figure, the left / right line left / right selection unit 34 extracts the left / right edge of the left white line using the value of the short distance area of the screen of the 9-value identification unit 33, and the right / white line left / right selection unit 35 The left and right edges of the right white line are extracted using the values of the short distance area of the screen of the ninth value identification means 33, and the shared left and right edge selection means 36 uses the values of the long distance area of the screen of the ninth value identification means 33 to determine the right and left. Extract the left and right edges of the white line. The recognition processing unit 4 tracks an edge in a long-distance region based on an edge recognized as a white line in a short-distance region and recognizes the edge as a white line. In this case, when the white line is discontinuous in the long distance area, the tracking is performed by jumping over the edge. As described above, in the short-distance region, as described above, if the distance between the right edge and the left edge is compared with the white line width determined by the Road Structure Ordinance and they match, the right edge and the left edge are connected to the traveling lane. However, in the long-distance area, the distance between the right edge and the left edge gradually decreases, and the accuracy of this distance cannot be obtained. enable.
[0028]
In the short distance region, the left white line left / right selection unit 34 and the right white line left / right selection unit 35 are used to perform high-accuracy and accurate recognition processing. However, in the long distance region, the accuracy of the white line interval is originally not so high. accuracy of the white line recognition becomes possible to reduce the processing time but rather with the accuracy is much sought such damage configuration simple shared lateral edge selection means 36.
[0029]
The Sobel operator is an example, and another operator may be used. Also, the left edge and the right edge of the white line can be distinguished, but the pixels are classified into two, that is, a change from a dark portion to a bright portion or a reverse change and no change in brightness, and the left and right edges of the white line are not distinguished. Alternatively, the speed may be measured simply by obtaining the white line edge.
[0030]
【The invention's effect】
As described above, according to the present invention, the edges extracted in the horizontal direction and the vertical direction are combined in a pair, and if this combination includes at least one edge, it is determined that the edge has been extracted. Even if the white line is inclined in the horizontal or vertical direction, the boundary of the white line has no non-edge portion and is not easily affected by noise, so that an appropriate edge can be extracted. Then, it is possible to realize highly accurate and accurate recognition processing. Furthermore, since the magnitude of the inclination of the white line changes depending on the angle of view and depression angle of the camera, and also changes depending on the height of the camera set on the vehicle or the like, an appropriate edge can be extracted. No more difficulties. Depending on the degree of recognition of the white line, the common left and right edge selecting means and the left and right white line selecting means and the right and left right and left edge selecting means are selectively switched, so that there is much image noise (isolated points). Can be prevented. The shared left and right edge selection means selects an edge of a long-distance region of the screen based on the nine values of the image above the travel lane, and the left and right white line left and right edge selection means and the right white line left and right edge selection means Since the edge of the near area of the screen is selected based on the 9 values of the upper image, it is possible to improve the accuracy and precision in the near area and shorten the processing time in the far area become.
[Brief description of the drawings]
FIG. 1 is a diagram showing a white line recognition device according to an embodiment of the present invention, which recognizes an inclined white line in an image.
FIG. 2 is a diagram showing a configuration of a white line edge extraction unit 30.
FIG. 3 is a diagram illustrating a Sobel operator 31.
FIG. 4 is a diagram for explaining a ninth value discriminating means 33;
FIG. 5 is a diagram illustrating left and right edge selection means for right and left white lines.
FIG. 6 is a diagram showing another configuration of the white line edge extraction unit 30.
FIG. 7 is a view for explaining common left and right edge selection means 36;
FIG. 8 is a diagram illustrating a screen area division and a feature extraction method.
FIG. 9 is a diagram showing another configuration of the white line edge extraction unit 30.
FIG. 10 is a diagram showing an image of a white line width of a traveling lane.
FIG. 11 is a diagram showing a left edge of a white line by the white line recognition device.
[Explanation of symbols]
31 horizontal / vertical operator 32 edge extraction judging means 33 9-value discriminating means 34 left and right edge selecting means 35 for left white line left and right edge selecting means 36 for left white line shared left and right edge selecting means

Claims (3)

自車の走行レーンを規定する白線画像を認識するため、その明暗変化をエッジとして抽出する白線認識装置であって、
前記画像の同一部分について水平方向及び垂直方向にエッジを抽出する水平・垂直オペレータ(31)と、
前記水平方向及び前記垂直方向でそれぞれ抽出されたエッジを1対に組み合わせて、この組合せに少なくと1つのエッジが含まれる場合にエッジを抽出したと判断するエッジ抽出判断手段(32)と、を備え、
前記エッジ抽出判断手段(32)によりエッジと判断された2本のエッジ間隔が一定の場合には白線と認識することで画像中の傾斜白線を認識する白線認識装置において、
前記水平・垂直オペレータ(31)は、前記画像の同一部分について水平方向及び垂直方向に、暗から明への変化を負エッジ、明から暗への変化を正エッジ、明暗のない変化をエッジ無しとし、各3つの状態のエッジを抽出するエッジ抽出手段であり、
前記エッジ抽出判断手段(32)は、
前記エッジ抽出手段により抽出された水平方向及び垂直方向のエッジの組合せ状態を9値に分類して画像を識別する9値識別手段(33)と、
走行レーンの左白線用の画像の9値を基に左白線の左右エッジを選択する左白線用左右エッジ選択手段(34)と、
走行レーンの右白線用の画像の9値を基に右白線の左右エッジを選択する白線用左右エッジ選択手段(35)とを具備することを特徴とする、画像中の傾斜白線を認識する白線認識装置。
A white line recognition device that extracts a change in brightness as an edge to recognize a white line image that defines a traveling lane of the own vehicle,
A horizontal / vertical operator (31) for extracting edges in the horizontal and vertical directions for the same part of the image;
Edge extraction determining means (32) for combining the edges extracted in the horizontal direction and the vertical direction in a pair, and determining that an edge has been extracted when the combination includes at least one edge; Prepare,
When the interval between two edges determined as edges by the edge extraction determination means (32) is constant, the white line recognition device recognizes an inclined white line in an image by recognizing the white line.
The horizontal / vertical operator (31) performs a change from dark to light with a negative edge, a change from light to dark with a positive edge, and a change without light and dark with no edge in the horizontal and vertical directions for the same part of the image. And edge extracting means for extracting edges in each of the three states.
The edge extraction determination means (32)
9-value identification means (33) for classifying the combination state of the horizontal and vertical edges extracted by the edge extraction means into nine values and identifying an image;
Left and right edge selection means (34) for selecting left and right edges of the left white line based on 9 values of the left white line image of the driving lane;
Characterized by comprising a right white line for the left and right edge selection means (35) for selecting the left and right edges of the right white line based on 9 values of the image for the right white line of the traveling lane, recognizing the inclination white line in the images White line recognition device.
前記エッジ抽出判断手段(32)は、水平方向及び垂直方向の1つのエッジの状態を基に左右白線の左右エッジを選択する共有左右エッジ選択手段(36)を更に備え、
前記白線の認識程度に応じて、前記共有左右エッジ選択手段と前記左白線用左右エッジ選択手段(34)及び白線用左右エッジ選択手段(35)とを択一的に切り換えすることを特徴とする、請求項1に記載の画像中の傾斜白線を認識する白線認識装置。
The edge extraction determining means (32) further comprises a shared lateral edge selection means for selecting the left and right edges of the left and right white lines on the basis of the state of the one edge of the horizontal及beauty vertical direction (36),
The shared left and right edge selecting means, the left and right white line selecting means (34), and the right and left right and left edge selecting means (35) are selectively switched according to the degree of recognition of the white line. The white line recognition device for recognizing an inclined white line in an image according to claim 1 .
記エッジ抽出判断手段(32)は、水平方向及び垂直方向の1つのエッジの状態を基に左右白線の左右エッジを選択する共有左右エッジ選択手段(36)を更に備え、
該共有左右エッジ選択手段(36)は、走行レーンの上部の画像の9値を基に画面の遠距離領域のエッジを選択し、
前記左白線用左右エッジ選択手段(34)及び白線用左右エッジ選択手段(35)は、走行レーンの上部の画像の9値を基に画面の近距離領域のエッジを選択することを特徴とする、請求項1に記載の画像中の傾斜白線を認識する白線認識装置。
Before it disappeared Tsu di extraction determining means (32) further comprises a shared lateral edge selection means for selecting the left and right edges of the left and right white lines on the basis of the state of the one edge of the horizontal及beauty vertical direction (36),
The shared left / right edge selecting means (36) selects an edge of a long-distance area on the screen based on the nine values of the image above the traveling lane,
The left and right edge selection means for left white line (34) and the right and left edge selection means for right white line (35) select an edge in a short distance area of a screen based on nine values of an image on an upper part of a traveling lane. The white line recognition device for recognizing an inclined white line in an image according to claim 1 .
JP23052695A 1995-09-07 1995-09-07 White line recognition device that recognizes inclined white lines in images Expired - Lifetime JP3553698B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP23052695A JP3553698B2 (en) 1995-09-07 1995-09-07 White line recognition device that recognizes inclined white lines in images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP23052695A JP3553698B2 (en) 1995-09-07 1995-09-07 White line recognition device that recognizes inclined white lines in images

Publications (2)

Publication Number Publication Date
JPH0972716A JPH0972716A (en) 1997-03-18
JP3553698B2 true JP3553698B2 (en) 2004-08-11

Family

ID=16909134

Family Applications (1)

Application Number Title Priority Date Filing Date
JP23052695A Expired - Lifetime JP3553698B2 (en) 1995-09-07 1995-09-07 White line recognition device that recognizes inclined white lines in images

Country Status (1)

Country Link
JP (1) JP3553698B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008101985A (en) * 2006-10-18 2008-05-01 Xanavi Informatics Corp On-vehicle device
JP5035284B2 (en) 2009-03-25 2012-09-26 株式会社日本自動車部品総合研究所 Vehicle periphery display device
JP5955404B2 (en) * 2012-10-22 2016-07-20 ヤマハ発動機株式会社 Distance measuring device and vehicle using the same
JP2015200976A (en) 2014-04-04 2015-11-12 富士通株式会社 Movement amount estimation device, movement amount estimation method, and program

Also Published As

Publication number Publication date
JPH0972716A (en) 1997-03-18

Similar Documents

Publication Publication Date Title
US10521676B2 (en) Lane detection device, lane departure determination device, lane detection method and lane departure determination method
JP3503230B2 (en) Nighttime vehicle recognition device
US9558412B2 (en) Vehicle exterior environment recognition device
CN101900562B (en) Clear path detection using divide approach
JP6274557B2 (en) Moving surface information detection apparatus, moving body device control system using the same, and moving surface information detection program
US10552706B2 (en) Attachable matter detection apparatus and attachable matter detection method
JP2917661B2 (en) Traffic flow measurement processing method and device
US20100121561A1 (en) Car navigation system
JP4864043B2 (en) Image processing apparatus, method, and program
JP6678552B2 (en) Vehicle type identification device and vehicle type identification method
JPH11195127A (en) Method for recognizing white line and device therefor
JP2007193702A (en) Image processing device and image processing method
JP3553698B2 (en) White line recognition device that recognizes inclined white lines in images
JP2000306097A (en) Road area decision device
JPH11213284A (en) Vehicle kind discrimination device
JPH10320559A (en) Traveling path detector for vehicle
JP2002008019A (en) Railway track recognition device and rolling stock using railway track recognition device
JP3333468B2 (en) Roadway image processing device for autonomous running of vehicles
JP5056931B2 (en) Vehicle color determination device, vehicle color determination system, and vehicle color determination method
JP3232064B2 (en) Target area extraction method for color images
JP2924063B2 (en) Image processing type traffic flow measurement device
JPH1166490A (en) Vehicle detecting method
JP2946620B2 (en) Automatic number reading device with speed measurement function
JPH0863549A (en) Vehicle number recognition device, binarization device and picture processor
JP3271743B2 (en) Serial number cutout device for license plate

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20040120

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20040127

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20040226

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20040406

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20040430

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090514

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090514

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100514

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110514

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110514

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120514

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120514

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130514

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130514

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140514

Year of fee payment: 10

EXPY Cancellation because of completion of term