CN106415602A - Method and device for detecting paired lane lines - Google Patents

Method and device for detecting paired lane lines Download PDF

Info

Publication number
CN106415602A
CN106415602A CN201680000880.XA CN201680000880A CN106415602A CN 106415602 A CN106415602 A CN 106415602A CN 201680000880 A CN201680000880 A CN 201680000880A CN 106415602 A CN106415602 A CN 106415602A
Authority
CN
China
Prior art keywords
lane line
artificial neural
sampling point
distance
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680000880.XA
Other languages
Chinese (zh)
Other versions
CN106415602B (en
Inventor
黄凯明
韩永刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Streamax Technology Co Ltd
Original Assignee
Streamax Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Streamax Technology Co Ltd filed Critical Streamax Technology Co Ltd
Publication of CN106415602A publication Critical patent/CN106415602A/en
Application granted granted Critical
Publication of CN106415602B publication Critical patent/CN106415602B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Image Analysis (AREA)

Abstract

A method for detecting paired lane lines is provided. The method includes: acquiring two lines to be detected, selecting sample points on the two lines according to a preset distance, and acquiring distance vectors between the acquired sample points and a preset common point; substituting the distance vectors into a preset artificial neural network to calculate an excitation value, wherein the weight vector of the artificial neural network is trained and acquired according to pre-acquired paired lane line sample data; and determining if the two lines are paired lane lines or not according to the excitation value output by the artificial neural network. The method can effectively ensure the real-time performance of determination of the paired lane lines, and can improve the accuracy of determination.

Description

A kind of detection method of paired lane line and device
Technical field
The invention belongs to automatic Pilot field, the detection method of more particularly, to a kind of paired lane line and device.
Background technology
Lane Departure Warning System be a kind of by report to the police by way of assist driver reduce automobile send out because of deviation The accessory system of the car steering of raw traffic accident.When vehicle deviates traveling lane, by described Lane Departure Warning System Early warning can be sent remind, described early warning is reminded and be may include alarm tone, vibration of steering wheel or automatic change steering etc..
In Lane Departure Warning System, in order to ensure the degree of accuracy of early warning, need lane line is correctly extracted And identification.Current paired method for detecting lane lines, generally requires and consumes more system resource, when the higher degree of accuracy of needs When, then need to spend certain calculating time it is impossible to ensure real-time detection;Or, in order to improve the real-time of detection, then may Cause missing inspection, lead to false drop rate to improve.
Content of the invention
It is an object of the invention to provide a kind of detection method of paired lane line, to solve prior art in paired track It is impossible to effectively ensure the problem of accuracy rate and real-time during line detection.
In a first aspect, embodiments providing a kind of detection method of paired lane line, methods described includes:
Two straight lines to be detected, the sample selecting respectively according to spacing set in advance are obtained on described two straight lines Point, obtains described sampling point and the distance between predetermined common point vector;
Described distance vector is substituted into artificial neural networks excitation value set in advance, wherein, described artificial neuron The weight vectors of network are according to the trained acquisition of paired lane line sample data gathering in advance;
According to the excitation value of described artificial neural network output, judge whether described two straight lines are paired lane line.
In conjunction with a first aspect, first aspect the first may in implementation, described according to spacing set in advance The sampling point selecting on described two straight lines, is obtained described sampling point and is included with the distance between predetermined common point vectorial step:
According to spacing set in advance, described two straight lines select sampling point respectively;
Using the central point of image as common point, obtain described sampling point and the distance between described common point vector.
In conjunction with a first aspect, the second in first aspect may be in implementation, described by described distance vector generation Before entering artificial neural networks excitation value step set in advance, methods described also includes:
Gather substantial amounts of paired lane line sample and azygous lane line sample, according to described spacing in described track Sampling point is selected on line sample;
Calculate the distance between described sampling point and described common point;
Described distance is substituted into the neuronal cell layers of artificial neural network, according to sample whether in pairs, calculate artificial neuron The corresponding weight vectors of the neuronal cell layers of network.
In conjunction with a first aspect, in the third possible implementation of first aspect, described sampling point is included in every track The N number of sampling point selecting on line, described N is the natural number more than or equal to 2.
In conjunction with a first aspect, in the 4th kind of possible implementation of first aspect, described according to described ANN The excitation value of network output, judges whether described two straight lines are that paired lane line step includes:
Obtain the excitation value of described artificial neural network output, described excitation value is compared with threshold value set in advance Relatively;
If described excitation value is more than described threshold value it is determined that described two straight lines are paired lane line, if described swash Encourage value and be less than described threshold value it is determined that described two straight lines are not paired lane lines.
Second aspect, embodiments provides a kind of detection means of paired lane line, and described device includes:
Lane line acquiring unit, for obtaining to be detected two straight line, according to spacing set in advance at described two The sampling point selecting respectively on straight line, obtains described sampling point and the distance between predetermined common point vector;
Computing unit, for described distance vector is substituted into artificial neural networks excitation value set in advance, wherein, The weight vectors of described artificial neural network are according to the trained acquisition of paired lane line sample data gathering in advance;
Judging unit, for the excitation value exporting according to described artificial neural network, judges that whether described two straight lines are Lane line in pairs.
In conjunction with second aspect, in the first possible implementation of second aspect, described lane line acquiring unit includes:
Sampling point selects subelement, for according to spacing set in advance, selecting sampling point respectively on described two straight lines;
Common point obtains subelement, obtains described sampling point public with described for using the central point of image as common point The distance between point vector.
In conjunction with second aspect, in the possible implementation of second of second aspect, described device also includes:
Sample collection unit, for gathering substantial amounts of paired lane line sample and azygous lane line sample, according to Described spacing selects sampling point on described lane line sample;
Metrics calculation unit, for calculating the distance between described sampling point and described common point;
Weight vector computation unit, for described distance substitutes into the neuronal cell layers of artificial neural network, according to sample Whether in pairs, the corresponding weight vectors of the neuronal cell layers of artificial neural network are calculated.
In conjunction with second aspect, in the third possible implementation of second aspect, described sampling point is included in every track The N number of sampling point selecting on line, described N is the natural number more than or equal to 2.
In conjunction with second aspect, in the 4th kind of possible implementation of second aspect, described judging unit includes:
Comparing subunit has, for obtaining the excitation value of described artificial neural network output, by described excitation value with advance The threshold value setting is compared;
Lane line determination subelement in pairs, if be more than described threshold value for described excitation value it is determined that described two straight Line is paired lane line, if described excitation value is less than described threshold value it is determined that described two straight lines are not paired lane lines.
In the present invention, obtain two straight lines to be detected, according to spacing set in advance, described two straight lines select Select sampling point, obtain sampling point and the distance between common point vector, described distance vector is substituted into the good artificial neuron of training in advance Network, can get the excitation value of artificial neural network output, judges whether described two straight lines are paired according to described excitation value Lane line.Using the method for the invention it is only necessary to by obtain range data substitute into artificial neural network can quickly really Whether fixed is paired lane line, you can effectively ensures the real-time that paired lane line is judged, can improve the standard of judgement again Really property.
Brief description
Fig. 1 is the flowchart of the detection method of paired lane line provided in an embodiment of the present invention;
Fig. 2 is that distance vector provided in an embodiment of the present invention obtains example schematic;
Fig. 3 is that another distance vector provided in an embodiment of the present invention obtains example schematic;
Fig. 4 is artificial neural network schematic diagram provided in an embodiment of the present invention;
Fig. 5 is the flowchart of artificial neural network training provided in an embodiment of the present invention;
Fig. 5 a, Fig. 5 b, Fig. 5 c are training sample schematic diagram provided in an embodiment of the present invention;
Fig. 6 is the structural representation of the detection means of paired lane line provided in an embodiment of the present invention.
Specific embodiment
In order that the objects, technical solutions and advantages of the present invention become more apparent, below in conjunction with drawings and Examples, right The present invention is further elaborated.It should be appreciated that specific embodiment described herein is only in order to explain the present invention, and It is not used in the restriction present invention.
Described in the embodiment of the present invention, paired method for detecting lane lines is it is therefore intended that overcome in prior art with regard to paired lane line In detection method, in order to improve the detection accuracy of paired lane line, generally require, using complex detection algorithm, to lead to Detection calculating process needs to consume certain duration, if under galloping state, testing result can be caused delayed, The relatively low defect of real-time of detection.And if adopting simple lane line determination methods, then testing result error easily occurs, Impact user judges.Below in conjunction with the accompanying drawings, the present invention is further illustrated.
What Fig. 1 showed the detection method of paired lane line provided in an embodiment of the present invention realizes flow process, and details are as follows:
In step S101, obtain two straight lines to be detected, according to spacing set in advance on described two straight lines The sampling point selecting respectively, obtains described sampling point and the distance between predetermined common point vector.
Specifically, to becoming lane line described in the embodiment of the present invention, refer to the boost line in the track for limiting vehicle traveling. Due in vehicle travel process, it is also possible to include other tag lines in addition to lane line, ratio is as shown in figure 3, except car Beyond diatom, also include arrow logo, the mark being made up of arrow line and lane line, then should not be identified as paired lane line.
Described two straight lines to be detected, can be by being identified to image obtaining.Such as, the identification of described straight line, Can be identified according to color in image, in such as identification image, color is white, or the straight line that color is yellow etc..
Described spacing set in advance, can be set according to the size of image.Width such as according to image, setting 1/3 screen width is the length of described spacing.It is, of course, also possible to according to the number of required sampling point, select described spacing Size, sets the length of described spacing so that the sampling point selecting includes the end position of described straight line.
The selection of described common point, flexibly can set according to the needs of user.The top in image such as can be set Midpoint as described common point it is also possible to set image in bottom midpoint as described common point, figure can also be set The central point of picture is as described common point.The difference of the selection mode according to common point, corresponding to described artificial neural network Weight vectors also differ.And the position of the common point selected in the training process of weight vectors, to be detected with described Article two, the position of the common point of line correspondences is identical.
Described sampling point and the distance between described common point vector, can be by measuring the distance between sampling point and common point Obtain.Two straight lines in such as Fig. 2, select two sampling points (to implement for the one of which that example is selected on every straight line Mode), the sampling point of selection includes four, and described common point is the center of image, then the distance of upper left corner line segment is 7.7cm, the distance of lower left corner line segment is 10.5cm, and the distance of upper right corner line segment is 8.5cm, and the distance of lower right corner line segment is 12cm, constituting distance vector is<7.7,10.5,8.5,12>.
Two straight lines as shown in Figure 3, (one of which selected for example is real to select two sampling points on every straight line Apply mode), selected sampling point includes four, and described common point is the center of image, then the distance of upper left corner line segment is 2.7cm, the distance of lower left corner line segment is 8.2cm, and the distance of upper right corner line segment is 6.2cm, and the distance of lower right corner line segment is 4.3cm, constituting distance vector is<2.7,8.2,6.2,4.3>.
In step s 102, described distance vector is substituted into artificial neural networks excitation value set in advance, wherein, The weight vectors of described artificial neural network are according to the trained acquisition of paired lane line sample data gathering in advance.
Specifically, the calculating of the weight vectors of artificial neural network described in the embodiment of the present invention, can be according to setting in advance Fixed multiple sample trainings obtain, wherein, the training method of described weight vectors, the following step shown in Fig. 5 can be included:
In step S501, gather substantial amounts of paired lane line sample and azygous lane line sample, according to described Spacing selects sampling point on described lane line sample;
In step S02, calculate the distance between described sampling point and described common point;
In step S503, described distance is substituted into the neuronal cell layers of artificial neural network, according to sample whether in pairs, Calculate the corresponding weight vectors of the neuronal cell layers of artificial neural network.
Specifically, described artificial neural network can include neuronal cell layers, output layer.Described neuronal cell layers with defeated Go out between layer, hidden layer can also be included.If Fig. 4 is a kind of structural representation of artificial neural network provided in an embodiment of the present invention Figure.As shown in figure 4, described artificial neural network includes input layer X1, X2, X3 and X4, neuronal cell layers Y1, Y2, Z1 and Z2, its In, Y1 and Y2 constitutes hidden layer, Z1 and Z2 constitutes output layer.
Wherein, the input number of described input layer, the number according to input vector sets.Such as in Fig. 5 a, Fig. 5 b and figure Sampling point in 5c is 4, and the input number of corresponding input layer is also 4.
Assume that Fig. 5 a, Fig. 5 b and Fig. 5 c are three in all of substantial amounts of paired track, and for Fig. 5 a, Fig. 5 b and The distance vector of Fig. 5 c is followed successively by:
Two straight lines as shown in Figure 5 a, select two sampling points (one of which selected for example on every straight line Embodiment), selected sampling point includes four, and described common point is the center of image, then the distance of upper left corner line segment For 5cm, the distance of lower left corner line segment is 14cm, and the distance of upper right corner line segment is 11cm, and the distance of lower right corner line segment is 9.5cm, Constituting distance vector is<5,14,11,9.5>.
Two straight lines as shown in Figure 5 b, select two sampling points (one of which selected for example on every straight line Embodiment), selected sampling point includes four, and described common point is the center of image, then the distance of upper left corner line segment For 5cm, the distance of lower left corner line segment is 6.5cm, and the distance of upper right corner line segment is 10cm, and the distance of lower right corner line segment is 16cm, Constituting distance vector is<5,6.5,10,16>.
Two straight lines as shown in Figure 5 c, select two sampling points (one of which selected for example on every straight line Embodiment), selected sampling point includes four, and described common point is the center of image, then the distance of upper left corner line segment For 7cm, the distance of lower left corner line segment is 11cm, and the distance of upper right corner line segment is 7.7cm, and the distance of lower right corner line segment is 12cm, Constituting distance vector is<7,11,7.7,12>.
Fig. 5 a, Fig. 5 b, the output result of Fig. 5 c is is " paired lane line ", in conjunction with the artificial neural network described in Fig. 4, W11, W13, W15 and W17 are four corresponding weights of input of nerve cell Y1;W12, W14, W16 and W18 are nerve cell Y2 Four input corresponding weights;W21 and W23 is two corresponding weights of input of nerve cell Z1;W22 and W24 is nerve Two corresponding weights of input of cell Z2.
Wherein, the excitation value of nerve cell is the sum of products of input and weight, in such as Fig. 4:The excitation value that Y1 obtains= X1*W11+X2*W13+X3*W15+X4*W17.
In the training process, excitation function can be set for nerve cell, such as:The excitation value of output layer exceedes certain threshold Value, exports 1;Otherwise export 0.In this instance, the excitation function of nerve cell Y1 and Y2 of hidden layer is set to:Output==swash Encourage value.
The input of nerve cell Z1 and Z2 is the output of Y1 and Y2, that is,:Excitation value=Y1*W21+Y2* that Z1 obtains W23.Wherein, the present invention for Z1 and Z2 setting excitation function can be:If (excitation value>=0), output 1, otherwise is defeated Go out 0.
According to above-mentioned excitation function it may be determined that each weight of artificial neural network, the i.e. training of artificial neural network.
In the present invention, each weighted value of artificial neural network is initialized as any random decimal between [- 1,1], so Afterwards the sample in training set is inputted artificial neural network one by one, adjust each weighted value, make own " just " sample in Z1 generation 1 Output, Z2 produce 0 output;And all " negative sample " produces 0 output in Z1, produce 1 output in Z2.Through Fig. 5 a, figure Three vectors in 5b and Fig. 5 c input as positive sample one by one, repetition training, adjust weighted value.It is, of course, also possible to input is negative Sample is trained.The weight vectors finally giving nerve cell Y1 are<0.8, -0.2,0.65, -0.3>, the input weight of Y2 Vector is<0.7, -0.3,0.9, -0.4>, the input weight vectors of Z1 are<1, -1>, the input weight vectors of Z2 are<- 1,1>.
In step s 103, the excitation value according to the output of described artificial neural network, judges that whether described two straight lines are Lane line in pairs.
The distance vector being obtained according to Fig. 2 and Fig. 3, substitutes into described artificial neural network respectively, can calculate artificial neuron The excitation value of the output layer of network:
For Fig. 2, input vector is:<7.7,10.5,8.5,12>, the weight vectors of Y1 are<0.8, -0.2,0.65, - 0.3>, obtain Y1 excitation value=5.985;The weight vectors of Y2 are<0.7, -0.3,0.9, -0.4>, obtain Y2 excitation value= 5.09.
The excitation function of Y1 and Y2 is set to " output==excitation value ", then the excitation value of Z1=<5.985,5.09>*< 1, -1>=0.985;Z2 excitation value=<5.985,5.09>*<- 1,1>=-0.985.
The excitation function of Z1 and Z2 is set to " if (excitation value>=0), output 1, otherwise output 0 ", then Z1 output 1, Z2 output 0, obtains the judgement of " two straight line is paired lane line ".
For two lines between in Fig. 4, the vector value of acquirement is<2.7,8.2,6.2,4.3>, the corresponding weight vectors of Y1 For:<0.8, -0.2,0.65, -0.3>, obtain Y1 excitation value=3.26;The excitation vector of Y2 is<0.7, -0.3,0.9, -0.4>, Obtain Y2 excitation value=3.29.
The excitation value of Z1=<3.26,3.29>*<1, -1>=-0.03;Z2 excitation value=<3.26,3.29>*<- 1,1>= 0.03.Z1 exports 0, Z2 output 1, obtains the judgement of " two straight line non-paired lane line ".
The present invention obtains two straight lines to be detected, according to spacing set in advance, selects sample on described two straight lines Point, obtains sampling point and the distance between common point vector, and described distance vector is substituted into the good artificial neural network of training in advance, Can get the excitation value of artificial neural network output, judge whether described two straight lines are paired track according to described excitation value Line.Using the method for the invention it is only necessary to the range data of acquisition is substituted into artificial neural network can quickly determining and being No for paired lane line, you can with the effective real-time ensureing and paired lane line being judged, the accurate of judgement can be improved again Property.
Fig. 6 show the structural representation of the detection means of paired lane line provided in an embodiment of the present invention, and details are as follows:
The detection means of paired lane line described in the embodiment of the present invention, including:
Lane line acquiring unit 601, for obtaining to be detected two straight line, according to spacing set in advance described two The sampling point selecting respectively on bar straight line, obtains described sampling point and the distance between predetermined common point vector;
Computing unit 602, for described distance vector is substituted into artificial neural networks excitation value set in advance, its In, the weight vectors of described artificial neural network are according to the trained acquisition of paired lane line sample data gathering in advance;
Whether judging unit 603, for the excitation value exporting according to described artificial neural network, judge described two straight lines For paired lane line.
Preferably, described lane line acquiring unit includes:
Sampling point selects subelement, for according to spacing set in advance, selecting sampling point respectively on described two straight lines;
Common point obtains subelement, obtains described sampling point public with described for using the central point of image as common point The distance between point vector.
Preferably, described device also includes:
Sample collection unit, for gathering substantial amounts of paired lane line sample and azygous lane line sample, according to Described spacing selects sampling point on described lane line sample;
Metrics calculation unit, for calculating the distance between described sampling point and described common point;
Weight vector computation unit, for described distance substitutes into the neuronal cell layers of artificial neural network, according to sample Whether in pairs, the corresponding weight vectors of the neuronal cell layers of artificial neural network are calculated.
Preferably, described sampling point includes the N number of sampling point selecting on every lane line, and described N is oneself more than or equal to 2 So count.
Preferably, described judging unit includes:
Comparing subunit has, for obtaining the excitation value of described artificial neural network output, by described excitation value with advance The threshold value setting is compared;
Lane line determination subelement in pairs, if be more than described threshold value for described excitation value it is determined that described two straight Line is paired lane line, if described excitation value is less than described threshold value it is determined that described two straight lines are not paired lane lines.
Described in the embodiment of the present invention, the detection means of paired lane line, corresponding with the detection method of above-mentioned paired lane line, Here is not repeated and repeats.
It should be understood that disclosed apparatus and method in several embodiments provided by the present invention, can be passed through it Its mode is realized.For example, device embodiment described above is only schematically, for example, the division of described unit, and only It is only a kind of division of logic function, actual can have other dividing mode when realizing, and for example multiple units or assembly can be tied Close or be desirably integrated into another system, or some features can be ignored, or do not execute.Another, shown or discussed Coupling each other or direct-coupling or communication connection can be by some interfaces, the INDIRECT COUPLING of device or unit or logical Letter connects, and can be electrical, mechanical or other forms.
The described unit illustrating as separating component can be or may not be physically separate, show as unit The part showing can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.The mesh to realize this embodiment scheme for some or all of unit therein can be selected according to the actual needs 's.
In addition, can be integrated in a processing unit in each functional unit in each embodiment of the present invention it is also possible to It is that unit is individually physically present it is also possible to two or more units are integrated in a unit.Above-mentioned integrated list Unit both can be to be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.
If described integrated unit is realized and as independent production marketing or use using in the form of SFU software functional unit When, can be stored in a computer read/write memory medium.Based on such understanding, technical scheme is substantially The part in other words prior art being contributed or all or part of this technical scheme can be in the form of software products Embody, this computer software product is stored in a storage medium, including some instructions with so that a computer Equipment (can be personal computer, server, or network equipment etc.) executes the complete of each embodiment methods described of the present invention Portion or part.And aforesaid storage medium includes:USB flash disk, portable hard drive, read-only storage (ROM, Read-Only Memory), Random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can be with store program codes Medium.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all essences in the present invention Any modification, equivalent and improvement made within god and principle etc., should be included within the scope of the present invention.

Claims (10)

1. a kind of detection method of paired lane line is it is characterised in that methods described includes:
Two straight lines to be detected, the sampling point selecting respectively according to spacing set in advance are obtained on described two straight lines, obtains Take described sampling point and the distance between predetermined common point vector;
Described distance vector is substituted into artificial neural networks excitation value set in advance, wherein, described artificial neural network Weight vectors according to the trained acquisition of paired lane line sample data gathering in advance;
According to the excitation value of described artificial neural network output, judge whether described two straight lines are paired lane line.
2. according to claim 1 method it is characterised in that described according to spacing set in advance on described two straight lines The sampling point selecting, is obtained described sampling point and is included with the distance between predetermined common point vectorial step:
According to spacing set in advance, described two straight lines select sampling point respectively;
Using the central point of image as common point, obtain described sampling point and the distance between described common point vector.
3. according to claim 1 method it is characterised in that described described distance vector is substituted into set in advance artificial Before neural computing excitation value step, methods described also includes:
Gather substantial amounts of paired lane line sample and azygous lane line sample, according to described spacing in described lane line sample Sampling point is selected on basis;
Calculate the distance between described sampling point and described common point;
Described distance is substituted into the neuronal cell layers of artificial neural network, according to sample whether in pairs, calculate artificial neural network Neuronal cell layers corresponding weight vectors.
4. according to claim 1 method it is characterised in that described sampling point include on every lane line select N number of sample Point, described N is the natural number more than or equal to 2.
5. according to claim 1 method it is characterised in that described according to described artificial neural network output excitation value, Judge whether described two straight lines are that paired lane line step includes:
Obtain the excitation value of described artificial neural network output, described excitation value is compared with threshold value set in advance;
If described excitation value is more than described threshold value it is determined that described two straight lines are paired lane line, if described excitation value Less than described threshold value it is determined that described two straight lines are not paired lane lines.
6. a kind of detection means of paired lane line is it is characterised in that described device includes:
Lane line acquiring unit, for obtaining to be detected two straight line, according to spacing set in advance in described two straight lines The upper sampling point selecting respectively, obtains described sampling point and the distance between predetermined common point vector;
Computing unit, for described distance vector is substituted into artificial neural networks excitation value set in advance, wherein, described The weight vectors of artificial neural network are according to the trained acquisition of paired lane line sample data gathering in advance;
Judging unit, for the excitation value exporting according to described artificial neural network, judges whether described two straight lines are paired Lane line.
7. according to claim 6 device it is characterised in that described lane line acquiring unit includes:
Sampling point selects subelement, for according to spacing set in advance, selecting sampling point respectively on described two straight lines;
Common point obtains subelement, for the central point of image as common point, is obtained described sampling point and described common point it Between distance vector.
8. according to claim 6 device it is characterised in that described device also includes:
Sample collection unit, for gathering substantial amounts of paired lane line sample and azygous lane line sample, according to described Spacing selects sampling point on described lane line sample;
Metrics calculation unit, for calculating the distance between described sampling point and described common point;
Weight vector computation unit, for described distance being substituted into the neuronal cell layers of artificial neural network, according to sample whether In pairs, the corresponding weight vectors of the neuronal cell layers of artificial neural network are calculated.
9. according to claim 6 device it is characterised in that described sampling point include on every lane line select N number of sample Point, described N is the natural number more than or equal to 2.
10. according to claim 6 device it is characterised in that described judging unit includes:
Comparing subunit has, for obtaining the excitation value of described artificial neural network output, by described excitation value with preset Threshold value be compared;
Lane line determination subelement in pairs, if be more than described threshold value it is determined that described two straight lines are for described excitation value Lane line in pairs, if described excitation value is less than described threshold value it is determined that described two straight lines are not paired lane lines.
CN201680000880.XA 2016-08-25 2016-08-25 A kind of detection method and device of pairs of lane line Active CN106415602B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/096761 WO2018035815A1 (en) 2016-08-25 2016-08-25 Method and device for detecting paired lane lines

Publications (2)

Publication Number Publication Date
CN106415602A true CN106415602A (en) 2017-02-15
CN106415602B CN106415602B (en) 2019-12-03

Family

ID=58087907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680000880.XA Active CN106415602B (en) 2016-08-25 2016-08-25 A kind of detection method and device of pairs of lane line

Country Status (2)

Country Link
CN (1) CN106415602B (en)
WO (1) WO2018035815A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778791A (en) * 2017-03-01 2017-05-31 成都天衡电科科技有限公司 A kind of timber visual identity method based on multiple perceptron
WO2018053836A1 (en) * 2016-09-26 2018-03-29 深圳市锐明技术股份有限公司 Paired lane line detection method and device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1431918A1 (en) * 2002-12-20 2004-06-23 Valeo Vision Method and system for detecting road bends
CN102201167A (en) * 2010-04-07 2011-09-28 宫宁生 Video-based driveway automatic identification method
CN102750825A (en) * 2012-06-19 2012-10-24 银江股份有限公司 Urban road traffic condition detection method based on neural network classifier cascade fusion
KR101261409B1 (en) * 2012-04-24 2013-05-10 이엔지정보기술 주식회사 System for recognizing road markings of image
CN104063877A (en) * 2014-07-16 2014-09-24 中电海康集团有限公司 Hybrid judgment identification method for candidate lane lines
CN105046235A (en) * 2015-08-03 2015-11-11 百度在线网络技术(北京)有限公司 Lane line recognition modeling method and apparatus and recognition method and apparatus
CN105069415A (en) * 2015-07-24 2015-11-18 深圳市佳信捷技术股份有限公司 Lane line detection method and device
US20160012300A1 (en) * 2014-07-11 2016-01-14 Denso Corporation Lane boundary line recognition device
CN105260713A (en) * 2015-10-09 2016-01-20 东方网力科技股份有限公司 Method and device for detecting lane line
CN105260699A (en) * 2015-09-10 2016-01-20 百度在线网络技术(北京)有限公司 Lane line data processing method and lane line data processing device
CN105608429A (en) * 2015-12-21 2016-05-25 重庆大学 Differential excitation-based robust lane line detection method
CN105718916A (en) * 2016-01-27 2016-06-29 大连楼兰科技股份有限公司 Lane line detection method based on Hough transform

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104102905B (en) * 2014-07-16 2018-03-16 中电海康集团有限公司 A kind of adaptive detection method of lane line

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1431918A1 (en) * 2002-12-20 2004-06-23 Valeo Vision Method and system for detecting road bends
CN102201167A (en) * 2010-04-07 2011-09-28 宫宁生 Video-based driveway automatic identification method
KR101261409B1 (en) * 2012-04-24 2013-05-10 이엔지정보기술 주식회사 System for recognizing road markings of image
CN102750825A (en) * 2012-06-19 2012-10-24 银江股份有限公司 Urban road traffic condition detection method based on neural network classifier cascade fusion
US20160012300A1 (en) * 2014-07-11 2016-01-14 Denso Corporation Lane boundary line recognition device
CN104063877A (en) * 2014-07-16 2014-09-24 中电海康集团有限公司 Hybrid judgment identification method for candidate lane lines
CN105069415A (en) * 2015-07-24 2015-11-18 深圳市佳信捷技术股份有限公司 Lane line detection method and device
CN105046235A (en) * 2015-08-03 2015-11-11 百度在线网络技术(北京)有限公司 Lane line recognition modeling method and apparatus and recognition method and apparatus
CN105260699A (en) * 2015-09-10 2016-01-20 百度在线网络技术(北京)有限公司 Lane line data processing method and lane line data processing device
CN105260713A (en) * 2015-10-09 2016-01-20 东方网力科技股份有限公司 Method and device for detecting lane line
CN105608429A (en) * 2015-12-21 2016-05-25 重庆大学 Differential excitation-based robust lane line detection method
CN105718916A (en) * 2016-01-27 2016-06-29 大连楼兰科技股份有限公司 Lane line detection method based on Hough transform

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XUE LI 等: "Lane detection based on spiking neural network and hough transform", 《2015 8TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP)》 *
韦唯: "基于单目视觉的车道线识别方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018053836A1 (en) * 2016-09-26 2018-03-29 深圳市锐明技术股份有限公司 Paired lane line detection method and device
CN106778791A (en) * 2017-03-01 2017-05-31 成都天衡电科科技有限公司 A kind of timber visual identity method based on multiple perceptron

Also Published As

Publication number Publication date
CN106415602B (en) 2019-12-03
WO2018035815A1 (en) 2018-03-01

Similar Documents

Publication Publication Date Title
CN106462757B (en) A kind of rapid detection method and device of pairs of lane line
US20240149882A1 (en) Multiple exposure event determination
JP6565967B2 (en) Road obstacle detection device, method, and program
CN108944939B (en) Method and system for providing driving directions
CN106740457A (en) Vehicle lane-changing decision-making technique based on BP neural network model
CN105404947A (en) User quality detection method and device
CN109415057B (en) Method for preferably identifying object by driver assistance system
KR20210040415A (en) Object classification method and device
WO2019053052A1 (en) A method for (re-)training a machine learning component
Peng et al. Intelligent method for identifying driving risk based on V2V multisource big data
US20200097004A1 (en) Evolutionary algorithmic state machine for autonomous vehicle planning
CN108108703A (en) Deceleration strip missing detection method, device and electronic equipment
CN106415602A (en) Method and device for detecting paired lane lines
CN106448258B (en) A kind of detection method and device of parking space state
Taherifard et al. Attention-based event characterization for scarce vehicular sensing data
CN108027896A (en) System and method for decoding the pulse reservoir with continuous synaptic plasticity
CN103049747A (en) Method for re-identifying human body images by utilization skin color
CN111881706A (en) Living body detection, image classification and model training method, device, equipment and medium
CN111352414A (en) Decoy removal apparatus and method for vehicle and vehicle including the same
CN115270381A (en) Simulation scene generation method and device, automatic driving equipment and readable storage medium
CN107301487A (en) Automotive occupant is enabled to report the method and system for associating harm with vehicle environmental
CN106415603B (en) A kind of efficient detection method and device of pairs of lane line
Muñoz-Organero et al. Detecting different road infrastructural elements based on the stochastic characterization of speed patterns
EP4141807A1 (en) Method and device for generating synthetic training data for an ultrasonic sensor model
CN115841712A (en) Driving data processing method, device and equipment based on V2X technology

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant