CN106991389B - Device and method for determining road edge - Google Patents

Device and method for determining road edge Download PDF

Info

Publication number
CN106991389B
CN106991389B CN201710196345.2A CN201710196345A CN106991389B CN 106991389 B CN106991389 B CN 106991389B CN 201710196345 A CN201710196345 A CN 201710196345A CN 106991389 B CN106991389 B CN 106991389B
Authority
CN
China
Prior art keywords
road
vehicle
road edge
curve
edge curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710196345.2A
Other languages
Chinese (zh)
Other versions
CN106991389A (en
Inventor
胡传远
付晶玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NIO Holding Co Ltd
Original Assignee
NIO Anhui Holding Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NIO Anhui Holding Co Ltd filed Critical NIO Anhui Holding Co Ltd
Priority to CN201710196345.2A priority Critical patent/CN106991389B/en
Publication of CN106991389A publication Critical patent/CN106991389A/en
Priority to PCT/CN2018/075130 priority patent/WO2018177026A1/en
Application granted granted Critical
Publication of CN106991389B publication Critical patent/CN106991389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a device and a method for determining road edges, and belongs to the technical field of intelligent automobiles. An apparatus for determining a road edge, comprising: a radar detector mounted on the vehicle and capable of detecting at least stationary objects beside a road edge of a road on which the vehicle is located; and a processing component configured to: and receiving the static targets detected by the radar detector and extracting the arrangement information of the static targets which are approximately regularly arranged relative to the road, thereby obtaining the road edge information based on the arrangement information. The device and the method are very suitable for acquiring the road edge information in the unstructured road, and the road edge information can be relatively accurately acquired at a long distance.

Description

Device and method for determining road edge
Technical Field
The invention belongs to the technical field of intelligent automobiles, and relates to a device and a method for determining a road edge based on a static target beside the road edge.
Background
Automatic driving (including assisted driving) is an important direction for the development of intelligent automobiles, and more vehicles are beginning to apply automatic driving systems to realize the automatic driving function of the vehicles. In general, an automatic driving system can determine a driving available area of a vehicle at any time, and in the process of determining the driving available area, an important aspect is to determine a road edge of a current driving road.
At present, it is common in automatic driving systems to determine the road edge by means of an image including a lane line captured by an image sensor (e.g. a camera mounted on a vehicle), wherein the road edge is determined based on image processing of the lane line in the image captured in real time. This technique of determining road edges presents at least one of the following problems:
on the first hand, it is difficult to determine the road edge or the determined road edge deviates greatly from the real road edge for the road with fuzzy lane line, partial missing lane line or completely absent lane line, which must depend on the lane line of the lane;
in a second aspect, the technique for determining the edge of a road is based on an image sensor, however, in practical applications, the amount of information carried by the image sensor on the near-distance image and the far-distance image differs. Generally, as for the actual physical distance represented between two pixel points on the image, the distance near the lens center point of the image sensor is smaller than that at the lens boundary area, which easily causes a problem of poor recognition capability of the long-distance lane, that is, inaccurate determination or detection of the long-distance (relative to the vehicle) road edge.
Disclosure of Invention
At least one aspect of the technical problems to be solved by the present invention or other technical problems the present invention provides the following technical solutions.
According to an aspect of the present invention, there is provided an apparatus for determining an edge of a road, comprising:
a radar detector mounted on the vehicle and capable of detecting at least stationary objects beside a road edge of a road on which the vehicle is located; and
a processing component configured to: and receiving the static targets detected by the radar detector and extracting the arrangement information of the static targets which are approximately regularly arranged relative to the road, thereby obtaining the road edge information based on the arrangement information.
According to yet another aspect of the present invention, there is provided a method of determining a road edge, comprising the steps of:
(a) detecting a stationary object beside a road edge of a road on which a vehicle is located; and
(b) the method comprises the steps of extracting arrangement information of static objects which are approximately regularly arranged relative to a road, and obtaining road edge information based on the arrangement information.
According to a further aspect of the present invention there is provided a vehicle provided with an autonomous driving system in which is provided any of the above described means for determining the edge of a road.
The above features and operation of the present invention will become more apparent from the following description and the accompanying drawings.
Drawings
The above and other objects and advantages of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which like or similar elements are designated by like reference numerals.
Fig. 1 is a schematic view showing the structure of an apparatus for determining a road edge according to an embodiment of the present invention.
Fig. 2 is a schematic view of an application scenario of the apparatus of the embodiment shown in fig. 1 when determining a road edge.
FIG. 3 is a flow chart of a method of determining a road edge in accordance with an embodiment of the invention.
Detailed Description
The present invention will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the invention to those skilled in the art. In the drawings, the same reference numerals denote the same elements or components, and thus, their description will be omitted.
Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
Fig. 1 is a schematic structural diagram of an apparatus for determining a road edge according to an embodiment of the present invention, and fig. 2 is a schematic application scenario diagram of the apparatus shown in fig. 1 when determining a road edge. The following describes an apparatus according to an embodiment of the present invention and its operation principle with reference to fig. 1 and 2.
As shown in fig. 1, a device for determining a road edge (hereinafter simply referred to as "determination device") is mounted on a vehicle 100, and a specific type of the vehicle 100 is not limitative, with respect to which the vehicle 100 is a host vehicle of the determination device. The determination means may be applied to an automatic driving system in which the vehicle 100 is installed.
Taking fig. 2 as an example, the vehicle 100 is driving on a road 900, and the road 900 has corresponding road edges 901a and 901b, where 901a is a left road edge and 901b is a right road edge, and in this application scenario, the road edges 901a and 901b are not explicitly identified by lane lines, or there is no corresponding lane line in a section of the road 900 in this example to identify the road edges. On both sides of the road 900, there are various stationary (stationary with respect to the road) objects, which are targets detected by the determination device, and therefore, are also referred to as "stationary targets"; by way of example, stationary objects are shown next to a left road edge 901a of a road 900, such as trees 801, utility poles 802, isolation parks 803 (of which three isolation parks 803a, 803b and 803c are shown), etc., it being understood that the stationary objects next to the road edge are not limited to the types of objects of the above embodiments, such as fences, sign uprights, etc.,
the determination means essentially comprise radar detectors 110 mounted on the vehicle 100 and capable of detecting at least stationary objects at least one of the two sides of the road 900 on which the vehicle 100 is located. In one embodiment, radar detector 110 is a millimeter wave radar, mounted at the front end of vehicle 100, capable of detecting various objects ahead at a 90 detection angle range on the road plane, including, for example, stationary objects beside road edge 901a as shown in FIG. 2. The radar detector 110 emits electromagnetic waves of a certain wavelength and receives reflections from a preceding object when detecting, and thus, can detect the positions of various objects, particularly, for an object at a long distance (for example, over 40 meters), can be detected relatively accurately (with respect to the image sensor 120) as a short-distance object, and thus, has a better long-distance detection characteristic with respect to the image sensor 120.
It is noted that a vehicle coordinate system, that is, an XY coordinate system, may be defined in advance in the determination device, where the centroid of the vehicle 100 is defined as the circular point O, the X axis is defined as the front vertical direction of the vehicle 100, the X coordinate is defined as the deviation of the distance from the centroid of the vehicle in the vertical direction, the Y axis is defined as the horizontal direction of the vehicle 100, and the Y coordinate is defined as the deviation of the distance from the centroid of the vehicle in the horizontal direction. When detecting various objects (including stationary objects), the radar detector 110 basically determines coordinates (X, Y) of an object, where the X coordinate represents a deviation of a distance of the object from the centroid of the vehicle 100 in a vertical direction (i.e., a deviation on the X axis) and the Y coordinate represents a deviation of a distance of the object from the centroid of the vehicle 100 in a horizontal direction (i.e., a deviation on the Y axis) in a vehicle coordinate system.
Among them, the millimeter wave radar is configured to be able to determine a stationary target, that is, an object that is stationary with respect to the road edge 901, from among various objects detected based on the doppler effect and the vehicle speed of the host vehicle. Therefore, the millimeter wave radar can output the information (for example, coordinates in the vehicle coordinate system) about the stationary object substantially in real time.
The radar detector 110 has the advantages of relatively low cost and accurate detection of long-distance (e.g., over 40 meters) stationary targets when using millimeter-wave radar, but it should be understood that the radar detector 110 is not limited to millimeter-wave radar, and may be, for example, laser radar, which detects various stationary targets (including long-distance stationary targets) relatively more accurately, but is relatively expensive and requires higher data processing capability of the subsequent processing unit 130.
Continuing with fig. 1, the determining means further comprises a processing component 130, which is also provided on the vehicle 100, which may be implemented by processing means in an autonomous driving system on the vehicle 100, or by a processor provided separately from the autonomous driving system. The processing component 130 may process algorithmic code stored therein and execute instructions from an autonomous driving system or vehicle, and the specific hardware implementation of the processing component 130 is known and will not be described in detail herein.
The processing unit 130 may mainly perform data processing on the stationary object related information transmitted by the radar detector 110 to obtain a road edge curve, and is configured to: the stationary objects detected by the radar detector 110 are received and arrangement information of the stationary objects arranged substantially regularly with respect to the road is extracted, thereby obtaining road edge information based on the arrangement information. The road edge information may be specifically expressed as road edge curve information. The following illustrates a specific operation principle of the processing unit 130 by taking a road edge curve of the left road edge 901a shown in fig. 2.
In an embodiment, the number of the stationary objects transmitted by the radar detector 110 after one scanning detection may reach several tens of the order, and therefore, a corresponding screening unit 131 is provided in the processing unit 130, which screens out at least three or more stationary objects as reference objects from among the plurality of stationary objects that can be transmitted by the radar detector 110. As shown in fig. 2, trees 901, utility poles 902, and london 903 having a substantially regular arrangement with respect to a road 900 may be screened from among a plurality of stationary targets as a reference target for a left road edge 901a, and other stationary targets such as trees, electric poles, etc. that are not irregularly arranged with respect to the left road edge 901a may not be selected as a reference target or may be filtered.
Applicants note that since both sides of the road 900 will typically have objects, such as trees 901 and utility poles 902, arranged substantially regularly with respect to the road 900; in determining whether a stationary object is substantially regularly aligned with respect to the road 900, a predicted travel trajectory, which substantially corresponds to the current road curve, may be obtained based on the current yaw rate of the vehicle 100 (which may be collected from components such as a steering system of the vehicle 100, for example), and therefore, whether a stationary object is substantially regularly aligned with respect to the road 900 may be determined based substantially on whether a stationary object is substantially regularly aligned with respect to the predicted travel trajectory, and thus, a corresponding stationary object may be screened out as a reference object. It should be understood that "substantially" in "substantially regular arrangement" reflects that stationary objects on either side of the road 900 are not necessarily exactly aligned with respect to the road according to a certain rule, e.g., there is a tolerance on the order of several meters in alignment with respect to the road, etc.
As shown in fig. 2, trees 801, utility poles 802, piers 803, and the like regularly arranged near the left road edge 901a are determined to be approximately regularly arranged with respect to the predicted travel locus by the screening unit 131, and therefore, at least three of them are used as reference targets, for example, three or more trees 801 are selected as reference targets, piers 803a, 803b, and 803c are selected as reference targets, or a plurality of trees 801 and one utility pole 802 and one pier 803a are selected as reference targets. The more the number of the reference targets is, the more the road edge curve can be obtained accurately in the following process.
In an embodiment, the processing component 130 is provided with a target curve fitting unit 132 configured to perform curve fitting on at least three or more reference targets in a vehicle coordinate system to obtain corresponding reference target arrangement curves. Specifically, the reference target arrangement curve to be obtained is defined in advance by a quadratic function, that is, the following functional relation (1):
Y=C2×X2 + C1×X + C0’ (1)
wherein X is an independent variable corresponding to an X coordinate under a vehicle coordinate system, the X coordinate being defined as a deviation of a distance from a centroid of the vehicle in a vertical direction; y is a dependent variable corresponding to a Y coordinate under a vehicle coordinate system, the Y coordinate being defined as a deviation of a distance from a centroid of the vehicle in a horizontal direction; c2Is a coefficient of a quadratic term, C1Is a coefficient of a first order term, C0' is a constant term.
Further, the coordinates of a plurality of reference targets are substituted into the quadratic function relational expression (1), and a quadratic coefficient C in the quadratic function relational expression (1) is calculated2First order coefficient C1And constant term C0' thus, the relation (1) is obtained, that is, the reference target arrangement curve is determined.
In yet another embodiment, the turning radius of the current vehicle may also be calculated based on the current yaw rate of the vehicle,thereby calculating the quadratic term coefficient C in the relation (1)2In this case, the quadratic function relation (1) is reduced to a linear function relation, and the linear term coefficient C can be further calculated based on the coordinates of a plurality of reference objects1And constant term C0' thus, the relation (1) is obtained, that is, the reference target arrangement curve is determined.
In an embodiment, the processing unit 130 is provided with a road edge estimation unit 133 for estimating a road edge curve based on the reference target arrangement curve. In one embodiment, the road edge estimation unit 133 obtains the corresponding road edge curve by: the following quadratic function relation (2) is obtained from the quadratic function relation (1) as a road edge curve:
Y=C2×X2 + C1×X + C0 (2)
wherein, C0Is a constant term, C0= C0' + D, D is a distance constant of the road edge with respect to the stationary objects arranged substantially regularly beside it, for example, the distance constant D of the stationary objects (tree 801, telegraph pole 802, isolation dun 803, etc.) corresponding to the left road edge 901a with respect to the left road edge 901a is estimated in advance, and in general, there are corresponding specification regulations for the distances of the tree 801, telegraph pole 802, isolation dun 803 with respect to the road edge 901, and the distance constant D (specifically, for example, 0.5 m) can be estimated based on these specifications
In this way, the quadratic function relation (2) is determined, i.e. the road edge curve is determined.
It should be noted that although the above examples are described with reference to determining the reference target arrangement curve and the road edge curve in a quadratic functional relationship, it should be understood that the reference target arrangement curve and the road edge curve may also be determined based on a higher-order functional relationship (for example, a cubic functional relationship or a quartic functional relationship), and of course, the higher the order of the functional relationship, the larger the number of reference targets required.
The determination device of the above embodiment may determine the road edge information based on the stationary objects at both sides of the road, and is implemented without depending on the lane line, and therefore, is very suitable for being applied to an unstructured road (for example, a road with a fuzzy lane line, a disappearance or a missing lane line) to acquire the road edge information; and, do not rely on image sensor to realize yet, therefore, get rid of the long distance road and can't obtain the problem of corresponding road edge information accurately, long distance also can obtain road edge information relatively accurately.
The determination device of the above embodiment may be applied to the vehicle 100 having an automatic driving system that can give not only a near-end travelable region but also a relatively accurate far-end travelable region based on the road edge curve provided by the determination device, wherein the algorithm for determining the travelable region based on the road edge curve is not limited.
As shown in fig. 1, in yet another embodiment, the determining device may further include an image sensor 120, which may be installed at a position of a rear-view mirror inside the vehicle, for example, the image sensor 120 may be a camera or the like, which may acquire lane line image information of a lane line 901 (if the lane line 901 exists) of the road 900 in real time, and of course, in practical applications, the image information acquired by the image sensor 120 is not limited to the lane line image information, and may further include image information of a front vehicle, a pedestrian, an obstacle, and the like, for example.
In yet another embodiment, the processing part 130 in the determination device further receives the above lane line image information, and further calculates a road edge curve of the road 900 in real time based on the lane line image information; image processing and calculation of road edge curves based on lane line image information are well known in the art and will not be described in detail herein. Therefore, the processing component 130 may obtain two road edge curves obtained by two mechanisms respectively, and the processing component 130 may determine the road edge curve of the road 900 based on the two road edge curves under different scenes.
For example, in a scenario where a lane line of the road 900 exists and a stationary target approximately regularly arranged with respect to the road 900 exists beside the road 900, based on the above two mechanisms or two road edge curves, for a short distance road, the road edge curve may be a road edge curve calculated based on the image information of the lane line, and for a long distance road, the road edge curve may be a road edge curve calculated based on the stationary target, so that the problem that the road edge curve calculated based on the image information of the lane line is inaccurate in a long distance section is overcome.
For example, in another scenario, a part of the road line of the road 900 is missing or unclear, a static object approximately regularly arranged with respect to the road 900 is present beside the road 900, and for the part of the road line missing or unclear on the road, a road edge curve calculated based on the static object may be used. Thereby overcoming the problem of road edge curves that cannot be obtained or cannot be accurately obtained on certain road segments based on the image sensor 120.
Fig. 3 is a flow chart illustrating a method of determining a road edge according to an embodiment of the present invention. A method of determining a road edge according to an embodiment of the present invention is illustrated with reference to fig. 1 to 3.
First, in step S310, stationary objects beside the road edge of the road on which the vehicle is located are detected.
This step S310 may be implemented in the radar detector 110 such as a millimeter wave radar. The detection of a stationary target at least one side (e.g., the left road edge 901a side) of the both sides of the road 900 on which the vehicle 100 is located by the radar detector 110 can be detected relatively accurately as in the case of a short-distance object, particularly for a long-distance object (e.g., 40 meters or more). The millimeter wave radar is configured to be able to determine a stationary target, that is, an object stationary with respect to the road edge 901, including, for example, a tree 801, a utility pole 802, and a quarantine 803, from among various detected objects based on the doppler effect. Therefore, the millimeter wave radar can output the information (for example, coordinates in the vehicle coordinate system) about the stationary object substantially in real time.
Further, in step S320, a reference object is selected from the still objects.
The step S320 is implemented in the filtering unit 131 of the main processing unit 130, and at least three or more stationary objects are filtered from the plurality of stationary objects transmitted from the radar detector 110 as reference objects. The principle of the filtering is to filter the stationary objects as reference objects by determining whether the stationary objects are substantially regularly arranged with respect to the road 900, specifically, the road 900 may be associated with a predicted driving trajectory, which may be obtained based on the current yaw rate of the vehicle 100 (for example, may be collected from a steering system or the like of the vehicle 100), and the trees 801, utility poles 802, and piers 803 or the like, which are determined based on the regular arrangement near the left road edge 901a, are substantially regularly arranged with respect to the predicted driving trajectory. The more the number of the reference targets is, the more the road edge curve can be obtained accurately in the following process.
Further, in step S330, a curve fitting is performed on the reference target to obtain a reference target arrangement curve.
This step S330 is mainly implemented in the target curve fitting unit 132 of the processing section 130. In one embodiment, the reference target arrangement curve to be obtained is defined in advance by a quadratic function, i.e. the following functional relation (1):
Y=C2×X2 + C1×X + C0’ (1)
wherein X is an independent variable corresponding to an X coordinate under a vehicle coordinate system, the X coordinate being defined as a deviation of a distance from a centroid of the vehicle in a vertical direction; y is a dependent variable corresponding to a Y coordinate under a vehicle coordinate system, the Y coordinate being defined as a deviation of a distance from a centroid of the vehicle in a horizontal direction; c2Is a coefficient of a quadratic term, C1Is a coefficient of a first order term, C0' is a constant term.
Further, coordinates of a plurality of reference targets (for example, coordinates of the trees 801, the utility poles 802, and the piers 803 in the vehicle coordinate representation) are substituted into the quadratic function relational expression (1), and a quadratic coefficient C in the quadratic function relational expression (1) is calculated2First order coefficient C1And constant term C0' to obtain the relation (1),i.e. a reference target alignment curve is determined.
In still another embodiment, the turning radius of the current vehicle may also be calculated based on the current yaw rate of the vehicle, so as to calculate the quadratic coefficient C in the relation (1)2In this case, the quadratic function relation (1) is reduced to a linear function relation, and the linear term coefficient C can be further calculated based on the coordinates of a plurality of reference objects1And constant term C0' thus, the relation (1) is obtained, that is, the reference target arrangement curve is determined.
Further, in step S340, a road edge curve is estimated based on the reference target arrangement curve.
This step S330 is mainly implemented in the road edge estimation unit 133 of the processing section 130. In one embodiment, the corresponding road edge curve is obtained by: the following quadratic function relation (2) is obtained from the quadratic function relation (1) as a road edge curve:
Y=C2×X2 + C1×X + C0 (2)
wherein, C0Is a constant term, C0= C0' + D, D is a distance constant of the road edge with respect to the stationary objects arranged substantially regularly beside it, for example, the distance constant D of the stationary objects (tree 801, telegraph pole 802, isolation dun 803, etc.) corresponding to the left road edge 901a with respect to the left road edge 901a is estimated in advance, and in general, there are corresponding specification regulations for the distances of the tree 801, telegraph pole 802, isolation dun 803 with respect to the road edge 901, and the distance constant D (specifically, for example, 0.5 m) can be estimated based on these specifications
In this way, the quadratic function relation (2) is determined, i.e. the road edge curve is determined.
The method for determining the road edge according to the embodiment shown in fig. 3 is implemented by not relying on the lane line or the image sensor, and is very suitable for acquiring the road edge information in an unstructured road (for example, a road with a blurred lane line, a disappeared or missing lane line), and the road edge information can be acquired relatively accurately at a long distance.
Herein, the terms "short range" and "long range" correspond based substantially on the effective detection range of the radar detector and the effective detection range of the image sensor, respectively, and generally the effective detection range of the radar detector is farther from the effective detection range of the image sensor; therefore, a distance range less than or equal to the effective detection distance of the image sensor is defined as "near distance", and a distance range exceeding the effective detection distance of the image sensor is defined as "far distance" in the present application. It is to be understood that the division between "near" and "far" is not based on a fixed distance value, for example, the effective detection distances of different models of image sensors may also be different, and for example, as the image sensor technology develops, the effective detection distance of a newly emerging image sensor may also be farther.
The above examples mainly illustrate the apparatus and method of determining road edges of the present invention. Although only a few embodiments of the present invention have been described, those skilled in the art will appreciate that the present invention may be embodied in many other forms without departing from the spirit or scope thereof. Accordingly, the present examples and embodiments are to be considered as illustrative and not restrictive, and various modifications and substitutions may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims.

Claims (13)

1. An apparatus for determining a road edge, comprising:
a radar detector mounted on the vehicle and capable of detecting at least stationary objects beside a road edge of a road on which the vehicle is located; and
a processing component configured to: receiving the static targets detected by the radar detector and extracting arrangement information of the static targets which are approximately regularly arranged relative to the road, so that road edge information is obtained based on the arrangement information;
wherein the processing component comprises:
a screening unit for screening out, from among the stationary objects, three or more stationary objects arranged substantially regularly with respect to the road as reference objects;
the target curve fitting unit is used for performing curve fitting on three or more than three reference targets under a vehicle coordinate system to obtain corresponding reference target arrangement curves; and
a road edge estimation unit for estimating a first road edge curve based on the reference target arrangement curve;
wherein the screening unit is configured to: and calculating a predicted running track of the vehicle based on the current yaw rate of the vehicle, and screening out the corresponding static target as a reference target based on whether the static target is approximately regularly arranged relative to the predicted running track.
2. The apparatus of claim 1, wherein the reference target arrangement curve is the following quadratic function relation (1):
Y=C2×X2 + C1×X + C0’ (1)
wherein X is an independent variable corresponding to an X coordinate under the vehicle coordinate system, the X coordinate being defined as a deviation in a vertical direction of a distance from a center of mass of the vehicle; y is a dependent variable corresponding to a Y coordinate under the vehicle coordinate system, the Y coordinate being defined as a deviation of a distance from a centroid of the vehicle in a horizontal direction; c2Is a coefficient of a quadratic term, C1Is a coefficient of a first order term, C0' is a constant term;
wherein, based on the coordinate information of the reference target in the vehicle coordinate system, the quadratic term coefficient C in the quadratic function relation (1) is calculated2First order coefficient C1And constant term C0’;
Wherein the current yaw rate of the vehicle is used to calculate the turning radius of the current vehicle so as to obtain the quadratic term coefficient C2
Further, the road edge estimation unit is further configured to: obtaining the following quadratic function relation (2) as a first roadside edge curve from the quadratic function relation (1):
Y=C2×X2 + C1×X + C0 (2)
wherein, C0Is a constant term, C0= C0' + D, D is the distance constant of the road edge from the stationary objects arranged approximately regularly beside it.
3. The apparatus of claim 1, wherein the radar detector is a millimeter wave radar.
4. The apparatus of claim 1, wherein the apparatus further comprises: an image sensor mounted on the vehicle for acquiring lane line image information of the road;
wherein the processing component is further configured to: and calculating a second road edge curve of the road based on the lane line image information, and determining the road edge curve of the road based on the first road edge curve and the second road edge curve.
5. The apparatus of claim 4, wherein the processing component is further configured to: the road edge curve of the short-distance road adopts the second road edge curve, and the road edge curve of the long-distance road adopts the first road edge curve.
6. The apparatus of claim 4, wherein the processing component is further configured to: and adopting the first roadside edge curve as the road edge curve of the road on the road section with the missing or unclear lane line of the road.
7. A method of determining a road edge, comprising the steps of:
(a) detecting a stationary object beside a road edge of a road on which a vehicle is located; and
(b) extracting arrangement information of static objects which are approximately regularly arranged relative to a road, and acquiring road edge information based on the arrangement information;
wherein step (b) comprises the sub-steps of:
(b1) screening three or more than three static targets which are approximately regularly arranged relative to the road from the static targets as reference targets;
(b2) performing curve fitting on three or more than three reference targets under a vehicle coordinate system to obtain corresponding reference target arrangement curves; and
(b3) estimating to obtain a first roadside edge curve based on the reference target arrangement curve;
in the sub-step (b 1), the predicted travel locus of the vehicle is calculated based on the current yaw rate of the vehicle, and the stationary object is selected as the reference object based on whether the stationary object is substantially regularly arranged with respect to the predicted travel locus.
8. The method of claim 7, wherein in step (b 2), the reference target arrangement curve is a quadratic function relationship (1) as follows:
Y=C2×X2 + C1×X + C0’ (1)
wherein X is an independent variable corresponding to an X coordinate under the vehicle coordinate system, the X coordinate being defined as a deviation in a vertical direction of a distance from a center of mass of the vehicle; y is a dependent variable corresponding to a Y coordinate under the vehicle coordinate system, the Y coordinate being defined as a deviation of a distance from a centroid of the vehicle in a horizontal direction; c2Is a coefficient of a quadratic term, C1Is a coefficient of a first order term, C0' is a constant term;
wherein, based on the coordinate information of the reference target in the vehicle coordinate system, the quadratic term coefficient C in the quadratic function relation (1) is calculated2Once, forCoefficient of term C1And constant term C0’;
Wherein the turning radius of the current vehicle is calculated based on the current yaw rate of the vehicle to derive the quadratic coefficient C2
In the step (b 3), the following quadratic function relation (2) is obtained from the quadratic function relation (1) as a first road side edge curve:
Y=C2×X2 + C1×X + C0 (2)
wherein, C0Is a constant term, C0= C0' + D, D is the distance constant of the road edge from the stationary objects arranged approximately regularly beside it.
9. The method of claim 7, further comprising the step of:
acquiring lane line image information of the road;
calculating to obtain a second road edge curve of the road based on the lane line image information; and
determining a road edge curve of the road based on the first road edge curve and a second road edge curve.
10. The method as set forth in claim 9, wherein in the step of determining the road edge curve of the road, the road edge curve of the short distance road adopts the second road edge curve, and the road edge curve of the long distance road adopts the first road edge curve.
11. The method as claimed in claim 10, wherein in the step of determining the road-edge curve of the road, the first road-edge curve is adopted as its road-edge curve at a section of the road where a lane line is missing or unclear.
12. An autopilot system for a vehicle comprising an apparatus for determining a road edge according to one of claims 1 to 6.
13. A vehicle provided with an automatic driving system, characterized in that the automatic driving system is provided with a device for determining road edges according to any one of claims 1-6.
CN201710196345.2A 2017-03-29 2017-03-29 Device and method for determining road edge Active CN106991389B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710196345.2A CN106991389B (en) 2017-03-29 2017-03-29 Device and method for determining road edge
PCT/CN2018/075130 WO2018177026A1 (en) 2017-03-29 2018-02-02 Device and method for determining road edge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710196345.2A CN106991389B (en) 2017-03-29 2017-03-29 Device and method for determining road edge

Publications (2)

Publication Number Publication Date
CN106991389A CN106991389A (en) 2017-07-28
CN106991389B true CN106991389B (en) 2021-04-27

Family

ID=59412995

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710196345.2A Active CN106991389B (en) 2017-03-29 2017-03-29 Device and method for determining road edge

Country Status (2)

Country Link
CN (1) CN106991389B (en)
WO (1) WO2018177026A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991389B (en) * 2017-03-29 2021-04-27 蔚来(安徽)控股有限公司 Device and method for determining road edge
CN109895694B (en) * 2017-12-08 2020-10-20 郑州宇通客车股份有限公司 Lane departure early warning method and device and vehicle
CN108573272B (en) * 2017-12-15 2021-10-29 蔚来(安徽)控股有限公司 Lane fitting method
CN108572642B (en) * 2017-12-15 2022-02-18 蔚来(安徽)控股有限公司 Automatic driving system and transverse control method thereof
CN108693517B (en) * 2018-05-22 2020-10-09 森思泰克河北科技有限公司 Vehicle positioning method and device and radar
US11035943B2 (en) * 2018-07-19 2021-06-15 Aptiv Technologies Limited Radar based tracking of slow moving objects
CN109254289B (en) * 2018-11-01 2021-07-06 百度在线网络技术(北京)有限公司 Detection method and detection equipment for road guardrail
CN110174113B (en) * 2019-04-28 2023-05-16 福瑞泰克智能***有限公司 Positioning method, device and terminal for vehicle driving lane
CN110244696A (en) * 2019-06-24 2019-09-17 北京经纬恒润科技有限公司 Vehicle body crosswise joint method and electronic control unit ECU
CN110320504B (en) * 2019-07-29 2021-05-18 浙江大学 Unstructured road detection method based on laser radar point cloud statistical geometric model
CN111198370B (en) * 2020-01-02 2022-07-08 北京百度网讯科技有限公司 Millimeter wave radar background detection method and device, electronic equipment and storage medium
CN111289980B (en) * 2020-03-06 2022-03-08 成都纳雷科技有限公司 Roadside stationary object detection method and system based on vehicle-mounted millimeter wave radar
CN112132109A (en) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 Lane line processing and lane positioning method, device, equipment and storage medium
CN112597839B (en) * 2020-12-14 2022-07-08 上海宏景智驾信息科技有限公司 Road boundary detection method based on vehicle-mounted millimeter wave radar
CN112949489B (en) * 2021-03-01 2023-05-12 成都安智杰科技有限公司 Road boundary identification method and device, electronic equipment and storage medium
CN113167886B (en) * 2021-03-02 2022-05-31 华为技术有限公司 Target detection method and device
CN113525368A (en) * 2021-06-23 2021-10-22 清华大学 Lane keeping emergency control strategy and safety control method and device for vehicle
CN113879312B (en) * 2021-11-01 2023-02-28 无锡威孚高科技集团股份有限公司 Forward target selection method and device based on multi-sensor fusion and storage medium
CN114067120B (en) * 2022-01-17 2022-09-20 腾讯科技(深圳)有限公司 Augmented reality-based navigation paving method, device and computer readable medium
CN114475614A (en) * 2022-03-21 2022-05-13 中国第一汽车股份有限公司 Method, device, medium and equipment for screening dangerous targets
CN114872712B (en) * 2022-06-29 2022-10-18 小米汽车科技有限公司 Static vehicle detection method, device, equipment, vehicle and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6728392B1 (en) * 2001-01-30 2004-04-27 Navigation Technologies Corp. Shape comparison using a rotational variation metric and applications thereof
CN101210825A (en) * 2006-12-27 2008-07-02 爱信艾达株式会社 Map information generating systems
CN104002809A (en) * 2014-05-28 2014-08-27 长安大学 Vehicle fork road segment detection device and detection method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102275587B (en) * 2011-06-07 2015-12-09 长安大学 A kind of following vehicle collision danger monitoring device and monitoring method thereof
KR101787996B1 (en) * 2013-04-11 2017-10-19 주식회사 만도 Apparatus of estimating traffic lane in vehicle, control method of thereof
EP3159866B1 (en) * 2014-06-19 2022-04-13 Hitachi Astemo, Ltd. Object recognition apparatus and vehicle travel controller using same
CN105404844B (en) * 2014-09-12 2019-05-31 广州汽车集团股份有限公司 A kind of Method for Road Boundary Detection based on multi-line laser radar
CN106476689A (en) * 2015-08-27 2017-03-08 长城汽车股份有限公司 A kind of road limit for width alert device and method for vehicle
CN105711588B (en) * 2016-01-20 2018-05-11 奇瑞汽车股份有限公司 A kind of track keeps auxiliary system and track to keep householder method
CN105922991B (en) * 2016-05-27 2018-08-17 广州大学 Based on the lane departure warning method and system for generating virtual lane line
CN106326850A (en) * 2016-08-18 2017-01-11 宁波傲视智绘光电科技有限公司 Fast lane line detection method
CN106991389B (en) * 2017-03-29 2021-04-27 蔚来(安徽)控股有限公司 Device and method for determining road edge

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6728392B1 (en) * 2001-01-30 2004-04-27 Navigation Technologies Corp. Shape comparison using a rotational variation metric and applications thereof
CN101210825A (en) * 2006-12-27 2008-07-02 爱信艾达株式会社 Map information generating systems
CN104002809A (en) * 2014-05-28 2014-08-27 长安大学 Vehicle fork road segment detection device and detection method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Use of Salient Features for the Design of a Multistage Framework to Extract Ros From High-Resolution Multispectral Satellit Images;Das S et al.;《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTS SENSING》;20111031;第3906-3931页 *
改进随机样本一致性算法的弯曲果园道路检测;林桂潮 等;《农业工程学报》;20150228;第31卷(第4期);第168-174页 *
非结构化道路识别及拟合算法研究;曹旭光;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20160115;C034-679 *

Also Published As

Publication number Publication date
CN106991389A (en) 2017-07-28
WO2018177026A1 (en) 2018-10-04

Similar Documents

Publication Publication Date Title
CN106991389B (en) Device and method for determining road edge
US9846812B2 (en) Image recognition system for a vehicle and corresponding method
Jung et al. A lane departure warning system using lateral offset with uncalibrated camera
US7327855B1 (en) Vision-based highway overhead structure detection system
US9292750B2 (en) Method and apparatus for detecting traffic monitoring video
US10074021B2 (en) Object detection apparatus, object detection method, and program
US20030011509A1 (en) Method for detecting stationary object on road
JP2021510227A (en) Multispectral system for providing pre-collision alerts
CN110809792B (en) Collision estimation device and collision estimation method
CN110008891B (en) Pedestrian detection positioning method and device, vehicle-mounted computing equipment and storage medium
Sehestedt et al. Robust lane detection in urban environments
CN112149460A (en) Obstacle detection method and device
US20190362512A1 (en) Method and Apparatus for Estimating a Range of a Moving Object
CN114415171A (en) Automobile travelable area detection method based on 4D millimeter wave radar
CN115856872A (en) Vehicle motion track continuous tracking method
Hultqvist et al. Detecting and positioning overtaking vehicles using 1D optical flow
CN114084129A (en) Fusion-based vehicle automatic driving control method and system
CN107886036B (en) Vehicle control method and device and vehicle
CN115512542B (en) Track restoration method and system considering shielding based on roadside laser radar
Oniga et al. A fast ransac based approach for computing the orientation of obstacles in traffic scenes
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
EP3288260B1 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and carrier means
KR102385907B1 (en) Method And Apparatus for Autonomous Vehicle Navigation System
CN107256382A (en) Virtual bumper control method and system based on image recognition
CN108416305B (en) Pose estimation method and device for continuous road segmentation object and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200731

Address after: Susong Road West and Shenzhen Road North, Hefei Economic and Technological Development Zone, Anhui Province

Applicant after: Weilai (Anhui) Holding Co., Ltd

Address before: Room 502, Minsheng Bank Building, 12 Cecil Harcourt Road, central, Hongkong, China

Applicant before: NIO NEXTEV Ltd.

GR01 Patent grant
GR01 Patent grant