US9550529B2 - Apparatus and method for recognizing driving field of vehicle - Google Patents

Apparatus and method for recognizing driving field of vehicle Download PDF

Info

Publication number
US9550529B2
US9550529B2 US14/323,350 US201414323350A US9550529B2 US 9550529 B2 US9550529 B2 US 9550529B2 US 201414323350 A US201414323350 A US 201414323350A US 9550529 B2 US9550529 B2 US 9550529B2
Authority
US
United States
Prior art keywords
vehicle
lane
candidate group
road
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/323,350
Other languages
English (en)
Other versions
US20150175204A1 (en
Inventor
Young Chul Oh
Tae Sung Choi
Byung Yong YOU
Chang Young JUNG
Su Rim Kwon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Original Assignee
Hyundai Motor Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Motor Co filed Critical Hyundai Motor Co
Assigned to HYUNDAI MOTOR COMPANY reassignment HYUNDAI MOTOR COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHOI, TAE SUNG, JUNG, CHANG YOUNG, KWON, SU RIM, OH, YOUNG CHUL, YOU, BYUNG YONG
Publication of US20150175204A1 publication Critical patent/US20150175204A1/en
Application granted granted Critical
Publication of US9550529B2 publication Critical patent/US9550529B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/029Steering assistants using warnings or proposing actions to the driver without influencing the steering system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/18Propelling the vehicle
    • B60Y2300/18008Propelling the vehicle related to particular drive situations
    • B60Y2300/18166Overtaking, changing lanes

Definitions

  • the present invention relates to an apparatus and method for recognizing a driving field of a vehicle, and more particularly, to an apparatus and method that recognize a driving field of a vehicle, to extract locations of road lanes, other vehicles, and a guard rail, adjacent to a vehicle using a global positioning system (GPS), an image sensor, and a radar sensor to detect a driving field of the vehicle.
  • GPS global positioning system
  • image sensor an image sensor
  • radar sensor to detect a driving field of the vehicle.
  • a driving field e.g., the area along the path on which the vehicle is traveling
  • a driving field of the vehicle is estimated using a global positioning system (GPS) installed in a navigation device for guidance for stopping on a road shoulder, accurate guidance for entrance and exit onto interchange (IC)/junction (JC), and accurate guidance for a path to a destination, or is estimated using sensors installed within the vehicle, such as an image recognition sensor, a radar sensor, or the like.
  • GPS global positioning system
  • a driving field of a vehicle when a driving field of a vehicle is estimated using a GPS, it may be difficult to estimate an accurate lane of a road on which the vehicle drives due to errors of the GPS.
  • a driving field of a vehicle when a driving field of a vehicle is estimated using a sensor such as an image recognition sensor, a radar sensor, or the like, it may be difficult to estimate a road lane on which numerous vehicles are present and to estimate a lane while the vehicle is being driven on an intermediate lane on a road with having a plurality of lanes.
  • the present invention provides an apparatus and method for recognizing a driving field of a vehicle, to extract locations of road lanes, other vehicles, and a guard rail, adjacent to a vehicle using a global positioning system (GPS), an image sensor, and a radar sensor to check a driving field of the vehicle.
  • GPS global positioning system
  • the present invention provides an apparatus and method for recognizing a driving field of a vehicle, to track whether to change a lane in real time to detect a driving field of the vehicle.
  • an apparatus for recognizing a driving field of a vehicle may include a sensor configured to sense a location of a vehicle driving on a road and sense whether an object adjacent to the vehicle is present, a controller configured to detect whether the object is present and whether a lane of the road on which the vehicle is driven is changed to detect a final lane candidate group on which the vehicle is positioned, and an output unit executed by the controller to display the final lane candidate group.
  • the controller may also be configured to set a virtual road having the same number of lanes as the detected lane of the road, containing a lane of the vehicle (e.g., a lane in which the vehicle is being driven), a lane on the right and left side of the vehicle, and a virtual road having at least one lane, and detect a previous lane candidate group based on the detected object.
  • the controller may be configured to detect a first lane candidate group using the same method as a method for detecting the previous lane candidate group when the lane of the road on which the vehicle is being driven is changed, and reduce and increase the number of lanes by as much as the changed number from the previous lane candidate group to detect the second lane candidate group.
  • the controller may be configured to detect a first lane candidate group using the same method as a method for detecting the previous lane candidate group when the lane of the road on which the vehicle is being driven is not changed, and set the previous lane candidate group as the second lane candidate group.
  • the controller may also be configured to combine the first lane candidate group and the second lane candidate group to detect the final lane candidate group.
  • the object may include a vehicle object adjacent to the vehicle and a still object including a guard rail and a median strip positioned on the road on which the vehicle is positioned.
  • a method for recognizing a driving field of a vehicle may include entering, by a controller, a driving field recognition mode according to external input, detecting, by the controller, a position of a vehicle driving a road, detecting, by the controller, whether an object is present adjacent to the vehicle, determining, by the controller, whether to change a lane of a road on which the vehicle is being driven, combining, by the controller, results to detect a final lane candidate group on which the vehicle is positioned, and displaying, by the controller, the detected final lane candidate group.
  • the method may further include detecting, by the controller, a previous lane candidate group after the detecting whether the object is present.
  • the detection of the previous lane candidate group may include detecting a lane of a road on which the vehicle is positioned, additionally setting a virtual road having the same number of lanes as the detected lane of the road on the right and left side of the vehicle, and a virtual road having at least one lane, and detecting the previous lane candidate group based on the object on the virtual road.
  • the method may further include detecting, by the controller, a first lane candidate group and a second lane candidate group after the detection of whether to change the lane of the road.
  • the detection of the first lane candidate group and the second lane candidate group may include detecting, by the controller, a first lane candidate group using the same method as a method for detecting the previous lane candidate group in response to detecting that the lane of the road on which the vehicle is being driven is changed, and reducing and increasing the number of lanes by as much as the changed number from the previous lane candidate group to detect the second lane candidate group, and detecting a first lane candidate group using the same method as a method for detecting the previous lane candidate group in response to not detecting that the lane of the road on which the vehicle is being driven is changed, and setting the previous lane candidate group as the second lane candidate group.
  • the detection of the final lane candidate group may include combining the first lane candidate group and the second lane candidate group to detect the final lane candidate group.
  • FIG. 1 is an exemplary diagram illustrating a main configuration of a driving field recognizing apparatus according to an exemplary embodiment of the present invention
  • FIG. 2 is an exemplary flowchart illustrating a method of recognizing a driving field of a vehicle according to an exemplary embodiment of the present invention
  • FIGS. 3A-3B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a still object adjacent to the vehicle is not detected, according to an exemplary embodiment of the present invention
  • FIGS. 4A-4B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a still object adjacent to the vehicle is detected, according to an exemplary embodiment of the present invention
  • FIGS. 5A-5B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a lane of a road on which the vehicle is being driven is maintained, according to an exemplary embodiment of the present invention.
  • FIGS. 6A-6B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a lane of a road on which the vehicle is being driven is changed, according to an exemplary embodiment of the present invention.
  • vehicle or “vehicular” or other similar term as used herein is inclusive of motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, combustion, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g. fuels derived from resources other than petroleum).
  • motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, combustion, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g. fuels derived from resources other than petroleum).
  • SUV sports utility vehicles
  • plug-in hybrid electric vehicles e.g. fuels derived from resources other than petroleum
  • controller/control unit refers to a hardware device that includes a memory and a processor.
  • the memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below.
  • control logic of the present invention may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller/control unit or the like.
  • the computer readable mediums include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices.
  • the computer readable recording medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
  • a telematics server or a Controller Area Network (CAN).
  • CAN Controller Area Network
  • FIG. 1 is an exemplary diagram illustrating a main configuration of a driving field recognizing apparatus 100 according to an exemplary embodiment of the present invention.
  • the driving field recognizing apparatus 100 may include a communicator 110 , a sensor 120 , an input unit 130 , an output unit 140 , a storage 150 , and a controller 160 .
  • the driving field recognizing apparatus 100 is used, but embodiments of the present invention are not limited thereto.
  • the exemplary embodiments of the present invention may be applied to audio, video, and navigation (AVN) devices installed within a vehicle.
  • APN audio, video, and navigation
  • the communicator 110 may be configured to perform controller area network (CAN) communication for communication between the sensor 120 and the controller 160 .
  • the sensor 120 may be configured to sense a location of a vehicle being driven on a road and sense an object adjacent to the vehicle. Accordingly, the sensor 120 may include a global positioning system (GPS) sensor, an image sensor, and a radar sensor.
  • GPS global positioning system
  • the GPS sensor may be installed within the vehicle and may be configured to convert a substantially accurate location of the measured vehicle into a coordinate value according to operation of the controller 160 .
  • the image sensor may be configured to obtain image data of the front of a road on which the vehicle is being driven based on operation of the controller 160 .
  • the radar sensor may be installed in the front of the vehicle and may be configured to measure presence of a still object including a vehicle object, a median strip, and a guardrail, adjacent to the vehicle.
  • the radar sensor may use light detection and ranging (LiDAR) laser radar.
  • LiDAR light detection and ranging
  • the radar sensor is not limited thereto, and thus, may use various sensors corresponding thereto.
  • the input unit 130 may be configured to receive a signal for entrance of the vehicle to a driving field recognition mode from a driver.
  • the input unit 130 may be configured by a keypad, a touchpad, a touchscreen, or the like.
  • the input unit 130 may also perform a function of the output unit 140 .
  • the output unit 140 may be configured to output the image data acquired by the image sensor, provide the image data to the driver, and output information about a confirmed lane of a road on which the vehicle is being driven based on the operation of the controller 160 .
  • the storage 150 may be configured to store map data received from a map server (not shown) for providing the map data, extract map data that corresponds to a current location of the vehicle, and provide the map data to the controller 160 according to operation of the controller 160 .
  • the storage 150 may be configured to store a program and the like for recognition of information of the lane of the road on which the vehicle is being driven.
  • the storage 150 may be configured to store a previous lane candidate group of the vehicle, detected by the controller 160 .
  • the controller 160 may be configured to detect presence of an object and whether to change a lane of a road on which the vehicle is being driven to detect a final lane candidate group of a road in which the vehicle is positioned. In particular, upon receiving a signal for entrance to a driving field recognition mode from the input unit 130 , the controller 160 may be configured to enter the driving field recognition mode and operate the sensor 120 . The controller 160 may be configured to operate the GPS sensor, the image sensor, and the radar sensor, included within the sensor 120 , to receive sensing information from each sensor. The controller 160 may be configured to detect a current location of the vehicle from the sensing information received from the GPS sensor and access map data that corresponds to the detected current location from the storage 150 to determine the number of lanes of the road on which the vehicle is being driven.
  • the controller 160 may be configured to set a driving field for detection of an object.
  • the controller 160 may be configured to additionally set a virtual road having the same number of lanes as the detected number of lanes and a virtual road having at least one lane to set the driving field of the object.
  • the controller 160 may be configured to virtually generate four lanes containing the vehicle on the left side of the vehicle based on the vehicle position and virtually generate four lanes containing the vehicle on the right side of the vehicle based on the vehicle position.
  • the controller 160 may be configured to virtually generate one lane for a median strip and two lanes for a guardrail and a road shoulder.
  • an object including the median strip and guardrail positioned in a road may be referred to as a still object.
  • the controller 160 may be configured to analyze the sensing information received from the radar sensor and the image sensor to detect a vehicle object positioned adjacent to the vehicle and a still object present on the right and left side of the vehicle based on the vehicle position.
  • the controller 160 may be configured to detect whether the still object is present on the right and left side of the vehicle.
  • the controller 160 may be configured to detect the previous lane candidate group based on a moving vehicle object adjacent to the vehicle being driven.
  • the controller 160 may be configured to detect previous lane candidates based on the still object and vehicle object present on the right and left side of the vehicle.
  • the controller 160 may be configured to detect whether the vehicle changes a lane.
  • the controller 160 may be configured to detect a first lane candidate group irrespective of whether the vehicle changes a lane. Additionally, the controller 160 may be configured to detect the first lane candidate group using the same method as a method of detecting the previous lane candidate group.
  • the controller 160 may be configured to detect a second lane candidate group. In particular, when the vehicle changes a lane to the left, the controller 160 may be configured to reduce the number of lanes from the previously-detected previous lane candidate group to detect the second lane candidate group.
  • the controller 160 may be configured to increase the number of lanes from the previous lane candidate group to detect the second lane candidate group.
  • the controller 160 may be configured to set the previous lane candidate group as the second lane candidate group.
  • the controller 160 may be configured to set a final lane candidate group using a combination of the first lane candidate group and the second lane candidate group and display the final lane candidate group on the output unit 140 .
  • FIG. 2 is an exemplary flowchart illustrating a method of recognizing a driving field of a vehicle according to an exemplary embodiment of the present invention.
  • the controller 160 may be configured to determine a signal for entrance to a driving field recognition mode from the input unit 130 and enter the driving field recognition mode in response to receiving the signal.
  • the controller 160 may be configured to detect whether a current mode is a state in which the previous lane candidate group is stored in the storage 150 . As the detection result of operation S 13 , when the current mode is the state in which the previous lane candidate group is stored in the storage 150 , the controller 160 may proceed to operation S 15 to extract the previous lane candidate group and proceed to operation S 33 .
  • the controller 160 may proceed to operation S 17 to operate the sensor 120 .
  • the controller 160 may be configured to operate the GPS sensor, the image sensor, and the radar sensor, included within the sensor 120 , to receive sensing information from each sensor.
  • the controller 160 may be configured to detect the number of lanes of a road on which the vehicle is being driven. Accordingly, the controller 160 may be configured to detect a current location of the vehicle from the sensing information received from the GPS sensor and access map data that corresponds to the detected current location from the storage 150 to detected the number of lanes of the road on which the vehicle is being driven.
  • the controller 160 may be configured to set a driving field for detection of an object.
  • the controller 160 may be configured to additionally set a virtual road having the same number of lanes as the detected number of lanes and a virtual road having at least one lane to set the driving field of the object.
  • the controller 160 may be configured to virtually generate four lanes containing the vehicle on the left side of the vehicle being driven based on the vehicle position and virtually generate four lanes containing the vehicle on the right side of the vehicle being driven based on the vehicle position.
  • the controller 160 may be configured to virtually generate one lane for a median strip and two lanes for a guardrail and a road shoulder.
  • an object including the median strip and guardrail positioned in a road may be referred to as a still object.
  • the controller 160 may be configured to detect an object.
  • the controller 160 may be configured to analyze the sensing information received from the radar sensor and the image sensor to detect a vehicle object positioned adjacent to the vehicle being driven and a still object present on the right and left of the vehicle based on the vehicle position.
  • the controller 160 may be configured to detect whether the still object is present on the right and left side of the vehicle based on the detection result of operation S 23 . As the detection result of operation S 25 , when a still object is not present on the right and left side of the vehicle, the controller 160 may proceed to operation S 27 . When the still object is present on the right and left side of the vehicle, the controller 160 may proceed to operation S 29 .
  • the controller 160 may be configured to detect the previous lane candidate group based on a moving vehicle object adjacent to the vehicle being driven and proceed to operation S 31 .
  • the controller 160 may be configured to detect the previous lane candidates based on the still object and moving vehicle object present on the right and left side of the vehicle and proceed to operation S 31 .
  • the controller 160 may be configured to store the previous lane candidates in the storage 150 and proceed to operation S 15 to extract the stored previous lane candidate group. In addition, the controller 160 may proceed to operation S 33 .
  • the controller 160 may be configured to detect a first lane candidate group using the same method as a method for detecting the previous lane candidate group. The first lane candidate group may be detected since a lane of a road on which the vehicle is be driven may be changed and a lane of a road on which a vehicle object adjacent to the vehicle being driven is positioned may be changed.
  • the controller 160 may be configured to detect whether to change a lane of a road on which the vehicle is being driven based on detecting other objects surrounding the vehicle.
  • the controller 160 may proceed to operation S 37 .
  • the controller 160 may proceed to operation S 39 .
  • the controller 160 may to operation S 37 to detect the second lane candidate group and then proceed to operation S 41 .
  • the controller 160 may be configured to reduce the number of lanes from the previous lane candidate group by as much as the changed number of lanes to detect the second lane candidate group.
  • the controller 160 may be configured to increase the number of lanes from the previous lane candidate group by as much as the changed number of lanes to detect the second lane candidate group.
  • the controller 160 may proceed to operation S 41 .
  • the controller 160 may proceed to operation S 39 upon detecting when the lane of the road is not changed.
  • the controller 160 may be configured to set the previous lane candidate group extracted in operation S 15 as the second lane candidate group and may proceed to operation S 41 .
  • the controller 160 may be configured to detect a final lane candidate group using a combination of the first lane candidate group and the second lane candidate group.
  • the controller 160 may be configured to display the detected final lane candidate group on the output unit 140 .
  • the controller 160 may be configured to terminate the aforementioned process upon receiving a termination signal for termination of a driving field recognition mode through the input unit 130 , and proceed to operation S 47 when the controller 160 does not receive the termination signal.
  • the controller 160 may be configured to re-set the final lane candidate group detected in operation S 41 as the previous lane candidate group and return to operation S 33 to re-perform the aforementioned operations.
  • a driving field on which the vehicle currently is being driven may be recognized in real time until the driving field recognition mode is terminated.
  • FIGS. 3A-3B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a still object adjacent to the vehicle is not detected, according to an exemplary embodiment of the present invention.
  • FIGS. 4A-4B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a still object adjacent to the vehicle is detected, according to an exemplary embodiment of the present invention.
  • FIGS. 5A-5B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a lane of a road on which the vehicle is being driven is maintained, according to an exemplary embodiment of the present invention.
  • FIGS. 6A-6B are exemplary diagrams illustrating a method for recognizing a driving field of a vehicle when a lane of a road on which the vehicle is being driven is changed, according to an exemplary embodiment of the present invention.
  • the controller 160 may be configured to set object detection driving field CA, LA, RA, and SA, as illustrated in FIG. 3B .
  • the object detection driving fields CA, LA, RA, and SA may be set by a virtual road that includes the same number of lanes as the detected lanes on the right and left sides of the vehicle V and a virtual road that includes at least one lane.
  • the controller 160 may be configured to generate a virtual four-lane road LA on the left side of the vehicle V, including the lane on which the vehicle V is positioned, and generate a virtual four-lane road RA on the right side of the vehicle V, including the lane on which the vehicle V is positioned.
  • the controller 160 may be configured to generate a virtual road CA for a median strip on the left of the LA and generate a virtual road SA for a guardrail on the right of the RA.
  • a two-lane road may be allocated to the SA for a guardrail and a road shoulder.
  • the controller 160 may be configured to detect an object from the object detection driving fields CA, LA, RA, and SA and extract a road on which the object is positioned.
  • the controller 160 may be configured to detect a candidate group of roads on which the vehicle V is positioned as two, three, and four-lane roads, based on a vehicle object (a), and may be configured to detect a candidate group of roads on which the vehicle V is positioned as one, two, and three-lane roads, based on a vehicle object (c).
  • the controller 160 may further be configured to detect the set candidate group of two and three-lane roads as the previous lane candidate group using a combination of candidate groups.
  • the controller 160 may be configured to set object detection driving fields CA, LA, RA, and SA, as illustrated in FIG. 4B .
  • the setting of the object detection driving fields CA, LA, RA, and SA is the same as those described with reference to FIG. 3A .
  • the controller 160 may be configured to detect an object from the object detection driving fields CA, LA, RA, and SA and extract a driving field on which an object is present. As the detection result of the object, as illustrated in FIGS. 4A-4B , upon detecting still objects SO 1 and SO 2 , the controller 160 may be configured to determine that the still object is present on the right and left side of the vehicle V. In particular, the still object may be reflected when two or more still objects are detected. The controller 160 may be configured to detect a candidate group of roads on which the vehicle V is positioned as two-lane roads, based on a median strip SO 1 , and detect a candidate group of roads on which the vehicle V is positioned as two and three-lane roads, based on a guardrail SO 2 .
  • a candidate group of locations of the vehicle may be two and three-lane roads, based on the guardrail SO 2 .
  • the controller 160 may be configured to detect a candidate group of driving fields on which the vehicle V is positioned as two, three, and four-lane roads.
  • the controller 160 may be configured to detect a candidate group of driving fields on which the vehicle V is positioned as one, two, and three-lane roads.
  • the controller 160 may also be configured to detect the set candidate group of a two-lane road as the previous lane candidate group using a combination of candidate groups.
  • the controller 160 may be configured to set object detection driving fields CA, LA, RA, and SA.
  • the setting of the object detection driving fields CA, LA, RA, and SA is the same as those described with reference to FIG. 3A .
  • the controller 160 may be configured to detect a first lane candidate group using the same method as a method for detecting the previous lane candidate group. In FIG.
  • the controller 160 may be configured to detect a candidate group of roads on which the vehicle V is positioned as one, two, three, and four-lane roads, and detect a candidate group of roads on which the vehicle V is positioned as one and two-lane roads, based on a vehicle object e present in a last lane on the right of the vehicle V.
  • the first lane candidate group detected using a combination of the detected candidates may be one and two-lane roads.
  • the controller 160 may be configured to detect a second lane candidate group.
  • the second lane candidate group may be the same as the previous lane candidate group detected in FIGS. 3A-3B .
  • the controller 160 may also be configured to detect a two-lane road obtained using a combination of one and two-lane roads as the detected first lane candidate group and two and three-lane roads as the second lane candidate group as a final lane candidate group the vehicle V.
  • the controller 160 may be configured to set object detection driving fields CA, LA, RA, and SA.
  • the setting of the object detection driving fields CA, LA, RA, and SA is the same as those described with reference to FIG. 3A .
  • the controller 160 may be configured to detect a first lane candidate group using the same method as a method for detection of the previous lane candidate group. In FIG.
  • the controller 160 may be configured to detect a candidate group of roads on which the vehicle V is positioned as two, three, and four-lane roads, based on a vehicle object d present on the left of the vehicle V, and detect a candidate group of roads on which the vehicle V is positioned as one, two, and three-lane roads, based on a vehicle object e present on the right of the vehicle V.
  • the first lane candidate group detected using a combination of the detected candidates may be two and three-lane roads.
  • the controller 160 may be configured to detect a second lane candidate group.
  • the controller 160 since the lane of the vehicle V is changed to the right by one lane compared with in FIG. 6A , the controller 160 may be configured to add one lane (+1) to the previous lane candidate group detected in FIGS. 3A-3B to detect the second lane candidate group.
  • the second lane candidate group may be three and four-lane roads.
  • the controller 160 may also be configured to detect a three-lane road obtained using a combination of two and three-lane roads as the detected first lane candidate group and three and four-lane roads as the second lane candidate group as a final lane candidate group the vehicle V.
  • locations of lanes of a road, another vehicle, and a guard rail, adjacent to a vehicle may be extracted using a GPS, an image sensor, and a radar sensor to detect a driving field of the vehicle.
  • whether to change a lane of a road of the vehicle may be determined in real time to detect a driving field of the vehicle in real time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
US14/323,350 2013-12-24 2014-07-03 Apparatus and method for recognizing driving field of vehicle Active US9550529B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130162430A KR101519277B1 (ko) 2013-12-24 2013-12-24 차량의 주행도로 인지 장치 및 방법
KR10-2013-0162430 2013-12-24

Publications (2)

Publication Number Publication Date
US20150175204A1 US20150175204A1 (en) 2015-06-25
US9550529B2 true US9550529B2 (en) 2017-01-24

Family

ID=53275568

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/323,350 Active US9550529B2 (en) 2013-12-24 2014-07-03 Apparatus and method for recognizing driving field of vehicle

Country Status (4)

Country Link
US (1) US9550529B2 (de)
KR (1) KR101519277B1 (de)
CN (1) CN104724121B (de)
DE (1) DE102014212702B4 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10504052B2 (en) * 2015-01-16 2019-12-10 Volvo Car Corporation Navigation unit and method for providing navigation instructions for an autonomous vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101782362B1 (ko) * 2015-12-14 2017-10-13 현대자동차주식회사 차량, 및 그 제어방법
JP6614353B2 (ja) * 2016-07-12 2019-12-11 日産自動車株式会社 走行制御方法及び走行制御装置
KR20180099280A (ko) * 2017-02-28 2018-09-05 삼성전자주식회사 가상 차로 생성 장치 및 방법
WO2020132938A1 (en) * 2018-12-26 2020-07-02 Baidu.Com Times Technology (Beijing) Co., Ltd. Methods for obstacle filtering for non-nudge planning system in autonomous driving vehicle
CN111241900B (zh) * 2019-04-12 2021-02-05 宁夏爱特云翔信息技术有限公司 交通环境现场维护方法

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080195315A1 (en) * 2004-09-28 2008-08-14 National University Corporation Kumamoto University Movable-Body Navigation Information Display Method and Movable-Body Navigation Information Display Unit
JP2010070012A (ja) 2008-09-17 2010-04-02 Toyota Motor Corp 車線認識装置
JP2010069922A (ja) 2008-09-16 2010-04-02 Toyota Motor Corp 車線認識装置
KR20100063372A (ko) 2008-12-03 2010-06-11 한민홍 다중 카메라를 이용한 주행차선 인식방법
KR20100131681A (ko) 2009-06-08 2010-12-16 주식회사 만도 운전자 편의 장치 및 그의 차량 정체 해소 방법
JP2012127772A (ja) 2010-12-15 2012-07-05 Honda Motor Co Ltd 車両の走行支援装置
KR20130015746A (ko) 2011-08-04 2013-02-14 엘지전자 주식회사 차선 인식 장치 및 그 방법
KR20130021987A (ko) 2011-08-24 2013-03-06 현대모비스 주식회사 차량의 차선 인식 방법 및 그 장치
JP2013097714A (ja) 2011-11-04 2013-05-20 Toyota Motor Corp 車線認識装置
KR20130054660A (ko) 2011-11-17 2013-05-27 현대모비스 주식회사 차량의 전방 영상 및 측방 영상을 이용한 차선 인식 향상 시스템 및 그 방법
JP2013168016A (ja) 2012-02-15 2013-08-29 Toyota Motor Corp 走行車線認識装置
US20140156182A1 (en) * 2012-11-30 2014-06-05 Philip Nemec Determining and displaying auto drive lanes in an autonomous vehicle

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10327869A1 (de) 2003-06-18 2005-01-13 Siemens Ag Navigationssystem mit Fahrspurhinweisen
DE102005039103A1 (de) 2005-08-18 2007-03-01 Robert Bosch Gmbh Verfahren für die Erfassung eines Verkehrsraums
JP4770702B2 (ja) * 2006-10-31 2011-09-14 アイシン・エィ・ダブリュ株式会社 経路案内システム及び経路案内方法
KR100874107B1 (ko) * 2008-03-25 2008-12-15 팅크웨어(주) 차선정보를 제공하는 방법 및 상기 방법을 수행하는 장치
US20110291874A1 (en) * 2010-06-01 2011-12-01 De Mersseman Bernard Vehicle radar system and method for detecting objects
JP5971020B2 (ja) * 2011-08-22 2016-08-17 日産自動車株式会社 レーン認識装置

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080195315A1 (en) * 2004-09-28 2008-08-14 National University Corporation Kumamoto University Movable-Body Navigation Information Display Method and Movable-Body Navigation Information Display Unit
JP2010069922A (ja) 2008-09-16 2010-04-02 Toyota Motor Corp 車線認識装置
JP2010070012A (ja) 2008-09-17 2010-04-02 Toyota Motor Corp 車線認識装置
KR20100063372A (ko) 2008-12-03 2010-06-11 한민홍 다중 카메라를 이용한 주행차선 인식방법
KR20100131681A (ko) 2009-06-08 2010-12-16 주식회사 만도 운전자 편의 장치 및 그의 차량 정체 해소 방법
JP2012127772A (ja) 2010-12-15 2012-07-05 Honda Motor Co Ltd 車両の走行支援装置
KR20130015746A (ko) 2011-08-04 2013-02-14 엘지전자 주식회사 차선 인식 장치 및 그 방법
KR20130021987A (ko) 2011-08-24 2013-03-06 현대모비스 주식회사 차량의 차선 인식 방법 및 그 장치
JP2013097714A (ja) 2011-11-04 2013-05-20 Toyota Motor Corp 車線認識装置
KR20130054660A (ko) 2011-11-17 2013-05-27 현대모비스 주식회사 차량의 전방 영상 및 측방 영상을 이용한 차선 인식 향상 시스템 및 그 방법
JP2013168016A (ja) 2012-02-15 2013-08-29 Toyota Motor Corp 走行車線認識装置
US20140156182A1 (en) * 2012-11-30 2014-06-05 Philip Nemec Determining and displaying auto drive lanes in an autonomous vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10504052B2 (en) * 2015-01-16 2019-12-10 Volvo Car Corporation Navigation unit and method for providing navigation instructions for an autonomous vehicle

Also Published As

Publication number Publication date
CN104724121B (zh) 2018-10-16
DE102014212702B4 (de) 2024-03-07
US20150175204A1 (en) 2015-06-25
KR101519277B1 (ko) 2015-05-11
DE102014212702A1 (de) 2015-06-25
CN104724121A (zh) 2015-06-24

Similar Documents

Publication Publication Date Title
US9550529B2 (en) Apparatus and method for recognizing driving field of vehicle
US10816984B2 (en) Automatic data labelling for autonomous driving vehicles
CN103786729B (zh) 车道识别方法和***
US9273971B2 (en) Apparatus and method for detecting traffic lane using wireless communication
US9792506B2 (en) Lane change determining apparatus, junction entry determining apparatus and method thereof
US10139832B2 (en) Computer-assisted or autonomous driving with region-of-interest determination for traffic light analysis
US20140365109A1 (en) Apparatus and method for recognizing driving lane
US20170320500A1 (en) Path planning apparatus and method for autonomous vehicle
US20150166069A1 (en) Autonomous driving style learning
US9694851B2 (en) Apparatus and method of generating travel route of vehicle
CN110490217B (zh) 用于改进对象检测和对象分类的方法和***
US20160245659A1 (en) Method and system of route guidance for a towing vehicle
US20140114500A1 (en) Method and system for adjusting side mirror
KR20190100855A (ko) 자율 주행 차량을 위한 자기 위치 측정 방법, 시스템 및 기계 판독 가능한 매체
US11195027B2 (en) Automated crowd sourcing of road environment information
US20200307630A1 (en) Vehicle control device, vehicle control method and non-transitory computer-readable medium
WO2020164090A1 (en) Trajectory prediction for driving strategy
US20210221395A1 (en) Information processing apparatus and information processing method
US20220388506A1 (en) Control apparatus, movable object, control method, and computer-readable storage medium
US11347240B2 (en) Method and apparatus for determining path
US20150168155A1 (en) Method and system for measuring a vehicle position indoors
US20180088587A1 (en) Controlling Method and System for Autonomous Vehicle
US9111451B2 (en) Apparatus and method for driving guide of vehicle
US20220327317A1 (en) Apparatus and method for predicting trajectory of surrounding vehicle
WO2021214871A1 (ja) 状態推定方法、状態推定装置、及びプログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: HYUNDAI MOTOR COMPANY, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OH, YOUNG CHUL;CHOI, TAE SUNG;YOU, BYUNG YONG;AND OTHERS;REEL/FRAME:033239/0730

Effective date: 20140513

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8