US10962980B2 - System and methods for reverse braking during automated hitch alignment - Google Patents

System and methods for reverse braking during automated hitch alignment Download PDF

Info

Publication number
US10962980B2
US10962980B2 US16/117,076 US201816117076A US10962980B2 US 10962980 B2 US10962980 B2 US 10962980B2 US 201816117076 A US201816117076 A US 201816117076A US 10962980 B2 US10962980 B2 US 10962980B2
Authority
US
United States
Prior art keywords
trailer
vehicle
proximity
coupler
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/117,076
Other versions
US20200073398A1 (en
Inventor
Luke Niewiadomski
Erick Michael Lavoie
Eric L. Reed
Chen Zhang
Roger Arnold Trombley
Douglas J. Rogan
Bruno Sielly Jales Costa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to US16/117,076 priority Critical patent/US10962980B2/en
Assigned to FORD GLOBAL TECHNOLOGIES, LLC reassignment FORD GLOBAL TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Rogan, Douglas J., TROMBLEY, ROGER ARNOLD, ZHANG, CHEN, Jales Costa, Bruno Sielly, LAVOIE, ERICK MICHAEL, NIEWIADOMSKI, LUKE, REED, ERIC L.
Priority to CN201910797174.8A priority patent/CN110871652A/en
Priority to DE102019123125.6A priority patent/DE102019123125A1/en
Publication of US20200073398A1 publication Critical patent/US20200073398A1/en
Application granted granted Critical
Publication of US10962980B2 publication Critical patent/US10962980B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/245Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating push back or parking of trailers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/248Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for measuring, indicating or displaying the weight
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/36Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating connection, e.g. hitch catchers, visual guide means, signalling aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/58Auxiliary devices
    • B60D1/62Auxiliary devices involving supply lines, electric circuits, or the like
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0213Road vehicle, e.g. car or truck

Definitions

  • the present disclosure generally relates to a system for assisting in a vehicle-trailer hitching operation.
  • the present disclosure relates to a system for detecting a force applied to a hitch assembly and related applications.
  • Hitching a trailer to a vehicle can be a difficult and time-consuming experience.
  • aligning a vehicle hitch ball with the desired trailer hitch can, depending on the initial location of the trailer relative to the vehicle, require repeated forward and reverse driving coordinated with multiple steering maneuvers to appropriately position the vehicle.
  • the trailer hitch cannot be seen, and the hitch ball can, under ordinary circumstances, never actually be seen by the driver.
  • This lack of sight lines requires an inference of the positioning of the hitch ball and hitch based on experience with a particular vehicle and trailer, and can still require multiple instances of stopping and stepping out of the vehicle to confirm alignment or to note an appropriate correction for a subsequent set of maneuvers.
  • the closeness of the hitch ball to the rear bumper of the vehicle means that any overshoot can cause a collision of the vehicle with the trailer. Accordingly, further advancements may be desired.
  • a vehicle system comprising a hitch ball mounted on a vehicle.
  • the system comprises a plurality of sensor devices comprising an ultrasonic sensor and an image sensor.
  • a controller is configured to process image data from the image sensor identifying a coupler position of a trailer.
  • the controller is further configured to process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer. Based on the proximity of the trailer, the system is configured to identify the trailer in a detection range of the image sensor.
  • a method for controlling a vehicle comprises processing image data identifying a coupler position of a trailer and processing ultrasonic data identifying a proximity of the trailer.
  • the method further comprises controlling a motion of the vehicle aligning the hitch ball with the coupler position and monitoring proximity of the trailer relative to the coupler position.
  • the method comprises halting the motion of the vehicle.
  • a vehicle system comprising a hitch ball mounted on a vehicle and an ultrasonic sensor configured to capture ultrasonic data rearward of the vehicle.
  • An image sensor is configured to capture image data rearward of the vehicle.
  • a controller is configured to process image data from the image sensor identifying a coupler position of a trailer and control a motion of the vehicle aligning the hitch ball with the coupler position.
  • the controller is further configured to process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer and monitor the proximity of the trailer relative to the coupler position. The controller corrects for a misidentification of the coupler position identified in the image data based on the proximity of the trailer identified from the ultrasonic data.
  • FIG. 1 is a perspective view of a vehicle in an unhitched position relative to a trailer
  • FIG. 2 is a diagram of a system according to an aspect of the disclosure for assisting in aligning the vehicle with a trailer in a position for hitching the trailer to the vehicle;
  • FIG. 3 is an overhead schematic view of a vehicle during a step of the alignment sequence with the trailer
  • FIG. 4 is a is an overhead schematic view of a vehicle during a step of the alignment sequence with the trailer;
  • FIG. 5 is a projected view of image data demonstrating a alignment sequence of a vehicle with the trailer
  • FIG. 6A is a side profile view of a vehicle approaching a trailer demonstrating a plurality of sensors detecting the trailer;
  • FIG. 6B is a side profile view of a vehicle approaching a trailer demonstrating a plurality of sensors detecting the trailer;
  • FIG. 7 is a flow chart demonstrating a method for controlling an alignment of a vehicle with a trailer utilizing proximity data and image data;
  • FIG. 8 is a flow chart demonstrating a method for controlling an alignment of a vehicle with a trailer identifying a minimum alignment distance for an alignment operation
  • FIG. 9 is a flow chart demonstrating a method for controlling an alignment of a vehicle with a trailer utilizing proximity data in combination with additional sensor data in accordance with the disclosure.
  • the terms “upper,” “lower,” “right,” “left,” “rear,” “front,” “vertical,” “horizontal,” “interior,” “exterior,” and derivatives thereof shall relate to the device as oriented in FIG. 1 .
  • the device may assume various alternative orientations, except where expressly specified to the contrary.
  • the specific devices and processes illustrated in the attached drawing, and described in the following specification are simply exemplary embodiments of the inventive concepts defined in the appended claims. Hence, specific dimensions and other physical characteristics relating to the embodiments disclosed herein are not to be considered as limiting, unless the claims expressly state otherwise.
  • reference numeral 10 designates a hitch assistance system (also referred to as a “hitch assist” system) for a vehicle 12 .
  • hitch assist system 10 includes a controller 14 configured to acquire position data of a coupler 16 of a trailer 18 .
  • the controller 14 may be configured to derive a vehicle path 20 to align a hitch ball 22 of the vehicle 12 with the coupler 16 .
  • Deriving the vehicle path 20 may include a variety of steps including detecting and compensating for a change in a coupler position 24 in order to control the vehicle 12 to locate a hitch position 26 aligned with the coupler 16 .
  • the vehicle path 20 may comprise a plurality of segments 28 , which may correspond to changes in the operating direction or steering direction of the vehicle 12 .
  • deriving the vehicle path 20 may include navigating around intervening objects or structures, operating over uneven terrain, following a desired path indicated by an operator or user U, etc.
  • the disclosure may provide for the hitch assist system 10 to provide for improved navigation of the vehicle 12 and/or interaction with the coupler 16 such that trailer 18 may be effectively connected to the vehicle 12 without complication.
  • the system 10 may be configured to detect a proximity of the coupler 16 in connection with the trailer 18 .
  • the proximity of the trailer 18 may be detected in response to a signal received by the controller 14 from one or more proximity sensors 30 .
  • the proximity sensors 30 may correspond to various sensors, including, but not limited to, ultrasonic sensors, electromagnetic sensors, radar sensors, laser sensors, and/or various types of sensors that may be configured to detect a distance of an object along the vehicle path 20 .
  • the system 10 may utilize proximity information from the one or more proximity sensor 30 in combination with additional sensors, which may be utilized to detect and track the coupler position 24 .
  • the system 10 may provide for improved operational accuracy and error reduction by utilizing the proximity sensor 30 in combination with at least one addition sensor (e.g., a camera or image system). In this way, the system accurately identify the coupler position 24 and control the vehicle 12 to align the hitch position 26 with the coupler position 24 .
  • at least one addition sensor e.g., a camera or image system
  • the system 10 includes various sensors and devices that obtain or otherwise provide vehicle status-related information.
  • This information includes positioning information from a positioning system 32 , which may include a dead reckoning device 34 or, in addition or as an alternative, a global positioning system (GPS), to determine a coordinate location of the vehicle 12 based on the one or more locations of the devices within the positioning system 32 .
  • the dead reckoning device 34 can establish and track the coordinate location of the vehicle 12 within a localized coordinate system 36 based at least on vehicle speed and steering angle ⁇ as shown in FIG. 3 .
  • hitch assist system 10 may include a speed of the vehicle 12 from a speed sensor 38 and a yaw rate of the vehicle 12 from a yaw rate sensor 40 . It is contemplated that in additional embodiments, a distance or proximity sensor or an array thereof, and other vehicle sensors and devices may provide sensor signals or other information, such as sequential images of the trailer 18 , including the detected coupler 16 , that the controller 14 of the hitch assist system 10 may process with various routines to determine the height H and position (e.g., based on the distance D c and angle ⁇ c ) of coupler 16 .
  • the hitch assist system 10 is in communication with the steering system 50 of vehicle 12 .
  • the steering system 50 may be a power assist steering system 50 including a steering motor 52 to operate the steered wheels 54 ( FIG. 1 ) of the vehicle 12 for moving the vehicle 12 in such a manner that the vehicle yaw changes with the vehicle velocity and the steering angle ⁇ .
  • the power assist steering system 50 is an electric power-assisted steering (“EPAS”) system including electric steering motor 52 for turning the steered wheels 54 to a steering angle ⁇ based on a steering command, whereby the steering angle ⁇ may be sensed by a steering angle sensor 56 of the power assist steering system 50 .
  • the steering command may be provided by the hitch assist system 10 for autonomously steering during a trailer hitch alignment maneuver and may alternatively be provided manually via a rotational position (e.g., steering wheel angle) of a steering wheel of vehicle 12 .
  • the steering wheel of the vehicle 12 is mechanically coupled with the steered wheels 54 of the vehicle 12 , such that the steering wheel moves in concert with steered wheels 54 , preventing manual intervention with the steering wheel during autonomous steering.
  • a torque sensor 58 is provided on the power assist steering system 50 that senses torque on the steering wheel that is not expected from autonomous control of the steering wheel and therefore indicative of manual intervention.
  • the hitch assist system 10 may alert the driver to discontinue manual intervention with the steering wheel and/or discontinue autonomous steering.
  • some vehicles have a power assist steering system 50 that allows a steering wheel to be partially decoupled from movement of the steered wheels 54 of such a vehicle.
  • the power assist steering system 50 provides the controller 14 of the hitch assist system 10 with information relating to a rotational position of steered wheels 54 of the vehicle 12 , including a steering angle ⁇ .
  • the controller 14 in the illustrated embodiment processes the current steering angle, in addition to other vehicle 12 conditions to guide the vehicle 12 along the desired path 20 ( FIG. 3 ).
  • the hitch assist system 10 in additional embodiments, may be an integrated component of the power assist steering system 50 .
  • the power assist steering system 50 may include a hitch assist algorithm for generating vehicle steering information and commands as a function of all or a portion of information received from an imaging system 60 , the power assist steering system 50 , a vehicle brake control system 62 , a powertrain control system 64 , and other vehicle sensors and devices, as well as a human-machine interface (“HMI”) 66 , as discussed further below.
  • a hitch assist algorithm for generating vehicle steering information and commands as a function of all or a portion of information received from an imaging system 60 , the power assist steering system 50 , a vehicle brake control system 62 , a powertrain control system 64 , and other vehicle sensors and devices, as well as a human-machine interface (“HMI”) 66 , as discussed further below.
  • HMI human-machine interface
  • the vehicle brake control system 62 may also communicate with the controller 14 to provide the hitch assist system 10 with braking information, such as vehicle wheel speed, and to receive braking commands from the controller 14 .
  • the brake control system 62 may be configured to control service brakes 62 a and a parking brake 62 b.
  • the parking brake 62 b may correspond to an electronic parking brake system that may be in communication with the controller 14 .
  • the controller 14 may be configured to control the brakes 62 a and 62 b as well as detect vehicle speed information, which may be determined from individual wheel speed sensors monitored by the brake control system 62 .
  • Vehicle speed may also be determined from the powertrain control system 64 , the speed sensor 38 , and/or the positioning system 32 , among other conceivable means.
  • individual wheel speeds can also be used to determine a vehicle yaw rate, which can be provided to the hitch assist system 10 in the alternative or in addition to the vehicle yaw rate sensor 40 .
  • the hitch assist system 10 can further provide vehicle braking information to the brake control system 62 for allowing the hitch assist system 10 to control braking of the vehicle 12 during backing of the trailer 18 .
  • the hitch assist system 10 may regulate speed of the vehicle 12 during alignment of the vehicle 12 with the coupler 16 of trailer 18 , which can reduce the potential for a collision with trailer 18 , and can bring vehicle 12 to a complete stop at a determined endpoint 70 of the path 20 .
  • the hitch assist system 10 can additionally or alternatively issue an alert signal corresponding to a notification of an actual, impending, and/or anticipated collision with a portion of trailer 18 .
  • regulation of the speed of the vehicle 12 may be advantageous to prevent collision with trailer 18 .
  • the powertrain control system 64 may also interact with the hitch assist system 10 for regulating speed and acceleration of the vehicle 12 during partial or autonomous alignment with trailer 18 .
  • the powertrain control system 64 may further be utilized and configured to control a throttle as well as a drive gear selection of a transmission of the vehicle 12 .
  • the controller 14 may be configured to control a gear of the transmission system and/or prompt the user U to shift to a desired gear to complete semi-automated operations of the vehicle 12 .
  • the hitch assist system 10 may communicate with human-machine interface (“HMI”) 66 of the vehicle 12 .
  • the HMI 66 may include a vehicle display 72 , such as a center-stack mounted navigation or entertainment display ( FIG. 1 ).
  • HMI 66 further includes an input device, which can be implemented by configuring display 72 as a portion of a touchscreen 74 with circuitry 76 to receive an input corresponding with a location over display 72 .
  • Other forms of input including one or more joysticks, digital input pads, or the like, can be used in place or in addition to touchscreen 74 .
  • the hitch assist system 10 may communicate via wireless communication with another embodiment of the HMI 66 , such as with one or more handheld or portable devices 80 ( FIG.
  • the portable device 80 may also include the display 72 for displaying one or more images and other information to a user U.
  • the portable device 80 may display one or more images of the trailer 18 on the display 72 and may be further configured to receive remote user inputs via touchscreen circuitry 76 .
  • the portable device 80 may provide feedback information, such as visual, audible, and tactile alerts.
  • the hitch assist system 10 may further be in communication with one or more indicator devices 78 .
  • the indicator devices 78 may correspond to conventional vehicle indicators, such as a vehicle horn 78 a, lights 78 b, a speaker system 78 c, vehicle accessories 78 d, etc.
  • the indicator devices 78 may further include one or more accessories 78 d, which may correspond to communication devices, remote controls, and a variety of devices that may provide for status and operational feedback between the user U and the vehicle 12 .
  • the HMI 66 , the display 72 , and the touchscreen 74 may be controlled by the controller 14 to provide status updates identifying the operation or receiving instructions or feedback to control the hitch assist system 10 .
  • the portable device 80 may be in communication with the controller 14 and configured to display or otherwise indicate one or more alerts or messages related to the operation of the hitch assist system 10 .
  • the controller 14 is configured with a microprocessor 82 to process logic and routines stored in memory 84 that receive information from the above-described sensors and vehicle systems, including the imaging system 60 , the power assist steering system 50 , the vehicle brake control system 62 , the powertrain control system 64 , and other vehicle sensors and devices.
  • the controller 14 may generate vehicle steering information and commands as a function of all or a portion of the information received. Thereafter, the vehicle steering information and commands may be provided to the power assist steering system 50 for affecting steering of the vehicle 12 to achieve a commanded path 20 ( FIG. 3 ) of travel for alignment with the coupler 16 of trailer 18 .
  • the controller 14 may include the microprocessor 82 and/or other analog and/or digital circuitry for processing one or more routines. Also, the controller 14 may include the memory 84 for storing one or more routines, including an image processing routine 86 and/or hitch detection routine, a path derivation routine 88 , and an operating routine 90 .
  • controller 14 may be a stand-alone dedicated controller or may be a shared controller integrated with other control functions, such as integrated with a vehicle sensor system, the power assist steering system 50 , and other conceivable onboard or off-board vehicle control systems.
  • image processing routine 86 may be carried out by a dedicated processor, for example, within a stand-alone imaging system for vehicle 12 that can output the results of its image processing to other components and systems of vehicle 12 , including microprocessor 82 .
  • any system, computer, processor, or the like, that completes image processing functionality, such as that described herein, may be referred to herein as an “image processor” regardless of other functionality it may also implement (including simultaneously with executing image processing routine 86 ).
  • System 10 may also incorporate the imaging system 60 that includes one or more exterior cameras. Examples of exterior cameras are illustrated in FIG. 4 and include rear camera 60 a, center high-mount stop light (CHMSL) camera 60 b, and side-view cameras 60 c and 60 d, although other arrangements including additional or alternative cameras are possible.
  • imaging system 60 can include rear camera 60 a alone or can be configured such that system 10 utilizes only rear camera 60 a in a vehicle with multiple exterior cameras.
  • the various cameras 60 a - 60 d included in imaging system 60 can be positioned to generally overlap in their respective fields of view, which in the depicted arrangement include fields of view 92 (e.g.
  • image data from two or more of the cameras can be combined in image processing routine 86 , or in another dedicated image processor within imaging system 60 , into a single image.
  • the image data can be used to derive stereoscopic image data that can be used to reconstruct a three-dimensional scene of the area or areas within overlapped areas of the various fields of view 92 a, 92 b, 92 c, and 92 d, including any objects (obstacles or coupler 16 , for example) therein.
  • the use of two images including the same object can be used to determine a location of the object relative to the two image sources, given a known spatial relationship between the image sources.
  • the image processing routine 86 can use known programming and/or functionality to identify an object within image data from the various cameras 60 a, 60 b, 60 c, and 60 d within imaging system 60 .
  • the image processing routine 86 can include information related to the positioning of any cameras 60 a, 60 b, 60 c, and 60 d present on vehicle 12 or utilized by system 10 , including relative to a center 96 ( FIG. 1 ) of vehicle 12 , for example, such that the positions of cameras 60 a, 60 b, 60 c, and 60 d relative to center 96 and/or to each other can be used for object positioning calculations and to result in object position data relative to the center 96 of vehicle 12 , for example, or other features of vehicle 12 , such as hitch ball 22 ( FIG. 1 ), with known positions relative to center 96 of the vehicle 12 .
  • the image processing routine 86 can be specifically programmed or otherwise configured to locate coupler 16 within image data.
  • the image processing routine 86 can identify the coupler 16 within the image data based on stored or otherwise known visual characteristics of coupler 16 or hitches in general.
  • a marker in the form of a sticker, or the like may be affixed with trailer 18 in a specified position relative to coupler 16 in a manner similar to that which is described in commonly-assigned U.S. Pat. No. 9,102,271, the entire disclosure of which is incorporated by reference herein.
  • image processing routine 86 may be programmed with identifying characteristics of the marker for location in image data, as well as the positioning of coupler 16 relative to such a marker so that the position 24 of the coupler 16 can be determined based on the marker location.
  • controller 14 may seek confirmation of the detected coupler 16 , via a prompt on touchscreen 74 . If the coupler 16 determination is not confirmed, further image processing may be provided, or user-adjustment of the position 24 of coupler 16 may be facilitated, either using touchscreen 74 or another input to allow the user U to move the depicted position 24 of coupler 16 on touchscreen 74 , which controller 14 uses to adjust the determination of position 24 of coupler 16 with respect to vehicle 12 based on the above-described use of image data. Alternatively, the user U can visually determine the position 24 of coupler 16 within an image presented on HMI 66 and can provide a touch input in a manner similar to that which is described in commonly-assigned U.S. Pat. No.
  • the image processing routine 86 can then correlate the location of the touch input with the coordinate system 36 applied to image data shown on the display 72 , which may be depicted as shown in FIG. 3 .
  • the image processing routine 86 and operating routine 90 may be used in conjunction with each other to determine the path 20 along which hitch assist system 10 can guide vehicle 12 to align hitch ball 22 and coupler 16 of trailer 18 .
  • an initial position of vehicle 12 relative to trailer 18 may be such that coupler 16 is only in the field of view 92 c of side camera 60 c, with vehicle 12 being positioned laterally from trailer 18 but with coupler 16 being almost longitudinally aligned with hitch ball 22 .
  • image processing routine 86 can identify coupler 16 within the image data of camera 60 c and estimate the position 24 of coupler 16 relative to hitch ball 22 .
  • the position 24 of the coupler 16 may be identified by the system 10 using the image data in accordance by receiving focal length information within image data to determine a distance D c to coupler 16 and an angle a, of offset between coupler 16 and the longitudinal axis of vehicle 12 . This information may also be used in light of the position 24 of coupler 16 within the field of view of the image data to determine or estimate the height H c of coupler 16 .
  • the controller 14 can take control of at least the vehicle steering system 50 to control the movement of vehicle 12 along the desired path 20 to align the hitch position 26 of the vehicle hitch ball 22 with coupler 16 .
  • controller 14 having estimated the positioning D c , ⁇ c of coupler 16 , as discussed above, can, in one example, execute path derivation routine 88 to determine vehicle path 20 to align the vehicle hitch ball 22 with coupler 16 .
  • controller 14 can have stored in memory 84 various characteristics of vehicle 12 , including the wheelbase W, the distance from the rear axle to the hitch ball 22 , which is referred to herein as the drawbar length L, as well as the maximum angle to which the steered wheels 54 can be turned ⁇ max .
  • the wheelbase W and the current steering angle ⁇ can be used to determine a corresponding turning radius ⁇ for vehicle 12 according to the equation:
  • Path derivation routine 88 can be programmed to derive vehicle path 20 to align a known location of the vehicle hitch ball 22 with the estimated position 24 of coupler 16 that takes into account the determined minimum turning radius ⁇ min to allow path 20 to use the minimum amount of space and maneuvers. In this manner, path derivation routine 88 can use the position of vehicle 12 , which can be based on the center 96 of vehicle 12 , a location along the rear axle, the location of the dead reckoning device 34 , or another known location on the coordinate system 36 , to determine both a lateral distance to the coupler 16 and a forward or rearward distance to coupler 16 and derive a path 20 that achieves the needed lateral and forward-backward movement of vehicle 12 within the limitations of steering system 50 .
  • path 20 further takes into account the positioning of hitch ball 22 , based on length L, relative to the tracked location of vehicle 12 (which may correspond with the center 96 of mass of vehicle 12 , the location of a GPS receiver, or another specified, known area) to determine the needed positioning of vehicle 12 to align hitch ball 22 with coupler 16 .
  • FIG. 5 demonstrates a projected view of image data demonstrating an alignment sequence with the trailer 18 .
  • FIGS. 6A and 6B demonstrate side profile views of the vehicle 12 approaching the trailer 18 along the vehicle path 20 .
  • the system 10 may be configured to detect a proximity of the coupler 16 in connection with the trailer 18 .
  • the proximity of the trailer 18 may be detected in response to a signal received by the controller 14 from one or more proximity sensors 30 .
  • the proximity sensors 30 may correspond to various sensors including but not limited to ultrasonic sensors, electromagnetic sensors, radar sensors, laser sensors, and/or various types of sensors that may be configured to detect a distance of an object along the vehicle path 20 .
  • the controller 14 may be configured to utilize the proximity data in combination with image data or additional location information to verify and track the coupler position 24 .
  • the coupler position 24 may be identified in the image data captured by the imaging system 60 . Additionally, a position of the trailer 18 , represented as an outline 102 , may be identified in the image data. Based on the position of the trailer 18 (outline 102 ) in combination with the coupler position 24 , the controller 14 may generally be operable to reliably identify the coupler position 24 . However, in some circumstances, the controller 14 may identify a false position 104 of the coupler 16 in the image data. In such situations, the operation of the operating routine 90 aligning the hitch position 26 with coupler position 24 may result in the vehicle 12 coming in contact with the coupler 16 due to an overshoot condition.
  • the system 10 may be in communication with the at least one proximity sensor 30 . Based on proximity data received from the proximity sensor 30 , the controller 14 may verify the coupler position 24 in relation to the hitch position 26 along the vehicle path 20 . In this way, the controller 14 may compare the proximity data identifying a proximity of the trailer 18 with the image data identifying the coupler position 24 to ensure that the distance D c to the coupler 16 is accurately identified.
  • the vehicle 12 is demonstrated in an approach configuration 110 and an aligned configuration 112 in relation to the trailer 18 .
  • the system 10 may also have a limited operating range over which the vehicle path 20 may be identified in the image data via the image processing routine 86 .
  • a tracking threshold e.g., a predetermined minimum tracking distance
  • the controller 14 may be inoperable to accurately identify the coupler position 24 in the image data.
  • the controller 14 may be unable to calculate the vehicle path 20 via the path derivation routine 88 . Under such circumstances, operation of the system 10 in response to a request to complete the operating routine 90 may result in an error or failure.
  • the controller 14 may be in communication with the at least one proximity sensor 30 to identify whether the distance D c to the coupler 16 is less than the minimum tracking threshold. Based on the proximity data from the proximity sensor 30 , the controller 14 may identify the proximity of the trailer 18 to approximate the distance D c . Once the distance D c is approximated via the proximity data from the proximity sensor 30 , the controller 14 may output instructions to the user U via the HMI 66 to move the vehicle 12 away from the trailer 18 . In this way, the controller 14 may be configured to detect that the trailer 18 is too close to the vehicle 12 to successfully process path derivation and operating routines 88 , 90 and instruct the user U to increase the distance D c beyond than the tracking threshold.
  • a proximity signal 114 is shown emitted from the at least one proximity sensor 30 .
  • the controller 14 may not be operable to distinguish a specific portion of the trailer 18 based solely on the proximity data from the proximity sensor 30 .
  • the controller 14 may accurately identify the general proximity or distance of the trailer 18 from the proximity sensor 30 to accurately indicate whether the distance D c of the coupler 16 is less than the tracking threshold.
  • the controller 14 may be configured to utilize the proximity data to determine whether the system 10 is sufficiently far from the trailer 18 to accurately identify the coupler position 24 and process the operating routine 90 .
  • the controller 14 may apply the proximity data communicated from the proximity sensor 30 in combination with the image data communicated by the imaging system 60 to provide for various operating methods that may improve the accuracy and operation of the system 10 .
  • the method 120 may begin in response to the initiation of the hitch connection routine ( 122 ).
  • the controller 14 may control the at least one proximity sensor 30 to scan a region proximate the vehicle 12 to determine the proximity of the trailer 18 .
  • the controller 14 may receive proximity data and detect the proximity of the trailer 18 ( 124 ).
  • the controller 14 may activate or control the imaging system 60 to capture image data and detect or attempt to detect the trailer 18 and coupler position 24 in the image data ( 126 ).
  • the controller 14 may compare the proximity or distance to the trailer 18 with the minimum distance tracking threshold.
  • the minimum distance tracking threshold may correspond to a minimum distance required for the system 10 to accurately identify the coupler position 24 in the image data. If the trailer distance identified based on the proximity data from the proximity sensor 30 is greater than the minimum tracking threshold, the controller 14 may continue to step 132 and monitor the proximity data for the coupler range or distance to the trailer 18 . If the trailer distance is not greater than the minimum distance tracking threshold in step 128 , the controller 14 may output an instruction via the HMI 66 instructing the user U to move the vehicle 12 away from the trailer 18 ( 130 ).
  • the controller 14 may continue the method 120 by controlling the movement of the vehicle 12 aligning the hitch position 26 with the coupler position 24 ( 134 ).
  • the controller 14 may compare the distance D c to the coupler 16 as identified from the image data as the coupler position 24 to the trailer proximity as identified by the proximity data from the proximity sensor 30 ( 136 ).
  • the controller 14 may apply course braking via the brake control system 62 to halt the vehicle 12 . In this way, the system 10 may prevent the potential collision between the vehicle 12 and trailer 18 ( 138 ).
  • the method 120 may return to step 130 instructing the user to move the vehicle 12 away from the trailer 18 . If the distance D c to the coupler 16 identified based on the image data is not greater than the trailer proximity in step 136 , the controller 14 may continue to complete the operating routine 90 aligning the hitch position 26 with the coupler position 24 ( 140 ). In this way, the method 120 may provide for the system 10 to utilize the proximity data in combination with the image data to improve the accuracy and operation of the system 10 .
  • a flow chart demonstrating a method 150 for controlling alignment of the vehicle 12 with the trailer 18 is shown demonstrating a method for identifying a minimum alignment distance or minimum tracking threshold that may be required in some cases for accurate operation of the system 10 .
  • the method may begin by initiating a hitch connection routine (e.g., the operating routine 90 ) for the vehicle 12 ( 152 ).
  • the controller 14 may activate the proximity sensor 30 and detect the proximity of the trailer 18 via the proximity data ( 154 ). Based on the proximity data, in step 156 , the controller 14 may identify if the trailer 18 is within a maximum detection range from which the system 10 can accurately identify and maneuver the hitch ball 22 of the vehicle 12 to align with the coupler 16 of the trailer 18 .
  • step 156 if the trailer 18 is beyond the maximum detection range, the controller 14 may display an instruction on the HMI 66 instructing the user to decrease the distance D c to the trailer 18 ( 158 ). If the trailer 18 is within the maximum detection range in step 156 , the controller 14 may process the proximity data from the proximity sensor 30 to estimate the proximity of the trailer 18 ( 160 ). Additionally, the controller 14 may control the imaging system 60 to capture image data to identify the trailer 18 and the coupler position 24 ( 162 ). In step 164 , the controller 14 may determine whether the trailer 18 and/or the coupler position 24 are detected.
  • the controller 14 may notify the user U of the non-detection of the trailer 18 and display instructions on the HMI 66 to assist the user U in aligning the vehicle 12 with the trailer 18 ( 166 ).
  • the controller 14 may utilize the proximity data from the proximity sensor 30 as well as the image data from the imaging system 60 to detect the trailer 18 and/or the corresponding coupler position 24 . If the trailer 18 is detected in step 164 , the controller 14 may process the proximity data from the proximity sensor 30 to identify if the trailer distance is less than the minimum distance tracking threshold ( 168 ). If the trailer distance or proximity is less than the minimum tracking threshold in step 168 , the controller 14 may continue to step 170 and display instructions to the user U to increase the distance between the vehicle 12 and the trailer 18 on the HMI 66 ( 170 ).
  • step 168 if the trailer distance or proximity is greater than the minimum tracking threshold, the controller 14 may continue to step 172 and control the vehicle 12 along the vehicle path 20 aligning the hitch position 26 with the coupler position 24 . Accordingly, the method 150 may provide for the proximity sensor 30 to be used in combination with the imaging system 60 to improve the robustness operation of the system 10 .
  • the method 180 may begin by the controller 14 initiating the hitch connection routine ( 182 ).
  • the hitch connection routine may begin by detecting or attempting to detect the trailer 18 and/or the coupler position 24 in the image data as provided by the imaging system 60 ( 184 ). Additionally, the controller 14 may activate and monitor the proximity data from the proximity sensor 30 to identify the proximity of the trailer 18 ( 186 ). With the coupler position 24 identified in the image data, the controller 14 may apply the operating routine 90 to align the hitch position 26 with the coupler position 24 ( 188 ).
  • the controller 14 may monitor the proximity of the trailer 18 as identified from the proximity data in order to identify the approximate distance travelled by the vehicle 12 . Based on the distance traveled in step 190 , the controller 14 may compare the distance D c to the coupler 16 with the approximate distance traveled plus a predetermined distance threshold ( 190 ). In this way, the controller 14 may compare the approximate distance traveled plus the distance threshold with the distance D c to the coupler 16 to identify whether the coupler position 24 is misidentified or changing in the image data. If the distance D c to the coupler 16 is greater than the distance traveled plus the threshold in step 190 , the controller 14 may apply course braking by the brake control system 62 to halt the vehicle 12 and prevent collision ( 192 ). In step 190 , if the distance D c to the coupler 16 is not greater than the distance traveled plus the threshold, the method 180 may continue by completing the alignment of the hitch position 26 with the coupler position 24 ( 194 ).
  • the disclosure provides for various solutions that may improve the operation of the system 10 in accuracy and robustness. Accordingly, the disclosure may provide for an improved experience of the user U in various settings. Though specific detailed steps were discussed in reference to the exemplary embodiments, such examples are merely provided as examples to demonstrate useful applications of the systems and devices disclosed by the application. It shall be understood that the system 10 and corresponding methods are provided strictly as exemplary illustrations of the disclosure that may vary or be combined in various ways without departing from the spirit of the disclosure. Additionally, the detailed embodiment shall not be considered limiting to the scope of the disclosure unless expressly required by the claims.
  • the term “coupled” in all of its forms, couple, coupling, coupled, etc. generally means the joining of two components (electrical or mechanical) directly or indirectly to one another. Such joining may be stationary in nature or movable in nature. Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components. Such joining may be permanent in nature or may be removable or releasable in nature unless otherwise stated.
  • elements shown as integrally formed may be constructed of multiple parts or elements shown as multiple parts may be integrally formed, the operation of the interfaces may be reversed or otherwise varied, the length or width of the structures and/or members or connector or other elements of the system may be varied, the nature or number of adjustment positions provided between the elements may be varied.
  • the elements and/or assemblies of the system may be constructed from any of a wide variety of materials that provide sufficient strength or durability, in any of a wide variety of colors, textures, and combinations. Accordingly, all such modifications are intended to be included within the scope of the present innovations. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the desired and other exemplary embodiments without departing from the spirit of the present innovations.

Abstract

A vehicle system comprises a hitch ball mounted on a vehicle. The system further comprises a plurality of sensor devices comprising an ultrasonic sensor and an image sensor. A controller is configured to process image data from the image sensor identifying a coupler position of a trailer. The controller is further configured to process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer. Based on the proximity of the trailer, the system is configured to identify the trailer in a detection range of the image sensor.

Description

FIELD OF THE DISCLOSURE
The present disclosure generally relates to a system for assisting in a vehicle-trailer hitching operation. In particular, the present disclosure relates to a system for detecting a force applied to a hitch assembly and related applications.
BACKGROUND OF THE DISCLOSURE
Hitching a trailer to a vehicle can be a difficult and time-consuming experience. In particular, aligning a vehicle hitch ball with the desired trailer hitch can, depending on the initial location of the trailer relative to the vehicle, require repeated forward and reverse driving coordinated with multiple steering maneuvers to appropriately position the vehicle. Further, through a significant portion of the driving needed for appropriate hitch ball alignment, the trailer hitch cannot be seen, and the hitch ball can, under ordinary circumstances, never actually be seen by the driver. This lack of sight lines requires an inference of the positioning of the hitch ball and hitch based on experience with a particular vehicle and trailer, and can still require multiple instances of stopping and stepping out of the vehicle to confirm alignment or to note an appropriate correction for a subsequent set of maneuvers. Even further, the closeness of the hitch ball to the rear bumper of the vehicle means that any overshoot can cause a collision of the vehicle with the trailer. Accordingly, further advancements may be desired.
SUMMARY OF THE DISCLOSURE
According to one aspect of the present disclosure, a vehicle system comprising a hitch ball mounted on a vehicle is disclosed. The system comprises a plurality of sensor devices comprising an ultrasonic sensor and an image sensor. A controller is configured to process image data from the image sensor identifying a coupler position of a trailer. The controller is further configured to process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer. Based on the proximity of the trailer, the system is configured to identify the trailer in a detection range of the image sensor.
Embodiments of the first aspect of the invention can include any one or a combination of the following features:
    • the controller is configured to control a motion of the vehicle aligning the hitch ball with the coupler position;
    • the controller is further configured to correct for a misidentification of the coupler position identified in the image data based on the proximity of the trailer identified from the ultrasonic data;
    • the controller is further configured to: monitor the proximity of the trailer based on the ultrasonic data; monitor the location of the coupler based on the image data; and suppress a motion instruction of the location based on the proximity of the trailer;
    • the suppression of the motion instruction comprises stopping the motion of the vehicle;
    • the suppression of the motion instruction is in response to the proximity of the trailer being less than a change in the proximity detected via the ultrasonic data in addition to a predetermined distance constant;
    • the controller is further configured to: control a notification indicating that the trailer is outside the detection range; and
    • the detection range of the notification provides for the vehicle to be repositioned at a greater distance within the detection range.
According to another aspect of the present disclosure, a method for controlling a vehicle is disclosed. The method comprises processing image data identifying a coupler position of a trailer and processing ultrasonic data identifying a proximity of the trailer. The method further comprises controlling a motion of the vehicle aligning the hitch ball with the coupler position and monitoring proximity of the trailer relative to the coupler position. In response to a comparison of the coupler position with the proximity, the method comprises halting the motion of the vehicle.
According to yet another aspect of the present disclosure, a vehicle system is disclosed. The system comprises a hitch ball mounted on a vehicle and an ultrasonic sensor configured to capture ultrasonic data rearward of the vehicle. An image sensor is configured to capture image data rearward of the vehicle. A controller is configured to process image data from the image sensor identifying a coupler position of a trailer and control a motion of the vehicle aligning the hitch ball with the coupler position. The controller is further configured to process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer and monitor the proximity of the trailer relative to the coupler position. The controller corrects for a misidentification of the coupler position identified in the image data based on the proximity of the trailer identified from the ultrasonic data.
These and other aspects, objects, and features of the present disclosure will be understood and appreciated by those skilled in the art upon studying the following specification, claims, and appended drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings:
FIG. 1 is a perspective view of a vehicle in an unhitched position relative to a trailer;
FIG. 2 is a diagram of a system according to an aspect of the disclosure for assisting in aligning the vehicle with a trailer in a position for hitching the trailer to the vehicle;
FIG. 3 is an overhead schematic view of a vehicle during a step of the alignment sequence with the trailer;
FIG. 4 is a is an overhead schematic view of a vehicle during a step of the alignment sequence with the trailer;
FIG. 5 is a projected view of image data demonstrating a alignment sequence of a vehicle with the trailer;
FIG. 6A is a side profile view of a vehicle approaching a trailer demonstrating a plurality of sensors detecting the trailer;
FIG. 6B is a side profile view of a vehicle approaching a trailer demonstrating a plurality of sensors detecting the trailer;
FIG. 7 is a flow chart demonstrating a method for controlling an alignment of a vehicle with a trailer utilizing proximity data and image data;
FIG. 8 is a flow chart demonstrating a method for controlling an alignment of a vehicle with a trailer identifying a minimum alignment distance for an alignment operation; and
FIG. 9 is a flow chart demonstrating a method for controlling an alignment of a vehicle with a trailer utilizing proximity data in combination with additional sensor data in accordance with the disclosure.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
For purposes of description herein, the terms “upper,” “lower,” “right,” “left,” “rear,” “front,” “vertical,” “horizontal,” “interior,” “exterior,” and derivatives thereof shall relate to the device as oriented in FIG. 1. However, it is to be understood that the device may assume various alternative orientations, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawing, and described in the following specification are simply exemplary embodiments of the inventive concepts defined in the appended claims. Hence, specific dimensions and other physical characteristics relating to the embodiments disclosed herein are not to be considered as limiting, unless the claims expressly state otherwise. Additionally, unless otherwise specified, it is to be understood that discussion of a particular feature or component extending in or along a given direction or the like does not mean that the feature or component follows a straight line or axis in such a direction or that it only extends in such direction or on such a plane without other directional components or deviations, unless otherwise specified.
Referring generally to FIGS. 1-5, reference numeral 10 designates a hitch assistance system (also referred to as a “hitch assist” system) for a vehicle 12. In various embodiments, hitch assist system 10 includes a controller 14 configured to acquire position data of a coupler 16 of a trailer 18. The controller 14 may be configured to derive a vehicle path 20 to align a hitch ball 22 of the vehicle 12 with the coupler 16. Deriving the vehicle path 20 may include a variety of steps including detecting and compensating for a change in a coupler position 24 in order to control the vehicle 12 to locate a hitch position 26 aligned with the coupler 16. The vehicle path 20 may comprise a plurality of segments 28, which may correspond to changes in the operating direction or steering direction of the vehicle 12. In various embodiments, deriving the vehicle path 20 may include navigating around intervening objects or structures, operating over uneven terrain, following a desired path indicated by an operator or user U, etc. Accordingly, the disclosure may provide for the hitch assist system 10 to provide for improved navigation of the vehicle 12 and/or interaction with the coupler 16 such that trailer 18 may be effectively connected to the vehicle 12 without complication.
In some embodiments, the system 10 may be configured to detect a proximity of the coupler 16 in connection with the trailer 18. The proximity of the trailer 18 may be detected in response to a signal received by the controller 14 from one or more proximity sensors 30. The proximity sensors 30 may correspond to various sensors, including, but not limited to, ultrasonic sensors, electromagnetic sensors, radar sensors, laser sensors, and/or various types of sensors that may be configured to detect a distance of an object along the vehicle path 20. As further discussed in reference to FIGS. 5-9, the system 10 may utilize proximity information from the one or more proximity sensor 30 in combination with additional sensors, which may be utilized to detect and track the coupler position 24. Accordingly, the system 10 may provide for improved operational accuracy and error reduction by utilizing the proximity sensor 30 in combination with at least one addition sensor (e.g., a camera or image system). In this way, the system accurately identify the coupler position 24 and control the vehicle 12 to align the hitch position 26 with the coupler position 24.
With respect to the general operation of the hitch assist system 10, as illustrated in the system diagram of FIGS. 2-4, the system 10 includes various sensors and devices that obtain or otherwise provide vehicle status-related information. This information includes positioning information from a positioning system 32, which may include a dead reckoning device 34 or, in addition or as an alternative, a global positioning system (GPS), to determine a coordinate location of the vehicle 12 based on the one or more locations of the devices within the positioning system 32. In particular, the dead reckoning device 34 can establish and track the coordinate location of the vehicle 12 within a localized coordinate system 36 based at least on vehicle speed and steering angle δ as shown in FIG. 3. Other vehicle information received by hitch assist system 10 may include a speed of the vehicle 12 from a speed sensor 38 and a yaw rate of the vehicle 12 from a yaw rate sensor 40. It is contemplated that in additional embodiments, a distance or proximity sensor or an array thereof, and other vehicle sensors and devices may provide sensor signals or other information, such as sequential images of the trailer 18, including the detected coupler 16, that the controller 14 of the hitch assist system 10 may process with various routines to determine the height H and position (e.g., based on the distance Dc and angle αc) of coupler 16.
As further shown in FIG. 2, one embodiment of the hitch assist system 10 is in communication with the steering system 50 of vehicle 12. The steering system 50 may be a power assist steering system 50 including a steering motor 52 to operate the steered wheels 54 (FIG. 1) of the vehicle 12 for moving the vehicle 12 in such a manner that the vehicle yaw changes with the vehicle velocity and the steering angle δ. In the illustrated embodiment, the power assist steering system 50 is an electric power-assisted steering (“EPAS”) system including electric steering motor 52 for turning the steered wheels 54 to a steering angle δ based on a steering command, whereby the steering angle δ may be sensed by a steering angle sensor 56 of the power assist steering system 50. The steering command may be provided by the hitch assist system 10 for autonomously steering during a trailer hitch alignment maneuver and may alternatively be provided manually via a rotational position (e.g., steering wheel angle) of a steering wheel of vehicle 12.
In the illustrated embodiment, the steering wheel of the vehicle 12 is mechanically coupled with the steered wheels 54 of the vehicle 12, such that the steering wheel moves in concert with steered wheels 54, preventing manual intervention with the steering wheel during autonomous steering. More specifically, a torque sensor 58 is provided on the power assist steering system 50 that senses torque on the steering wheel that is not expected from autonomous control of the steering wheel and therefore indicative of manual intervention. In this configuration, the hitch assist system 10 may alert the driver to discontinue manual intervention with the steering wheel and/or discontinue autonomous steering. In alternative embodiments, some vehicles have a power assist steering system 50 that allows a steering wheel to be partially decoupled from movement of the steered wheels 54 of such a vehicle.
With continued reference to FIG. 2, the power assist steering system 50 provides the controller 14 of the hitch assist system 10 with information relating to a rotational position of steered wheels 54 of the vehicle 12, including a steering angle δ. The controller 14 in the illustrated embodiment processes the current steering angle, in addition to other vehicle 12 conditions to guide the vehicle 12 along the desired path 20 (FIG. 3). It is conceivable that the hitch assist system 10, in additional embodiments, may be an integrated component of the power assist steering system 50. For example, the power assist steering system 50 may include a hitch assist algorithm for generating vehicle steering information and commands as a function of all or a portion of information received from an imaging system 60, the power assist steering system 50, a vehicle brake control system 62, a powertrain control system 64, and other vehicle sensors and devices, as well as a human-machine interface (“HMI”) 66, as discussed further below.
As also illustrated in FIG. 2, the vehicle brake control system 62 may also communicate with the controller 14 to provide the hitch assist system 10 with braking information, such as vehicle wheel speed, and to receive braking commands from the controller 14. The brake control system 62 may be configured to control service brakes 62 a and a parking brake 62 b. The parking brake 62 b may correspond to an electronic parking brake system that may be in communication with the controller 14. Accordingly in operation, the controller 14 may be configured to control the brakes 62 a and 62 b as well as detect vehicle speed information, which may be determined from individual wheel speed sensors monitored by the brake control system 62. Vehicle speed may also be determined from the powertrain control system 64, the speed sensor 38, and/or the positioning system 32, among other conceivable means. In some embodiments, individual wheel speeds can also be used to determine a vehicle yaw rate, which can be provided to the hitch assist system 10 in the alternative or in addition to the vehicle yaw rate sensor 40.
The hitch assist system 10 can further provide vehicle braking information to the brake control system 62 for allowing the hitch assist system 10 to control braking of the vehicle 12 during backing of the trailer 18. For example, the hitch assist system 10, in some embodiments, may regulate speed of the vehicle 12 during alignment of the vehicle 12 with the coupler 16 of trailer 18, which can reduce the potential for a collision with trailer 18, and can bring vehicle 12 to a complete stop at a determined endpoint 70 of the path 20. It is disclosed herein that the hitch assist system 10 can additionally or alternatively issue an alert signal corresponding to a notification of an actual, impending, and/or anticipated collision with a portion of trailer 18. As mentioned above, regulation of the speed of the vehicle 12 may be advantageous to prevent collision with trailer 18.
In some embodiments, the powertrain control system 64, as shown in the embodiment illustrated in FIG. 2, may also interact with the hitch assist system 10 for regulating speed and acceleration of the vehicle 12 during partial or autonomous alignment with trailer 18. During autonomous operation, the powertrain control system 64 may further be utilized and configured to control a throttle as well as a drive gear selection of a transmission of the vehicle 12. Accordingly, in some embodiments, the controller 14 may be configured to control a gear of the transmission system and/or prompt the user U to shift to a desired gear to complete semi-automated operations of the vehicle 12.
As previously discussed, the hitch assist system 10 may communicate with human-machine interface (“HMI”) 66 of the vehicle 12. The HMI 66 may include a vehicle display 72, such as a center-stack mounted navigation or entertainment display (FIG. 1). HMI 66 further includes an input device, which can be implemented by configuring display 72 as a portion of a touchscreen 74 with circuitry 76 to receive an input corresponding with a location over display 72. Other forms of input, including one or more joysticks, digital input pads, or the like, can be used in place or in addition to touchscreen 74. Further, the hitch assist system 10 may communicate via wireless communication with another embodiment of the HMI 66, such as with one or more handheld or portable devices 80 (FIG. 1), including one or more smartphones. The portable device 80 may also include the display 72 for displaying one or more images and other information to a user U. For instance, the portable device 80 may display one or more images of the trailer 18 on the display 72 and may be further configured to receive remote user inputs via touchscreen circuitry 76. In addition, the portable device 80 may provide feedback information, such as visual, audible, and tactile alerts.
In some embodiments, the hitch assist system 10 may further be in communication with one or more indicator devices 78. The indicator devices 78 may correspond to conventional vehicle indicators, such as a vehicle horn 78 a, lights 78 b, a speaker system 78 c, vehicle accessories 78 d, etc. In some embodiments, the indicator devices 78 may further include one or more accessories 78 d, which may correspond to communication devices, remote controls, and a variety of devices that may provide for status and operational feedback between the user U and the vehicle 12. For example, in some embodiments, the HMI 66, the display 72, and the touchscreen 74 may be controlled by the controller 14 to provide status updates identifying the operation or receiving instructions or feedback to control the hitch assist system 10. Additionally, in some embodiments, the portable device 80 may be in communication with the controller 14 and configured to display or otherwise indicate one or more alerts or messages related to the operation of the hitch assist system 10.
Still referring to the embodiment shown in FIG. 2, the controller 14 is configured with a microprocessor 82 to process logic and routines stored in memory 84 that receive information from the above-described sensors and vehicle systems, including the imaging system 60, the power assist steering system 50, the vehicle brake control system 62, the powertrain control system 64, and other vehicle sensors and devices. The controller 14 may generate vehicle steering information and commands as a function of all or a portion of the information received. Thereafter, the vehicle steering information and commands may be provided to the power assist steering system 50 for affecting steering of the vehicle 12 to achieve a commanded path 20 (FIG. 3) of travel for alignment with the coupler 16 of trailer 18. The controller 14 may include the microprocessor 82 and/or other analog and/or digital circuitry for processing one or more routines. Also, the controller 14 may include the memory 84 for storing one or more routines, including an image processing routine 86 and/or hitch detection routine, a path derivation routine 88, and an operating routine 90.
It should be appreciated that the controller 14 may be a stand-alone dedicated controller or may be a shared controller integrated with other control functions, such as integrated with a vehicle sensor system, the power assist steering system 50, and other conceivable onboard or off-board vehicle control systems. It should further be appreciated that the image processing routine 86 may be carried out by a dedicated processor, for example, within a stand-alone imaging system for vehicle 12 that can output the results of its image processing to other components and systems of vehicle 12, including microprocessor 82. Further, any system, computer, processor, or the like, that completes image processing functionality, such as that described herein, may be referred to herein as an “image processor” regardless of other functionality it may also implement (including simultaneously with executing image processing routine 86).
System 10 may also incorporate the imaging system 60 that includes one or more exterior cameras. Examples of exterior cameras are illustrated in FIG. 4 and include rear camera 60 a, center high-mount stop light (CHMSL) camera 60 b, and side- view cameras 60 c and 60 d, although other arrangements including additional or alternative cameras are possible. In one example, imaging system 60 can include rear camera 60 a alone or can be configured such that system 10 utilizes only rear camera 60 a in a vehicle with multiple exterior cameras. In another example, the various cameras 60 a-60 d included in imaging system 60 can be positioned to generally overlap in their respective fields of view, which in the depicted arrangement include fields of view 92 (e.g. 92 a, 92 b, 92 c, and 92 d) to correspond with rear camera 60 a, center high-mount stop light (CHMSL) camera 60 b, and side- view cameras 60 c and 60 d, respectively. In this manner, image data from two or more of the cameras can be combined in image processing routine 86, or in another dedicated image processor within imaging system 60, into a single image.
As an example of combining image data from multiple cameras, the image data can be used to derive stereoscopic image data that can be used to reconstruct a three-dimensional scene of the area or areas within overlapped areas of the various fields of view 92 a, 92 b, 92 c, and 92 d, including any objects (obstacles or coupler 16, for example) therein. In an embodiment, the use of two images including the same object can be used to determine a location of the object relative to the two image sources, given a known spatial relationship between the image sources. In this respect, the image processing routine 86 can use known programming and/or functionality to identify an object within image data from the various cameras 60 a, 60 b, 60 c, and 60 d within imaging system 60. In either example, the image processing routine 86 can include information related to the positioning of any cameras 60 a, 60 b, 60 c, and 60 d present on vehicle 12 or utilized by system 10, including relative to a center 96 (FIG. 1) of vehicle 12, for example, such that the positions of cameras 60 a, 60 b, 60 c, and 60 d relative to center 96 and/or to each other can be used for object positioning calculations and to result in object position data relative to the center 96 of vehicle 12, for example, or other features of vehicle 12, such as hitch ball 22 (FIG. 1), with known positions relative to center 96 of the vehicle 12.
The image processing routine 86 can be specifically programmed or otherwise configured to locate coupler 16 within image data. In one example, the image processing routine 86 can identify the coupler 16 within the image data based on stored or otherwise known visual characteristics of coupler 16 or hitches in general. In another embodiment, a marker in the form of a sticker, or the like, may be affixed with trailer 18 in a specified position relative to coupler 16 in a manner similar to that which is described in commonly-assigned U.S. Pat. No. 9,102,271, the entire disclosure of which is incorporated by reference herein. In such an embodiment, image processing routine 86 may be programmed with identifying characteristics of the marker for location in image data, as well as the positioning of coupler 16 relative to such a marker so that the position 24 of the coupler 16 can be determined based on the marker location.
Additionally or alternatively, controller 14 may seek confirmation of the detected coupler 16, via a prompt on touchscreen 74. If the coupler 16 determination is not confirmed, further image processing may be provided, or user-adjustment of the position 24 of coupler 16 may be facilitated, either using touchscreen 74 or another input to allow the user U to move the depicted position 24 of coupler 16 on touchscreen 74, which controller 14 uses to adjust the determination of position 24 of coupler 16 with respect to vehicle 12 based on the above-described use of image data. Alternatively, the user U can visually determine the position 24 of coupler 16 within an image presented on HMI 66 and can provide a touch input in a manner similar to that which is described in commonly-assigned U.S. Pat. No. 10/266,023, the entire disclosure of which is incorporated by reference herein. The image processing routine 86 can then correlate the location of the touch input with the coordinate system 36 applied to image data shown on the display 72, which may be depicted as shown in FIG. 3.
As shown in FIG. 3, the image processing routine 86 and operating routine 90 may be used in conjunction with each other to determine the path 20 along which hitch assist system 10 can guide vehicle 12 to align hitch ball 22 and coupler 16 of trailer 18. In the example shown, an initial position of vehicle 12 relative to trailer 18 may be such that coupler 16 is only in the field of view 92 c of side camera 60 c, with vehicle 12 being positioned laterally from trailer 18 but with coupler 16 being almost longitudinally aligned with hitch ball 22. In this manner, upon initiation of hitch assist system 10, such as by user input on touchscreen 74, for example, image processing routine 86 can identify coupler 16 within the image data of camera 60 c and estimate the position 24 of coupler 16 relative to hitch ball 22. The position 24 of the coupler 16 may be identified by the system 10 using the image data in accordance by receiving focal length information within image data to determine a distance Dc to coupler 16 and an angle a, of offset between coupler 16 and the longitudinal axis of vehicle 12. This information may also be used in light of the position 24 of coupler 16 within the field of view of the image data to determine or estimate the height Hc of coupler 16. Once the positioning Dc, αc of coupler 16 has been determined and, optionally, confirmed by the user U, the controller 14 can take control of at least the vehicle steering system 50 to control the movement of vehicle 12 along the desired path 20 to align the hitch position 26 of the vehicle hitch ball 22 with coupler 16.
Continuing with reference to FIGS. 3 and 4 with additional reference to FIG. 2, controller 14, having estimated the positioning Dc, αc of coupler 16, as discussed above, can, in one example, execute path derivation routine 88 to determine vehicle path 20 to align the vehicle hitch ball 22 with coupler 16. In particular, controller 14 can have stored in memory 84 various characteristics of vehicle 12, including the wheelbase W, the distance from the rear axle to the hitch ball 22, which is referred to herein as the drawbar length L, as well as the maximum angle to which the steered wheels 54 can be turned δmax. As shown, the wheelbase W and the current steering angle δ can be used to determine a corresponding turning radius ρ for vehicle 12 according to the equation:
ρ = 1 W tan δ ( 1 )
in which the wheelbase W is fixed and the steering angle δ can be controlled by controller 14 by communication with steering system 50, as discussed above. In this manner, when the maximum steering angle δmax is known, the smallest possible value for the turning radius ρmin is determined as:
ρ m i n = 1 W tan δ m ax ( 2 )
Path derivation routine 88 can be programmed to derive vehicle path 20 to align a known location of the vehicle hitch ball 22 with the estimated position 24 of coupler 16 that takes into account the determined minimum turning radius ρmin to allow path 20 to use the minimum amount of space and maneuvers. In this manner, path derivation routine 88 can use the position of vehicle 12, which can be based on the center 96 of vehicle 12, a location along the rear axle, the location of the dead reckoning device 34, or another known location on the coordinate system 36, to determine both a lateral distance to the coupler 16 and a forward or rearward distance to coupler 16 and derive a path 20 that achieves the needed lateral and forward-backward movement of vehicle 12 within the limitations of steering system 50. The derivation of path 20 further takes into account the positioning of hitch ball 22, based on length L, relative to the tracked location of vehicle 12 (which may correspond with the center 96 of mass of vehicle 12, the location of a GPS receiver, or another specified, known area) to determine the needed positioning of vehicle 12 to align hitch ball 22 with coupler 16.
FIG. 5 demonstrates a projected view of image data demonstrating an alignment sequence with the trailer 18. Additionally, FIGS. 6A and 6B demonstrate side profile views of the vehicle 12 approaching the trailer 18 along the vehicle path 20. Referring to FIGS. 5, 6A, and 6B, in some embodiments, the system 10 may be configured to detect a proximity of the coupler 16 in connection with the trailer 18. The proximity of the trailer 18 may be detected in response to a signal received by the controller 14 from one or more proximity sensors 30. The proximity sensors 30 may correspond to various sensors including but not limited to ultrasonic sensors, electromagnetic sensors, radar sensors, laser sensors, and/or various types of sensors that may be configured to detect a distance of an object along the vehicle path 20. In this way, the controller 14 may be configured to utilize the proximity data in combination with image data or additional location information to verify and track the coupler position 24.
As demonstrated in FIG. 5, the coupler position 24 may be identified in the image data captured by the imaging system 60. Additionally, a position of the trailer 18, represented as an outline 102, may be identified in the image data. Based on the position of the trailer 18 (outline 102) in combination with the coupler position 24, the controller 14 may generally be operable to reliably identify the coupler position 24. However, in some circumstances, the controller 14 may identify a false position 104 of the coupler 16 in the image data. In such situations, the operation of the operating routine 90 aligning the hitch position 26 with coupler position 24 may result in the vehicle 12 coming in contact with the coupler 16 due to an overshoot condition.
In order to improve the operational accuracy and reduce the likelihood of errors in the detection of the coupler position 24, the system 10 may be in communication with the at least one proximity sensor 30. Based on proximity data received from the proximity sensor 30, the controller 14 may verify the coupler position 24 in relation to the hitch position 26 along the vehicle path 20. In this way, the controller 14 may compare the proximity data identifying a proximity of the trailer 18 with the image data identifying the coupler position 24 to ensure that the distance Dc to the coupler 16 is accurately identified.
Referring now to FIGS. 6A and 6B, the vehicle 12 is demonstrated in an approach configuration 110 and an aligned configuration 112 in relation to the trailer 18. In addition to the challenges that may be related to the identification of the coupler 16 in the false position 104, the system 10 may also have a limited operating range over which the vehicle path 20 may be identified in the image data via the image processing routine 86. For example, in response to the distance Dc to the coupler 16 being less than a tracking threshold (e.g., a predetermined minimum tracking distance), the controller 14 may be inoperable to accurately identify the coupler position 24 in the image data. Additionally, if the distance Dc to the coupler 16 is less than a tracking threshold, the controller 14 may be unable to calculate the vehicle path 20 via the path derivation routine 88. Under such circumstances, operation of the system 10 in response to a request to complete the operating routine 90 may result in an error or failure.
Accordingly, in order to improve the operation of the system 10, the controller 14 may be in communication with the at least one proximity sensor 30 to identify whether the distance Dc to the coupler 16 is less than the minimum tracking threshold. Based on the proximity data from the proximity sensor 30, the controller 14 may identify the proximity of the trailer 18 to approximate the distance Dc. Once the distance Dc is approximated via the proximity data from the proximity sensor 30, the controller 14 may output instructions to the user U via the HMI 66 to move the vehicle 12 away from the trailer 18. In this way, the controller 14 may be configured to detect that the trailer 18 is too close to the vehicle 12 to successfully process path derivation and operating routines 88, 90 and instruct the user U to increase the distance Dc beyond than the tracking threshold.
In FIGS. 6A and 6B, a proximity signal 114 is shown emitted from the at least one proximity sensor 30. In operation, the controller 14 may not be operable to distinguish a specific portion of the trailer 18 based solely on the proximity data from the proximity sensor 30. However, the controller 14 may accurately identify the general proximity or distance of the trailer 18 from the proximity sensor 30 to accurately indicate whether the distance Dc of the coupler 16 is less than the tracking threshold. Accordingly, the controller 14 may be configured to utilize the proximity data to determine whether the system 10 is sufficiently far from the trailer 18 to accurately identify the coupler position 24 and process the operating routine 90. As further discussed in reference to FIGS. 7, 8, and 9, the controller 14 may apply the proximity data communicated from the proximity sensor 30 in combination with the image data communicated by the imaging system 60 to provide for various operating methods that may improve the accuracy and operation of the system 10.
Referring to FIG. 7, a flow chart demonstrating a method 120 for controlling an alignment of the vehicle 12 with the trailer 18 is shown describing an exemplary operation utilizing the proximity data in combination with the image data. The method 120 may begin in response to the initiation of the hitch connection routine (122). For example, prior to the activation of the image processing routine 86, the controller 14 may control the at least one proximity sensor 30 to scan a region proximate the vehicle 12 to determine the proximity of the trailer 18. In response to the activation of the proximity sensor 30, the controller 14 may receive proximity data and detect the proximity of the trailer 18 (124). Additionally, the controller 14 may activate or control the imaging system 60 to capture image data and detect or attempt to detect the trailer 18 and coupler position 24 in the image data (126).
Based on the proximity data, in step 128, the controller 14 may compare the proximity or distance to the trailer 18 with the minimum distance tracking threshold. As previously discussed, the minimum distance tracking threshold may correspond to a minimum distance required for the system 10 to accurately identify the coupler position 24 in the image data. If the trailer distance identified based on the proximity data from the proximity sensor 30 is greater than the minimum tracking threshold, the controller 14 may continue to step 132 and monitor the proximity data for the coupler range or distance to the trailer 18. If the trailer distance is not greater than the minimum distance tracking threshold in step 128, the controller 14 may output an instruction via the HMI 66 instructing the user U to move the vehicle 12 away from the trailer 18 (130).
Following step 132, the controller 14 may continue the method 120 by controlling the movement of the vehicle 12 aligning the hitch position 26 with the coupler position 24 (134). During the alignment operation, the controller 14 may compare the distance Dc to the coupler 16 as identified from the image data as the coupler position 24 to the trailer proximity as identified by the proximity data from the proximity sensor 30 (136). In step 136, if the distance Dc to the coupler 16 as identified from the image data is greater than the trailer proximity identified based on the proximity data, the controller 14 may apply course braking via the brake control system 62 to halt the vehicle 12. In this way, the system 10 may prevent the potential collision between the vehicle 12 and trailer 18 (138). Following step 138, the method 120 may return to step 130 instructing the user to move the vehicle 12 away from the trailer 18. If the distance Dc to the coupler 16 identified based on the image data is not greater than the trailer proximity in step 136, the controller 14 may continue to complete the operating routine 90 aligning the hitch position 26 with the coupler position 24 (140). In this way, the method 120 may provide for the system 10 to utilize the proximity data in combination with the image data to improve the accuracy and operation of the system 10.
Referring now to FIG. 8, a flow chart demonstrating a method 150 for controlling alignment of the vehicle 12 with the trailer 18 is shown demonstrating a method for identifying a minimum alignment distance or minimum tracking threshold that may be required in some cases for accurate operation of the system 10. The method may begin by initiating a hitch connection routine (e.g., the operating routine 90) for the vehicle 12 (152). Upon initiation of the routine, the controller 14 may activate the proximity sensor 30 and detect the proximity of the trailer 18 via the proximity data (154). Based on the proximity data, in step 156, the controller 14 may identify if the trailer 18 is within a maximum detection range from which the system 10 can accurately identify and maneuver the hitch ball 22 of the vehicle 12 to align with the coupler 16 of the trailer 18.
In step 156, if the trailer 18 is beyond the maximum detection range, the controller 14 may display an instruction on the HMI 66 instructing the user to decrease the distance Dc to the trailer 18 (158). If the trailer 18 is within the maximum detection range in step 156, the controller 14 may process the proximity data from the proximity sensor 30 to estimate the proximity of the trailer 18 (160). Additionally, the controller 14 may control the imaging system 60 to capture image data to identify the trailer 18 and the coupler position 24 (162). In step 164, the controller 14 may determine whether the trailer 18 and/or the coupler position 24 are detected. If the trailer 18 or corresponding coupler position 24 are not identified in step 164, the controller 14 may notify the user U of the non-detection of the trailer 18 and display instructions on the HMI 66 to assist the user U in aligning the vehicle 12 with the trailer 18 (166).
In step 164, the controller 14 may utilize the proximity data from the proximity sensor 30 as well as the image data from the imaging system 60 to detect the trailer 18 and/or the corresponding coupler position 24. If the trailer 18 is detected in step 164, the controller 14 may process the proximity data from the proximity sensor 30 to identify if the trailer distance is less than the minimum distance tracking threshold (168). If the trailer distance or proximity is less than the minimum tracking threshold in step 168, the controller 14 may continue to step 170 and display instructions to the user U to increase the distance between the vehicle 12 and the trailer 18 on the HMI 66 (170). In step 168, if the trailer distance or proximity is greater than the minimum tracking threshold, the controller 14 may continue to step 172 and control the vehicle 12 along the vehicle path 20 aligning the hitch position 26 with the coupler position 24. Accordingly, the method 150 may provide for the proximity sensor 30 to be used in combination with the imaging system 60 to improve the robustness operation of the system 10.
Referring now to FIG. 9, a flow chart demonstrating a method 180 for controlling an alignment of the vehicle 12 with the trailer 18 utilizing the proximity data in combination with additional sensor data is shown. The method 180 may begin by the controller 14 initiating the hitch connection routine (182). The hitch connection routine may begin by detecting or attempting to detect the trailer 18 and/or the coupler position 24 in the image data as provided by the imaging system 60 (184). Additionally, the controller 14 may activate and monitor the proximity data from the proximity sensor 30 to identify the proximity of the trailer 18 (186). With the coupler position 24 identified in the image data, the controller 14 may apply the operating routine 90 to align the hitch position 26 with the coupler position 24 (188).
During the alignment, the controller 14 may monitor the proximity of the trailer 18 as identified from the proximity data in order to identify the approximate distance travelled by the vehicle 12. Based on the distance traveled in step 190, the controller 14 may compare the distance Dc to the coupler 16 with the approximate distance traveled plus a predetermined distance threshold (190). In this way, the controller 14 may compare the approximate distance traveled plus the distance threshold with the distance Dc to the coupler 16 to identify whether the coupler position 24 is misidentified or changing in the image data. If the distance Dc to the coupler 16 is greater than the distance traveled plus the threshold in step 190, the controller 14 may apply course braking by the brake control system 62 to halt the vehicle 12 and prevent collision (192). In step 190, if the distance Dc to the coupler 16 is not greater than the distance traveled plus the threshold, the method 180 may continue by completing the alignment of the hitch position 26 with the coupler position 24 (194).
As discussed herein, the disclosure provides for various solutions that may improve the operation of the system 10 in accuracy and robustness. Accordingly, the disclosure may provide for an improved experience of the user U in various settings. Though specific detailed steps were discussed in reference to the exemplary embodiments, such examples are merely provided as examples to demonstrate useful applications of the systems and devices disclosed by the application. It shall be understood that the system 10 and corresponding methods are provided strictly as exemplary illustrations of the disclosure that may vary or be combined in various ways without departing from the spirit of the disclosure. Additionally, the detailed embodiment shall not be considered limiting to the scope of the disclosure unless expressly required by the claims.
It is to be understood that variations and modifications can be made on the aforementioned structure without departing from the concepts of the present disclosure, and further it is to be understood that such concepts are intended to be covered by the following claims unless these claims by their language expressly state otherwise.
For purposes of this disclosure, the term “coupled” (in all of its forms, couple, coupling, coupled, etc.) generally means the joining of two components (electrical or mechanical) directly or indirectly to one another. Such joining may be stationary in nature or movable in nature. Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components. Such joining may be permanent in nature or may be removable or releasable in nature unless otherwise stated.
It is also important to note that the construction and arrangement of the elements of the disclosure as shown in the exemplary embodiments is illustrative only. Although only a few embodiments of the present innovations have been described in detail in this disclosure, those skilled in the art who review this disclosure will readily appreciate that many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited. For example, elements shown as integrally formed may be constructed of multiple parts or elements shown as multiple parts may be integrally formed, the operation of the interfaces may be reversed or otherwise varied, the length or width of the structures and/or members or connector or other elements of the system may be varied, the nature or number of adjustment positions provided between the elements may be varied. It should be noted that the elements and/or assemblies of the system may be constructed from any of a wide variety of materials that provide sufficient strength or durability, in any of a wide variety of colors, textures, and combinations. Accordingly, all such modifications are intended to be included within the scope of the present innovations. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the desired and other exemplary embodiments without departing from the spirit of the present innovations.
It will be understood that any described processes or steps within described processes may be combined with other disclosed processes or steps to form structures within the scope of the present disclosure. The exemplary structures and processes disclosed herein are for illustrative purposes and are not to be construed as limiting.

Claims (17)

What is claimed is:
1. A vehicle system, comprising:
a hitch ball mounted on a vehicle;
a plurality of sensor devices comprising an ultrasonic sensor and an image sensor; and
a controller configured to:
process image data from the image sensor identifying a coupler position of a trailer in a detection range of the image sensor, wherein the detection range comprises a minimum tracking distance within which the controller is inoperable to accurately track the coupler position via the image data;
process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer; and
identify the trailer within the minimum tracking distance of the image sensor based on the proximity of the trailer.
2. The system according to claim 1, wherein the controller is further configured to:
control a motion of the vehicle aligning the hitch ball with the coupler position.
3. The system according to claim 2, wherein the controller is further configured to:
correct for a misidentification of the coupler position identified in the image data based on the proximity of the trailer identified from the ultrasonic data.
4. The system according to claim 1, wherein the controller is further configured to:
monitor the proximity of the trailer based on the ultrasonic data;
monitor the location of the coupler based on the image data; and
suppress a motion instruction calculated based on the location of the coupler detected in the image data in response to the proximity of the trailer.
5. The system according to claim 4, wherein the suppression of the motion instruction comprises stopping the motion of the vehicle.
6. The system according to claim 4, wherein the suppression of the motion instruction is in response to the proximity of the trailer being less than sum of a change in the proximity detected via the ultrasonic data and a predetermined distance constant.
7. The system according to claim 1, wherein the controller is further configured to:
control a notification indicating that the trailer is outside the detection range.
8. The system according to claim 7, wherein the detection range of the notification provides for the vehicle to be repositioned at a greater distance within the detection range.
9. A method for controlling a vehicle comprising:
processing image data from an image sensor identifying a coupler position of a trailer in a detection range of the image sensor, wherein the detection range comprises a minimum tracking distance within which the controller is inoperable to accurately track the coupler position via the image data;
processing ultrasonic data identifying a proximity of the trailer;
controlling a motion of the vehicle aligning the hitch ball with the coupler position;
monitoring proximity of the trailer relative to the coupler position;
identifying the trailer within the minimum tracking distance of the image sensor based on the proximity of the trailer;
halting the motion of the vehicle in response to the proximity pf the trailer being within the minimum tracking distance.
10. The method according to claim 9, further comprising:
capturing the image data with an image sensor; and
capturing the ultrasonic data with an ultrasonic sensor.
11. The method according to claim 9, wherein the halting of the motion of the vehicle comprises:
correcting for a misidentification of the coupler position identified in the image data based on the proximity of the trailer identified from the ultrasonic data.
12. The method according to claim 9, further comprising;
comparing the coupler position identified based on the image data to the proximity of the trailer identified based on the ultrasonic data, wherein the comparison comprises comparing a distance to the coupler position with a change in the proximity.
13. The method according to claim 12, wherein the comparison further comprises a sum adding a predetermined distance coefficient to the change in proximity and comparing the distance to the coupler position with the sum of the change in proximity and the predetermined distance coefficient.
14. A vehicle system, comprising:
a hitch ball mounted on a vehicle;
an ultrasonic sensor configured to capture ultrasonic data rearward of the vehicle;
an image sensor configured to capture image data rearward of the vehicle; and
a controller configured to:
process image data from the image sensor identifying a coupler position of a trailer in a detection range of the image sensor, wherein the detection range comprises a minimum tracking distance within which the controller is inoperable to accurately track the coupler position via the image data;
control a motion of the vehicle aligning the hitch ball with the coupler position;
process ultrasonic data from the ultrasonic sensor identifying a proximity of the trailer;
monitor the proximity of the trailer relative to the coupler position; and
correct for a misidentification of the coupler position identified in the image data within the minimum tracking distance based on the proximity of the trailer identified from the ultrasonic data.
15. The system according to claim 14, wherein the correcting for the misidentification comprises halting the motion of the vehicle in response to a comparison of the coupler position with the proximity.
16. The system according to claim 15, wherein the comparison comprises comparing a distance to the coupler position with a change in the proximity detected via the ultrasonic data.
17. The system according to claim 16, wherein the comparison further comprises calculating a sum of a predetermined distance coefficient and the change in proximity and comparing the distance to the coupler position with the sum of the change in proximity and the predetermined distance coefficient.
US16/117,076 2018-08-30 2018-08-30 System and methods for reverse braking during automated hitch alignment Active 2039-04-06 US10962980B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/117,076 US10962980B2 (en) 2018-08-30 2018-08-30 System and methods for reverse braking during automated hitch alignment
CN201910797174.8A CN110871652A (en) 2018-08-30 2019-08-27 System and method for reverse braking during automatic hitch alignment
DE102019123125.6A DE102019123125A1 (en) 2018-08-30 2019-08-28 SYSTEM AND METHOD FOR REVERSE BRAKING DURING AUTOMATIC CLUTCH ORIENTATION

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/117,076 US10962980B2 (en) 2018-08-30 2018-08-30 System and methods for reverse braking during automated hitch alignment

Publications (2)

Publication Number Publication Date
US20200073398A1 US20200073398A1 (en) 2020-03-05
US10962980B2 true US10962980B2 (en) 2021-03-30

Family

ID=69526912

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/117,076 Active 2039-04-06 US10962980B2 (en) 2018-08-30 2018-08-30 System and methods for reverse braking during automated hitch alignment

Country Status (3)

Country Link
US (1) US10962980B2 (en)
CN (1) CN110871652A (en)
DE (1) DE102019123125A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220250681A1 (en) * 2021-02-05 2022-08-11 Ford Global Technologies, Llc Trailer backup assist systems and methods

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11030476B2 (en) * 2018-11-29 2021-06-08 Element Ai Inc. System and method for detecting and tracking objects
US11090991B2 (en) * 2018-12-04 2021-08-17 Ford Global Technologies, Llc Human machine interface for vehicle alignment in an acceptable hitch zone
US10864848B2 (en) * 2018-12-21 2020-12-15 Continental Automotive Systems, Inc. Reverse lights trailer hitch assist
US11022972B2 (en) 2019-07-31 2021-06-01 Bell Textron Inc. Navigation system with camera assist
CN111898460A (en) * 2020-07-08 2020-11-06 中国神华能源股份有限公司神朔铁路分公司 Locomotive auxiliary trailer system, method, device, equipment and storage medium
US11667165B1 (en) * 2020-09-29 2023-06-06 Orbcomm Inc. System, method and apparatus for multi-zone container monitoring
CN114228410A (en) * 2021-12-24 2022-03-25 上海创时汽车科技有限公司 Automatic connection device and control method for intelligent driving automobile

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480104B1 (en) 2001-04-23 2002-11-12 Darby S. Wall Trailer alignment method and apparatus
US20040226768A1 (en) 2000-08-09 2004-11-18 Deluca Michael J. Automatic hold parking brake
US20050246081A1 (en) 2004-04-10 2005-11-03 Christophe Bonnet Method and system to prevent unintended rolling of a vehicle
US20060293800A1 (en) 2004-02-24 2006-12-28 Bayerische Motoren Werke Aktiengesellschaft Process for coupling a trailer with the use of a vehicle level regulation system
US20070058838A1 (en) * 2005-09-13 2007-03-15 Kabushiki Kaisha Toshiba Object position detecting apparatus, map creating apparatus, autonomous mobile apparatus, object position detecting method, and computer program product for object position detection
JP2009051725A (en) 2007-08-24 2009-03-12 Kofukin Seimitsu Kogyo (Shenzhen) Yugenkoshi Thermal conductive sheet containing high-density carbon nanotube array, and method of manufacturing the same
US20100126985A1 (en) 2008-06-13 2010-05-27 Tsinghua University Carbon nanotube heater
US20120283909A1 (en) 2011-05-03 2012-11-08 Dix Peter J System and method for positioning a vehicle with a hitch using an automatic steering system
EP2682329A1 (en) 2012-07-05 2014-01-08 Uusi, LLC Vehicle trailer connect system
US20140183841A1 (en) 2012-12-21 2014-07-03 Dustin Jones Tow Hitch System with Brake Sensor
US9102271B2 (en) 2011-04-19 2015-08-11 Ford Global Technologies, Llc Trailer monitoring system and method
US20150234045A1 (en) * 2014-02-20 2015-08-20 Mobileye Vision Technologies Ltd. Navigation based on radar-cued visual imaging
US20160001632A1 (en) 2013-02-22 2016-01-07 Lg Hausys, Ltd. Automotive sheet heater using radiant heat
US20160167482A1 (en) 2014-12-10 2016-06-16 Hyundai Motor Company Heating panel for vehicle
US20160185169A1 (en) 2013-07-11 2016-06-30 Smart Innovation As System and method for connecting or disconnecting a trailer to a vehicle
US9434381B2 (en) 2014-10-31 2016-09-06 Fca Us Llc Collision avoidance method including determining height of an object fixed to a vehicle
US20160272024A1 (en) 2013-11-18 2016-09-22 Robert Bosch Gmbh Overhead view for hitch connection
US9457632B1 (en) 2015-06-11 2016-10-04 Fca Us Llc Collision avoidance method including determining height of an object fixed to a vehicle
US20160288601A1 (en) * 2015-04-01 2016-10-06 Robert Bosch Gmbh Trailer coupling assistance system with vehicle video camera
US20160320477A1 (en) * 2013-12-17 2016-11-03 Valeo Schalter Und Sensoren Gmbh Method for detecting a mark made on a ground, driver assistance device and motor vehicle
US20160347263A1 (en) 2015-05-28 2016-12-01 Daehan Solution Co., Ltd Headlining having heat shielding function for vehicle and manufacturing method thereof
US20160374147A1 (en) 2014-03-31 2016-12-22 Lg Hausys, Ltd. Heating seat with high efficiency for vehicle
US9550399B2 (en) 2012-12-21 2017-01-24 Intelli-Hitch, Llc Tow hitch with brake sensor system and method of use
US20170139419A1 (en) * 2014-06-30 2017-05-18 Husqvarna Ab Improved robotic working tool
US20170158007A1 (en) * 2015-12-08 2017-06-08 Ford Global Technologies, Llc Trailer backup assist system with hitch assist
US20170274827A1 (en) * 2016-03-23 2017-09-28 GM Global Technology Operations LLC Rear vision system for a vehicle and method of using the same
US20180157920A1 (en) * 2016-12-01 2018-06-07 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for recognizing obstacle of vehicle
US20180312022A1 (en) 2017-05-01 2018-11-01 Ford Global Technologies, Llc System to automate hitching a trailer
US20180361929A1 (en) * 2017-06-20 2018-12-20 Ford Global Technologies, Llc Vehicle rear object proximity system using multiple cameras
US20190228239A1 (en) * 2016-08-23 2019-07-25 Suteng Innovation Technology Co., Ltd. Target detection method and system
US10471591B1 (en) * 2018-06-01 2019-11-12 X Development Llc Object hand-over between robot and actor
US20190346557A1 (en) * 2016-11-18 2019-11-14 Denso Corporation Vehicle control device and vehicle control method
US20190366929A1 (en) * 2017-01-16 2019-12-05 Aisin Seiki Kabushiki Kaisha Periphery monitoring device
US20200001919A1 (en) * 2018-06-27 2020-01-02 Ford Global Technologies, Llc Hitch assist system

Patent Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040226768A1 (en) 2000-08-09 2004-11-18 Deluca Michael J. Automatic hold parking brake
US6480104B1 (en) 2001-04-23 2002-11-12 Darby S. Wall Trailer alignment method and apparatus
US20060293800A1 (en) 2004-02-24 2006-12-28 Bayerische Motoren Werke Aktiengesellschaft Process for coupling a trailer with the use of a vehicle level regulation system
US20050246081A1 (en) 2004-04-10 2005-11-03 Christophe Bonnet Method and system to prevent unintended rolling of a vehicle
US20070058838A1 (en) * 2005-09-13 2007-03-15 Kabushiki Kaisha Toshiba Object position detecting apparatus, map creating apparatus, autonomous mobile apparatus, object position detecting method, and computer program product for object position detection
JP2009051725A (en) 2007-08-24 2009-03-12 Kofukin Seimitsu Kogyo (Shenzhen) Yugenkoshi Thermal conductive sheet containing high-density carbon nanotube array, and method of manufacturing the same
US20100126985A1 (en) 2008-06-13 2010-05-27 Tsinghua University Carbon nanotube heater
US9102271B2 (en) 2011-04-19 2015-08-11 Ford Global Technologies, Llc Trailer monitoring system and method
US20120283909A1 (en) 2011-05-03 2012-11-08 Dix Peter J System and method for positioning a vehicle with a hitch using an automatic steering system
EP2682329A1 (en) 2012-07-05 2014-01-08 Uusi, LLC Vehicle trailer connect system
US20140183841A1 (en) 2012-12-21 2014-07-03 Dustin Jones Tow Hitch System with Brake Sensor
US9550399B2 (en) 2012-12-21 2017-01-24 Intelli-Hitch, Llc Tow hitch with brake sensor system and method of use
US20160001632A1 (en) 2013-02-22 2016-01-07 Lg Hausys, Ltd. Automotive sheet heater using radiant heat
US20160185169A1 (en) 2013-07-11 2016-06-30 Smart Innovation As System and method for connecting or disconnecting a trailer to a vehicle
US20160272024A1 (en) 2013-11-18 2016-09-22 Robert Bosch Gmbh Overhead view for hitch connection
US20160320477A1 (en) * 2013-12-17 2016-11-03 Valeo Schalter Und Sensoren Gmbh Method for detecting a mark made on a ground, driver assistance device and motor vehicle
US20150234045A1 (en) * 2014-02-20 2015-08-20 Mobileye Vision Technologies Ltd. Navigation based on radar-cued visual imaging
US20160374147A1 (en) 2014-03-31 2016-12-22 Lg Hausys, Ltd. Heating seat with high efficiency for vehicle
US20170139419A1 (en) * 2014-06-30 2017-05-18 Husqvarna Ab Improved robotic working tool
US9434381B2 (en) 2014-10-31 2016-09-06 Fca Us Llc Collision avoidance method including determining height of an object fixed to a vehicle
US20160167482A1 (en) 2014-12-10 2016-06-16 Hyundai Motor Company Heating panel for vehicle
US20160288601A1 (en) * 2015-04-01 2016-10-06 Robert Bosch Gmbh Trailer coupling assistance system with vehicle video camera
US9499018B2 (en) 2015-04-01 2016-11-22 Robert Bosch Gmbh Trailer coupling assistance system with vehicle video camera
US20160347263A1 (en) 2015-05-28 2016-12-01 Daehan Solution Co., Ltd Headlining having heat shielding function for vehicle and manufacturing method thereof
US9457632B1 (en) 2015-06-11 2016-10-04 Fca Us Llc Collision avoidance method including determining height of an object fixed to a vehicle
US20170158007A1 (en) * 2015-12-08 2017-06-08 Ford Global Technologies, Llc Trailer backup assist system with hitch assist
US20170274827A1 (en) * 2016-03-23 2017-09-28 GM Global Technology Operations LLC Rear vision system for a vehicle and method of using the same
US20190228239A1 (en) * 2016-08-23 2019-07-25 Suteng Innovation Technology Co., Ltd. Target detection method and system
US20190346557A1 (en) * 2016-11-18 2019-11-14 Denso Corporation Vehicle control device and vehicle control method
US20180157920A1 (en) * 2016-12-01 2018-06-07 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for recognizing obstacle of vehicle
US20190366929A1 (en) * 2017-01-16 2019-12-05 Aisin Seiki Kabushiki Kaisha Periphery monitoring device
US20180312022A1 (en) 2017-05-01 2018-11-01 Ford Global Technologies, Llc System to automate hitching a trailer
US20180361929A1 (en) * 2017-06-20 2018-12-20 Ford Global Technologies, Llc Vehicle rear object proximity system using multiple cameras
US10471591B1 (en) * 2018-06-01 2019-11-12 X Development Llc Object hand-over between robot and actor
US20200001919A1 (en) * 2018-06-27 2020-01-02 Ford Global Technologies, Llc Hitch assist system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220250681A1 (en) * 2021-02-05 2022-08-11 Ford Global Technologies, Llc Trailer backup assist systems and methods
US11511801B2 (en) * 2021-02-05 2022-11-29 Ford Global Technologies, Llc Trailer backup assist systems and methods

Also Published As

Publication number Publication date
US20200073398A1 (en) 2020-03-05
DE102019123125A1 (en) 2020-03-05
CN110871652A (en) 2020-03-10

Similar Documents

Publication Publication Date Title
US10962980B2 (en) System and methods for reverse braking during automated hitch alignment
US11385651B2 (en) System and methods for detection and response to interference between trailer coupler and hitch ball
US11427199B2 (en) System for aligning a vehicle hitch location identifier with a trailer coupler
US11433722B2 (en) Trailer and vehicle collision detection and response during autohitch maneuvering
US11505247B2 (en) System and method for trailer height adjustment
US10870323B2 (en) Compensation for trailer coupler geometry in automatic hitch operation
US11155298B2 (en) Modified steering angle at completion of hitch assist operation
US11433944B2 (en) Automated hitching system with variable backing paths
US11505124B2 (en) Alignment position adaptation for fifth wheel trailers in hitch assist operation
US11180148B2 (en) Detection and response to confined trailer in system-assisted hitch operation
US20200361466A1 (en) Brake control technique to stop a vehicle for assisting automatic trailer hitching
US11155133B2 (en) System and methods for vehicle alignment control
US11491833B2 (en) System and methods for vehicle alignment control
US11247520B2 (en) System and method for trailer alignment
US10960721B2 (en) System for detection and response to retreating trailer
US10933914B2 (en) Trailer hitching aid
US10946897B2 (en) System and methods for steering control in assisted vehicle operation
US11148488B2 (en) System and method for positioning a vehicle with reduced variation
US11479297B2 (en) Overshoot protection in hitch assist operation
US11192552B2 (en) Vehicle motion control for trailer alignment
US11077729B2 (en) System and method for trailer height adjustment
US11358639B2 (en) Trailer hitching assistance system with contact mitigation measures
US11214105B2 (en) Saturated steering detection and operation for guided vehicle operation

Legal Events

Date Code Title Description
AS Assignment

Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NIEWIADOMSKI, LUKE;LAVOIE, ERICK MICHAEL;REED, ERIC L.;AND OTHERS;SIGNING DATES FROM 20180827 TO 20180829;REEL/FRAME:046751/0570

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE