GB2580401A - A control system, system and method for providing assistance to an occupant of a vehicle - Google Patents

A control system, system and method for providing assistance to an occupant of a vehicle Download PDF

Info

Publication number
GB2580401A
GB2580401A GB1900333.4A GB201900333A GB2580401A GB 2580401 A GB2580401 A GB 2580401A GB 201900333 A GB201900333 A GB 201900333A GB 2580401 A GB2580401 A GB 2580401A
Authority
GB
United Kingdom
Prior art keywords
vehicle
control system
trajectory
topography
image frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1900333.4A
Other versions
GB2580401B (en
GB201900333D0 (en
Inventor
Aitdis Llias
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1900333.4A priority Critical patent/GB2580401B/en
Publication of GB201900333D0 publication Critical patent/GB201900333D0/en
Priority to DE112020000391.4T priority patent/DE112020000391T5/en
Priority to US17/422,163 priority patent/US11919513B2/en
Priority to PCT/EP2020/050119 priority patent/WO2020144129A1/en
Publication of GB2580401A publication Critical patent/GB2580401A/en
Application granted granted Critical
Publication of GB2580401B publication Critical patent/GB2580401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3644Landmark guidance, e.g. using POIs or conspicuous other objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • B60R2300/305Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/804Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A control system (10 see fig 1), a system (30 see fig 1) and a method (100 see fig 3) are provided for assisting an occupant of a vehicle (50 see fig 2). The control system (10) receives data indicative of topography of a surface (36 see figs 6a-d) within an environment. A vehicle trajectory is determined in dependence on vehicle parameters for use in a composite image sequence comprising image frames 32 each comprising a captured image and a trajectory indicator 40, 42 indicative of the determined vehicle trajectory for display to an occupant. The trajectory indicator 40,42 is displayed within the image sequence in dependence on the topography data. The invention avoids an unnatural sense of trajectory lines not being closely associated with the topography of the ground surface 36.

Description

A CONTROL SYSTEM, SYSTEM AND METHOD FOR PROVIDING ASSISTANCE TO AN
OCCUPANT OF A VEHICLE
TECHNICAL FIELD
The present disclosure relates to a control system, a system and a method for providing assistance to an occupant of a vehicle. Aspects of the invention relate to a control system, to a system, to a vehicle, to a method, to a non-transitory computer readable medium, and to computer software for providing assistance whilst performing manoeuvres within a vehicle.
BACKGROUND
It is known to provide driver assistance systems which provide a visual representation of an environment external to a vehicle. Some of these systems include an image sequence of the environment captured from one or more cameras mounted on or within the vehicle. In addition, some systems include some form of indication of a predicted path of the vehicle through the image sequence. This generally takes the form of one or more trajectory lines.
Conventionally, trajectory lines may illustrate the predicted path of one or more wheels of the vehicle within the environment.
A disadvantage of prior art systems is that the trajectory lines themselves appear fixed with respect to the vehicle in the direction of travel of the vehicle. This can appear unnatural and/or confusing to a user as the trajectory lines may appear to "float" over a surface as the vehicle moves within the environment, particularly where the terrain over which the vehicle is travelling is not flat with respect to the vehicle. This can make it difficult to assess any correspondence between the location of the trajectory lines within the image sequence and any objects within the environment external to the vehicle present in the image sequence.
It is an aim of the present invention to address one or more of the disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system, a system, a vehicle, a method, and a non-transitory computer readable medium as claimed in the appended claims.
According to an aspect of the invention there is provided a control system for providing assistance to an occupant of a vehicle, the control system comprising one or more controllers configured to: determine a composite image sequence comprising one or more image frames, the or each image frame comprising a captured image of an environment external to the vehicle and a trajectory indicator indicative of a determined vehicle trajectory through the environment; and output a signal indicative of the composite image sequence to a display for displaying the composite image sequence to the occupant of the vehicle; wherein the control system is configured to position the trajectory indicator at one or more locations within the or each image frame in dependence on a topography of the surface.
According to a further aspect of the invention there is provided a control system for providing assistance to an occupant of a vehicle, the control system comprising one or more controllers, configured to: receive one or more vehicle parameters; receive image data from an imaging device comprising captured images of an environment external to the vehicle; receive or determine topography data indicative of a topography of a surface within the environment; determine a vehicle trajectory in dependence on the or each parameter; determine a composite image sequence comprising one or more image frames, the or each image frame comprising one of said captured images and a trajectory indicator indicative of the determined vehicle trajectory; and output a signal indicative of the composite image sequence to a display for displaying the composite image sequence to the occupant of the vehicle; wherein the control system is configured to position the trajectory indicator at one or more locations within the or each image frame in dependence on the received topography data.
Advantageously, the trajectory indicator may be made to appear "anchored" to, or aligned with, the ground within the composite image sequence. This increases an occupant's situational awareness of the driving environment, providing the user with a more natural and less confusing representation of the vehicle's movement through the environment when compared with prior art systems. The control system of the present invention provides a visual representation of an environment of a vehicle which enables a user to better assess any correspondence between the location of a trajectory indicator, and hence a future position of a vehicle, with respect to one or more objects within the environment external to the vehicle, particularly in environments where the topography of the terrain over which the vehicle is travelling is not substantially flat, but includes inclines, declines and other variations.
In embodiments, the one or more controllers collectively comprise: at least one electronic processor having an electrical input for receiving the one or more vehicle parameters and/or the image data; and at least one electronic memory device operatively coupled to the at least one electronic processor and having instructions stored therein; wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions stored therein so as to determine the composite image sequence.
The control system may be configured to position the trajectory indicator within the or each image frame such that an apparent orientation of the trajectory indicator at the or each image frame location is substantially equivalent to the orientation of the surface at the image frame location. Optionally, the control system is configured to position the trajectory indicator within the or each image frame such that the trajectory indicator is indicative of the determined trajectory of the vehicle over the surface.
In some embodiments the topography data comprises a surface profile of the surface. In such embodiments the control system may be configured to: correlate the surface profile with the surface within the or each image frame; and position the trajectory indicator in accordance with an orientation of the surface profile at the one or more image frame locations. Alternatively, the topography data comprises a point cloud comprising a plurality of data points indicative of the topography of the surface within the environment. In such embodiments the control system may be configured to: correlate the point cloud with the surface within the or each image frame; and position the trajectory indicator in accordance with the point cloud at the one or more image frame locations. Correlating the surface profile or point cloud with the surface in the or each image frame may comprise overlaying the surface profile or point cloud onto a representation of the surface in the or each image frame in order to align the topography data with the representation of the surface.
In embodiments, the topography data comprises sensor data from one or more sensors associated with the vehicle. In such embodiments the control system may be configured to: determine a topography of the surface within the environment in dependence on the sensor data; and position the trajectory indicator within the or each image frame in dependence on the determined topography. Advantageously, the control system of the present invention may be utilised to determine a topography of a surface from raw sensor data.
In embodiments, the control system may be configured to: determine the topography data from the sensor data. Advantageously, the control system of the present invention may be utilised to determine the topography data from raw sensor data. For example, in some embodiments the sensor data may comprise image data and the control system may be configured to extract the topography data from the image data.
The control system may be configured to: determine a topography of the surface within the environment by determining a surface profile of the surface in dependence on the sensor data; correlate the surface profile with the surface within the or each image frame; and position the trajectory indicator in accordance with an orientation of the surface profile at the one or more image frame locations. Alternatively, the control system may be configured to: determine a topography of the surface within the environment by determining a point cloud comprising a plurality of data points indicative of the topography of the surface in dependence on the sensor data; correlate the point cloud with the surface within the or each image frame; and position the trajectory indicator in accordance with the point cloud at the one or more image frame locations. As described herein, correlating the surface profile or point cloud with the surface in the or each image frame may comprise overlaying the surface profile or point cloud onto a representation of the surface in the or each image frame in order to align the topography data with the representation of the surface.
The one or more sensors may comprise one or more cameras. The camera(s) may comprise a mono camera, stereo camera, optical, and/or infrared camera, for example. Additionally or alternatively, the one or more sensors may comprise a RADAR sensor, a LIDAR sensor, and/or an ultrasonic sensor. In embodiments the one or more sensors may comprise a position sensor, which may be a GNSS sensor for determining an absolute position of an associated object, e.g. the vehicle. The one or more sensors may comprise one or more attitude sensors for determining an orientation of the vehicle with respect to the environment.
The attitude sensor(s) may comprise an inertial measurement unit (IMU), accelerometer, inclinometer, and/or a gyroscope, for example.
The one or more vehicle parameters may comprise a steering angle of the vehicle. The steering angle may comprise an angle of a steering wheel of the vehicle, or an angle of one or more steerable wheels of the vehicle. The one or more vehicle parameters may comprise a velocity of the vehicle; and/or an orientation of the vehicle about one or more axes.
In embodiments, the control system may be configured to: receive or determine vehicle orientation data indicative of an orientation of the vehicle about one or more axes; determine a relative orientation of the vehicle with respect to the surface in dependence on the vehicle orientation data; and position the trajectory indicator at the one or more image frame locations in dependence on the relative orientation of the vehicle. In this way, the orientation of the vehicle within the environment may be utilised to determine a required position for a trajectory indicator within the composite image sequence.
In some embodiments the trajectory indicator may comprise a plurality of indicator sections.
In such embodiments, the control system may be configured to position each trajectory indicator section at a respective image frame location in dependence on the topography of the surface at the respective image frame location. In some embodiments each indicator section is moveable about a respective pivot point associated with an end point of a preceding indicator section. In such embodiments, the control system may be configured to move one or more of the plurality of indicator sections about its respective pivot point in dependence on the topography of the surface at a corresponding image frame location. In this way, a trajectory indicator may be constructed from a plurality of indicator sections which avoids any discontinuities along the length of the trajectory indicator which may otherwise be encountered if indicator sections were moved without taking into account the position of adjacent sections.
According to a further aspect of the invention there is provided a system for providing assistance to an occupant of a vehicle comprising a control system of any preceding aspect of the invention, and one or more selected from: an imaging device configured to capture one or more images of an environment external to the vehicle; and a display configured to display the composite image sequence to the occupant of the vehicle.
According to another aspect of the invention there is provided a vehicle comprising a control system or a system of any aspect of the invention described herein.
According to yet a further aspect of the invention there is provided a method for providing assistance to an occupant of a vehicle, the method comprising: receiving one or more vehicle parameters; receiving image data from an imaging device comprising captured images of an environment external to the vehicle; receiving or determining topography data indicative of a topography of a surface within the environment; determining a vehicle trajectory in dependence on the or each parameter; determining a composite image sequence comprising a one or more image frames, the or each image frame comprising one of said captured images and a trajectory indicator indicative of the determined vehicle trajectory; and outputting a signal indicative of the composite image sequence to a display for displaying the composite image sequence to the occupant of the vehicle; wherein the method comprises positioning the trajectory indicator at one or more locations within the or each image frame in dependence on the topography data.
In some embodiments the method comprises positioning the trajectory indicator within the or each image frame such that an apparent orientation of the trajectory indicator at the or each image frame location is substantially equivalent to the orientation of the surface at the image frame location. Optionally, the method comprises positioning the trajectory indicator within the or each image frame such that the trajectory indicator is indicative of the determined trajectory of the vehicle over the surface.
In embodiments, the topography data comprises sensor data from one or more sensors, and the method comprises: determining a topography of the surface in dependence on the sensor data; and positioning the trajectory indicator within the or each image frame in dependence on the determined topography.
In embodiments, the method may comprise determining the topography data from the sensor data. For example, in some embodiments the sensor data may comprise image data and the method may comprise extracting the topography data from the image data.
In some embodiments the topography data comprises a surface profile of the surface. In such embodiments the method comprises correlating the surface profile with the surface within the or each image frame; and positioning the trajectory indicator in accordance with an orientation of the surface profile at the one or more image frame locations. Alternatively, the topography data comprises a point cloud comprising a plurality of data points indicative of the topography of the surface within the environment. In such embodiments the method may comprise correlating the point cloud with the surface within the or each image frame; and positioning the trajectory indicator in accordance with the point cloud at the one or more image frame locations. Correlating the surface profile or point cloud with the surface in the or each image frame may comprise overlaying the surface profile or point cloud onto a representation of the surface in the or each image frame in order to align the topography data with the representation of the surface.
In embodiments, the topography data comprises sensor data from one or more sensors associated with the vehicle. In such embodiments the method may comprise determining a topography of the surface within the environment in dependence on the sensor data; and positioning the trajectory indicator within the or each image frame in dependence on the determined topography.
In some embodiments the method comprises determining a topography of the surface within the environment by determining a surface profile of the surface in dependence on the sensor data; correlating the surface profile with the surface within the or each image frame; and positioning the trajectory indicator in accordance with an orientation of the surface profile at the one or more image frame locations. Alternatively, the method may comprise determining a topography of the surface within the environment by determining a point cloud comprising a plurality of data points indicative of the topography of the surface in dependence on the sensor data; correlating the point cloud with the surface within the or each image frame; and positioning the trajectory indicator in accordance with the point cloud at the one or more image frame locations. As described herein, correlating the surface profile or point cloud with the surface in the or each image frame may comprise overlaying the surface profile or point cloud onto a representation of the surface in the or each image frame in order to align the topography data with the representation of the surface.
The method may comprise receiving or determining vehicle orientation data indicative of an orientation of the vehicle about one or more axes; determining a relative orientation of the vehicle with respect to the surface in dependence on the vehicle orientation data; and positioning the trajectory indicator at the one or more image frame locations in dependence on the relative orientation of the vehicle.
In some embodiments the trajectory indicator may comprise a plurality of indicator sections.
In such embodiments, the method may comprise positioning each trajectory indicator section at a respective image frame location in dependence on the topography of the surface at the respective image frame location. In some embodiments each indicator section is moveable about a respective pivot point associated with an end point of a preceding indicator section. In such embodiments, the method may comprise moving one or more of the plurality of indicator sections about its respective pivot point in dependence on the topography of the surface at a corresponding image frame location.
According to another aspect of the invention there is provided a non-transitory computer readable medium having instructions stored therein which, when executed by a computing means, perform a method according to the preceding aspect of the invention.
According to a further aspect of the invention there is provided computer software which, when executed by one or more processors, causes performance of a method in accordance with any preceding aspect of the invention.
Any controller or controllers described herein may suitably comprise a control unit or computational device having one or more electronic processors. Thus the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller" or "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. A first controller may be implemented in software run on one or more processors. One or more other controllers may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a schematic representation of an embodiment of a control system in accordance with the present invention; Figure 2 shows a schematic representation of an embodiment of a vehicle in accordance with the present invention; Figure 3 is a flowchart illustrating an embodiment of a method in accordance with the present invention; Figure 4 illustrates an example method for use in embodiments of the present invention; Figure 5 illustrates an example method for use in embodiments of the present invention; Figures 6A-D illustrate an operational use of embodiments of the present invention; Figure 7 illustrates an example method for use in embodiments of the present invention; and Figures 8-11 illustrate the operational use of embodiments of the invention, showing a series of graphical representations of composite image sequences formed in embodiments of the present invention.
DETAILED DESCRIPTION
A control system 10, system 30, vehicle 50 and method 100 in accordance with embodiments of the present invention are described herein with reference to the accompanying Figures.
With reference to Figure 1, a system 30 in accordance with the invention comprises a control system 10. The control system 10 is operatively coupled to a vehicle system 16, an imaging device in the form of a camera 20, one or more sensors 28 and a display 24 as shown in Figure 1. The control system 10 includes a processor 12, memory device 26, electrical inputs 14, 18, 27, and an electrical output 22.
The camera 20 is configured to capture images of an environment external to the vehicle. As will be described herein, the control system 10 is configured to receive image data representative of the images captured by the camera 20, and use this data to determine a composite image sequence.
The vehicle system 16 may be any system capable of outputting a signal indicative of one or more vehicle parameters, specifically relating to the motion of the vehicle. The vehicle system 16 may comprise a steering system of the vehicle which may be capable of outputting a signal indicative of a steering angle of the vehicle. The steering angle may be an angular position of a steering wheel of the vehicle. Additionally or alternatively, the steering angle may comprise an angular position of one or more steerable wheels of the vehicle. The steering angle may relate to a predicted radius of curvature of a vehicle path due to the angular position of one or more steerable wheels of the vehicle. The vehicle system 16 may comprise a braking system of the vehicle, such as an anti-lock braking system (ABS) which may be configured to output a signal indicative of a wheel speed of the vehicle, and hence a speed of the vehicle. The vehicle system 16 may comprise a power unit management system which may be configured to output a signal indicative of an engine and/or motor speed of the vehicle, for example. In use, the control system 10 may be configured to determine the vehicle trajectory in dependence on any one or more such types of vehicle parameter.
In embodiments, the vehicle system 16 is an imaging system, and may comprise the camera 20, for example. In such embodiments, the vehicle system 16 comprises an image processing unit configured to analyse movement of one or more objects within image data captured by the imaging system. Such analysis may be used to infer a speed of the vehicle relative to those objects, and hence a speed of the vehicle within the environment.
The one or more sensors 28 may comprise sensors capable of detecting obstacles, surfaces and other such objects within the environment of the vehicle, and/or information relating to the position / orientation of the vehicle within that environment. For example, the one or more sensors may comprise an imaging device such as a camera, RADAR, LIDAR, ultrasonic sensors, etc. The one or more sensors may comprise the camera 20 configured to capture images of the environment external to the vehicle. The data received from the one or more sensors 28, hereinafter referred to as topography data, may be used to map the environment external to the vehicle. For example, in embodiments the control system 10 is configured to utilise the topography data to determine a topography of a surface over which the vehicle is currently travelling or will travel were it to continue along the determined trajectory. The topography of the surface is used to determine the composite image sequence as described herein. The sensor(s) 28 may additionally include an inertial measurement unit (IMU) for determining an orientation of the vehicle along one or more axes, and/or sensors relating to a GNSS module (e.g. a GPS module) within the vehicle suitable for determining a position of the vehicle within a mapped environment.
In the illustrated embodiments described herein, the data received from the one or more sensors 28 comprises topography data, and the control system 10 is configured to receive the data from sensors 28 as "topography" data. However, it will be appreciated that in other embodiments the control system 10 may be configured to receive raw sensor data from the sensors 28. For example, in such embodiments the control system 10 may be configured to determine topography data by extracting the topography data from raw sensor data received from the sensors 28.
In use, the control system 10 receives, at electrical input 14, one or more vehicle parameters from the vehicle system 16. The one or more vehicle parameters may comprise a speed of the vehicle, velocity of the vehicle, steering angle (e.g. an angle of a steering wheel or an angle of one or more steerable wheels of the vehicle), and/or orientation of the vehicle (e.g. roll, pitch and/or yaw angle), for example. The control system 10 is further configured to receive image data, at electrical input 18, from the camera 20. The image data comprises captured images of an environment external to the vehicle. The processor 12 determines a vehicle trajectory using the received vehicle parameters, and uses the determined vehicle trajectory along with the received image data to determine a composite image sequence.
Specifically, the processor 12 is configured to form a composite image sequence comprising a sequence of one or more image frames, each comprising a captured image and a trajectory indicator indicative of the determined vehicle trajectory. A control signal indicative of the composite image sequence is output via electrical output 22 to the display 24 for displaying the composite image sequence to an occupant of the vehicle.
As will be described in further detail herein, in generating the composite image sequence the control system 10 is configured to position the trajectory indicator within the composite image sequence in dependence on the received topography data. Accordingly, the control system 10 may be configured to position the trajectory indicator within the composite image sequence such that an apparent orientation of the trajectory indicator at one or more locations within the or each image frame is substantially equivalent to the orientation of the surface at the respective image frame location. Specifically, data from the sensor(s) 28 is used to determine a topography of a traversable surface within the environment of the vehicle. This is achieved by mapping, e.g. using a point cloud map, contour map, sparse map, etc. an environment using the one or more sensor(s) 28. In embodiments, the mapping of the environment is used to determine a three dimensional representation of the environment, and specifically a traversable surface within the environment of the vehicle. Overlaying or otherwise correlating the mapped topography with the images obtained by camera 20, and positioning the trajectory indicator with in dependence on the topography, ensures that, to a user, the trajectory indicator(s) appear to lie on the surface (and are fixed with respect thereto).
As will be appreciated, any process step (or method step) described herein may be performed by running computer software, e.g. computer software 29, on one or more processors 12 as shown in Figure 1. Any such software 29 may be stored in in a location accessible by the processor 12, e.g. at memory 26.
Figure 2 illustrates an embodiment of a vehicle 50 in accordance with an embodiment of the invention. As shown, the vehicle 50 comprises a system 30 which includes control system 10, imaging devices in the form of cameras 20a, 20b, the vehicle system 16, sensor 28 and the display 24.
An embodiment of a method 100 in accordance with the invention will now be described with reference to Figure 3.
The method comprises receiving 102 one or more vehicle parameters. As described herein, the one or more parameters may be received from a vehicle system and relate to the motion of the vehicle. At 104, the method comprises receiving image data from an imaging device. Typically, this comprises receiving image data from a camera mounted on or within the vehicle configured to capture images of an environment external to the vehicle. It will, however, be appreciated that the method may comprise receiving image data from a plurality of imaging devices. At 106, the method comprises receiving topography data indicative of a topography of a surface within the environment. The topography data may be received directly, or indirectly (i.e. via one or more additional vehicle systems) from the or each sensor 28. At 108, the method comprises determining a vehicle trajectory (as discussed in more detail below). The vehicle trajectory is determined in dependence on the received one or more vehicle parameters. At 110, the method comprises determining a composite image sequence. The composite image sequence comprises a sequence of one or more image frames, the or each image frame comprising a captured image (received from the imaging device) and a trajectory indicator indicative of the determined vehicle trajectory. At 112, a control signal indicative of the composite image sequence is output to a display for displaying the composite image sequence to an occupant of the vehicle.
In determining the composite image sequence, the method 100 comprises positioning the trajectory indicator at one or more locations within the or each image frame of the composite image sequence, in dependence on received topography data. In this way, the trajectory indicator may be made to appear to have substantially the same orientation as the surface within the environment at a particular image frame location. Accordingly, the trajectory indicator may be made to appear within the composite image sequence to be positioned on top of the surface, having a direct correspondence therewith. Determination of the composite image sequence is discussed in detail below.
In the illustrated embodiments described herein, the data received from the one or more sensors comprises topography data, and the method comprises receiving the data from sensors as "topography" data. However, it will be appreciated that in other embodiments the method may comprise receiving raw sensor data from the sensors. In such embodiments the method may include determining the topography data by extracting the topography data from raw sensor data received from the sensors.
Figure 4 illustrates a method for determining a vehicle trajectory. Specifically Figure 4 shows how a Bicycle Model is used to model the trajectory of a vehicle 50.
In the model shown in Figure 4, the vehicle 50 comprises four steerable wheels 52, 54, 56, 58, two front wheels 52, 54 associated with a front axle 53 of the vehicle 50, and two rear wheels 56, 58 associated with a rear axle 57 of the vehicle 50. In this example, the front wheels 52, 54 are configured to be steered at the same angle, m front with respect to longitudinal axis of the vehicle, y. Accordingly, front wheels 52, 54 can be modelled as a single wheel positioned at the centre of the front axle 53 at an angle m Tfront with respect to longitudinal axis of the vehicle, y. Similarly, the rear wheels 56, 58 are configured to be steered at the same angle, (D rear with respect to longitudinal axis of the vehicle, y. Accordingly, rear wheels 56, 58 can be modelled as a single wheel positioned at the centre of the rear axle 57 at an angle m "-rear with respect to longitudinal axis of the vehicle, y.
It will be appreciated that the vehicle 50 may comprise other steering configurations each requiring to be modelled in a similar but different way. For example, where a vehicle comprises only two steerable wheels, e.g. front wheels 52, 54, with the rear wheels 56, 58 being rotationally fixed with respect to the longitudinal axis of the vehicle y, the front wheels 52, 54 can be modelled as described above. The rear wheels 56, 58 can be modelled as a single wheel positioned at the centre of the rear axle 57 at an angle of 0° with respect to longitudinal axis of the vehicle, y. Where each of the wheels 52, 54, 56, 58 are steerable by different angles with respect to one another, each must be modelled individually.
The model shown in Figure 4 is used to determine a centre of rotation C for the vehicle 50 in dependence on the steering angle (D T front Of the front wheels 52, 54 and the steering angle m rear of the rear wheels 56, 58. Defining the frame of reference with respect to the longitudinal axis y and the lateral axis x of the vehicle 50, with the origin at the centre of the rear axle 57, the centre of rotation C of the vehicle 50 is determined as the point of intersection of the straight lines * front, Yrear passing through the centres of the front and rear axles 53, 57, normal to the steering directions of the relevant axles 53, 57. The equations of these lines are given by the following equations: Yfront = tall((pfront) X ± W [Equation 1] Yrear = tan(prear) * X [Equation 2] where W is the wheelbase of the vehicle 50. Solving Equations 1 and 2 for x and Y gives the centre of rotation C, where x = R (radius) and V = 0 (offset from lateral axis).
Once the centre of rotation C has been found, the trajectory of the vehicle 50 is defined as a circle about the centre of rotation C at a radius R. This is shown in Figure 5 by a trajectory indicator in the form of a pair of parallel trajectory lines 40, 42. A first trajectory line 40 represents a predicted path to be traversed by a first point on the vehicle 50 and a second trajectory line 42 represents a predicted path to be traversed by a second point on the vehicle 50. Typically, and as shown in Figure 5, the first point on the vehicle 50 will be the point on the vehicle which is furthest in distance from the identified centre of rotation C. This will typically be the far-side front corner of the vehicle 50, where "far-side' refers to the side of the vehicle 50 furthest from the centre of rotation C. The second point on the vehicle 50 may correspond to the nearside rear wheel of the vehicle 50, where "nearside" refers to the side of the vehicle 50 closest to the centre of rotation C. In embodiments, the trajectory lines 40, 42 represent a predicted path traversed by other vehicle components. For example, the first trajectory line 40 represents a predicted path to be traversed by a first front wheel of the vehicle 50, and the second trajectory line 42 represents a predicted path to be traversed by a second front wheel of the vehicle 50.
The linear speed of the vehicle 50 may be used to determine an angular velocity, vveniole, of the vehicle 50 when moving along the determined trajectory, using the following: Vlinear [Equation 3] vvetucle D avehicie where \hoe., is the linear speed of the vehicle (which may be determined from the one or more vehicle parameters as described herein), and Rvehicle is the perpendicular distance between the longitudinal axis of the vehicle 50 and the centre of rotation C. This equation may be used to determine the angular velocity of any point on the vehicle, for instance the first point and second point for defining first and second trajectory lines 40, 42.
For calculating the angular velocity, vino, of the point on the vehicle 50 defining the first trajectory line 40, the following equation may be used: vout Vlinear Rout [Equation 4] where Root is radius of curvature of the first trajectory line 40 about the centre of rotation C. For calculating the angular velocity, vio, of the point on the vehicle 50 defining the second trajectory line 42, the following equation may be used: Vtinear [Equation 5]
V -
En Rut where R"-, is the radius of curvature of the second trajectory line 42 about the centre of rotation C. When the vehicle is travelling in a straight line (i.e. forwards or backwards along the longitudinal axis of the vehicle), R"t = (the centre of rotation C sits on the longitudinal axis of the vehicle), meaning that V.", = Vin and the distance travelled by a first point on the vehicle 50 (e.g. the farside front corner) along the first trajectory line 40 is the same as the distance travelled by a second point on the vehicle 50 (e.g. the near side rear wheel) along the second trajectory line 42. However, when any steering angle is applied, Vout is less than W. Similarly, the distance travelled by a first point on the vehicle 50 (e.g. the farside front corner) along the first trajectory line 40 is different to the distance travelled by a second point on the vehicle 50 (e.g. the near side rear wheel) along the second trajectory line 42.
Figures 6A to 6D illustrate the operational use of embodiments of the present invention.
Specifically, each of Figures 6A-D illustrate a vehicle 50 traversing a surface 36, and a position of a camera 20, and virtual display 60 with respect to the vehicle 50, and projection outlines 62, 64. The projections outlines 62, 64 link the position of a trajectory line 40 to the position of the camera 20. Specifically, projection outline 62 links the position of a first end of the trajectory line 40 to the camera 20, and projection outline 64 links the position of a second end of the trajectory line 40 to the camera 20. The position at which the projection outlines 62, 64 intersect the virtual display 60 illustrates how the position of a representation of the trajectory line 40 within an image/image sequence, e.g. the composite image sequence, can convey varying orientations of the trajectory line 40 to a user viewing the display, as is described herein.
Figure 6A illustrates the vehicle 50 traversing a surface 36 having a surface profile 36a, which is substantially level with respect to the orientation of the vehicle 50. Trajectory line 40 is shown having a first orientation along surface profile 36a of surface 36. Figure 6A also illustrates the surface 36 having varying different surface profiles 36b (showing an incline in gradient ahead of the vehicle 50) and 36c (showing a decline in gradient ahead of the vehicle 50). This Figure illustrates how a trajectory line 40 may appear to be positioned either in the surface (with respect to surface profile 36b) or "floating" above the surface (with respect to surface profile 36c) if, as in prior art systems, a topography of the surface 36 is not taken into account when positioning the trajectory line 40.
To account for this, in Figure 6B, the trajectory line 40 is shown having a second orientation along surface profile 36b of surface 36. Likewise, in Figure 60, the trajectory line 40 is shown having a third orientation along surface profile 36c of surface 36. As described herein, in order to correctly position the trajectory line 40 such that, to the user, the trajectory line 40 appears to sit on top of the surface 36 rather than "floating" or positioned in the surface 36, the trajectory line 40 is oriented in dependence on topography data indicative of the topography of the surface 36.
As shown, the projection outlines 62, 64 intersect the virtual display 60 at respective first positions for surface profile 36a, respective second positions for surface profile 36b and respective third positions for surface profile 36c. Specifically, by adjusting the points of intersection of the projection outlines 62, 64 in dependence on a topography of the surface 36 it is possible to convey an apparent orientation of the trajectory line 40 within the composite image sequence.
Figures 6A to 6C illustrate various surface profiles 36a, 36b, 36c having a substantially constant gradient. However, it will be appreciated that the invention is not limited in this sense. Equally, and as shown in Figure 6D, the surface 36 can have a profile 36d which varies in gradient along its length. To account for this, the trajectory line 40 may be separated into indicator sections 39 (a, b, c, d, ...) each of which may be positioned and/or oriented independently in order to map the trajectory line 40 to the surface profile 36d. For example, each indicator section 39 (a, b, c, d, ...) may be moveable about respective pivot points along the trajectory of the vehicle 50.
As shown in Figure 7, in order to convey uniformity along the length of a trajectory line 40 comprising a plurality of indicator sections 39 (a, b, c, d, ...), each indicator section 39 (a, b, c, d, ...) may be moveable about a respective pivot point 44 (a, b, c, d, ...) associated with an end point of a preceding indicator section 39 (a, b, c, d, ...) in the trajectory line 40. Specifically, indicator section 39b is pivotable about pivot point 44a associated with an end point of preceding indicator section 39a, indicator section 39c is pivotable about pivot point 44b associated with an end point of preceding indicator section 39b, and so on for each indicator section 39 (a, b, c, d, ...) making up trajectory line 40. It will be appreciated that the indicator sections 39 (a, b, c, d, ...) may be pivotable about a single axis, or multiple axes, e.g. three axes, as required. Constructing a trajectory indicator in this manner ensures that any discontinuities in the trajectory lines 40, 42 can be avoided thereby conveying a more natural representation of the vehicle's trajectory over the surface 36.
Although shown in Figures 6A -6D as a 1-dimensional virtual display 60, the same technique can be applied equally for a virtual display in 2 (or more) dimensions, as will be appreciated. Figures 8 to 11 illustrate such embodiments.
Figures 8 to 11 illustrate representations of a composite image sequence comprising an image frame 32. The image frame 32 comprises a representation of an image obtained from a camera mounted on, within, or integral to, a vehicle 50 and a trajectory indicator in the form of trajectory lines 40, 42. Each of the trajectory lines comprises a series of trajectory indicator sections. As illustrated, the representation of the image from the camera comprises a surface 36 over which the vehicle 50 is predicted to travel. In the illustrated embodiments, the representation of the image additionally includes a surface profile in the form of a surface map illustrative of the topography of the surface. As discussed herein, the topography of the surface may be determined via image analysis of the image data obtained by the camera, or through other sensing means.
In Figure 8, the topography of the surface 36 is such that there is an incline in gradient in the path of the vehicle 50. In Figure 9, the topography of the surface 36 is such that there is an incline in gradient in the path of the right hand side of the vehicle 50. Accordingly, trajectory line 42 is shown to follow the incline in gradient of the surface 36. In Figure 10, the topography of the surface 36 is such that there is a decline in gradient in the path of the right hand side of the vehicle 50. Accordingly, trajectory line 42 is shown to follow the decline in gradient of the surface 36. Each of Figures 8-10 show a trajectory of the vehicle 50 generally straight ahead of the vehicle 50. However, as will be appreciated and as shown in Figure 11, the trajectory of the vehicle 50 may not be straight ahead of the vehicle 50, but may follow a path defined by a steering angle of the vehicle, as described herein, as well as in accordance with inclines and/or declines in the topography of the surface 36.
In the embodiments described herein, the control system 10 is configured to utilise raw sensor data from the sensor(s) 28 in order to determine a topography of the surface. However, it will be appreciated that the topography data may comprise a surface profile, point cloud or other formatted set of data points complied by a controller, processor or further control system external to the control system 10. In such embodiments, the control system 10 of the present invention is configured to use the formatted topography data to determine the topography of the surface 36.
It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a computer program comprising code for implementing a system or method as claimed, and a machine-readable storage storing such a program (e.g. a non-transitory computer readable medium). Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
It will be appreciated that various changes and modifications can be made to the present invention without departing from the scope of the present application.

Claims (20)

  1. CLAIMS1 A control system for providing assistance to an occupant of a vehicle, the control system comprising one or more controllers, configured to: receive one or more vehicle parameters; receive image data from an imaging device comprising captured images of an environment external to the vehicle; receive or determine topography data indicative of a topography of a surface within the environment; determine a vehicle trajectory in dependence on the one or more parameters; determine a composite image sequence comprising one or more image frames, the or each image frame comprising one of said captured images and a trajectory indicator indicative of the determined vehicle trajectory; and output a signal indicative of the composite image sequence to a display for displaying the composite image sequence to the occupant of the vehicle; wherein the control system is configured to position the trajectory indicator at one or more locations within the or each image frame in dependence on the topography data.
  2. 2. A control system of claim 1, wherein the one or more controllers collectively comprise: at least one electronic processor having an electrical input for receiving the one or more vehicle parameters and/or the image data; and at least one electronic memory device operatively coupled to the at least one electronic processor and having instructions stored therein; wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions stored therein so as to determine the composite image sequence.
  3. 3. A control system as claimed in claim 1 or claim 2, configured to position the trajectory indicator within the or each image frame such that an apparent orientation of the trajectory indicator at the or each image frame location is substantially equivalent to the orientation of the surface at the image frame location.
  4. 4. A control system as claimed in claim 3, configured to position the trajectory indicator within the or each image frame such that the trajectory indicator is indicative of the determined trajectory of the vehicle over the surface.
  5. 5. A control system as claimed in claim any preceding claim, wherein the topography data comprises a surface profile of the surface, and the control system is configured to: correlate the surface profile with the surface within the or each image frame; and position the trajectory indicator in accordance with an orientation of the surface profile at the one or more image frame locations.
  6. 6. A control system as claimed in any of claims 1 to 4, wherein the topography data comprises a point cloud comprising a plurality of data points indicative of the topography of the surface within the environment, and the control system is configured to: correlate the point cloud with the surface within the or each image frame; and position the trajectory indicator in accordance with the point cloud at the one or more image frame locations.
  7. 7. A control system as claimed in any of claims 1 to 4, wherein the topography data comprises sensor data from one or more sensors associated with the vehicle, and the control system is configured to: determine a topography of the surface within the environment in dependence on the sensor data; and position the trajectory indicator within the or each image frame in dependence on the determined topography.
  8. 8. A control system as claimed in claim 7, configured to: determine a topography of the surface within the environment by determining a surface profile of the surface in dependence on the sensor data; correlate the surface profile with the surface within the or each image frame; and position the trajectory indicator in accordance with an orientation of the surface profile at the one or more image frame locations.
  9. 9. A control system as claimed in claim 7, configured to: determine a topography of the surface within the environment by determining a point cloud comprising a plurality of data points indicative of the topography of the surface in dependence on the sensor data; correlate the point cloud with the surface within the or each image frame; and position the trajectory indicator in accordance with the point cloud at the one or more image frame locations.
  10. 10. A control system as claimed in any of claims 7 to 9, wherein the one or more sensors comprise any one or more selected from: a camera; a RADAR sensor; a LIDAR sensor; an ultrasonic sensor; a position sensor; and an attitude sensor.
  11. A control system as claimed in any preceding claim, wherein the one or more vehicle parameters comprise any one or more selected from: a steering angle of the vehicle; a velocity of the vehicle; and an orientation of the vehicle about one or more axes.
  12. 12. A control system as claimed in any preceding claim, configured to: receive or determine vehicle orientation data indicative of an orientation of the vehicle about one or more axes; determine a relative orientation of the vehicle with respect to the surface in dependence on the vehicle orientation data; and position the trajectory indicator at the one or more image frame locations in dependence on the relative orientation of the vehicle.
  13. 13. A control system as claimed in any preceding claim, wherein the trajectory indicator comprises a plurality of indicator sections; and wherein the control system is configured to position each trajectory indicator section at a respective image frame location in dependence on the topography of the surface at the respective image frame location.
  14. 14. A control system as claimed in claim 13, wherein each indicator section is moveable about a respective pivot point associated with an end point of a preceding indicator section, and wherein the control system is configured to move one or more of the plurality of indicator sections about its respective pivot point in dependence on the topography of the surface at a corresponding image frame location.
  15. 15. A system comprising the control system of any one of claims 1 to 14, and one or more selected from: an imaging device configured to capture one or more images of an environment external to the vehicle; and a display configured to display the composite image sequence to the occupant of the vehicle.
  16. 16. A vehicle comprising the control system of any one of claims 1 to 14, or a system according to claim 15.
  17. 17. A method for providing assistance to an occupant of a vehicle, the method comprising: receiving one or more vehicle parameters; receiving image data from an imaging device comprising captured images of an environment external to the vehicle; receiving or determining topography data indicative of a topography of a surface within the environment; determining a vehicle trajectory in dependence on the or each parameter; determining a composite image sequence comprising a one or more image frames, the or each image frame comprising a captured image and a trajectory indicator indicative of the determined vehicle trajectory; and outputting a signal indicative of the composite image sequence to a display for displaying the composite image sequence to the occupant of the vehicle; wherein the method comprises positioning the trajectory indicator at one or more locations within the or each image frame in dependence on the topography data.
  18. 18. A method as claimed in claim 17, comprising: positioning the trajectory indicator within the or each image frame such that an apparent orientation of the trajectory indicator at the or each image frame location is substantially equivalent to the orientation of the surface at the image frame location.
  19. 19. A method as claimed in claim 17 or claim 18, wherein the topography data comprises sensor data from one or more sensors, and the method comprises: determining a topography of the surface in dependence on the sensor data; and positioning the trajectory indicator within the or each image frame in dependence on the determined topography.
  20. 20. A non-transitory computer readable medium having instructions stored therein which, when executed by a computing means, perform the method according to any of claims 17 to 19.
GB1900333.4A 2019-01-10 2019-01-10 A control system, system and method for providing assistance to an occupant of a vehicle Active GB2580401B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
GB1900333.4A GB2580401B (en) 2019-01-10 2019-01-10 A control system, system and method for providing assistance to an occupant of a vehicle
DE112020000391.4T DE112020000391T5 (en) 2019-01-10 2020-01-06 A control system, system and method for assisting a vehicle occupant
US17/422,163 US11919513B2 (en) 2019-01-10 2020-01-06 Control system, system and method for providing assistance to an occupant of a vehicle
PCT/EP2020/050119 WO2020144129A1 (en) 2019-01-10 2020-01-06 A control system, system and method for providing assistance to an occupant of a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1900333.4A GB2580401B (en) 2019-01-10 2019-01-10 A control system, system and method for providing assistance to an occupant of a vehicle

Publications (3)

Publication Number Publication Date
GB201900333D0 GB201900333D0 (en) 2019-02-27
GB2580401A true GB2580401A (en) 2020-07-22
GB2580401B GB2580401B (en) 2021-06-30

Family

ID=65528058

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1900333.4A Active GB2580401B (en) 2019-01-10 2019-01-10 A control system, system and method for providing assistance to an occupant of a vehicle

Country Status (1)

Country Link
GB (1) GB2580401B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011007562A (en) * 2009-06-24 2011-01-13 Toshiba Alpine Automotive Technology Corp Navigation device for vehicle and navigation method
JP2012108047A (en) * 2010-11-18 2012-06-07 Denso Corp Vehicle navigation device
GB2525053A (en) * 2014-04-09 2015-10-14 Jaguar Land Rover Ltd Apparatus and method for displaying information
WO2016188547A1 (en) * 2015-05-22 2016-12-01 Here Global B.V. An apparatus and associated methods for providing turn guidance
US20180164115A1 (en) * 2016-12-13 2018-06-14 Inventec (Pudong) Technology Corporation Vehicle Navigation Projection System And Method Thereof
US20190005727A1 (en) * 2017-06-30 2019-01-03 Panasonic Intellectual Property Management Co., Ltd. Display system, information presentation system, control method of display system, storage medium, and mobile body

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011007562A (en) * 2009-06-24 2011-01-13 Toshiba Alpine Automotive Technology Corp Navigation device for vehicle and navigation method
JP2012108047A (en) * 2010-11-18 2012-06-07 Denso Corp Vehicle navigation device
GB2525053A (en) * 2014-04-09 2015-10-14 Jaguar Land Rover Ltd Apparatus and method for displaying information
WO2016188547A1 (en) * 2015-05-22 2016-12-01 Here Global B.V. An apparatus and associated methods for providing turn guidance
US20180164115A1 (en) * 2016-12-13 2018-06-14 Inventec (Pudong) Technology Corporation Vehicle Navigation Projection System And Method Thereof
US20190005727A1 (en) * 2017-06-30 2019-01-03 Panasonic Intellectual Property Management Co., Ltd. Display system, information presentation system, control method of display system, storage medium, and mobile body

Also Published As

Publication number Publication date
GB2580401B (en) 2021-06-30
GB201900333D0 (en) 2019-02-27

Similar Documents

Publication Publication Date Title
EP3640599B1 (en) Vehicle localization method and apparatus
US10354151B2 (en) Method of detecting obstacle around vehicle
CN108692699B (en) Vehicle and method for collision avoidance assistance
CN105984464B (en) Controller of vehicle
KR102508843B1 (en) Method and device for the estimation of car egomotion from surround view images
Gehrig et al. Dead reckoning and cartography using stereo vision for an autonomous car
US10046803B2 (en) Vehicle control system
US9902425B2 (en) System for guiding trailer along target route during reversing maneuver
US8233660B2 (en) System and method for object motion detection based on multiple 3D warping and vehicle equipped with such system
RU2721860C2 (en) Steering column torque control system and method
US20050201593A1 (en) Vehicle state sensing system and vehicle state sensing method
US20100315505A1 (en) Object motion detection system based on combining 3d warping techniques and a proper object motion detection
JP2021049969A (en) Systems and methods for calibrating steering wheel neutral position
US10907972B2 (en) 3D localization device
CN110316197A (en) Tilt evaluation method, inclination estimation device and the non-transitory computer-readable storage media for storing program
CN112433531A (en) Trajectory tracking method and device for automatic driving vehicle and computer equipment
CN114167470A (en) Data processing method and device
CN111123950A (en) Driving control method and device and vehicle
CN111284477A (en) System and method for simulating steering characteristics
CN114518119A (en) Positioning method and device
Miao et al. Path-following control based on ground-watching navigation
Muro et al. Moving-object detection and tracking by scanning LiDAR mounted on motorcycle based on dynamic background subtraction
GB2580400A (en) A control system, system and method for providing assistance to an occupant of a vehicle
US11919513B2 (en) Control system, system and method for providing assistance to an occupant of a vehicle
GB2580401A (en) A control system, system and method for providing assistance to an occupant of a vehicle