GB2571983A - Method of controlling an in-vehicle monitoring system - Google Patents

Method of controlling an in-vehicle monitoring system Download PDF

Info

Publication number
GB2571983A
GB2571983A GB1804177.2A GB201804177A GB2571983A GB 2571983 A GB2571983 A GB 2571983A GB 201804177 A GB201804177 A GB 201804177A GB 2571983 A GB2571983 A GB 2571983A
Authority
GB
United Kingdom
Prior art keywords
image capture
driver
obscuration
capture system
dependence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1804177.2A
Other versions
GB201804177D0 (en
Inventor
Singh Harpreet
Dias Eduardo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1804177.2A priority Critical patent/GB2571983A/en
Publication of GB201804177D0 publication Critical patent/GB201804177D0/en
Publication of GB2571983A publication Critical patent/GB2571983A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Emergency Management (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is a method for controlling an in-vehicle monitoring system, the method comprising: monitoring a driver of the vehicle using an image capture system comprising one or more image capture devices (e.g. cameras) 210; determining that the driver will be obscured from the image capture system by an obscuration 212 and create an obscuration event; generating an obscuration signal in dependence on said determination; and in dependence on an obscuration signal, altering the viewpoint of the image capture system so as to reduce the time during which the driver is obscured from the image capture system by the obscuration. Altering the viewpoint may be achieved my moving the image capture device (fig. 5) or by switching from a first image capture device 210 to a second image capture device 214 (fig. 6). The obscuration signal may be generated in response to a detected obstruction (e.g. driver’s hand, steering wheel spoke etc.). Alternatively the system may predict that an obscuration event is going to occur by analyzing vehicle or road parameters and, for instance, establishing that the steering wheel will be turned. The analysis may also account for past driver behavior, e.g. likely hand position.

Description

METHOD OF CONTROLLING AN IN-VEHICLE MONITORING SYSTEM
TECHNICAL FIELD
The present disclosure relates to a method of controlling an in-vehicle monitoring system. Aspects of the invention relate to a method, to an in-vehicle monitoring system, and to a vehicle.
BACKGROUND
In-vehicle cameras and other visual sensors are capable of monitoring the driver of the vehicle for signs that their attention is being diverted from the road. Such lapses in attention for any length of time can be critical to the safety of the driver, their passengers and other road users. If the camera's view is obstructed or obscured for even a short period of time, there is a risk that the collection of information regarding the driver's attention or lack thereof will be interrupted.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method, an in-vehicle monitoring system, a controller for an in-vehicle monitoring system, a computer program product; and a vehicle as claimed in the appended claims.
According to an aspect of the invention, there is provided a method for controlling an invehicle monitoring system, the method comprising: monitoring the driver using an image capture system, the image capture system comprising one or more image capture devices; and in dependence on an obscuration signal, altering the viewpoint of the image capture system to reduce the time during which the driver is obscured from the image capture system by an obscuration event.
According to another aspect of the invention, there is provided a method for controlling an invehicle monitoring system, the method comprising:
monitoring the driver of the vehicle using an image capture system, the image capture system comprising one or more image capture devices;
determining that the driver will be obscured from the image capture system by an obscuration and create an obscuration event;
generating an obscuration signal in dependence on said determination; and in dependence on an obscuration signal, altering the viewpoint of the image capture system so as to reduce the time during which the driver is obscured from the image capture system by an obscuration event.
The obscuration event may begin before the obscuration signal is transmitted. The method may comprise determining that the driver is obscured from the image capture system and generating the obscuration signal in dependence on said determination.
The obscuration event may begin after the obscuration signal is transmitted. In this way, the alteration of the viewpoint of the image capture system anticipates the obscuration event. The method may further comprise: analysing one or more vehicle parameters and in dependence on the analysis, determining that the driver will be obscured for a future period of time; and generating the obscuration signal in response to the determination. The method may further comprise: analysing one or more parameters relating to the road ahead and in dependence on the analysis, determining that the driver will be obscured for a future period of time; and generating the obscuration signal in response to the determination. The analysis may be made in dependence on data relating to past driver behaviour.
In an example, altering the viewpoint of the image capture system may comprise moving one or more of the image capture devices. In another example, the monitoring of the driver may be carried out by one of a plurality of image capture devices and altering the viewpoint of the image capture system may comprise changing which of the plurality of image capture devices is carrying out the monitoring of the driver. In an example, one or more of the image capture devices may be arranged to observe the face of the driver, and in particular, observe the driver’s eyes. Advantageously, one or more image capture devices may be arranged to observe one or more of the following: direction of gaze of the driver; orientation of the driver’s head; and blink behaviour of the driver.
According to a further aspect of the invention, there is provided an in-vehicle monitoring system comprising: an image capture system, the image capture system comprising one or more image capture devices; and a processor configured to generate an obscuration signal, the signal indicating that it is necessary to alter the viewpoint of the image capture system to reduce the time during which the driver is obscured from the image capture system by an obscuration event; wherein the image capture system is configured, in dependence on the signal, to alter the viewpoint of the image capture system.
The in-vehicle monitoring system may comprise an analysis unit that, in dependence on detecting that the driver is obscured from the image capture system, provides an output to the processor, wherein the processor generates the obscuration signal in response to receiving the output from the sensor.
The in-vehicle monitoring system may comprising an analysis unit that, in dependence on determining that the driver will be obscured for a future period of time from the image capture system, provides an output to the processor, wherein the processor generates the obscuration signal in dependence on receiving the output from the analysis unit. The analysis unit may be configured to learn driver behavioural information, the driver behavioural information relating to past driver behaviour.
Altering the viewpoint of the image capture system may comprise moving one or more of the image capture devices.
The monitoring of the driver may be carried out by one of a plurality of image capture devices and altering the viewpoint of the image capture system may comprise changing which of the plurality of image capture devices is carrying out the monitoring of the driver.
According to a still further aspect of the invention, there is provided a vehicle comprising an in-vehicle monitoring system as outlined above.
According to still another aspect of the invention, there is provided a controller for an invehicle monitoring system, the controller comprising a data processor arranged to perform the method of any one of the preceding paragraphs.
According to a yet further aspect of the invention, there is provided computer program product comprising computer executable instructions which, when executed by a data processor cause the data processor to perform the method of any one of any one of the preceding paragraphs.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic illustration of a vehicle including an in-vehicle monitoring system according to embodiments of the present invention;
Figure 2 is a schematic illustration of a side view of the inside of a vehicle incorporating an in-vehicle monitoring system in accordance with embodiments of the present invention;
Figure 3 is a schematic view of an in-vehicle monitoring system in accordance with an embodiment of the present invention;
Figure 4 is a further schematic view of an in-vehicle monitoring system operating in accordance with an embodiment of the present invention;
Figure 5 is a further schematic view of an in-vehicle monitoring system operating in accordance with an embodiment of the present invention;
Figure 6 is a further schematic view of an in-vehicle monitoring system operating in accordance with an embodiment of the present invention; and
Figure 7 shows a schematic illustration of a method according to embodiments of the present invention, the method controls an in-vehicle monitoring system according to embodiments of the present invention.
DETAILED DESCRIPTION
Referring first to Figure 1, a vehicle 100 comprising an in-vehicle monitoring system 102 is shown. The in-vehicle monitoring system 102 comprises an image capture system 104 and a processor 106 in communication with the image capture system 104. The in-vehicle monitoring system 102 further comprises an analysis system 108 in communication with the image capture system 104 and the processor 106.
Referring now to Figure 2, an in-vehicle monitoring system 102 is shown in a vehicle 100. The vehicle 100 comprises a steering wheel 202 mounted on a steering column 203 extending towards a driver 206 from a dashboard 204, behind which the driver 206 sits. The in-vehicle monitoring system 102 comprises an image capture system 104. In this example, the image capture system 104 is mounted on the dashboard. Equally, it could be mounted on the steering wheel support, or anywhere else in the vehicle with a view of the driver.
The image capture system 104 comprises one or more image capture devices, at least one of the image capture devices being positioned so as to monitor the driver. In an example, one or more of the image capture devices are arranged to observe the driver and, in particular, the driver’s face. If more than one image capture devices are used, it may be that only one of the image capture devices monitors the driver's face at any one time. The remaining image capture devices may remain switched off or inactive until they are required. One or more of the image capture devices may take the form of electronic video or still image cameras, arranged to capture images of the driver. The or each camera may be sensitive to wavelengths of light visible to the driver, or may additionally or alternatively be sensitive to light in the infrared and/or ultraviolet regions of the electromagnetic spectrum as may be desired. The image capture system 104 may further comprise subject illumination means, not shown, arranged to illuminate the driver to aid the capture of useful images by one or more image capture devices, making them less susceptible to variations in ambient lighting.
Monitoring the driver's face may refer to, in particular, monitoring the driver's eyes, or gaze, and the image capture system 104 may obtain information relating to the gaze direction, for example. Alternatively, it may refer to the image capture system 104 being configured to monitor the driver's head position or rotation.
Such monitoring may be useful, for example, for determining whether a driver is paying attention to the road whilst driving. If it is determined, through monitoring the driver's face, that the driver is not paying attention to the road, an audible or haptic alarm may be issued to the driver. The monitoring of the face may comprise observing the driver’s face and monitoring their eyes to capture blink data, such as blink rate data, blink duration data, and eyelid movement rate data. For example, the blink data may include data indicative of the start point and end point of a blink, as well as data indicative of the degree to which the driver’s eyes are open or closed. The received data may also include data indicative of more general head movement of the driver.
Figure 3 shows a view of the vehicle dashboard from the driver's perspective. In this scenario, both the rim of the steering wheel 202 and an image capture device 210 of the image capture system 104 are visible. The image capture device 210 is able to monitor the driver's face, as its view of the driver's face is not being obstructed.
Referring now to Figure 4, the same perspective is shown as is shown in Figure 3. However, in this instance, an obscuration 212 obscures the image capture device 210, thereby blocking the driver's face from the image capture device 210. As such, the image capture device 210 is unable to monitor the driver's face for the duration of the obscuration event. The obscuration 212 may, for example, be the driver's hand positioned on the steering wheel 202, or one of the spokes of the steering wheel 202 itself.
In one example, after determining that the driver's face is obscured from the image capture system 104 due to the obscuration 212, an obscuration signal is generated, indicating that the driver's face is obscured from the viewpoint of the image capture system 104. Viewpoint is a term that may, for example, refer to the position from which the driver's face is viewed by the image capture system 104.
To determine that the driver's face is obscured from the image capture system 104 due to an obscuration 212, an analysis unit of the analysis system 108 may be configured to analyse the output of the image capture system 104 to recognise key facial features of the driver 206, for example. If these key facial features are not present in the output image, it may signify that an obscuration event is taking place. This analysis unit may be the same analysis unit that analyses data from the image capture system 104 to determine if, for example, the driver 206 is concentrating on the road.
In response to, or in dependence on, receiving the obscuration signal, the in-vehicle monitoring system 102 alters the viewpoint of the image capture system 104 so as to reduce the time during which the driver's face is obscured from the image capture system 104.
One possible way of altering the viewpoint of the image capture system 104 so as to reduce the time during which the driver's face is obscured from the image capture system 104 is shown in Figure 5. In dependence on, or in response to, the obscuration signal, the image capture device 210 that was monitoring the driver's face is moved until the obscuration 212 no longer obscures the driver's face from the image capture device 210, meaning that the image capture device 210 can continue to monitor the driver's face. Such movement of the image capture device 210 may comprise a translation, a rotation, or a translation combined with a rotation.
An alternative way of altering the viewpoint of the image capture system 104 so as to reduce the time during which the driver's face is obscured from the image capture system 104 is shown in Figure 6. The image capture system 104 shown in Figure 6 comprises at least two image capture devices 210, 214. Following the obscuration of the first image capture device 210 due to the obscuration 212, the viewpoint of the image capture system 104 is altered by switching the monitoring feed from the first, obscured image capture device 210 to a second, unobscured image capture device 214, thus ensuring that the image capture system 104 can continue to monitor the driver's face. If it is determined that the obscuration 212 has been removed, the image capture system 104 may switch the monitoring feed back to the first image capture device 210. In this case therefore, altering the viewpoint of the image capture system 104 means changing which of the image capture devices 210, 214 of the image capture system 104 is carrying out the monitoring of the driver's face.
In the example outlined above, an obscuration event takes place, obscuring the image capture system 104, after which the viewpoint of the image capture system 104 is altered such that the driver's face is no longer obscured by the obscuration event 212. Such a scenario is an example of a reactive system. However, in another example, the invention also provides for a pre-emptive system, in which it is determined that an obscuration event is about to take place, and the viewpoint of the image capture system 104 is altered in anticipation of the obscuration 212 such that the image capture system 104 is not obscured by the obscuration 212, or the time that it is obscured is reduced or minimised.
An impending obscuration event may be determined by an analysis unit which monitors and analyses one or more of: vehicle parameters, the road ahead, and driver behaviour.
Vehicle parameters may include: vehicle speed; steering wheel angle, direction of steering wheel input (clockwise or anti-clockwise), and rate of change of steering wheel input; the operating status of features such as the on-board radio or other infotainment system; and if the vehicle reverse gear is engaged (which would imply that the driver may be looking backwards in the direction of travel, or looking at rear view mirrors rather than towards the road directly in front of the vehicle), for example. Features of the road ahead may be derived from sensors on the vehicle 100, or alternatively or additionally using a GPS satellite navigation system. Driver behaviour may include: information on past driving behaviour and how the driver typically behaves or reacts to certain situations. For example, this information may include how the driver typically holds the steering wheel 202 (i.e. placement of the driver's hands on the steering wheel 202) in various scenarios. For instance, the driver may move their hand above the steering wheel 202 after 20-30 minutes of a long journey, or they may hold the steering wheel 202 differently while overtaking.
As an example, the vehicle 100 may be approaching a bend in the road. The analysis system recognises that when the driver 206 steers around a bend such as the upcoming bend, they places their hand on the steering wheel 202 in such a manner that obscures their face from the image capture system 104. Thus, the analysis system recognises that the presence of the bend on the road ahead means that an obscuration 212 is upcoming. The analysis system may then generate an obscuration signal that causes the viewpoint of the image capture system 104 to be altered, such that any obscuration caused by the obscuration 212 is avoided or at least reduced or minimised.
Figure 7 shows a schematic illustration of a method 300 by which the in-vehicle monitoring system 102 as described above may be controlled.
In Figure 7, the method 300 comprises the steps of monitoring, at step 302, the driver 206 using an image capture system 104 as described above. The image capture system 104 comprises one or more image capture devices 210, 214 as has been described with reference to the system 102 above. The method then moves to step 304, where it determines whether the driver 206 will be obscured from the image capture system 104 by an obscuration 212 such as the driver’s hand or a spoke 203 (as shown in Figure 3), which would create an obscuration event 312. The method at step 310 generates an obscuration signal 320 in dependence on that determination. In dependence on the obscuration signal 320, the method at step 324 alters the viewpoint of the image capture system 104 so as to reduce the time during which the driver is obscured from the image capture system 104 by the obscuration 212.
In an example, the method 300 operates in a reactive mode, in which the obscuration event 312 begins before the obscuration signal 320 is transmitted. In this example, the determination at step 304 that the driver is obscured from the image capture system 104 and subsequent generation at step 310 of the obscuration signal 320 is sequential and made in dependence on the determination that an obscuration event 312 has already occurred.
In another example, the method 300 operates in a predictive mode, in which the obscuration event 312 begins after the obscuration signal 320 is transmitted. In this example, the determination at step 304 that the driver will be obscured from the image capture system 104 and subsequent generation at step 310 of the obscuration signal 320 is predictive and made in dependence on monitoring of a potential obscuration 212 which is moving relative to the image capture device. In dependence on the path of the potential obscuration 212 and its movement along its path towards a position where it will cause an obscuration event 312, the in-vehicle monitoring system 102, by means of the analysis system 108, predicts when an obscuration event 312 will occur and at step 310 the obscuration signal 320 is generated before the obscuration event 312 occurs, allowing time for the viewpoint of the image capture system 104 to be altered at step 324, either by repositioning an image capture device or by switching to another image capture device not obscured by the obscuration 212.
In this example of a predictive mode, the method 300 may be made more reliable by analysing one or more vehicle parameters shown generally at 202’. As vehicle parameter data 202’ is optional for the method 300, the signal is shown as a dashed line 202’ in Figure
7. The vehicle parameters include one or more of: vehicle speed; steering wheel angle, direction of steering wheel input (clockwise or anti-clockwise); and/or rate of change of steering wheel input.
Additionally or alternatively, the analysis system 108 may further analyse one or more parameters relating to the road ahead by means of a vehicle mounted forward-looking imagining means 410 such as a radar, lidar or stereoscopic camera arranged to view the road ahead of the vehicle. As analysis of data indicative of parameters relating to the road ahead of the vehicle is optional for the method 300, the signal is shown as a dashed line 410 in Figure 7.
In dependence on analysis of the one or more parameters relating to the road ahead, the analysis system 108 may determine that the driver will be obscured for a future period of time due to their expected movements due to their actions to control the vehicle on the current path. This may be because, for example, there is a sharp left-hand bend approaching and the driver will probably move their right hand in front of the imaging device as they steer left. The processor 106 will therefore generate an obscuration signal 320 in response to the prediction.
It will be appreciated that the method 300 may further be enhanced by the provision of a learning function, where the in-vehicle monitoring system 102 is further provided with a memory means M accessible to the processor 106 and arranged to store data indicative of past vehicle and driver behaviour. For example, a driver may exhibit the habit of crossing their hands past the 12 Ό’ Clock position of the steering wheel, obscuring an image capture device mounted directly above the steering wheel 202, when negotiating tight turns in the road. Where this driver is identified by the system 102, the analysis system 108 may switch from a reactive mode to a predictive or pre-emptive mode and change the viewpoint of the image capture system 104 before the driver’s hand has reached a position that obscures an image capture device of the image capture system 104. However, if it is determined that the driver, based on historical data captured by the system 102 and stored in the memory means M, only brings their hands above the 10 Ό’ clock and 2 Ό’ clock positions for the left and right hand respectively only during low speed manoeuvres such as during parking, then the system 102 may only switch to a predictive mode when travelling slowly or when the vehicle is reversing.

Claims (19)

1. A method for controlling an in-vehicle monitoring system, the method comprising: monitoring the driver of the vehicle using an image capture system, the image capture system comprising one or more image capture devices;
determining that the driver will be obscured from the image capture system by an obscuration and create an obscuration event;
generating an obscuration signal in dependence on said determination; and in dependence on the obscuration signal, altering the viewpoint of the image capture system so as to reduce the time during which the driver is obscured from the image capture system by the obscuration.
2. The method of claim 1, wherein the obscuration event begins before the obscuration signal is transmitted.
3. The method of claim 2, comprising
Determining that the driver is obscured from the image capture system and generating the obscuration signal in dependence on said determination.
4. The method of claim 1, wherein the obscuration event begins after the obscuration signal is transmitted.
5. The method of claim 4, comprising analysing one or more vehicle parameters and in dependence on the analysis, determining that the driver will be obscured for a future period of time; and generating the obscuration signal in response to the determination.
6. The method of claim 4, comprising analysing one or more parameters relating to the road ahead and in dependence on the analysis, determining that the driver will be obscured for a future period of time; and generating the obscuration signal in response to the determination.
7. The method of claim 5 or 6, wherein the analysis is made in dependence on data relating to past driver behaviour.
8. The method of any preceding claim, wherein altering the viewpoint of the image capture system comprises moving one or more of the image capture devices.
9. The method of any preceding claim, wherein the monitoring of the driver is carried out by one of a plurality of image capture devices and altering the viewpoint of the image capture system comprises changing which of the plurality of image capture devices is carrying out the monitoring of the driver.
10. An in-vehicle monitoring system comprising:
an image capture system, the image capture system comprising one or more image capture devices; and a processor configured to generate an obscuration signal, the signal indicating that it is necessary to alter the viewpoint of the image capture system to reduce the time during which the driver is obscured from the image capture system by an obscuration event;
wherein the image capture system is configured, in dependence on the signal, to alter the viewpoint of the image capture system.
11. An in-vehicle monitoring system according to claim 10, comprising an analysis unit that, in dependence on detecting that the driver is obscured from the image capture system, provides an output to the processor, wherein the processor generates the obscuration signal in response to receiving the output from the analysis unit.
12. An in-vehicle monitoring system according to claim 10, comprising an analysis unit that, in dependence on determining that the driver will be obscured for a future period of time from the image capture system, provides an output to the processor, wherein the processor generates the obscuration signal in dependence on receiving the output from the analysis unit.
13. An in-vehicle monitoring system according to claim 12, wherein the analysis unit is configured to learn driver behavioural information, the driver behavioural information relating to past driver behaviour.
14. An in-vehicle monitoring system according to any of claims 10-13, wherein altering the viewpoint of the image capture system comprises moving one or more of the image capture devices.
15. An in-vehicle monitoring system according to any of claims 10-14, wherein the monitoring of the driver is carried out by one of a plurality of image capture devices and altering the viewpoint of the image capture system comprises changing which of the plurality of image capture devices is carrying out the monitoring of the driver.
16. A vehicle comprising an in-vehicle monitoring system in accordance with any one of claims 10-15.
17. A controller for an in-vehicle monitoring system, the controller comprising a data processor arranged to perform the method of any one of claims 1-9.
18. A computer program product comprising computer executable instructions which, when executed by a data processor cause the data processor to perform the method of any one of claims
1-9.
GB1804177.2A 2018-03-15 2018-03-15 Method of controlling an in-vehicle monitoring system Withdrawn GB2571983A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1804177.2A GB2571983A (en) 2018-03-15 2018-03-15 Method of controlling an in-vehicle monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1804177.2A GB2571983A (en) 2018-03-15 2018-03-15 Method of controlling an in-vehicle monitoring system

Publications (2)

Publication Number Publication Date
GB201804177D0 GB201804177D0 (en) 2018-05-02
GB2571983A true GB2571983A (en) 2019-09-18

Family

ID=62017861

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1804177.2A Withdrawn GB2571983A (en) 2018-03-15 2018-03-15 Method of controlling an in-vehicle monitoring system

Country Status (1)

Country Link
GB (1) GB2571983A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10360176A1 (en) * 2003-12-20 2005-07-21 Volkswagen Ag Optical monitoring system observing vehicle drivers eyes e.g. for signs of tiredness, is positioned to avoid obstruction to beam path during driving
EP1657664A2 (en) * 2004-11-11 2006-05-17 Delphi Technologies, Inc. Vehicular optical system
JP2009143279A (en) * 2007-12-11 2009-07-02 Mazda Motor Corp Vehicle-mounted equipment controller
JP2009201756A (en) * 2008-02-28 2009-09-10 Omron Corp Information processor and information processing method, and program
US20160018889A1 (en) * 2014-07-21 2016-01-21 Tobii Ab Method and apparatus for detecting and following an eye and/or the gaze direction thereof
EP3018629A1 (en) * 2013-07-01 2016-05-11 Pioneer Corporation Imaging system
DE102016213066A1 (en) * 2016-07-18 2018-01-18 Volkswagen Aktiengesellschaft Device for driver observation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10360176A1 (en) * 2003-12-20 2005-07-21 Volkswagen Ag Optical monitoring system observing vehicle drivers eyes e.g. for signs of tiredness, is positioned to avoid obstruction to beam path during driving
EP1657664A2 (en) * 2004-11-11 2006-05-17 Delphi Technologies, Inc. Vehicular optical system
JP2009143279A (en) * 2007-12-11 2009-07-02 Mazda Motor Corp Vehicle-mounted equipment controller
JP2009201756A (en) * 2008-02-28 2009-09-10 Omron Corp Information processor and information processing method, and program
EP3018629A1 (en) * 2013-07-01 2016-05-11 Pioneer Corporation Imaging system
US20160018889A1 (en) * 2014-07-21 2016-01-21 Tobii Ab Method and apparatus for detecting and following an eye and/or the gaze direction thereof
DE102016213066A1 (en) * 2016-07-18 2018-01-18 Volkswagen Aktiengesellschaft Device for driver observation

Also Published As

Publication number Publication date
GB201804177D0 (en) 2018-05-02

Similar Documents

Publication Publication Date Title
US10789490B2 (en) Method for calculating a display of additional information for an advertisement, a display unit, apparatus for carrying out the method, and transportation vehicle and computer program
RU2514924C2 (en) Forecasting man-machine interface exploiting technology of stare detection, dead zone indicators and driver experience
US9649936B2 (en) In-vehicle device, control method of in-vehicle device, and computer-readable storage medium
US10181266B2 (en) System and method to provide driving assistance
JP4847178B2 (en) Vehicle driving support device
US10040350B2 (en) Control apparatus and related method
US9851715B2 (en) Method for the automatic operation of a vehicle
JP6690581B2 (en) Operation mode switching control device, method and program
US20200039535A1 (en) Method and device for assisting a driver during the deactivation of a highly automated driving mode of a vehicle
US11021103B2 (en) Method for enriching a field of view of a driver of a transportation vehicle with additional information, device for use in an observer transportation vehicle, device for use in an object, and transportation vehicle
JP6460019B2 (en) Vehicle control device
WO2016117272A1 (en) Driving assistance device, method and program
GB2500690A (en) Driver monitoring and vehicle control system
CN111605551B (en) Vehicle control device, vehicle, and vehicle control method
US11214279B2 (en) Controlling the operation of a head-up display apparatus
US10930148B2 (en) Method and device for reminding a driver about an approach to a light signal apparatus
JP6503285B2 (en) Operation control device, operation control method and program
JP6624016B2 (en) Automatic driving control device for vehicles
JP2008030617A (en) Traffic-lane-keeping assist system
US11062149B2 (en) System and method for recording images reflected from a visor
GB2571983A (en) Method of controlling an in-vehicle monitoring system
US20200239073A1 (en) Display controller
CN112429013A (en) Apparatus and method for controlling behavior of autonomous vehicle
JP5289920B2 (en) Vehicle alarm device
WO2023171458A1 (en) Vehicular notification control device and vehicular notification control method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)