WO2020125943A1 - Automated driving system and method for controlling a cabin illumination - Google Patents

Automated driving system and method for controlling a cabin illumination Download PDF

Info

Publication number
WO2020125943A1
WO2020125943A1 PCT/EP2018/085435 EP2018085435W WO2020125943A1 WO 2020125943 A1 WO2020125943 A1 WO 2020125943A1 EP 2018085435 W EP2018085435 W EP 2018085435W WO 2020125943 A1 WO2020125943 A1 WO 2020125943A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
automated driving
driving system
vehicle
exterior
Prior art date
Application number
PCT/EP2018/085435
Other languages
French (fr)
Inventor
Alexandre GENTNER
Original Assignee
Toyota Motor Europe
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Europe filed Critical Toyota Motor Europe
Priority to PCT/EP2018/085435 priority Critical patent/WO2020125943A1/en
Publication of WO2020125943A1 publication Critical patent/WO2020125943A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/80Circuits; Control arrangements

Definitions

  • the present disclosure is related to an automated driving system, a self-driving vehicle, and a method of cabin illumination in a self-driving vehicle having an automated driving system.
  • An automated driving system is a motor vehicle driving automation system that is capable of performing part or all of the dynamic driving task (DDT) on a sustained basis.
  • An automated driving system may be mounted or is to be mounted in a vehicle (such as a car, a truck, an airplane).
  • road vehicles in particular, it may range in level from no driving automation (level 0) to full driving automation (level 5) according to SAE norm J3016.
  • an automated driving system normally comprises at least one sensor, an electronic control unit, and feedback devices which transmit information to the driver and/or act on control member(s) of the vehicle (for instance the steering shaft, the brake, the accelerator pedal or the like) instead of the driver to take some driving load off the driver.
  • control member(s) of the vehicle for instance the steering shaft, the brake, the accelerator pedal or the like
  • An automated driving system is at least capable of assuming part of the driving task (for instance, to perform longitudinal control of the vehicle).
  • many automated driving systems are designed to assist the driver and are therefore called Advanced Driver Assistance Systems (ADAS).
  • ADAS Advanced Driver Assistance Systems
  • Some automated driving systems are capable of assuming the whole driving task, at least during some periods. Such systems are classified at level 3, 4 or 5 according to SAE norm J3016.
  • the present disclosure concerns an automated driving system classified desirably at level 4 or 5 according to SAE norm J3016.
  • exterior illumination may be influenced by e.g. shadow areas on the road (e.g. caused by landscape, building, tunnel%), punctual light sources (other vehicle lights, street lighting, etc.), changes in illuminance from sun/moon (e.g. clouds, etc.).
  • shadow areas on the road e.g. caused by landscape, building, tunnel
  • punctual light sources other vehicle lights, street lighting, etc.
  • changes in illuminance from sun/moon e.g. clouds, etc.
  • US20170253254A1 further discloses a sensory stimulation system for autonomous vehicle (AV) which can monitor a set of maneuvers of the AV. Based on each respective maneuver, the sensory stimulation system can determine a set of sensory stimulation outputs to provide a rider of the AV with sensory indications of the respective maneuver. The sensory stimulation system can then output the set of sensory stimulation outputs via an interior output system. For example maneuvers like vehicle acceleration, braking or change of direction are indicated by a respectively changing stimulation output.
  • AV autonomous vehicle
  • an automated driving system for a self-driving vehicle comprises: - a cabin illumination control unit (4) configured to control the interior illumination, which is the illumination inside the passenger cabin,
  • a first sensor unit (3) configured to sense exterior illumination which is the illumination outside the vehicle and which is directed toward the window
  • an electronic control unit (1) configured to:
  • o receive a sensor output of the first sensor unit (3), o determine at least one first parameter of the exterior illumination based on the sensor output, and
  • Sensing the exterior illumination may comprise sensing the illuminance of at least one exterior light source being outside the vehicle and/or the relative position of the exterior light source with regard to the vehicle.
  • light spots as e.g. electrical light sources or the sun can be detected and their light can be compensated.
  • the at least one first parameter may represent at least one of: the illuminance of the sensed exterior illumination, the colour of the sensed exterior illumination, and the spatial and/or temporal distribution of the sensed exterior illumination.
  • the interior illumination pattern may be generated such that the interior illumination corresponds to predetermined target illumination.
  • a stable or at least partially stabilized illuminance can be achieved.
  • the interior illumination pattern may be generated such that temporal and/or spatial changes of the interior illumination are minimized.
  • any temporal or spatial changes of the external light can be compensated.
  • the at least one first parameter may represent an exterior illumination pattern based on the sensor output.
  • the exterior illumination pattern may comprise temporal and spatial characteristics of the exterior illumination. These may be caused e.g. by shadow areas on the road (e.g. caused by landscape, building, tunnel%) and/or punctual light sources (other vehicle lights, street lighting.). It is hence possible that periodic illumination patterns occur (e.g. due to tunnel lights).
  • the interior illumination pattern may comprise temporal and spatial characteristics which may be selected as a function of the exterior illumination pattern.
  • the interior illumination pattern may be set such that it compensates the exterior illumination pattern (i.e. to complement it).
  • periodic illumination patterns e.g. tunnel lights at a constant vehicle speed
  • the interior illumination pattern may be recognized and compensated reliably due to their stable frequency.
  • the cabin illumination control unit may comprises: at least one dimmable light source configured to illuminate the cabin interior with a variable illuminance and/or colour and/or spatial distribution, and/or at least one shading unit configured to variably shade the at least one window including a variable opacity degree and/or colour filter and/or spatial distribution across the window.
  • the cabin illumination control unit may be connected to and control the at least one dimmable light source and/or at least one shading unit.
  • the illumination control unit is able to generate light, i.e. to additionally illuminate, and to filter/block light from outside, i.e. to selectively reduce the illumination.
  • the shading unit comprises at least one controllable filter arranged on or inside the window.
  • the automated driving system may further comprise a geo localization unit, and a database configured to store map data.
  • the electronic control unit may be configured to determine the at least one first parameter of exterior illumination further based on the geo localization of the vehicle and the map data. Accordingly, by consulting a database of geotagged constant illuminance environments (e.g. shadows from tunnels, other elements), the at least one first parameter of exterior illumination may be determined or refined / verified.
  • a database of geotagged constant illuminance environments e.g. shadows from tunnels, other elements
  • the automated driving system may further comprise: a second sensor unit configured to sense the interior illumination.
  • the electronic control unit may be configured to receive a sensor output of the second sensor unit, determine at least one second parameter of the interior illumination based on the second sensor output, and control the cabin illumination control unit to generate the interior illumination pattern as a function of the at least one second parameter.
  • the cabin illumination control unit can also be controlled (in addition) based on the detected interior illumination. In this way it can be verified that the interior illumination is stable.
  • the first sensor unit may comprise a plurality of light sensors (e.g. luxmeters) arranged to sense the exterior illumination, in particular illumination originating from the front and/or sides and/or back of the vehicle (and/or e.g. omnidirectionally).
  • a plurality of light sensors e.g. luxmeters
  • the exterior illumination in particular illumination originating from the front and/or sides and/or back of the vehicle (and/or e.g. omnidirectionally).
  • the illuminance from different incidence angles coming from light sources outside the vehicle may be measured by luxmeters located outside the vehicle.
  • the automated driving system may further comprise at least one camera sensor whose sensor data are used for controlling automated driving, wherein the camera sensor is additionally used as the first sensor unit.
  • the sensors which are already provided by the automated driving system to carry out (control) the automated driving function may additionally be used to detect the exterior illumination.
  • the present disclosure further relates to a self-driving vehicle comprising an automated driving system according to any one of the preceding claims.
  • the present disclosure further relates to a method of cabin illumination in a self-driving vehicle having an automated driving system.
  • the method controls a cabin illumination control unit to control the interior illumination being inside the passenger cabin.
  • the method comprises the steps of:
  • the method may comprise further method steps which correspond to the functions of the automated driving system, as described above.
  • FIG. 1 shows a block diagram of an automated driving system according to embodiments of the present disclosure
  • FIG. 2 shows a schematic representation of a vehicle in a top view with a plurality of sensors, light sources and shading units according to embodiments of the present disclosure
  • FIG. 3 shows a schematic representation of an electronic control unit and its inputs according to embodiments of the present disclosure. DESCRIPTION OF THE EMBODIMENTS
  • Fig. 1 shows a block diagram of an automated driving system 30 with an electronic control unit 1 according to embodiments of the present disclosure.
  • the automated driving system 30 is part of a self-driving vehicle 10.
  • the vehicle 10 has a vehicle cabin for transporting passengers (i.e. the occupants).
  • the electronic control unit 1 is connected to or comprises a data storage 2.
  • Said data storage may be used e.g. to store map data.
  • the electronic control unit 1 is further connected to a geo localization unit 6 which provides geo localization data of the vehicle to the control unit 1.
  • the electronic control unit 1 may additionally carry out further functions in the vehicle 10.
  • the control device may also act as the general purpose ECU (electronic control unit) of the vehicle.
  • the electronic control unit 1 may control the automated driving function.
  • the electronic control unit 1 may comprise an electronic circuit, a processor (shared, dedicated, or group), a combinational logic circuit, a memory that executes one or more software programs, and/or other suitable components that provide the described functionality.
  • the electronic control unit 1 is further connected to a (first) sensor unit 3, in particular comprising a plurality of light sensors, e.g. luxmeters and/or digital cameras.
  • the sensor unit 3 is configured such that it can sense exterior illumination which, i.e. the illumination (comprising light sources) around the vehicle 10.
  • the sensors of are desirably oriented to the exterior of the vehicle and in a plurality, preferably all directions.
  • the electronic control unit 1 may be further connected to a second sensor unit 5 (optional), in particular comprising at least one light sensor, e.g. a luxmeter and/or digital camera.
  • the sensor unit 5 is configured such that it can sense interior illumination, i.e. the illumination inside the vehicle cabin.
  • the output of sensor unit 3 (and if existing of the sensor unit 5), in particular a recorded stream or single samples recorded at a predetermined sampling frequency (e.g. every second), is transmitted to the electronic control unit 1.
  • the output is transmitted instantaneously, i.e. in real time or in quasi real time.
  • the exterior (and optionally also the interior) illumination can also be determined by the electronic control unit in real time or in quasi real time.
  • the electronic control unit 1 is further connected to a cabin illumination control unit 4.
  • the cabin illumination control unit 4 may be configured to control the interior illumination, i.e. the illumination inside the passenger cabin. This may be done by generating artificial light inside the cabin and/or shading light coming from outside. Based on the output of the sensor unit 3, the electronic control unit determines the at least one first parameter of the exterior illumination based on which the cabin illumination control unit 4 is controlled.
  • FIG. 2 shows a schematic representation of a vehicle 10 in a top view with a plurality of sensors 3a-f, light sources 7a-h and shading units 8a-e according to embodiments of the present disclosure. Also more or less sensors 3a-f, light sources 7a-h and shading units 8a-e may be used.
  • the light sources 7a-h and shading units 8a-e may be comprised by the illumination control unit 4 (cf. fig. 1) or external elements controlled by the illumination control unit 4.
  • the sensors 3a-f may be situated at the front and back (3a and 3f) of the vehicle and/or on the corners of the cabin roof (3b-e). They are desirably oriented radially to the exterior of the vehicle, in order to sense light from any direction.
  • the light sources 7a - h are arranged inside the vehicle cabin, e.g. in the cabin roof inner side and or in the vicinity of the cabin windows.
  • the light sources may comprise each desirably a plurality of LEDs.
  • the LEDs may be able to change their illuminance and/or the color of the emitted light.
  • the light sources may additionally or alternatively comprise OLED surface(s) (that may be printed on the vehicle cabin's interior surfaces) or electroluminescent surfaces. Furthermore it is desirable that the orientation of the light sources or at least of their emitted light may be changeable and controllable by the control unit 1.
  • the shading units 8a-e are desirably integrated into the cabin windows.
  • each cabin window may be provided with a shading unit in the form of a selectively controllable filter layer.
  • the shading units may therefore be configured to selectively set the opacity degree of the windows or specific areas of the windows.
  • the shading units may also act as color filters.
  • the shading units may comprise or consist of electrochromic glass.
  • the shading units may comprise printed electrochromic films (e.g. formed on or in the vehicle window(s)) configured to control the opacity degree and/or the color filtered from specific areas of the surface they are printed on.
  • Illuminance from different incidence angles coming from light sources outside the vehicle can be obtained by the luxmeters 3a-f located outside the vehicle and/or by consulting a database of geotagged constant illuminance environments (e.g. shadows from tunnels, other elements).
  • the impact of light source from outside the vehicle on interior luminance inside the cabin can be controlled by changing the opacity of windows by the shading units 8a-e.
  • the opacity degree of a complete window may be adjustable.
  • areas within the window may be controlled to have different opacity degrees to each other.
  • the control of light generated inside the vehicle can be done by controlling the orientation and intensity of the artificial light sources 7a-h.
  • Fig. 3 shows a schematic representation of an electronic control unit 1 and it inputs according to embodiments of the present disclosure.
  • the control unit 1 (i.e. the data processing system E/) is part of an automated driving system and is configured to limit the changes in illuminance in the field of view of occupants (i.e. riders) of a self-driving vehicle, meanwhile still allowing them to have some visibility to the vehicle exterior (this means that e.g. the window of the vehicle are not just simply removed or blackened).
  • the control unit 1 receives several inputs.
  • a first input may be an input set by an vehicle occupant (i.e. a vehicle rider). This input may control the desired target illuminance and/or light color inside the vehicle cabin. In other words, it is a target parameter for the electronic control unit 1.
  • control unit 1 may receive two different target values: A/ The target illumination for the vehicle interior set by the main rider (i.e. Luxjnt target ) r and B/ Local illumination targets that can be set by each rider located in the X location available in the vehicle (i.e. Luxjnt target + cciocacionx) ⁇ Accordingly, the illumination may be set individually and hence differ within the vehicle cabin.
  • A/ The target illumination for the vehicle interior set by the main rider i.e. Luxjnt target
  • B/ Local illumination targets that can be set by each rider located in the X location available in the vehicle (i.e. Luxjnt target + cciocacionx) ⁇ Accordingly, the illumination may be set individually and hence differ within the vehicle cabin.
  • the control unit 1 further receives sensor data, in order to control the cabin illumination as close as possible to the set target parameter.
  • the control unit 1 may receive an input of at least one of the following sources:
  • This may e.g. include illumination information of external artificial light sources, e.g. street lights or tunnel lighting. This input is in particular useful in areas which are registered as constant illuminance environment, e.g. a tunnel.
  • D/ Real time data regarding illuminance from different incidence angles outside the vehicle may origin from a sensor unit 3, in particular from a plurality of light sensors.
  • Said input, in particular the real time data D/ of the sensor unit 3 may be raw data.
  • the control unit therefore desirably determines at least one (measurement) parameter based on the received input.
  • the parameter may represent at least one of: the illuminance of the sensed exterior illumination, the colour of the sensed exterior illumination, and the spatial and/or temporal distribution of the sensed exterior illumination.
  • control unit 1 is configured to control the cabin illumination control unit (not shown in fig. 3) as a function of the determined parameter.
  • control unit 1 may determine at least one of the following control parameters which are provided to the cabin illumination control unit: F/ The opacity of each of the Z windows: Opa winZ , G/ The intensity of each individual lighting system (total: N) inside the vehicle cabin: lntens lightNl and H / The orientation of each individual lighting system (total: N) inside the vehicle cabin: lntens lightN .
  • control unit 1 desirably is configured to keep the equilibrium point of the following equation system: (N) LxLX_iT ⁇ ti ar g e [ + a locationN f ( ⁇ ⁇ IC_6C ⁇ g, iTltGTlSngfatfl, OviC lugh ⁇ f j , OpCl wi z )
  • the function / may be specific to a vehicle configuration (e.g. according to its interior layout, window size, sensor location, etc.).

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Arrangements Of Lighting Devices For Vehicle Interiors, Mounting And Supporting Thereof, Circuits Therefore (AREA)

Abstract

The invention relates to an automated driving system (30) for a self-driving vehicle comprising a passenger cabin with at least one window, the automated driving system comprising: • a cabin illumination control unit (4) configured to control the interior illumination being inside the passenger cabin, • a first sensor unit (3) configured to sense exterior illumination being outside the vehicle and directed toward the window, and · an electronic control unit (1) configured to: receive a sensor output of the first sensor unit (3), determine at least one first parameter of the exterior illumination based on the sensor output, and control the cabin illumination control unit (4) to generate an interior illumination pattern as a function of the at least one first parameter. The invention further relates to a self-driving vehicle and a method.

Description

AUTOMATED DRIVING SYSTEM AND METHOD FOR CONTROLLING
A CABIN ILLUMINATION
FIELD OF THE DISCLOSURE
[0001] The present disclosure is related to an automated driving system, a self-driving vehicle, and a method of cabin illumination in a self-driving vehicle having an automated driving system.
[0002] An automated driving system is a motor vehicle driving automation system that is capable of performing part or all of the dynamic driving task (DDT) on a sustained basis. An automated driving system may be mounted or is to be mounted in a vehicle (such as a car, a truck, an airplane).
[0003] In the case of road vehicles in particular, it may range in level from no driving automation (level 0) to full driving automation (level 5) according to SAE norm J3016.
[0004] In order to realize this function, an automated driving system normally comprises at least one sensor, an electronic control unit, and feedback devices which transmit information to the driver and/or act on control member(s) of the vehicle (for instance the steering shaft, the brake, the accelerator pedal or the like) instead of the driver to take some driving load off the driver.
[0005] An automated driving system is at least capable of assuming part of the driving task (for instance, to perform longitudinal control of the vehicle). In particular, many automated driving systems are designed to assist the driver and are therefore called Advanced Driver Assistance Systems (ADAS). Some automated driving systems are capable of assuming the whole driving task, at least during some periods. Such systems are classified at level 3, 4 or 5 according to SAE norm J3016.
[0006] The present disclosure concerns an automated driving system classified desirably at level 4 or 5 according to SAE norm J3016.
BACKGROUND OF THE DISCLOSURE
[0007] During the periods when the automated driving system is activated (level 4 or 5 according to SAE norm J3016), all occupants (i.e. passengers or riders) will be able to focus their full attention on non-driving/monitoring related tasks. [0008] Due to this circumstance of not focusing on the road and upcoming events a sensory conflict might occur between visual perception and vestibular perception (i.e. inner ear). Such upcoming events may include (sudden or unexpected) changes in light, in particular due to changing (exterior) illumination being outside the vehicle. Since the vehicle usually has one or several windows, such changes also affect the (interior) illumination being inside the vehicle cabin. In other words, changes of exterior illumination may reach the cabin through the windows.
[0009] For example, exterior illumination may be influenced by e.g. shadow areas on the road (e.g. caused by landscape, building, tunnel...), punctual light sources (other vehicle lights, street lighting, etc.), changes in illuminance from sun/moon (e.g. clouds, etc.).
[0010] As a consequence, the occupants of the vehicle may feel a visual discomfort, as also described in Boyce, P.R, (2014). Human Factors in Lighting, Third Edition, CRC Press.
[0011] US20170253254A1 further discloses a sensory stimulation system for autonomous vehicle (AV) which can monitor a set of maneuvers of the AV. Based on each respective maneuver, the sensory stimulation system can determine a set of sensory stimulation outputs to provide a rider of the AV with sensory indications of the respective maneuver. The sensory stimulation system can then output the set of sensory stimulation outputs via an interior output system. For example maneuvers like vehicle acceleration, braking or change of direction are indicated by a respectively changing stimulation output.
[0012] However, such systems may be a source of further illumination changes inside the cabin and hence of an additional visual discomfort.
SUMMARY OF THE DISCLOSURE
[0013] Currently, it remains desirable to provide an automated driving system and a method of cabin illumination in a self-driving vehicle having an automated driving system which ameliorates the visual comfort for vehicle occupants when the automated driving system is activated.
[0014] Therefore, according to the embodiments of the present disclosure, an automated driving system for a self-driving vehicle is provided. The automated driving system comprises: - a cabin illumination control unit (4) configured to control the interior illumination, which is the illumination inside the passenger cabin,
- a first sensor unit (3) configured to sense exterior illumination which is the illumination outside the vehicle and which is directed toward the window, and
- an electronic control unit (1) configured to:
o receive a sensor output of the first sensor unit (3), o determine at least one first parameter of the exterior illumination based on the sensor output, and
o control the cabin illumination control unit (4) to generate an interior illumination pattern as a function of the at least one first parameter.
[0015] By providing such a control device, it becomes possible to improve the visual comfort of the occupants of the self-driving vehicle. In particular, by limiting the changes in illuminance in the field of view of the occupants, it is desirably still possible for them, to still have some visibility to the exterior (i.e. not completely shading or blackening the windows).
[0016] In particular, in this way it is e.g. possible to target a stable illuminance. Therefore, the occupants of the vehicle will not be confronted to e.g. glare, flicker, or shadows. These aspects of lighting are all sources of visual discomfort. By removing sources of visual discomfort, the performance and pleasure related to in-car activities involving vision can be improved (e.g. reading, discussing, watching a picture or movie). Additionally on improvement of passengers' visual comfort will also have a positive impact on their holistic comfort.
[0017] Sensing the exterior illumination may comprise sensing the illuminance of at least one exterior light source being outside the vehicle and/or the relative position of the exterior light source with regard to the vehicle.
[0018] Accordingly, in particular light spots, as e.g. electrical light sources or the sun can be detected and their light can be compensated.
[0019] The at least one first parameter may represent at least one of: the illuminance of the sensed exterior illumination, the colour of the sensed exterior illumination, and the spatial and/or temporal distribution of the sensed exterior illumination. [0020] The interior illumination pattern may be generated such that the interior illumination corresponds to predetermined target illumination.
Accordingly, a stable or at least partially stabilized illuminance can be achieved.
[0021] The interior illumination pattern may be generated such that temporal and/or spatial changes of the interior illumination are minimized.
Accordingly, any temporal or spatial changes of the external light can be compensated.
[0022] The at least one first parameter may represent an exterior illumination pattern based on the sensor output.
[0023] The exterior illumination pattern may comprise temporal and spatial characteristics of the exterior illumination. These may be caused e.g. by shadow areas on the road (e.g. caused by landscape, building, tunnel...) and/or punctual light sources (other vehicle lights, street lighting.). It is hence possible that periodic illumination patterns occur (e.g. due to tunnel lights).
[0024] The interior illumination pattern may comprise temporal and spatial characteristics which may be selected as a function of the exterior illumination pattern.
[0025] Accordingly, once the exterior illumination pattern has been recognized, the interior illumination pattern may be set such that it compensates the exterior illumination pattern (i.e. to complement it). In particular periodic illumination patterns (e.g. tunnel lights at a constant vehicle speed) may be recognized and compensated reliably due to their stable frequency.
[0026] The cabin illumination control unit may comprises: at least one dimmable light source configured to illuminate the cabin interior with a variable illuminance and/or colour and/or spatial distribution, and/or at least one shading unit configured to variably shade the at least one window including a variable opacity degree and/or colour filter and/or spatial distribution across the window. Alternatively the cabin illumination control unit may be connected to and control the at least one dimmable light source and/or at least one shading unit.
[0027] Accordingly, the illumination control unit is able to generate light, i.e. to additionally illuminate, and to filter/block light from outside, i.e. to selectively reduce the illumination. [0028] The shading unit comprises at least one controllable filter arranged on or inside the window.
[0029] The automated driving system may further comprise a geo localization unit, and a database configured to store map data.
[0030] The electronic control unit may be configured to determine the at least one first parameter of exterior illumination further based on the geo localization of the vehicle and the map data. Accordingly, by consulting a database of geotagged constant illuminance environments (e.g. shadows from tunnels, other elements), the at least one first parameter of exterior illumination may be determined or refined / verified.
[0031] The automated driving system may further comprise: a second sensor unit configured to sense the interior illumination.
[0032] The electronic control unit may be configured to receive a sensor output of the second sensor unit, determine at least one second parameter of the interior illumination based on the second sensor output, and control the cabin illumination control unit to generate the interior illumination pattern as a function of the at least one second parameter.
[0033] Accordingly, the cabin illumination control unit can also be controlled (in addition) based on the detected interior illumination. In this way it can be verified that the interior illumination is stable.
[0034] The first sensor unit may comprise a plurality of light sensors (e.g. luxmeters) arranged to sense the exterior illumination, in particular illumination originating from the front and/or sides and/or back of the vehicle (and/or e.g. omnidirectionally).
[0035] For example, the illuminance from different incidence angles coming from light sources outside the vehicle may be measured by luxmeters located outside the vehicle.
[0036] The automated driving system may further comprise at least one camera sensor whose sensor data are used for controlling automated driving, wherein the camera sensor is additionally used as the first sensor unit. In other words, the sensors which are already provided by the automated driving system to carry out (control) the automated driving function may additionally be used to detect the exterior illumination. [0037] The present disclosure further relates to a self-driving vehicle comprising an automated driving system according to any one of the preceding claims.
[0038] Finally, the present disclosure further relates to a method of cabin illumination in a self-driving vehicle having an automated driving system. The method controls a cabin illumination control unit to control the interior illumination being inside the passenger cabin. The method comprises the steps of:
- sensing exterior illumination being outside the vehicle and directed toward the window,
- determining at least one first parameter of the exterior illumination based on the sensor output, and
- controlling the cabin illumination control unit to generate an interior illumination pattern as a function of the at least one first parameter.
[0039] The method may comprise further method steps which correspond to the functions of the automated driving system, as described above.
[0040] It is intended that combinations of the above-described elements and those within the specification may be made, except where otherwise contradictory.
[0041] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure, as claimed.
[0042] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, and serve to explain the principles thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
[0043] Fig. 1 shows a block diagram of an automated driving system according to embodiments of the present disclosure;
[0044] Fig. 2 shows a schematic representation of a vehicle in a top view with a plurality of sensors, light sources and shading units according to embodiments of the present disclosure; and
[0045] Fig. 3 shows a schematic representation of an electronic control unit and its inputs according to embodiments of the present disclosure. DESCRIPTION OF THE EMBODIMENTS
[0046] Reference will now be made in detail to exemplary embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
[0047] Fig. 1 shows a block diagram of an automated driving system 30 with an electronic control unit 1 according to embodiments of the present disclosure. The automated driving system 30 is part of a self-driving vehicle 10. The vehicle 10 has a vehicle cabin for transporting passengers (i.e. the occupants).
[0048] The electronic control unit 1 is connected to or comprises a data storage 2. Said data storage may be used e.g. to store map data.
[0049] The electronic control unit 1 is further connected to a geo localization unit 6 which provides geo localization data of the vehicle to the control unit 1.
[0050] The electronic control unit 1 may additionally carry out further functions in the vehicle 10. For example, the control device may also act as the general purpose ECU (electronic control unit) of the vehicle. In particular the electronic control unit 1 may control the automated driving function. The electronic control unit 1 may comprise an electronic circuit, a processor (shared, dedicated, or group), a combinational logic circuit, a memory that executes one or more software programs, and/or other suitable components that provide the described functionality.
[0051] The electronic control unit 1 is further connected to a (first) sensor unit 3, in particular comprising a plurality of light sensors, e.g. luxmeters and/or digital cameras. The sensor unit 3 is configured such that it can sense exterior illumination which, i.e. the illumination (comprising light sources) around the vehicle 10. The sensors of are desirably oriented to the exterior of the vehicle and in a plurality, preferably all directions.
[0052] The electronic control unit 1 may be further connected to a second sensor unit 5 (optional), in particular comprising at least one light sensor, e.g. a luxmeter and/or digital camera. The sensor unit 5 is configured such that it can sense interior illumination, i.e. the illumination inside the vehicle cabin.
[0053] The output of sensor unit 3 (and if existing of the sensor unit 5), in particular a recorded stream or single samples recorded at a predetermined sampling frequency (e.g. every second), is transmitted to the electronic control unit 1. Desirably, the output is transmitted instantaneously, i.e. in real time or in quasi real time. Hence, the exterior (and optionally also the interior) illumination can also be determined by the electronic control unit in real time or in quasi real time.
[0054] The electronic control unit 1 is further connected to a cabin illumination control unit 4. The cabin illumination control unit 4 may be configured to control the interior illumination, i.e. the illumination inside the passenger cabin. This may be done by generating artificial light inside the cabin and/or shading light coming from outside. Based on the output of the sensor unit 3, the electronic control unit determines the at least one first parameter of the exterior illumination based on which the cabin illumination control unit 4 is controlled.
[0055] Fig. 2 shows a schematic representation of a vehicle 10 in a top view with a plurality of sensors 3a-f, light sources 7a-h and shading units 8a-e according to embodiments of the present disclosure. Also more or less sensors 3a-f, light sources 7a-h and shading units 8a-e may be used. The light sources 7a-h and shading units 8a-e may be comprised by the illumination control unit 4 (cf. fig. 1) or external elements controlled by the illumination control unit 4.
[0056] The sensors 3a-f may be situated at the front and back (3a and 3f) of the vehicle and/or on the corners of the cabin roof (3b-e). They are desirably oriented radially to the exterior of the vehicle, in order to sense light from any direction.
[0057] The light sources 7a - h are arranged inside the vehicle cabin, e.g. in the cabin roof inner side and or in the vicinity of the cabin windows. The light sources may comprise each desirably a plurality of LEDs. The LEDs may be able to change their illuminance and/or the color of the emitted light. The light sources may additionally or alternatively comprise OLED surface(s) (that may be printed on the vehicle cabin's interior surfaces) or electroluminescent surfaces. Furthermore it is desirable that the orientation of the light sources or at least of their emitted light may be changeable and controllable by the control unit 1.
[0058] The shading units 8a-e are desirably integrated into the cabin windows. In other words, each cabin window may be provided with a shading unit in the form of a selectively controllable filter layer. The shading units may therefore be configured to selectively set the opacity degree of the windows or specific areas of the windows. The shading units may also act as color filters. The shading units may comprise or consist of electrochromic glass. For example, the shading units may comprise printed electrochromic films (e.g. formed on or in the vehicle window(s)) configured to control the opacity degree and/or the color filtered from specific areas of the surface they are printed on.
[0059] Accordingly, in order to control illuminance in various cabin locations for a vehicle equipped with windows, two light source categories need to be controlled: light coming from outside the vehicle and light coming from within the vehicle.
[0060] Illuminance from different incidence angles coming from light sources outside the vehicle can be obtained by the luxmeters 3a-f located outside the vehicle and/or by consulting a database of geotagged constant illuminance environments (e.g. shadows from tunnels, other elements). The impact of light source from outside the vehicle on interior luminance inside the cabin can be controlled by changing the opacity of windows by the shading units 8a-e. In a possible embodiment, the opacity degree of a complete window may be adjustable. In a further possible embodiment, areas within the window may be controlled to have different opacity degrees to each other.
[0061] The control of light generated inside the vehicle can be done by controlling the orientation and intensity of the artificial light sources 7a-h.
[0062] Fig. 3 shows a schematic representation of an electronic control unit 1 and it inputs according to embodiments of the present disclosure.
[0063] The control unit 1 (i.e. the data processing system E/) is part of an automated driving system and is configured to limit the changes in illuminance in the field of view of occupants (i.e. riders) of a self-driving vehicle, meanwhile still allowing them to have some visibility to the vehicle exterior (this means that e.g. the window of the vehicle are not just simply removed or blackened). The control unit 1 receives several inputs. A first input may be an input set by an vehicle occupant (i.e. a vehicle rider). This input may control the desired target illuminance and/or light color inside the vehicle cabin. In other words, it is a target parameter for the electronic control unit 1.
[0064] In particular, the control unit 1 may receive two different target values: A/ The target illumination for the vehicle interior set by the main rider (i.e. Luxjnt target)r and B/ Local illumination targets that can be set by each rider located in the X location available in the vehicle (i.e. Luxjnttarget + cciocacionx)· Accordingly, the illumination may be set individually and hence differ within the vehicle cabin.
[0065] The control unit 1 further receives sensor data, in order to control the cabin illumination as close as possible to the set target parameter. In particular, the control unit 1 may receive an input of at least one of the following sources:
[0066] C/ Upcoming exterior illuminance related to different incidence angles outside the vehicle originating from a database 2 in combination with global positioning data of a GPS sensor 6 and optionally time information: Lux_extstoreY. This may e.g. include illumination information of external artificial light sources, e.g. street lights or tunnel lighting. This input is in particular useful in areas which are registered as constant illuminance environment, e.g. a tunnel.
[0067] D/ Real time data regarding illuminance from different incidence angles outside the vehicle: Lux_extsenseY. This input may origin from a sensor unit 3, in particular from a plurality of light sensors.
[0068] Said input, in particular the real time data D/ of the sensor unit 3 may be raw data. The control unit therefore desirably determines at least one (measurement) parameter based on the received input. For example, the parameter may represent at least one of: the illuminance of the sensed exterior illumination, the colour of the sensed exterior illumination, and the spatial and/or temporal distribution of the sensed exterior illumination.
[0069] In order to keep interior illuminance close to target, the control unit 1 is configured to control the cabin illumination control unit (not shown in fig. 3) as a function of the determined parameter. In this regard the control unit 1 may determine at least one of the following control parameters which are provided to the cabin illumination control unit: F/ The opacity of each of the Z windows: OpawinZ, G/ The intensity of each individual lighting system (total: N) inside the vehicle cabin: lntenslightNl and H / The orientation of each individual lighting system (total: N) inside the vehicle cabin: lntenslightN .
[0070] In order to maintain a constant interior illuminance at every location the control unit 1 desirably is configured to keep the equilibrium point of the following equation system:
Figure imgf000011_0001
(N) LxLX_iT\tiarge [ + alocationN f (ί ΊIC_6Cίg, iTltGTlSngfatfl, OviC lugh^fj , OpClwi z )
[0071] The function / may be specific to a vehicle configuration (e.g. according to its interior layout, window size, sensor location, etc.).
[0072] Throughout the description, including the claims, the term "comprising a" should be understood as being synonymous with "comprising at least one" unless otherwise stated. In addition, any range set forth in the description, including the claims should be understood as including its end value(s) unless otherwise stated. Specific values for described elements should be understood to be within accepted manufacturing or industry tolerances known to one of skill in the art, and any use of the terms "substantially" and/or "approximately" and/or "generally" should be understood to mean falling within such accepted tolerances.
[0073] Although the present disclosure herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present disclosure.
[0074] It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims.

Claims

1. An automated driving system (30) for a self-driving vehicle comprising a passenger cabin with at least one window, the automated driving system comprising:
a cabin illumination control unit (4) configured to control the interior illumination being inside the passenger cabin,
a first sensor unit (3) configured to sense exterior illumination being outside the vehicle and directed toward the window, and
an electronic control unit (1) configured to:
receive a sensor output of the first sensor unit (3),
determine at least one first parameter of the exterior illumination based on the sensor output, and
control the cabin illumination control unit (4) to generate an interior illumination pattern as a function of the at least one first parameter.
2. The automated driving system (30) according claim 1, wherein sensing the exterior illumination comprises sensing the illuminance of at least one exterior light source being outside the vehicle and/or the relative position of the exterior light source with regard to the vehicle.
3. The automated driving system (30) according to claim 1 or 2, wherein
the at least one first parameter represents at least one of: the illuminance of the sensed exterior illumination, the colour of the sensed exterior illumination, and the spatial and/or temporal distribution of the sensed exterior illumination.
4. The automated driving system (30) according to any one of the preceding claims, wherein
the interior illumination pattern is generated such that the interior illumination corresponds to predetermined target illumination.
5. The automated driving system (30) according to any one of the preceding claims, wherein the interior illumination pattern is generated such that temporal and/or spatial changes of the interior illumination are minimized.
6. The automated driving system (30) according to any one of the preceding claims, wherein
the at least one first parameter represents an exterior illumination pattern determined based on the sensor output, the exterior illumination pattern comprising temporal and spatial characteristics of the exterior illumination, and the interior illumination pattern comprises temporal and spatial characteristics which are selected as a function of the exterior illumination pattern.
7. The automated driving system (30) according to any one of the preceding claims, wherein
the cabin illumination control unit (4) comprises:
at least one dimmable light source configured to illuminate the cabin interior with a variable illuminance and/or colour and/or spatial distribution, and/or at least one shading unit configured to variably shade the at least one window including a variable opacity degree and/or colour filter and/or spatial distribution across the window.
8. The automated driving system (30) according to any one of the preceding claims, wherein
the shading unit comprises at least one controllable filter arranged on or inside the window.
9. The automated driving system (30) according to any one of the preceding claims, further comprising:
a geo localization unit, and
a database configured to store map data, wherein
the electronic control unit (1) is configured to:
determine the at least one first parameter of exterior illumination further based on the geo localization of the vehicle and the map data.
10. The automated driving system (30) according to any one of the preceding claims, further comprising: a second sensor unit (5) configured to sense the interior illumination wherein the electronic control unit (1) is configured to:
receive a sensor output of the second sensor unit (5),
determine at least one second parameter of the interior illumination based on the second sensor output, and
control the cabin illumination control unit (4) to generate the interior illumination pattern as a function of the at least one second parameter.
11. The automated driving system (30) according to any one of the preceding claims, wherein
the first sensor unit comprises a plurality of light sensors arranged to sense the exterior illumination, in particular illumination originating from the front and/or sides and/or back of the vehicle.
12. The automated driving system (30) according to any one of the preceding claims, further comprising:
at least one camera sensor whose sensor data are used for controlling automated driving, wherein the camera sensor is additionally used as the first sensor unit (3).
13. A self-driving vehicle comprising an automated driving system according to any one of the preceding claims.
14. A method of cabin illumination in a self-driving vehicle having an automated driving system, the method controlling a cabin illumination control unit (4) which controls the interior illumination being inside the passenger cabin, comprising the steps of:
sensing exterior illumination being outside the vehicle and directed toward the window,
determining at least one first parameter of the exterior illumination based on the sensor output, and
controlling the cabin illumination control unit (4) to generate an interior illumination pattern as a function of the at least one first parameter.
PCT/EP2018/085435 2018-12-18 2018-12-18 Automated driving system and method for controlling a cabin illumination WO2020125943A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2018/085435 WO2020125943A1 (en) 2018-12-18 2018-12-18 Automated driving system and method for controlling a cabin illumination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2018/085435 WO2020125943A1 (en) 2018-12-18 2018-12-18 Automated driving system and method for controlling a cabin illumination

Publications (1)

Publication Number Publication Date
WO2020125943A1 true WO2020125943A1 (en) 2020-06-25

Family

ID=65003357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/085435 WO2020125943A1 (en) 2018-12-18 2018-12-18 Automated driving system and method for controlling a cabin illumination

Country Status (1)

Country Link
WO (1) WO2020125943A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022223309A1 (en) * 2021-04-20 2022-10-27 Bayerische Motoren Werke Aktiengesellschaft Motor vehicle comprising a plurality of interior light modules

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014009264A1 (en) * 2012-07-11 2014-01-16 Trw Automotive Electronics & Components Gmbh Method for controlling an interior lighting system in a vehicle and interior lighting system
DE102016203164A1 (en) * 2015-05-26 2016-12-01 Ford Global Technologies, Llc Light therapy lighting system for a vehicle interior
DE102015115578A1 (en) * 2015-09-16 2017-03-16 Hella Kgaa Hueck & Co. Method for operating at least one lighting unit for the interior of a vehicle
US20170253254A1 (en) 2016-03-03 2017-09-07 Uber Technologies, Inc. Sensory stimulation system for an autonomous vehicle
EP3231667A1 (en) * 2016-04-11 2017-10-18 Philips Lighting Holding B.V. Vehicle interior lighting control module and method of controlling vehicle interior lighting
US20180043825A1 (en) * 2016-08-10 2018-02-15 Toyota Jidosha Kabushiki Kaisha Automatic driving system
DE102016217595A1 (en) * 2016-09-15 2018-03-15 Bayerische Motoren Werke Aktiengesellschaft Driver assistance system for driver-independent operation of a motor vehicle
EP3300953A1 (en) * 2016-09-28 2018-04-04 Valeo Vision Interior lighting system for an autonomous motor vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014009264A1 (en) * 2012-07-11 2014-01-16 Trw Automotive Electronics & Components Gmbh Method for controlling an interior lighting system in a vehicle and interior lighting system
DE102016203164A1 (en) * 2015-05-26 2016-12-01 Ford Global Technologies, Llc Light therapy lighting system for a vehicle interior
DE102015115578A1 (en) * 2015-09-16 2017-03-16 Hella Kgaa Hueck & Co. Method for operating at least one lighting unit for the interior of a vehicle
US20170253254A1 (en) 2016-03-03 2017-09-07 Uber Technologies, Inc. Sensory stimulation system for an autonomous vehicle
EP3231667A1 (en) * 2016-04-11 2017-10-18 Philips Lighting Holding B.V. Vehicle interior lighting control module and method of controlling vehicle interior lighting
US20180043825A1 (en) * 2016-08-10 2018-02-15 Toyota Jidosha Kabushiki Kaisha Automatic driving system
DE102016217595A1 (en) * 2016-09-15 2018-03-15 Bayerische Motoren Werke Aktiengesellschaft Driver assistance system for driver-independent operation of a motor vehicle
EP3300953A1 (en) * 2016-09-28 2018-04-04 Valeo Vision Interior lighting system for an autonomous motor vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BOYCE, P.R: "Human Factors in Lighting", 2014, CRC PRESS

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022223309A1 (en) * 2021-04-20 2022-10-27 Bayerische Motoren Werke Aktiengesellschaft Motor vehicle comprising a plurality of interior light modules

Similar Documents

Publication Publication Date Title
KR101859047B1 (en) Headlamp, Adaptive Driver Assistance System and Vehicle
KR102511055B1 (en) Redundancy Hardware System for Autonomous Vehicles
EP3093192B1 (en) Rear combination lamp for vehicle comprising a display
US10377212B2 (en) Dynamic anti-glare system for a windshield of a vehicle
US7289085B2 (en) Combined instrument and structure provided with a combined instrument
KR102129478B1 (en) How to improve light conditions from the driver's perspective
US8009977B2 (en) On-vehicle lighting apparatus
US9566946B2 (en) Systems, methods, and computer readable media for protecting an operator against glare
US20180096668A1 (en) Hue adjustment of a vehicle display based on ambient light
KR20160133223A (en) Lamp for vehicle and Vehicle including the same
CN107719082B (en) Window system for a vehicle passenger compartment
JP6489084B2 (en) Automated driving system
CN106575036A (en) Method for reducing reflection when operating a head-up display of a motor vehicle
JP2005508785A (en) Method for controlling an air conditioner for a vehicle
CN109302568A (en) The indirect image system of vehicle
US11295704B2 (en) Display control device, display control method, and storage medium capable of performing appropriate luminance adjustment in case where abnormality of illuminance sensor is detected
US20230158942A1 (en) Headlamp encapsulated with camera and artificial intelligence processor to adjust illumination
WO2020125943A1 (en) Automated driving system and method for controlling a cabin illumination
US11541825B2 (en) System for providing color balance in automotive display
JP2010012995A (en) Lighting system
CN207328275U (en) Drive assist system
CN110203413A (en) The dynamic control method of runway illumination
JPH07323899A (en) Device for replacing artificial image indicated to aircraft pilot with corresponding actual image
CN111448095A (en) Method for visualizing sensor data and/or measurement data
KR101827707B1 (en) Apparatus of camera for vehicle, vehicle having the same and controlling method of camera for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18830448

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18830448

Country of ref document: EP

Kind code of ref document: A1