WO2024086260A1 - Ambient light detection us ing machine learning - Google Patents

Ambient light detection us ing machine learning Download PDF

Info

Publication number
WO2024086260A1
WO2024086260A1 PCT/US2023/035476 US2023035476W WO2024086260A1 WO 2024086260 A1 WO2024086260 A1 WO 2024086260A1 US 2023035476 W US2023035476 W US 2023035476W WO 2024086260 A1 WO2024086260 A1 WO 2024086260A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
sensor
signals
ambient light
optical sensor
Prior art date
Application number
PCT/US2023/035476
Other languages
French (fr)
Inventor
Sue Hui
Jian Liu
Doug NELSON
Matthew SAMPSELL
Original Assignee
ams Sensors USA Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ams Sensors USA Inc. filed Critical ams Sensors USA Inc.
Publication of WO2024086260A1 publication Critical patent/WO2024086260A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0488Optical or mechanical part supplementary adjustable parts with spectral filtering
    • G01J1/0492Optical or mechanical part supplementary adjustable parts with spectral filtering using at least two different filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4228Photometry, e.g. photographic exposure meter using electric radiation detectors arrangements with two or more detectors, e.g. for sensitivity compensation

Definitions

  • a method for evaluating optical sensor signals, an optical sensor device and a computer program product are provided .
  • a measurement of ambient light is useful to control a brightness of a display.
  • Ambient light can be detected e.g. by a multispectral sensor. Signals provided by the multispectral sensor have to be evaluated to provide an ambient light signal.
  • the multispectral sensor is placed behind a glass or behind a display. Thus, an accurate ambient light signal shall be provided even in case of a glass or a display before the multispectral sensor.
  • a method for evaluating optical sensor signals comprises : generating at least three optical sensor signals by a multispectral sensor, digitali zing the least three optical sensor signals into at least three digital sensor signals , and providing a first ambient light signal by a trained model as a function of the at least three digital sensor signals .
  • the trained model generates the first ambient light signal as a function of at least three optical sensor signals .
  • the trained model is e . g . configured to minimi ze an influence of a display attached to the multispectral sensor .
  • the optical sensor device can be controlled using the first ambient light signal .
  • the trained model is reali zed as trained software , trained code or trained instructions .
  • the first ambient light signal is one of a group consisting of an illuminance value , a correlated color temperature value , chromaticity values , tristimulus values and a flicker value .
  • the first ambient light signal is designed to adj ust a display, e . g . a brightness of a display .
  • a display can also be named screen .
  • the illuminance value is the luminous flux value measured in LUX .
  • a second ambient light signal is provided by the trained model as a function of the at least three digital sensor signals .
  • the second ambient light signal is another one of the group described above .
  • the first ambient light signal is the illuminance and the second ambient light signal is the correlated color temperature value .
  • the first and the second ambient light signal are designed to adj ust the display, e . g . the brightness of the display .
  • the trained model is trained by a machine learning algorithm .
  • the machine learning algorithm is one of a group consisting of linear regression, arti ficial neural network, multilayer perceptron, decision trees , gradient-boosted decision trees , random forest , K nearest neighbors , clustering, K-means clustering and principal components analysis .
  • Gradient-boosted decision trees can also be named gradient-boosting decision trees .
  • the machine learning algorithm is an arti ficial neural network comprising : an input layer receiving on an input side the at least three digital sensor signals , a hidden layer receiving on an input side signals of the input layer, an output layer, receiving on an input side directly or indirectly signals of the hidden layer, and at least one summation function receiving on an input side directly or indirectly signals of the output layer and providing the first ambient light signal .
  • the arti ficial neural network comprises an activation function receiving on an input side signals of the output layer, and providing signals to at least one summation function .
  • the activation function is one of recti fied linear unit activation function, logistic activation function and hyperbolic tangent activation function .
  • the recti fied linear unit activation function can be named recti fier activation function .
  • the output layer provides more than one signals .
  • each of the signals provided by the output layer is processed by the same type of activation function .
  • the parameters of the activation function are di f ferent for the di f ferent signals provided by the output layer .
  • the arti ficial neural network further comprises at least a further hidden layer receiving on an input side signals of the hidden layer and providing signals to the output layer .
  • the machine learning algorithm is linear regression .
  • the first ambient light signal AL1 is calculated according to the equation :
  • AL1 aO + al • xl + a2 • x2 + ... + aN+M • xN+M, wherein aO , al , a2 , ... aN+M are coef ficients determined by linear regression and xl , x2 ... xN+M are signals .
  • the signals comprises the at least three digital sensor signals .
  • the second ambient light signal is provided by the trained model as a function of the at least three digital sensor signals .
  • the machine learning algorithm is linear regression .
  • the second ambient light signal AL2 is calculated according to the equation :
  • AL2 a20 + a21 • xl + a22 • x2 + ... + a2N+M • xN+M , wherein a20 , a21 , a22 , ... a2N+M are coef ficients determined by linear regression and xl , x2 ... xN+M are the signals .
  • the signals comprises the at least three digital sensor signals .
  • providing the first ambient light signal is performed by the trained model as a function of the at least three digital sensor signals and at least one additional signal .
  • the at least one additional signal is a signal of a group consisting of display buf fer data red averaged information, display buf fer data green averaged information, display buf fer data blue averaged information, display frame buf fer information, display brightness information, display buf fer information, display refresh rate information, display frame synchroni zation information, and display pulse-width-modulation information .
  • the signals xl , x2 ... xN+M comprise the at least three digital sensor signals and the at least one additional signal .
  • the optical sensor signals and, thus , also the digital sensor signals are only determined in the visible range .
  • an optical sensor device comprises the multispectral sensor and a processor .
  • the processor is configured to execute the method for evaluating an optical sensor signal , e . g . as described above .
  • the optical sensor device comprises a smart sensor .
  • the smart sensor comprises the multispectral sensor and the processor .
  • the processor is configured to provide the first ambient light signal .
  • the multispectral sensor and the processor are integrated together on one semiconductor substrate .
  • the processor is reali zed as a host processor .
  • the processor is configured to generate the first ambient light signal .
  • the multispectral sensor and the host processor are reali zed on two separate semiconductor bodies .
  • the optical sensor device further comprises a display .
  • the multispectral sensor is located such that the display is in an optical path between an ambient and the multispectral sensor .
  • the display can be named screen .
  • the display is reali zed as an organic light-emitting diode display, abbreviated OLED display .
  • an optical path between an ambient and the multispectral sensor is free of a display .
  • the optical sensor device further comprises a glass or a polymer sheet which is colorless or colored .
  • a colored glass can be named tinted glass .
  • a colorless glass can be named clear glass .
  • the multispectral sensor is located such that the glass or the polymer sheet is in the optical path between the ambient and the multispectral sensor .
  • the glass or the polymer sheet and the display are in the optical path between the ambient and the multispectral sensor .
  • the optical sensor device is reali zed as one of a group consisting of a cell phone , tablet , television set , laptop and computer with computer display .
  • a computer program product comprises instructions to cause the optical sensor device to execute the method for evaluating an optical sensor signal .
  • the method for evaluating an optical sensor signal is performed e . g . in real-time .
  • the method for evaluating an optical sensor signal is performed e . g . on-line .
  • optical sensor device and the computer program product described above are particularly suitable for the method for evaluating optical sensor signals .
  • Features described in connection with optical sensor device and the computer program product can therefore be used for the method and vice versa .
  • the method for evaluating optical sensor signals provides an ambient light detection using machine learning .
  • the method can be implemented in any product that uses an organic-light-emitting-diode display ( abbreviated OLED display) or other type of display such as a cell phone , tablet , television set etc .
  • OLED display organic-light-emitting-diode display
  • the ambient light detection is configured for a configuration of the multispectral sensor behind an OLED or another display .
  • the ambient light detection is configured for a configuration of the multispectral sensor without display, e . g . as the multispectral sensor being located in open air .
  • an accuracy of ambient light detection is improved .
  • the method detects ambient light intensity in terms of an illuminance , measured luminous flux per unit area, unit LUX, and/or spectral content of the ambient light in terms of correlated color temperature , abbreviated CCT .
  • the optical sensor device uses machine learning algorithms such as an arti ficial neural network combined with a multispectral sensor for detecting ambient light in behind OLED ( abbreviated BOLED) or behind other screens or in open air environments .
  • FIG. 1A to 1C show exemplary embodiments of a multispectral sensor and its characteristic ;
  • Figures 2A to 2C show exemplary embodiments of an optical sensor device with a multispectral sensor ;
  • Figures 3A to 3E show exemplary embodiments of a method for evaluating optical sensor signals ;
  • Figures 4A to 4D show exemplary embodiments of a method for evaluating optical sensor signals .
  • Figures 5A to 5C show exemplary embodiments of test results of a method for evaluating optical sensor signals .
  • FIG. 1A shows an exemplary embodiment of a multispectral sensor 20 .
  • the multispectral sensor 20 comprises more than one light sensor 21 .
  • the more than one light sensor 21 is reali zed e . g . as photodiode .
  • the multispectral sensor 20 comprises a semiconductor substrate 22 .
  • the more than one light sensor 21 is integrated on a first main surface of the semiconductor substrate 22 .
  • the multispectral sensor 20 has a sensing window 23 .
  • the sensing window 23 is located at the more than one light sensor 21 .
  • the sensing window 23 is configured to allow light to the more than one light sensor 21 .
  • An ambient light IL is applied to the multispectral sensor 20 .
  • the ambient light IL can be named illumination .
  • the ambient light IL is applied to the more than one light sensor 21 .
  • the multispectral sensor 20 generates more than one optical sensor signals SOI to SO5 as a function of the ambient light IL .
  • the more than one optical sensor signals SOI to SO5 can be named channel data outputs .
  • the multispectral sensor 20 can be named multispectral light sensor or multispectral ambient light sensor .
  • FIG. IB shows an exemplary embodiment of a multispectral sensor 20 which is a further development of the embodiment shown in Figure 1A.
  • the multispectral sensor 20 comprises a digital core 27 .
  • the multispectral sensor 20 comprises seven light sensors 21 .
  • the more than one light sensors 21 are coupled via a multiplexer 24 , an analog circuit 25 and one or more than one analog-digital converters 26 to the digital core 27 .
  • the digital core 27 comprises a register 28 .
  • the multispectral sensor 20 comprises an interface 29 coupled to the register 28 .
  • the optical sensor signals SOI to SO7 are digiti zed into digital sensor signals SD1 to SD7 by the multispectral sensor 20 (more exactly by the one or more than one analog-digital converters 26 ) .
  • the digital sensor signals SD1 to SD7 are stored in the register 28 and provided via the interface 29 to another circuit not shown in Figure IB .
  • the interface 29 is a bus interface , e . g . an inter-integrated circuit interface , abbreviated I2C interface .
  • the interface 29 receives and provides a data signal SDA and receives a clock signal SCL .
  • the multispectral sensor 20 provides the digital sensor signals SD1 to SD7 via the data signal SDA.
  • FIG. 1C shows an exemplary embodiment of a characteristic of a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A and IB .
  • responsivities R1 to R5 are shown as a function of a wavelength X (measured in nm) for di f ferent filters .
  • the multispectral sensor 20 comprises e . g . a blue filter, a green filter, a red filter, a clear filter and a near infrared filter ( abbreviated NIR filter ) .
  • the responsivity R1 ( given in arbitrary units ) is obtained for the blue filter, R2 is obtained for the green filter, R3 is obtained for the red filter, R4 is obtained for the clear filter and R5 is obtained for the NIR filter .
  • the multispectral sensor 20 comprises several optical filters .
  • Each filter has a center wavelength and a relatively narrow bandwidth .
  • An example of these filter spectrums are illustrated in Figure 1C, where the red filter has a center wavelength of 650nm, the green filter 550nm, the blue filter 450nm, and the NIR filter 750nm .
  • the outputs of each light sensor 21 covered with a filter are referred to as optical sensor signals SOI to SO5 and after digitali zation as digital sensor signals SD1 to SD5 of the multispectral sensor 20 .
  • the digital sensor signals SD1 to SD5 can be named channel output data .
  • the intensity of the ambient light IL will af fect the absolute values of the channel data, and the color temperature of the ambient light IL will af fect the relative values of the channel data .
  • i f the ambient light IL is a florescent light
  • the green and blue channel data will have higher values than the red channel data
  • I f the ambient light IL is an incandescent light
  • the red and near IR channel will have higher values than the green and blue channel .
  • the multispectral sensor 20 allow a machine learning algorithm to computationally derive an illuminance value and a correlated color temperature value , abbreviated CCT value , of the ambient light IL, as described below .
  • the illuminance value is the luminous flux value measured in LUX .
  • FIG 2A shows an exemplary embodiment of an optical sensor device 10 with a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A to 1C .
  • the optical sensor device 10 is reali zed as cell phone , mobile phone or smart phone .
  • the optical sensor device 10 comprises the multispectral sensor 20 .
  • the optical sensor device 10 comprises a display 12 .
  • the display 12 can also be named screen .
  • the display 12 is located between the multispectral sensor 20 and an ambient .
  • the ambient light IL is provided to the sensing window 23 via the display 12 .
  • the multispectral sensor 20 generates ( e . g . internally) at least three optical sensor signals SOI to SO8 and provides at least three digital sensor signals SD1 to SD8 by digitali zing the at least three optical sensor signals SOI to SO8 .
  • the at least three digital sensor signals SD1 to SD8 are applied to a computer program product 40 .
  • the computer program product 40 comprises a trained model 41 .
  • the trained model 41 provides a first ambient light signal AL1 as a function of the at least three digital sensor signals SD1 to SD8 .
  • the first ambient light signal AL1 is at least one of an illuminance value , a correlated color temperature value , chromaticity values , tristimulus values and a flicker value .
  • the trained model 41 is trained by a machine learning algorithm comprised by the computer program product 40 , as described below . As shown in Figure 2A, at least one additional signal SGI to SG9 is provided to the computer program product 40 and thus to the trained model 41 .
  • the at least one additional signal SGI to SG9 is a signal of a group consisting of display buf fer data R averaged information, display buf fer data G averaged information, display buf fer data B averaged information, display frame buf fer information, display brightness information, display buf fer information, display refresh rate information, display frame synchroni zation information, and display pulse-width- modulation information (which can be named display pulsewidth-modulation duty cycle information) .
  • the at least one additional signal SGI to SG9 are not generated e . g . by any of the light sensors 21 .
  • the trained model 41 of the computer program product 40 comprises instructions or software code to cause the optical sensor device 10 to execute the method for evaluating the optical sensor signal SOI to SO8 ( or in other words the digital sensor signals SD1 to SD8 ) and the at least one additional signal SGI to SG9 .
  • the first ambient light signal AL1 is provided by the trained model 41 as a function of the at least three digital sensor signals SD1 to SD8 and optionally as a function of the at least one additional signal SGI to SG9 .
  • the first ambient light signal AL1 is applied to a control input 14 of the optical sensor device 10 .
  • the trained model 41 provides a second ambient light signal AL2 as a function of the at least three digital sensor signals SD1 to SD8 .
  • the second ambient light signal AL2 is generated similarly as the first ambient light signal AL1 .
  • the optical sensor device 10 comprises the display 12
  • the multispectral sensor 20 is located such that the display 12 is in an optical path between an ambient and the multispectral sensor 20 .
  • the display 12 is reali zed as OLED display .
  • the optical sensor device 10 further comprises a glass 13 which is a colorless or a colored glass .
  • the glass 13 covers the display 12 .
  • the glass 13 is located between the ambient and the display 12 .
  • the multispectral sensor 20 is located such that the glass 13 and the display 12 are in an optical path between an ambient and the multispectral sensor 20 .
  • a possible function of the glass 13 is to protect the display 12 .
  • the optical sensor device 10 comprises a polymer sheet at the place of the glass 13 .
  • BOLED screens behind OLED screens
  • a higher ambient light detection accuracy can be achieved .
  • the multispectral sensor 20 provides raw channel data as inputs to the machine learning algorithm that runs on a cell phone , a laptop personal computer or any other target platform that contains one or more processors which executes a pre-trained model to predict the first ambient light signal AL1 which comprises e . g . an intensity as well as a spectrum of the ambient light IL .
  • the multispectral sensor 20 is placed behind an OLED screen or any other screen material that allows transmission of ambient light IL to reach the multispectral sensor 20 or in open air .
  • FIG 2A an overview about a system and a hardware setup is shown .
  • the multispectral sensor 20 is arranged behind an OLED display screen .
  • the OLED display screen is on a cell phone .
  • the multispectral sensor 20 contains multiple optical filters to filter out many narrow band light waves such as e . g . red, green, blue , yellow, magenta, etc . from the ambient light IL .
  • the filters shown in Figure 2A are for illustrative purposes . The actual number of filters and the exact wavelengths could vary depending on speci fic applications .
  • the outputs of the multispectral sensor 20 are used as feature inputs to the machine learning algorithm .
  • the outputs comprise e . g . a red, green, blue , yellow, magenta, cyan, near IR and clear filter output .
  • various display data SGI to SG9 are used as additional feature inputs to the machine learning algorithms , for example : display buf fer data, display brightness data, display refresh rate data, display frame sync data, and display pulse-width-modulation duty-cycle data .
  • One or more machine learning algorithms as described below are appropriate for finding the illuminance and CCT values of the ambient light IL .
  • the outputs are illuminance and CCT values of the ambient light IL .
  • the illuminance and CCT values are used to control e . g . the brightness and/or color temperature of the display 12 .
  • the trained model 41 is shown outside of the optical sensor device 10 for illustration purpose .
  • the trained model 41 is reali zed inside the optical sensor device 10 , as explained e . g . in Figures 2B and 2C .
  • the control input 14 is internal in the optical sensor device 10 .
  • an optical path between an ambient and the multispectral sensor 20 is free of a display .
  • the optical sensor device 10 comprises a tablet , television set , laptop or computer with computer display .
  • FIG 2B shows an exemplary embodiment of an optical sensor device 10 with a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A to 1C and 2A.
  • the optical sensor device 10 additionally comprises a processor 44 which is configured to execute the method for evaluating the optical sensor signal SOI to SO8 or the digital sensor signal SD1 to SD8 .
  • the optical sensor device 10 comprises a smart sensor 45 .
  • the smart sensor 45 can be named intelligent sensor .
  • the smart sensor 45 comprises the multispectral sensor 20 and the processor 44 .
  • the multispectral sensor 20 and the processor 44 are combined in the smart sensor 45 .
  • the semiconductor substrate 22 comprises the multispectral sensor 20 and the processor 44 .
  • the multispectral sensor 20 and the processor 44 are integrated on the first main surface of the semiconductor substrate 22 .
  • the processor 44 is an embedded processor .
  • the processor 44 provides the first ambient light signal AL1 .
  • the processor 44 additionally provides the second ambient light signal AL2 .
  • the first and the second ambient light signal AL1 , AL2 are the illuminance , measured luminous flux per unit area, unit LUX, and the correlated color temperature , abbreviated OCT .
  • the processor 44 is configured for running the machine learning model . Once a machine learning model has been success fully trained to have reached certain performance level , the trained model 41 is deployed and run in the optical sensor device 10 . I f a smart sensor 45 incorporating the multispectral sensor 20 also contains the processor 44 , the trained model 41 can be run on the processor 44 .
  • FIG. 2B a hardware configuration is shown for running the model on the embedded processor 44 inside of the smart sensor 45 .
  • the smart sensor 45 combines the multispectral sensor 20 and the embedded processor 44 .
  • the multispectral sensor 20 generates the sensor data .
  • the sensor data and other data as explained above are applied to the embedded processor 44 which runs the machine learning model in real time and produces the first and the second ambient light signal AL1 , AL2 .
  • the optical sensor device additionally comprises a host processor 46 .
  • the host processor 46 is e . g . one of the processors of the optical sensor device 10 which is e . g . a cell phone .
  • the host processor 46 takes the first and the second ambient light signal AL1 , AL2 as inputs to adj ust a brightness of the display 12 and/or a correlated color temperature of the display 12 .
  • the host processor 46 provides the at least one additional signal SGI to SG9 to the processor 44 .
  • FIG. 2C shows an alternative exemplary embodiment of an optical sensor device 10 with a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A to 1C, 2A and 2B .
  • the multispectral sensor 20 does not contain an embedded processor 44 .
  • the trained model 41 is run on the host processor 46 .
  • the host processor 46 is one of the processors of the optical sensor device 10 (that is on a cell phone ) .
  • the model runs on the host processor 46 .
  • the multispectral sensor 20 is e . g . free from a processor .
  • the digital sensor signals SD1 to SD8 are sent to the host processor 46 .
  • the at least one additional signal SGI to SG9 (not including the reference LUX and CCT data ) are sent to the host processor 46 or are present in the host processor 46 .
  • the host processor 46 takes a data vector DV' as an input and runs the machine learning model in real time and produces the first and the second ambient light signal AL1 , AL2 .
  • the the illuminance value and the CCT value are designed for adj usting the brightness of the display 12 and/or the correlated color temperature of the display 12 .
  • the host processor 46 is reali zed on a further semiconductor substrate 47 which is separate from the semiconductor substrate 22 of the multispectral sensor 20 .
  • FIG 3A shows an exemplary embodiment of a test setup 50 for reali zing a method for evaluating optical sensor signals which is a further development of the embodiments shown in Figures 1A to 1C and 2A to 2C .
  • the test setup 50 comprises the optical sensor device 10 and a test multispectral sensor 51 .
  • the optical sensor device 10 comprises the multispectral sensor 20 and the display 12 .
  • the multispectral sensor 20 is behind the display 12 .
  • the multispectral sensor 20 is implemented as ambient light sensor behind an OLED screen 12 on a cell phone .
  • the test multispectral sensor 51 is separate from the optical sensor device 10 .
  • An optical path from the ambient to the test multispectral sensor 51 is free from a display 12 (such as e . g . an OLED screen) .
  • the test multispectral sensor 51 is an ambient light sensor in open air .
  • the test setup 50 optionally comprises a glass or polymer cover that is in the optical path from the ambient to the test multispectral sensor 51 .
  • the test multispectral sensor 51 and the multispectral sensor 20 are oriented in the same direction with respect to the ambient . Thus , ambient light IL that falls on the test multispectral sensor 51 is identical or nearly identical to ambient light IL that falls on the display 12 above the multispectral sensor 20 .
  • the ambient light IL that falls on the test multispectral sensor 51 is di f ferent from light that falls on the multispectral sensor 20 .
  • the display 12 can influence the light falling on the multispectral sensor 20 e . g . by providing a shield, mask or shadow structure and/or by emitting light not only in the direction of a user but also in the direction towards the multispectral sensor 20 .
  • the shield, mask or shadow structure reduces the light falling on the multispectral sensor 20 .
  • the light emission of the display 12 increases the light falling on the multispectral sensor 20 .
  • the multispectral sensor 20 and the test multispectral sensor 51 are reali zed identically .
  • the test multispectral sensor 51 has a sensing window 53 that is identical e . g . to the sensing window 23 of the multispectral sensor 20 .
  • the light sensors 21 and the filters of the multispectral sensor 20 and the test multispectral sensor 51 are e . g . identical .
  • the multispectral sensor 20 generates digital sensor signals SD1 to SD8 .
  • the test multispectral sensor generates further digital sensor signals SD1 ' to SD8 ' .
  • the test setup 50 comprises a computer 54 that comprises a software 55 for collecting the digital sensor signals SD1 to SD8 and the further digital sensor signals SD1 ' to SD8 ' and for constructing a data vector DV ( shown in Figure 3B ) for training .
  • the software 55 is configured to implement one or more than one machine learning algorithm and to train machine learning models .
  • FIG. 3A the method and device assembly for collecting training data is illustrated .
  • the data collection assembly comprises the multispectral sensor 20 placed behind the OLED screen 12 .
  • the multispectral sensor 20 and its physical placement are the same as for its intended use in a consumer product .
  • the test multispectral sensor 51 is another multispectral sensor placed in close proximity to the multispectral sensor 20 .
  • the test multispectral sensor 51 is used to measure the illuminance and correlated color temperature of the ambient light IL accurately and to serve as a reference for the purpose of collecting the training data .
  • a first reference signal SREF1 is a reference value for the illuminance .
  • a second reference signal SREF2 is a reference value for the CCT .
  • the first and the second reference signal SREF1 , SREF2 are obtained from the further digital sensor signals SD1 ' to SD8 ' and are used as reference or target values for the training data .
  • the entire collection of the data vectors DV is used to train the machine learning model .
  • Figure 3B shows an exemplary embodiment of a vector DV which is a further development of the embodiments shown above .
  • the vector DV comprises the digital data signals SD1 to SD8 , the additional signals SGI to SG9 and the reference values SREF1 , SREF2 .
  • SD1 sensor 1 channel data
  • SD3 sensor 3 channel data
  • SD4 sensor 4 channel data
  • SGI display buf fer data R averaged information
  • the vector DV comprises eight digital data signals SD1 to SD8 , nine additional signals SGI to SG9 and two reference signals SREF1 , SREF2 .
  • the vectors DV are determined at predetermined time steps .
  • a huge set of vectors DV are determined and stored .
  • Each vector DV is measured and determined at another point of time .
  • a time sequence of the vectors DV is stored .
  • SD1 to SD8 are measured values
  • SGI to SG9 are values available in the host processor 46 or at another part of the optical sensor device 10 .
  • the first and the second reference signal SREF1 , SREF2 are calculated by the computer 54 using the further digital sensor signals SD1 ' to SD8 ' provided by the test multispectral sensor 51 .
  • the vector DV includes less parameters .
  • the vector DV comprises at least three digital data signals SD1 to SD8 and at least one of the reference signals SREF1 , SREF2 .
  • the vector DV is e . g . free of any of the additional signals SGI to SG9 .
  • the vector DV comprises between three and eight digital data signals SD1 to SD8 , between one and nine additional signals SGI to SG9 and one or two reference signals SREF1 , SREF2 .
  • Figures 3C shows an exemplary embodiment for training of a method for evaluating optical sensor signals which is a further development of the embodiments shown above .
  • a supervised training of a model such as e . g . a neural network model can be performed as follows :
  • a plurality of vectors DV' is provided as input data to a machine learning algorithm 61 or combination of algorithms described below .
  • the vector DV' is equal to the vector DV with the exception of the first and the second reference signal SREF1 , SREF2 .
  • the first and the second reference signal SREF1 , SREF2 are not part of the vector DV' .
  • the machine learning algorithm 61 provides model output data .
  • the model output data are the first and the second ambient light signal AL1 , AL2 .
  • the model output data are compared with target data .
  • An error ER between the target data and the model output data is computed .
  • the error ER between the first and the second ambient light signal AL1 , AL2 on one side and the first and the second reference signal SREF1 , SREF2 (which are known from the vector DV) on the other side is calculated .
  • the error ER can be calculated according to the method of least squares .
  • the error ER is calculated e . g . according to the equation :
  • a comparison process 63 the error ER is compared with a preset value PV .
  • I s the error ER is smaller than the preset error value PV or is the training iteration greater than a preset iteration value , than the training is stopped .
  • the parameters of the model are stored in a parameter memory 56 of the computer 54 .
  • the parameters can be used for testing and in case of a success ful test in the optical sensor device 10 (that means in the product series of the optical sensor device 10 ) .
  • Figures 3D shows an exemplary embodiment for testing of a method for evaluating optical sensor signals which is a further development of the embodiments shown above .
  • a trained model 41 is used for testing.
  • Input data are applied to the trained model 41 .
  • the trained model 41 uses the parameters which result from the training as described in Figure 3C .
  • the trained model 41 generates the first and the second ambient light sensor signals AL1 , AL2 .
  • the first and the second ambient light sensor signals AL1 , AL2 are compared with the first and the second reference signal SREF1 , SREF2 , e . g . using the error calculation process 62 .
  • the training model is appropriate for implementation in an optical sensor arrangement 10 that can be sold e . g . as a product .
  • a set of vectors DV' used for training is di f ferent from a set of vectors DV' used for testing ( Figure 3D) .
  • a set of vectors DV' used for training is di f ferent from a set of vectors DV' used for testing ( Figure 3D) .
  • 75 % of the vectors DV' generated with the test setup 50 are used for training and the other 25 % of the vectors DV' generated with the test setup 50 are used for testing .
  • a percentage between 50 % and 90 % of the vectors DV' are used for training and the rest of the vectors DV' are used for testing .
  • a trained model achieves a low error with the vectors DV' which were used for training . Testing the trained model with other vectors DV' results in a reliable quality check of the trained model 41 and the stored parameter of the trained model 41 .
  • Figures 3E shows an exemplary embodiment for using a method for evaluating optical sensor signals which is a further development of the embodiments shown above .
  • the method for evaluating optical sensor signals comprises generating at least three optical sensor signals SOI to SO9 by the multispectral sensor 20 , digitali zing the at least three optical sensor signals Sol to SO9 into at least three digital sensor signals SD1 to SD9 , and providing the first ambient light signal AL1 by the trained model 41 as a function of the at least three digital sensor signals SD1 to SD9 .
  • the first ambient light signal AL1 is at least one of an illuminance value , a correlated color temperature value , chromaticity values , tristimulus values and a flicker value .
  • the trained model 41 is trained by a machine learning algorithm as described e . g . in Figures 3C and 3D .
  • the first ambient light signal AL1 is provided by the trained model 41 additionally as a function of at least one additional signal SGI to SG9 .
  • the at least one additional signal SGI to SG9 is a signal of a group consisting of a display buf fer data R averaged information, a display buf fer data G averaged information, a display buf fer data B averaged information, a display frame buf fer information, a display brightness information, a display buf fer information, a display refresh rate information, a display frame synchroni zation information, and a display pulse-width-modulation information .
  • One or more than one of these information is used by the trained model 41 .
  • the computer program product 40 comprises instructions or software code such as the trained model 41 to cause the optical sensor device 10 to execute the method for evaluating an optical sensor signal .
  • the processor 44 or the host processor 46 executes the method for evaluating an optical sensor signal .
  • the processor 44 provides the first ambient light signal AL1 .
  • the host processor 46 calculates the first ambient light signal AL1 ( e . g . in case the optical sensor device 10 does not comprise a smart sensor or the processor 44 of the smart sensor 45 is not configured for processing the trained model 41 ) .
  • the host processor 46 uses e . g . internally the first ambient light signal AL1 for control of the display 12 .
  • the display 12 is reali zed as OLED .
  • the multispectral sensor 20 is located such that the glass 13 and the display 12 or only the display 12 are in an optical path between the ambient and the multispectral sensor 20 .
  • the optical sensor device 10 is reali zed as a device of a group consisting of a cell phone , tablet , television set , laptop and computer with computer display .
  • Figure 4A shows an exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above .
  • the ambient light IL falls through the display 12 on the multispectral sensor 20 that generates N digital sensor signals SD1 to SDN .
  • the N digital sensor signals SD1 to SDN and M additional signals SGI to SGM are provided to the trained model 41 .
  • the trained model 41 implements a result of a machine learning algorithm .
  • the machine learning algorithm is one of a group consisting of linear regression 70 , arti ficial neural network 71 ( abbreviated ANN) , multilayer perceptron 72 ( abbreviated MLP ) , decision trees 73 , gradient-boosted decision trees 74 , random forest 75 , K nearest neighbors 76 ( abbreviated kNN) , clustering 77 , K-means clustering 78 and principal components analysis 79 ( abbreviated PCA) .
  • Multilayer perceptron 72 can be described as an example of an arti ficial neural network 71 .
  • Gradient-boosted decision trees 74 can be described as an example of decision trees 73 .
  • Another name for gradient-boosted decision trees 74 is e . g . gradient-boosting decision trees .
  • the trained model 41 implements more than one machine learning algorithm 70 to 79 listed above .
  • the N digital sensor signals SD1 to SDN and the M additional signals SGI to SGM are provided to at least two machine learning algorithms listed above and the results of the at least two machine learning algorithms are provided to a combiner (not shown in Figure 4A) which provides the first and the second ambient light signal AL1 , AL2 .
  • the combiner can be reali zed as a summation function .
  • two or three or more than three machine learning algorithms 70 to 79 are implemented in an AND/OR combination .
  • an ambient light detection architecture is configured to implement several machine learning algorithms 70 to 79 .
  • FIG. 4B shows an exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above .
  • the machine learning algorithm is an arti ficial neural network 71 comprising an input layer 81 receiving on an input side the at least three digital sensor signals SD1 to SD8 .
  • the input layer 81 receives on the input side the N digital sensor signals SD1 to SDN and the M additional signals SGI to SGM .
  • the M additional signals SGI to SGM comprise e . g . a phone display frame buf fer information, a phone display brightness information and/or other relevant information .
  • a hidden layer 91 of the ANN 71 receives on an input side signals of the input layer 81 .
  • An output layer 111 receives on an input side directly or indirectly signals of the hidden layer 91 .
  • An activation function 121 receiving on an input side signals of the output layer 111 .
  • At least one summation function 130 of the ANN 71 receives on an input side signals provided by the activation function 121 and provides the first ambient light signal AL1 or the first and the second ambient light signal AL1 , AL2 .
  • the activation function 121 is one of recti fied linear unit , logistic and hyperbolic tangent .
  • an ambient light detection architecture uses an ANN 71 as the machine learning algorithm .
  • Each of the number Ml of neurons 82 to 88 of the input layer 81 comprises a bias bi .
  • the number Ml of neurons 82 to 88 of the input layer 81 are connected to a number M2 of neurons 92 to 101 of the hidden layer 91 via weights Wij .
  • Each of the number M2 number of neurons 92 to 101 of the hidden layer 91 comprises a bias bj .
  • Each of the number M2 number of neurons 92 to 101 of the hidden layer 91 is coupled to a number M3 of neurons 112 to 116 of the output layer 111 via weights Wj k .
  • Each of the number M3 of neurons 112 to 116 of the output layer 111 are coupled via the activation function 121 to the at least one summation function 130 .
  • the activation function 121 comprises a number M3 of functions 122 to 127 .
  • each neuron is coupled via one function to the at least one summation function 130 .
  • a first summation function of the at least one summation function 130 generates the first ambient light signal AL1 .
  • a second summation function of the at least one summation function 130 generates the second ambient light signal AL2 .
  • the first and the second ambient light signal AL1 , AL2 are e . g . the illuminance and the CCT of the ambient light IL .
  • Two separate models are trained with one for predicting illuminance and another for predicting CCT (measured in LUX ) .
  • the hidden layer 91 , the output layer 11 , the activation function 121 and the summation function 130 are di f ferent for generating the first ambient light signal AL1 and for generating the second ambient light signal AL2 .
  • the values for the bias and the weights are di f ferent .
  • the two trained models 41 may or may not use the same machine learning algorithm 70 to 79 .
  • the training process takes typically 20 min to 30 min on a laptop PC and uses a large set of data (for example > 200000 sets used so far ) . Once trained, running predictions from a single set of input data is reasonably fast even in case of a cell phone , or whatever target device will be used .
  • the algorithm is capable of determining the illuminance value (measured in LUX ) and the OCT value based on the pre-trained model 41 or pre-trained models i f the current ambient condition as well as display conditions were included in the training set . It is capable of providing predictions under static or changing displays .
  • a possible disadvantage of using a machine learning algorithm for BOLED ambient light detection is that i f a new ambient light and/or new display conditions were not included in the training set , the model may completely fail to predict the illuminance and OCT values correctly for the new ambient light .
  • the ANN 71 comprises one or more than one further hidden layers which are arranged between the hidden layer 91 and the output layer 111 .
  • Figure 4C shows a further exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above .
  • the machine learning algorithm is linear regression 70 .
  • the first ambient light signal AL1 is calculated according to the equation : AL1 aO + al xl + a2 x2 + ... + aN+M • xN+M , wherein aO, al, a2, ... aN+M are coefficients determined by linear regression and xl, x2 ... xN+M are the N digital sensor signals SD1 to SDN and M additional signals SGI to SGM.
  • the number N > 2.
  • the number M is e.g. 0, 1, 2, 3 or more than 0 or more than 1 or more than 2.
  • the second ambient light signal AL2 can be calculated with the same equation using other values for the coefficients aO, al, a2, ... aN+M.
  • linear regression 70 is implemented as the machine learning algorithm.
  • the linear regression 70 operates with multiple variables.
  • Figure 4C a linear regression is illustrated with a single variable x and with a single result AL1.
  • FIG. 4D shows an additional exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above.
  • random forest 75 is implemented as the machine learning algorithm.
  • the random forest 75 comprises several decision trees 141 to 143.
  • the decision trees 141 to 143 are implemented as shallow decision trees.
  • the random forest 75 comprises a decision tree for each of the N digital sensor signals SD1 to SDN and M additional signals SGI to SGM.
  • the random forest 75 comprises a combining function 144 which receives the outputs of the decision trees 141 to 143.
  • the combining function 144 provides e.g.
  • the first ambient light signal AL1 is a function of an average of all outputs of the decision trees 141 to 143.
  • the second ambient light signal AL2 can be calculated with a similar random forest 75.
  • FIG 5A shows an exemplary embodiment of test results of a method for evaluating optical sensor signals which is realized according to an embodiment shown above.
  • An evaluation process includes a training data description.
  • the test setup 50 shown in Figure 3A was used.
  • the two multispectral sensors 20 and 51 were placed behind OLED phone screen and placed in open air close to one another for data collection.
  • For data preparation behind OLED data and open air data were collected simultaneously on a single Raspberry PI.
  • the two sets of data were synchronized and combined into a single data file.
  • the lux and OCT values produced by the test multispectral sensor 51 for each data point are used as reference signals SREF1, SREF2 in the training of the machine learning algorithms.
  • the data files were collected during motion or walk-around or collected in a laboratory (with lux values in the following range: 200, 250, 300, 350, 400, 450, 500, 550, 600, 650, 700, 750, 800) or collected in an office and a conference room with ceiling lights and some sun light through a window.
  • a MLP 72 is an embodiment of an ANN 71.
  • the MLP 72 is realized as a fully connected feedforward ANN 71.
  • the activation function used is rectified linear unit (abbreviated ReLU) .
  • Other activation functions such as logistic (sigmoid) and hyperbolic Tangent ( tanh) were also studied (the results are not illustrated here ) . All data listed above are totaling 234701 with 75% randomly selected for training the model . The further data where used for testing the trained model 41 .
  • Figure 5B shows a further exemplary embodiment of test results of a method for evaluating optical sensor signals which is reali zed according to the embodiment described above .
  • the first ambient light signal AL1 and the first reference signal SREF1 (which both refer to the illuminance in LUX ) are shown versus a number NU of the data vector DV' .
  • the data were collected during walk around in an of fice , wherein the screen 12 is active .
  • Figure 5C shows an additional exemplary embodiment of test results of a method for evaluating optical sensor signals which is reali zed according to the embodiment described above .
  • the first ambient light signal AL1 and the first reference signal SREF1 (which both refer to the illuminance in LUX ) are shown versus a number NU of the data vector DV' .
  • the data were collected in a laboratory, wherein the ambient light IL has been changed at the points of time indicated by arrows .
  • the reference signal SREF1 and the ambient light signal AL1 have a good match in the experiments shown in Figures 5B and 5C .
  • the invention is not limited to the description of the embodiments . Rather, the invention comprises each new feature as well as each combination of features , particularly each combination of features of the claims , even i f the feature or the combination of features itsel f is not explicitly given in the claims or embodiments .

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

A method for evaluating optical sensor signals comprises generating at least three optical sensor signals by a multispectral sensor, digitalizing the least three optical sensor signals into at least three digital sensor signals, and providing an first ambient light signal by a trained model as a function of the at least three digital sensor signals. Furthermore, an optical sensor device and a computer program product are described.

Description

DESCRIPTION
AMBIENT LIGHT DETECTION USING MACHINE LEARNING
CROSS-REFERENCE TO RELATED APPLICATIONS
The present application claims the benefit of U.S. Provisional Application No. 63/417,791, filed on October 20, 2022, the disclosure of which is incorporated by reference herein in its entirety.
TECHNICAL FIELD
A method for evaluating optical sensor signals, an optical sensor device and a computer program product are provided .
BACKGROUND
A measurement of ambient light is useful to control a brightness of a display. Ambient light can be detected e.g. by a multispectral sensor. Signals provided by the multispectral sensor have to be evaluated to provide an ambient light signal. In order to save area, in some cases, the multispectral sensor is placed behind a glass or behind a display. Thus, an accurate ambient light signal shall be provided even in case of a glass or a display before the multispectral sensor.
Thus, there is a need for a method for evaluating optical sensor signals, an optical sensor device and a computer program product that is able to generate an ambient light signal with high accuracy. SUMMARY
In an embodiment , a method for evaluating optical sensor signals comprises : generating at least three optical sensor signals by a multispectral sensor, digitali zing the least three optical sensor signals into at least three digital sensor signals , and providing a first ambient light signal by a trained model as a function of the at least three digital sensor signals .
Advantageously, the trained model generates the first ambient light signal as a function of at least three optical sensor signals . The trained model is e . g . configured to minimi ze an influence of a display attached to the multispectral sensor . Thus , the optical sensor device can be controlled using the first ambient light signal . The trained model is reali zed as trained software , trained code or trained instructions .
In an embodiment of the method, the first ambient light signal is one of a group consisting of an illuminance value , a correlated color temperature value , chromaticity values , tristimulus values and a flicker value . Advantageously, the first ambient light signal is designed to adj ust a display, e . g . a brightness of a display . A display can also be named screen . The illuminance value is the luminous flux value measured in LUX .
In an embodiment of the method, a second ambient light signal is provided by the trained model as a function of the at least three digital sensor signals . The second ambient light signal is another one of the group described above . For example , the first ambient light signal is the illuminance and the second ambient light signal is the correlated color temperature value . Advantageously, the first and the second ambient light signal are designed to adj ust the display, e . g . the brightness of the display .
In an embodiment of the method, the trained model is trained by a machine learning algorithm .
In an embodiment of the method, the machine learning algorithm is one of a group consisting of linear regression, arti ficial neural network, multilayer perceptron, decision trees , gradient-boosted decision trees , random forest , K nearest neighbors , clustering, K-means clustering and principal components analysis . Gradient-boosted decision trees can also be named gradient-boosting decision trees .
In an embodiment of the method, the machine learning algorithm is an arti ficial neural network comprising : an input layer receiving on an input side the at least three digital sensor signals , a hidden layer receiving on an input side signals of the input layer, an output layer, receiving on an input side directly or indirectly signals of the hidden layer, and at least one summation function receiving on an input side directly or indirectly signals of the output layer and providing the first ambient light signal .
In a further development of the method, the arti ficial neural network comprises an activation function receiving on an input side signals of the output layer, and providing signals to at least one summation function .
In a further development of the method, the activation function is one of recti fied linear unit activation function, logistic activation function and hyperbolic tangent activation function . The recti fied linear unit activation function can be named recti fier activation function .
In an example , the output layer provides more than one signals . Thus , each of the signals provided by the output layer is processed by the same type of activation function . The parameters of the activation function are di f ferent for the di f ferent signals provided by the output layer .
In a further development of the method, the arti ficial neural network further comprises at least a further hidden layer receiving on an input side signals of the hidden layer and providing signals to the output layer .
In an embodiment of the method, the machine learning algorithm is linear regression . The first ambient light signal AL1 is calculated according to the equation :
AL1 = aO + al • xl + a2 • x2 + ... + aN+M • xN+M, wherein aO , al , a2 , ... aN+M are coef ficients determined by linear regression and xl , x2 ... xN+M are signals . The signals comprises the at least three digital sensor signals .
In an embodiment of the method, the second ambient light signal is provided by the trained model as a function of the at least three digital sensor signals . The machine learning algorithm is linear regression . The second ambient light signal AL2 is calculated according to the equation :
AL2 = a20 + a21 • xl + a22 • x2 + ... + a2N+M • xN+M , wherein a20 , a21 , a22 , ... a2N+M are coef ficients determined by linear regression and xl , x2 ... xN+M are the signals . The signals comprises the at least three digital sensor signals .
In an embodiment of the method, providing the first ambient light signal is performed by the trained model as a function of the at least three digital sensor signals and at least one additional signal . The at least one additional signal is a signal of a group consisting of display buf fer data red averaged information, display buf fer data green averaged information, display buf fer data blue averaged information, display frame buf fer information, display brightness information, display buf fer information, display refresh rate information, display frame synchroni zation information, and display pulse-width-modulation information .
Thus , in an example , the signals xl , x2 ... xN+M comprise the at least three digital sensor signals and the at least one additional signal .
In an example , the optical sensor signals and, thus , also the digital sensor signals are only determined in the visible range .
In an embodiment , an optical sensor device comprises the multispectral sensor and a processor . The processor is configured to execute the method for evaluating an optical sensor signal , e . g . as described above .
In an embodiment , the optical sensor device comprises a smart sensor . The smart sensor comprises the multispectral sensor and the processor . The processor is configured to provide the first ambient light signal . For example , the multispectral sensor and the processor are integrated together on one semiconductor substrate .
In an embodiment of the optical sensor device , the processor is reali zed as a host processor . The processor is configured to generate the first ambient light signal . The multispectral sensor and the host processor are reali zed on two separate semiconductor bodies .
In an embodiment , the optical sensor device further comprises a display . The multispectral sensor is located such that the display is in an optical path between an ambient and the multispectral sensor . The display can be named screen .
In a further development of the optical sensor device , the display is reali zed as an organic light-emitting diode display, abbreviated OLED display .
In an embodiment of the optical sensor device , an optical path between an ambient and the multispectral sensor is free of a display .
In an embodiment , the optical sensor device further comprises a glass or a polymer sheet which is colorless or colored . A colored glass can be named tinted glass . A colorless glass can be named clear glass . The multispectral sensor is located such that the glass or the polymer sheet is in the optical path between the ambient and the multispectral sensor .
In an example , the glass or the polymer sheet and the display are in the optical path between the ambient and the multispectral sensor .
In an embodiment , the optical sensor device is reali zed as one of a group consisting of a cell phone , tablet , television set , laptop and computer with computer display .
In an embodiment , a computer program product comprises instructions to cause the optical sensor device to execute the method for evaluating an optical sensor signal .
The method for evaluating an optical sensor signal is performed e . g . in real-time . The method for evaluating an optical sensor signal is performed e . g . on-line .
The optical sensor device and the computer program product described above are particularly suitable for the method for evaluating optical sensor signals . Features described in connection with optical sensor device and the computer program product can therefore be used for the method and vice versa .
In an example , the method for evaluating optical sensor signals provides an ambient light detection using machine learning .
In an example , the method can be implemented in any product that uses an organic-light-emitting-diode display ( abbreviated OLED display) or other type of display such as a cell phone , tablet , television set etc . The ambient light detection is configured for a configuration of the multispectral sensor behind an OLED or another display . Alternatively, the ambient light detection is configured for a configuration of the multispectral sensor without display, e . g . as the multispectral sensor being located in open air .
Advantageously, an accuracy of ambient light detection is improved . In an example , the method detects ambient light intensity in terms of an illuminance , measured luminous flux per unit area, unit LUX, and/or spectral content of the ambient light in terms of correlated color temperature , abbreviated CCT . The optical sensor device uses machine learning algorithms such as an arti ficial neural network combined with a multispectral sensor for detecting ambient light in behind OLED ( abbreviated BOLED) or behind other screens or in open air environments .
BRIEF DESCRIPTION OF THE DRAWINGS
The following description of figures of examples or embodiments may further illustrate and explain aspects of the method for evaluating optical sensor signals , the optical sensor device and the computer program product . Arrangements , devices , circuit blocks , methods and steps with the same structure and the same ef fect , respectively, appear with equivalent reference symbols . In so far as arrangements , devices , circuit blocks , methods and steps correspond to one another in terms of their function in di f ferent figures , the description thereof is not repeated for each of the following figures . Figures 1A to 1C show exemplary embodiments of a multispectral sensor and its characteristic ;
Figures 2A to 2C show exemplary embodiments of an optical sensor device with a multispectral sensor ;
Figures 3A to 3E show exemplary embodiments of a method for evaluating optical sensor signals ;
Figures 4A to 4D show exemplary embodiments of a method for evaluating optical sensor signals ; and
Figures 5A to 5C show exemplary embodiments of test results of a method for evaluating optical sensor signals .
DETAILED DESCRIPTION
Figure 1A shows an exemplary embodiment of a multispectral sensor 20 . The multispectral sensor 20 comprises more than one light sensor 21 . The more than one light sensor 21 is reali zed e . g . as photodiode . The multispectral sensor 20 comprises a semiconductor substrate 22 . The more than one light sensor 21 is integrated on a first main surface of the semiconductor substrate 22 . The multispectral sensor 20 has a sensing window 23 . The sensing window 23 is located at the more than one light sensor 21 . The sensing window 23 is configured to allow light to the more than one light sensor 21 .
An ambient light IL is applied to the multispectral sensor 20 . The ambient light IL can be named illumination . Thus , the ambient light IL is applied to the more than one light sensor 21 . The multispectral sensor 20 generates more than one optical sensor signals SOI to SO5 as a function of the ambient light IL . The more than one optical sensor signals SOI to SO5 can be named channel data outputs . The multispectral sensor 20 can be named multispectral light sensor or multispectral ambient light sensor .
Figure IB shows an exemplary embodiment of a multispectral sensor 20 which is a further development of the embodiment shown in Figure 1A. The multispectral sensor 20 comprises a digital core 27 . In the example shown in Figure IB, the multispectral sensor 20 comprises seven light sensors 21 . The more than one light sensors 21 are coupled via a multiplexer 24 , an analog circuit 25 and one or more than one analog-digital converters 26 to the digital core 27 . The digital core 27 comprises a register 28 . The multispectral sensor 20 comprises an interface 29 coupled to the register 28 .
The optical sensor signals SOI to SO7 are digiti zed into digital sensor signals SD1 to SD7 by the multispectral sensor 20 (more exactly by the one or more than one analog-digital converters 26 ) . The digital sensor signals SD1 to SD7 are stored in the register 28 and provided via the interface 29 to another circuit not shown in Figure IB . The interface 29 is a bus interface , e . g . an inter-integrated circuit interface , abbreviated I2C interface . The interface 29 receives and provides a data signal SDA and receives a clock signal SCL . Thus , the multispectral sensor 20 provides the digital sensor signals SD1 to SD7 via the data signal SDA. A supply voltage VDD and a ground potential VSS are provided to the multispectral sensor 20 . Figure 1C shows an exemplary embodiment of a characteristic of a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A and IB . In Figure 1C, responsivities R1 to R5 are shown as a function of a wavelength X (measured in nm) for di f ferent filters . The multispectral sensor 20 comprises e . g . a blue filter, a green filter, a red filter, a clear filter and a near infrared filter ( abbreviated NIR filter ) . The responsivity R1 ( given in arbitrary units ) is obtained for the blue filter, R2 is obtained for the green filter, R3 is obtained for the red filter, R4 is obtained for the clear filter and R5 is obtained for the NIR filter .
Thus , the multispectral sensor 20 comprises several optical filters . Each filter has a center wavelength and a relatively narrow bandwidth . An example of these filter spectrums are illustrated in Figure 1C, where the red filter has a center wavelength of 650nm, the green filter 550nm, the blue filter 450nm, and the NIR filter 750nm . The outputs of each light sensor 21 covered with a filter are referred to as optical sensor signals SOI to SO5 and after digitali zation as digital sensor signals SD1 to SD5 of the multispectral sensor 20 . The digital sensor signals SD1 to SD5 can be named channel output data .
When the ambient light IL is incident upon the sensing window 23 of the multispectral sensor 20 , the intensity of the ambient light IL will af fect the absolute values of the channel data, and the color temperature of the ambient light IL will af fect the relative values of the channel data . For example , i f the ambient light IL is a florescent light , the green and blue channel data will have higher values than the red channel data . I f the ambient light IL is an incandescent light , the red and near IR channel will have higher values than the green and blue channel . These characteristics provided by the multispectral sensor 20 allow a machine learning algorithm to computationally derive an illuminance value and a correlated color temperature value , abbreviated CCT value , of the ambient light IL, as described below . The illuminance value is the luminous flux value measured in LUX .
Figure 2A shows an exemplary embodiment of an optical sensor device 10 with a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A to 1C . The optical sensor device 10 is reali zed as cell phone , mobile phone or smart phone . The optical sensor device 10 comprises the multispectral sensor 20 . The optical sensor device 10 comprises a display 12 . The display 12 can also be named screen . The display 12 is located between the multispectral sensor 20 and an ambient . Thus , the ambient light IL is provided to the sensing window 23 via the display 12 . The multispectral sensor 20 generates ( e . g . internally) at least three optical sensor signals SOI to SO8 and provides at least three digital sensor signals SD1 to SD8 by digitali zing the at least three optical sensor signals SOI to SO8 .
The at least three digital sensor signals SD1 to SD8 are applied to a computer program product 40 . The computer program product 40 comprises a trained model 41 . The trained model 41 provides a first ambient light signal AL1 as a function of the at least three digital sensor signals SD1 to SD8 . The first ambient light signal AL1 is at least one of an illuminance value , a correlated color temperature value , chromaticity values , tristimulus values and a flicker value . The trained model 41 is trained by a machine learning algorithm comprised by the computer program product 40 , as described below . As shown in Figure 2A, at least one additional signal SGI to SG9 is provided to the computer program product 40 and thus to the trained model 41 . The at least one additional signal SGI to SG9 is a signal of a group consisting of display buf fer data R averaged information, display buf fer data G averaged information, display buf fer data B averaged information, display frame buf fer information, display brightness information, display buf fer information, display refresh rate information, display frame synchroni zation information, and display pulse-width- modulation information (which can be named display pulsewidth-modulation duty cycle information) . The at least one additional signal SGI to SG9 are not generated e . g . by any of the light sensors 21 .
The trained model 41 of the computer program product 40 comprises instructions or software code to cause the optical sensor device 10 to execute the method for evaluating the optical sensor signal SOI to SO8 ( or in other words the digital sensor signals SD1 to SD8 ) and the at least one additional signal SGI to SG9 . Thus , the first ambient light signal AL1 is provided by the trained model 41 as a function of the at least three digital sensor signals SD1 to SD8 and optionally as a function of the at least one additional signal SGI to SG9 . The first ambient light signal AL1 is applied to a control input 14 of the optical sensor device 10 .
The trained model 41 provides a second ambient light signal AL2 as a function of the at least three digital sensor signals SD1 to SD8 . The second ambient light signal AL2 is generated similarly as the first ambient light signal AL1 .
Since the optical sensor device 10 comprises the display 12 , the multispectral sensor 20 is located such that the display 12 is in an optical path between an ambient and the multispectral sensor 20 . In an example , the display 12 is reali zed as OLED display .
In an example , the optical sensor device 10 further comprises a glass 13 which is a colorless or a colored glass . For example , the glass 13 covers the display 12 . The glass 13 is located between the ambient and the display 12 . The multispectral sensor 20 is located such that the glass 13 and the display 12 are in an optical path between an ambient and the multispectral sensor 20 . A possible function of the glass 13 is to protect the display 12 . Alternatively, the optical sensor device 10 comprises a polymer sheet at the place of the glass 13 .
By combining the multispectral sensor 20 with machine learning algorithms , more accurate ambient light detection in behind OLED screens ( abbreviated BOLED screens ) or other screens or in open air can be performed . Advantageously, a higher ambient light detection accuracy can be achieved .
The multispectral sensor 20 provides raw channel data as inputs to the machine learning algorithm that runs on a cell phone , a laptop personal computer or any other target platform that contains one or more processors which executes a pre-trained model to predict the first ambient light signal AL1 which comprises e . g . an intensity as well as a spectrum of the ambient light IL . The multispectral sensor 20 is placed behind an OLED screen or any other screen material that allows transmission of ambient light IL to reach the multispectral sensor 20 or in open air .
In Figure 2A, an overview about a system and a hardware setup is shown . The multispectral sensor 20 is arranged behind an OLED display screen . The OLED display screen is on a cell phone . The multispectral sensor 20 contains multiple optical filters to filter out many narrow band light waves such as e . g . red, green, blue , yellow, magenta, etc . from the ambient light IL . The filters shown in Figure 2A are for illustrative purposes . The actual number of filters and the exact wavelengths could vary depending on speci fic applications .
The outputs of the multispectral sensor 20 are used as feature inputs to the machine learning algorithm . The outputs comprise e . g . a red, green, blue , yellow, magenta, cyan, near IR and clear filter output .
Furthermore , various display data SGI to SG9 are used as additional feature inputs to the machine learning algorithms , for example : display buf fer data, display brightness data, display refresh rate data, display frame sync data, and display pulse-width-modulation duty-cycle data . One or more machine learning algorithms as described below are appropriate for finding the illuminance and CCT values of the ambient light IL . Thus , the outputs are illuminance and CCT values of the ambient light IL . The illuminance and CCT values are used to control e . g . the brightness and/or color temperature of the display 12 . In Figure 2A, the trained model 41 is shown outside of the optical sensor device 10 for illustration purpose . The trained model 41 is reali zed inside the optical sensor device 10 , as explained e . g . in Figures 2B and 2C . The control input 14 is internal in the optical sensor device 10 .
In an alternative embodiment , not shown, an optical path between an ambient and the multispectral sensor 20 is free of a display .
In an alternative embodiment , not shown, the optical sensor device 10 comprises a tablet , television set , laptop or computer with computer display .
Figure 2B shows an exemplary embodiment of an optical sensor device 10 with a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A to 1C and 2A. The optical sensor device 10 additionally comprises a processor 44 which is configured to execute the method for evaluating the optical sensor signal SOI to SO8 or the digital sensor signal SD1 to SD8 .
The optical sensor device 10 comprises a smart sensor 45 . The smart sensor 45 can be named intelligent sensor . The smart sensor 45 comprises the multispectral sensor 20 and the processor 44 . Thus , the multispectral sensor 20 and the processor 44 are combined in the smart sensor 45 . In an example , the semiconductor substrate 22 comprises the multispectral sensor 20 and the processor 44 . The multispectral sensor 20 and the processor 44 are integrated on the first main surface of the semiconductor substrate 22 . The processor 44 is an embedded processor . The processor 44 provides the first ambient light signal AL1 . Optionally, the processor 44 additionally provides the second ambient light signal AL2 . In an example , the first and the second ambient light signal AL1 , AL2 are the illuminance , measured luminous flux per unit area, unit LUX, and the correlated color temperature , abbreviated OCT .
The processor 44 is configured for running the machine learning model . Once a machine learning model has been success fully trained to have reached certain performance level , the trained model 41 is deployed and run in the optical sensor device 10 . I f a smart sensor 45 incorporating the multispectral sensor 20 also contains the processor 44 , the trained model 41 can be run on the processor 44 .
In Figure 2B, a hardware configuration is shown for running the model on the embedded processor 44 inside of the smart sensor 45 . The smart sensor 45 combines the multispectral sensor 20 and the embedded processor 44 . The multispectral sensor 20 generates the sensor data . The sensor data and other data as explained above (not including the reference LUX and CCT data ) are applied to the embedded processor 44 which runs the machine learning model in real time and produces the first and the second ambient light signal AL1 , AL2 .
The optical sensor device additionally comprises a host processor 46 . The host processor 46 is e . g . one of the processors of the optical sensor device 10 which is e . g . a cell phone . The host processor 46 takes the first and the second ambient light signal AL1 , AL2 as inputs to adj ust a brightness of the display 12 and/or a correlated color temperature of the display 12 . The host processor 46 provides the at least one additional signal SGI to SG9 to the processor 44 .
Figure 2C shows an alternative exemplary embodiment of an optical sensor device 10 with a multispectral sensor 20 which is a further development of the embodiments shown in Figures 1A to 1C, 2A and 2B . The multispectral sensor 20 does not contain an embedded processor 44 . The trained model 41 is run on the host processor 46 . The host processor 46 is one of the processors of the optical sensor device 10 ( that is on a cell phone ) . In the hardware configuration shown in Figure 2C, the model runs on the host processor 46 . The multispectral sensor 20 is e . g . free from a processor .
The digital sensor signals SD1 to SD8 are sent to the host processor 46 . The at least one additional signal SGI to SG9 (not including the reference LUX and CCT data ) are sent to the host processor 46 or are present in the host processor 46 . The host processor 46 takes a data vector DV' as an input and runs the machine learning model in real time and produces the first and the second ambient light signal AL1 , AL2 . The the illuminance value and the CCT value are designed for adj usting the brightness of the display 12 and/or the correlated color temperature of the display 12 . The host processor 46 is reali zed on a further semiconductor substrate 47 which is separate from the semiconductor substrate 22 of the multispectral sensor 20 .
Figure 3A shows an exemplary embodiment of a test setup 50 for reali zing a method for evaluating optical sensor signals which is a further development of the embodiments shown in Figures 1A to 1C and 2A to 2C . The test setup 50 comprises the optical sensor device 10 and a test multispectral sensor 51 .
The optical sensor device 10 comprises the multispectral sensor 20 and the display 12 . The multispectral sensor 20 is behind the display 12 . Thus , the multispectral sensor 20 is implemented as ambient light sensor behind an OLED screen 12 on a cell phone .
The test multispectral sensor 51 is separate from the optical sensor device 10 . An optical path from the ambient to the test multispectral sensor 51 is free from a display 12 ( such as e . g . an OLED screen) . The test multispectral sensor 51 is an ambient light sensor in open air . The test setup 50 optionally comprises a glass or polymer cover that is in the optical path from the ambient to the test multispectral sensor 51 . The test multispectral sensor 51 and the multispectral sensor 20 are oriented in the same direction with respect to the ambient . Thus , ambient light IL that falls on the test multispectral sensor 51 is identical or nearly identical to ambient light IL that falls on the display 12 above the multispectral sensor 20 .
Of course , due to the influence of the display 12 , the ambient light IL that falls on the test multispectral sensor 51 is di f ferent from light that falls on the multispectral sensor 20 . The display 12 can influence the light falling on the multispectral sensor 20 e . g . by providing a shield, mask or shadow structure and/or by emitting light not only in the direction of a user but also in the direction towards the multispectral sensor 20 . The shield, mask or shadow structure reduces the light falling on the multispectral sensor 20 . The light emission of the display 12 increases the light falling on the multispectral sensor 20 .
In an example , the multispectral sensor 20 and the test multispectral sensor 51 are reali zed identically . The test multispectral sensor 51 has a sensing window 53 that is identical e . g . to the sensing window 23 of the multispectral sensor 20 . The light sensors 21 and the filters of the multispectral sensor 20 and the test multispectral sensor 51 are e . g . identical . The multispectral sensor 20 generates digital sensor signals SD1 to SD8 . The test multispectral sensor generates further digital sensor signals SD1 ' to SD8 ' .
The test setup 50 comprises a computer 54 that comprises a software 55 for collecting the digital sensor signals SD1 to SD8 and the further digital sensor signals SD1 ' to SD8 ' and for constructing a data vector DV ( shown in Figure 3B ) for training . The software 55 is configured to implement one or more than one machine learning algorithm and to train machine learning models .
Possible advantages of machine learning are : Due to signi ficantly increased computation power and large amount of data available , machine learning algorithms are highly ef fective in recogni zing complex patterns in a world of seemingly disparate data . The data detected by a multispectral sensor 20 behind an OLED screen 12 is quite complex because it is a composite of random screen displays and arbitrary and changing ambient lights . Due to the complexity of this data, a machine learning algorithm is well suited for solving this task . In Figure 3A, the method and device assembly for collecting training data is illustrated . The data collection assembly comprises the multispectral sensor 20 placed behind the OLED screen 12 . The multispectral sensor 20 and its physical placement are the same as for its intended use in a consumer product . The test multispectral sensor 51 is another multispectral sensor placed in close proximity to the multispectral sensor 20 . The test multispectral sensor 51 is used to measure the illuminance and correlated color temperature of the ambient light IL accurately and to serve as a reference for the purpose of collecting the training data .
For collection of the training data, multiple channel data from the multispectral sensor 20 and the test multispectral sensor 51 are captured simultaneously and transmitted to the computer 54 . The software 55 inside the computer 54 first computes the illuminance (measured luminous flux per unit area, unit LUX ) and the correlated color temperature ( abbreviated CCT ) of the ambient light IL from the further digital sensor signals SD' to SD8 ' , and then combines with the digital sensor signals SD1 to SD8 and other input data to create a data vector DV as shown in Figure 3B . A first reference signal SREF1 is a reference value for the illuminance . A second reference signal SREF2 is a reference value for the CCT . The first and the second reference signal SREF1 , SREF2 are obtained from the further digital sensor signals SD1 ' to SD8 ' and are used as reference or target values for the training data . After numerous data vectors DV are collected for many types of ambient light conditions , the entire collection of the data vectors DV is used to train the machine learning model . Figure 3B shows an exemplary embodiment of a vector DV which is a further development of the embodiments shown above . The vector DV comprises the digital data signals SD1 to SD8 , the additional signals SGI to SG9 and the reference values SREF1 , SREF2 .
SD1 = sensor 1 channel data
SD2 = sensor 2 channel data
SD3 = sensor 3 channel data
SD4 = sensor 4 channel data
SD5 = sensor 5 channel data
SD6 = sensor 6 channel data
SD7 = sensor 7 channel data
SD8 = sensor 8 channel data
SGI = display buf fer data R averaged information
SG2 = display buf fer data G averaged information
SG3 = display buf fer data B averaged information
SG4 = display brightness information
SG5 = display refresh rate information
SG6 = display frame synchroni zation information
SG7 = display PWM information
SG8 = display frame buf fer information
SG9 = display buf fer information
SREF1 = reference illuminance value
SREF2 = reference CCT value
The vector DV comprises eight digital data signals SD1 to SD8 , nine additional signals SGI to SG9 and two reference signals SREF1 , SREF2 . During data acquisition with the test setup 50 shown in Figure 3A, the vectors DV are determined at predetermined time steps . Thus , a huge set of vectors DV are determined and stored . Each vector DV is measured and determined at another point of time . Thus , a time sequence of the vectors DV is stored . SD1 to SD8 are measured values , SGI to SG9 are values available in the host processor 46 or at another part of the optical sensor device 10 . The first and the second reference signal SREF1 , SREF2 are calculated by the computer 54 using the further digital sensor signals SD1 ' to SD8 ' provided by the test multispectral sensor 51 .
In an alternative embodiment , not shown, the vector DV includes less parameters . The vector DV comprises at least three digital data signals SD1 to SD8 and at least one of the reference signals SREF1 , SREF2 . The vector DV is e . g . free of any of the additional signals SGI to SG9 .
Alternatively, the vector DV comprises between three and eight digital data signals SD1 to SD8 , between one and nine additional signals SGI to SG9 and one or two reference signals SREF1 , SREF2 .
Figures 3C shows an exemplary embodiment for training of a method for evaluating optical sensor signals which is a further development of the embodiments shown above . In Figures 3C and 3D, an overview about model training and testing is provided . A supervised training of a model such as e . g . a neural network model can be performed as follows : A plurality of vectors DV' is provided as input data to a machine learning algorithm 61 or combination of algorithms described below . The vector DV' is equal to the vector DV with the exception of the first and the second reference signal SREF1 , SREF2 . The first and the second reference signal SREF1 , SREF2 are not part of the vector DV' . The machine learning algorithm 61 provides model output data . The model output data are the first and the second ambient light signal AL1 , AL2 . In an error calculation process 62 , the model output data are compared with target data . An error ER between the target data and the model output data is computed . In the error calculation process 62 , the error ER between the first and the second ambient light signal AL1 , AL2 on one side and the first and the second reference signal SREF1 , SREF2 (which are known from the vector DV) on the other side is calculated . For example , the error ER can be calculated according to the method of least squares . Thus , the error ER is calculated e . g . according to the equation :
ER = V(AL1 - SREF1)2 + (AL2 - SREF2)2 .
In a comparison process 63 , the error ER is compared with a preset value PV . I s the error ER is smaller than the preset error value PV or is the training iteration greater than a preset iteration value , than the training is stopped . I s the error ER greater than the preset error value PV and is the training iteration less than the preset iteration value , than the training is continued, e . g . by providing feedback to the optimi zation engine in the process 61 .
In case the training is terminated, the parameters of the model are stored in a parameter memory 56 of the computer 54 . The parameters can be used for testing and in case of a success ful test in the optical sensor device 10 ( that means in the product series of the optical sensor device 10 ) .
Figures 3D shows an exemplary embodiment for testing of a method for evaluating optical sensor signals which is a further development of the embodiments shown above . For testing, a trained model 41 is used . Input data are applied to the trained model 41 . The trained model 41 uses the parameters which result from the training as described in Figure 3C . The trained model 41 generates the first and the second ambient light sensor signals AL1 , AL2 . The first and the second ambient light sensor signals AL1 , AL2 are compared with the first and the second reference signal SREF1 , SREF2 , e . g . using the error calculation process 62 . In case the error ER is less than a predetermined value , the training model is appropriate for implementation in an optical sensor arrangement 10 that can be sold e . g . as a product .
In an embodiment , a set of vectors DV' used for training ( Figure 3C ) is di f ferent from a set of vectors DV' used for testing ( Figure 3D) . For example , 75 % of the vectors DV' generated with the test setup 50 ( shown in Figure 3A) are used for training and the other 25 % of the vectors DV' generated with the test setup 50 are used for testing . Alternatively, a percentage between 50 % and 90 % of the vectors DV' are used for training and the rest of the vectors DV' are used for testing . Obviously, a trained model achieves a low error with the vectors DV' which were used for training . Testing the trained model with other vectors DV' results in a reliable quality check of the trained model 41 and the stored parameter of the trained model 41 .
Figures 3E shows an exemplary embodiment for using a method for evaluating optical sensor signals which is a further development of the embodiments shown above . The method for evaluating optical sensor signals comprises generating at least three optical sensor signals SOI to SO9 by the multispectral sensor 20 , digitali zing the at least three optical sensor signals Sol to SO9 into at least three digital sensor signals SD1 to SD9 , and providing the first ambient light signal AL1 by the trained model 41 as a function of the at least three digital sensor signals SD1 to SD9 . The first ambient light signal AL1 is at least one of an illuminance value , a correlated color temperature value , chromaticity values , tristimulus values and a flicker value . The trained model 41 is trained by a machine learning algorithm as described e . g . in Figures 3C and 3D .
Optionally, the first ambient light signal AL1 is provided by the trained model 41 additionally as a function of at least one additional signal SGI to SG9 . The at least one additional signal SGI to SG9 is a signal of a group consisting of a display buf fer data R averaged information, a display buf fer data G averaged information, a display buf fer data B averaged information, a display frame buf fer information, a display brightness information, a display buf fer information, a display refresh rate information, a display frame synchroni zation information, and a display pulse-width-modulation information . One or more than one of these information is used by the trained model 41 .
The computer program product 40 comprises instructions or software code such as the trained model 41 to cause the optical sensor device 10 to execute the method for evaluating an optical sensor signal . The processor 44 or the host processor 46 executes the method for evaluating an optical sensor signal .
In case the optical sensor device 10 comprises the smart sensor 45 which comprises the multispectral sensor 20 and the processor 44 , the processor 44 provides the first ambient light signal AL1 . Alternatively, the host processor 46 calculates the first ambient light signal AL1 ( e . g . in case the optical sensor device 10 does not comprise a smart sensor or the processor 44 of the smart sensor 45 is not configured for processing the trained model 41 ) . The host processor 46 uses e . g . internally the first ambient light signal AL1 for control of the display 12 .
Advantageously, an influence of the display 12 or an influence of the display 12 and the glass 13 on the light received by the multispectral sensor 20 is reduced . For example , the display 12 is reali zed as OLED . The multispectral sensor 20 is located such that the glass 13 and the display 12 or only the display 12 are in an optical path between the ambient and the multispectral sensor 20 . The optical sensor device 10 is reali zed as a device of a group consisting of a cell phone , tablet , television set , laptop and computer with computer display .
Figure 4A shows an exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above . The ambient light IL falls through the display 12 on the multispectral sensor 20 that generates N digital sensor signals SD1 to SDN . The N digital sensor signals SD1 to SDN and M additional signals SGI to SGM are provided to the trained model 41 .
The trained model 41 implements a result of a machine learning algorithm . The machine learning algorithm is one of a group consisting of linear regression 70 , arti ficial neural network 71 ( abbreviated ANN) , multilayer perceptron 72 ( abbreviated MLP ) , decision trees 73 , gradient-boosted decision trees 74 , random forest 75 , K nearest neighbors 76 ( abbreviated kNN) , clustering 77 , K-means clustering 78 and principal components analysis 79 ( abbreviated PCA) . Multilayer perceptron 72 can be described as an example of an arti ficial neural network 71 . Gradient-boosted decision trees 74 can be described as an example of decision trees 73 . Another name for gradient-boosted decision trees 74 is e . g . gradient-boosting decision trees .
Alternatively, the trained model 41 implements more than one machine learning algorithm 70 to 79 listed above . Thus , the N digital sensor signals SD1 to SDN and the M additional signals SGI to SGM are provided to at least two machine learning algorithms listed above and the results of the at least two machine learning algorithms are provided to a combiner (not shown in Figure 4A) which provides the first and the second ambient light signal AL1 , AL2 . The combiner can be reali zed as a summation function . Thus , two or three or more than three machine learning algorithms 70 to 79 are implemented in an AND/OR combination . As shown in the example of the architecture overview in Figure 4A, an ambient light detection architecture is configured to implement several machine learning algorithms 70 to 79 .
Figure 4B shows an exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above . The machine learning algorithm is an arti ficial neural network 71 comprising an input layer 81 receiving on an input side the at least three digital sensor signals SD1 to SD8 . In an example , the input layer 81 receives on the input side the N digital sensor signals SD1 to SDN and the M additional signals SGI to SGM . The M additional signals SGI to SGM comprise e . g . a phone display frame buf fer information, a phone display brightness information and/or other relevant information . A hidden layer 91 of the ANN 71 receives on an input side signals of the input layer 81 . An output layer 111 receives on an input side directly or indirectly signals of the hidden layer 91 . An activation function 121 receiving on an input side signals of the output layer 111 . At least one summation function 130 of the ANN 71 receives on an input side signals provided by the activation function 121 and provides the first ambient light signal AL1 or the first and the second ambient light signal AL1 , AL2 . The activation function 121 is one of recti fied linear unit , logistic and hyperbolic tangent .
In the architecture overview shown in Figure 4B, an ambient light detection architecture uses an ANN 71 as the machine learning algorithm . The input layer 81 comprises a number Ml number of neurons 82 to 88 to which the N digital sensor signals SD1 to SDN and the M additional signals SGI to SGM are provided ( thus , Ml = N + M) . Each of the number Ml of neurons 82 to 88 of the input layer 81 comprises a bias bi . The number Ml of neurons 82 to 88 of the input layer 81 are connected to a number M2 of neurons 92 to 101 of the hidden layer 91 via weights Wij .
Each of the number M2 number of neurons 92 to 101 of the hidden layer 91 comprises a bias bj . Each of the number M2 number of neurons 92 to 101 of the hidden layer 91 is coupled to a number M3 of neurons 112 to 116 of the output layer 111 via weights Wj k .
Each of the number M3 of neurons 112 to 116 of the output layer 111 are coupled via the activation function 121 to the at least one summation function 130 . The activation function 121 comprises a number M3 of functions 122 to 127 . Thus , each neuron is coupled via one function to the at least one summation function 130 . A first summation function of the at least one summation function 130 generates the first ambient light signal AL1 . Optionally, a second summation function of the at least one summation function 130 generates the second ambient light signal AL2 . The first and the second ambient light signal AL1 , AL2 are e . g . the illuminance and the CCT of the ambient light IL .
Multiple machine learning algorithms such as linear regression 70 , random forest 75 , gradient boosting ( also named gradient boosted 74 ) and ANN 71 have been studied . ANN 71 of fers the best performance for the BOLED task . However, other machine learning algorithms could also be used and could provide other advantages such as lower processing power requirements . The performance was evaluated using simulation data as well as real data . However, the results shown in Figures 5A to 5C are focused on using the real data .
Two separate models are trained with one for predicting illuminance and another for predicting CCT (measured in LUX ) . In an example , the hidden layer 91 , the output layer 11 , the activation function 121 and the summation function 130 are di f ferent for generating the first ambient light signal AL1 and for generating the second ambient light signal AL2 . The values for the bias and the weights are di f ferent .
The two trained models 41 may or may not use the same machine learning algorithm 70 to 79 .
The training process takes typically 20 min to 30 min on a laptop PC and uses a large set of data ( for example > 200000 sets used so far ) . Once trained, running predictions from a single set of input data is reasonably fast even in case of a cell phone , or whatever target device will be used .
An advantage of the machine learning algorithm for BOLED ambient light detection is that it does not need hand waving or any other temporal learning events . The algorithm is capable of determining the illuminance value (measured in LUX ) and the OCT value based on the pre-trained model 41 or pre-trained models i f the current ambient condition as well as display conditions were included in the training set . It is capable of providing predictions under static or changing displays .
A possible disadvantage of using a machine learning algorithm for BOLED ambient light detection is that i f a new ambient light and/or new display conditions were not included in the training set , the model may completely fail to predict the illuminance and OCT values correctly for the new ambient light .
In an alternative embodiment , not shown, the ANN 71 comprises one or more than one further hidden layers which are arranged between the hidden layer 91 and the output layer 111 .
Figure 4C shows a further exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above . The machine learning algorithm is linear regression 70 . The first ambient light signal AL1 is calculated according to the equation : AL1 aO + al xl + a2 x2 + ... + aN+M • xN+M , wherein aO, al, a2, ... aN+M are coefficients determined by linear regression and xl, x2 ... xN+M are the N digital sensor signals SD1 to SDN and M additional signals SGI to SGM. The number N > 2. The number M is e.g. 0, 1, 2, 3 or more than 0 or more than 1 or more than 2. The second ambient light signal AL2 can be calculated with the same equation using other values for the coefficients aO, al, a2, ... aN+M.
In the architecture overview of the ambient light detection architecture, linear regression 70 is implemented as the machine learning algorithm. The linear regression 70 operates with multiple variables. In Figure 4C, a linear regression is illustrated with a single variable x and with a single result AL1.
Figure 4D shows an additional exemplary embodiment of a method for evaluating optical sensor signals which is a further development of the embodiments shown above. In the architecture overview of the ambient light detection architecture, random forest 75 is implemented as the machine learning algorithm. The random forest 75 comprises several decision trees 141 to 143. For example, the decision trees 141 to 143 are implemented as shallow decision trees. For example, the random forest 75 comprises a decision tree for each of the N digital sensor signals SD1 to SDN and M additional signals SGI to SGM. The random forest 75 comprises a combining function 144 which receives the outputs of the decision trees 141 to 143. The combining function 144 provides e.g. an average of the outputs of the decision trees 141 to 143, a sum of the outputs of the decision trees 141 to 143 or a weighted sum of the outputs of the decision trees 141 to 143 (similar to the equation for linear regression described above, but with the outputs of the decision trees 141 to 143 as signals or variables) . Thus, the first ambient light signal AL1 is a function of an average of all outputs of the decision trees 141 to 143. The second ambient light signal AL2 can be calculated with a similar random forest 75.
Figure 5A shows an exemplary embodiment of test results of a method for evaluating optical sensor signals which is realized according to an embodiment shown above. An evaluation process includes a training data description. The test setup 50 shown in Figure 3A was used. The two multispectral sensors 20 and 51 were placed behind OLED phone screen and placed in open air close to one another for data collection. For data preparation, behind OLED data and open air data were collected simultaneously on a single Raspberry PI. Using a post processing tool, the two sets of data were synchronized and combined into a single data file. The lux and OCT values produced by the test multispectral sensor 51 for each data point are used as reference signals SREF1, SREF2 in the training of the machine learning algorithms. The data files were collected during motion or walk-around or collected in a laboratory (with lux values in the following range: 200, 250, 300, 350, 400, 450, 500, 550, 600, 650, 700, 750, 800) or collected in an office and a conference room with ceiling lights and some sun light through a window.
The results shown in Figures 5A to 5C were achieved by multilayer perceptron 72 (MLP) as machine learning model. A MLP 72 is an embodiment of an ANN 71. The MLP 72 is realized as a fully connected feedforward ANN 71. The activation function used is rectified linear unit (abbreviated ReLU) . Other activation functions such as logistic (sigmoid) and hyperbolic Tangent ( tanh) were also studied ( the results are not illustrated here ) . All data listed above are totaling 234701 with 75% randomly selected for training the model . The further data where used for testing the trained model 41 .
In Figure 5A, results are shown when the trained model 41 is tested on the entire set of data . MAPE LUX is the abbreviation for mean absolute percent error for the illuminance (measured in LUX ) ; MAPE CCT is the abbreviation for mean absolute percent error for the CCT . The error is smaller for values of the illuminance above 200 LUS .
Figure 5B shows a further exemplary embodiment of test results of a method for evaluating optical sensor signals which is reali zed according to the embodiment described above . The first ambient light signal AL1 and the first reference signal SREF1 (which both refer to the illuminance in LUX ) are shown versus a number NU of the data vector DV' . The data were collected during walk around in an of fice , wherein the screen 12 is active .
Figure 5C shows an additional exemplary embodiment of test results of a method for evaluating optical sensor signals which is reali zed according to the embodiment described above . The first ambient light signal AL1 and the first reference signal SREF1 (which both refer to the illuminance in LUX ) are shown versus a number NU of the data vector DV' . The data were collected in a laboratory, wherein the ambient light IL has been changed at the points of time indicated by arrows . The reference signal SREF1 and the ambient light signal AL1 have a good match in the experiments shown in Figures 5B and 5C . The invention is not limited to the description of the embodiments . Rather, the invention comprises each new feature as well as each combination of features , particularly each combination of features of the claims , even i f the feature or the combination of features itsel f is not explicitly given in the claims or embodiments .
Reference numerals
10 optical sensor device
12 display
13 glass
14 input
20 multispectral sensor
21 light sensor
22 semiconductor substrate
23 sensing window
24 multiplexer
25 analog circuit
26 analog-to-digital converter
27 digital core
28 register
29 interface
40 computer program product
41 trained model
44 processor
45 smart sensor
46 host processor
47 further semiconductor substrate
50 test setup
51 test multispectral sensor
53 sensing window
54 computer
55 software
56 parameter memory
61 machine learning algorithm
62 error calculation process
63 comparison process
70 to 79 algorithm
81 input layer 82 to 88 neuron
91 hidden layer
92 to 101 neuron
111 output layer
112 to 116 neuron
121 activation function
122 to 127 function
130 summation function
141 to 143 decision tree
144 combining function
AL1 , AL2 ambient light signal
DV vector
ER error
IL ambient light
NU number
R1-R5 responsivity
SCL clock signal
SDA data signal
SD, SD1-SD9 digital sensor signal
SD1 ' -SD9 ' further digital sensor signal
SG, SG1-SG9 additional signal
SO1-SO9 optical sensor signal
SREF1 , SREF2 reference signal
VDD supply voltage
VSS ground potential
X wave length

Claims

We claim:
1. A method for evaluating optical sensor signals, wherein the method comprises: generating at least three optical sensor signals by a multispectral sensor, digitalizing the least three optical sensor signals into at least three digital sensor signals, and providing a first ambient light signal by a trained model as a function of the at least three digital sensor signals.
2. The method of claim 1, wherein the first ambient light signal is at least one of an illuminance value, a correlated color temperature value, chromaticity values, tristimulus values and a flicker value.
3. The method of claim 1, wherein the trained model is trained by a machine learning algorithm.
4. The method of claim 3, wherein the machine learning algorithm is one of a group consisting of linear regression, artificial neural network, multilayer perceptron, decision trees, gradient-boosted decision trees, random forest, K nearest neighbors, clustering, K-means clustering and principal components analysis .
5. The method of claim 3, wherein the machine learning algorithm is an artificial neural network comprising: an input layer receiving on an input side the at least three digital sensor signals, a hidden layer receiving on an input side signals of the input layer, an output layer receiving on an input side directly or indirectly signals of the hidden layer, and at least one summation function receiving on an input side directly or indirectly signals of the output layer and providing the first ambient light signal .
6 . The method of claim 5 , wherein the arti ficial neural network comprises an activation function receiving on an input side signals of the output layer, and providing signals to at least one summation function .
7 . The method of claim 6 , wherein the activation function are one of recti fied linear unit , logistic and hyperbolic tangent .
8 . The method of claim 3 , wherein the machine learning algorithm is linear regression, and wherein the first ambient light signal AL1 is calculated according to the equation :
AL1 = aO + al • xl + a2 • x2 + ... + aN+M • xN+M , wherein aO , al , a2 , ... aN+M are coef ficients determined by linear regression and xl , x2 ... xN+M are signals .
9 . The method of claim 3 , wherein the first ambient light signal is provided by the trained model as a function of the at least three digital sensor signals and at least one additional signal , and wherein the at least one additional signal is a signal of a group consisting of display buf fer data R averaged information, display buf fer data G averaged information, display buf fer data B averaged information, display frame buf fer information, display brightness information, display buf fer information, display refresh rate information, display frame synchroni zation information, and display pulse-width- modulation information .
10 . An optical sensor device , comprising : a multispectral sensor, and a processor which is configured to execute the method for evaluating an optical sensor signal of claim 1 .
11 . The optical sensor device of claim 10 , further comprising a smart sensor, wherein the smart sensor comprises the multispectral sensor and the processor, and wherein the processor is configured to provide the first ambient light signal .
12 . The optical sensor device of claim 10 , wherein the processor is reali zed as a host processor and is configured to generate the first ambient light signal , and wherein the multispectral sensor and the host processor are reali zed on two separate semiconductor bodies .
13 . The optical sensor device of claim 10 , further comprising a display, wherein the multispectral sensor is located such that the display is in an optical path between an ambient and the multispectral sensor .
14 . The optical sensor device of claim 10 , wherein an optical path between an ambient and the multispectral sensor is free of a display .
15 . The optical sensor device of claim 10 , further comprising a glass or a polymer sheet which is colorless or colored, wherein the multispectral sensor is located such that the glass or the polymer sheet is in an optical path between an ambient and the multispectral sensor .
16 . The optical sensor device of claim 10 , wherein the optical sensor device is reali zed as a device of a group consisting of a cell phone , tablet , television set , laptop and computer with computer display .
17 . A computer program product comprising instructions to cause the optical sensor device of claim 10 to execute the method for evaluating an optical sensor signal of claim 1 .
PCT/US2023/035476 2022-10-20 2023-10-19 Ambient light detection us ing machine learning WO2024086260A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263417791P 2022-10-20 2022-10-20
US63/417,791 2022-10-20

Publications (1)

Publication Number Publication Date
WO2024086260A1 true WO2024086260A1 (en) 2024-04-25

Family

ID=88874800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/035476 WO2024086260A1 (en) 2022-10-20 2023-10-19 Ambient light detection us ing machine learning

Country Status (1)

Country Link
WO (1) WO2024086260A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170084250A1 (en) * 2015-09-17 2017-03-23 Apple Inc. Methods for Color Sensing Ambient Light Sensor Calibration
US10580341B2 (en) * 2015-02-11 2020-03-03 Apple Inc. Electronic device with color sensing ambient light sensor
US20220172676A1 (en) * 2019-03-08 2022-06-02 Ams International Ag Spectral decomposition of ambient light measurements

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10580341B2 (en) * 2015-02-11 2020-03-03 Apple Inc. Electronic device with color sensing ambient light sensor
US20170084250A1 (en) * 2015-09-17 2017-03-23 Apple Inc. Methods for Color Sensing Ambient Light Sensor Calibration
US20220172676A1 (en) * 2019-03-08 2022-06-02 Ams International Ag Spectral decomposition of ambient light measurements

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WIKIPEDIA: "Activation function", 19 October 2022 (2022-10-19), XP093119953, Retrieved from the Internet <URL:https://en.wikipedia.org/w/index.php?title=Activation_function&direction=prev&oldid=1117252591> [retrieved on 20240116] *

Similar Documents

Publication Publication Date Title
US10321031B2 (en) Device attachment with infrared imaging sensor
US10845872B2 (en) Eye-gaze tracker, eye-gaze tracking method, and recording medium
CN108027239B (en) Detector for optically detecting at least one object
US9807319B2 (en) Wearable imaging devices, systems, and methods
EP3645965B1 (en) Detector for determining a position of at least one object
KR20220090576A (en) Image detection and processing to track regions of interest using neural networks
US20150288892A1 (en) Device attachment with infrared imaging sensor
WO2015195645A1 (en) Occupancy sensing smart lighting system
JP2012014668A (en) Image processing apparatus, image processing method, program, and electronic apparatus
Expert et al. Outdoor field performances of insect‐based visual motion sensors
CN111551266A (en) Environmental color temperature testing method and system based on multispectral image detection technology
Ma et al. Characterization of indoor light conditions by light source classification
CN103869973A (en) Sensing device and sensing method
US10996169B2 (en) Multi-spectral fluorescent imaging
TW202219489A (en) Sensing apparatus, sensing method and sensing computing device
USRE42255E1 (en) Color sensor
WO2024086260A1 (en) Ambient light detection us ing machine learning
CN117500118A (en) Intelligent control method, device and equipment for lighting lighthouse equipment and storage medium
WO2022196999A1 (en) Fruit quantity measurement system and method therefor
US11614365B1 (en) Luminescent cold shield paneling for infrared camera continuous non-uniformity correction
US11079277B2 (en) Spectral imaging device and method
CN116583723A (en) Method, system and apparatus for measuring surface color
Wetzstein et al. Deep optics: Learning cameras and optical computing systems
Radke et al. 17‐2: Invited Paper: Advanced Sensing and Control in the Smart Conference Room at the Center for Lighting Enabled Systems and Applications
Marefat et al. A hemispherical omni-directional bio inspired optical sensor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23809781

Country of ref document: EP

Kind code of ref document: A1