LU101028B1 - Method for Online Training of an Artificial Intelligence (AI) Sensor System - Google Patents

Method for Online Training of an Artificial Intelligence (AI) Sensor System Download PDF

Info

Publication number
LU101028B1
LU101028B1 LU101028A LU101028A LU101028B1 LU 101028 B1 LU101028 B1 LU 101028B1 LU 101028 A LU101028 A LU 101028A LU 101028 A LU101028 A LU 101028A LU 101028 B1 LU101028 B1 LU 101028B1
Authority
LU
Luxembourg
Prior art keywords
neural network
quality
confidence level
artificial neural
sensor system
Prior art date
Application number
LU101028A
Other languages
German (de)
Inventor
Hans Peter Beise
Udo Schröder
Da Cruz Steve Dias
Original Assignee
Iee Sa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iee Sa filed Critical Iee Sa
Priority to LU101028A priority Critical patent/LU101028B1/en
Priority to PCT/EP2019/083536 priority patent/WO2020115066A1/en
Application granted granted Critical
Publication of LU101028B1 publication Critical patent/LU101028B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

A method of operating an artificial intelligence sensor system (10) is presented for supervised training purposes, the artificial intelligence sensor system (10) having one or more sensors (12, 14) and at least one artificial neural network (18) that is configured for receiving and processing signals (xA^,xA2,xBi) from the sensor or the sensors (12, 14). The at least one artificial neural network (18) derives an output representing a quality with a confidence level regarding the provided signals (xA1,xA2,xB1),. If the derived confidence level of the quality is lower than a predetermined confidence level, the at least one provided signal (xA1) and the derived quality are temporarily stored (60). By use of at least one independent sensor signal (xA2,xB1) the quality having a derived confidence level lower than the predetermined confidence level is confirmed (62, 66), and the temporarily stored signal (xA1) or signals and the confirmed quality are permanently stored (70) as labeled online training data, using the derived quality as the label.

Description

Method for Online Training of an Artificial Intelligence (AI) Sensor System Technical field
[0001] The invention relates to a method of operating an artificial intelligence sensor system for supervised training purposes and an artificial intelligence sensor system configured for executing such method. Background of the Invention
[0002] Systems based on sensor input experience fast-growing demands in various fields. For instance, in the automotive field they constitute the backbone of almost all Advanced Driver-Assistance Systems (ADAS) as these monitor an exterior environment or the interior of a vehicle and its occupants for providing improved safety by facilitating an optimized reaction of a driver of a vehicle with appropriate warnings or even by automatically taking over control of the vehicle, for instance in collision avoidance systems.
[0003] In this function, such systems are requested to perform tasks of increasing complexity. For example, they should be capable to anticipate potential risks that might occur in complex traffic scenarios within the next few seconds. In conventional ADAS, usually an electronic processing unit such as a central processing unit (CPU) is employed for executing a program code of a software module that has been manually designed for controlling an automatic execution of a monitoring method.
[0004] By way of example, patent application publication US 2014/0139670 A1 describes a system and method directed to augmenting advanced driver assistance systems (ADAS) features of a vehicle with image processing support in an on-board vehicle platform. Images may be received from one or more image sensors associated with an ADAS of a vehicle. The received images may be processed. An action is determined based upon, at least in part, the processed images. À message is transmitted to an ADAS controller responsive to the determination. To that end, the vehicle may include one or more processor units, networking interfaces, and other computing devices that may enable it to capture image data, process the image data, and augment ADAS features of the vehiclewith image processing support in the on-board vehicle platform. A computing system may include single-feature fixed-function devices such as an ADAS image system on chip (SoC).
[0005] The complexity of tasks to be performed by such ADAS tends to grow more and more as well as in other technical fields such as, for instance, medical diagnostic appliances, smartphone technology and drone technology.
[0006] In such complex sensor-based systems it has been proposed to exploit the capabilities of artificial intelligence (Al) systems and artificial neural networks, respectively. In contrast to conventional processing units, artificial neural networks provide the possibility of learning.
[0007] Artificial neural networks are known to comprise a plurality of interconnected artificial neurons and to have an input side and an output side. As is well known in the field of artificial neural networks, each artificial neuron of the plurality of interconnected artificial neurons (also called nodes) can transmit a signal to another artificial neuron connected to it, and the received signal can further be processed and transmitted to the next artificial neuron. The output of each artificial neuron may be calculated using a non-linear function of the sum of its inputs. In a learning process of an artificial neural network, weights of the non- linear function usually are being adjusted. A complex task may be learned by determining a set of weights for the artificial neurons such that the output signal of the artificial neural network is close to a desired output signal, which is performed when the artificial neural network is trained.
[0008] Multiple methods for training an artificial neural network are known in the art. For instance, in supervised learning a function is learned that maps an input to an output based on exemplary input-output pairs. An artificial neural network that has been submitted to a learning scheme is often called a “trained” artificial neural network.
[0009] Reliability and performance of Al systems including an artificial neural network improve with quantity and quality of training data. Typical Al systems using neural network require a vast amount of data. Therefore, acquisition of training data constitutes a major challenge in the creation of such systems.
Object of the invention
[0010] It is therefore an object of the invention to provide a method for training of a single or multiple sensor based artificial intelligence (Al) system including at least one artificial neural network, wherein the training method is able to ensure a specified reliability and a specified performance of the Al system and that is efficient in terms of computational and digital data memory hardware effort and cost.
General Description of the Invention
[0011] For the purpose of training a single or multiple sensor based Al system including at least one artificial neural network it is virtually impossible to produce a data set that covers what the system will have to process in the field.
[0012] Within the scope of the invention, it is therefore proposed to (re-)train Al systems in an online manner during their lifetime so as to improve the AI system performance and reliability with time.
[0013] The invention addresses and overcomes at least the following obstacles of online training: - Supervised training, for which labeled training data are needed, is considered the most efficient training method at present. Hence, the question raises on how to automatically label data received during the Al system lifetime.
- The outcome of an online training method is in principle not predictable. Therefore, there has to be a control mechanism for ensuring a specified reliability and a specified performance of the Al system.
- (Online) Training of an Al system including an artificial neural network requires high computational effort and expenses, i.e. it requires expensive devices exhibiting high computational capabilities.
[0014] In one aspect of the present invention, the object is achieved by a method of operating an artificial intelligence sensor system for supervised training purposes. The artificial intelligence sensor system includes one or more sensors and at least one classifier or artificial neural network that is configured for receiving and processing signals from the sensor or the sensors. It should be noted that the present invention is not limited to Al or machine learning using artificial neuralnetworks. The skilled person will recognize that the invention also relates to other classifier techniques using an algorithm that gives a reliable classification/decision, e.g. with a high confidence level.
[0015] The method comprises at least the following steps that are to be executed iteratively: - providing signals from the sensor or the sensors as input data to the at least one classifier or artificial neural network, - operating the at least one classifier or artificial neural network to derive, e.g. based on labeled training data resident within the at least one artificial neural network, an output representing a quality with a confidence level regarding the provided signals, - if the derived confidence level of the quality is equal to or larger than a predetermined confidence level, permanently storing at least a portion of the provided signal and the derived quality as labeled online training data, using the derived quality as the label, - if the derived confidence level of the quality is lower than the predetermined confidence level, temporarily storing the at least one provided signal and the derived quality, - confirming the quality having a derived confidence level lower than the predetermined confidence level by use of at least one independent sensor signal, and - after completion of the step of confirming, permanently storing at least a portion of the temporarily stored signal or signals and the derived quality as labeled online training data, using the derived quality as the label.
[0016] The phrase “being configured to”, as used in this application, shall in particular be understood as being specifically programmed, laid out, furnished or arranged. The term “quality”, as used in this application, shall particularly encompass, without being limited to, abstract objects such as classes as used for classification purposes, such as “pedestrian”, “vehicle”, “cyclists”, and so forth, as well as properties of objects, such as color and/or size.
[0017] By using an independent sensor signal for confirming a quality that has been derived by the at least one classifier or artificial neural network with a confidence level lower than the predetermined confidence level, labeled trainingdata can readily be provided in a sufficient number for the purpose of supervised training sessions. It should be noted that after the completion of the step of confirming, at least a portion of the provided signal and the derived quality derived from the independent sensor signal may also be stored as labeled online training data, using the derived quality as the label, as the confidence level of this quality is also equal to or larger than the predetermined confidence level.
[0018] The invention is, without being limited to, in particular beneficially employable in automotive applications, smartphone technology and drone technology, but may as well be used in any other technical field in which complex sensor-based systems including a suitable classifier or an artificial neural network are used. The term “automotive”, as used in this patent application, shall particularly be understood as being suitable for use in vehicles including passenger cars, trucks, semi-trailer trucks and buses.
[0019] In preferred embodiments of the method, the step of confirming the quality by use of at least one independent sensor signal includes using a signal of the same sensor that is provided within a predetermined time period after the deriving of the quality having a derived confidence level lower than the predetermined confidence level, and from which the at least one classifier or artificial neural network device derives an output representing a quality with a confidence level that is equal to or larger than the predetermined confidence level.
[0020] By confirming and labeling acquired and temporarily stored sensor signals in retrospect, confirmed and labeled training data can beneficially and readily be provided for the purpose of supervised training sessions.
[0021] In preferred embodiments of the method, the step of confirming the quality by use of at least one independent sensor signal includes using a signal of another sensor from which the at least one classifier or artificial neural network device derives an output representing a quality with a confidence level that is equal to or larger than the predetermined confidence level.
[0022] By making use of independent and complementary information obtained from another sensor, confirmed and labeled training data can beneficially and readily be provided for the purpose of supervised training sessions.
[0023] The sensor that provides signals from which the at least one classifier or artificial neural network derived the quality with a confidence level lower than the predetermined confidence level and the sensor that provided the at least one independent sensor signal from which the at least one classifier or artificial neural network device derives an output representing a quality with a confidence level that is equal to or larger than the predetermined confidence level may be sensors of the same type, i.e. sensors that are based on the same working principle. In other embodiments, the two sensors may be based on different working principles.
[0024] In preferred embodiments, the method further comprises a preceding step of providing the at least one classifier or artificial neural network in an offline mode with initial training results. In this way, an online training of the at least one classifier or artificial neural network by using the permanently stored labeled online training data can start from a higher level, and a larger training effect can be achieved in a shorter time period.
[0025] In preferred embodiments, the method further comprises a step of executing an online supervised training phase at least once in a predetermined time period by using at least the permanently stored labeled online training data. In this way, the at least one classifier or artificial neural network can be trained in a quasi-continuous manner, and the option of virtually continuously improving reliability and performance of the artificial intelligence sensor system can be provided.
[0026] Preferably, in cases in which the at least one artificial neural network includes a plurality of layers between an input layer and an output layer, the step of executing an online supervised training phase includes training only of preselected layers out of the plurality of layers. In this way, computational costs and hardware requirements and costs for digital data memory units can be reduced. Further, a set of data required for the online supervised training given by the permanently stored labeled online training data can be stored in a compressed manner and hence requires less memory space. Examples for artificial neural networks including a plurality of layers between an input layer and an output layer are deep neural network (DNN), in particular recurrent neural networks (RNNs) and convolutional deep neural networks (CNNs).
[0027] In preferred embodiments, the method further comprises a step of performance control by validating the current online training status, using a validation sensor data set with assigned correct labels that resides within the classifier or artificial neural network. In this way, reliability and performance of the at least one artificial neural network that is being modified during online supervised training phases can be ensured.
[0028] Preferably, the step of performance control comprises steps of - operating the at least one classifier or artificial neural network to derive, e.g. based on labeled training data that is resident within the at least one artificial neural network at present, outputs with confidence levels representing qualities for the provided validation sensor data set with assigned correct labels, - comparing the derived outputs with the assigned correct labels, - calculating a current performance figure that represents the result of the step of comparing, - comparing the current performance figure with a predetermined performance figure, - accept adaptations made in the at least one classifier or artificial neural network since executing the latest online supervised training phase if the current performance figure is equal to or exceeds the predetermined performance figure, and - reject adaptations made in the at least one classifier or artificial neural network since executing the latest online supervised training phase if the current performance figure is lower than the predetermined performance figure.
[0029] In this way it can be ensured that only such adaptations within the at least one classifier or artificial neural network are implemented that improve the performance of the artificial intelligence sensor system, and adaptations that may diminish the performance of the artificial intelligence sensor system are rejected.
[0030] In another aspect of the invention an artificial intelligence sensor system is provided. The artificial intelligence sensor system includes at least one classifier or artificial neural network with an input side and an output side, and at least onesensor system that is operatively connected to the input side of the classifier or artificial neural network. Each sensor system comprises at least one sensor.
[0031] The classifier or artificial neural network is configured to provide, at the output side, an output representing a quality with a confidence level with regard to at least one object to be monitored or surveilled by applying at least one trained task to signals that have been received from the at least one sensor system.
[0032] The at least one task is trained by executing a method as disclosed herein, depending on a total number of sensors of the artificial intelligence sensor system.
[0033] The benefits described in context with the disclosed method of operating an artificial intelligence sensor system for supervised training purposes apply to the artificial intelligence sensor system to the full extent.
[0034] Preferably, the at least one sensor system comprises at least one out of an optical camera, a RADAR sensor system, a LIDAR (light detection and ranging) device and an acoustics based sensor, like an ultrasonic sensor. By that, sensor signals can be provided that allow to detect qualities representing characteristic features of an object to be monitored or surveilled in a variety of ways, wherein appropriate sensors can be chosen depending on the specific application.
[0035] Preferably, the at least one artificial neural network comprises at least one deep neural network. As explained before, this can provide perspectives of substantial savings of hardware costs such as for digital data memory units, and of computational effort.
[0036] In another aspect of the invention, it is proposed to use the disclosed artificial intelligence sensor system, including at least one out of an optical camera, a RADAR sensor system, a LIDAR device and acoustics based sensor, like e.g. an ultrasonic sensor, as an automotive vehicle exterior sensing system. In this way, an improved monitoring and surveying of other traffic participants can be accomplished.
[0037] The RADAR sensor system may be configured to be operated at a RADAR carrier frequency that lies in a frequency range between 20 GHz and 140 GHz.
[0038] In yet another aspect of the invention, it is proposed to use the disclosed artificial intelligence sensor system, including at least one out of an optical camera and a RADAR sensor system, as an automotive interior sensing system. Such automotive interior sensing systems can beneficially be employed for, without being limited to, a detection of left-behind pets and/or children, vital sign monitoring, vehicle seat occupancy detection for seat belt reminder (SBR) systems, and anti-theft alarm.
[0039] Also here, the RADAR sensor system may be configured to be operated at a RADAR carrier frequency that lies in a frequency range between 20 GHz and 140 GHz
[0040] These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
[0041] It shall be pointed out that the features and measures detailed individually in the preceding description can be combined with one another in any technically meaningful manner and show further embodiments of the invention. The description characterizes and specifies the invention in particular in connection with the figures.
Brief Description of the Drawings
[0042] Further details and advantages of the present invention will be apparent from the following detailed description of not limiting embodiments with reference to the attached drawing, wherein: Fig. 1 schematically illustrates an artificial intelligence sensor system in accordance with the invention installed in a vehicle in a detail side view, Fig. 2 illustrates a schematic diagram of the artificial intelligence sensor system pursuant to Fig. 1, Fig. 3 is a flow chart of a method in accordance with the invention of operating the sensor-based monitoring system pursuant to Fig. 1 for supervised online training purposes, and Fig. 4 schematically illustrates a general structure of artificial neural modules of the artificial neural network of the sensor-based monitoring system pursuant to Fig. 1.
Description of Preferred Embodiments
[0043] Fig. 1 schematically illustrates an artificial intelligence sensor system 10 in accordance with the invention installed in a vehicle 44, which is designed as a sedan passenger car, in a detail side view. The artificial intelligence sensor system 10 is configured to be used as an automotive vehicle interior sensing system for a detection of for instance, but not limited to, left-behind pets and/or children, vital sign monitoring, vehicle seat occupancy detection for seat belt reminder (SBR) systems, and/or anti-theft alarm. To that end, the artificial intelligence sensor system includes two sensor systems 12, 14.
[0044] One of the two sensor systems 12, 14 is formed by an optical camera 12 that may be fixedly or movably connected to a chassis of the vehicle 44, or it may be integrated in the vehicle dashboard. The other one of the two sensor systems 12, 14 is designed as a RADAR sensor system 14 that is configured for monitoring vital signs of vehicle occupants 48, which may encompass at least one out of heartbeat and breathing. The RADAR sensor system 14 comprises a plurality of RADAR sensor devices formed as RADAR transceivers 16 that are attached in a front region of the car roof interior. in this specific embodiment, the RADAR sensor system 14 is formed as a phase-modulated continuous wave (PMCW) RADAR system configured to be operated at a RADAR carrier frequency that lies in a frequency regime between 20 GHz and 140 GHz, for example at a RADAR carrier frequency of 79 GHz.
[0045] Although in this specific embodiment the artificial intelligence sensor system 10 is configured for use as an automotive vehicle interior sensing system, the artificial intelligence sensor system may also be used in automotive vehicle exterior sensing systems. In this case, the artificial intelligence sensor system may include at least one sensor system that comprises at least one out of an optical camera and a RADAR sensor system that are arranged in the vehicle to be directed towards an oncoming traffic.
[0046] With reference to Fig.2 and Fig. 4, the artificial intelligence sensor system 10 further comprises an artificial neural network 18 that is configured for receiving and processing signals from sensors of the two sensor systems 12, 14. The artificial neural network 18 includes two neural network modules 26, 34 designed as deep neural networks (DNN), each DNN having an input side 28, 36connected to an input layer 20, an output side 30, 38 connected to an output layer 22 and a plurality of intermediate layers 24 between the input layer 20 and the output layer 22, wherein each one of the layers 20, 22, 24 comprises a plurality of interconnected artificial neurons, as is well known in the art.
[0047] Each one of the two sensor systems 12, 14 is operatively connected to the input side 28, 36 of one of the neural network modules 26, 34. A first one 26 of the two neural network modules 26, 34 is configured to provide at its output side 30 an output representing a quality with a confidence level with regard to at least one object to be monitored or surveilled by applying a trained task to signals x, that have been received from the optical camera 12. À second one 34 of the two neural network modules 26, 34 is configured to provide at its output side 38 an output representing a quality with a confidence level with regard to at least one object to be monitored or surveilled by applying a trained task to signals x; that have been received from the RADAR sensor system 14. As will be described later, the task is trained by executing a method in accordance with the invention.
[0048] It is noted herewith that the terms “first”, “second”, etc. are used in this application for distinction purposes only, and are not meant to indicate or anticipate a sequence or a priority in any way.
[0049] The artificial neural network 18 further includes an external memory unit 42. Data are independently exchanged between the two neural network modules 26, 34 and the external memory unit42 by means of individually assigned, cooperating data management systems 32, 40 (Fig. 3) that form part of the artificial intelligence sensor system 10.
[0050] In the following, an embodiment of a method in accordance with the invention of operating the artificial intelligence sensor system 10 for supervised training purposes will be described with reference to Fig. 3, which schematically illustrates a chronological sequence of steps of the method that are to be executed iteratively. In preparation of operating the artificial intelligence sensor system 10, it shall be understood that all involved units and devices are in an operational state and configured as illustrated in Figs. 1 and 2.
[0051] Furthermore, in a preceding step the artificial neural network 18 has been provided with initial permanently stored training results in an offline mode. Theinitial permanently stored and labeled training results reside in the external memory unit 42 of the artificial neural network 18.
[0052] In a first step 50 of the method, signals x,, from the optical camera 12, usually representing the contents of at least one image, are provided at a time t, as input data to the first neural network module 26. Simultaneously or virtually simultaneously, signals xz, from the RADAR sensor system 14 are provided as input data to the second neural network module 34 in another step 52.
[0053] In a next step 54, the first neural network module 26 is operated to derive, e.g. based on labeled training data resident within the external memory unit 42 of the artificial neural network 18, an output representing a quality with a confidence level regarding the provided optical camera signals x, ,. For the optical camera 12, the quality is given by classes such as, but not being limited to, “adult”, “child”, “pet”, “empty child restraint system” and “occupied child restraint system”. At the same time t, in another step 56, the second neural network module 34 is operated to derive, e.g. based on labeled training data resident within the external memory unit 42 of the artificial neural network 18, an output representing a quality with a confidence level regarding the provided RADAR sensor system signal x; ;. For the RADAR sensor system 14, the quality is given by the same classes as for the optical camera 12, but derived e.g. from a breathing amplitude determined by the RADAR sensor system 14.
[0054] For the sake of argumentation it is assumed that the confidence level of the class derived from the RADAR sensor system signals xp, by the second artificial neural module 34 is equal to or larger than a predetermined confidence level. In this case, the provided signal and the derived quality are permanently stored in the external memory unit 42 as labeled online training data, using the derived quality as the label, in another step 58 that is executed by the data management system 40 assigned to the second artificial neural module 34.
[0055] For the sake of argumentation it is further assumed that the confidence level of the class derived from the optical camera signals x,, by the first artificial neural module 26 is lower than a predetermined confidence level. This can for instance be due to bad illumination in the car interior. In this case, the data management system 32 that is assigned to the first artificial neural module 26 maytemporarily store the optical camera signals x,, and the derived quality, i.e. class, in the external memory unit 42 in another step 60 of the method.
[0056] In a next step then, the class derived from the optical camera signals x, , having the derived confidence level that is lower than the predetermined confidence level is confirmed by use of an independent sensor signal.
[0057] In one embodiment, the step 62 of confirming is carried out by use of the signals xg, of the RADAR sensor system 14 as the independent sensor signal, which were taken at the same time t,, and on the basis of which the second artificial neural module 34 derived a class with a confidence level that is equal to or larger than the predetermined confidence level. In the example, thus, although it cannot be derived from the optical camera signals x,; with a large enough confidence level whether the vehicle seat 46 (Fig. 1) may be occupied by an adult, a child or a pet, the signals x; from the RADAR sensor system 14 allow for a classification with a sufficiently large confidence level that the vehicle seat 46 is occupied by an adult. After completion of the step 62 of confirming, in which the quality derived from the camera signals x,, and the quality derived from the RADAR sensor system signals xp, are affirmatively checked to be identical, the temporarily stored signals x, , provided by the optical camera 12 at time t, and the derived quality are permanently stored in the external memory unit 42 by the assigned data management system 32 as labeled online training data in another step 64, using the derived quality as the label.
[0058] In another embodiment, the next step 66 of confirming is carried out by use of signals x, , of the optical camera 12 acquired at later point in time t, as the independent sensor signal, and on the basis of which the first artificial neural module 26 can derive a class with a confidence level that is equal to or larger than the predetermined confidence level. In the example, thus, although it cannot be derived from the optical camera signals x,, at time t, with a large enough confidence level whether the vehicle seat 46 may be occupied by an adult, a child or a pet, the signals x,, from the optical camera 12 taken at time t, allow for a classification with a sufficiently large confidence level of the vehicle seat 46 being occupied by an adult. In this case, the data management system 32 assigned to the first artificial neural module 26 retrieves the temporarily stored optical camerasignals x,, and the quality derived at time t,, and compares the quality derived at time t, with the quality derived at time t, in a step 68 of the method. If the qualities are identical, the data management system 32 assigned to the first artificial neural module 26 permanently stores the temporarily stored optical camera signals x, , and the derived quality in the external memory unit 42 in a next step 70. If the qualities are not identical, the data management system 32 assigned to the first artificial neural module 26 deletes the temporarily stored optical camera signals x,, and the derived quality in an alternative step 72.
[0059] The method further comprises executing an online supervised training phase at least once in a predetermined time period, which in this specific embodiment is one week, by using the permanently stored labeled online training data that are available at that point of time.
[0060] it can in principle not be taken for granted that executing online training phases leads to improved system performance and reliability. Hence, for monitoring and controlling the effect of online training phases the method comprises a step of performance control by validating a current online training status, using a validation sensor data set with assigned correct labels that resides within the external memory unit 42 of the artificial neural network 18.
[0061] A validation data set X,,,; with labels Y,,,, has been installed in the external memory unit42, and a desired reference performance rev_val has been predefined. After each execution of an online supervised training phase, the artificial neural modules 26, 34 process the data set X,a; and the derived outcome is compared with the correct labelsY,, to calculate a current performance curr_val. The latter is then compared to the reference performance rev_val. Based on the result of the comparison, the artificial intelligence sensor system 10 decides whether to accept or to reject the adaptions performed during the executed online supervised training phase.
[0062] In a potential embodiment, a corresponding pseudo code may look as follows:
[0063] Notation: (Xvair Yvai) - validation samples (Xontine» Yontine) - Online recorded training samples Me - Al module with trainable parameters 6 Mg (X) - application of the Al module on input data set X 60 - parameters of the module operating in the system d(Myg(X),Y) - suitable distance measure to compare the output M,(X) with a desired output Y R(Mg,X,Y) - online (re-) training function of the Al module with current parameters 6. This function returns the new parameters Ô defining the (re-) trained model M3.
[0064] Pseudo code: if number of (Xontine» Yontine) > retrain threshold 6 = R(My,, Xonline» Yontine) if d(M5(Xyai)» Yvai) < acceptance threshold Go = 6 clear (Xontine» Yontine)
[0065] Suitable distance measures to evaluate the performance of the online- trained artificial neural modules 26, 34 could be the following:
1. d(Mg(X),Y) = Ey yexxyliMa(x) — yll with ||-|| denoting some mathematical norm like Euclidean norm.
2. d(Mg(X),Y) = Exyexxy WeyllMg (x) — yl with |I-|| denoting some mathematical norm like Euclidean norm and w,, denoting weights to encode the importance of different samples-label pairs (x,y) for the system performance.
3. Versions in 1. or 2. with the norm ||-|| being replaced by some other distance measure of the output (like cross-entropy, Kullback-Leibler divergence or any other distance measure that appears to be suitable to those skilled in the art)
4. d(My(X),Y) could as well be a whole programmed module that performs a dedicated comparison of My(X) and Y.
[0066] A substantial reduction of hardware and computational costs is achieved as during the step of executing an online supervised training phase, only preselected layers of the plurality of layers 20, 22, 24 of the DNNs are being trained (Fig. 4). The training is implemented in form of an optimization problem, wherein an objective function encodes that the training input should be mapped to the desired output, namely the labels, with respect to some distance. À detailed description can be found for instance in Goodfellow, |., Bengio, Y., Courville, A., & Bengio, Y.: “Deep learning”, The MIT press, Cambridge. MA, USA (2016), ISBN 978-0262035613, which shall hereby be incorporated by reference in its entirety with effect for those jurisdictions permitting incorporation by reference.
[0067] A general structure of the artificial neural modules 26, 34 designed as deep neural networks (DNN) of the artificial neural network 18 of the artificial intelligence sensor system 10 pursuant to Fig. 1 is provided in Fig. 4.
[0068] Within the additional steps of the method, only the parameters of some preselected deeper layers are being adjusted, which in this specific embodiment are all the parameters corresponding to layer k+1 to layer L. Firstly, this has the advantage that an effort for optimization can significantly be reduced with regard to computational costs, as fewer parameters must be adapted. Secondly, the architecture of the DNN is chosen such that the input dimension to layer k is significantly lower than d;,. In this specific embodiment, only the parameters in the last layers that receive data of dimension < 100 are adjusted, whereas the input dimension d;,, of an image of the optical camera 12 is in the range of several thousands. This has the advantage that the labeled online training data acquired during lifetime of the artificial intelligence sensor system 10 by the methods described above can be stored in a compressed way, namely in form of the lower dimensional output of layer k.
[0069] For this specific embodiment of the method, the pseudo code that realizes the steps of - data labeling, - validation of online training and performance control, and
- efficient online training may be given by the following.
[0070] Pseudo code for online training of the neural network module 26 (module A) that processes data of the optical camera 12 (sensor A), utilizing the information from the neural network module 34 (module B) that processes data from the RADAR sensor system 14 (sensor B). Both modules should perform the same classification/prediction task but based on different input data (different sensors).
[0071] Notation: X4, Xp - data received at sensor A, respectively B Mg - module B Mao - module A, an implementation of neural network module 26 as described, with online trainable parameters 9 that define only layerk+1toL Mg(x) - output of module B when applied to data x M10(x) - output of module A when applied to data x Max (x) - output of the neural network module 26, that represents module A, at layer k. It is assumed that dimension d, is significantly lower than the input dimension din Mag pr (Xk) - output of the network consisting only of layers k + 1 to L when applied to data x, having dimension dy, i.e. x, = Max (x) with x being some data sensed by sensor A (Xontine» Yontine) - Online recorded training samples and labels stored as input for Mao... (di-dimensional) (Xvatk Yparx) - validation data set for module A, stored as input for M, g 5; (dx- dimensional). That is, offline generated/measured sensor A data on which the encoding of M,, has been applied.
60 - parameters of module A in the operating in the system d(Mg(X),Y) - suitable distance measure to compare the output Mg; ;(X) with a desired output Y
R(Ma,or1X,Y) - online (re-) training function of layers k + 1,..,L of module A with current parameters 6. This function returns the new parameters 8 defining the trained model M, 35. (Which changes the whole module as Mg = Mj 91 © Mar, With © the composition of the two network parts) while lifetime of the system if Mg (x4) is below minimum confidence and M; (xp) gives y with sufficiently high confidence write (M4 7 (<a), y) to (Xontine» Yontine) if number of samples/labels (Xontine» Yontine) > retrain threshold 8 = R(M40 2.1» Xontine» Yontine) if d(M 5. Xvar), Your) < acceptance threshold 6, = Ô clear (Xontine» Yontine)
[0072] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
[0073] Other variations to be disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality, which is meant to express a quantity of at least two. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting scope.
List of Reference Symbols
Al sensor system 36 input side
12 optical camera 38 output side
14 RADAR sensor system 40 data management system
16 RADAR transceiver 42 external memory unit
18 artificial neural network 44 vehicleinput layer 46 vehicle seat
22 output layer 48 vehicle occupant
24 intermediate layer xa, Optical camera signals acquired at
26 neural network module time t,
28 input side xa2 Optical camera signals acquired atoutput side time t,
32 data management system xg; RADAR sensor system signals
34 neural network module acquired at time t, Method steps:
50 provide optical camera signals to first neural network module
52 provide RADAR sensor system signals to second neural network module
54 operate first neural network module to derive output
56 operate second neural network module to derive output
58 permanently store RADAR sensor system signals and derived quality
60 temporarily store optical camera signals and derived quality
62 confirm class derived from optical camera signals by use of RADAR sensor system signals
64 permanently store temporarily stored optical camera signals and derived quality
66 confirm class derived from optical camera signals by use of later acquired optical camera signals
68 retrieve temporarily stored optical camera signals and derived quality and compare quality with later derived quality
70 permanently store temporarily stored optical camera signals and derived quality
72 delete temporarily stored optical camera signals and derived quality

Claims (13)

Claims
1. A method of operating an artificial intelligence sensor system (10) for supervised training purposes, the artificial intelligence sensor system (10) having one or more sensors (12, 14) and at least one classifier or artificial neural network (18) that is configured for receiving and processing signals (x4, x42, xp 1) from the sensor or the sensors (12, 14), wherein the method comprises at least the following steps that are to be executed iteratively: - providing (50, 52) signals (x, 1,x4,, xp 1) from the sensor or the sensors (12, 14) as input data to the at least one classifier or artificial neural network (18), - operating (54, 56) the at least one classifier or artificial neural network (18) to derive an output representing a quality with a confidence level regarding the provided signals (x4 1, X42, Xp 1), - if the derived confidence level of the quality is equal to or larger than a predetermined confidence level, permanently storing (58) at least a portion of the provided signals (x4,,x4,,x5,) and the derived quality as labeled online training data, using the derived quality as the label, - if the derived confidence level of the quality is lower than the predetermined confidence level, temporarily storing (60) the at least one provided signal (x, ,) and the derived quality, - confirming (62, 66) the quality having a derived confidence level lower than the predetermined confidence level by use of at least one independent sensor signal (x4,, xp 1), and - after completion of the step of confirming (62, 66), permanently storing (70) at least a portion of the temporarily stored signal (x,,) or signals and the derived quality as labeled online training data, using the derived quality as the label.
2. The method as claimed in claim 1, wherein the step (66) of confirming the quality by use of at least one independent sensor signal includes using a signal (x4,) of the same sensor (12) that is provided within a predetermined time period after the deriving of the quality having a derived confidence levellower than the predetermined confidence level, and from which the at least one classifier or artificial neural network (18) derives an output representing a quality with a confidence level that is equal to or larger than the predetermined confidence level.
3. The method as claimed in claim 1 or 2, wherein the step of confirming (62) the quality by use of at least one independent sensor signal (xz ;) includes using a signal (xp ,) of another sensor (14) from which the at least one classifier or artificial neural network (18) derives an output representing a quality with a confidence level that is equal to or larger than the predetermined confidence level.
4. The method as claimed in any one of the preceding claims, further comprising a preceding step of providing the at least one classifier or artificial neural network (18) in an offline mode with initial permanently stored training results.
5. The method as claimed in any one of the preceding claims, further comprising a step of executing an online supervised training phase at least once in a predetermined time period by using at least the permanently stored labeled online training data.
6. The method as claimed in claim 5, wherein the at least one artificial neural network (18) includes a plurality of layers (24) between an input layer (20) and an output layer (22), and the step of executing an online supervised training phase includes training only of preselected layers out of the plurality of layers (20, 22, 24).
7. The method as claimed in any one of the preceding claims, further comprising a step of performance control by validating the current online training status, using a validation sensor data set with assigned correct labels that resides within the classifier or artificial neural network (18).
8. The method as claimed in claim 7, wherein the step of performance control comprises steps of - operating the at least one classifier or artificial neural network (18) to derive outputs representing qualities with confidence levels for the provided validation sensor data set with assigned correct labels, - comparing the derived outputs with the assigned correct labels,
- calculating a current performance figure that represents the result of the step of comparing, - comparing the current performance figure with a predetermined performance figure, - accepting adaptations made in the at least one classifier or artificial neural network (18) since executing the latest online supervised training phase if the current performance figure is equal to or exceeds the predetermined performance figure, and - rejecting adaptations made in the at least one classifier or artificial neural network since executing the latest online supervised training phase if the current performance figure is lower than the predetermined performance figure.
9. An artificial intelligence sensor system (10), including - at least one classifier or artificial neural network (18) having an input side (28, 36) and an output side (30, 38), - at least one sensor system (12, 14), operatively connected to the input side (28, 36) of the classifier or artificial neural network (18), and each sensor system (12, 14) comprising at least one sensor, wherein the classifier or artificial neural network (18) is configured to provide, at the output side (30, 38), an output representing a quality with a confidence level with regard to at least one object (48) to be monitored or surveilled by applying at least one trained task to signals (x41,x42,xp1) that have been received from the at least one sensor system (12, 14), and wherein the at least one task is trained by executing a method as claimed in any one of claims 1 to 8.
10. The artificial intelligence sensor system (10) as claimed in claim 9, wherein the at least one sensor system (12, 14) comprises at least one out of an optical camera (12), a RADAR sensor system (14), a LIDAR device and an acoustics based sensor device.
11. The artificial intelligence sensor system (10) as claimed in claimS9 or 10, wherein the at least one artificial neural network (18) comprises at least one deep neural network (26, 34).
12. Use of the artificial intelligence sensor system (10) as claimed in claim 9 to 11 in an automotive vehicle exterior sensing system.
13. Use of the artificial intelligence sensor system (10) as claimed in claim 9 to 11, comprising at least one out of an optical camera (12) and a RADAR sensor system (14), as an automotive vehicle interior sensing system.
LU101028A 2018-12-06 2018-12-06 Method for Online Training of an Artificial Intelligence (AI) Sensor System LU101028B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
LU101028A LU101028B1 (en) 2018-12-06 2018-12-06 Method for Online Training of an Artificial Intelligence (AI) Sensor System
PCT/EP2019/083536 WO2020115066A1 (en) 2018-12-06 2019-12-03 Method for online training of an artificial intelligence (ai) sensor system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
LU101028A LU101028B1 (en) 2018-12-06 2018-12-06 Method for Online Training of an Artificial Intelligence (AI) Sensor System

Publications (1)

Publication Number Publication Date
LU101028B1 true LU101028B1 (en) 2020-06-08

Family

ID=65013749

Family Applications (1)

Application Number Title Priority Date Filing Date
LU101028A LU101028B1 (en) 2018-12-06 2018-12-06 Method for Online Training of an Artificial Intelligence (AI) Sensor System

Country Status (2)

Country Link
LU (1) LU101028B1 (en)
WO (1) WO2020115066A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3117966B1 (en) * 2020-12-17 2023-06-02 Faurecia Interieur Ind Intrusion detection system for a vehicle and associated vehicle and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9760827B1 (en) * 2016-07-22 2017-09-12 Alpine Electronics of Silicon Valley, Inc. Neural network applications in resource constrained environments
US20180253645A1 (en) * 2017-03-03 2018-09-06 International Business Machines Corporation Triage of training data for acceleration of large-scale machine learning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9165196B2 (en) 2012-11-16 2015-10-20 Intel Corporation Augmenting ADAS features of a vehicle with image processing support in on-board vehicle platform

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9760827B1 (en) * 2016-07-22 2017-09-12 Alpine Electronics of Silicon Valley, Inc. Neural network applications in resource constrained environments
US20180253645A1 (en) * 2017-03-03 2018-09-06 International Business Machines Corporation Triage of training data for acceleration of large-scale machine learning

Also Published As

Publication number Publication date
WO2020115066A1 (en) 2020-06-11

Similar Documents

Publication Publication Date Title
US11565721B2 (en) Testing a neural network
US11410048B2 (en) Systems and methods for anomalous event detection
US11842282B2 (en) Neural networks for coarse- and fine-object classifications
US11657266B2 (en) Cooperative multi-goal, multi-agent, multi-stage reinforcement learning
US20190205744A1 (en) Distributed Architecture for Enhancing Artificial Neural Network
CN109711557B (en) Driving track prediction method, computer equipment and storage medium
CN111527500A (en) Self-learning in distributed architecture for enhancing artificial neural networks
US20190019068A1 (en) Integrated system for detection of driver condition
US11753048B2 (en) Monitoring of neural-network-based driving functions
CN110857100B (en) Method for embedded coding of context information using neural network
CN113165646B (en) Electronic device for detecting risk factors around vehicle and control method thereof
EP3913538A1 (en) Classification model calibration
LU101028B1 (en) Method for Online Training of an Artificial Intelligence (AI) Sensor System
KR20230032985A (en) On-the-fly calibration of an image classifier
CN111830962A (en) Interpretation data for reinforcement learning agent controller
US20210192345A1 (en) Method for generating labeled data, in particular for training a neural network, by using unlabeled partitioned samples
US11551084B2 (en) System and method of robust active learning method using noisy labels and domain adaptation
CN116168209A (en) Frequency-based feature constraints for neural networks
JP2019214249A (en) Detection device, computer program, detection method, and learning model
US11620475B2 (en) Domain translation network for performing image translation
WO2020030722A1 (en) Sensor system including artificial neural network configured to perform a confidence measure-based classification or regression task
LU101167B1 (en) Method and System for Predicting the Time Behavior of an Environment using a Sensing Device, a Physical Model and an Artificial Neural Network
US20230316728A1 (en) Robust neural network learning system
US20230007870A1 (en) Device and method for providing classified digital recordings for a system for automatic machine learning and for updating a machine-readable program code therewith
US11462020B2 (en) Temporal CNN rear impact alert system

Legal Events

Date Code Title Description
FG Patent granted

Effective date: 20200608