EP1656650B1 - Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche - Google Patents

Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche Download PDF

Info

Publication number
EP1656650B1
EP1656650B1 EP04767924A EP04767924A EP1656650B1 EP 1656650 B1 EP1656650 B1 EP 1656650B1 EP 04767924 A EP04767924 A EP 04767924A EP 04767924 A EP04767924 A EP 04767924A EP 1656650 B1 EP1656650 B1 EP 1656650B1
Authority
EP
European Patent Office
Prior art keywords
data
interface
image
representative
blue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP04767924A
Other languages
English (en)
French (fr)
Other versions
EP1656650A1 (de
Inventor
Thierry Cohignac
Frédéric Guichard
Christophe Migliorini
Fanny Rousson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MG International SA
Original Assignee
MG International SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MG International SA filed Critical MG International SA
Publication of EP1656650A1 publication Critical patent/EP1656650A1/de
Application granted granted Critical
Publication of EP1656650B1 publication Critical patent/EP1656650B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/08Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water
    • G08B21/082Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water by monitoring electrical characteristics of the water

Definitions

  • the present invention relates to a method, a system and devices for detecting a body in an area near an interface between two liquid and / or gaseous media including water / air type.
  • near also means “at the interface”.
  • the problem relates to detecting the presence of bodies in the vicinity of a water / air type interface. In addition to this main problem, there is the discrimination between the bodies located on one side or the other of the interface and the detection of stationary bodies.
  • a video camera uses two filters, in the blue-green and red ranges.
  • the red signal very attenuated in water, is removed from the signal in the blue-green wavelength range; thus, the reflections and signals corresponding to the surface of the water are eliminated.
  • the present invention solves the problem of detecting bodies located in the vicinity of a water / air type interface by proposing a method and a system for evaluating the position of a body with respect to an interface, particularly of water / water type. air, to discriminate the moving bodies of the stationary bodies, to generate alerts, to compile statistics, to give trajectory elements and to allow the detection of entrances or exits of bodies in the area under surveillance.
  • the invention relates to a method for detecting a body in an area near an interface between two liquid and / or gaseous media, especially of the water / air type.
  • the body is illuminated by electromagnetic radiation comprising at least two different wavelengths, in particular located in the ranges corresponding to the near infra-red on the one hand and the blue-green on the other hand.
  • Steps (c) to (f) are hereinafter referred to as the process of deducing the presence of a body. It follows from the combination of the technical features that it is thus possible to detect the presence of a body and / or to determine the position of the body detected with respect to the interface, by discriminating between a body located entirely under the interface. and a body located at least partly above the interface.
  • the method further comprises the step of integrating the results of the step of comparing the data groups in time.
  • the method further comprises the step of triggering an alarm if a human-sized body is detected under the interface for a time greater than a determined threshold.
  • the method is such that, to extract data corresponding to each image, two groups of data respectively representative of at least one part of the body in the near infra-red range and in the blue-green range, one generates caps (within the meaning of the present invention).
  • the method is such that to compare the groups of data, data representative of at least a part of the body in the blue-green range for which there is no, in a neighborhood, are sought. determined geometric, of corresponding data representative of at least a part of the body in the near infra-red range.
  • the method is such that to compare the groups of data, data representative of at least a part of the body in the blue-green range for which there is, in a determined geometric neighborhood, is sought. corresponding data representative of at least a part of the body in the infra-red range.
  • the conversion means, the digitization means, the computer processing means, the calculation means are hereinafter referred to as the means for deducing the presence of a body. It follows from the combination of technical features that it is thus possible to detect the presence of a body and / or to determine the position of the body detected with respect to the interface, by discriminating between a body located under the interface and a body located at least partly above the interface.
  • the system further comprises integration means for integrating in time the results of the calculation means of the groups of data.
  • the system further comprises activation means for actuating an alarm if a human-sized body is detected under the interface for a time greater than a determined threshold.
  • the system is such that the computer processing means can generate caps (within the meaning of the present invention).
  • the system is such that the calculation means make it possible to search the data representative of at least a part of the body in the blue-green range for which there is no, in a geometric neighborhood determined , corresponding data representative of at least a part of the body in the near infra-red range.
  • the system is such that the calculation means make it possible to search the data representative of at least a part of said body in the blue-green range for which there is, in a given geometrical neighborhood, data corresponding representative of at least a portion of said body in the near infra-red range.
  • a pixel is called an elementary zone of an image obtained by creating a generally regular tiling of said image.
  • a sensor such as a video camera, or a thermal or acoustic camera
  • FIG. 1a We have shown on the figure 1a an image 101 (symbolized by a man, swimming on the surface of a swimming pool, whose contours are not perfectly visible). On the figure 1b , this picture has been superimposed on a tiling 102 of pixels 103. figure 1c a tiling on which the values of the pixels have been indicated.
  • Two pixels of paving are said to be adjacent if their edges or their corners touch each other.
  • a path on the pavement is an ordered and finite set of pixels where each pixel is adjacent to its next (in the sense of the ordering).
  • the size of a path is given by the number of pixels constituting it.
  • Two pixels are said to be contiguous when the shortest path starting at one and ending at the other is smaller than a determined number of pixels.
  • a set of pixels is said to be related if for each pair of pixels in the set, there is a path beginning at one and ending at the other, this path consisting of pixels of the set.
  • the figure 2a represents a tiling 202 of 16 pixels 203, among which 3 pixels, denoted A, B and C, have been highlighted. It may be noted that the pixels A and B are adjacent and that the pixels B and C are adjacent. There is therefore a path (A->B-> C) that connects these pixels. The set of pixels ⁇ A, B, C ⁇ is therefore connected.
  • FIG. 2b there is also shown a tiling 202 of 16 pixels 203, designated by the letters A to P. If we select the set of pixels ⁇ A, B, C, E, F, I ⁇ , we can see that the pixels A and B are adjacent, that pixels B and C are adjacent, etc 7-8 So there are paths: A -> B -> C and C-> B -> F -> E -> I. Each pair of pixels of the set is connected by a path of pixels belonging to the set, the set of pixels ⁇ A, B, C, E, F, I ⁇ is therefore connected.
  • the same paving 202 is shown as on the figure 2b , by selecting the set of pixels ⁇ A, C, F, N, P ⁇ . There is a path: A->C-> F which connects the pixels A, C and F, but there is no path of pixels belonging to the set connecting N and P, or N to A.
  • the set of pixels ⁇ A, C, F, N, P ⁇ is not connected.
  • the set ⁇ A, C, F ⁇ is connected.
  • a pixel not belonging to a set is said to be adjacent to said set when it is joined to at least one pixel belonging to said set
  • FIGS. 3a, 3b, 4a, and 4b represent images composed of tilts 302 (respectively 402) of pixels 303 (respectively 403) on which their values have been indicated.
  • the set of pixels considered is therefore not a sup sup. level 1.
  • This set of pixels is therefore a sup sup. level 2.
  • the set of pixels considered is therefore a sup sup. level 1.
  • Called characteristic (s) associated with a cap one or values obtained by arithmetic and / or logical operations predefined from the values of the pixels of the cap, and / or positions of the pixels in the pavement, and / or the level of the cap.
  • an arithmetic operation could consist in using the sum of the differences between the value of each pixel of the cap and the level of the cap, or the size (number of pixels) of said cap.
  • cap sup. performed (or lower crown made): a superm. (resp.) whose associated characteristics are within a certain value range.
  • the figure 5 is a schematic view of the system for detecting bodies located in the vicinity of a water / air type interface.
  • the data or the images can be placed in a virtual common reference frame 503.
  • the virtual reference mark may correspond to the surface water 504, so that a point of the surface of the water 505, seen by the blue-green camera 506 and seen by the near infra-red camera 507, will be at the same place 508 in the common frame virtual. In this way, at two points close to the real space, will correspond close points in this virtual common reference.
  • the notion of geometric reference will correspond to the notion of proximity in the virtual common coordinate system.
  • the figure 6 represents, in the case of a pool, a general view of the system for the detection of bodies located in the vicinity of a water / air type interface, in particular the detection and monitoring of swimmers.
  • the system according to the invention comprises means, hereinafter described, for detecting a body 601 in a zone 603 located near an interface 602 between two liquid media 604 and / or gaseous 605 in particular of the water / air type; said body being illuminated by electromagnetic radiation comprising at least two different wavelengths, in particular lying in the ranges corresponding to the near infra-red on the one hand and blue-green on the other hand; said media having different absorption coefficients as a function of the wavelengths of the electromagnetic radiation.
  • near also means “at the interface”.
  • Each of the observation points 607a and 607b is located on one side of said interface 602. In this case, the observation points 607a and 607b are situated above the pool.
  • the video cameras 606a and 606b and their housings are overhead, they are in the open air.
  • the system further includes digital conversion means 609 for generating digital data from electrical signals 608a and 608b representative of the green-blue and near-infrared video images.
  • the cameras 606a and 606b are equipped with polarizing filters 611a and 611b at least partially eliminating the reflections of the light on said interface in said images.
  • This variant embodiment is particularly suitable in the case a pool reflecting the sun's rays or those of artificial lighting.
  • Said system further comprises computer processing means 700 described below.
  • the figure 7 represents a flowchart of the computer processing means 700.
  • the computer processing means 700 make it possible to discriminate the data corresponding to the green-blue video images of a part of a real body ( figure 1a ) of those corresponding to the apparent blue-green video images ( figure 1b ) generated by said interface 602.
  • the computer processing means 700 also make it possible to discriminate the data corresponding to the near infra-red video images of a part of a real body ( figure 1a ) of those corresponding to the apparent near infra-red video images ( figure 1b ) generated by said interface 602.
  • Said computer processing means 700 comprise calculation means, in particular a processor 701, and a memory 702.
  • the computer processing means 700 include extraction means 712 for extracting a group of data representative of at least a part of the body in the near infra-red range.
  • the computer processing means 700 further include extraction means 713 for extracting a group of data representative of at least a part of the body in the blue-green range.
  • An example of a characteristic associated with a cap may be its area defined by the number of pixels constituting it.
  • Another characteristic associated with a cap may be its contrast defined as being the sum of the differences between the value of each pixel of the cap and the level of the cap.
  • An example of a group of data representative of a part of a body may then be a cap having a contrast greater than a threshold SC and an area between a threshold TailleMin and a threshold TailleMax representative of the minimum and maximum dimensions of the body parts sought.
  • the computer means 700 make it possible to select from the groups of data extracted, those that do not correspond to a swimmer's part.
  • the system comprises means for eliminating the caps corresponding to reflections, water lines, carpets and any object potentially present in a pool and not corresponding to a swimmer. Examples of selection may be made by calculating the level of the caps, which must be less than a threshold SR corresponding to the average gray level of the reflections, by calculating the alignment of the caps, corresponding to the usual position of the water lines , by estimating the shape of the caps which does not have to be rectangular in order to eliminate the carpets.
  • the extraction means 712 and 713 can proceed otherwise than by means of the extraction of caps.
  • the extraction means 712 and 713 may extract groups of pixels sharing one or more predetermined properties, and then associate with each group of pixels characteristics, and deduce the presence of a group of data representative of at least a part of the body if the characteristics exceed a predetermined SC threshold.
  • the predetermined property or properties may, for example, be chosen so as to exclude the appearance of the water / air interface in the image. For example, in the case of infrared images, we can extract the groups of pixels whose brightness is much higher than the average brightness of the image of the interface and whose size is relative to that of a human body .
  • Said computer processing means 700 further comprise comparison means 714 for comparing said data groups.
  • said comparison means 714 look for data representative of at least a portion of said body in the blue-green range for which there is not, in a geometric comparison neighborhood, corresponding representative data. at least a portion of said body in the near infra-red range. So that in case of positive search, it can be concluded that said body is located under the interface.
  • a geometric comparison neighborhood for example a circular neighborhood of radius 50 cm, centered on the center of gravity of the extracted caps. in the green-blue image, caps removed in the near infra-red image. If the search is negative, the swimmer is considered to be below the surface of the water.
  • the data representative of at least a part of said body in the blue-green range for which there is, in a geometric comparison neighborhood, corresponding data representative of at least a part of said body in the near infra-red range So that in case of positive search, it can be concluded that said body is located at least partly above the interface.
  • a geometric comparison neighborhood for example a circular neighborhood of radius 50 cm, centered on the center of gravity of the extracted caps. in the green-blue image, caps removed in the near infra-red image. If the search is positive, the swimmer is considered to be at least partly above the surface of the water.
  • the extracted caps are combined in the green-blue image and those extracted in the near infra-red image if the distance the shortest (between the two closest pixels) is less than 30 cm. Unmatched green-blue image caps will then be considered to be a swimmer below the surface of the water. Paired blue-green image caps will be considered swimmers partly above the surface of the water.
  • the geometric comparison neighborhood is not necessarily determined.
  • system described in the present invention can be used in complement a system based on stereovision such as that described in the patent n ° FR 00/15803 .
  • system described in the present invention can advantageously use stereovision principles such as those described in patent no. FR 00/15803 .
  • stereovision principles such as those described in patent no. FR 00/15803 .
  • green-blue cameras and / or several near infra-red cameras These will work in stereovision.
  • said system includes a time integration 703, associated with a clock 704, for iterating at specified time intervals said process of deducing the presence of a body described above.
  • the video images are taken at determined time intervals from said observation point.
  • said computer processing means 700 comprise totalisers 705 for calculating the number of times the body is detected for a determined period of time T1.
  • Said computer processing means 700 further comprise discriminators 706 for discriminating, at a point in said zone, between the bodies that are present a number of times greater than a determined threshold. S1 and the bodies that are present a number of times below said determined threshold S1.
  • said bodies are hereinafter referred to as stationary bodies
  • said bodies are hereinafter referred to as moving bodies.
  • said computer processing means 700 further comprises means for calculating the number of times a body is detected as stationary and new for a determined period of time T2. Said period of time T2 is chosen greater than the duration of the phenomena that are observed, and in particular greater than T1.
  • Said computer processing means 700 further comprise transmission means 716 for transmitting an alert signal 711 according to the detection criteria described above.
  • the system emits a warning signal 711, in the presence of a human-sized body, stationary and located under the interface.
  • an additional step of integration over time can be advantageously performed by accumulating images from the same green-blue camera and / or near infrared.
  • the accumulated image is calculated for example by averaging the gray levels of the pixels of the successive images taken over a given time interval.
  • An accumulated image obtained by accumulating images from a green-blue camera will be called the accumulated green-blue image.
  • an accumulated image obtained by accumulation of the images coming from a near infra-red camera will be said near infra-red accumulated image.
  • the extraction means 712 and 713 can then also use the accumulated green-blue and / or near-infrared images.
  • the extraction means 712 may extract only the caps of the green-blue image for which there is not, in the accumulated green-blue image, similar cap located in a neighborhood.
  • the extraction means 712 and 713 will then also be able to use composite images consisting of accumulated blue-green images and blue-green images as well as composite images consisting of accumulated near infrared and near infra-red images.
  • the extraction means 712 may use the difference between the green-blue image and the accumulated green-blue image.
  • FIG 8 represents a schematic overview of the system according to the invention.
  • the conversion means 816, the digitization means 817, the computer processing means 818, the calculation means 819 are hereinafter referred to as the means for deducing the presence of a body 801. It is thus possible to detect the presence of a body 801 and / or to determine the position of the body detected with respect to the interface 803, by discriminating between a body 801 situated under the interface 803 and a body 801 located at least partly above the 803 interface.
  • system furthermore comprises integration means 820 for integrating, in time, the results of the calculation means 819 of the data groups 807.
  • the system further comprises activation means 821 for actuating an alarm 808 if a human-sized body is detected under the interface for a time greater than a determined threshold.

Landscapes

  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)
  • Emergency Alarm Devices (AREA)
  • Studio Devices (AREA)

Claims (16)

  1. Verfahren zur Erkennung eines Körpers (1) in einer Zone (2) in der Nähe einer Grenzfläche (3) zwischen zwei flüssigen und/oder gasförmigen Medien insbesondere des Typs Wasser/Luft; wobei dieser Körper (1) mit einer elektromagnetischen Strahlung (4) beleuchtet wird, die mindestens zwei verschiedene Wellenlängen umfasst, die insbesondere in Bereichen liegen, die einerseits Nahe-Infrarot und anderseits Grünblau entsprechen; wobei diese Medien je nach Wellenlänge der elektromagnetischen Strahlung (4) verschiedene Absorptionskoeffizienten aufweisen;
    wobei dieses Verfahren die folgenden Schritte umfasst:
    - (a) den Schritt des Wählens, aus den Wellenlängen der elektromagnetischen Strahlung (4), von mindestens zwei Wellenlängen oder zwei Wellenlängenbereichen,
    - (b) den Schritt des Aufnehmens, für jede dieser Wellenlängen oder Wellenlängenbereiche, eines Bilds (5) der Grenzfläche (3) und der Zone (2),
    - (c) den Schritt des Erzeugens von elektrischen Signalen (6), die für jedes Bild (5) repräsentativ sind,
    - (d) den Schritt des Digitalisierens der elektrischen Signale (6), um jedem Bild (5) entsprechende Daten (7) zu erzeugen,
    - (e) den Schritt des Extrahierens, aus diesen jedem Bild (5) entsprechenden Daten (7), von zwei Gruppen von Daten (7), die jeweils im nahen Infrarotbereich und im Grünblaubereich für mindestens einen Teil des Körpers (1) repräsentativ sind,
    - (f) den Schritt des Vergleichens dieser Gruppen von Daten (7);
    wobei die Schritte (c) bis (f) im Folgenden als Ableitungsprozess des Vorhandenseins eines Körpers (1) bezeichnet werden;
    wodurch es möglich ist, das Vorhandensein eines Körpers (1) zu erkennen und/oder die Position des erkannten Körpers (1) in Bezug auf die Grenzfläche (3) zu bestimmen, indem zwischen einem Körper (1), der ganz unter der Grenzfläche (3) liegt, und einem Körper (1), der mindestens zum Teil über der Grenzfläche (3) liegt, unterschieden wird.
  2. Verfahren nach Anspruch 1; wobei dieses Verfahren außerdem umfasst:
    - den Schritt des zeitlichen Integrierens der Ergebnisse des Schritts des Vergleichs der Gruppen von Daten (7).
  3. Verfahren nach Anspruch 2; wobei dieses Verfahren außerdem umfasst:
    - den Schritt des Auslösens eines Alarms (8), wenn eine Zeit lang, die einen vorbestimmten Schwellenwert übersteigt, ein Körper (1) menschlicher Größe unter der Grenzfläche (3) erkannt wird.
  4. Verfahren nach einem der Ansprüche 1 bis 3; wobei das Verfahren derart ist, dass zum Extrahieren, aus diesen jedem Bild (5) entsprechenden Daten (7), von zwei Gruppen von Daten (7), die jeweils im nahen Infrarotbereich und im Grünblaubereich für mindestens einen Teil des Körpers (1) repräsentativ sind, Hüllen (9) (im Sinne der vorliegenden Erfindung) erzeugt werden.
  5. Verfahren nach Anspruch 4; wobei dieses Verfahren außerdem die folgenden Schritte umfasst:
    - den Schritt des Zuordnens von Eigenschaften (10) (im Sinne der vorliegenden Erfindung) zu jeder Hülle (9),
    - den Schritt des Ableitens des Vorhandenseins einer Gruppe von Daten (7), die für mindestens einen Teil des Körpers (1) repräsentativ sind, wenn die Eigenschaften (10) einen vorbestimmten Schwellenwert SC übersteigen.
  6. Verfahren nach einem der Ansprüche 1 bis 5; wobei das Verfahren derart ist, dass zum Vergleich der Gruppen von Daten (7) nach Daten (7) gesucht wird, die für mindestens einen Teil des Körpers (1) im Grünblaubereich repräsentativ sind, für welche es in einer bestimmten geometrischen Nähe (11) keine entsprechenden Daten (7) gibt, die für mindestens einen Teil des Körpers (1) im Infrarotbereich repräsentativ sind;
    sodass im Falle einer positiven Suche gefolgert werden kann, dass der Körper (1) unter der Grenzfläche (3) liegt.
  7. Verfahren nach einem der Ansprüche 1 bis 5; wobei das Verfahren derart ist, dass zum Vergleich der Gruppen von Daten (7) nach Daten (7) gesucht wird, die für mindestens einen Teil des Körpers (1) im Grünblaubereich repräsentativ sind, für welche es in einer bestimmten geometrischen Nähe (11) entsprechende Daten (7) gibt, die für mindestens einen Teil des Körpers (1) im Infrarotbereich repräsentativ sind;
    sodass im Falle einer positiven Suche gefolgert werden kann, dass der Körper (1) mindestens zum Teil über der Grenzfläche (3) liegt.
  8. Verfahren nach Anspruch 2 in Verbindung mit einem der Ansprüche 1 bis 7; insbesondere dazu bestimmt, um zwischen einem stationären Körper (1) und einem in Bewegung befindlichen Körper (1) zu unterscheiden; um die Ergebnisse des Schritts des Vergleichens der Gruppen von Daten (7) zeitlich zu integrieren, wobei dieses Verfahren außerdem die folgenden Schritte umfasst:
    - den Schritt des Wiederholens, in bestimmten Zeitabständen, des Prozesses der Ableitung des Vorhandenseins des Körpers (1);
    - den Schritt des Berechnens der Zahl der Male, wo der Körper (1) während einer bestimmten Zeitperiode T1 erkannt wird,
    - den Schritt des Unterscheidens, an einem Punkt der Zone (2), zwischen den Körpern (1), die öfter als ein bestimmter Schwellenwert S1 vorhanden sind (wobei diese Körper (1) nachstehend stationäre Körper (1) genannt werden), und den Körpern (1), die weniger oft als ein bestimmter Schwellenwert S1 vorhanden sind (wobei diese Körper (1) nachstehend in Bewegung befindliche Körper (1) genannt werden);
    wodurch es möglich ist, das Vorhandensein eines stationären Körpers (1) zu erkennen, der ganz unter der Grenzfläche (3) liegt, und daher einen Alarm (8) auszulösen.
  9. System zur Erkennung eines Körpers (1) in einer Zone (2) in der Nähe einer Grenzfläche (3) zwischen zwei flüssigen (12) und/oder gasförmigen (13) Medien insbesondere des Typs Wasser/Luft; wobei dieser Körper (1) mit einer elektromagnetischen Strahlung (4) beleuchtet wird, die mindestens zwei verschiedene Wellenlängen umfasst, die insbesondere in Bereichen liegen, die einerseits Nahe-Infrarot und andrerseits Grünblau entsprechen; wobei diese Medien je nach Wellenlänge der elektromagnetischen Strahlung (4) verschiedene Absorptionskoeffizienten aufweisen; wobei dieses System umfasst:
    - (a) Wählmittel (14), um aus den Wellenlängen der elektromagnetischen Strahlung (4) mindestens zwei Wellenlängen oder zwei Wellenlängenbereiche zu wählen,
    - (b) Bildaufnahmemittel (15), umfassend eine Videokamera mit einem Filter, das die Aufnahme eines Videobilds im Wellenlängenbereich Grünblau erlaubt, und eine Videokamera mit einem Filter, das die Aufnahme mindestens eines Videobilds im Wellenlängenbereich Nahinfrarot erlaubt, um für jede dieser Wellenlängen oder Wellenlängenbereiche ein Bild (5) der Grenzfläche (3) und der Zone (2) aufzunehmen,
    - (c) Umwandlungsmittel (16), um elektrische Signale (6) zu erzeugen, die für jedes Bild (5) repräsentativ sind,
    - (d) Digitalisierungsmittel (17) zum Digitalisieren der elektrischen Signale (6), um jedem Bild (5) entsprechende Daten (7) zu erzeugen,
    - (e) Datenverarbeitungsmittel (18), um aus diesen jedem Bild (5) entsprechenden Daten (7) zwei Gruppen von Daten (7) zu extrahieren, die jeweils im nahen Infrarotbereich und im Grünblaubereich für mindestens einen Teil des Körpers (1) repräsentativ sind,
    - (f) Berechnungsmittel (19), um diese Gruppen von Daten (7) zu vergleichen;
    wobei die Umwandlungsmittel (16), Digitalisierungsmittel (17), Datenverarbeitungsmittel (18) und Berechnungsmittel (19) nachstehend als Mittel zur Ableitung des Vorhandenseins eines Körpers (1) bezeichnet werden; wodurch es möglich ist, das Vorhandensein eines Körpers (1) zu erkennen und/oder die Position des erkannten Körpers (1) in Bezug auf die Grenzfläche (3) zu bestimmen, indem zwischen einem Körper (1), der ganz unter der Grenzfläche (3) liegt, und einem Körper (1), der mindestens zum Teil über der Grenzfläche (3) liegt, unterschieden wird.
  10. System nach Anspruch 9; wobei dieses System außerdem umfasst:
    - Integrierungsmittel (20), um die Ergebnisse der Berechnungsmittel (19) der Gruppen von Daten (7) zeitlich zu integrieren.
  11. System nach Anspruch 10; wobei dieses System außerdem umfasst:
    - Aktivierungsmittel (21), um einen Alarm (8) auszulösen, wenn eine Zeit lang, die einen vorbestimmten Schwellenwert übersteigt, ein Körper (I) menschlicher Größe unter dieser Grenzfläche (3) erkannt wird.
  12. System nach einem der Ansprüche 9 bis 11; wobei das System derart ist, dass die Datenverarbeitungsmittel (18) es erlauben, Hüllen (9) (im Sinne der vorliegenden Erfindung) zu erzeugen.
  13. System nach Anspruch 12; wobei dieses System derart ist, dass die Datenverarbeitungsmittel (18) es erlauben:
    - jeder Hülle (9) Eigenschaften (10) (im Sinne der vorliegenden Erfindung) zuzuordnen,
    - das Vorhandensein einer Gruppe von Daten (7) abzuleiten, die für mindestens einen Teil des Körpers (1) repräsentativ sind, wenn die Eigenschaften (10) einen vorbestimmten Schwellenwert SC übersteigen.
  14. System nach einem der Ansprüche 9 bis 13; wobei das System derart ist, dass die Berechnungsmittel (19) es erlauben, nach Daten (7) zu suchen, die für mindestens einen Teil des Körpers (1) im Grünblaubereich repräsentativ sind, für welche es in einer bestimmten geometrischen Nähe (11) keine entsprechenden Daten (7) gibt, die für mindestens einen Teil des Körpers (1) im Infrarotbereich repräsentativ sind;
    sodass im Falle einer positiven Suche gefolgert werden kann, dass der Körper (1) unter der Grenzfläche (3) liegt.
  15. System nach einem der Ansprüche 9 bis 13; wobei das System derart ist, dass die Berechnungsmittel (19) es erlauben, nach Daten (7) zu suchen, die für mindestens einen Teil des Körpers (1) im Grünblaubereich repräsentativ sind, für welche es in einer bestimmten geometrischen Nähe (11) entsprechende Daten (7) gibt, die für mindestens einen Teil des Körpers (1) im Infrarotbereich repräsentativ sind;
    sodass im Falle einer positiven Suche gefolgert werden kann, dass der Körper (1) mindestens zum Teil über der Grenzfläche (3) liegt.
  16. Verfahren nach Anspruch 10 in Verbindung mit einem der Ansprüche 9 bis 15; insbesondere dazu bestimmt, um zwischen einem stationären Körper (1) und einem in Bewegung befindlichen Körper (1) zu unterscheiden; wobei die Integrierungsmittel (20) zum zeitlichen Integrieren der Ergebnisse des Berechnungsmittels (19) es erlauben:
    - die Ausführung der Mittel zur Ableitung des Vorhandenseins des Körpers (1) in bestimmten Zeitabständen zu wiederholen;
    - die Zahl der Male zu berechnen, wo der Körper (1) während einer bestimmten Zeitperiode T erkannt wird,
    - an einem Punkt der Zone (2) zwischen den Körpern (1), die öfter als ein bestimmter Schwellenwert S1 vorhanden sind (wobei diese Körper (1) im Folgenden als stationäre Körper (1) bezeichnet werden), und den Körpern (1) zu unterscheiden, die weniger oft als ein bestimmter Schwellenwert S1 vorhanden sind (wobei diese Körper (1) nachstehend in Bewegung befindliche Körper (1) genannt werden);
    wodurch es möglich ist, das Vorhandensein eines stationären Körpers (1) zu erkennen, der ganz unter der Grenzfläche (3) liegt;
    sodass es möglich ist, einen Alarm (8) auszulösen.
EP04767924A 2003-07-28 2004-07-28 Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche Expired - Lifetime EP1656650B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0350378A FR2858450B1 (fr) 2003-07-28 2003-07-28 Procede et systeme pour detecter un corps dans une zone situee a proximite d'une interface
PCT/FR2004/050363 WO2005013226A1 (fr) 2003-07-28 2004-07-28 Procede et systeme pour detecter un corps dans une zone situee a proximite d'une interface

Publications (2)

Publication Number Publication Date
EP1656650A1 EP1656650A1 (de) 2006-05-17
EP1656650B1 true EP1656650B1 (de) 2008-03-05

Family

ID=34043805

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04767924A Expired - Lifetime EP1656650B1 (de) 2003-07-28 2004-07-28 Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche

Country Status (8)

Country Link
US (1) US7583196B2 (de)
EP (1) EP1656650B1 (de)
JP (1) JP4766492B2 (de)
AT (1) ATE388460T1 (de)
DE (1) DE602004012283D1 (de)
ES (1) ES2303092T3 (de)
FR (1) FR2858450B1 (de)
WO (1) WO2005013226A1 (de)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008066619A1 (en) * 2006-10-19 2008-06-05 Travis Sparks Pool light with safety alarm and sensor array
US7839291B1 (en) * 2007-10-02 2010-11-23 Flir Systems, Inc. Water safety monitor systems and methods
US8390685B2 (en) * 2008-02-06 2013-03-05 International Business Machines Corporation Virtual fence
US8345097B2 (en) * 2008-02-15 2013-01-01 Harris Corporation Hybrid remote digital recording and acquisition system
WO2012145800A1 (en) * 2011-04-29 2012-11-01 Preservation Solutions Pty Ltd Monitoring the water safety of at least one person in a body of water
US8544120B1 (en) * 2012-03-02 2013-10-01 Lockheed Martin Corporation Device for thermal signature reduction
CN103646511A (zh) * 2013-11-25 2014-03-19 银川博聚工业产品设计有限公司 游泳池溺水动态监控装置
US20170167151A1 (en) * 2015-12-10 2017-06-15 Elazar Segal Lifesaving system and method for swimming pool
US10329785B2 (en) 2016-04-08 2019-06-25 Robson Forensic, Inc. Lifeguard positioning system
JP7313811B2 (ja) * 2018-10-26 2023-07-25 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
CN109584509B (zh) * 2018-12-27 2020-08-11 太仓市小车东汽车服务有限公司 一种基于红外线与可见光组合的游泳池溺水监测方法
CN115278119B (zh) * 2022-09-30 2022-12-06 中国科学院长春光学精密机械与物理研究所 用于红外辐射特性测量的红外相机积分时间自动调整方法

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0683451B2 (ja) * 1986-08-01 1994-10-19 東芝エンジニアリング株式会社 水没検知システム
US4779095A (en) * 1986-10-28 1988-10-18 H & G Systems, Inc. Image change detection system
GB8811355D0 (en) * 1988-05-13 1997-09-17 Secr Defence An electro-optical detection system
US4862257A (en) * 1988-07-07 1989-08-29 Kaman Aerospace Corporation Imaging lidar system
JPH0378577A (ja) * 1989-08-19 1991-04-03 Mitsubishi Electric Corp 真空装置
US5043705A (en) * 1989-11-13 1991-08-27 Elkana Rooz Method and system for detecting a motionless body in a pool
GB9115537D0 (en) * 1991-07-18 1991-09-04 Secr Defence An electro-optical detection system
US5959534A (en) * 1993-10-29 1999-09-28 Splash Industries, Inc. Swimming pool alarm
US5638048A (en) * 1995-02-09 1997-06-10 Curry; Robert C. Alarm system for swimming pools
FR2741370B1 (fr) * 1995-11-16 1998-05-29 Poseidon Systeme de surveillance d'une piscine pour la prevention des noyades
US6963354B1 (en) * 1997-08-07 2005-11-08 The United States Of America As Represented By The Secretary Of The Navy High resolution imaging lidar for detecting submerged objects
US6628835B1 (en) * 1998-08-31 2003-09-30 Texas Instruments Incorporated Method and system for defining and recognizing complex events in a video sequence
US6327220B1 (en) * 1999-09-15 2001-12-04 Johnson Engineering Corporation Sonar location monitor
FR2802653B1 (fr) * 1999-12-21 2003-01-24 Poseidon Procede et systeme pour detecter un objet devant un fond
JP2002077897A (ja) * 2000-08-25 2002-03-15 Nippon Hoso Kyokai <Nhk> オブジェクト抽出型tvカメラ
WO2002046795A1 (fr) * 2000-12-06 2002-06-13 Poseidon Procede pour detecter des corps nouveaux dans une scene eclairee
SG95652A1 (en) * 2001-05-25 2003-04-23 Univ Nanyang Drowning early warning system
US6642847B1 (en) * 2001-08-31 2003-11-04 Donald R. Sison Pool alarm device

Also Published As

Publication number Publication date
US20070052697A1 (en) 2007-03-08
ES2303092T3 (es) 2008-08-01
EP1656650A1 (de) 2006-05-17
ATE388460T1 (de) 2008-03-15
FR2858450B1 (fr) 2005-11-11
WO2005013226A1 (fr) 2005-02-10
US7583196B2 (en) 2009-09-01
DE602004012283D1 (de) 2008-04-17
JP4766492B2 (ja) 2011-09-07
FR2858450A1 (fr) 2005-02-04
JP2007500892A (ja) 2007-01-18

Similar Documents

Publication Publication Date Title
EP1656650B1 (de) Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche
EP2071280B1 (de) Vorrichtung zur erzeugung normaler informationen und verfahren zur erzeugung normaler informationen
US8594455B2 (en) System and method for image enhancement and improvement
FR3065307A1 (fr) Dispositif de capture d&#39;une empreinte d&#39;une partie corporelle.
FR2832528A1 (fr) Determination d&#39;un illuminant d&#39;une image numerique en couleur par segmentation et filtrage
EP3114831B1 (de) Optimierte videorauschunterdrückung für heterogenes multisensorsystem
EP1240622B1 (de) Verfahren und vorrichtung zur detektion eines gegenstandes in bezug auf eine oberfläche
FR2989198A1 (fr) Procede et dispositif de detection d&#39;un objet dans une image
EP3388976B1 (de) Betrugserkennungsverfahren
EP1340104B1 (de) Verfahren und vorrichtung zur detektion eines körpers in der nähe einer wasser/luft schichtgrenze
EP2307948A2 (de) Interaktive vorrichtung und verfahren zu ihrer verwendung
EP2756483B1 (de) Verfahren und system zur erfassung und verarbeitung von bildern zur bewegungsdetektion
FR2945883A1 (fr) Procede et systeme de detection de l&#39;etat ouvert ou ferme des yeux d&#39;un visage.
FR2911984A1 (fr) Procede pour identifier des points symboliques sur une image d&#39;un visage d&#39;une personne
FR2817624A1 (fr) Procede, systeme et dispositif pour detecter un corps a proximite d&#39;une interface de type eau/air
Chen et al. Extraction of oil slicks on the sea surface from optical satellite images by using an anomaly detection technique
EP4254250A1 (de) Verfahren und vorrichtung zur erkennung eines mobilen geräte-zuschauers auf der basis von tiefendaten
WO2023057726A1 (fr) Dispositif et procede opto-informatique d&#39;analyse en lumiere traversante d&#39;un recipient en materiau transparent ou translucide a l&#39;aide d&#39;une camera numerique polarimetrique
FR3141788A1 (fr) Système de surveillance volumétrique d’un espace et programme d’ordinateur correspondant.
FR3135812A1 (fr) Procédé de surveillance automatique des personnes dans un bassin d’eau, programme d’ordinateur et dispositif associés
FR3127613A1 (fr) Œuvre d’art panoptique
FR2994008A1 (fr) Methode de suivi de position 6d monocamera
FR2817625A1 (fr) Procede pour detecter des corps nouveaux dans une scene eclairee par des lumieres non forcement contraintes
FR3055997A1 (fr) Systeme pour la determination d&#39;au moins une caracteristique relative au contour d&#39;un sujet contenu dans au moins une image numerique

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060228

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20060811

DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: MG INTERNATIONAL

111Z Information provided on other rights and legal means of execution

Free format text: FR

Effective date: 20070919

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REF Corresponds to:

Ref document number: 602004012283

Country of ref document: DE

Date of ref document: 20080417

Kind code of ref document: P

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2303092

Country of ref document: ES

Kind code of ref document: T3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080605

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080805

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080606

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

26N No opposition filed

Effective date: 20081208

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080605

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: FR

Ref legal event code: GC

Ref country code: FR

Ref legal event code: AU

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080906

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080606

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230711

Year of fee payment: 20

Ref country code: LU

Payment date: 20230711

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230831

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230710

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20231103

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20230929

Year of fee payment: 20