EP2805313B1 - Method and device for determining and adjusting an area to be monitored by a video camera - Google Patents

Method and device for determining and adjusting an area to be monitored by a video camera Download PDF

Info

Publication number
EP2805313B1
EP2805313B1 EP12809829.0A EP12809829A EP2805313B1 EP 2805313 B1 EP2805313 B1 EP 2805313B1 EP 12809829 A EP12809829 A EP 12809829A EP 2805313 B1 EP2805313 B1 EP 2805313B1
Authority
EP
European Patent Office
Prior art keywords
camera
area
monitored
georeferenced
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP12809829.0A
Other languages
German (de)
French (fr)
Other versions
EP2805313A1 (en
Inventor
Michael Hoeynck
Alexander Flaig
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP2805313A1 publication Critical patent/EP2805313A1/en
Application granted granted Critical
Publication of EP2805313B1 publication Critical patent/EP2805313B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19686Interfaces masking personal details for privacy, e.g. blurring faces, vehicle license plates

Definitions

  • the invention relates to a method for determining and setting an area to be monitored by a video camera. Furthermore, the invention relates to a corresponding device and a corresponding camera arrangement.
  • Position refers to the translational coordinates of the mounting location or center of the image recording
  • orientation are the solid angle, pitch, roll and roll angle, which describes the inclination of the camera in the three spatial directions.
  • Position and orientation together describe the pose of the camera.
  • So-called video surveillance systems usually comprise several surveillance cameras, which are to be directed to surveillance areas.
  • Such video surveillance systems can be found, for example, in companies or companies, but also in public places, intersections, railway stations or other buildings.
  • the monitoring areas are to be secured and, if necessary, a tracking of suspicious objects is made possible.
  • a collection of personal data in public areas is subject to strict regulations.
  • surveillance may be carried out under certain conditions if it does not extend to public space, such as a street or a square.
  • public space such as a street or a square.
  • the privacy of the person who is in the image area which is not primarily to be monitored but captured by the camera, for example, a street is protected.
  • Such a function is referred to in today's camera systems as a "Privacy Mask”.
  • a Primary Mask When installing the camera especially in static and dynamic (pan-tilt-zoom) cameras are under control of the corresponding Video image certain areas often selected in the form of polygons, which are then grayed out or rendered illegally in the transmitted and possibly recorded or displayed video image.
  • the masking module for a monitoring system, wherein the monitoring system comprises at least one monitoring camera and is suitable for the observation of monitoring areas with moving objects and arranged. Furthermore, the masking module comprises a selection device for selecting objects, wherein the masking module is designed to output the selection objects masked, which is limited to at least one selected spatial subregion of the monitoring region.
  • a camera surveillance system that includes a camera and a computing unit coupled thereto that is configured to obscure some areas of an image captured by the camera with a privacy mask.
  • a method for determining and setting an area to be monitored by a video camera.
  • a visible area resulting during installation and calibration of the camera and captured by the camera is thereby tailored to the area to be monitored so that the position and orientation of the camera are described with georeferenced coordinates.
  • the georeferenced coordinates of a georeferenced map of the area to be monitored are taken.
  • a mask is automatically determined, which masks on the view area contents of the field of view outside the area to be monitored.
  • Position and orientation of the camera are defined, for example, by a corresponding 3D camera pose.
  • a pose corresponds to a combination of position and orientation of the camera in three-dimensional space.
  • a programming interface API, Application Programming Interface
  • Google Map API® it is conceivable to use Google Map API®.
  • the mask is calculated on the basis of geometrical assumptions calculated visual beams of the camera. With the help of the mask, the contents of the field of view outside the area to be monitored are grayed out and / or removed and / or hidden.
  • the setting of the area to be monitored by the video camera is carried out fully automatically after an initial installation and calibration.
  • the present invention relates to a device for a camera arrangement with at least one camera, wherein the device has at least one module which has access to a georeferenced map and is configured to describe the position and orientation of the camera by means of georeferenced coordinates.
  • the device further comprises a computing unit which is configured to automatically determine a mask based on the georeferenced coordinates, which masks, when imaging on a field of view detected by the camera, contents of the field of view outside a defined area to be monitored.
  • the present invention relates to the use of a method according to the invention and / or a camera arrangement according to the invention for or in a video surveillance system.
  • a schematic representation of a camera arrangement 1 comprises a camera 2 having a first module 4 for recording and digitizing images of an area to be monitored by the camera 2. Another module 6 is used to evaluate images of the camera and to detect moving objects. Furthermore, the camera 2 has a further module 8, which via a wireless or wired interface 10 has a connection to an external memory or to the Internet 11, via which the module 8 has access to a georeferenced map.
  • the georeferenced card can therefore either be stored in an external memory or, for example, be retrievable from an external server via the Internet 11.
  • the interface can be realized via a so-called Transmission Control Protocol Internet Protocol (TCP-IP). However, other network protocols, such as bpsw. UDP, are used.
  • the access to the geo-referenced map makes it possible to describe the position and orientation of the camera 2 by means of georeferenced coordinates.
  • Google-Map API® can be used as a georeferenced map, for example.
  • the description of position and orientation The camera 2 by means of such georeferenced coordinates is unique worldwide and thus allows a clearly defined statement about the position and orientation of the camera. 2
  • the area to be monitored as well as the field of view actually detected by the camera 2 can be determined relatively accurately by means of the georeferenced map so that a mask can be automatically determined by means of a further module 12 on the basis of the geo-referenced coordinates the camera 2 detected area masked contents of the field of view outside a defined area to be monitored.
  • the present embodiment of the device according to the invention is thus arranged and integrated in the camera 2 and has at least the modules 8 and 12.
  • the field of view captured by the camera 2 is privacy-preserving using the position and orientation of the camera, i. H. a 3D camera pose, and using georeferenced maps to actively restrict to the defined area to be monitored, so that content outside the defined area to be monitored can be masked, for example, hidden or grayed out.
  • the position and orientation of the camera is not, as before, on a building or land on which it is installed, even mapped to geographic coordinates.
  • the area to be monitored such as a defined plot or an acceptable surveillance area beyond this, can be defined, for example, as far as the street or one meter into the street.
  • the area detected by the camera can then be grayed out or removed automatically from the area to be monitored. Such an operation can be performed automatically after an initial installation and calibration, and image data can thereby be actively suppressed from the outset.
  • a calibration of the camera is expanded with the aid of the device according to the invention taking into account the georeferenced data mentioned.
  • the defined area to be monitored is already to be marked on a digital, geo-referenced map, so that all contents outside of this area, but adjacent to it, are automatically removed from the field of vision covered by the camera, whether in the form of a graying out or a perfect one Distance.
  • the device according to the invention does not have to, as in FIG. 1 shown, inevitably be integrated into a housing of the camera 2.
  • FIG. 2 An example of a further embodiment of a device according to the invention is shown in FIG FIG. 2 shown.
  • a camera arrangement 15 comprises a first camera 20 and a second camera 30, which are designed to observe a region to be monitored.
  • the camera arrangement 15 comprises a device 25 with at least one module 26, which is designed to determine the respective position and orientation of the first camera 20 and the second camera 30.
  • the cameras 20 and 30 are respectively described with respect to their position and orientation by means of georeferenced coordinates, so that by means of a further computing unit 27 based on the georeferenced coordinates, if necessary, a common for the cameras 20 and 30 mask can be determined automatically when mapping to a
  • the viewing area captured by the two cameras conceals contents of the viewing area outside of a defined area to be monitored.
  • Communication for exchanging information takes place between the first camera 20 and the device 25 via a line 40.
  • the device 25 is connected to the second camera 30 via a further line 41.
  • the device 25 of this embodiment establishes a connection 42 to the Internet 28 in order to be able to provide information about it on a correspondingly stored georeferenced map.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Alarm Systems (AREA)

Description

Die Erfindung betrifft ein Verfahren zum Bestimmen und Einstellen eines durch eine Videokamera zu überwachenden Bereichs. Ferner betrifft die Erfindung eine entsprechende Vorrichtung sowie eine entsprechende Kameraanordnung.The invention relates to a method for determining and setting an area to be monitored by a video camera. Furthermore, the invention relates to a corresponding device and a corresponding camera arrangement.

Stand der TechnikState of the art

Aus der Druckschrift DE 10 2006 042 318 A1 ist ein Verfahren zum Betreiben mindestens einer Kamera bekannt, bei dem eine Position und Orientierung der mindestens einen Kamera bestimmt wird und zur Bildverarbeitung Informationen über eine, ausgehend von der Position und Orientierung zu beobachtenden Szene, bereitgestellt werden. Position bezieht sich hier auf die translatorischen Koordinaten des Anbringungsortes bzw. Zentrums der Bildaufnahme, Orientierung sind die Raumwinkel, Nick-, Wank- und Rollwinkel, die die Neigung der Kamera in den drei Raumrichtungen beschreibt. Position und Orientierung gemeinsam beschreiben die Pose der Kamera.From the publication DE 10 2006 042 318 A1 For example, a method for operating at least one camera is known in which a position and orientation of the at least one camera is determined and for image processing information about a scene to be observed based on the position and orientation is provided. Position here refers to the translational coordinates of the mounting location or center of the image recording, orientation are the solid angle, pitch, roll and roll angle, which describes the inclination of the camera in the three spatial directions. Position and orientation together describe the pose of the camera.

Aufgrund eines steigenden Bedürfnisses an Sicherheit werden zunehmend an öffentlichen Orten und im Bereich des Objektschutzes immer mehr Überwachungskameras installiert. Des Öfteren ist diesen Überwachungskameras eine Bildverarbeitung nachgeschaltet, die dazu konfiguriert ist, Bilder der Überwachungskameras automatisch auszuwerten.Due to a growing need for security, more and more surveillance cameras are increasingly being installed in public places and in the area of property protection. Often, these surveillance cameras are followed by image processing configured to automatically evaluate images from surveillance cameras.

Ein Einsatz von Videokameras in der Sicherheitstechnik zur Überwachung von Zäunen, Eingangsbereichen und Zugängen mit dem Ziel eines Schutzes von Eigentum oder zur Gewährleistung der Sicherheit von Personen durch eine im Einzelfall durchgehende Überwachung ist weitestgehend etabliert und zunehmend verbreitet.The use of video cameras in security technology to monitor fences, entrances and entrances with the aim of protecting property or ensuring the safety of individuals by a case-by-case basis continuous monitoring is largely established and increasingly widespread.

Sogenannte Videoüberwachungssysteme umfassen in der Regel mehrere Überwachungskameras, welche auf Überwachungsbereiche zu richten sind. Derartige Videoüberwachungssysteme sind bspw. in Betrieben oder Firmen, aber auch an öffentlichen Plätzen, Kreuzungen, Bahnhöfen oder anderen Gebäuden zu finden. Mit Hilfe derartiger Überwachungssysteme sollen die Überwachungsbereiche gesichert werden und ggf. eine Nachverfolgung von verdächtigen Objekten ermöglicht werden. Dem gegenüber steht die zu schützende Privatsphäre von Personen, welche sich bspw. auch in derartigen videoüberwachten Bereichen befinden. Derzeit ist es üblich, um die Privatsphäre ausreichend respektieren zu können, Überwachungskameras so zu positionieren und zu orientieren, dass nur relevante Überwachungsbereiche beobachtet und aufgenommen werden.So-called video surveillance systems usually comprise several surveillance cameras, which are to be directed to surveillance areas. Such video surveillance systems can be found, for example, in companies or companies, but also in public places, intersections, railway stations or other buildings. With the help of such monitoring systems, the monitoring areas are to be secured and, if necessary, a tracking of suspicious objects is made possible. On the other hand, there is the need to protect the privacy of persons who, for example, are also in such video-monitored areas. Currently, in order to respect the privacy sufficiently, it is common practice to position and orient surveillance cameras so that only relevant surveillance areas are observed and recorded.

Im Zusammenhang mit einer zunehmenden Verbreitung von Videosystemen in Bahnhöfen, öffentlichen Plätzen und im Außenbereich erwächst zusehends der Wunsch und der konkrete Bedarf, die darin enthaltenen Personen und ggf. Gegenstände in bestimmten Bildbereichen unkenntlich zu machen, bspw. zum Schutz der Privatsphäre oder aus Datenschutzgründen.In the context of increasing distribution of video systems in railway stations, public squares and in the field, there is an increasing desire and need to obfuscate people and, where appropriate, objects in certain image areas, for example for privacy or privacy reasons.

Nach gültiger Rechtsprechung ist eine Erhebung personenbezogener Daten in öffentlichen Bereichen an strikte Regularien gebunden. Unter Wahrung des Hausrechts bspw. darf unter bestimmten Voraussetzungen eine Überwachung erfolgen, wenn sich diese nicht auf den öffentlichen Raum, wie bspw. eine Straße oder einen Platz erstreckt. Zu diesem Zwecke gibt es in heutigen Kameras eine Software-Funktion, mit welcher bestimmte Bereiche des Videobildes der Kamera zum Schutz der Privatsphäre von Personen ausgegraut bzw. unkenntlich gemacht werden können. Durch eine Verhinderung einer Aufzeichnung von Bildinformationen in den betreffenden Bereichen wird die Privatsphäre der Person geschützt, die sich im nicht primär zu überwachenden, jedoch von der Kamera erfassten Bildbereich bspw. einer Straße befindet.According to current case law, a collection of personal data in public areas is subject to strict regulations. For example, under the conditions of domestic law, surveillance may be carried out under certain conditions if it does not extend to public space, such as a street or a square. For this purpose, there is a software function in today's cameras with which certain areas of the video image of the camera can be grayed out or made unrecognizable to protect the privacy of persons. By preventing the recording of image information in the respective areas, the privacy of the person who is in the image area, which is not primarily to be monitored but captured by the camera, for example, a street is protected.

Eine derartige Funktion wird in heutigen Kamerasystemen auch als "Privacy Mask" bezeichnet. Bei einer Installation der Kamera insbesondere bei statischen und auch dynamischen (pan-tilt-zoom) Kameras werden unter Kontrolle des entsprechenden Videobildes bestimmte Bereiche oft in Form von Polygonen ausgewählt, welche dann im übermittelten und ggf. aufgezeichneten oder angezeigten Videobild ausgegraut bzw. unkenntlich gemacht werden.Such a function is referred to in today's camera systems as a "Privacy Mask". When installing the camera especially in static and dynamic (pan-tilt-zoom) cameras are under control of the corresponding Video image certain areas often selected in the form of polygons, which are then grayed out or rendered illegally in the transmitted and possibly recorded or displayed video image.

In der Druckschrift DE 10 2008 007 199 A1 wird ein Maskierungsmodul für ein Überwachungssystem beschrieben, wobei das Überwachungssystem mindestens eine Überwachungskamera aufweist und zur Beobachtung von Überwachungsbereichen mit bewegten Objekten geeignet und angeordnet ist. Ferner umfasst das Maskierungsmodul eine Auswahleinrichtung zur Auswahl von Objekten, wobei das Maskierungsmodul ausgebildet ist, die Auswahlobjekte maskiert auszugeben, was auf mindestens einen selektierten räumlichen Teilbereich des Überwachungsbereichs begrenzt ist.In the publication DE 10 2008 007 199 A1 describes a masking module for a monitoring system, wherein the monitoring system comprises at least one monitoring camera and is suitable for the observation of monitoring areas with moving objects and arranged. Furthermore, the masking module comprises a selection device for selecting objects, wherein the masking module is designed to output the selection objects masked, which is limited to at least one selected spatial subregion of the monitoring region.

Aus der Druckschrift DE 10 2007 029 476 A1 ist eine Bildverarbeitungsvorrichtung zur Schattendetektion bzw. -unterdrückung in einem Kamerabild von einer Beobachtungszene bekannt.From the publication DE 10 2007 029 476 A1 For example, an image processing apparatus for shadow detection in a camera image of an observation scene is known.

Aus der Druckschrift US 2005/0270371 A1 ist ein Kameraüberwachungssystem bekannt, das eine Kamera und eine damit gekoppelte Recheneinheit aufweist, die dazu ausgebildet ist, einige Bereiche eines von der Kamera aufgenommenen Bilds mit einer Privatspärenmaske unkenntlich zu machen.From the publication US 2005/0270371 A1 For example, a camera surveillance system is known that includes a camera and a computing unit coupled thereto that is configured to obscure some areas of an image captured by the camera with a privacy mask.

In der Druckschrift US 2008/0074494 A1 ist ein Videoüberwachungssystem beschrieben, mit dem ein sich bewegendes Objekt verfolgt werden kann, wobei ein geographisch-räumliches Modell verwendet wird.In the publication US 2008/0074494 A1 there is described a video surveillance system capable of tracking a moving object using a geographic-spatial model.

Vor dem Hintergrund des zitierten Stands der Technik war es nunmehr eine Aufgabe der vorliegenden Erfindung, die bislang manuell zu definierenden "Privacy Masks" zum Schutz der Privatsphäre zukünftig zumindest teilweise automatisiert berechnen zu können.Against the background of the cited prior art, it was now an object of the present invention to be able to calculate the "privacy masks" to be defined manually so far for the protection of privacy in the future, at least partially automated.

Offenbarung der ErfindungDisclosure of the invention

Zur Lösung dieser Aufgabe wird nunmehr ein Verfahren mit den Merkmalen von Patentanspruch 1 sowie eine Einrichtung mit den Merkmalen von Patentanspruch 5 und eine geeignete Kameraanordnung mit dem Merkmalen von Patentanspruch 7 bereitgestellt.To solve this problem will now be a method with the features of claim 1 and a device with the features of claim 5 and a suitable camera arrangement with the features of claim 7 is provided.

Erfindungsgemäß wird nunmehr ein Verfahren zum Bestimmen und Einstellen eines durch eine Videokamera zu überwachenden Bereichs bereitgestellt. Dabei wird ein bei Installation und Kalibrierung der Kamera resultierender und von der Kamera erfasster Sichtbereich dadurch auf den zu überwachenden Bereich zugeschnitten, dass Position und Orientierung der Kamera mit georeferenzierten Koordinaten beschrieben werden. Dabei werden die georeferenzierten Koordinaten einer georeferenzierten Karte des zu überwachenden Bereichs entnommen.According to the invention, a method is now provided for determining and setting an area to be monitored by a video camera. In this case, a visible area resulting during installation and calibration of the camera and captured by the camera is thereby tailored to the area to be monitored so that the position and orientation of the camera are described with georeferenced coordinates. The georeferenced coordinates of a georeferenced map of the area to be monitored are taken.

Anhand der georeferenzierten Koordinaten der Kamera wird automatisch eine Maske bestimmt, die bei Abbildung auf den Sichtbereich Inhalte des Sichtbereichs außerhalb des zu überwachenden Bereichs maskiert.
Position und Orientierung der Kamera werden bspw. durch eine entsprechende 3D Kamera-Pose definiert. Eine Pose entspricht dabei einer Kombination von Position und Orientierung der Kamera im dreidimensionalen Raum.
In vorteilhafter Weise wird bei dem Verfahren über eine Programmierschnittstelle (API, Application Programming Interface) netzwerkbasiert auf eine Datenbank zugegriffen, die Daten der georeferenzierten Karte enthält. Es ist denkbar, Google-Map API ® zu verwenden.
Es ist denkbar, dass die Maske auf Grundlage von unter geometrischen Annahmen berechneten Sichtstrahlen der Kamera berechnet wird.
Mit Hilfe der Maske werden die Inhalte des Sichtbereichs außerhalb des zu überwachenden Bereichs ausgegraut und/oder entfernt und/oder ausgeblendet.
Based on the georeferenced coordinates of the camera, a mask is automatically determined, which masks on the view area contents of the field of view outside the area to be monitored.
Position and orientation of the camera are defined, for example, by a corresponding 3D camera pose. A pose corresponds to a combination of position and orientation of the camera in three-dimensional space.
Advantageously, in the method via a programming interface (API, Application Programming Interface) network-based access to a database containing data of georeferenced card. It is conceivable to use Google Map API®.
It is conceivable that the mask is calculated on the basis of geometrical assumptions calculated visual beams of the camera.
With the help of the mask, the contents of the field of view outside the area to be monitored are grayed out and / or removed and / or hidden.

Gemäß einer möglichen Ausführungsphase des erfindungsgemäßen Verfahrens wird das Einstellen des durch die Videokamera zu überwachenden Bereichs nach einer initialen Installation und Kalibrierung voll automatisch durchgeführt.
Ferner betrifft die vorliegende Erfindung eine Einrichtung für eine Kameraanordnung mit mindestens einer Kamera, wobei die Einrichtung mindestens ein Modul aufweist, das Zugriff auf eine georeferenzierte Karte hat und dazu konfiguriert ist, Position und Orientierung der Kamera mittels georeferenzierten Koordinaten zu beschreiben.
Dabei ist es denkbar, dass die Einrichtung ferner eine Recheneinheit aufweist, die dazu konfiguriert ist, anhand der georeferenzierten Koordinaten automatisch eine Maske zu bestimmen, die bei Abbildung auf einen von der Kamera erfassten Sichtbereich Inhalte des Sichtbereichs außerhalb eines definierten, zu überwachenden Bereichs maskiert.
According to a possible execution phase of the method according to the invention, the setting of the area to be monitored by the video camera is carried out fully automatically after an initial installation and calibration.
Furthermore, the present invention relates to a device for a camera arrangement with at least one camera, wherein the device has at least one module which has access to a georeferenced map and is configured to describe the position and orientation of the camera by means of georeferenced coordinates.
It is conceivable that the device further comprises a computing unit which is configured to automatically determine a mask based on the georeferenced coordinates, which masks, when imaging on a field of view detected by the camera, contents of the field of view outside a defined area to be monitored.

Ferner betrifft die vorliegende Erfindung die Verwendung eines erfindungsgemäßen Verfahrens und/oder einer erfindungsgemäßen Kameraanordnung für bzw. in einem Videoüberwachungssystem.Furthermore, the present invention relates to the use of a method according to the invention and / or a camera arrangement according to the invention for or in a video surveillance system.

Weitere Vorteile und Ausgestaltungen der Erfindung ergeben sich aus der Beschreibung und den beiliegenden Zeichnungen.Further advantages and embodiments of the invention will become apparent from the description and the accompanying drawings.

Es versteht sich, dass die voranstehend genannten und die nachstehend noch zu erläuternden Merkmale nicht nur in der jeweils angegebenen Kombination, sondern auch in anderen Kombinationen oder in Alleinstellung verwendbar sind, ohne den Rahmen der vorliegenden Erfindung zu verlassen.It is understood that the features mentioned above and those yet to be explained below can be used not only in the particular combination indicated, but also in other combinations or in isolation, without departing from the scope of the present invention.

Kurze Beschreibung der Zeichnungen

Figur 1
zeigt eine Ausführungsform einer erfindungsgemäßen Kameraanordnung in schematischer Darstellung.
Figur 2
zeigt eine weitere Ausführungsform einer erfindungsgemäßen Kamera-anordnung in schematischer Darstellung.
Brief description of the drawings
FIG. 1
shows an embodiment of a camera arrangement according to the invention in a schematic representation.
FIG. 2
shows a further embodiment of a camera arrangement according to the invention in a schematic representation.

Die in Figur 1 in schematischer Darstellung gezeigte Ausführungsform einer erfindungsgemäßen Kameraanordnung 1 weist eine Kamera 2 auf, die ein erstes Modul 4 zur Aufnahme und Digitalisierung von Bildern eines durch die Kamera 2 zu überwachenden Bereichs aufweist. Ein weiteres Modul 6 dient dazu, Bilder der Kamera auszuwerten und bewegte Objekte zu detektieren. Ferner weist die Kamera 2 ein weiteres Modul 8 auf, welches über eine drahtlose oder drahtgebundene Schnittstelle 10 eine Verbindung zu einem externen Speicher bzw. zum Internet 11 hat, worüber das Modul 8 Zugriff auf eine georeferenzierte Karte hat. Die georeferenzierte Karte kann demnach entweder in einem externen Speicher hinterlegt oder bspw. von einem externen Server über das Internet 11 abrufbar sein. Die Schnittstelle kann über ein sogenanntes Transmission Control Protocol-Internet Protocol (TCP-IP) realisiert sein. Es können jedoch auch andere Netzprotokolle, wie bpsw. UDP, eingesetzt werden. Der Zugriff auf die georeferenzierte Karte erlaubt es, Position und Orientierung der Kamera 2 mittels georeferenzierter Koordinaten zu beschreiben. Als georeferenzierte Karte kann dabei bspw. Google-Map API ® verwendet werden. Die Beschreibung von Position und Orientierung der Kamera 2 mittels derartiger georeferenzierter Koordinaten ist weltweit eindeutig und erlaubt somit eine klar definierte Aussage über die Position und Orientierung der Kamera 2.In the FIG. 1 a schematic representation of a camera arrangement 1 according to the invention comprises a camera 2 having a first module 4 for recording and digitizing images of an area to be monitored by the camera 2. Another module 6 is used to evaluate images of the camera and to detect moving objects. Furthermore, the camera 2 has a further module 8, which via a wireless or wired interface 10 has a connection to an external memory or to the Internet 11, via which the module 8 has access to a georeferenced map. The georeferenced card can therefore either be stored in an external memory or, for example, be retrievable from an external server via the Internet 11. The interface can be realized via a so-called Transmission Control Protocol Internet Protocol (TCP-IP). However, other network protocols, such as bpsw. UDP, are used. The access to the geo-referenced map makes it possible to describe the position and orientation of the camera 2 by means of georeferenced coordinates. Google-Map API® can be used as a georeferenced map, for example. The description of position and orientation The camera 2 by means of such georeferenced coordinates is unique worldwide and thus allows a clearly defined statement about the position and orientation of the camera. 2

Ferner kann hierüber der zu überwachende Bereich als auch der von der Kamera 2 real erfasste Sichtbereich relativ exakt mittels der georeferenzierten Karte bestimmt werden, so dass mittels eines weiteren Moduls 12 anhand der georeferenzierten Koordinaten automatisch eine Maske bestimmt werden kann, die bei Abbildung auf einen von der Kamera 2 erfassten Sichtbereich Inhalte des Sichtbereichs außerhalb eines definierten zu überwachenden Bereichs maskiert.Furthermore, the area to be monitored as well as the field of view actually detected by the camera 2 can be determined relatively accurately by means of the georeferenced map so that a mask can be automatically determined by means of a further module 12 on the basis of the geo-referenced coordinates the camera 2 detected area masked contents of the field of view outside a defined area to be monitored.

Die vorliegende Ausführungsform der erfindungsgemäßen Einrichtung ist somit in der Kamera 2 angeordnet und integriert und weist zumindest die Module 8 und 12 auf. Mittels der erfindungsgemäßen Einrichtung ist es, wie voranstehend beschrieben, möglich, den von der Kamera 2 erfassten Sichtbereich aus Gründen des Schutzes der Privatsphäre unter Verwendung der Position und Orientierung der Kamera, d. h. einer 3D Kamera Pose, und Heranziehen georeferenzierter Karten, aktiv auf den definierten zu überwachenden Bereich einzuschränken, so dass Inhalte außerhalb des definierten zu überwachenden Bereichs maskiert, bspw. ausgeblendet oder ausgegraut werden können.The present embodiment of the device according to the invention is thus arranged and integrated in the camera 2 and has at least the modules 8 and 12. By means of the device according to the invention, as described above, it is possible for the field of view captured by the camera 2 to be privacy-preserving using the position and orientation of the camera, i. H. a 3D camera pose, and using georeferenced maps to actively restrict to the defined area to be monitored, so that content outside the defined area to be monitored can be masked, for example, hidden or grayed out.

Ein wesentlicher Punkt ist, dass die Position und Orientierung der Kamera nicht wie bisher auf ein Gebäude bzw. Grundstück, an welchem sie installiert ist, selbst, sondern auf geografische Koordinaten abgebildet wird. Durch Berücksichtigung der georeferenzierten Koordinaten der Kamera 2 kann davon ausgehend der zu überwachende Bereich, wie bspw. ein definiertes Grundstück oder ein darüber hinausgehender akzeptabler Überwachungsbereich, bspw. bis zur Straße oder einen Meter in den Straßenraum blickend, definiert werden. Unter geometrischen Annahmen über eine Abbildung von Sichtstrahlen in der Kamera 2 kann dann im von der Kamera erfassten Sichtbereich automatisch der Bereich ausgegraut bzw. entfernt werden, der sich außerhalb des zu überwachenden Bereichs befindet. Ein derartiger Vorgang kann nach einer initialen Installation und Kalibrierung automatisch durchgeführt werden, und es können dadurch aktiv Bilddaten von vornherein unterdrückt werden.An important point is that the position and orientation of the camera is not, as before, on a building or land on which it is installed, even mapped to geographic coordinates. By taking into account the georeferenced coordinates of the camera 2, on the basis of this, the area to be monitored, such as a defined plot or an acceptable surveillance area beyond this, can be defined, for example, as far as the street or one meter into the street. Under geometrical assumptions about an image of visual beams in the camera 2, the area detected by the camera can then be grayed out or removed automatically from the area to be monitored. Such an operation can be performed automatically after an initial installation and calibration, and image data can thereby be actively suppressed from the outset.

Bei der Kamerainstallation wird mit Hilfe der erfindungsgemäßen Einrichtung eine Kalibrierung der Kamera unter Berücksichtigung der genannten georeferenzierten Daten erweitert. Ferner ist der definierte zu überwachende Bereich bereits auf einer digitalen, georeferenzierten Karten zu markieren, so dass alle Inhalte außerhalb dieses Bereichs, aber angrenzend an diesen, automatisch aus dem von der Kamera erfassten Sichtbereich herausgenommen werden, sei es in Form einer Ausgrauung oder einer vollkommenen Entfernung.
Dadurch kann ein Videoüberwachungssystem im Außenbereich eine Sicherung der Privatsphäre gewährleisten.
Die erfindungsgemäße Einrichtung muss nicht, wie in Figur 1 dargestellt, zwangsläufig in ein Gehäuse der Kamera 2 integriert sein. Auch eine räumlich getrennte Anordnung der einzelnen Module ist vorstellbar.
Ein Beispiel für eine weitere Ausführungsform einer erfindungsgemäßen Einrichtung ist in Figur 2 dargestellt. Dabei umfasst eine Kameraanordnung 15 eine erste Kamera 20 und eine zweite Kamera 30, die zur Beobachtung eines zu überwachenden Bereichs ausgebildet sind. Außerdem umfasst die Kameraanordnung 15 eine Einrichtung 25 mit mindestens einem Modul 26, das zur Bestimmung der jeweiligen Position und Orientierung der ersten Kamera 20 und der zweiten Kamera 30 ausgebildet ist. Dabei werden die Kameras 20 und 30 jeweilig bezüglich ihrer Position und Orientierung mittels georeferenzierter Koordinaten beschrieben, so dass mittels einer weiteren Recheneinheit 27 anhand der georeferenzierten Koordinaten ggf. eine für die Kameras 20 und 30 gemeinsame Maske automatisch bestimmt werden kann, die bei Abbildung auf einen von den beiden Kameras zusammen erfassten Sichtbereich Inhalte des Sichtbereichs außerhalb eines definierten zu überwachenden Bereichs maskiert.
Eine Kommunikation zum Informationsaustausch erfolgt zwischen der ersten Kamera 20 und der Einrichtung 25 über eine Leitung 40. Über eine weitere Leitung 41 ist die Einrichtung 25 mit der zweiten Kamera 30 verbunden. Außerdem baut die Einrichtung 25 dieser Ausführungsform eine Verbindung 42 zum Internet 28 auf, um darüber Informationen einer entsprechend hinterlegten georeferenzierten Karte bereitstellen zu können.
In the camera installation, a calibration of the camera is expanded with the aid of the device according to the invention taking into account the georeferenced data mentioned. Furthermore, the defined area to be monitored is already to be marked on a digital, geo-referenced map, so that all contents outside of this area, but adjacent to it, are automatically removed from the field of vision covered by the camera, whether in the form of a graying out or a perfect one Distance.
This allows an outdoor video surveillance system to ensure privacy.
The device according to the invention does not have to, as in FIG. 1 shown, inevitably be integrated into a housing of the camera 2. A spatially separated arrangement of the individual modules is conceivable.
An example of a further embodiment of a device according to the invention is shown in FIG FIG. 2 shown. In this case, a camera arrangement 15 comprises a first camera 20 and a second camera 30, which are designed to observe a region to be monitored. In addition, the camera arrangement 15 comprises a device 25 with at least one module 26, which is designed to determine the respective position and orientation of the first camera 20 and the second camera 30. In this case, the cameras 20 and 30 are respectively described with respect to their position and orientation by means of georeferenced coordinates, so that by means of a further computing unit 27 based on the georeferenced coordinates, if necessary, a common for the cameras 20 and 30 mask can be determined automatically when mapping to a The viewing area captured by the two cameras conceals contents of the viewing area outside of a defined area to be monitored.
Communication for exchanging information takes place between the first camera 20 and the device 25 via a line 40. The device 25 is connected to the second camera 30 via a further line 41. In addition, the device 25 of this embodiment establishes a connection 42 to the Internet 28 in order to be able to provide information about it on a correspondingly stored georeferenced map.

Claims (5)

  1. Method for determining and setting an area to be monitored by a video camera, wherein a visual area that results from installation and calibration of the camera and is captured by a camera is tailored to the area to be monitored, wherein the captured visual area is tailored to the area to be monitored by describing the position and orientation of the camera using geo reference coordinates taken from a georeferenced map of the area to be monitored, and by using the georeferenced coordinates to automatically determine a mask that, when mapped onto the visual area, masks content of the visual area outside the area to be monitored, characterized in that
    the mask is computed on the basis of visual rays of the camera that are computed on geometric assumptions, wherein the camera has a module that is used to access an external memory containing data of the georeferenced map via a programming interface via the Internet, the area to be monitored already being marked on the georeferenced map.
  2. Method according to Claim 1, in which the position and orientation of the camera are defined by the position and orientation of the camera in three-dimensional space.
  3. Method according to either of Claims 1 or 2, in which the mask is used to grey out and/or remove and/or hide the content of the visual area outside the area to be monitored.
  4. Device for a camera arrangement, in particular for performing the method according to one of Claims 1 to 3, wherein the device has at least one module that has access to a georeferenced map and that is configured to describe the position and orientation of the camera by means of georeferenced coordinates, wherein the device further has a computation unit configured to tailor a visual area that results from installation and calibration of the camera and is captured by the camera to the area to be monitored and to use the georeferenced coordinates to automatically determine a mask that, when mapped onto a visual area captured by the camera, masks content of the visual area outside a defined area to be monitored, characterized in that
    the computation unit is configured to compute the mask on the basis of visual rays of the camera that are computed on geometric assumptions, wherein the module is used to access an external memory containing data of the georeferenced map via a programming interface via the Internet, the area to be monitored already being marked on the georeferenced map.
  5. Camera arrangement having a camera and a device according to Claim 4.
EP12809829.0A 2012-01-17 2012-12-28 Method and device for determining and adjusting an area to be monitored by a video camera Active EP2805313B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102012200573A DE102012200573A1 (en) 2012-01-17 2012-01-17 Method and device for determining and setting an area to be monitored by a video camera
PCT/EP2012/077011 WO2013107606A1 (en) 2012-01-17 2012-12-28 Method and device for determining and adjusting an area to be monitored by a video camera

Publications (2)

Publication Number Publication Date
EP2805313A1 EP2805313A1 (en) 2014-11-26
EP2805313B1 true EP2805313B1 (en) 2018-12-26

Family

ID=47501277

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12809829.0A Active EP2805313B1 (en) 2012-01-17 2012-12-28 Method and device for determining and adjusting an area to be monitored by a video camera

Country Status (3)

Country Link
EP (1) EP2805313B1 (en)
DE (1) DE102012200573A1 (en)
WO (1) WO2013107606A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6509926B1 (en) * 2000-02-17 2003-01-21 Sensormatic Electronics Corporation Surveillance apparatus for camera surveillance system
GB2404247B (en) * 2003-07-22 2005-07-20 Hitachi Int Electric Inc Object tracing method and object tracking apparatus
US8212872B2 (en) * 2004-06-02 2012-07-03 Robert Bosch Gmbh Transformable privacy mask for video camera images
DE102006042318B4 (en) 2006-09-08 2018-10-11 Robert Bosch Gmbh Method for operating at least one camera
US20080074494A1 (en) * 2006-09-26 2008-03-27 Harris Corporation Video Surveillance System Providing Tracking of a Moving Object in a Geospatial Model and Related Methods
DE102007029476A1 (en) 2007-06-26 2009-01-08 Robert Bosch Gmbh Image processing apparatus for shadow detection and suppression, method and computer program
US8098282B2 (en) * 2007-07-13 2012-01-17 Honeywell International Inc. Privacy zone algorithm for ptz dome cameras
DE102008007199A1 (en) 2008-02-01 2009-08-06 Robert Bosch Gmbh Masking module for a video surveillance system, method for masking selected objects and computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
WO2013107606A1 (en) 2013-07-25
DE102012200573A1 (en) 2013-07-18
EP2805313A1 (en) 2014-11-26

Similar Documents

Publication Publication Date Title
DE102005063217B4 (en) Method for configuring a monitoring device for monitoring a room area and corresponding monitoring device
EP1454285B1 (en) Video monitoring system with object masking
DE102005021735B4 (en) Video surveillance system
DE102010038341B4 (en) Video surveillance system and method for configuring a video surveillance system
WO2009095106A1 (en) Masking module for a video monitoring system method for masking selected objects and computer programme
EP2104904B1 (en) Method and apparatus for monitoring a spatial volume and a calibration method
DE102008039130A1 (en) Automatic tracing and identification system for movable object e.g. human, in building, has safety monitoring sensor connected with safety monitoring system such that tracing camera receives desired data when sensor is operated
DE102007053812A1 (en) Video surveillance system configuration module, configuration module monitoring system, video surveillance system configuration process, and computer program
DE102006042318B4 (en) Method for operating at least one camera
WO2015028294A1 (en) Monitoring installation and method for presenting a monitored area
DE102012211298A1 (en) Display device for a video surveillance system and video surveillance system with the display device
DE10151983A1 (en) Method for automatic documentation of a traffic accident and recording of the layout of vehicles and objects involved in it, by use of a laser measurement device with an associated differential global positioning system
EP3236440B1 (en) Device, system and method for marking-free slope monitoring and/or building inspection
EP3021256A1 (en) Method for image processing, presence detector and lighting system
EP2805313B1 (en) Method and device for determining and adjusting an area to be monitored by a video camera
DE10049366A1 (en) Security area monitoring method involves using two image detection units whose coverage areas overlap establishing monitored security area
WO2017144033A1 (en) Method for identifying and displaying changes in a real environment comprising a real terrain and real objects situated therein
DE102009000810A1 (en) Device for segmenting an object in an image, video surveillance system, method and computer program
EP3542528B1 (en) Display apparatus for a monitoring installation of a monitoring area
LU102252B1 (en) Computer-implemented method for determining a shading state of an object
DE102007056835A1 (en) Image processing module for estimating an object position of a surveillance object, method for determining an object position of a surveillance object and computer program
DE102014223433A1 (en) Dynamic masking of video recordings
DE102017216372A1 (en) Method for determining work priorities and / or danger points on a construction site
EP3489842A1 (en) Forensic database
DE10214306A1 (en) Procedure for monitoring data movements

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140818

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20161026

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20180730

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1082494

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190115

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502012014067

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190326

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190326

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20181226

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190426

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190426

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181228

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502012014067

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20181231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181228

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181231

26N No opposition filed

Effective date: 20190927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181231

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 1082494

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181226

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20121228

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231220

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231219

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240227

Year of fee payment: 12