EP2260469B1 - Détection de visage automatique et masquage d'identité dans des images, et applications apparentées - Google Patents

Détection de visage automatique et masquage d'identité dans des images, et applications apparentées Download PDF

Info

Publication number
EP2260469B1
EP2260469B1 EP09755189.9A EP09755189A EP2260469B1 EP 2260469 B1 EP2260469 B1 EP 2260469B1 EP 09755189 A EP09755189 A EP 09755189A EP 2260469 B1 EP2260469 B1 EP 2260469B1
Authority
EP
European Patent Office
Prior art keywords
face
image
regions
region
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP09755189.9A
Other languages
German (de)
English (en)
Other versions
EP2260469A2 (fr
EP2260469A4 (fr
Inventor
Sergey Ioffe
Lance Williams
Dennis Strelow
Andrea Frome
Luc Vincent
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Publication of EP2260469A2 publication Critical patent/EP2260469A2/fr
Publication of EP2260469A4 publication Critical patent/EP2260469A4/fr
Application granted granted Critical
Publication of EP2260469B1 publication Critical patent/EP2260469B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/37Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Definitions

  • the present invention relates to image processing, and more particularly to identity masking by automatically detecting and processing face regions in an image, and applications thereof.
  • WO 03/049035 proposes an image processing system that provides automatic face or skin blurring for images.
  • a particular face is blurred on an image or on a series of images in a video.
  • the faces are determined in an image and face matching is performed to match a particular face to faces in the image. If a match is found, the face or a portion of the face is blurred in the image. Blurring is performed on a portion of the image containing a particular face.
  • the sensitivity of the face detector can be adjusted to detect possible regions that may correspond to a face. Then a pre-defined verification analysis is used to reject false positives i.e. features which do not correspond to human faces in the image. In an embodiment, a skin color analysis is performed to reject false positives detected by the face detector. Alternatively, a blur algorithm based on such verification criteria can be used to process potentially false positives. In an embodiment, a blur algorithm is applied on the basis of the probability that an area of color is a natural skin color. Higher probability results in greater blurring.
  • the present invention relates to identity masking by automatically detecting and processing face regions in an image, and applications thereof.
  • references to "one embodiment”, “an embodiment”, “an example embodiment”, etc. indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • identity masking is performed to process the image before it can be viewed by others.
  • a face detection algorithm is applied to detect regions in the image that may contain faces, then an identity masking algorithm is selected to process faces in the detected regions in order to obscure the corresponding identities.
  • identity masking the processed image can be stored in an image database and is ready to be accessed by other viewers.
  • a motion blur algorithm can make a detected face appear as if photographed while in motion but its identity is obscured.
  • a face replacement algorithm can replace the detected face with some other facial image to obscure its identity.
  • FIG. 1 illustrates an exemplary system 100 for identity masking according to one embodiment of the present invention.
  • System 100 includes an image database of unprocessed images (or raw images), raw image database 102.
  • Raw image database 102 is connected to processing pipeline server 110, which includes a face detector 112 and an identity masker 114.
  • Processing pipeline server 110 detects faces in an image using face detector 112, and obscures the corresponding identities using identity masker 114.
  • System 100 further includes one or more image storage components, such as an image database for storing processed images.
  • Such a database is shown as processed image database 120, which is accessible by an image server 130.
  • Image server 130 can be accessed by image viewers. In the illustrated embodiment, access can be provided through network 140.
  • Image browser 150 is connected to network 140 in order to access the processed images through image server 130.
  • Identity masker 114 includes a set of identity masking tools using different identity masking algorithms. These tools include face replacer 116, which implements face replacement algorithms to replace a detected face by a substitute facial image. Another tool is motion blurrer 118, which implements motion blur algorithms to blur a face detected by face detector 112 as if it were photographed while in motion.
  • raw images may be used instead of a raw image database.
  • a particular raw image may be provided directly by a user, for example.
  • Raw images may also be taken from a video.
  • server 110 and the logic shown therein may be implemented in software, hardware, or firmware, or a combination thereof.
  • Server 110 may, for example, be implemented on one or more customized or general purpose computers, where the face detector 112, identity masker 114, face replacer 116, and motion blurrer 118 are implemented as software.
  • Network 140 can be any network or combination of networks that can carry data communications, and may be referred to herein as a computer network. Such a network can include, but is not limited to, a local area network, medium area network, and/or wide area network such as the Internet. Network 140 can support protocols and technology including, but not limited to, World Wide Web protocols and/or services. Intermediate web servers, gateways, or other servers (not shown) may be provided between components of system 100 depending upon a particular application or environment.
  • the region that contains the face needs to be detected first. This can be done by a face detection algorithm. Because the purpose of identity masking is to obscure identities of individuals whose faces appear in an image, the face detection algorithm needs to identify possible face regions in the image.
  • processing pipeline server 110 in FIG. 1 gets a raw image from raw image database 102.
  • Processing pipeline server 110 uses face detector 112 to detect regions corresponding to faces (or face regions) in the raw image.
  • the sensitivity of face detector 112 is adjusted to detect as many face regions as possible.
  • the initially detected face regions may include true hits containing faces and false positives that do not actually correspond to faces.
  • face detector 112 may use verification criteria to verify the detected regions and reject false positives.
  • skin color analysis is used to verify if a detected region has a natural skin color. The regions that are mistakenly detected as faces are considered false positives.
  • Processing pipeline server 110 can also ask identity masker 114 to use identity masking algorithms to handle potential false positives based on verification criteria. For example, in one embodiment, a blur algorithm based on such verification criteria can be used to process potentially false positives.
  • the blur algorithm is applied on the basis of the probability that an area of color is a natural skin color. Higher probability results in greater blurring.
  • face detector 112 may search an image database to verify if a detected region matches such an image. If the detected region has a match in the database, it is unmarked and is not processed for identity masking.
  • an identity masking algorithm can be applied to make the face regions unrecognizable so that the corresponding identities are obscured.
  • an identity masking algorithm can be applied to make the face regions unrecognizable so that the corresponding identities are obscured.
  • the faces in the face regions can be blurred, replaced by substitute facial images not subject to privacy issues, etc.
  • processing pipeline server 110 calls identity masker 114 to obscure identities corresponding to the detected face regions.
  • identity masker 114 uses motion blurrer 118 to make a detected face region appear as if it is in motion.
  • identity masker 114 uses face replacer 116 to replace a detected face region with a substitute facial image.
  • both motion blurrer 118 and face replacer 116 are used by identity masker 114.
  • image 200 is a raw image containing two faces.
  • face detector 112 takes image 200 as input, detects two regions containing two respective faces, and outputs information about the two detected face regions, region 222 and region 224.
  • the identity masker 114 chooses to motion blur the detected face regions in process 230. Region 222 and region 224 are changed to region 242 and region 244 using the motion blur algorithm in process 230. The blurred face regions 240 containing regions 242 and 244 are output to processed image 250.
  • the identity masker 114 replaces the detected face regions with substitute facial images as illustrated in FIG. 2B .
  • Region 242 and region 244 are replaced by regions 262 and 264 using a face replacement algorithm in process 230.
  • the replaced face regions 260 containing regions 262 and 264 are output to a processed image 270.
  • the identity masker can also use different identity masking algorithms to process different detected face regions respectively. For example, as illustrated in FIG. 2C , region 222 is motion blurred to create region 282, and region 224 is replaced by region 284. The identity masked face regions 280 are output to create a processed image 290. Alternatively, the identity masker can apply two or more different identity masking algorithms on the same detected face regions to mask their identities.
  • Motion blurrer 118 can use a motion blur algorithm to make the original face region in an image appear as if the face has been photographed while in motion or out of focus.
  • FIG. 3A shows an illustration of motion blur.
  • the original detected face region 310 is processed by motion blurrer 118 using a motion blur algorithm in process 320.
  • the output is a motion blurred face region 330.
  • the substitute facial image can be a facial image not subject to privacy concerns, or a generated face different than the original face.
  • a face may be generated from a 3D computer graphics model, which can match the lighting in the image. Face replacement using such generated faces can have result in a more natural appearance of the image than other replacement methods.
  • FIG. 3B illustrates one way to replace a detected face region with a substitute facial image that is selected from a facial database.
  • a substitute facial image is selected based on the profile of the detected face region 340.
  • the profile may include orientation, facial features (e.g. size, eyes, nose, mouth, etc.), or even three-dimensional information such as depth of the face.
  • the substitute facial image should have a similar orientation and size as the detected face. It can also have similar positions of facial features.
  • the detected face region 340 is replaced by the substitute facial image 360.
  • a substitute facial image can be generated by mixing the selected facial image with the detected face region. Because the generated facial image is different than the original detected face region, the identity of detected face region 340 is obscured.
  • FIG. 4 is a flow chart of an exemplary process 400 for identity masking according to one embodiment of the invention.
  • a raw image is selected from a raw image database.
  • the raw image database can be any storage means to store images. For example, it can be raw image database 102 in FIG. 1 . In alternative embodiments, the raw image can come from other sources such as video, etc.
  • a face detector e.g. face detector 112 is used to detect face regions in the selected raw image using a face detection algorithm in stage 420. The detected face regions will be processed to obscure corresponding identities.
  • a detected face region is selected.
  • an identity masking algorithm is chosen in stage 440.
  • a motion blur algorithm can be applied to obscure the identity in stage 452, or a face replacement algorithm can be applied in stage 454.
  • there is no selection stage 440 and one or more fixed masking algorithms are used each time.
  • a blur algorithm based on skin color can be chosen to obscure the identity.
  • Each pixel in the selected face region is blurred in proportion to its probability of having a natural skin color. Therefore if the selected region has a low probability of corresponding to a human face based on color, the blurring effect performed on the region will be little.
  • processing pipeline server 110 will determine in stage 460 if all detected face regions have been processed. If there are detected face regions which have not been processed, the routine will go back to stage 430 to select the next detected face region. Otherwise, if all the detected face regions in the selected raw image are processed, the processed image will be output in stage 470 to processed image database 120.
  • a selected region is processed by one identity masking algorithm.
  • one or more identity masking algorithms can be applied on the same selected region to mask the corresponding identity.
  • selected face regions are processed in serial.
  • the selected face regions may be processed in parallel.
  • FIG. 5 is a flow chart for an exemplary process 452 of identity masking using motion blur according to one embodiment of the invention. Once a detected face region is selected and motion blurring is chosen, a particular motion blur algorithm needs to be chosen to obscure the identity of the selected face region.
  • motion blur algorithms are available to obscure the selected face region such as the Line Integral Convolution motion blur, motion blur based on a Canonical Map Function, motion blur based on face orientation, etc.
  • other blur algorithms may be used.
  • more than one blur algorithm may be applied to a selected face region.
  • LIC Line Integral Convolution
  • the Line Integral Convolution motion blur is applied to the selected face region in stage 522 for the motion blur effect.
  • LIC is well known in the art for visualizing a vector field of an image. It can involve selectively blurring the image as a function of the vector field to be displayed.
  • a vector field associated with the face region is created to represent the direction and extent of motion for each pixel in the blur. By varying the direction and the extent of motion of the vector field, the face region can be motion blurred in different directions with different amounts of blur.
  • Canonical Map Function is also well known in the art as an average estimation of three-dimensional depth when aligned with the selected face. Then the selected face region can be motion blurred according to the face depth.
  • the orientation of the selected face region needs to be calculated first in stage 542.
  • the orientation of a face in an image relates to where the corresponding individual is facing.
  • the individual may directly face the camera, i.e., to the front.
  • the individual may face to the left or right of the camera.
  • the orientation of the selected face region may comprise a face direction vector, an image vector, and an angle between them.
  • the face direction vector is a vector representing the direction of the face.
  • the image vector is a vector associated with the image.
  • the face direction vector can be a vector in the direction of the nose, and the image vector can be a vector perpendicular to the image.
  • the motion blur algorithm based on face orientation is applied to the selected face region in stage 544.
  • the blurring corresponds to the motion of the face turning in the direction of increasing/decreasing the angle between the face direction vector and the image vector.
  • the present invention is not limited to the above mentioned motion blur algorithms for identity masking. In alternative embodiments of the invention, other motion blur or image blur algorithms can also be applied upon selection. In some embodiments of the invention, because the blur algorithms have different extents of blur at different locations of the face region, the blurring process is irreversible or irrecoverable.
  • FIG. 6 is a flow chart of an exemplary process 454 for identity masking using face replacement algorithms according to one embodiment of the invention.
  • a substitute facial image replaces a selected face region in an image, so that the identity of the selected face region is obscured.
  • a face profile is determined for the selected face region in stage 620.
  • the face profile is often used in face recognition algorithms to identify a face.
  • the face profile can include locations and shapes of eyes, nose and mouth, face outline, face orientation, etc.
  • a substitute facial image can be either generated or selected from a face database in stage 630. There are different ways to generate the substitute facial image.
  • the substitute facial image can be generated by mixing one or more corresponding features of the selected face region and a facial image selected from the face database.
  • the substitute facial image can be generated by mixing some features of two or more detected face regions in the image.
  • the substitute facial image should have a size similar to the selected face region. For example, in one embodiment, the difference between lengths or heights of the two image regions can be less than 5% of the length or height of the selected face region.
  • FIG. 7 is a flow chart of an exemplary process 700 for rejecting false positives of the face detection algorithm according to one embodiment of the invention.
  • the face detector marksregions of an image that possibly include a face.
  • the sensitivity of the face detector is tuned to mark as many face regions as possible, including both true hits and false positives. Then a marked region is selected in stage 710.
  • stage 720 the selected region is tested using pre-defined verification criteria to verify that the region contains a face.
  • the verification criterion can be defined based on skin color, three-dimensional face profile, etc. If the selected region does not contain a face, it will be unmarked in stage 722. In stage 730, if all marked regions are tested, the procedure ends. Otherwise, the procedure goes back to stage 710, to select another marked region for verification.
  • the process of rejecting false positives is performed in serial. In alternative embodiments, the process may be performed in parallel.
  • FIG. 8 shows some examples for excluding false positives based on skin color analysis.
  • Column 810 of the table 800 contains the original detected face regions from input images.
  • the face regions 811 and 812 in column 810 correspond to faces.
  • the regions 813 and 814 are falsely detected regions other than human faces, and region 815 is a poster with human figures in black and white.
  • a skin color analysis is applied to verify the above mentioned regions.
  • the results of skin color analysis of the above five detected regions are listed in column 820 of the table.
  • the skin color analysis results 821 and 822 indicate that the corresponding detected regions 811 and 812 include human skin colors and may therefore contain human faces.
  • the skin color analysis results 823-825 indicate that the corresponding detected regions 813-815 are false positives.
  • region 815 has human faces, if they are in black and white, thus region 815 is indicated as a false positive.
  • the detected regions are then blurred based on the skin color analysis.
  • the final results are listed in column 830, and only regions containing human faces are blurred, i.e. images 831 and 832.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Claims (11)

  1. Procédé pour cacher des identités dans une image, comprenant :
    une détection, dans une image, d'une région correspondant à un visage ;
    une sélection d'une partie de la région qui inclut une ou plusieurs caractéristiques du visage ; et
    un floutage dynamique de la partie pour cacher les une ou plusieurs caractéristiques du visage, en se basant au moins en partie sur une orientation du visage.
  2. Procédé selon la revendication 1, dans lequel la détection comprend l'utilisation d'une analyse de couleur de peau pour rejeter des faux positifs.
  3. Procédé selon la revendication 2, dans lequel l'analyse destinée à rejeter des faux positifs comprend une vérification que la région inclut une couleur en adéquation avec la peau humaine.
  4. Procédé selon la revendication 1, dans lequel la détection comprend une vérification de la région détectée au moyen d'images publiques stockées ; et un rejet de la région détectée où la région détectée correspond à une image publique stockée, ou la sélection comprend une sélection d'au moins un élément parmi un nez, un oeil, une bouche, et un contour du visage.
  5. Procédé selon la revendication 1, dans lequel le floutage comprend un floutage dynamique des une ou plusieurs caractéristiques du visage.
  6. Procédé selon la revendication 5, dans lequel le floutage comprend : un floutage dynamique des une ou plusieurs caractéristiques du visage en se basant sur une fonction de table canonique ou en utilisant une convolution intégrale linéaire ; ou une détermination d'une orientation du visage, et un floutage dynamique de la partie de la région en se basant sur l'orientation du visage.
  7. Procédé selon la revendication 6, dans lequel la détermination d'une orientation du visage comprend une association d'un premier vecteur au visage ; et une détermination d'une relation entre le premier vecteur et un second vecteur associé à l'image.
  8. Procédé selon la revendication 1, dans lequel le floutage comprend un floutage de chaque pixel des une ou plusieurs caractéristiques du visage en proportion de sa probabilité d'être une couleur de peau naturelle.
  9. Système pour cacher des identités dans une image, comprenant :
    un serveur de pipeline de traitement configuré pour cacher les identités dans l'image, le serveur de pipeline de traitement comprenant :
    un détecteur de visage configuré pour détecter des régions de visage dans l'image ;
    un masqueur d'identité configuré pour flouter les régions de visage afin de cacher les identités correspondant aux régions de visage détectées en se basant au moins en partie sur une orientation des régions de visage sans déformer l'image d'autre manière, le masqueur d'identité comprenant un flouteur dynamique configuré pour flouter les régions de visage détectées comme si elles étaient en mouvement.
  10. Système selon la revendication 9, dans lequel le détecteur de visage est configuré pour rejeter les faux positifs.
  11. Système selon la revendication 10 ou 11, dans lequel le masqueur d'identité comprend un flouteur dynamique destiné à flouter les régions de visage détectées comme si elles étaient en mouvement, ou un dispositif de remplacement de visage destiné à remplacer les régions de visage détectées par des images faciales de substitution ; ou le détecteur de visage est configuré pour utiliser une couleur de peau afin de rejeter des régions détectées qui ne sont pas des régions de visage.
EP09755189.9A 2008-03-31 2009-03-31 Détection de visage automatique et masquage d'identité dans des images, et applications apparentées Active EP2260469B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/078,464 US8098904B2 (en) 2008-03-31 2008-03-31 Automatic face detection and identity masking in images, and applications thereof
PCT/US2009/001988 WO2009145826A2 (fr) 2008-03-31 2009-03-31 Détection de visage automatique et masquage d'identité dans des images, et applications apparentées

Publications (3)

Publication Number Publication Date
EP2260469A2 EP2260469A2 (fr) 2010-12-15
EP2260469A4 EP2260469A4 (fr) 2013-03-06
EP2260469B1 true EP2260469B1 (fr) 2014-05-07

Family

ID=41201130

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09755189.9A Active EP2260469B1 (fr) 2008-03-31 2009-03-31 Détection de visage automatique et masquage d'identité dans des images, et applications apparentées

Country Status (8)

Country Link
US (2) US8098904B2 (fr)
EP (1) EP2260469B1 (fr)
JP (1) JP5361987B2 (fr)
KR (1) KR101572995B1 (fr)
CN (1) CN102067175B (fr)
AU (1) AU2009251833B2 (fr)
CA (1) CA2719992C (fr)
WO (1) WO2009145826A2 (fr)

Families Citing this family (256)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8554868B2 (en) 2007-01-05 2013-10-08 Yahoo! Inc. Simultaneous sharing communication interface
US9721148B2 (en) * 2007-12-31 2017-08-01 Applied Recognition Inc. Face detection and recognition
US9639740B2 (en) * 2007-12-31 2017-05-02 Applied Recognition Inc. Face detection and recognition
US8098904B2 (en) * 2008-03-31 2012-01-17 Google Inc. Automatic face detection and identity masking in images, and applications thereof
JP5352150B2 (ja) * 2008-08-01 2013-11-27 パナソニック株式会社 撮像装置
US9641537B2 (en) 2008-08-14 2017-05-02 Invention Science Fund I, Llc Conditionally releasing a communiqué determined to be affiliated with a particular source entity in response to detecting occurrence of one or more environmental aspects
US20110041185A1 (en) * 2008-08-14 2011-02-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Obfuscating identity of a source entity affiliated with a communiqué directed to a receiving user and in accordance with conditional directive provided by the receiving user
US9659188B2 (en) * 2008-08-14 2017-05-23 Invention Science Fund I, Llc Obfuscating identity of a source entity affiliated with a communiqué directed to a receiving user and in accordance with conditional directive provided by the receiving use
US8571354B2 (en) * 2008-09-25 2013-10-29 Tomtom Global Content B.V. Method of and arrangement for blurring an image
JP5237055B2 (ja) * 2008-11-07 2013-07-17 キヤノン株式会社 映像送信装置、映像送信方法、およびコンピュータプログラム
US8345921B1 (en) 2009-03-10 2013-01-01 Google Inc. Object detection with false positive filtering
CN101859370A (zh) * 2009-04-07 2010-10-13 佛山普立华科技有限公司 成像***及其成像方法
WO2011037579A1 (fr) * 2009-09-25 2011-03-31 Hewlett-Packard Development Company, L.P. Dispositif et procédés de reconnaissance du visage
CN102044064A (zh) * 2009-10-23 2011-05-04 鸿富锦精密工业(深圳)有限公司 影像处理***及方法
KR101611440B1 (ko) * 2009-11-16 2016-04-11 삼성전자주식회사 이미지 처리 방법 및 장치
US9672332B2 (en) 2010-02-18 2017-06-06 Nokia Technologies Oy Method and apparatus for preventing unauthorized use of media items
US9077950B2 (en) * 2010-04-28 2015-07-07 Thomas William Hickie System, method, and module for a content control layer for an optical imaging device
EP2577607A1 (fr) * 2010-05-28 2013-04-10 QUALCOMM Incorporated Création d'ensemble de données pour le suivi de cibles dotées de parties changeant de manière dynamique
JP5701007B2 (ja) * 2010-10-19 2015-04-15 キヤノン株式会社 監視カメラ装置及び監視カメラ装置の制御方法
TWI420405B (zh) * 2010-10-20 2013-12-21 Hon Hai Prec Ind Co Ltd 人臉影像代換系統及方法
CN102456232A (zh) * 2010-10-20 2012-05-16 鸿富锦精密工业(深圳)有限公司 人脸影像代换***以及方法
US20140192238A1 (en) 2010-10-24 2014-07-10 Linx Computational Imaging Ltd. System and Method for Imaging and Image Processing
US11080513B2 (en) 2011-01-12 2021-08-03 Gary S. Shuster Video and still image data alteration to enhance privacy
US8744119B2 (en) 2011-01-12 2014-06-03 Gary S. Shuster Graphic data alteration to enhance online privacy
US8842931B2 (en) * 2011-02-18 2014-09-23 Nvidia Corporation System, method, and computer program product for reducing noise in an image using depth-based sweeping over image samples
JP5156108B2 (ja) * 2011-04-23 2013-03-06 オリンパスイメージング株式会社 撮像装置および撮像方法
EP4009651A1 (fr) 2011-07-12 2022-06-08 Snap Inc. Procédés et systèmes permettant de fournir des fonctions d'édition de contenu visuel
JP2013069187A (ja) * 2011-09-26 2013-04-18 Dainippon Printing Co Ltd 画像処理システム、画像処理方法、サーバおよびプログラム
EP2771865A4 (fr) * 2011-10-25 2015-07-08 Sony Corp Appareil, procédé et produit programme d'ordinateur de traitement d'image
CN103186312A (zh) * 2011-12-29 2013-07-03 方正国际软件(北京)有限公司 终端、漫画形象处理***和漫画形象处理方法
US8761498B1 (en) 2012-01-26 2014-06-24 Google Inc. Face and license plate detection in street level images with 3-D road width features estimated from laser data
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US8972357B2 (en) 2012-02-24 2015-03-03 Placed, Inc. System and method for data collection to validate location data
WO2013138268A1 (fr) * 2012-03-12 2013-09-19 Diy, Co. Détection faciale automatique et accord parental dans images et vidéo et ses applications
JP5845988B2 (ja) * 2012-03-16 2016-01-20 大日本印刷株式会社 画像処理システム、画像処理方法、サーバおよびプログラム
JP5880182B2 (ja) * 2012-03-19 2016-03-08 カシオ計算機株式会社 画像生成装置、画像生成方法及びプログラム
WO2013160539A1 (fr) 2012-04-27 2013-10-31 Nokia Corporation Procédé et appareil de protection de la confidentialité dans des images
WO2013166588A1 (fr) 2012-05-08 2013-11-14 Bitstrips Inc. Système et procédé pour avatars adaptables
US11590053B2 (en) 2012-05-17 2023-02-28 Zoll Medical Corporation Cameras for emergency rescue
US10420701B2 (en) 2013-05-17 2019-09-24 Zoll Medical Corporation Cameras for emergency rescue
US9378207B2 (en) 2012-06-29 2016-06-28 Nokia Technologies Oy Methods and apparatus for multimedia creation
US8953843B1 (en) 2012-07-17 2015-02-10 Google Inc. Selecting objects in a sequence of images
US8977003B1 (en) * 2012-07-17 2015-03-10 Google Inc. Detecting objects in a sequence of images
CN102841354B (zh) * 2012-08-09 2014-06-11 广东欧珀移动通信有限公司 一种具有显示屏幕的电子设备的保护视力实现方法
US20150206349A1 (en) 2012-08-22 2015-07-23 Goldrun Corporation Augmented reality virtual content platform apparatuses, methods and systems
JP2014078910A (ja) * 2012-10-12 2014-05-01 Sony Corp 画像処理装置、画像処理システム、画像処理方法、及びプログラム
US8977012B2 (en) 2012-10-31 2015-03-10 Google Inc. Image denoising system and method
CN103886549A (zh) * 2012-12-21 2014-06-25 北京齐尔布莱特科技有限公司 自动对图片中的车牌进行马赛克处理的方法及装置
KR101999140B1 (ko) * 2013-01-03 2019-07-11 삼성전자주식회사 카메라장치 및 카메라를 구비하는 휴대단말기의 이미지 촬영장치 및 방법
US20140253727A1 (en) * 2013-03-08 2014-09-11 Evocentauri Inc. Systems and methods for facilitating communications between a user and a public official
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
KR101381343B1 (ko) * 2013-09-14 2014-04-04 주식회사 코어셀 얼굴인식기술을 이용하여 영상처리기기에서 촬영된 영상물로부터 얼굴영역을 검출하여 개인 프라이버시 보호를 위한 마스킹을 제공하는 방법
CN105917360A (zh) * 2013-11-12 2016-08-31 应用识别公司 面部检测和识别
EP2876605B1 (fr) * 2013-11-22 2016-04-06 Axis AB Masques de confidentialité de gradient
US9569656B2 (en) 2013-12-06 2017-02-14 Google Inc. Local real-time facial recognition
CA2863124A1 (fr) 2014-01-03 2015-07-03 Investel Capital Corporation Systeme et procede de partage de contenu utilisateur avec integration automatisee de contenu externe
US9628950B1 (en) 2014-01-12 2017-04-18 Investment Asset Holdings Llc Location-based messaging
US10082926B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
KR101426785B1 (ko) * 2014-03-05 2014-08-06 성결대학교 산학협력단 스마트 디바이스에서 예술감성 반영을 통한 민감영역에 대한 사용자 친화적 영상왜곡 방법
US8909725B1 (en) 2014-03-07 2014-12-09 Snapchat, Inc. Content delivery network for ephemeral objects
CN103914634A (zh) * 2014-03-26 2014-07-09 小米科技有限责任公司 图片加密方法、装置及电子设备
US9571785B2 (en) * 2014-04-11 2017-02-14 International Business Machines Corporation System and method for fine-grained control of privacy from image and video recording devices
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
CN104021350B (zh) * 2014-05-13 2016-07-06 小米科技有限责任公司 隐私信息隐藏方法及装置
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
JPWO2015186447A1 (ja) * 2014-06-03 2017-04-20 ソニー株式会社 情報処理装置、撮影装置、画像共有システム、情報処理方法およびプログラム
IL239237B (en) 2014-06-05 2018-12-31 Rotem Efrat Network document extension
US9113301B1 (en) 2014-06-13 2015-08-18 Snapchat, Inc. Geo-location based event gallery
US9225897B1 (en) 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
WO2016004595A1 (fr) * 2014-07-09 2016-01-14 Splunk Inc. Réduction au minimum des opérations de flou pour créer un effet de flou pour une image
US9679194B2 (en) * 2014-07-17 2017-06-13 At&T Intellectual Property I, L.P. Automated obscurity for pervasive imaging
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US9015285B1 (en) 2014-11-12 2015-04-21 Snapchat, Inc. User interface for accessing media at a geographic location
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US9754355B2 (en) 2015-01-09 2017-09-05 Snap Inc. Object recognition based photo filters
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
CN110874571B (zh) * 2015-01-19 2023-05-05 创新先进技术有限公司 人脸识别模型的训练方法及装置
US9521515B2 (en) 2015-01-26 2016-12-13 Mobli Technologies 2010 Ltd. Content request by location
US9898836B2 (en) * 2015-02-06 2018-02-20 Ming Chuan University Method for automatic video face replacement by using a 2D face image to estimate a 3D vector angle of the face image
CA2977139C (fr) 2015-02-24 2021-01-12 Axon Enterprise, Inc. Systemes et procedes de caviardage en masse de donnees enregistrees
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
KR102662169B1 (ko) 2015-03-18 2024-05-03 스냅 인코포레이티드 지오-펜스 인가 프로비저닝
US9692967B1 (en) 2015-03-23 2017-06-27 Snap Inc. Systems and methods for reducing boot time and power consumption in camera systems
US10135949B1 (en) 2015-05-05 2018-11-20 Snap Inc. Systems and methods for story and sub-story navigation
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
CN104992120A (zh) * 2015-06-18 2015-10-21 广东欧珀移动通信有限公司 一种图片加密方法及移动终端
CN104966067B (zh) * 2015-06-29 2018-05-18 福建天晴数码有限公司 保护隐私的图像处理方法及***
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
CN105117122B (zh) * 2015-07-30 2019-05-14 深圳市金立通信设备有限公司 一种终端截屏方法及终端
US10284558B2 (en) 2015-08-12 2019-05-07 Google Llc Systems and methods for managing privacy settings of shared content
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
KR102407133B1 (ko) 2015-08-21 2022-06-10 삼성전자주식회사 전자 장치 및 이의 콘텐트 변형 방법
CN106484704A (zh) * 2015-08-25 2017-03-08 中兴通讯股份有限公司 一种图片处理方法及装置
US20170094019A1 (en) * 2015-09-26 2017-03-30 Microsoft Technology Licensing, Llc Providing Access to Non-Obscured Content Items based on Triggering Events
JP6675082B2 (ja) * 2015-09-30 2020-04-01 パナソニックIpマネジメント株式会社 見守り装置、見守り方法、およびコンピュータプログラム
US9652896B1 (en) 2015-10-30 2017-05-16 Snap Inc. Image based tracking in augmented reality systems
US9471852B1 (en) 2015-11-11 2016-10-18 International Business Machines Corporation User-configurable settings for content obfuscation
US10846895B2 (en) * 2015-11-23 2020-11-24 Anantha Pradeep Image processing mechanism
WO2017088340A1 (fr) * 2015-11-25 2017-06-01 腾讯科技(深圳)有限公司 Procédé et appareil de traitement d'informations d'image et support de stockage informatique
CN106803877A (zh) * 2015-11-26 2017-06-06 西安中兴新软件有限责任公司 一种摄影摄像方法和装置
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US9984499B1 (en) 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
CN105550592B (zh) * 2015-12-09 2018-06-29 上海斐讯数据通信技术有限公司 一种人脸图片的保护方法、***及移动终端
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US20170178287A1 (en) * 2015-12-21 2017-06-22 Glen J. Anderson Identity obfuscation
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10285001B2 (en) 2016-02-26 2019-05-07 Snap Inc. Generation, curation, and presentation of media collections
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
JP2019114821A (ja) * 2016-03-23 2019-07-11 日本電気株式会社 監視システム、装置、方法およびプログラム
US10339365B2 (en) 2016-03-31 2019-07-02 Snap Inc. Automated avatar generation
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US10334134B1 (en) 2016-06-20 2019-06-25 Maximillian John Suiter Augmented real estate with location and chattel tagging system and apparatus for virtual diary, scrapbooking, game play, messaging, canvasing, advertising and social interaction
US9681265B1 (en) 2016-06-28 2017-06-13 Snap Inc. System to track engagement of media items
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10181073B2 (en) * 2016-06-29 2019-01-15 Intel Corporation Technologies for efficient identity recognition based on skin features
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US10855632B2 (en) 2016-07-19 2020-12-01 Snap Inc. Displaying customized electronic messaging graphics
US11068696B2 (en) 2016-08-24 2021-07-20 International Business Machines Corporation Protecting individuals privacy in public through visual opt-out, signal detection, and marker detection
WO2018045076A1 (fr) 2016-08-30 2018-03-08 C3D Augmented Reality Solutions Ltd Systèmes et procédés de cartographie et localisation simultanées
WO2018048895A1 (fr) * 2016-09-06 2018-03-15 Apple Inc. Réglages d'image en fonction des estimations de profondeur de champ
CN106384058B (zh) * 2016-09-12 2019-02-05 Oppo广东移动通信有限公司 发布图片的方法和装置
CN106446185A (zh) * 2016-09-28 2017-02-22 北京小米移动软件有限公司 产品推荐方法、装置以及服务器
US10169894B2 (en) * 2016-10-06 2019-01-01 International Business Machines Corporation Rebuilding images based on historical image data
EP3526964B1 (fr) * 2016-10-14 2024-02-21 Genetec Inc. Masquage dans un flux vidéo
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
KR102163443B1 (ko) 2016-11-07 2020-10-08 스냅 인코포레이티드 이미지 변경자들의 선택적 식별 및 순서화
DE102016223859A1 (de) 2016-11-30 2018-05-30 Robert Bosch Gmbh Kamera zur Überwachung eines Überwachungsbereiches und Überwachungsvorrichtung sowie Verfahren zur Überwachung eines Überwachungsbereiches
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US20180150697A1 (en) * 2017-01-09 2018-05-31 Seematics Systems Ltd System and method for using subsequent behavior to facilitate learning of visual event detectors
US10454857B1 (en) 2017-01-23 2019-10-22 Snap Inc. Customized digital avatar accessories
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10074381B1 (en) 2017-02-20 2018-09-11 Snap Inc. Augmented reality speech balloon system
US10552500B2 (en) * 2017-03-02 2020-02-04 International Business Machines Corporation Presenting a data instance based on presentation rules
US10565795B2 (en) 2017-03-06 2020-02-18 Snap Inc. Virtual vision system
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US10582277B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US10581782B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US10212541B1 (en) 2017-04-27 2019-02-19 Snap Inc. Selective location-based identity communication
CN110800018A (zh) 2017-04-27 2020-02-14 斯纳普公司 用于社交媒体平台的朋友位置共享机制
US10467147B1 (en) 2017-04-28 2019-11-05 Snap Inc. Precaching unlockable data elements
US10803120B1 (en) 2017-05-31 2020-10-13 Snap Inc. Geolocation based playlists
WO2019002521A1 (fr) * 2017-06-29 2019-01-03 Koninklijke Philips N.V. Masquage de caractéristiques faciales d'un sujet dans une image
CN107330904B (zh) * 2017-06-30 2020-12-18 北京乐蜜科技有限责任公司 图像处理方法、装置、电子设备及存储介质
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US10573043B2 (en) 2017-10-30 2020-02-25 Snap Inc. Mobile-based cartographic control of display content
US10586070B2 (en) 2017-11-14 2020-03-10 International Business Machines Corporation Privacy protection in captured image for distribution
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
CN109886864B (zh) * 2017-12-06 2021-03-09 杭州海康威视数字技术股份有限公司 隐私遮蔽处理方法及装置
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US10574890B2 (en) 2018-01-12 2020-02-25 Movidius Ltd. Methods and apparatus to operate a mobile camera for low-power usage
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
KR20230129617A (ko) 2018-03-14 2023-09-08 스냅 인코포레이티드 위치 정보에 기초한 수집가능한 항목들의 생성
GB2572435B (en) * 2018-03-29 2022-10-05 Samsung Electronics Co Ltd Manipulating a face in an image
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US10902274B2 (en) * 2018-04-30 2021-01-26 Adobe Inc. Opting-in or opting-out of visual tracking
US10896197B1 (en) 2018-05-22 2021-01-19 Snap Inc. Event detection system
CN108765561A (zh) * 2018-05-30 2018-11-06 链家网(北京)科技有限公司 房屋虚拟三维模型生成过程中隐私信息处理方法及装置
CN108495049A (zh) * 2018-06-15 2018-09-04 Oppo广东移动通信有限公司 拍摄控制方法及相关产品
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US10915995B2 (en) * 2018-09-24 2021-02-09 Movidius Ltd. Methods and apparatus to generate masked images based on selective privacy and/or location tracking
US10698583B2 (en) 2018-09-28 2020-06-30 Snap Inc. Collaborative achievement interface
CN109389076B (zh) * 2018-09-29 2022-09-27 深圳市商汤科技有限公司 图像分割方法及装置
US10778623B1 (en) 2018-10-31 2020-09-15 Snap Inc. Messaging and gaming applications communication platform
US12026284B2 (en) 2018-11-20 2024-07-02 HCL Technologies Italy S.p.A System and method for facilitating a secure access to a photograph over a social networking platform
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US10939236B1 (en) 2018-11-30 2021-03-02 Snap Inc. Position service to determine relative position to map features
US11301966B2 (en) * 2018-12-10 2022-04-12 Apple Inc. Per-pixel filter
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11972529B2 (en) 2019-02-01 2024-04-30 Snap Inc. Augmented reality system
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
US10838599B2 (en) 2019-02-25 2020-11-17 Snap Inc. Custom media overlay system
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US10810782B1 (en) 2019-04-01 2020-10-20 Snap Inc. Semantic texture mapping system
US10885606B2 (en) * 2019-04-08 2021-01-05 Honeywell International Inc. System and method for anonymizing content to protect privacy
US10762607B2 (en) 2019-04-10 2020-09-01 Alibaba Group Holding Limited Method and device for sensitive data masking based on image recognition
US10560898B1 (en) 2019-05-30 2020-02-11 Snap Inc. Wearable device location systems
US10582453B1 (en) 2019-05-30 2020-03-03 Snap Inc. Wearable device location systems architecture
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
KR102284700B1 (ko) * 2019-08-19 2021-08-02 재단법인 아산사회복지재단 의료 영상 비식별화를 위한 신체 부위 위치 특정 방법, 프로그램 및 컴퓨팅 장치
EP3796655B1 (fr) 2019-09-20 2021-11-03 Axis AB Post-masquage sans transcodage
WO2021061551A1 (fr) * 2019-09-24 2021-04-01 Qsinx Management Llc Procédé et dispositif de traitement d'images de caméra
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11275972B2 (en) * 2019-11-08 2022-03-15 International Business Machines Corporation Image classification masking
KR102313554B1 (ko) * 2019-11-28 2021-10-18 한국전자기술연구원 개인정보 비식별화 방법 및 시스템
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US10880496B1 (en) 2019-12-30 2020-12-29 Snap Inc. Including video feed in message thread
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
KR102215056B1 (ko) * 2020-01-23 2021-02-10 울산대학교 산학협력단 의료 영상 비식별화 시스템, 방법 및 컴퓨터 판독 가능한 기록매체
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11120523B1 (en) * 2020-03-12 2021-09-14 Conduent Business Services, Llc Vehicle passenger detection system and method
US10956743B1 (en) 2020-03-27 2021-03-23 Snap Inc. Shared augmented reality system
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11615205B2 (en) 2020-05-28 2023-03-28 Bank Of America Corporation Intelligent dynamic data masking on display screens based on viewer proximity
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11308327B2 (en) 2020-06-29 2022-04-19 Snap Inc. Providing travel-based augmented reality content with a captured image
KR20220015019A (ko) * 2020-07-30 2022-02-08 삼성전자주식회사 데이터 마스킹을 이용하여 이미지를 변환하는 전자 장치 및 방법
US11349797B2 (en) 2020-08-31 2022-05-31 Snap Inc. Co-location connection service
US12013968B2 (en) 2020-10-22 2024-06-18 Robert Bosch Gmbh Data anonymization for data labeling and development purposes
CN112258388A (zh) * 2020-11-02 2021-01-22 公安部第三研究所 一种公共安全视图脱敏测试数据生成方法、***以及存储介质
US11468617B2 (en) 2021-03-10 2022-10-11 International Business Machines Corporation Selective redaction of images
US11094042B1 (en) 2021-03-12 2021-08-17 Flyreel, Inc. Face detection and blurring methods and systems
US11606756B2 (en) 2021-03-29 2023-03-14 Snap Inc. Scheduling requests for location data
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
WO2022225375A1 (fr) * 2021-04-22 2022-10-27 서울대학교산학협력단 Procédé et dispositif de reconnaissance faciale basée sur des dnn multiples à l'aide de pipelines de traitement parallèle
US12026362B2 (en) 2021-05-19 2024-07-02 Snap Inc. Video editing application for mobile devices
WO2023027853A1 (fr) * 2021-08-26 2023-03-02 Hypatia Group, Inc. Système de surveillance de santé avec filtre de blocage de l'œil de précision
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
KR102500252B1 (ko) * 2022-01-20 2023-02-17 (주)에이아이매틱스 얼굴 개인정보 보호 기술을 이용한 기계학습 데이터베이스 구축 시스템
US12001750B2 (en) 2022-04-20 2024-06-04 Snap Inc. Location-based shared augmented reality experience system
US12020384B2 (en) 2022-06-21 2024-06-25 Snap Inc. Integrating augmented reality experiences with other components
US12020386B2 (en) 2022-06-23 2024-06-25 Snap Inc. Applying pregenerated virtual experiences in new location
JP2024059288A (ja) * 2022-10-18 2024-05-01 ソニーセミコンダクタソリューションズ株式会社 画像処理装置、画像処理方法および記録媒体

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120070042A1 (en) * 2008-03-31 2012-03-22 Google Inc. Automatic Face Detection and Identity Masking In Images, and Applications Thereof

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6067399A (en) * 1998-09-02 2000-05-23 Sony Corporation Privacy mode for acquisition cameras and camcorders
US7106887B2 (en) 2000-04-13 2006-09-12 Fuji Photo Film Co., Ltd. Image processing method using conditions corresponding to an identified person
US6959099B2 (en) * 2001-12-06 2005-10-25 Koninklijke Philips Electronics N.V. Method and apparatus for automatic face blurring
JP2003319158A (ja) * 2002-04-18 2003-11-07 Toshiyuki Tani 画像処理システム
US6801642B2 (en) 2002-06-26 2004-10-05 Motorola, Inc. Method and apparatus for limiting storage or transmission of visual information
CN1237485C (zh) * 2002-10-22 2006-01-18 中国科学院计算技术研究所 利用快速人脸检测对新闻被采访者进行脸部遮挡的方法
US7660482B2 (en) * 2004-06-23 2010-02-09 Seiko Epson Corporation Method and apparatus for converting a photo to a caricature image
US7702131B2 (en) * 2005-10-13 2010-04-20 Fujifilm Corporation Segmenting images and simulating motion blur using an image sequence
CN1979558A (zh) * 2005-11-30 2007-06-13 中国科学院半导体研究所 一种基于高维空间点分布分析法的图像复原方法
US7787664B2 (en) 2006-03-29 2010-08-31 Eastman Kodak Company Recomposing photographs from multiple frames
US8660319B2 (en) 2006-05-05 2014-02-25 Parham Aarabi Method, system and computer program product for automatic and semi-automatic modification of digital images of faces
WO2007136779A2 (fr) * 2006-05-19 2007-11-29 New Jersey Institute Of Technology Capteur à fibres optiques à base de diaphragme embouti aligné
WO2008016645A2 (fr) * 2006-07-31 2008-02-07 Onlive, Inc. systÈme et procÉDÉ pour exÉcuter une capture de mouvement et une reconstitution d'image

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120070042A1 (en) * 2008-03-31 2012-03-22 Google Inc. Automatic Face Detection and Identity Masking In Images, and Applications Thereof

Also Published As

Publication number Publication date
US20120070042A1 (en) 2012-03-22
CN102067175B (zh) 2014-08-27
JP5361987B2 (ja) 2013-12-04
EP2260469A2 (fr) 2010-12-15
WO2009145826A2 (fr) 2009-12-03
AU2009251833A1 (en) 2009-12-03
AU2009251833B2 (en) 2014-08-07
KR20100134079A (ko) 2010-12-22
US8098904B2 (en) 2012-01-17
KR101572995B1 (ko) 2015-11-30
EP2260469A4 (fr) 2013-03-06
WO2009145826A3 (fr) 2010-01-21
US8509499B2 (en) 2013-08-13
CA2719992A1 (fr) 2009-12-03
CN102067175A (zh) 2011-05-18
US20090262987A1 (en) 2009-10-22
CA2719992C (fr) 2016-09-06
JP2011516965A (ja) 2011-05-26

Similar Documents

Publication Publication Date Title
EP2260469B1 (fr) Détection de visage automatique et masquage d'identité dans des images, et applications apparentées
JP4755202B2 (ja) 顔特徴の検出方法
WO2022156640A1 (fr) Procédé et appareil de correction du regard pour image, dispositif électronique, support d'enregistrement lisible par ordinateur et produit programme d'ordinateur
CN110210276A (zh) 一种移动轨迹获取方法及其设备、存储介质、终端
Mosaddegh et al. Photorealistic face de-identification by aggregating donors’ face components
JP2008152789A (ja) 顔映像の類似度の算出方法及び装置とこれを利用した顔映像の検索方法及び装置、並びに顔合成方法
JP7419080B2 (ja) コンピュータシステムおよびプログラム
Chavan et al. Real time emotion recognition through facial expressions for desktop devices
CN112396053A (zh) 一种基于级联神经网络的环视鱼眼图像目标检测方法
CN112802081B (zh) 一种深度检测方法、装置、电子设备及存储介质
CN112633221A (zh) 一种人脸方向的检测方法及相关装置
Mushtaq et al. Image copy move forgery detection: a review
JP2021068056A (ja) 路上障害物検知装置、路上障害物検知方法、及び路上障害物検知プログラム
CN110543813B (zh) 一种基于场景的人脸画像、目光计数方法及***
US11341612B1 (en) Method and system for automatic correction and generation of facial images
CN110502961A (zh) 一种面部图像检测方法及装置
CN116309494B (zh) 一种电子地图中兴趣点信息确定方法、装置、设备及介质
Çiftçi et al. Using false colors to protect visual privacy of sensitive content
CN116778533A (zh) 一种掌纹全感兴趣区域图像提取方法、装置、设备及介质
CN114332982A (zh) 一种人脸识别模型攻击防御方法、装置、设备及存储介质
Prinosil et al. Facial image de-identification using active appearance model
KR20130013261A (ko) 영상 내의 자동 면상 검출 및 식별 마스킹, 및 그 응용
CN113837018B (zh) 一种化妆进度检测方法、装置、设备及存储介质
CN111666908B (zh) 视频用户的兴趣画像生成方法、装置、设备和存储介质
CN117765521A (zh) 车牌识别方法、电子设备及存储介质

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100930

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA RS

RIN1 Information on inventor provided before grant (corrected)

Inventor name: FROME, ANDREA

Inventor name: WILLIAMS, LANCE

Inventor name: IOFFE, SERGEY

Inventor name: STRELOW, DENNIS

Inventor name: VINCENT, LUC

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20130204

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/00 20060101ALI20130129BHEP

Ipc: G06K 9/00 20060101ALI20130129BHEP

Ipc: G06T 7/40 20060101AFI20130129BHEP

Ipc: G06T 5/00 20060101ALI20130129BHEP

Ipc: G06T 7/60 20060101ALI20130129BHEP

17Q First examination report despatched

Effective date: 20130927

17Q First examination report despatched

Effective date: 20130927

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20131120

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 667186

Country of ref document: AT

Kind code of ref document: T

Effective date: 20140515

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602009023939

Country of ref document: DE

Effective date: 20140618

REG Reference to a national code

Ref country code: NL

Ref legal event code: T3

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 667186

Country of ref document: AT

Kind code of ref document: T

Effective date: 20140507

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140807

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140907

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140808

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140908

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602009023939

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 7

26N No opposition filed

Effective date: 20150210

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602009023939

Country of ref document: DE

Effective date: 20150210

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150331

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150331

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150331

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20090331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602009023939

Country of ref document: DE

Representative=s name: MARKS & CLERK (LUXEMBOURG) LLP, LU

Ref country code: DE

Ref legal event code: R081

Ref document number: 602009023939

Country of ref document: DE

Owner name: GOOGLE LLC (N.D.GES.D. STAATES DELAWARE), MOUN, US

Free format text: FORMER OWNER: GOOGLE, INC., MOUNTAIN VIEW, CALIF., US

REG Reference to a national code

Ref country code: FR

Ref legal event code: CD

Owner name: GOOGLE INC., US

Effective date: 20180213

Ref country code: FR

Ref legal event code: CJ

Effective date: 20180213

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 10

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140507

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IE

Payment date: 20230327

Year of fee payment: 15

Ref country code: FR

Payment date: 20230327

Year of fee payment: 15

Ref country code: FI

Payment date: 20230327

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230327

Year of fee payment: 15

Ref country code: DE

Payment date: 20230329

Year of fee payment: 15

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230505

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230326

Year of fee payment: 15