EP0871331A2 - Verfahren und Vorrichtung zur adaptiven Kodierung einer Objektkontur - Google Patents

Verfahren und Vorrichtung zur adaptiven Kodierung einer Objektkontur Download PDF

Info

Publication number
EP0871331A2
EP0871331A2 EP19970304110 EP97304110A EP0871331A2 EP 0871331 A2 EP0871331 A2 EP 0871331A2 EP 19970304110 EP19970304110 EP 19970304110 EP 97304110 A EP97304110 A EP 97304110A EP 0871331 A2 EP0871331 A2 EP 0871331A2
Authority
EP
European Patent Office
Prior art keywords
contour
current
overlapping
matching
contours
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP19970304110
Other languages
English (en)
French (fr)
Other versions
EP0871331A3 (de
EP0871331B1 (de
Inventor
Jin-Hun Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WiniaDaewoo Co Ltd
Original Assignee
Daewoo Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daewoo Electronics Co Ltd filed Critical Daewoo Electronics Co Ltd
Publication of EP0871331A2 publication Critical patent/EP0871331A2/de
Publication of EP0871331A3 publication Critical patent/EP0871331A3/de
Application granted granted Critical
Publication of EP0871331B1 publication Critical patent/EP0871331B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/20Contour coding, e.g. using detection of edges
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding

Definitions

  • the present invention relates to a method and apparatus for encoding a contour of an object expressed in a video signal; and, more particularly, to a method and apparatus capable of reducing the amount of transmission data through the use of a contour motion estimation technique.
  • One of such techniques for encoding video signals for a low bit-rate encoding system is the so-called object-oriented analysis-synthesis coding technique( see Michael Hötter, "Object-Oriented Analysis-Synthesis Coding Based on Moving Two-Dimensional Objects", Signal Processing: Image Communication 2 , 409-428(December, 1990)).
  • an input video image is divided into objects; and three sets of parameters for defining the motion, contour and pixel data of each object are processed through different encoding channels.
  • contour information is important for the analysis and synthesis of the object shape.
  • a classical coding method for representing the contour information is a chain coding method.
  • the chain coding method requires a substantial amount of bits for the representation thereof, although there is no loss in the contour information.
  • DST discrete sine transform
  • the present invention provides a method and apparatus for contour coding employing a contour motion estimation technique based on a difference between a current and previous contours.
  • a method for encoding a current contour based on one or more previously reconstructed contours of video image signals comprising the steps of: (a) choosing one of the previously reconstructed contours as a predicted contour for the current contour; (b) overlapping the predicted contour with the current contour; (c) finding matching segments between the predicted and the current contours and setting two end points of each matching segment as major vertices; (d) widening the overlapped predicted contour and determining matching sections on the current contour, wherein each matching section represents a portion of the current contour overlapping with the widened contour between two major vertices; (e) setting the two major vertices at the ends of each matching section as primary vertices; (f) polygonal-approximating each non-matching section on the current contour, thereby determining secondary vertices on non-matching sections; and (g) encoding the current contour by representing each matching section by a portion of the predicted contour between two primary vertices of
  • FIG. 1 there is shown a schematic block diagram of an inventive apparatus 10 for encoding contours in a frame signal.
  • a contour image signal of a current frame having therein one or more objects is inputted to a contour detection unit 100 and a motion estimation unit 300 via a line L50 in a form of a segmentation mask, wherein each pixel in the segmentation mask has a label identifying a region it belongs to. For instance, a pixel in a background has a label "0" and each pixel in an object is labeled by one of non-zero values.
  • the contour detection unit 100 detects contours from the input segmentation mask and assigns index data to each of the contours included in the contour image signal according to a processing order thereof; and sequentially outputs contour information for each of the contours in the current frame, wherein the contour information includes contour data representing positions of contour pixels on a contour and index data thereof.
  • the contour information on a line L10 from the contour detection unit 100 is provided as current contour information to an inter-coding unit 200, and a motion estimation unit 300.
  • the motion estimation unit 300 detects a most similar contour to the current contour on the line L10 based on the current contour image signal on the line L50, the current contour information on the line L10 and a reconstructed previous contour image signal coupled thereto from a frame memory 700 via a line L30.
  • the previous contour image signal is also in the form of a segmentation mask, each pixel therein having a label identifying a region it belongs to.
  • Outputs of the motion estimation unit 300 on lines L20 and L40 are index data of the most similar contour and motion information representing a displacement between the current contour and the most similar contour.
  • the motion estimation unit 300 will be described in detail with reference to Figs. 2 and 3.
  • the motion estimation unit 300 includes a global motion vector detection block 310, a predicted contour image generation block 320, and an optimum contour detection block 330.
  • the global motion vector detection block 310 uses the previous contour image signal on the line L30 and the current contour image signal on the line L50, detects a global motion vector(GMV) yielding a largest number of object pixels overlapping with each other. Detection of the GMV is carried out within a predetermined search ranges of, e.g., +/- 16 pixels.
  • the GMV detected at the global motion vector detection block 310 is fed to the predicted contour image generation block 320 and provided on lines L22 and L42 leading to the motion compensation unit 400 and a multiplexor(MUX) 800, respectively.
  • the predicted contour image generation block 320 At the predicted contour image generation block 320, the previous contour image coupled thereto via the line L30 is shifted by the GMV to produce a predicted contour image. Further, as in the contour detection unit 100 shown in Fig. 1, the predicted contour image generation block 320 detects contours in the predicted contour image. Contour information for each of the predicted contours detected at the predicted contour image generation block 320 is fed on a line L60.
  • an optimum contour based on the predicted contour information on the line L60 and current contour information on the line L10, an optimum contour, a predicted contour most similar to the current contour is detected among predicted contours residing within a preset search range, e.g., +/- 8 pixels of the current contour; and local motion vector(LMV) representing a displacement between the current contour and the optimum contour and the index data of the optimum contour are outputted on the lines L20 and L40.
  • the index data of the predicted contour has a same value as a label of the object pixels corresponding to the predicted contour.
  • optimum contour detection block 330 which includes a candidate contour determination sector 331, a matching sector 333, and an optimum contour determination sector 335.
  • the candidate contour determination sector 331 detects predicted contours residing within the preset search range from the current contour and calculates the lengths of the current and the detected predicted contours based on the current contour information and the predicted contour information on the lines L10 and L60, respectively. Thereafter, the lengths of the current contour and each of those predicted contours within the preset search range are compared. If a difference between the lengths of the current and a predicted contour is smaller than M times the shorter one of the two contours, the predicted contour is determined as a candidate contour, M being a predetermined number. After determining one or more candidate contours for the current contour, indication signals identifying those candidate contours, e.g., index data of the candidate contours, are fed to the matching sector 333.
  • the length of a contour can be defined by, for example, the number of contour pixels which constitute the contour.
  • the candidate contours can be determined based on the numbers of object pixels positioned inside the respective contours in lieu of the lengths thereof.
  • the matching sector 333 retrieves contour information for each candidate contour from the predicted contour image generation block 320 via the line L60 in response to the indication signals inputted thereto. Subsequently, the current contour and a candidate contour are matched based on the current contour information on the line L10 and the candidate contour information on the line L60. After performing the matching process between the current contour and each candidate contour, the matching sector 333 provides the optimum contour determination sector 335 with matching information for each candidate block.
  • the matching information includes index data, a motion displacement and a matching length for a candidate contour.
  • the candidate contour is displaced by, e.g., a one pixel basis within the preset search range, and matching segments of the current and the candidate contours at each displacement are determined.
  • Fig. 5A depicts a current contour 10 and a candidate contour 20 overlapping with each other. After overlapping the contours 10 with 20, the intersection points therebetween such as PV1 to PV6, P1 and P2 are detected and lengths of the overlapping segments PV6-PV1, PV2-PV3, P1-P2, and PV4-PV5 are calculated.
  • the overlapping segment is determined as a matching segment.
  • the overlapping segment between P1 and P2 is not greater than the TH1; the remaining overlapping segments are greater than the TH1. Therefore, the remaining overlapping segments, e.g., PV2 to PV3, PV4 to PV5, and PV6 to PV1, are determined as the matching segments.
  • determination of the matching segment can be carried out based on the number of contour pixels residing on a given overlapping segment in lieu of the length thereof.
  • a portion of the edge between the intersection points is regarded as a portion of the contour. For instance, as shown in Fig. 5B, if a current contour 50 and a candidate contour 60 intersect with a right edge 40-1 of the video frame 40 at points 50-1, 50-2 and 60-1, 60-2, the portions of the edge 40-1 between the points 50-1 and 50-2 and the points 60-1 and 60-2 are treated as parts of the contours 50 and 60, respectively.
  • the matching lengths of the candidate contours are compared with each other; and a candidate contour corresponding to a matching length of a maximum value is declared as the optimum contour of the current contour.
  • the motion displacement corresponding to the optimum contour is set as the local motion vector(LMV).
  • Outputs from the optimum contour determination sector 335 on the lines L20 and L40 are the LMV and the index data of the optimum contour.
  • the motion compensation unit 400 generates a predicted current contour by retrieving the optimum contour information from the frame memory 700 via the line L30 based on the GMV on the line L22, and the LMV and the index data of the optimum contour on the line L20, wherein the predicted current contour represents the optimum contour shifted by the sum of the GMV and the LMV.
  • the output to the inter-coding unit 200 and a contour reconstruction unit 600 provided via a line L55 from the motion compensation unit 400 is the predicted current contour information representing position data of contour pixels of the predicted current contour and index data thereof.
  • FIG. 4 there is depicted a detailed block diagram of the inter-coding unit 200 which includes a matching block 420, a widening block 430 and an encoding block 440.
  • the Predicted current contour information on the line L55 and the current contour information on the line L10 are fed to the matching block 420.
  • the predicted current contour is matched with the current contour.
  • the matching procedure executed at the matching block 420 is similar to the matching procedure of the matching sector 333 in Fig. 3 described hereinabove with reference to Figs. 5A and 5B. For instance, if the contours 10 and 20 in Fig. 5A are the current contour and the predicted current contour, those points PV1 to PV6, each constituting an end point of a matching segment, are determined as major vertices. The intersecting points P1 and P2 are not major vertices since the overlapping segment therebetween is not a matching segment.
  • the matched contour is fed to the widening block 430, wherein matched contour information includes position information of contour pixels on the predicted current and the current contours and the major vertices.
  • the widening block 430 widens the predicted current contour 20 by a predetermined threshold Dmax to create a predicted contour band 20'. Then, the widening block 430 matches the predicted contour band 20' with the current contour 10 to find portions of the current contour overlapping with the predicted contour band 20' between pairs of major vertices inclusive. It is found, in the example give n in Fig. 6, that the current contour 10 overlaps with the predicted contour band 20' between the pairs of vertices PV6 and PV3, and PV4 and PV5. Such overlapping parts PV6 to PV3 and PV4 to PV5 of the current contour 10 are set as overlapping sections.
  • a length of each overlapping section is compared with a threshold TH2 and an overlapping section longer than the threshold TH2 is determined as a matching section and major vertices at the ends of the matching section are considered as primary vertices.
  • those pairs of the vertices PV6-PV3 and PV4-PV5 become primary vertices if the lengths therebetween are greater than TH2.
  • those matching sections are treated as being matched with the portions of the predicted current contour between each pair of the primary vertices.
  • the output from the widening block 430 to the encoding block 440 is matching data including position information of the primary vertices and the contour pixels of the current contour.
  • vertices are determined on each of non-matching sections of the current contour, e.g., portions of the current contour 10 between primary vertices PV6-PV5 and PV4-PV3 through the use of the conventional polygonal approximation technique based on the predetermined threshold Dmax. That is, according to the conventional polygonal approximation, a contour pixel on any contour segment which has a farthest distance to a line segment corresponding thereto is determined as a vertex when the farthest distance is greater than the Dmax. Those vertices determined by the polygonal approximation are set as secondary vertices.
  • the encoding block 440 encodes position data of the primary and the secondary vertices and provides the encoded vertex data to the MUX 800 and an inter-decoding unit 500.
  • the encoded vertex data, the GMV, the LMV and the index data of the optimum contour are multiplexed to provide encoded contour data to a transmitter(not shown) for the transmission thereof.
  • the encoded vertex data is decoded into decoded vertex data representing the decoded primary and secondary vertices and the decoded vertex data is provided to the contour reconstruction block 600.
  • the decoded vertex data from the inter-decoding unit 500 is utilized in reconstructing the current image signal together with the predicted current contour information fed via the line L55 at the contour reconstruction unit 600. That is, in order to provide a reconstructed current image signal having reconstructed current contour, portions of the reconstructed current contour which correspond to the non-matching sections of the current contour are reconstructed based on the decoded vertex data while remaining portions are reconstructed from the predicted current contour.
  • the reconstructed current image signal is stored at the frame memory 700 and is utilized as a reconstructed previous image signal for the next image signal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Image Processing (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Image Analysis (AREA)
  • Compression Of Band Width Or Redundancy In Fax (AREA)
EP19970304110 1997-04-11 1997-06-12 Verfahren und Vorrichtung zur adaptiven Kodierung einer Objektkontur Expired - Lifetime EP0871331B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1019970013367A KR100229544B1 (ko) 1997-04-11 1997-04-11 움직임 추정기법을 이용한 윤곽선 부호화 장치
KR9713367 1997-04-11

Publications (3)

Publication Number Publication Date
EP0871331A2 true EP0871331A2 (de) 1998-10-14
EP0871331A3 EP0871331A3 (de) 2000-09-27
EP0871331B1 EP0871331B1 (de) 2007-08-15

Family

ID=19502525

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19970304110 Expired - Lifetime EP0871331B1 (de) 1997-04-11 1997-06-12 Verfahren und Vorrichtung zur adaptiven Kodierung einer Objektkontur

Country Status (6)

Country Link
US (1) US5929917A (de)
EP (1) EP0871331B1 (de)
JP (1) JP3924048B2 (de)
KR (1) KR100229544B1 (de)
CN (1) CN1147156C (de)
DE (1) DE69738016T2 (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2914467A1 (fr) * 2007-03-29 2008-10-03 Canon Kk Procedes et dispositifs de codage et de decodage de signaux numeriques multidimensionnels.
US8249372B2 (en) 2007-03-16 2012-08-21 Canon Kabushiki Kaisha Methods and devices for coding and decoding multidimensional digital signals

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3363039B2 (ja) * 1996-08-29 2003-01-07 ケイディーディーアイ株式会社 動画像内の移動物体検出装置
KR100229546B1 (ko) * 1997-04-11 1999-11-15 전주범 윤곽선 비디오 신호 부호화 방법 및 그 장치
KR100244769B1 (ko) * 1997-06-26 2000-02-15 전주범 스케일러빌리티를 갖는 간 윤곽선 부호화 방법 및 장치
KR19990008977A (ko) * 1997-07-05 1999-02-05 배순훈 윤곽선 부호화 방법
KR100295798B1 (ko) * 1997-07-11 2001-08-07 전주범 스케일러빌리티를구현한이진현상신호부호화장치
JPH11308610A (ja) * 1998-04-02 1999-11-05 Daewoo Electronics Co Ltd 映像信号適応的符号化装置
JP3753578B2 (ja) * 1999-12-07 2006-03-08 Necエレクトロニクス株式会社 動きベクトル探索装置および方法
JP2001266159A (ja) * 2000-03-17 2001-09-28 Toshiba Corp 物体領域情報生成方法及び物体領域情報生成装置並びに近似多角形生成方法及び近似多角形生成装置
US7336713B2 (en) * 2001-11-27 2008-02-26 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding data
US7809204B2 (en) * 2002-10-18 2010-10-05 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding key value data of coordinate interpolator
RU2225035C1 (ru) * 2003-04-21 2004-02-27 Общество с ограниченной ответственностью "Р.Т.С.-Сервис" Способ кодирования координат перемещающегося на экране вычислительного устройства видеоизображения, устройство для декодирования визуального объекта, закодированного этим способом, и система, предназначенная для визуализации активного видео с помощью этого устройства
CN108832935B (zh) * 2018-05-31 2022-05-10 郑州云海信息技术有限公司 一种rle算法实现方法、***、设备及计算机存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2296839A (en) * 1994-12-29 1996-07-10 Hyundai Electronics Ind Shape information reduction apparatus
US5635986A (en) * 1996-04-09 1997-06-03 Daewoo Electronics Co., Ltd Method for encoding a contour of an object in a video signal by using a contour motion estimation technique

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748231A (en) * 1992-10-13 1998-05-05 Samsung Electronics Co., Ltd. Adaptive motion vector decision method and device for digital image stabilizer system
US5592228A (en) * 1993-03-04 1997-01-07 Kabushiki Kaisha Toshiba Video encoder using global motion estimation and polygonal patch motion estimation
KR100235345B1 (ko) * 1994-12-29 1999-12-15 전주범 분할영역에서의 움직임 추정방법 및 장치
KR100235343B1 (ko) * 1994-12-29 1999-12-15 전주범 영역분할 기법을 이용한 동영상신호 부호화기의 움직임 벡터 측정장치
KR100209798B1 (ko) * 1995-04-08 1999-07-15 전주범 확장-내삽을 이용한 윤곽선 물체의 부호화 장치

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2296839A (en) * 1994-12-29 1996-07-10 Hyundai Electronics Ind Shape information reduction apparatus
US5635986A (en) * 1996-04-09 1997-06-03 Daewoo Electronics Co., Ltd Method for encoding a contour of an object in a video signal by using a contour motion estimation technique

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHAN K H ET AL: "CONTOUR-BASED IMAGE WARPING" PROCEEDINGS OF THE SPIE, 4 November 1996 (1996-11-04), XP000198824 *
CORONNA G ET AL: "SOME EXPERIMENTS IN INTERFRAME CONTOUR CODING" ALTA FREQUENZA,IT,UFFICIO CENTRALE AEI-CEI. MILANO, vol. 57, no. 10, 1 December 1988 (1988-12-01), pages 95-101, XP000112164 *
NAONORI UEDA ET AL: "A CONTOUR TRACKING METHOD USING AN ELASTIC CONTOUR MODEL AND AN ENERGY-MINIMIZATION APPROACH" SYSTEMS & COMPUTERS IN JAPAN,US,SCRIPTA TECHNICA JOURNALS. NEW YORK, vol. 24, no. 8, 1993, pages 59-69, XP000432441 ISSN: 0882-1666 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8249372B2 (en) 2007-03-16 2012-08-21 Canon Kabushiki Kaisha Methods and devices for coding and decoding multidimensional digital signals
FR2914467A1 (fr) * 2007-03-29 2008-10-03 Canon Kk Procedes et dispositifs de codage et de decodage de signaux numeriques multidimensionnels.

Also Published As

Publication number Publication date
DE69738016T2 (de) 2008-05-15
DE69738016D1 (de) 2007-09-27
US5929917A (en) 1999-07-27
CN1147156C (zh) 2004-04-21
JP3924048B2 (ja) 2007-06-06
KR100229544B1 (ko) 1999-11-15
JPH10290466A (ja) 1998-10-27
KR19980076589A (ko) 1998-11-16
EP0871331A3 (de) 2000-09-27
CN1196642A (zh) 1998-10-21
EP0871331B1 (de) 2007-08-15

Similar Documents

Publication Publication Date Title
US5978512A (en) Polygonal approximation method and apparatus for use in a contour encoding system
JP3725250B2 (ja) 輪郭符号化方法
US5737449A (en) Apparatus for encoding a contour of regions contained in a video signal
EP0720377B1 (de) Verfahren zum Ermitteln von Bewegungsvektoren zur Verwendung in einem auf Segmentation basierenden Kodierungssystem
JP3977494B2 (ja) 輪郭線符号化装置
US5691769A (en) Apparatus for encoding a contour of an object
US5929917A (en) Method and apparatus for adaptively coding a contour of an object
US5774595A (en) Contour approximation method for representing a contour of an object
US5870501A (en) Method and apparatus for encoding a contour image in a video signal
US5774596A (en) Adaptive contour coding method for encoding a contour image in a video signal
US5881174A (en) Method and apparatus for adaptively coding a contour of an object
JP3924032B2 (ja) 輪郭線符号化方法及び輪郭線符号化装置
US5896467A (en) Method and apparatus for encoding a contour image of an object in a video signal
US5828790A (en) Method and apparatus for approximating a contour image of an object in a video signal
US5754703A (en) Method for encoding a contour of an object in a video signal
CN1062701C (zh) 用于编码目标轮廓的装置
JP3694349B2 (ja) 輪郭符号化装置
KR100207389B1 (ko) 물체의 윤곽부호화 장치
KR100307617B1 (ko) 동영상부호화기에있어서움직임평가방법
JP3859786B2 (ja) 映像信号における物体の輪郭線符号化方法
EP0854651A2 (de) Verfahren und Vorrichtung zur Kodierung einer Objektkontur in einem Videosignal
GB2321359A (en) Polygonal approximation in video contour encoding system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB NL

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

17P Request for examination filed

Effective date: 20010228

AKX Designation fees paid

Free format text: DE FR GB NL

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: DAEWOO ELECTRONICS CORPORATION

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB NL

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 69738016

Country of ref document: DE

Date of ref document: 20070927

Kind code of ref document: P

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20080516

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69738016

Country of ref document: DE

Representative=s name: KLUNKER, SCHMITT-NILSON, HIRSCH, DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20130404 AND 20130410

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69738016

Country of ref document: DE

Representative=s name: KLUNKER, SCHMITT-NILSON, HIRSCH, DE

Effective date: 20130313

Ref country code: DE

Ref legal event code: R081

Ref document number: 69738016

Country of ref document: DE

Owner name: MAPLE VISION TECHNOLOGIES INC., CA

Free format text: FORMER OWNER: DAEWOO ELECTRONICS CORP., SEOUL/SOUL, KR

Effective date: 20130313

REG Reference to a national code

Ref country code: FR

Ref legal event code: TP

Owner name: MAPLE VISION TECHNOLOGIES INC., CA

Effective date: 20131226

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 19

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20150609

Year of fee payment: 19

Ref country code: GB

Payment date: 20150610

Year of fee payment: 19

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20150609

Year of fee payment: 19

Ref country code: FR

Payment date: 20150608

Year of fee payment: 19

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 69738016

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20160701

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20160612

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20170228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170103

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160701

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160612