EP3575257A1 - Commande d'ascenseur comportant une poursuite oculaire - Google Patents

Commande d'ascenseur comportant une poursuite oculaire Download PDF

Info

Publication number
EP3575257A1
EP3575257A1 EP18175194.2A EP18175194A EP3575257A1 EP 3575257 A1 EP3575257 A1 EP 3575257A1 EP 18175194 A EP18175194 A EP 18175194A EP 3575257 A1 EP3575257 A1 EP 3575257A1
Authority
EP
European Patent Office
Prior art keywords
control panel
camera
eye
control
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18175194.2A
Other languages
German (de)
English (en)
Inventor
Gerald REES
Ankush DAVE
Daniel Place
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inventio AG
Original Assignee
Inventio AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inventio AG filed Critical Inventio AG
Priority to EP18175194.2A priority Critical patent/EP3575257A1/fr
Publication of EP3575257A1 publication Critical patent/EP3575257A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/46Adaptations of switches or switchgear
    • B66B1/468Call registering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B2201/00Aspects of control systems of elevators
    • B66B2201/40Details of the change of control mode
    • B66B2201/46Switches or switchgear
    • B66B2201/4607Call registering systems
    • B66B2201/4638Wherein the call is registered without making physical contact with the elevator system

Definitions

  • the present invention relates to a method for controlling an elevator system, a control system for the elevator system and to the elevator system.
  • buttons may be out of reach to individuals in a wheel-chair, the use of crutches or other aides may pose a problem, or decreased motor function may make it difficult to press the intended button. Others may be hesitant to press the buttons for sanitary reasons or may want the convenience to call a floor hands-free if they are holding items in both of their hands.
  • JP 2010 100 370 describes that a line of sight of a passenger of an elevator car is determined and that the elevator car is stopped at the next floor, when the passenger looks at a specific location. This may be used as alternative to a security button.
  • WO 2011 114 489 A1 relates to a guide device for an elevator.
  • the guide device comprises a camera, which takes pictures around the entrance of the elevator. It is detected, whether or not a person has entered the elevator or not based on the pictures.
  • WO 2005 56251 A1 describes an elevator system with a camera, which detects the face of a person and determines therefrom, whether the person uses a wheel-chair or not.
  • An aspect of the present invention relates to method for controlling an elevator system.
  • the method may be automatically performed by a control system of the elevator system and/or may be implemented as a computer program.
  • an elevator system may be any device adapted for transporting persons and/or goods vertically with an elevator car.
  • An elevator system may be installed in a building in an elevator shaft.
  • the method comprises: detecting the presence of a person in front of a control panel of the elevator system with a presence detection sensor; in the case a person has been detected, detecting the presence of at least one eye of a person in front of the control panel from a video stream of a camera of the control panel; in the case, when the eye has been detected, determining a gaze point of the eye on the control panel from a video stream of the camera and/or a further camera, wherein a line of sight of the eye is determined from an image of the eye in the video stream and the gaze point is determined by intersecting the line of sight with a component of the control panel stored in a virtual layout of the control panel; determining a selected floor from the gaze point; and controlling the elevator system to move an elevator car to the selected floor.
  • the control system of the elevator system may wait for a person to appear in front of a control panel.
  • a camera may start to work and/or the video stream of the camera may be analysed, whether an eye of the person is visible for the camera or not.
  • an eye detection module analyses the video stream and/or one or more images of the video stream.
  • the eye detection module may be based on a neural network and/or machine learning and/or may have been trained to detect a portion of an image that contain an eye.
  • the video stream may be analysed to determine a line of sight of the eye and/or whether the person is looking at the control panel.
  • a gaze tracking module that analyses one or more video streams of one or more of the cameras may perform the determination of the line of sight. For example, from the portion of the image and/or video stream that has been detected, the gaze tracking module may determine reflections on the eye. The reflections may have been caused by infrared lights and/or by other light sources, such as an elevator car lighting. From the reflections an orientation of the eye and/or view direction may be determined. From the position of the eye and the orientation a line of sight of the eye may be determined.
  • a gaze point on the control panel may be determined, which also may be performed by the gaze tracking module.
  • the position of the camera and/or of the control panel and/or optionally a virtual layout of the control panel may be stored in a controller, which performs the method. From this information, it may be determined at which part of the control panel the person is looking. For example, such a part may be a display and/or may be a button, such as a floor selection button, of the control panel.
  • the elevator car then may be controlled to move to this floor.
  • the virtual layout may comprise the positions and/or extensions of buttons on the control panel, the positions and/or extensions of visual control commands on a display of the control panel, etc.
  • a floor selection is cancelled by looking at a visual cue, such as a text, a symbol, etc. and then moving the gaze point to the previously selected floor.
  • a visual cue also may be displayed as visual control command on a display.
  • an elevator floor destination may be selected requiring no tactile input.
  • the method may use eye-tracking and gaze point detection algorithm in order that a person can send command prompts to the control panel hands-free.
  • the method further comprises: playing an audio prompt indicating the person to look at the control panel, when a person has been detected and no eye has been detected.
  • an audio prompt with instructions, how a floor can be selected with gaze tracking may be output via a loudspeaker.
  • the method further comprises: determining the presence of the eye with a second camera of the control panel, when no eye has been detected with the camera being a first camera.
  • the control panel may comprise more than one camera, which is used for eye tracking.
  • the controller analyses the video stream of the first camera and is not able to detect an eye, it may switch to the video stream of another camera.
  • the video stream from the second camera may be analysed for the presence of an eye and/or, when an eye has been detected for a line of sight and/or the gaze point.
  • a position of the second camera relative to the control panel may be stored in the controller.
  • the second camera is installed lower than the first camera.
  • the first camera or the second camera
  • the first camera may be installed in a height adapted for eye tracking of a standing grown-up person.
  • the second camera or the first camera
  • the gaze point is determined with the second camera. It may be that the first camera is solely used for determining whether an eye is visible or not and the second camera is then used for eye tracking. For example, the first camera may be less power consuming as the second one and this may save power consumed by the control system.
  • the method further comprises: displaying visual control commands on a display of the control panel.
  • control commands may comprise commands like "Do you need help? Yes/No", "Do you want to move up? Yes/No", etc.
  • the control commands may provide possibilities to control the movement of the elevator car and/or the selection of a floor.
  • the display may show visual symbols and/or text as control commands.
  • the display and/or screen may be integrated into the control panel or may be provided in the elevator car. It may allow a person to interact with the control system with his or her eye movement. The person may interact with control command prompts on the display screen in an emergency scenario and/or also may allow a person at an offsite location to control what is being displayed on the display.
  • the method further comprises: selecting a control command by determining, whether the gaze point is on the control command.
  • the selected floor then may be determined with the selected control command.
  • the control command may be a number representing a floor. When the gaze point stays on this number, the control system may decide that the respective floor has been selected.
  • the method further comprises: selecting a button on the control panel by determining, whether the gaze point is on the button.
  • the selected floor then may be determined from the selected button. It has to be noted that the same button also may be pressed for selecting the same floor. With the method, the floor may not solely be selected by pressing a button, but also by looking at the button. It also may be possible that other operations are imitated on the control panel, such as generating an emergency call, when the person looks at an emergency button.
  • a button press on a special button of the control panel may be used to cancel the last stored floor call and/or any other operation.
  • the presence detection sensor is a motion detection sensor.
  • the presence detection sensor may be an infrared sensor, an electromagnetic sensor, an ultrasonic sensor, etc.
  • the presence detection sensor may be integrated into the control panel.
  • the presence detection sensor may be a sensor different from the camera providing the video stream for gate tracking. This may provide the ability to interface with a secondary sensor to control when the one or more cameras and/or other parts of the control system are active, off, in a power-saving state and/or any other mode of operation.
  • presence detection of a person in front of the control panel also may be performed by analysing the video stream from a camera, such as the first and/or second camera.
  • a further aspect of the invention relates to a control system for an elevator system, which comprises a control panel and a controller adapted for performing the method as described in the above and in the following.
  • the control panel may be installed in an elevator car and/or at a door of the elevator system.
  • the controller may be part of the control panel. It also may be that the controller comprises several parts, for example a part in the control panel and a further part in the central controller of the elevator system, which, for example, may be installed near a drive of the elevator system.
  • the control system may be adapted for controlling elevator calls that select the desired floor by tracking eye movements based on where and/or what a person is looking at.
  • the controller may comprise one or more processors, which are adapted for performing the method, when a corresponding computer program is executed with them.
  • the control panel comprises at least one camera adapted for eye tracking.
  • the video stream from the at least one camera may be analysed to determine a line of sight of the eye. It also may be that the video stream of the at least one camera is presented to a central processing center to monitor for safety reasons.
  • the central processing center may be connected to a control system of the elevator system by Internet and/or the video stream may be transmitted via Internet.
  • the control panel comprises a first module and a second module.
  • a module may be mechanically interconnected components that can be installed as one unit in the elevator car and/or other position.
  • the first module of the control panel may comprise buttons, such as floor selection buttons, and a camera.
  • the camera may be used for eye tracking.
  • the second module of the control panel may comprise a display and a further camera.
  • the display may be used for presenting control commands to a person in front of the control panel.
  • the further camera alternately and/or additionally may be used for eye tracking.
  • a video stream of the further camera may be transmitted to a central processing center for monitoring the place in front of the control panel.
  • control panel as described in the above and in the following also may be provided as one module.
  • the control panel comprises buttons for manually selecting a floor. These buttons may be part of the first module.
  • the control panel comprises a display for displaying control commands.
  • the display device also may be used as a display device for deaf users. For example, information on how to use the gaze tracking method may be displayed in text on the display.
  • the display may be part of the second module.
  • the control panel comprises a loudspeaker for outputting audio prompts.
  • the loudspeaker also may offer hearable information for blind users.
  • a loudspeaker may be part of the first module and/or the second module.
  • control panel comprises a presence detection sensor.
  • the presence detection sensor may be part of the first module.
  • a further aspect of the invention relates to an elevator system, which comprises an elevator car movable in an elevator shaft and a control system as described herein.
  • the control panel of the control system may be installed in the elevator car. However, it also may be possible that the control panel is installed at a door of the elevator system for getting access to the elevator car.
  • features of the elevator system and the control system as described herein may be features of the method for controlling the elevator system, and vice versa.
  • Fig. 1 shows an elevator system 10 comprising an elevator car 12 movable in an elevator shaft 14 by a drive 16.
  • the elevator system 10 furthermore comprises a central controller 18 (which may be a part of the drive 16 or at least arranged near the drive 16) for controlling the drive 16 and further equipment of the elevator system 10.
  • the central controller 18 may also control elevator doors 20.
  • the central controller 18 may receive electronic control commands from a control panel 22 inside the elevator car 12. It also may be that here and in the following the control panel 22 is installed outside of the elevator car 12, for example besides one of the doors 20. The control panel 22 and the central controller 18 may be seen as a control system 24 of the elevator system 10.
  • the control panel 22 comprises a first module 26 and a second module 28, which will be described in more detail with respect to Fig. 2 .
  • the first module 26 comprises floor select buttons 30, an upper camera 32 arranged above the buttons 30, a lower camera 34 arranged below the buttons 30 and a presence detection sensor 36.
  • the floor select buttons 30 may be used for selecting a floor to which the elevator car should move. There may be a button 30 for each floor. When a person pushes the respective button 30, a corresponding electronic command is sent via a local controller 38 to the central controller 18, which then controls the elevator system 10 to move the elevator car 12 to the respective floor.
  • the local controller 38 may be part of the control panel 22.
  • the camera 32 may generate a video stream that may be analysed by the local controller 38 to determine face data and/or retinal data of a person. Also the camera 34 may generate a video stream to determine face data and/or retinal data of a person. This may be performed additionally or alternatively with respect to the video stream of the camera 32.
  • the camera 34 may be a low-angle emergency and/or disability camera 34, for example, for high-stress situations and/or for persons with eyes on a lower level, such as persons in a wheel-chair or children.
  • the lower camera 34 may provide more accurate registering of information for a person in a wheelchair or of shorter stature.
  • the camera 34 may be seen as a secondary camera 34 and/or may be used instead of the primary camera 32, when the secondary camera 34 is more viable based on the position of the person in the elevator car 12. Additionally or alternatively, the camera 32 and the camera 34 may be used in conjunction with each other.
  • the local controller 38 may determine a line of sight of an eye of a person and a gaze point of the person, in particular a gaze point on the module 26 and/or the module 28.
  • the presence detection sensor 36 may be arranged above the buttons 30 and/or the camera 32.
  • the presence detection sensor 36 may be adapted for detecting the presence of a person in front of the control panel 22.
  • the presence detection sensor 36 is adapted for sensing changes in infrared radiation and/or an ultrasonic sound, which may be caused by a human body in front of the control panel 22.
  • the module 28 may be arranged above the module 26.
  • Module 28 comprises a display 40 and one or more further cameras 42.
  • the camera(s) 42 may generate a video stream, which is evaluated by the controller 38 for face tracking and/or gaze tracking data. For example, with the camera 42, gaze tracking data relating to the display 40 may be generated and analysed by the controller 38.
  • the video stream of the camera 42 may be transmitted to a central processing center, for example via the controller 38 to the controller 18, which may be connected to the central processing center via Internet.
  • the camera 42 may be used for multiple purposes besides gaze detection including but not limited to in-car monitoring.
  • the display 40 may be used for displaying text prompts or visual control commands 44, such as "Yes”, “No", “Up”, “Down”, etc. For example, a selection of choice may be stated, such as “Yes” and “No”. A text may be added such as "Do you need help?". When the display 40 is not used for any other functions, it may serve as an emergency services device.
  • Module 28 also may comprise a loudspeaker or audio speaker 46, for example for prompting and interacting with a person inside the elevator car 12.
  • Fig. 3 shows a flow diagram of a method that may be executed by the control system 24.
  • control system 24 most of the control functions are described with respect to the local controller 38. However, it has to be understood, that these functions also may be performed by the controller 18 or by a combination of both controllers 18, 38.
  • step S10 only the presence detection sensor 36 may be active, i.e. measurements may be performed with the presence detection sensor 36 and evaluated with the local controller 38.
  • Other components of the control panel 22, such as the cameras 32, 34, 42 and the display 40, may be inactive.
  • Step S12 if the presence detection sensor 36 does not detect a human presence, the control system 24 continues in a passive state. The control system 24 returns to step S10 and the components 32, 34, 42 may remain inactive.
  • steps S14 when the presence of a person in front of a control panel 22 of the elevator system 10 is detected, the control system 24 is switched to an active state, which may mean that the eye-tracking functionality is powered on.
  • the camera 32 may be powered on, which then generates a video stream, which is analysed by the local controller 38.
  • step S16 the local controller 38 starts to detect, whether there is an eye visible in the video stream.
  • the local controller 38 may detect, whether or not human eyes are readable by scanning and analysing retinal data in the video stream.
  • the local controller 38 may comprise an eye detection module, which, for example based on a neural network and/or machine learning, has been trained to detect portion of images that contain an eye.
  • an eye detection module which, for example based on a neural network and/or machine learning, has been trained to detect portion of images that contain an eye.
  • the presence of an eye of a person in front of the control panel 22 may be detected with one or more video streams from one or more of the cameras 32, 34, 42 of the control panel 22.
  • camera 34 may be powered on and the video stream of the camera 34 may be analysed. The presence of the eye may then be determined with the second camera 34.
  • step S18 when no eye has been detected (either with one or with more cameras 32, 34, 42), the local controller 38 plays an audio prompt indicating the person to look at the control panel 22.
  • step S20 when the eye has been detected, the video stream of the respective camera 32, 34, 42 is analysed for determining a gaze point of the eye. It may be that the gaze point is firstly determined with the first camera 32 and, if this is not possible, secondly with the second camera 34.
  • the local controller 38 may comprise a gaze tracking module that analyses one or more video streams of one or more of the cameras. For example, from the portion of the image and/or video stream that has been detected in step S14, the gaze tracking module may determine reflections on the eye. The reflections may have been caused by infrared lights included into the control panel 22 and/or by other light sources, such as an elevator car lighting. From the reflections an orientation of the eye and/or view direction may be determined. From the position of the eye and the orientation a line of sight of the eye may be determined.
  • the gaze point then may be determined by intersecting the line of sight with a component of the control panel 22, which component is stored in a virtual layout of the control panel.
  • the virtual layout may comprise the positions and/or extensions of the first and/or second module 26, 28, the positions and/or extensions of the buttons 30, the positions and/or extensions of the visual control commands 44, etc.
  • the virtual layout may be stored in the local controller 38 and/or may be updated by the local controller 38, for example when the visual control commands 44 change.
  • visual control commands 44 may be displayed on the display 40 of the control panel 22.
  • step S22 the local controller 38 detects, whether or not the person is looking at the control panel 22 and/or at which component of the control panel 22 the person is looking.
  • control system 24 may return to step S20 and, for example, may output user instructions via the display 40 and/or the loudspeaker 46. For example, if the eyes are not readable, an audio and or visual prompt may alert the person stating instructions on how to improve the chances of the person's eyes being readable. Similarly, if the person is not looking at the control panel 22, additional user instructions may be given.
  • step S24 if the gaze point is on the control panel 22, the control system 24 identifies, which floor is selected with the gaze point, i.e. a selected floor is determined from the gaze point. This may be done by determining, whether the person is looking at a specific button 30 or with the visual control commands 44 on the display 40.
  • a control command 44 may be selected, by determining, whether the gaze point is on the control command 44.
  • the selected floor then may be determined with the selected control command 44. For example, when the control command is "Up”, the selected floor may be the next floor above a current floor. By looking longer at the control command "Up”, the number of floors above the current floor may be increased. Analogously, with the control command "Down", a floor below the current floor may be selected.
  • buttons 30 Another possibility is that the person is looking at one of the buttons 30.
  • a button 30 on the control panel 22 may be selected by determining, whether the gaze point is on the button 30. The selected floor then may be determined from the selected button 30.
  • a layout of the control panel 22 may be stored in the local controller 38, which then can determine from the gaze point the component of the control panel 22, the person is looking at.
  • This layout may comprise the positions of the buttons 30 and/or the display 40.
  • step S26 an electronic control command is sent to the central controller 18, which floor has been selected.
  • This electronic control command may be the same as the one, which is generated, when a corresponding floor select button 30 is pushed.
  • step S28 the elevator system 10 controls the drive 16 to move the elevator car 12 to the selected floor. Furthermore, other equipment, such as the doors 20, may be controlled based on the electronic control command.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Elevator Control (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)
EP18175194.2A 2018-05-30 2018-05-30 Commande d'ascenseur comportant une poursuite oculaire Withdrawn EP3575257A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP18175194.2A EP3575257A1 (fr) 2018-05-30 2018-05-30 Commande d'ascenseur comportant une poursuite oculaire

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP18175194.2A EP3575257A1 (fr) 2018-05-30 2018-05-30 Commande d'ascenseur comportant une poursuite oculaire

Publications (1)

Publication Number Publication Date
EP3575257A1 true EP3575257A1 (fr) 2019-12-04

Family

ID=62492512

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18175194.2A Withdrawn EP3575257A1 (fr) 2018-05-30 2018-05-30 Commande d'ascenseur comportant une poursuite oculaire

Country Status (1)

Country Link
EP (1) EP3575257A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022003231A1 (fr) 2020-06-29 2022-01-06 Kone Corporation Commande d'un système d'ascenseur

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005056251A1 (fr) 2003-12-10 2005-06-23 Koninklijke Philips Electronics N.V. Tete de rasage equipee d'un organe etirant la peau
JP2007161420A (ja) * 2005-12-14 2007-06-28 Hitachi Ltd エレベータの呼び登録装置
JP2010100370A (ja) 2008-10-22 2010-05-06 Hitachi Ltd エレベーターの操作入力装置及び方法
WO2011114489A1 (fr) 2010-03-18 2011-09-22 三菱電機株式会社 Dispositif de guidage pour ascenseur
US20150309570A1 (en) * 2009-04-09 2015-10-29 Dynavox Systems Llc Eye tracking systems and methods with efficient text entry input features
JP2017013984A (ja) * 2015-07-03 2017-01-19 株式会社日立製作所 エレベータ装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005056251A1 (fr) 2003-12-10 2005-06-23 Koninklijke Philips Electronics N.V. Tete de rasage equipee d'un organe etirant la peau
JP2007161420A (ja) * 2005-12-14 2007-06-28 Hitachi Ltd エレベータの呼び登録装置
JP2010100370A (ja) 2008-10-22 2010-05-06 Hitachi Ltd エレベーターの操作入力装置及び方法
US20150309570A1 (en) * 2009-04-09 2015-10-29 Dynavox Systems Llc Eye tracking systems and methods with efficient text entry input features
WO2011114489A1 (fr) 2010-03-18 2011-09-22 三菱電機株式会社 Dispositif de guidage pour ascenseur
JP2017013984A (ja) * 2015-07-03 2017-01-19 株式会社日立製作所 エレベータ装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022003231A1 (fr) 2020-06-29 2022-01-06 Kone Corporation Commande d'un système d'ascenseur

Similar Documents

Publication Publication Date Title
AU2021200009B2 (en) System and method for alternatively interacting with elevators
US8958910B2 (en) Elevator system that autonomous mobile robot takes together with person
US11097924B2 (en) Hand detection for elevator operation
KR101775735B1 (ko) 승강기 설비의 호출 입력 장치
JP5951834B1 (ja) エレベータ
JP5996725B1 (ja) エレベータ用操作盤
KR20170030662A (ko) 승강기 시스템, 호출 입력 장치, 이러한 타입의 호출 입력 장치를 포함하는 승강기 시스템의 작동 방법 및 이러한 타입의 호출 입력 장치로의 승강기 시스템의 개조 방법
JP5074088B2 (ja) エレベータ装置
JP5550334B2 (ja) エレベータシステムおよびその制御方法
EP1633669B1 (fr) Touche d'appel d'ascenseur avec retroaction tactile
EP3575257A1 (fr) Commande d'ascenseur comportant une poursuite oculaire
JP6251638B2 (ja) エレベータシステム
JP2013159471A (ja) シークレット運転エレベーター
JP7294538B2 (ja) 建物の交通管理システム
Ekanayaka et al. Elderly supportive intelligent wheelchair
JP4237876B2 (ja) エレベータ
JP7478690B2 (ja) エレベーター
JP2011033837A (ja) 対話支援装置、対話支援方法およびプログラム
JPH09255243A (ja) 音声作動エレベータ−
JP7518738B2 (ja) エレベーターの呼び動作制御装置及び方法
JP6643221B2 (ja) エレベーター案内装置
JPWO2017195354A1 (ja) 案内提示装置及び案内提示方法
JP2013142026A (ja) エレベータの表示装置
JP4519468B2 (ja) 機能案内装置
JP7375134B1 (ja) エレベータ装置および表示方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20191216