EP4069440A1 - Procédé et dispositif de détection de conteneurs qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs - Google Patents

Procédé et dispositif de détection de conteneurs qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs

Info

Publication number
EP4069440A1
EP4069440A1 EP20808055.6A EP20808055A EP4069440A1 EP 4069440 A1 EP4069440 A1 EP 4069440A1 EP 20808055 A EP20808055 A EP 20808055A EP 4069440 A1 EP4069440 A1 EP 4069440A1
Authority
EP
European Patent Office
Prior art keywords
containers
container
fallen
images
mass flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20808055.6A
Other languages
German (de)
English (en)
Inventor
Siddiqui Aurangzaib AHMED
Udo BAYER
Josef PAUKER
Stefan AWISZUS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Krones AG
Original Assignee
Krones AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Krones AG filed Critical Krones AG
Publication of EP4069440A1 publication Critical patent/EP4069440A1/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/3404Sorting according to other particular properties according to properties of containers or receptacles, e.g. rigidity, leaks, fill-level
    • B07C5/3408Sorting according to other particular properties according to properties of containers or receptacles, e.g. rigidity, leaks, fill-level for bottles, jars or other glassware
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B13/00Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
    • G05B13/02Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
    • G05B13/0265Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
    • G05B13/027Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion using neural networks only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0063Using robots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Definitions

  • the deep neural network can be trained with a training data set with images of standing and fallen and / or damaged containers, so that the deep neural network uses the training data set to develop a model to match the standing, fallen and / or damaged containers of the container mass flow from one another distinguish.
  • the deep neural network can be trained with a large number of different cases, so that the evaluation is largely independent of the container type and / or environmental influences.
  • the training data set can include images of containers of different sizes, orientations or positions.
  • the images of the training data set can be recorded with the at least one camera. It is conceivable that this takes place in a test system or directly on site at an operator of the beverage processing system. It is also conceivable that the manufacturer of the beverage processing system creates a database with images of standing and fallen and / or damaged containers in order to then use them with the training data set.
  • the images of the training data set can be automatically duplicated in order to create additional images with additional combinations of standing and fallen and / or damaged containers. As a result, the effort involved in creating the training data set can be reduced considerably. It is conceivable that image sections with a standing or fallen and / or damaged container are created during the reproduction. The image sections can come from an original data set. It is conceivable that the image sections are rotated and / or enlarged individually during the reproduction. It is also conceivable that at least one exposure parameter is changed in the image sections during the reproduction. The image sections can then be reassembled to form the images of the training data set. As a result, a very large number of different images of the training data set can be provided using a few original images.
  • the exposure parameter can mean a brightness and / or a contrast of an image section.
  • the device offers the advantage of active accident and personal protection, since fallen and / or damaged containers do not have to be manually removed from the container mass flow by the operator. This is all the more true since the containers in the mass container flow are subjected to dynamic pressure among one another and intervention by the operating personnel to remove a container as a result of the sudden relief of the container flow harbors the risk of accidents such as crushing and cutting.
  • the device for identifying the fallen and / or damaged container in the container mass flow can be arranged in a beverage processing system. It is conceivable that at least one container treatment machine is arranged upstream and / or downstream of the conveyor. In other words, the conveyor can connect two container handling machines to one another.
  • FIG. 3 shows an exemplary embodiment of a method according to the invention for recognizing fallen containers as a flow chart
  • FIG. 4 shows an exemplary embodiment of a section of the method from FIG. 3 for training the deep neural network
  • the camera 6 is arranged on the conveyor 5, which detects the standing containers 2 and the containers 3 that have fallen over from above at an angle.
  • the arrangement of the camera 6 is only shown here by way of example. It is also conceivable that there are several cameras that look obliquely from above in the same direction or in opposite directions. An arrangement is also conceivable directly from above, perpendicular to a transport surface of the conveyor 5.
  • the neural network 71 is designed to recognize and localize the fallen containers 3. On the basis of the evaluation, the fallen containers 3 can then be removed from the conveyor 5 with a switch (not shown here) or by means of a gripping arm.
  • FIG. 2 two exemplary images 11 and I2 of the image data stream output by camera 6 from FIG. 1 are shown.
  • the training data set can be obtained from a set larger than 1000, preferably larger than 5000 and particularly preferably larger than 10000 images.
  • step 120 the containers 2 of the container mass flow are transported standing on the conveyor 5. It can occasionally happen that one of the containers 2 falls over and then lies on the conveyor 5 as a fallen container 3.
  • step 111 images of different container types and / or different lighting conditions are acquired. It is conceivable, for example, that this is done on a test system or that images of the container mass flow M of various beverage processing systems are collected in a database.
  • step 112 the images are scaled to a standard size. This enables them to be evaluated uniformly.
  • the fallen and standing containers 2, 3 are marked and classified. This can be done manually, semi-automatically or automatically. For example, this can be done manually by an operator on a screen or with a particularly computationally intensive image processing algorithm.
  • the marking can be, for example, a surrounding box and the classification can be a container type or a container size.
  • step 114 the images are automatically duplicated in order to create further images with additional combinations of standing and fallen containers 2, 3.
  • image sections are first created with one standing or one overturned container 2, 3, which are then rotated and / or enlarged individually for reproduction. It is also conceivable that the exposure parameters of the image sections are changed during the reproduction. The image sections can then be put together in the most varied of combinations as further images, from which the training data set is then created in step 115.
  • step 116 features are automatically extracted by means of the deep neural network 71.
  • a multi-stage filtering process for the training data set is used. It is conceivable that edge filters or the like can be used to extract the outer boundary of each individual container 2, 3.
  • the extraction of features here can very generally mean a method for recognizing and / or localizing distinguishing features of the overturned containers 3 compared to the standing containers 2 in the images of the training data set.
  • this can also be done manually by an operator.
  • the extracted features can include a container closure, a contour of a standing or fallen container 2, 3, a container label and / or the light reflections.
  • the extracted features can each include a feature classification, a 2D and / or 3D coordinate.
  • the deep neural network 71 is trained with the training data set.
  • images of the training data set with the extracted features and the associated markings and classifications of the fallen and standing containers 2, 3 are iteratively given to the deep neural network 71. From this, the deep neural network 71 develops a model in step 118 with which the fallen and standing containers 2, 3 can be recognized.
  • the model can then be verified by means of the training data set without specifying the markings and classifications. A comparison is made as to whether the deep neural network 71 actually recognizes the previously specified markings and classifications in the training data set. Likewise, further images with fallen and standing containers 2, 3 can be used for this purpose, which the deep neural network 71 was not trained.
  • the substep 140 of the method 100 from FIG. 3 for evaluating the image data stream with the deep neural network 71 is shown in more detail as a flowchart.
  • step 142 The features are then extracted in step 142. This takes place in the same way as described in step 116 with reference to FIG.
  • the deep neural network then recognizes the orientation and the location of the respective container 2, 3 in step 143 and indicates a probability as to whether this container 2, 3 is transported lying or standing on the conveyor 5.
  • This information is then visualized in step 144 and output on a screen in accordance with FIG. 2. In this way, an operator can check whether the recognition is proceeding properly.
  • a signal is output in step 145 in order to remove it from the conveyor 5, for example with a switch or a gripper arm.
  • the container mass flow M is recorded as an image data stream with the at least one camera 6 and the image data stream is evaluated by the image processing unit 7 with the deep neural network 71, the images of the image data stream can be evaluated on the basis of previously learned empirical values from the deep neural network 71, around the standing and to classify fallen container 2, 3 respectively. Because it is possible to train the deep neural network 71 with images of the most varied of container types and / or environmental conditions, it is no longer necessary to adapt the evaluation of the image data stream in the specific application. Consequently, the method according to the invention is particularly flexible and easy to use.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Medical Informatics (AREA)
  • Automation & Control Theory (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

Procédé de détection de conteneurs (3) qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs, les conteneurs (2, 3) dans le flux massique de conteneurs étant transportés verticalement sur un transporteur, le flux massique de conteneurs étant capturé sous la forme d'un flux de données d'image au moyen d'au moins un appareil de prise de vues, et le flux de données d'image étant évalué par une unité de traitement d'image, le flux de données d'image étant évalué par l'unité de traitement d'image au moyen d'un réseau neuronal profond afin de détecter et de localiser les conteneurs (3) qui sont tombés et/ou sont endommagés.
EP20808055.6A 2019-12-03 2020-11-16 Procédé et dispositif de détection de conteneurs qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs Pending EP4069440A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019132830.6A DE102019132830A1 (de) 2019-12-03 2019-12-03 Verfahren und Vorrichtung zur Erkennung von umgefallenen und/oder beschädigten Behältern in einem Behältermassenstrom
PCT/EP2020/082172 WO2021110392A1 (fr) 2019-12-03 2020-11-16 Procédé et dispositif de détection de conteneurs qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs

Publications (1)

Publication Number Publication Date
EP4069440A1 true EP4069440A1 (fr) 2022-10-12

Family

ID=73455690

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20808055.6A Pending EP4069440A1 (fr) 2019-12-03 2020-11-16 Procédé et dispositif de détection de conteneurs qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs

Country Status (5)

Country Link
US (1) US20230005127A1 (fr)
EP (1) EP4069440A1 (fr)
CN (1) CN114761145A (fr)
DE (1) DE102019132830A1 (fr)
WO (1) WO2021110392A1 (fr)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4200546A1 (de) * 1992-01-11 1993-07-15 Alfill Getraenketechnik Verfahren und vorrichtung zum behandeln von flaschen
DE20110686U1 (de) 2001-06-27 2002-08-01 Krones Ag Vorrichtung zum Erkennen liegender Gefäße
DE102007014802A1 (de) 2007-03-28 2008-10-09 Khs Ag Verfahren zur Überwachung, Steuerung und Optimierung von Abfüllanlagen für Lebensmittel, insbesondere für Getränkeflaschen
DE102009043976B4 (de) 2009-09-10 2021-07-29 Krones Aktiengesellschaft Fördereinrichtung und Verfahren zu deren Steuerung
DE102013207139A1 (de) * 2013-04-19 2014-10-23 Krones Ag Verfahren zur Überwachung und Steuerung einer Abfüllanlage und Vorrichtung zur Durchführung des Verfahrens
CA2949877C (fr) 2014-06-06 2022-07-12 Gebo Cermex Canada Inc. Dispositif et procede d'intervention sur ligne de convoyage
CN106000904B (zh) * 2016-05-26 2018-04-10 北京新长征天高智机科技有限公司 一种生活垃圾自动分拣***
DE102016124400A1 (de) * 2016-12-14 2018-06-14 Krones Ag Verfahren und Vorrichtung zum Erfassen von Störungen beim Objekttransport
JP6595555B2 (ja) * 2017-10-23 2019-10-23 ファナック株式会社 仕分けシステム
JP7131617B2 (ja) * 2018-03-06 2022-09-06 オムロン株式会社 照明条件を設定する方法、装置、システム及びプログラム並びに記憶媒体
DE102018105301B4 (de) * 2018-03-08 2021-03-18 Sick Ag Kamera und Verfahren zur Erfassung von Bilddaten
JP7076747B2 (ja) * 2018-05-09 2022-05-30 リョーエイ株式会社 分類器の学習支援システム、学習データの収集方法、検査システム
CN110154272B (zh) * 2019-05-17 2021-04-13 佛山市玖州智能装备技术有限公司 人工智能废品塑料瓶分拣方法
CN110321944A (zh) * 2019-06-26 2019-10-11 华中科技大学 一种基于接触网画质评估的深度神经网络模型的构建方法

Also Published As

Publication number Publication date
DE102019132830A1 (de) 2021-06-10
US20230005127A1 (en) 2023-01-05
WO2021110392A1 (fr) 2021-06-10
CN114761145A (zh) 2022-07-15

Similar Documents

Publication Publication Date Title
EP2092311B1 (fr) Dispositif d'inspection de bouteilles ou recipients analogues
EP2132129B1 (fr) Procédé de surveillance, de commande et d'optimisation d'installations de remplissage pour produits alimentaires, notamment pour bouteilles à boissons
EP2295157B1 (fr) Dispositif et procédé destinés au contrôle de fermetures de récipients
DE60028756T2 (de) Methode und vorrichtung zur handhabung von ausgeworfenen sprtizgussteilen
EP3625740B1 (fr) Système et méthode pour contrôler un flux de matière sur un point de noeud
EP2987136A1 (fr) Procédé de surveillance et de conduite d'une installation de remplissage et dispositif permettant la mise en uvre du procédé
EP2479123A1 (fr) Dispositif et procédé destinés au transport d'articles
EP3563941B1 (fr) Machine de nettoyage de bouteilles
EP2295156B1 (fr) Dispositif de transport avec moyens de détection d'articles renversés et son procédé de commande
EP3501676B1 (fr) Machine de nettoyage de bouteilles destinée au nettoyage de bouteilles
EP0996531B1 (fr) Procede et dispositif pour produire des corps creux en matiere plastique
WO2020003180A1 (fr) Dispositif et procédé d'inspection de sacs de transport pouvant être convoyés de manière suspendue
EP1600764A1 (fr) Procédé et dispositif d'inspection d'objets transparents
WO2021110682A1 (fr) Système de tri et procédé de tri pour tabac en feuilles
EP3541727A1 (fr) Procédé et dispositif de détection de perturbations lors d'un transport d'objets
EP4159329A1 (fr) Suppression des localisations erronées sur un élément sélecteur
EP3544780B1 (fr) Procédé et système pour le transport et la manipulation de récipients à boissons rassemblés en unités d'emballage ainsi que pour l'initialisation d'au moins un procédé déterminé dans le cas d'une dégradation d'une unité résultant d'une manipulation
EP4069440A1 (fr) Procédé et dispositif de détection de conteneurs qui sont tombés et/ou sont endommagés dans un flux massique de conteneurs
DE102019105834A1 (de) Greifvorrichtung, Vereinzelungsvorrichtung sowie Verfahren zum Greifen von Körpern und Verwendung einer Greifvorrichtung
DE19959623C2 (de) Verfahren und Anordnung zum Lokalisieren von zylinderförmigen Objekten
WO2023061840A1 (fr) Ligne de transformation alimentaire et procédé pour faire fonctionner une ligne de transformation alimentaire
EP2452291A1 (fr) Procédé pour déterminer en continu une position de préhension
WO2024017426A1 (fr) Dispositif et procédé de vidage de récipients
DE102022122554A1 (de) Verfahren zur Steuerung von Prozessen des Behälterhandlings und Behälterbehandlungsanlage zur Produktion, Abfüllung, Handhabung, Verpackung und/oder Förderung von Behältern
EP4276450A1 (fr) Dispositif et procédé d'inspection de récipients avec détection de position

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220408

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240513