AU709459B2 - Systems and methods for measuring at least two visual properties of an object - Google Patents

Systems and methods for measuring at least two visual properties of an object Download PDF

Info

Publication number
AU709459B2
AU709459B2 AU32944/95A AU3294495A AU709459B2 AU 709459 B2 AU709459 B2 AU 709459B2 AU 32944/95 A AU32944/95 A AU 32944/95A AU 3294495 A AU3294495 A AU 3294495A AU 709459 B2 AU709459 B2 AU 709459B2
Authority
AU
Australia
Prior art keywords
parameter
measurement
tray
determining
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU32944/95A
Other versions
AU3294495A (en
Inventor
Anthony L Adriaansen
Roger N. Caffin
David W. Crowe
John DEAR
Graham J. Higgerson
David E. Turvey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commonwealth Scientific and Industrial Research Organization CSIRO
Original Assignee
Commonwealth Scientific and Industrial Research Organization CSIRO
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AUPM8493A external-priority patent/AUPM849394A0/en
Application filed by Commonwealth Scientific and Industrial Research Organization CSIRO filed Critical Commonwealth Scientific and Industrial Research Organization CSIRO
Priority to AU32944/95A priority Critical patent/AU709459B2/en
Publication of AU3294495A publication Critical patent/AU3294495A/en
Application granted granted Critical
Publication of AU709459B2 publication Critical patent/AU709459B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Landscapes

  • Investigating Or Analysing Materials By Optical Means (AREA)

Description

SYSTEMS AND METHODS FOR MEASURING AT LEAST TWO VISUAL PROPERTIES OF AN OBJECT TECHNICAL FIELD This invention relates to apparatuses and methods for determining at least two different visual properties of an object.
BACKGROUND ART The price achieved by a consignment of raw wool at auction in Australia is currently dependent on a combination of the measured values for some properties as listed in the pre-sale catalogue and the buyer's assessment of a sample of the t0 consignment. Given the demands from the international market for more objective assessment of all wool properties and the potential improvement in wool prices and other economies if the sale could be based entirely on objective measurement, work has been done to quantify most of the remaining known but unmeasured wool properties, typically known in the wool trade as "style" or "type". There is a need for objective assessment of wool staples to provide "style" or "type" measurements.
OBJECTS OF INVENTION Objects of this invention are to provide apparatuses and methods for determining at least two different visual properties of an object.
:ioo DISCLOSURE OF INVENTION According to a first embodiment of this invention there is provided a method for determining at least two different visual properties of an object, comprising: determining a first parameter of the object by: locating the object in a first parameter measurement interaction volume; 2" illuminating the object in the first parameter measurement interaction volume with a light field selected from the group consisting of a substantially uniform measurement light field and an effectively uniform measurement light field to produce non surface relief measurement outgoing light containing visual S information related to non surface relief features of the object; o• detecting the non surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the first parameter; and determining the first parameter from the signals; (II) determining a second parameter of the object by: locating the object in a second parameter measurement interaction volume; illuminating the object in the second parameter measurement interaction volume with a directional measurement light field so as to produce surface relief measurement outgoing light containing visual information related to surface relief features of the object; detecting the surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the second parameter; and determining the second parameter from the signals; wherein at least one of steps and comprises illuminating said object at a location selected from the group consisting of: on a reflecting surface, on a gloss black surface, on a substantially non-reflecting surface, on a flat black matt surface, against a reflecting background, against a substantially non-reflecting background, against a gloss black background, and a against a flat black matt background.
Steps can be performed before, at the same time or after steps (II) Steps and steps (II) can be performed automatically or 20 non-automatically such as manually or batchwise.
According to a second embodiment of this invention there is provided a method for determining at least two different visual properties of an object, comprising: determining a first parameter of the object by: locating the object in a first parameter measurement interaction volume illuminating the object in the first parameter measurement interaction volume with an effectively uniform measurement light field comprising a non flat light field to produce a non surface relief measurement outgoing light containing visual information related to non surface relief features of the object; normalising the image of the object against an image of a substantially uniform flat white or substantially uniform flat near white object; detecting the non surface relief measurement outgoing light in the same light field and generating signals therefrom whereby the signals are a function of the first parameter; and determining the first parameter from the signals; determining a second parameter of the object by: locating the object in a second parameter measurement interaction volume; illuminating the object in the second parameter measurement interaction volume with a directional measurement light field so as to produce surface relief measurement outgoing light containing visual information related to surface relief features p, t of the object; [N:\LIBH]O0194:KWW detecting the surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the second parameter; and determining the second parameter from the signals.
According to a third embodiment of this invention there is provided an apparatus for determining at least two different visual properties of an object, comprising: means for determining a first parameter of the object comprising: means for locating the object in a first parameter measurement interaction volume; means for illuminating the object in the first parameter measurement interaction volume with a light field selected from the group consisting of a substantially uniform measurement light field and an effectively uniform measurement light field to produce non surface relief measurement outgoing light containing visual information related to non surface relief features of the object; a detector for detecting the non surface relief measurement outgoing light and 1 5 generating signals therefrom whereby the signals are a function of the first parameter, the detector being operatively associated with the means for illuminating of and means for determining the first parameter from the signals, the means for determining being operatively associated with the detector of means for determining a second parameter of the object comprising: means for locating the object in a second parameter measurement interaction Svolume; means for illuminating the object in the second parameter measurement interaction volume with a directional measurement light field so as to produce surface relief measurement outgoing light containing visual information related to surface relief features of the object; a detector for detecting the surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the second parameter, the detector being operatively associated with the means for illuminating of and means for determining the second parameter from the signals, the means for determining being operatively associated with the detector of wherein at least one of said measurement interaction volumes of and there is a *surface or background proximate a location where side object is illuminated or means for locating a surface or background proximate a location where said object is illuminated, said surface or background being selected from the group consisting of: a reflecting surface, a gloss black surface, a substantially non-reflecting surface, a flat black matt surface, a reflecting background, a substantially non-reflecting background, a gloss black background, and a flat black matt background wherein when said object is illuminated at said location said object is at a location selected from the group consisting of: on a reflecting surface, on a gloss black surface, on a substantially non-reflecting surface, on a flat back matt surface, against a reflecting background, against a [N:\LIBHIOO194:KWW substantially non reflecting background, against a gloss black background, and against a flat black matt background.
The apparatus of the second embodiment may be arranged to determine at least two different visual properties of an object, automatically or non-automatically such as manually or batchwise.
Typically the substantially uniform measurement light field is also a substantially constant measurement light field, and, in particular, is a flat light field. Typically the use of an effectively uniform measurement light field comprises illuminating the object with a non flat light field and normalising the image of the object against an image of a 1 o substantially uniform flat white or substantially uniform flat near white object as detected by detecting the outgoing light (by the detector) in the same light field.
The first parameter(s) measurement interaction volume may be different from, overlap with or be coincident with the second paramater(s) measurement interaction volume. When the first and second parameter(s) measurement interaction volumes are not spatially separated the volumes are illuminated with the substantially uniform measurement light field or the effectively uniform measurement light field and the directional measurement light field light at different times. In one particular [N:\LIBH]00194:KWW form of the invention the first parameter(s) measurement interaction volume is different to the second parameter(s) measurement interaction volume. The apparatus may include means to separate the first parameter(s) measurement interaction volume from the second parameter(s) measurement interaction volume. The object may be measured illuminated) on a reflecting surface (eg a gloss black surface) or substantially non-reflecting surface (eg a flat black matt surface) or against a reflecting background (eg a gloss black background) or substantially non-reflecting background a flat black matt background) in the first and/or second interaction volume(s). Typically at least one of steps and (typically at least step and more typically both steps and comprise illuminating the object on a flat black matt surface or on a gloss black surface or against a flat black matt background or against a gloss black background.
The detector for detecting the non surface relief measurement outgoing light and the detector for detecting the surface relief measurement outgoing light may be separate detectors but is typically the same detector. Generally the detector is an image detector such as a video camera, line scan camera or a flying spot scanner.
Generally the video camera is a colour video camera which has a high resolution lens, such as an SLR lens such as are made and sold by Olympus, Pentax or Nikon (as opposed to a C-mount lens). Examples of suitable video cameras are those of broadcast quality, studio or ENG cameras.
The object may be a solid or a collection of solid materials, such as a collection or clump of fibres, or other matter. Examples of objects include mechanical objects, mineral objects, such as diamonds and other crystals, organic and inorganic contaminants, fibrous objects, randomly shaped objects, spherical objects or cylindrical objects. Fibrous objects may be woven or twisted fibrous objects such as a clump of fibres and in particular fibre staples, such as for example raw wool staples. The fibrous objects may be synthetic fibres or natural fibres, dyed fibres, a textile product such as a strand, filament or yarn, or clumps or staples thereof Other examples of clumps, strands, filaments or yarns, may be ones of fibreglass, hessian, nylon, glass, polnosic S.O and polyester, abaca, silk, jute, flax and cellulose fibres (including paper, recycled 0" paper, corn stalks, sugar cane, wood, wood shavings, bagasse, wood chips), regenerated fibres such as viscose, rayon, cuprammonium rayon and cellulose acetate, sisal, carbon, stainless steel, vegetable fibrous material, polyolefin clumps, strands, filaments or yarns, such as polyethylenes and polypropylene, steel, boron, copper, brass, teflon, dacron, mylar, aluminium, aluminium alloy, polyamide, polyacrylic, or absorbent clumps, strands, filaments or yarns, such as nylon 66 polyacrylonitrile, or polyvinyl alcohol and absorbent types of polyesters or polyacrylics, edible vegetable clumps, strands, filaments or yarns, such as wheat, flax, or inedible vegetable clumps, strands, filaments or yarns, such as wood pulp or cotton, animal clumps, strands, filaments or yarns, such as meat, alpaca, wool fibres such as wool fibres from sheep or other wool producing animals, hairs, such as human hairs, goat hairs, cattle hairs, or feathers, yarns including wool and cotton yarns, (especially dyed wool, rabbit hair, kangaroo fur, mohair and cotton yarns as well as staples), string, wire, optical fibres for example.
The interaction producing non surface relief measurement outgoing light is typically one or a combination of non specular reflection, scattering, fluorescence, stimulated emission, polarisation rotation, and other polarisation effects, or optical absorption.
The interaction producing surface relief measurement outgoing light is typically one or a combination of reflection, scattering, fluorescence, stimulated emission, shadowing, polarisation rotation, and other polarisation effects, occlusion, or optical absorption.
Means for illuminating the first parameter(s) measurement interaction volume may be one or more light sources selected from, for example, incandescent sources, such as tungsten filament source, tungsten-halogen lamps including quartz-iodine S lamps, broad band light sources approximating white light or a combination of broad band light sources approximating white light. Advantageously, a wide band, visible light source(s) that approximates a black body radiator and maintains a constant colour temperature is used.
SMeans for illuminating the second parameter(s) measurement interaction volume may be one or more light sources selected from, for example, incandescent sources, such as tungsten filament source, tungsten-halogen lamps including quartz-iodine lamps, fluorescent lights, vapour lamps such as halogen lamps including sodium and iodine vapour lamps, discharge lamps such as xenon arc lamp and a Hg arc lamp, solid state light sources such as photo diodes, super radiant diodes, light emitting diodes, laser diodes, electroluminescent light sources, frequency doubled lasers, laser light sources including rare gas lasers such as an argon laser, argon/krypton laser, neon laser, helium neon laser, xenon laser and krypton laser, carbon monoxide and carbon dioxide lasers, metal ion lasers such as cadmium, zinc, mercury or selenium ion lasers, lead salt lasers, metal vapour lasers such as copper and gold vapour lasers, nitrogen lasers, ruby lasers, iodine lasers, neodymium glass and neodymium YAG lasers, dye lasers such as a dye laser employing rhodamine 640, Kiton Red 620 or rhodamine 590 dye, and a doped fibre laser.
Direct or indirect illumination may be used. Indirect illumination may employ optical fibres, reflectors such as mirrors or white surfaces, for example, lenses or other refractive optical devices.
The first parameter(s) may be colour, colour distribution, shape, diameter, area, chemical composition, number of parts, width, length, absorptivity, reflectivity, dielectric 15 constant, fluorescence, position, orientation, or density, for example.
The second parameter(s) may be one or more surface relief features such as shadowing including self shadowing, surface texture including surface periodicity or regularity, or other surface detail, surface waviness, surface crimp, surface roughness or g S surface profile, for example.
20 The apparatus of the invention may include: means for determining statistical information in respect of a measurement of the first and/or second parameter(s) for an object, operatively associated with the means o.
for determining the first and/or second parameters; and optionally (ddd) means for determining statistical information in respect of a plurality of measurements of (dd) for a plurality of objects, operatively associated with the means for determining of (dd).
Examples of statistical information include mean, standard deviation, coefficient of variation, variance, skewness, kurtosis, and other moments about the mean, spline fits, line fits including linear, exponential, logarithmic, multiple and polynomial regressions, [N:\LIBHIOO0114:RRB fractal fitting, mode, median, distribution fits including normal, gaussian, fermi, poisson, binomial, Weibull, parabolic, frequency, probability, cumulative and top hat distributions, data smoothing including running medians, means and least squares, table formation such as histograms and two way contingency, data manipulation for graphs, forecasting, probability statistics, simulations, pattern recognition, t test, chi square test, sample size, Wilcoxon signed-rank test, rank sum test, Kolmogorov-Smirnov test and boundary value and limit statistics. A more detailed description of statistical techniques is disclosed in G.E.P. Box, W.G. Hunter and J.S. Hunter, Statistics for Experimenters, John Wiley Sons, Inc, New York, USA, 1978, the contents of which are incorporated herein by cross reference.
Generally, the apparatus of the second embodiment further comprises at least one of the following items: means for determining the first parameter(s) of the object from measurement parameter(s) determined from the signals which are a function of the first parameter(s), operatively associated with the detector(s) of (ii)means for storing the measurement parameter(s) determined from the signals which are a function of the first parameter(s), operatively associated with the detector(s) of (iii) means for storing the first parameter(s) of the object operatively associated with the means for determining the first parameter(s); (iv) means for retrieving the measurement parameter(s) of the object operatively associated with means for storing the measurement parameter(s) determined from the signals which are a function of the first parameter(s); and (v)means for retrieving the first parameter(s) of the object operatively associated .25 with the means for storing the first parameter(s).
BRIEF DESCRIPTION OF DRA WINGS Fig. 1 is a diagram of an implementation of an Instrument for measuring, automatically, the style of wool staples; Fig. la is a diagram of a cross section through the centre of the Tray Dropper Unit of the instrument of Fig. 1; Fig. la is a diagram of a cross section through centre of the Tray Lifter Unit of the instrument of Fig. 1; Fig. 2 is a diagram of a Staple Tray used on Instrument of Fig. 1; Fig. 3 is a detailed diagram of the Optical Set-up of the implementation of the Instrument of Fig. 1; Fig. 4 is a diagram of the Optical Zones as seen by TV colour camera of the Instrument of Fig. 1; Fig. 5 is a flow diagram depicting the steps of staple colour analysis used by the Instrument of Fig. 1; Fig. 6 is a flow diagram depicting the steps of staple geometry analysis used by the Instrument of Fig. 1; and Fig. 7 is a flow diagram depicting the steps of staple crimp analysis used by the Instrument of Fig. 1.
BEST MODE AND OTHER MODES FOR CARRYING OUT THE
INVENTION
Referring to Figure 1 an Instrument 1 for determining style properties of wool is depicted. Instrument 1 comprising several major subsystems: Mechanical Tray Handling subsystem 2 for passing Tray Stack 701b containing wool staples on a Conveyor 300 under the TV colour camera 405, consisting of a Q Tray Dropper unit 100, a Tray Lifter unit 200 and a Conveyor 300, illustrated in Figure 1, with cross sections of Tray Dropper unit 100 in Figure la and of Tray S Lifter unit 200 in Figure lb; STray Stack 701b for the carriage of the wool staples to be measured through the S Mechanical Tray Handling subsystem 2 as illustrated in Figure 2; •2 Imaging subsystem 400 comprising a Lighting arrangement 401, a Baffles set 410 as illustrated in Figure 3 to combine the necessary lighting fields within the Sfield of view of TV colour camera 405, both attached to the Mechanical Tray Handling subsystem 2; SMicrocontroller 601 interfaced to actuators and sensors on the Mechanical Tray 3.o Handling subsystem 2 and to the Host computer 610, as shown in Figure 1; Handling subsystem 2 and to the Host computer 610, as shown in Figure 1; Electrical Power Supply 620 and a Compressed Air Supply 630 to power the Mechanical Tray Handling subsystem 2 as shown in Figure 1; and Host computer 610 with an internal or optionally external image acquisition interface also known as a frame-grabber and linked to the TV colour camera 405 as shown in Figure 1.
Mechanical Tray Handling subsystem 2 Tray Dropper unit 100 The Tray Dropper unit 100 has four major components: the Tray Dropper Rails 101a,b, the Tray Dropper Flaps 102a,b, the Dropper Back Wall 108 and the Dropper Front Bench 111.
The two Tray Dropper Rails 101a,b are located on the front and the back of the Conveyor 300 at the input end. Each Dropper Rail consists of a stiff straight rail of approximately the length of a tray, arranged parallel to the Conveyor 300 and moving up and down. The movements are driven by Tray Dropper Rail pneumatic actuators 103a,b, which are double-acting pneumatic cylinders with built-in guide rails, typically Festo part FEN-25-80 plus DSNU-25-80-P; the stroke is vertical.
Pneumatic needle valves, typically Festo part GRLA-1/8, are included on both inlets to each pneumatic cylinder to control the speed of operation in both up and down directions. The stroke length of the Tray Dropper Rail pneumatic actuators :30 103a,b are 80 mm each and not adjustable. The Tray Dropper Rail pneumatic actuators 103a,b are mounted on the Conveyor side panels 301a,b. Operation of the Tray Dropper Rails 101a,b is controlled by the Microcontroller 601 via Microcontroller-Dropper control and sense line 603 through a single-pole doublethrow electro-pneumatic valve 605, typically Festo part 6068-MFH-5-3.3. There are :.21i Tray Dropper Rail pneumatic actuator sensors 104a-d, typically Festo part SMEO- 4-K-LED-24, at both ends of the Tray Dropper Rail pneumatic actuators 103a,b to detect fully-up and fully-down positions. These sensors are linked to the Microcontroller 601 via the Microcontroller-Dropper control and sense line 603.
The two Tray Dropper Flaps 102a,b are located parallel to the Tray Dropper Rails 10a,b and outside them. Each flap consists of a stiff angle section mounted in an inverted L position, with a Tray Dropper Flap hinge 107a/b at the bottom connected to the Tray Dropper Frame 112 which is attached to the Conveyor side panels 301a,b. The tops of the Tray Dropper Flaps 102a,b point inwards across the Conveyor 300 and are moved in and out by Tray Dropper Flap pneumatic actuators 105a,b, which are double-acting pneumatic cylinders, typically Festo part 10-A, the stroke is horizontal. A pneumatic needle valve, typically Festo part GRLA-M5-Pk3, is included on one inlet to each pneumatic cylinder to control the speed of operation in the inwards direction. The stroke length of the Tray Dropper Rail pneumatic actuators 103a,b is about 10mm. Operation of the Tray Dropper Flaps 102a,b is controlled by the Microcontroller 601 via Microcontroller-Dropper control and sense line 603 through a single-pole double-throw electro-pneumatic valve 606, typically Festo part 6068-MFH-5-3.3. There are Tray Dropper Flap pneumatic actuator sensors 106a-d, typically Festo part SME-3-LED-24, at both ends of the Tray Dropper Flap pneumatic actuators 105a,b to detect fully-in and fully-out positions. These sensors are linked to the Microcontroller 601 via the Microcontroller-Dropper control and sense line 603. The Dropper Back Wall 108 is a flat plate rising vertically from the back of the Tray Dropper unit 100 with two vertical Dropper Back Wall guides 109 110, one at each end of the Dropper Back Wall 108. The Dropper Back Wall 108 is located with respect to the Tray Dropper Rails 101a,b and Tray Dropper Flaps 102a,b such that a stack of Tray Stack 701b correctly located on the Tray Dropper Flaps 102a,b is just touching the Dropper Back Wall 108; the two Dropper Back Wall guides 109 110 are located so as to S constrain the two ends of the Tray Stack 701b in the stack.
The Dropper Front Bench 111 is a flat sheet arranged horizontally on the Tray Dropper Frame 112. This provides a safety guard over the Tray Dropper Flaps 102a,b and as a temporary work surface for the operator while stacking trays. There are a number of optical beam sensors on the Tray Dropper unit 100. A Tray-Stacka Fault sensor 505a,b is integrated onto the front Tray Dropper Flap 102a and the Dropper Back Wall 108. This sensor passes a beam from the Tray-Stack-Fault sensor emitter 505a on the Dropper Back Wall 108 to the Tray-Stack-Fault sensor receiver 505b on the front Tray Dropper Flap 102a through the stack of Tray Stack 701b. The beam passes above the bottom Tray 701 in the stack and under the second bottom Tray 701, through the Tray inter-slot gap 703 in the trays. The beam is angled slightly downwards, such that it can only pass through the Tray inter-slot gap 703 if the Tray Stack 701b are correctly stacked on the Tray Dropper unit 100.
Any vertical mis-stacking of the Tray Stack 701b will cause the beam to be blocked.
The Tray-Stack-Fault sensor 505a,b is typically an Erwin Sick model WLL6-P122.
There is a Tray-on-Stack sensor 501 located on the Tray Dropper Frame 112 at the outer end, and a Tray-on-Stack sensor reflector 502 mounted on the Tray Dropper Frame 112 at the inner end of the Tray Dropper unit 100. The beam from the Trayon-Stack sensor 501 passes along the Conveyor 300 to hit the Tray-on-Stack sensor reflector 502 which reflects the beam back along the same path to the Tray-on-Stack sensor 501. This path is so positioned that the bottom Tray 701 in the stack on the Tray Dropper unit 100 will block the beam. The Tray-on-Stack sensor 501 is typically an Erwin Sick unit model WL6-P172; the Tray-on-Stack sensor reflector 502 is typically an Erwin Sick model PL31/9210. There is a Tray-under-Dropper sensor 503 and Tray-under-Dropper sensor reflector 504 located on the Conveyor 300 in such a position that the beam is blocked by a Tray 701 dropped onto the Conveyor 300 by the Tray Dropper unit 100. This blockage is interpreted to mean there is something (typically a Tray 701 but possibly unwanted rubbish) somewhere on the input end of the Conveyor 300. The Tray-under-Dropper sensor 503 is typically an Erwin Sick model WL6-P172; the Tray-under-Dropper sensor reflector 504 is typically an Erwin Sick model PL31/9210.
Tray Lifter unit 200 The Tray Lifter unit 200 is similar to the Tray Dropper unit 100 but uses a Tray Lifter Pawls 202 arrangement instead of flaps. The Tray Lifter Rails 201a,b are similar in design to the Tray Dropper Flaps 102a,b, and are driven by a similar arrangement of similar pneumatic equipment: Tray Lifter Rail pneumatic actuators 203a,b with Tray Lifter Rail pneumatic actuator sensors 204a-d controlled by a single-pole double-throw electro-pneumatic valve 607. The Tray Lifter Pawls 202 are parallel to the Tray Lifter Rails 201a,b and so arranged as to rotate upwards when a Tray 701a is lifted from the surface of the Conveyor 300. When the Tray 701a has passed the Tray Lifter Pawls 202 they will descend into the rest position, such that the lifted Tray 701a can rest on them. The Tray Lifter unit 200 has a vertical Tray Lifter Back guides 208a,b to provide vertical guidance to the outgoing Tray 70 l1a.
There are a number of optical beam sensors on the Tray Lifter unit 200. There is a Tray-at-End sensor 506 located on the Conveyor 300 near the Conveyor driven roller 303 such that a light beam is passed at right angles across the surface of the Conveyor 300 about 10 mm above the Conveyor belt 305 to sense when a Tray 701a reaches the output end under the Tray Lifter unit 200. The Tray-at-End sensor 506 is typically an Erwin Sick model WLL6-P122. Blockage of this beam is interpreted to mean that a Tray 701a has reached the end of the Conveyor 300 and is correctly positioned to be lifted by the Tray Lifter unit 200. There is a Trayunder-Lifter sensor 507 and a Tray-under-Lifter reflector 508 similar to the Trayunder-Dropper sensor 503 and Tray-under-Dropper sensor reflector 504 mounted on the output end of the Conveyor 300. The beam is positioned such that it will be blocked by a Tray 701a on the Conveyor 300 anywhere under the Tray-under-Lifter sensor 507 even when the Tray 701a is not interrupting the Tray-at-End sensor 506.
This blockage is interpreted to mean there is something (usually a Tray 701a but possibly rubbish) somewhere on the output end of the Conveyor 300. The Trayunder-Lifter sensor 507 is typically an Erwin Sick model WL6-P172; the Trayunder-Lifter reflector 508 is typically an Erwin Sick model PL31/9210.
Conveyor 300 The Conveyor 300 consists of two Conveyor side panels 301a,b onto which the Tray Dropper unit 100, the Tray Lifter unit 200 and several optical sensors are attached, a Conveyor free roller 302 at the input end and a Conveyor driven roller 303 at the output end, a Conveyor belt motor 304 to drive the Conveyor driven roller 303, and a Conveyor belt 305 around the two rollers. The top surface of the Conveyor belt 305 moves from the input end to the output end.
The optical sensors attached to the Conveyor 300 include the Tray-under- Dropper sensor 503, the Tray-under-Lifter sensor 507, the Tray-at-End sensor 506, and three sensors located at the middle of the Conveyor 300 in the region seen by the TV colour camera 405: Tray Movement sensor A 509, Tray Movement sensor B 510 and Tray Movement sensor C 511. Each Tray Movement sensor places a sensing beam at right angles across the Conveyor 300 at about 10 mm above the surface of the Conveyor belt 305. Tray Movement sensor A 509 and Tray Movement sensor C 511 are placed across the Conveyor 300 with a separation corresponding to the distance between two tray slots as shown in Figure 4. Their combined outputs provide a signal when a Tray 701a has advanced such a distance that the next Tray Slot 702 has moved into the TV camera field of view 460. Tray Movement sensor B 510 is placed in between Tray Movement sensor A 509 and Tray Movement sensor C 511, one and a half slot spacings from Tray Movement sensor A 509 and one half a slot spacing from Tray Movement sensor C 511, such that the beam from Tray Movement sensor B 510 is normally blocked when one or both of the beams from Tray Movement sensor A 509 and Tray Movement sensor C 511 are not blocked, and vice versa. Tray Movement sensor A 509, Tray Movement sensor B 510 and Tray Movement sensor C 511 are typically Erwin Sick part WLL6-P122. It is not possible for a Tray 701a to be on Conveyor 300 without blocking at least on of the beams from the set of Tray-under-Dropper sensor 503, Tray-under-Lifter sensor 507 and Tray Movement sensor B 510.
Staple Tray A Staple Tray 701 is a rectangular tray approximately 715 mm long by 200 mm wide, with typically seventeen slots in it. Each Tray Slot 702 is about 180 mm long by about 15 to 30 mm wide at the base and 10 mm deep, arranged generally as shown in Figure 2. The slots may be offset as shown in Figure 2 so as to maintain a separation of about 10 mm between trays when correctly stacked, or other :ooo S: arrangements may be made to ensure that separation. There is a Tray inter-slot gap .':703 as shown in Figure 2 between each Tray Slot 702 which is used by the Tray- Stack-Fault sensor 505a,b to check for correct tray stacking. An extended area is maintained at one end of Tray 701 to permit the attachment of a barcode label 642 or other means of automatic identification.
The under-side of the Tray 701 may be smooth or may have small protrusions to ensure correct location and stacking. The material of the Tray 701 is of a black colour. Typically it is made of plastic by vacuum-moulding. The top surface is 3O smooth and of matt finish or gloss finish; the under surface is smooth but otherwise not specified. A tray newly dropped onto Conveyor 300 is referred to as Tray 701; a tray proceeding through the TV camera field of view 460 or on to the Tray Lifter unit 200 is referred to as Tray 701a; a stack of trays on the Tray Dropper unit 100 is referred to as Tray Stack 701b.
Imazins subsystem 400 One arrangement of the Imaging subsystem 400 has 4 major components: the Colour Region Lights 421-426, the Crimp Region Lights 431-432, the baffles 411- 415 and the TV colour camera 405. This is described as follows.
Colour Resion Lights 421-426 There are six quartz-halogen (or tungsten-iodine) Colour Region Lights 421-426 with built-in dichroic reflectors for the Colour Region 420. These are arranged in two pairs 421,2 and 423,4 illuminating across the top tray slot 710 and one pair 425,6 illuminating the length of the top tray slot 710 as shown in Figure 4. The beams from the Colour Region Lights 421-426 are inclined at 30 degrees to the vertical. The Colour Region Lights 421-426 have round Infra Red block filters 428a-f in front of them to prevent heat from reaching the samples in Tray 701a. The placement of the Colour Region Lights 421-426 is designed to provide an even lighting field at the horizontal surface plane of the Tray 701 generally in the Colour Region 420 as described in "Operation of Instrument 1, Adjustment of Colour Region Lights 421-426". The lights are manufactured by General Electric and are 12v 50w type EXN.
Q The Colour Region Lights 421-426 are supported by the Colour Region Light support frame 427 which is supported by the TV Camera support plate 402.
Provision is made for adjustment of the position of the Colour Region Lights 421- 426 to create a flat field, as detailed under "Operation of Instrument The Colour Region Lights 421-426 are all connected to Lighting power distribution block 440 by equal length Colour Light leads 441-446. Lighting power distribution block 440 is supplied with 12.0 volts from Electrical Power Supply 620.
Crimp Region Lights 431-432 There are two quartz-halogen (or tungsten-iodine) Crimp Region Lights 431-432 with built-in dichroic reflectors for the Crimp Region 430. The beam from the 0 Crimp Region Lights 431-432 is angled across the length of the bottom tray slot 712 as shown in Figure 3. The beams from the Crimp Region Lights 431-432 are inclined at about 30 degrees to the horizontal. The Crimp Region Lights 431-432 have round Infra Red block filters 438a,b in front of them to prevent heat from reaching the samples in Tray 701a. The placement of the Crimp Region Lights 431- 432 is designed to provide a fairly even light field at the horizontal surface of the Tray 701 generally in the Crimp Region 430, but this does not have to be as uniform as for the Colour Region 420. The lights are manufactured by General Electric and are 12v 50w type EXZ. The Crimp Region Lights 431-432 are mounted close together on Crimp Region Lights support frame 433, which is supported by the Colour Region Light support frame 427.
The Crimp Region Lights 431-432 are all connected to Lighting power distribution block 440 by equal length Crimp Light leads 447-8.
Baffles set 410 There are five baffles 411-415 arranged around the Colour Region 420 and Crimp Region 430 on baffle support frame 416 attached to TV Camera support plate 402 as shown in Figure 3. Baffle 411 is arranged to prevent illumination from Colour Region Lights 421 423 from landing on Crimp Region 430. Baffle 412 is arranged to prevent illumination from Colour Region Lights 422 424 from landing on Colour Region 420. Baffle 413 is arranged to prevent illumination from Colour Region Light 425 from landing on Crimp Region 430. Baffle 414 is arranged to prevent illumination from Colour Region Light 526 from landing on Crimp Region 430. Baffle 414 has a square hole in it at the bottom, situated adjacent to bottom tray slot 712, such that light from Crimp Region Lights 431-432 fall on Crimp Region 430. The first four members of members of Baffles set 410 listed above are placed symmetrically about the centre of TV camera field of view 460. Baffle 415 is arranged to prevent illumination from Colour Region Light 526 from landing on the Crimp Region, and also to prevent light from Crimp Region Lights 431-432 from landing on Colour Region 420. Baffle 415 is supported by S Baffle 414.
All baffles 411-415 and baffle support frame 416 are painted matt black to *7 minimise reflected light.
A second arrangement of the Imaging subsystem 400 is described as follows.
A second arrangement of the Imaging subsystem 400 is described as follows.
Colour Region Lights 421 423 There are two quartz-halogen (or tungsten-iodine) Colour Region Lights 421 423 with built-in dichroic reflectors for the Colour Region 420. These are arranged as shown in Fig 3, illuminating across the top tray slot as shown in Figure 4. The beams from the Colour Region Lights 421 423 are inclined at 30 degrees to the vertical. The Colour Region Lights 421 423 have round Infra Red block filters 428a&c in front of them to prevent heat from reaching the samples in Tray 701a.
The placement of the Colour Region Lights 421 423 is designed to provide a moderately even lighting field at the horizontal surface plane of the Tray 701 generally in the Colour Region 420 substantially as described in "Operation of Instrument 1, Adjustment of Colour Region Lights 421-426". The lights are manufactured by General Electric and are 12v 50w type EXN.
The Colour Region Lights 421 423 are supported by the Colour Region Light support frame 427 which is supported by the TV Camera support plate 402.
Provision is made for adjustment of the position of the Colour Region Lights 421 423 to create a moderately flat field, substantially as detailed under "Operation of Instrument The Colour Region Lights 421 423 are connected to Lighting power distribution block 440 by equal length Colour Light leads 441 443.
Lighting power distribution block 440 is supplied with 12.0 volts from Electrical Power Supply 620.
s..0 o0•o Crimp Region Light 431 S: The quartz-halogen (or tungsten-iodine) Crimp Region Light 431 with built-in dichroic reflector illuminates the Crimp Region 430. The beam from the Crimp Region Light 431 is angled across the length of the bottom tray slot 712 as shown in 2Z. Figure 3. The beam from the Crimp Region Lights 431 is inclined at about 1•.
degrees to the horizontal. The Crimp Region Light 431 has a round Infra Red block filter 438a in front of it to prevent heat from reaching the samples in Tray 701a.
The placement of the Crimp Region Light 431 is designed to provide a fairly even light field at the horizontal surface of the Tray 701 generally in the Crimp Region *3 430, but this does not have to be as uniform as for the Colour Region 420. The S.go light is manufactured by General Electric and is of 12v 50w type EXZ. The Crimp Region Light 431 is mounted on Crimp Region Lights support frame 433, which is supported by the Colour Region Light support frame 427.
The Crimp Region Light 431 is connected to Lighting power distribution block 440 by Crimp Light lead 447.
Mirror-Baffle set 412 415 There is one baffle 412 arranged between the Colour Region 420 and Crimp Region 430 on baffle support frame 416 attached to TV Camera support plate 402 as shown in Figure 3. Baffle 412 is arranged to prevent illumination from Colour Region Lights 421 423 from landing on Crimp Region 430. Baffle 412 is also a mirror arranged to reflect light from Colour Region Lights 421 423 onto Colour Region 420 from the side opposite to the Colour Region Lights 421 423. Baffle 415 is arranged to prevent illumination from Crimp Region Light 431 from landing on Colour Region 420.
Baffles 412 on the Crimp Region 431 side and Baffle 415 and baffle support frame 416 are painted matt black to minimise reflected light.
TV colour camera 405 TV colour camera 405 is mounted on TV Colour camera mount 404, which has provision for fine adjustment of alignment in the horizontal and vertical planes. TV ::26 Colour camera mount 404 is located on TV Camera support plate 402 which is mounted on TV Camera support plate 402. TV Camera support plate 402 is
C
mounted indirectly onto Conveyor 300 via an external TV Camera support plate frame not shown in Figure 1. The output of TV colour camera 405 goes to the frame grabber in Host computer 610 via TV camera coax cable 406.
2::2 Field of View of TV colour camera 405 o* C TV colour camera 405 is pointed down at the Colour Region 420 and Crimp Region 430. The image obtained from TV colour camera 405 is shown in Figure 4.
The image shows three tray slots arranged horizontally: top tray slot 710, middle tray slot 711 and bottom tray slot 712. Vertically arranged on either side of the o.*31 image are Imaging Region Black White reference areas 450 and Imaging Region Cyan, Yellow Magenta reference areas 451. Colour Region 420 is generally.
Cyan, Yellow &r Magenta reference areas 451. Colour Region 420 is generally located within the area occupied by top tray slot 710 and as delineated by Colour Region virtual mask 720. Crimp Region 430 is generally located within the area occupied by bottom tray slot 712 and as delineated by Crimp Region virtual mask 721. There is an unused optical region generally located within the area occupied by middle tray slot 711 in between Colour Region virtual mask 720 and Crimp Region virtual mask 721.
Microcontroller 601 Microcontroller 601 is connected to Host computer 610 by Host-Microcontroller serial line 611, to Electrical Power Supply 620 by Microcontroller-Power Supply line 602, and the pneumatic actuators and various sensors on Mechanical Tray Handling subsystem 2 by Microcontroller-Dropper control and sense line 603, Microcontroller-Lifter control and sense line 604 and Microcontroller-Tray Movement sensors signal line 608. Each of these lines may contain several control and sensing signals.
The Microcontroller 601 is typically an Arcom SC52 with optically isolated digital interface cards, typically Arcom SINP-16 for inputs and Arcom SD-16 for outputs. The program running on Microcontroller 601 takes high level commands such as "Clear conveyor" and "Advance to next tray slot" from Host computer 610 via Host-Microcontroller serial line 611 and controls the actuators and reads the sensors to cause the Mechanical Tray Handling subsystem 2 to perform those tasks.
On completion of the commands the Microcontroller 601 sends a "Done" reply to Host computer 610 over Host-Microcontroller serial line 611. Any problems encountered in executing those commands are similarly reported back to Host computer 610.
Electrical Power Supply 620 and Compressed Air Supply 630 The Electrical Power Supply 620 draws power from the standard 240v mains supply and generates DC power at 24v for the pneumatic equipment and optical sensors, 12v for the Colour Region Lights 421-426 and Crimp Region Lights 431- 432, +24v for Conveyor belt motor 304 and lower voltages as required for the Microcontroller 601 (typically +15v, -15v The +24v power to Conveyor belt motor 304 is controlled through a relay in Electrical Power Supply 620 by a signal in Microcontroller-Power Supply line 602 from Microcontroller 601. The +24v power is delivered to Conveyor belt motor 304 by Power Supply-Motor line 621.
The +12v power to the Colour Region Lights 421-426 and Crimp Region Lights 431-432 is controlled through a relay in Electrical Power Supply 620 by a signal in Microcontroller-Power Supply line 602 from Microcontroller 601. The +12v power is delivered to Lighting power distribution block 440 by Power Supply-Lights line 622. The Compressed Air Supply 630 for the pneumatic equipment is derived from a source outside the Instrument 1. The primary supply of compressed air to the pneumatic subsystem (ie all the actuators) is controlled through a pneumatic dump valve in Compressed Air Supply 630 by a signal in Microcontroller-Air Supply control and sense line 609. Availability of compressed air for the pneumatic subsystem is sensed by a pressure sensor in Compressed Air Supply 630 connected to Microcontroller 601 by Microcontroller-Air Supply control and sense line 609.
Host computer 610 Host computer 610 is connected to Microcontroller 601 via Host-Microcontroller serial line 611, to the dual-head Barcode Reader unit 640a,b via Barcode-Host line 641 and to TV colour camera 405 through a typically internal frame grabber via TV camera coax cable 406. The program running on Host computer 610 effects control of Microcontroller 601 so as to cause it to move successive staples of wool into the field of view of TV colour camera 405, acquires the TV images of the staples through TV colour camera 405, processes these images to extract the required information concerning the previously listed staple properties and any other appropriate information, and displays or disposes of the resulting information as required.
Operation of Instrument 1 Adjustment of Colour Region Lights 421-426 in the first implemetation for successful analysis of the RGB images as outlined requires that the Colour Region Lights 421-426 be adjusted to produce a substantially flat field of light within the Colour Region 420. This is done as follows: The radial light intensity from an ideal point source quartz-iodine lamp of the type used projected onto a flat surface normal to the beam tends to follow a pattern or distribution known as a Gaussian curve. While not exact, there is a region some distance from the centre of the pattern where the light intensity falls off in a roughly linear manner with the distance from the centre. The superposition of two such light patterns can result in one of three distinct combined patterns, depending on the separation s between the centres of the patterns: a single peak when s is small two adjacent peaks when s is large a combined pattern with a reasonably flat centre at an appropriate value of s The explanation of the last case involves the overlap of the roughly linearly decreasing intensity region from one light with the roughly linearly increasing region from the second light. A pair of lamps, suitably positioned, can thereby create a region of substantially uniform and constant light intensity, with a roughly linearly decreasing intensity region around that reasonably uniform and constant region. By combining two such pairs of lights it is possible to create a larger area of uniform and constant intensity between the lights. This may be extended with additional pairs.
In the first implementation of Instrument 1 the Colour Region Lights 421-426 are 0 arranged in such pairs: lamps 421 422, lamps 423 424, and lamps 425 426, in order to take advantage of the above theory. However, the beams from the Colour Region Lights 421-426 are inclined at 30 degrees away from the vertical or at 30 degrees to the normal to the top surface of Tray 701a. This distorts the circular light pattern into an elliptical light pattern. In addition, the field from a quartz iodine lamp of the type used does not create a perfectly circular field in the first place, as the filament which has finite dimensions does not act as a perfect point source.
Adjustment of the Colour Region Lights 421-426 to achieve a flat field in Colour S Region 420 requires that the position of each light along the Colour Region Light support frame 427 and the tilt of each light with respect to the frame be adjusted.
The adjustment are done on each pair of lights separately, with the light field being monitored with the aid of TV colour camera 405 and diagnostic software on Host computer 610. It is sufficient for the application to maintain a light field flat to within 1% RMS and this is done. This adjustment is done each time the Colour Region Lights 421-426 are changed.
In the second implemention of the Instrument 1 the adjustment of the Colour Region Lights 421 423 to produce a reasonably flat light field requires that the position of each light along the Colour Region Light support frame 427 and the tilt of each light with respect to the frame be adjusted. It is sufficient for the application to maintain a light field flat to within 30% of average and this is done. Normalisation of the image of the object to an effective flat light field is done against the image of the reference white field by the software when the image of the object is acquired.
The adjustment of the light field is done each time either of the Colour Region Lights 421 423 are changed.
Normal Operation The normal mode of operation of Instrument 1 is for an operator to place a Tray 701 or a Tray Stack 701b onto Tray Dropper unit 100 and to activate the program in Host computer 610. Tray Dropper unit 100 will then drop a single Tray 701 onto Conveyor 300, which will then pass it through TV camera field of view 460, as shown by Tray 701a, where the images of the staples are captured for analysis, and finally Tray 701a will reach Tray Lifter unit 200 where it will be removed from Conveyor 300.
Tray Dropper unit 100 places each Tray 701 from Tray Stack 701b onto Conveyor 300, one at a time, with a two-phase mechanism, consisting of Tray S: Dropper Rails 101a,b and Tray Dropper Flaps 102a,b. Normally Tray Stack 701b is supported by the top horizontal section of Tray Dropper Flaps 102a,b. When a Tray 701 is to be placed on Conveyor 300 the Tray Dropper Rails 101a,b rise to just under Tray Dropper Flaps 102a,b. These then open, allowing Tray Stack 701b to drop a few millimetres onto Tray Dropper Rails 101a,b. Tray Dropper Flaps 102a,b then close, such that the top horizontal section is between the bottom tray and the next-to-bottom tray in Tray Stack 701b. The between-tray spacing of about 10 mm and the distance Tray Stack 701b drops on each cycle combine to ensure this. Then Tray Dropper Rails 101a,b descend to Conveyor 300, placing the bottom Tray 701 onto Conveyor belt 305. As Tray Dropper Rails 10a,b descend the rest of Tray Stack 701b, which had been supported by the bottom Tray 701, hits the top horizontal section of Tray Dropper Flaps 102a,b and becomes supported by them instead. The Tray Stack 701b now has a new bottom tray.
Tray Stack 701b is supported at the back by Dropper Back Wall 108 and lateral positioning of Tray Stack 701b in the direction of Conveyor 300 is constrained by Dropper Back Wall guides 109 110, such that the position of any Tray 701 when thus placed on Conveyor 300 is repeatable within a few millimetres each time.
Should Tray Stack 701b be incorrectly assembled or incorrectly placed on the Tray Dropper Flaps 102a,b, such that the operation of Tray Dropper unit 100 might be prone to failure, Tray-Stack-Fault sensor 505a,b will detect an improper stacking and provide an error signal to Microcontroller 601. In addition, Tray-under- Dropper sensor 503 serves to check that Tray 701 has been successfully placed on Conveyor 300. When there are no more trays on Tray Dropper unit 100 (ie Tray Stack 701b is empty), Tray-on-Stack sensor 501 is no longer activated, and this is sensed by Microcontroller 601.
Motion of Conveyor 300 is handled by Microcontroller 601. Conveyor belt motor 304 is started and Tray 701 under Tray Dropper unit 100 is advanced towards TV camera field of view 460. The mechanism for stopping Conveyor 300 depends :on the situation: If there is already a Tray 701a in TV camera field of view 460 the motion of Conveyor 300 will be controlled by Tray 701a and a combination of Tray Movement sensor A 509 and Tray Movement sensor C 511.
If there is no Tray 701a already in TV camera field of view 460 then Conveyor 25 300 will advance the newly dropped Tray 701 until it interrupts the beam of Tray Movement sensor A 509. The exact halting position of Conveyor 300 is the instant that the beam from Tray Movement sensor A 509 ceases to be interrupted. At this stage Tray 701 becomes effectively Tray 701a.
S When Tray 701a leaves TV camera field of view 460 only Tray Movement sensor C 511 will detect the passage of the final Tray Slot 702.
When Tray Dropper Rails 101a,b are fully down, as sensed by Tray Dropper Rail pneumatic actuator sensors 104a-d, and Tray 701 is on Conveyor 300, Barcode Reader unit 640a,b is triggered to do a read from barcode label 642 on the Tray 701. This should be successfully performed when Tray 701 has just been deposited on Conveyor 300 or within the first 10mm of movement. Barcode Reader unit 640a,b reports this barcode label 642 to Host computer 610 via Barcode-Host line 641. A successful reading of barcode label 642 is an essential precursor to Tray 701 being imaged. If barcode label 642 cannot be read successfully within a short time, typically a few seconds, Barcode Reader unit 640a,b reports this failure to Host computer 610 which then takes appropriate action. Such action is normally to complete the imaging of the previous Tray 701a and then to halt Conveyor 300 and alert the operator, who may either remove the unidentified Tray 701 or manually enter a barcode number for that tray.
On entering TV camera field of view 460 a particular staple in a Tray Slot 702 first enters Colour Region 420. Once Conveyor 300 is stationary Host computer 610 takes RGB images of TV camera field of view 460. The RGB images contain the colour image of the particular staple in Colour Region 420: this information is extracted from the images and analysed for staple presence, colour information and S geometry information. At the same time, another staple may be in Crimp Region o i 430 and subject to crimp analysis. Once this has been done Conveyor 300 then S advances the distance of one Tray Slot 702 plus one Tray inter-slot gap 703, which places the particular staple in-between Colour Region 420 and Crimp Region 430.
RGB images are again taken by Host computer 610 of the whole TV camera field of view 460, but no information is collected for the particular staple. Other staples in the RGB images may be subject to colour and crimp analysis. Once this has been done Conveyor 300 advances the same distance again such that the particular staple is in Crimp Region 430. RGB images are again taken by Host computer 610 and the crimp information for the particular staple is extracted and analysed. This completes S: the analysis for the particular staple. The analysis of other staples is interleaved with this analysis.
When the whole Tray 701a has passed through TV camera field of view 460 and all staples have been imaged and analysed, Tray 701a proceeds to Tray Lifter unit 200. Any failure during the passage of Tray 701a through TV camera field of view 460 may invalidate the results for Tray 701a. In this case Conveyor 300 is halted and the operator is alerted to remove Tray 701a for a repeat passage. If the measurement of the whole Tray 701a is successful the information concerning the staples in Tray 701a is accumulated, possibly averaged or otherwise statistically analysed, and the results are identified by barcode label 642 and other information which may include time, date, instrument number, operator ID, etc.
On passing under Tray Lifter unit 200 the Tray 701a is eventually physically blocked by Lifter Tray barrier 213, which acts at the level of the Tray Slot 702 rather than the top surface. At the same time the Tray Slot 702 interrupts the Trayat-End sensor 506, which is sensed by Microcontroller 601. Tray Lifter Rails 201a,b are raised, lifting Tray 701a past Tray Lifter Pawls 202. After a brief pause, typically about one second, Tray Lifter Rails 201a,b are dropped back down to Conveyor 300. The positions of Tray Lifter Pawls 202 are such that Tray 701a is caught during descent and rests on them, in a manner similar to that performed by Tray Dropper Flaps 102a,b. The trays are all stopped at the same physical location by Lifter Tray barrier 213 which ensures that they stack together on Tray Lifter unit 200 in the same manner as on Tray Dropper unit 100. The function of Microcontroller 601 is to relieve Host computer 610 of most of the real-time 0o operation of Mechanical Tray Handling subsystem 2. Host computer 610 typically instructs Microcontroller 601 to "advance to the next slot", and the rest of the details of control are performed by Microcontroller 601. Other common commands from Host computer 610 to Microcontroller 601 include "initialise", "clear conveyor", "start new tray" and "finish old tray". "Initialise" ensures that Mechanical Tray Handling subsystem 2 is in a safe state before turning the S. compressed air on in Compressed Air Supply 630. "Clear conveyor" advances Conveyor 300 and unloads trays at Tray Lifter unit 200 as long as there is a blocked beam signal from any of Tray-under-Dropper sensor 503, Tray Movement sensor B 510 and Tray-under-Lifter sensor 507. "Start new tray" and "Finish old tray" ensure synchronisation between Host computer 610 and Microcontroller 601 for each tray.
Other diagnostic commands for detailed testing of the hardware may be included in the program for Microcontroller 601.
Algorithms The information recorded by Host computer 610 forms two-dimension (2-D) arrays of pixels (picture elements) representing the RGB images. A separate 2-D array is recorded for each of the three colour channels Red, Green and Blue (RGB) generated by the camera. The values of the pixels range from about zero, representing darkness, to a full-scale value dependent on the hardware used in the particular implementation and representing maximum brightness. The three values representing any single pixel in the Colour Region are referred to as an RGB vector.
The images are analysed by a computer program in three stages.
1. Analysis of colour for wool and dirt and dirt distribution; 2. Analysis of staple geometry: length, width, tip-shape etc; 3. Analysis of staple crimp frequency and crimp definition.
Tasks 1 2 are done with RGB information obtained from Colour Region 420; task 3 is done with the information obtained from Crimp Region 430 using one colour channel only.
The tray containing the staples goes through the TV camera field of view 460 in steps, advancing one tray slot at a time. A full TV image is taken at each step. A :io.
particular staple first appears in Colour Region 420 where the information for tasks 1 2 is collected from the image. It then is moved to the region between Colour .Zo: Region 420 and Crimp Region 430. No information about the staple is collected at this step. On the next step the particular staple is moved into Crimp Region 430 where the information for task 3 is collected.
Colours normally associated with raw wool cover a limited range, and may be broken into three categories: dirt (typically industry classifications include red, 1.s: brown, grey and black), raw wool (industry typically uses various shades of white and pale yellow), and other (for example identification colours used during lambing, colours due to biological damage, grass stains, etc). In general it is not necessary to S know the exact colour of the wool or any dirt; it is usually sufficient to use a limited range of colour values such are given as examples here. For research use the average XYZ values of the wool and dirt areas may instead be reported after this step has been done.
The information obtained from this analysis may be listed for each staple. It may also be collected for all staples in one tray and averaged, and listed as average information for the tray as identified by the operator or by the barcode label on the tray.
Image processing techniques are discussed in numerous texts, such as "Fundamentals of Digital Image Processing", A K Jain, Prentice Hall, 1989, ISBN 0-13-336165-9, herein referred to as "Jain") and "Digital Picture Processing", vols 1 2, A Rosenfeld A C Kak, Academic Press, 1982, ISBN 0-12-597301-2, (herein referred to as the contents of which are incorporated herein by cross reference. Computer code for common algorithms may be obtained from several sources including "Numerical Recipes: the Art of Scientific Computing", (Fortran version) by W H Press, B P Flannery, S A Teukolsky and W T Vetterling, Cambridge UP, 1992, ISBN 0-52-138330-7 (herein referred to as NR) the contents of which is incorporated herein by cross reference.
The colour analysis for wool staples (step 1) proceeds according to flow diagram 1200 in Figure 5. The individual steps conducted during the colour analysis by Host computer 610 are outlined below: In step 1201 the number of pixels above a particular intensity level within the region of interest are counted: if the number exceeds a threshold there is a staple :~0o present. The particular intensity level is a parameter dependent on the particular S implementation and is chosen to distinguish between the staple region in the image and the tray background (which is not staple) in the image.
a. The region of interest is typically the area within the Tray Slot 702 in Colour Region 420, but may be altered according to the type of staple being examined.
25 b. The threshold number used to indicate that a staple is present may depend on the typical type and size of wool being examined.
In step 1202 the region of interest is subjected to a thresholding operation (see section 7.2 in Jain or section 10.1.2 in A&K) in which all pixels with values less than a particular level are set to a zero value in order to distinguish in subsequent operations between pixels considered to be representing a staple and pixels not representing a staple. The threshold used will depend on the hardware in use, which typically may have scale ranges of 0-63 (6 bits), 0-255 (8 bits), 0-1024 (10 bits).
This operation will leave a large contiguous area of non-zero pixels representing a staple, but may also leave other small areas of non-zero pixels representing irrelevant objects such as dust particles. The removal of the small areas may be done by one of the following methods: a. erosion and dilation (see section 9.9 of Jain) over the region of interest, or b. labelling all contiguous non-zero regions within the area of interest (see section 9.13 of Jain) and then converting the pixels all but the largest labelled region to zero.
The remaining non-zero region of pixels is termed the thresholded object.
In step 1203 the RGB vector for each pixel in the thresholded object is multiplied by a 4x3 conversion matrix to generate the CIE (Commission Internationale de l'Eclairage) XYZ vector for that pixel (see section 3.8 of Jain). X, Y and Z values range roughly from 0 to about 100 for the normal range of colours. The XYZ space is used as it is an international standard and is known to be obtainable from the RGB space with a linear transform (see p6 7 in Jain). The elements of the 4x3 conversion matrix are typically derived by linear regression for each system by a calibration process based on the use of coloured objects of known XYZ values.
iV. The reduction of the XYZ colour space to a small number of typical industry 20 classifications or colour values may be done by one of the following processes; others may be used to ontain equivalent information: a. Maximum Likelihood Estimation (see section 8.15 of Jain and section 14.6 of
NR)
b. Principle Component Analysis (see section 7.6 of Jain).
e rr rr r r r r r r r S In step 1204 such a reduction is applied to each pixel in the thresholded object.
In step 1205 a count of the pixels in the thresholded object with each colour value is made. For those colours corresponding to dirt which may be found in wool, S the colour with the highest numeric representation is selected as most probable and pixels with the other dirt colours are altered to show the selected or most probable dirt colour. This dirt colour according to typical industry classifications and/or the average XYZ values for the dirt area are reported.
In step 1206 those pixels with colours corresponding to raw wool are similarly all given the most likely raw wool colour, as in step 1203. This wool colour according to typical industry classifications and/or the average XYZ values for the dirt area are reported.
In step 1207 the location of the transition from dirt region to raw wool region along the length of the staple, termed "dust penetration", is determined by: a. condensing the 2D array of pixels to two ID vectors along the length of the staple, with one ID element representing the count of dirt pixels and the second 1D vector representing the count of raw wool pixels, each in the corresponding column in the 2D array, b. locating the 50% transition point between dirt and raw wool pixel counts from the two vectors.
If the tip region of the staple contains a significant quantity of clean wool pixels, the location of the transition from clean wool to dirt is located in a similar manner and this is recorded as a "wash-down" distance.
Other statistical properties of the colour image may be obtained from the above parameters, including a. Percentage dirt region (proportion of dirt area to total staple area) b. Percentage wash-down region (proportion of wash-down area to total staple area) *23 In task 2 the geometry analysis is done according to flow diagram 1300 in Figure 6 In step 1301 the borders enclosing the thresholded object are found by the following steps: a. generating a ID staple width vector with elements corresponding to the count of non-zero pixels in the corresponding columns in the object image ];23 b. generating a 1D staple length vector with elements corresponding to the count of non-zero pixels in the corresponding rows c. scanning the elements in each ID vector in two directions, from the first towards the last one and from the last towards the first one, and locating the elements. where the number of non-zero pixels first exceeds a threshold appropriate to the system d.
for the staple length vector the position of the left and right transitions are referred to as the left and right borders; for the staple width vector the position of the upper and lower transitions are referred to as the top and bottom borders.
In step 1302 the distance between the upper and lower borders is calculated and is termed the width of the object; the distance between the left and right borders is calculated and is termed the length of the object. The length and width of the object may be expressed in pixels counts or may be converted to millimetres according to the pixel space to image space conversion factor. This conversion factor may be derived simply by analysis of an object of known size. In step 1303 the staple width vector is analysed at the tip end of the staple for the slope of the transition from the end of the staple to the main body of the staple. The distance between the width and 75% width is termed the tip shape parameter. This is converted to a tip grade with a final scale of A-E by reference to a calibration scale derived from a number of experienced wool industry operators.
In task 3 the crimp analysis is according to flow diagram 1400 in Figure 7. In step 1410 the presence of an object of staple size in the image is checked. This may done by using the staple-present information from the colour analysis step 1202 if this was done first, or it may be done again in a similar manner.
In step 1402 a thresholded version of the image is created. This may be done by using the information from step 1202, or if there is a possibility that the staple may 2:"0 have moved within the slot since that information was calculated then the same procedure as in step 1202 may be repeated. The threshold used may or may not be the same as for step 1202, depending on the relative light intensities on the colour and crimp regions.
In step 1403 the crimp frequency for the staple is computed. This is done using a Fourier Transform. The transform may be done using any of several known methods (see Chapter 2 in Jain and Chapter 12 in NR). The transform is performed on each row in the thresholded staple region of the image, and the spectra for all lines are summed to produce an average spectrum for the whole thresholded region.
The analysis is performed for the frequency range of interest, which for raw wool staples is typically 1.0 crimp per centimetre to 12 crimps per centimetre.
In step 1404 the generated spectrum is simplified and scanned for peaks and the frequency of the highest amplitude peak, normally above a selected threshold frequency, is designated the crimp frequency. This is reported in crimps per centimetre to fit in with current wool trade practice.
In step 1405 the shape of the crimp frequency peak is analysed for a number of parameters which may be indicative of the crimp definition (where "crimp definition" is a term known in the wool trade). The parameters may include a. crimp peak height b. crimp peak width at half-height, as measured in crimps per millimetre c. the ratio of the crimp frequency to also known in the field of electronics engineering as the Q of the peak d. any other combinations of the above Crimp definition may be reported on a discrete step scale such as A-E or on a numeric scale: the former avoids problems with the different ranges for the possible definitions listed above. Conversion of the numeric value to a step scale value is done by reference to a calibration scale derived from a number of experienced wool industry operators.
Typical Use The normal mode of operation of the Instrument 1 is for an operator to place a tray or a stack of trays onto Tray Dropper unit 100 and to activate the program in Host computer 610. Tray Dropper unit 100 will then drop trays one at a time onto Conveyor 300; this will pass the trays through TV camera field of view 460 where the RGB images are captured for analysis, and finally the trays will reach Tray Lifter unit 200 where they will be removed from Conveyor 300.
25 When a tray is lowered onto Conveyor 300 Barcode Reader unit 640a,b, which has two heads, attempts to read barcode label 642 from one end of the tray or the other. Assuming this is successful, the information passed from Barcode Reader unit 640a,b to Host computer 610 consists of a barcode head identifier (1 for the head closest to TV camera field of view 460 or 2 for the head further away) and the barcode label 642 itself. The information about which barcode head furnished barcode label 642 is used to predict the orientation of the staples in the tray: this information simplifies the image analysis process.
The tray enters TV camera field of view 460 and the necessary images are captured by Host computer 610, analysed and summarised as the tray leaves TV camera field of view 460 for Tray Lifter unit 200. The collection of information about the staples in the tray is then stored in a data file, from where it may be printed or transmitted to another computer for further processing. The data collected for a typical tray may appear thus: Table 1 Staple Width Tip Crimp Crimp Wool Wool Wool Dirt Dirt Dirt Dirt No mm Gra freq def X% Y% Z% area% X% Y% Z% de 1 12 D 7 A 44 46 46 10 18 18 14 2 15 A 7 B 48 49 50 11 20 20 3 17 C 8 A 39 39 39 19 15 15 11 4 13 B 8 A 31 32 32 13 12 12 8 19 B 7 B 33 33 32 6 7 13 A 8 A 45 46 45 18 14 14 8 14 D 7 A 49 50 50 14 19 19 9 16 D 7 A 45 46 47 16 18 18 14 10 17 A 8 B 52 53 53 15 20 20 16 11 15 D 8 B 48 49 49 14 15 15 11 12 14 E 6 A 13 16 D 8 A 40 41 41 11 16 16 13 14 11 B 7 D 38 39 39 15 16 16 12 0* 15 17 A 8 A 44 45 46 10 21 21 17 16 10 B 6 A 43 44 46 10 20 19 17 11 B 6 A 47 48 49 4 25 25 No. 16 16 16 16 14 14 14 14 14 14 14 Staple Width Tip Crimp Crimp Wool Wool Wool Dirt Dirt Dirt Dirt No iin Gra freq def X% Y% Z% area% X% Y% Z% de Mean 14 B 7 A 44 45 45 13 18 18 14 The following comments should be noted for the above results: Not all staples have a dirt region (eg staple no. 6) or allow separation of the dirt and wool areas (eg staple no. Staple no. 6 might be a second cut, for example, and too short for valid measurement XYZ values have been reported rather than industry colours in this case Clean wool is not a bright white with XYZ values near 100; the values shown are typical of many wool types These results shown in Table 1 are broadly consistent with the results obtained by two experienced human assessors, subject to the following notes: Human ability to assess staple width is poor in comparison with objective measurement systems such as Instrument 1: an error range of 3 mm would be typical. In addition, different operators will typically have different definitions of the end of a staple unless continuous retraining and normalisation is conducted.
"Tip grade is assessed by different people in different ways, and is therefore a poor subjective metric. The algorithms used classify the tip into one of five categories as defined by mathematical criteria: for the purpose of staple assessment this is considered enough.
S. "Crimp frequency is quoted in Table I to 1 crimp/cm, but is easily measured to 0.1 crimp/cm for values below 10 crimp/cm. Human assessment can easily be in error by 10-20%.
Crimp definition is assessed by different people in different ways, and is therefore a poor subjective metric. Resolution to better than three grades would not be common. The algorithms used are generally accurate to 1%.
Wool colour and dirt colour are rarely measured by humans; rather a classification into red, brown, grey etc categories is done. Considerable variation exists between humans at the borderlines between colours. Of greater significance is the generally
I
accepted limit to colour measurement of 2 units in X, Y and Z. The measurements shown in Table 1 are typically repeatable to 1 unit, although natural variation across a staple may be up to 10 units.
The assessment of dirt area by humans would rarely resolve more than percentage levels (deciles). The assessment by the algorithms may be affected by the selection of the threshold levels in the algorithms, but is reproducible to I for a given threshold.
Throughout the specification the term "comprising" is to be taken as meaning any one of "including", consisting essentially of" or "consisting of". The term comprise(s) is to be taken as meaning any one of of "include(s)", consists essentially of" or "consists of".
p..
r r .o 34 The claims defining the Invention are as follows: 1. A method for determining at least two different visual properties of an object, comprising: determining a first parameter of the object by: locating the object in a first parameter measurement interaction volume; illuminating the object in the first parameter measurement interaction volume with a light field selected from the group consisting of a substantially uniform measurement light field and an effectively uniform measurement light field to produce non surface relief measurement outgoing light containing visual information related to non lo surface relief features of the object; detecting the non surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the first parameter; and determining the first parameter from the signals; (II) determining a second parameter of the object by: locating the object in a second parameter measurement interaction volume; illuminating the object in the second parameter measurement interaction volume with a directional measurement light field so as to produce surface relief measurement outgoing light containing visual information related to surface relief features of the object; detecting the surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the second parameter; and determining the second parameter from the signals; wherein at least one of steps and comprises illuminating said object at a location selected from the group consisting of: on a reflecting surface, on a gloss black surface, on a substantially non-reflecting surface, on a flat black matt surface, against a reflecting background, against a substantially non-reflecting background, against a gloss black background, and a against a flat black matt background.
2. The method of claim 1 wherein the substantially uniform measurement light field is also a substantially constant measurement light field.
3. The method of claim 2 wherein the substantially constant measurement light field is a flat light field.
4. The method of claim 1 wherein the use of an effectively uniform measurement light field comprises illuminating the object with a non flat light field and normalising the image of the object against an image of a substantially uniform flat white or substantially uniform flat near white object as detected by detecting the outgoing light (by detector) in the same light field.
The method of any one of claims 1 to 4 wherein step comprises illuminating said object in a location selected from the group consisting of: on a nonreflecting surface, on a substantially non-reflecting surface, on a flat black matt surface, IN:\LIBH]00194:KWW

Claims (14)

  1. 6. The method of any one of claims 1 to 5 wherein step comprises illuminating said object in a location selected from the group consisting of: on a reflecting surface, on a gloss black surface, on a substantially non-reflecting surface, on a flat black matt surface, against a reflecting background, against a substantially non-reflecting background, against a gloss black background, and against a flat black matt background.
  2. 7. The method of any one of claims 1 to 5 wherein the object comprises one or more fibrous objects selected from the group consisting of a clump of fibres, a staple of fibres, raw wool staples, wool staples, synthetic fibres, natural fibres, dyed fibres, a fibrous strand, a filament, a yarn, a clump of filaments, a clump of yarns, a clump of fibrous strands, a staple of filaments, a staple of yarns, a staple of fibrous strands, cotton staples, raw cotton staples, goat hair staples, and raw goat hair staples.
  3. 8. The method of any one of claims 1 to 5 wherein the first parameter is selected from the group consisting of colour, colour distribution, shape, diameter, area, chemical composition, number of parts, width, length, absorptivity, reflectivity, dielectric constant, fluorescence, position, orientation, and density.
  4. 9. The method of any one of claims 1 to 5 wherein the second parameter is selected from surface relief features selected from the group consisting of shadowing, self S 20 shadowing, surface texture, surface periodicity, surface regularity, other surface detail, surface waviness, surface crimp, surface roughness and surface profile.
  5. 10. A method for determining at least two different visual properties of an object, comprising: determining a first parameter of the object by: S 25 locating the object in a first parameter measurement interaction volume a illuminating the object in the first parameter measurement interaction volume with an effectively uniform measurement light field comprising a non flat light field to produce a non surface relief measurement outgoing light containing visual information as related to non surface relief features of the object; *1 normalising the image of the object against an image of a substantially uniform flat white or substantially uniform flat near white object; detecting the non surface relief measurement outgoing light in the same light field and generating signals therefrom whereby the signals are a function of the first parameter; and determining the first parameter from the signals; determining a second parameter of the object by: locating the object in a second parameter measurement interaction volume; illuminating the object in the second parameter measurement interaction volume with a directional measurement light field so as to produce surface relief IN:\LIBHIO0194:KWW 36 measurement outgoing light containing visual information related to surface relief features of the object; detecting the surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the second parameter; and determining the second parameter from the signals.
  6. 11. An apparatus for determining at least two different visual properties of an object, comprising: means for determining a first parameter of the object comprising: means for locating the object in a first parameter measurement interaction 1 o volume; means for illuminating the object in the first parameter measurement interaction volume with a light field selected from the group consisting of a substantially uniform measurement light field and an effectively uniform measurement light field to produce non surface relief measurement outgoing light containing visual information related to non surface relief features of the object; a detector for detecting the non surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the first parameter, the detector being operatively associated with the means for illuminating of and means for determining the first parameter from the signals, the means for S 20 determining being operatively associated with the detector of (II) means for determining a second parameter of the object comprising: means for locating the object in a second parameter measurement interaction volume; means for illuminating the object in the second parameter measurement 25 interaction volume with a directional measurement light field so as to produce surface relief measurement outgoing light containing visual information related to surface relief features of the object; a detector for detecting the surface relief measurement outgoing light and S: generating signals therefrom whereby the signals are a function of the second parameter, the detector being operatively associated with the means for illuminating of and means for determining the second parameter from the signals, the means for determining being operatively associated with the detector of wherein at least one of said measurement interaction volumes of and there is a surface or background proximate a location where side object is illuminated or means for locating a surface or background proximate a location where said object is illuminated, said surface or background being selected from the group consisting of: a reflecting surface, a gloss black surface, a substantially non-reflecting surface, a flat black matt surface, a reflecting background, a substantially non-reflecting background, a gloss black background, and a flat black matt background wherein when said object is illuminated at Ssaid location said object is at a location selected from the group consisting of: on a [N:\LIBHI00194:KWW reflecting surface, on a gloss black surface, on a substantially non-reflecting surface, on a flat back matt surface, against a reflecting background, against a substantially non- reflecting background, against a gloss black background, and against a flat black matt background.
  7. 12. The apparatus of claim 11 wherein the means for illuminating the object in the first parameter measurement interaction comprises means for illuminating the object in the first parameter measurement interaction volume with a substantially constant light field.
  8. 13. The apparatus of claim 11 wherein the means for illuminating the object in the first parameter measurement interaction comprises means for illuminating the object in the first parameter measurement interaction volume with a substantially constant and flat light field.
  9. 14. The apparatus of claim 11 wherein the use of an effectively uniform measurement light field comprises illuminating the object with a non flat light field and normalising the image of the object against an image of a substantially uniform flat white or substantially uniform near white object as detected by detecting the outgoing light (by the detector) in the same light field. The apparatus of claim 11 further including means to separate the first parameter measurement interaction volume from the second parameter measurement :interaction volume. S 20 16. The apparatus of claim 11 wherein the detector for detecting the non surface relief measurement outgoing light and the detector for detecting the surface relief measurement outgoing light is a single detector which is an image detector.
  10. 17. The apparatus of claim 16 wherein the image detector is selected from the group consisting of video camera, line scan camera and a flying spot scanner.
  11. 18. The apparatus of claim 11 further comprising: (dd) means for determining statistical information in respect of a measurement of the S" first and/or second parameter for an object, operatively associated with the means for determining the first and/or second parameters.
  12. 19. The apparatus of claim 18 further comprising: (ddd) means for determining statistical information in respect of a plurality of measurements of (dd) for a plurality of objects, operatively associated with the means for determining of (dd). The apparatus of claim 11 further comprising at least one of the following means selected from the group consisting of: means for determining the first parameter of the object from measurement parameter determined from the signals which are a function of the first parameter, operatively associated with the detector(s) of (ii) means for storing the measurement parameter determined from the signals which are a function of the first parameter, operatively associated with the detector(s) of [N:\LIBH10O0194:KWW (iii) means for storing the first parameter of the object operatively associated with the means for determining the first parameter; (iv) means for retrieving the measurement parameter of the object operatively associated with means for storing the measurement parameter determined from the signals which are a function of the first parameter; and means for retrieving the first parameter of the object operatively associated with the means for storing the first parameter.
  13. 21. A method for determining at least two different visual properties of an object, substantially as herein described with reference to the Figures.
  14. 22. An apparatus for determining at least two different visual properties of an object, substantially as herein described with reference to the Figures. DATED 30 June 1999 Commonwealth Scientific and Industrial Research Organisation Patent Attorneys for the Applicant SPRUSON FERGUSON S *o [N:\LIBH100194:KWW Systems and Methods for Measuring at Least Two Visual Properties of an Object ABSTRACT A method and apparatus for determining at least two different visual properties of an object are disclosed. The method comprises determining a first parameter of the object by: locating the object in a first parameter measurement interaction volume; illuminating the object in the first parameter measurement interaction volume with a light field selected from the group consisting of a substantially uniform measurement light field and an effectively uniform measurement light field to produce non surface relief measurement outgoing light containing visual information related to non surface relief features of the object; detecting the non surface relief measurement outgoing light and generating signals therefrom whereby the signals are a function of the first parameter; and determining the first parameter from the signals; (II) determining a second parameter of the object by: OSS@ o.o. locating the object in a second parameter measurement interaction volume; illuminating the object in the second parameter measurement interaction volume :15 with a directional measurement light field so as to produce surface relief measurement outgoing light containing visual information related to surface relief o features of the object; detecting the surface relief measurement outgoing light 0S• 0: and generating signals therefrom whereby the signals are a function of the second SS S parameter; and determining the second parameter from the signals. o •0 •SoS S 5 S O
AU32944/95A 1994-09-29 1995-09-28 Systems and methods for measuring at least two visual properties of an object Ceased AU709459B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU32944/95A AU709459B2 (en) 1994-09-29 1995-09-28 Systems and methods for measuring at least two visual properties of an object

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AUPM8493A AUPM849394A0 (en) 1994-09-29 1994-09-29 Systems and methods for measuring at least two visual properties of an object
AUPM8493 1994-09-29
AU32944/95A AU709459B2 (en) 1994-09-29 1995-09-28 Systems and methods for measuring at least two visual properties of an object

Publications (2)

Publication Number Publication Date
AU3294495A AU3294495A (en) 1996-04-18
AU709459B2 true AU709459B2 (en) 1999-08-26

Family

ID=25622269

Family Applications (1)

Application Number Title Priority Date Filing Date
AU32944/95A Ceased AU709459B2 (en) 1994-09-29 1995-09-28 Systems and methods for measuring at least two visual properties of an object

Country Status (1)

Country Link
AU (1) AU709459B2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3641816A1 (en) * 1986-12-06 1988-06-16 Robert Prof Dr Ing Massen METHOD AND ARRANGEMENT FOR MEASURING AND / OR MONITORING PROPERTIES OF YARNS AND ROPES
US5270222A (en) * 1990-12-31 1993-12-14 Texas Instruments Incorporated Method and apparatus for semiconductor device fabrication diagnosis and prognosis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3641816A1 (en) * 1986-12-06 1988-06-16 Robert Prof Dr Ing Massen METHOD AND ARRANGEMENT FOR MEASURING AND / OR MONITORING PROPERTIES OF YARNS AND ROPES
US4887155A (en) * 1986-12-06 1989-12-12 Robert Massen Method and arrangement for measuring and/or monitoring properties of yarns or ropes
US5270222A (en) * 1990-12-31 1993-12-14 Texas Instruments Incorporated Method and apparatus for semiconductor device fabrication diagnosis and prognosis

Also Published As

Publication number Publication date
AU3294495A (en) 1996-04-18

Similar Documents

Publication Publication Date Title
KR100374909B1 (en) Method and apparatus for detecting surface characteristics of translucent objects
US5106195A (en) Product discrimination system and method therefor
US5471311A (en) Information system for monitoring products in sorting apparatus
US6914678B1 (en) Inspection of matter
EP0700515B1 (en) An automatic inspection apparatus
AU2011277075B2 (en) A checkout counter
US5526119A (en) Apparatus & method for inspecting articles such as agricultural produce
US6061086A (en) Apparatus and method for automated visual inspection of objects
JPH0727718A (en) Continuous monitor device of thin web of fibrous substance in two dimension
CN113600508B (en) Tobacco leaf tobacco bale mildenes and rot and debris monitoring system based on machine vision
JPS6322423A (en) Element discriminating and arranging method and device
WO2003034049A1 (en) Automatic inspection apparatus and method for detection of anomalies in a 3-dimensional translucent object
AU709459B2 (en) Systems and methods for measuring at least two visual properties of an object
EP0193403A1 (en) Birdswing defect detection for glass containers
US20040047509A1 (en) Digital diagnostic apparatus and vision system with related methods
US5018864A (en) Product discrimination system and method therefor
NZ280124A (en) Determining visual properties of objects,in particular crimp and colour of wool fibres
EP1698888A2 (en) Inspection of matter
TW202339862A (en) Apparatus for illuminating matter
TW202338325A (en) Material identification apparatus and method
WO2012005661A1 (en) A checkout counter
DE60028731T2 (en) METHOD AND DEVICE FOR DETECTING CRACKS IN OBJECTS MADE OF HAZARDOUS OR TRANSLUCENT MATERIAL
KR102388752B1 (en) Hyperspectral inspection device capable of detecting soft foreign substances
JPH10160676A (en) Rice grain inspection device
WO2022269647A1 (en) Optical inspection apparatus and corresponding method