CA2153647A1 - Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section - Google Patents

Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section

Info

Publication number
CA2153647A1
CA2153647A1 CA 2153647 CA2153647A CA2153647A1 CA 2153647 A1 CA2153647 A1 CA 2153647A1 CA 2153647 CA2153647 CA 2153647 CA 2153647 A CA2153647 A CA 2153647A CA 2153647 A1 CA2153647 A1 CA 2153647A1
Authority
CA
Canada
Prior art keywords
face
camera
color
main face
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA 2153647
Other languages
French (fr)
Inventor
Raphael Vogrig
Bernard Karpp
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CENTRE TECHNIQUE DU BOIS ET DE L'AMEUBLEMENT
Original Assignee
CENTRE TECHNIQUE DU BOIS ET DE L'AMEUBLEMENT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CENTRE TECHNIQUE DU BOIS ET DE L'AMEUBLEMENT filed Critical CENTRE TECHNIQUE DU BOIS ET DE L'AMEUBLEMENT
Publication of CA2153647A1 publication Critical patent/CA2153647A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/46Wood
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • G01N21/898Irregularities in textured or patterned surfaces, e.g. textiles, wood
    • G01N21/8986Wood

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Textile Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The method consists in: a) disposing a laser source projecting a plane beam of radiation in such a manner as to illuminate head-on a first face without illuminating an adjacent face; b) placing a linear color video camera facing the adjacent face under consideration so that it observes a line perpendicular to the longitudinal axis of the part in the plane of the beam of laser radiation, so as to observe both the adjacent face and the projection of the first face on the adjacent face; c) simultaneously acquiring the three R G B color signals from the camera;
d) processing the R G B signals in real time to provide a succession of adjacent pixels that are associated, as a function of their R G B components, with particular labelled domains in R G B space; e) producing relative longitudinal displacement between the part being scanned and the camera together with the laser source to observe the entire adjacent face together with the projection of the first face on the adjacent face; and f) recognizing geometrical features of the first face on the basis of pixels identified as corresponding to a particular labelled domain of R G B space that corresponds to the monochromatic illumination from the laser source.

Description

21~3647 .

A METHOD AND APPARATUS FOR RECOGNIZING GEOMETRICAL
FEATURES OF PARALLELEPIPED-SHAPED PARTS OF POLYGONAL
SECTION
The present invention relates to a method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section including at least one secondary face adjacent to a main face, the maximum angle defined between the main face under consideration and said adjacent secondary face lying in the range 0 to 90.
BACKGROUND OF THE INVENTION
Various systems are already known for observing objects, such as pieces of wood, using linear or matrix color video cameras. Such systems for automatic inspection of objects serve to spot defects of appearance such as defects of color or of structure on plane faces of objects under observation for the purpose, for example, of subsequently sorting the objects on a quality basis.
In known observation systems, each plane face of an object is examined by a color video camera in association with a lighting system. Relative movement is produced between the part to be observed and the camera together with the lighting system so as to inspect an entire plane face, even when using a linear camera or a matrix camera with a small field of view. Defects of structure are detected essentially on the basis of variation in the luminance of the video signals delivered by an observation camera. Such observation systems are nevertheless poorly adapted to recognizing geometrical features of parts, e.g. geometrical features constituted by parts that are chamfered or waney along sides of a piece of wood adjacent to an observed main face.
Also known are systems for observing travelling objects, such systems comprising laser sources projecting coherent light on an object to be observed, together with photoelectrical sensors receiving the coherent light from .

the laser sources as reflec-ted by the object so as to detect defects on the basis of variations in the amplitude of the reflected light. Such object observation systems are nevertheless not adapted to complete inspection of an object for the purposes of detecting both geometrical features or defects and defects of coloration.
OBJECTS AND SUMMARY OF THE INVENTION
An object of the present invention is to remedy the drawbacks of the prior art and in particular to provide a method and apparatus for simple and effective optical detection of geometrical features affecting secondary faces of an object adjacent to a main face of the object being inspected by a video camera.
More particularly, the invention seeks to make it possible to use an optical method to detect simultaneously both geometrical features or defects affecting two secondary faces of an object adjacent to a main face, and defects of structure and defects of color present on said main face of the object.
These objects are achieved by a method of recognizing geometrical features of parallelepiped-shaped parts of polygonal section including at least one secondary face adjacent to a main face, the maximum angle defined between the main face under consideration and said adjacent secondary face lying in the range 0 to 90, the method comprising the following steps:
a) disposing a monochromatic light source to project a plane beam of collimated and monochromatic radiation perpendicularly to the longitudinal axis of the part so as to light said secondary face head-on while avoiding lighting the plane surface of the main face;
b) placing a linear color video camera facing the main face under consideration, in such a manner that its axis is perpendicular to said main face and it observes a line perpendicular to the longitudinal axis of the part in the plane of the beam of collimated and monochromatic ~ 2153647 radiation, thereby enabling it to observe both said main face and the projection of said adjacent secondary face on said main face;
c) simultaneously acquiring three R G B color signals from the camera, each signal representing the amplitude of one of the three prime components R G and B
at each point scanned along the observation line;
d) processing the R G B signals in real time to provide a succession of adjacent pixels that are associated, as a function of their R G and B components, with particular labelled domains in R G B space;
e) producing relative longitudinal displacement between the part to be scanned and the camera together with the monochromatic light source to observe all of the main face and the projection of the adjacent secondary face on said main face; and f) recognizing the geometrical features of said secondary face by identifying pixels that correspond to a particular domain in R G B space that is identified by a label corresponding to the monochromatic lighting from said monochromatic light source.
More particularly, according to the invention, between the camera and the main face of the part, there is also a diffuse polychromatic lighting source for lighting in diffuse manner the line on the main face that is observed by the camera, and defects of structure or of coloring in the main face are identified simultaneously with geometrical features of the adjacent secondary face in the image as picked up by the camera and preprocessed in real time by means of an operation of classifying pixels by color, each pixel of the picked-up image being associated with a specific subset identified by a label, thereby making it possible directly to characterize the specific image entity to which a pixel belongs independently of any other characteristic of size or shape in the region of the image to which said pixel belongs, thus making it possible to detect and 21S36~7 distinguish both said geometrical features of said secondary face and defects of structure or of coloring in the main face as represented by pixels corresponding to particular labelled domains in R G B space that do not correspond to said monochromatic lighting from the monochromatic light source.
According to an advantageous characteristic, while simultaneously acquiring the three R G B color signals from the linear camera, the real time processing of the R
G B signals includes a transformation of color space co-ordinates that is performed electrically in order to define point colors in a new three-dimensional space that favors color discrimination.
According to a preferred other characteristic, during simultaneous acquisition of the three R G B color signals from the linear camera, real time electronic correction is performed of the intensity of each of the R
G B components received by each pixel in order to compensate for possible non-uniformities in lighting.
The operation of classifying pixels into specific subsets provided with respective labels depending on color is performed in a prior first stage of learning the colors that are characteristic of each singularity to be identified in the image, and a second stage of classification proper in real time during which each pixel is no longer characterized by a triplet of co-ordinates in a color space, but solely by the color label corresponding to the co-ordinate triplet.
In a particular embodiment, the labelled color image is subjected to a filtering operation and is then segmented into regions, in order to define dimensional and geometrical attributes of each singularity of structure or of coloring and of each geometrical feature present in the image picked up by the linear camera.
More particularly, the method of the invention may include a defect-identifying operation that is performed in a prior first stage of learning typical defects and of ~ 21S3647 determining the most discriminating parameters for identifying each type of defect, and a second stage of recognizing defects in real time during which a set of parameters is determined describing each extracted defect, and depending on the value taken by each of these parameters, the probability of each previously learned defect occurring is calculated, and the types of defect associated with the greatest probabilities of occurrence are then considered as having been recognized.
In the method of the invention geometrical features are recognized simultaneously on a plurality of secondary faces adjacent to main faces and defects of structure or of coloring of said main faces are simultaneously identified by placing as many monochromatic light sources as there are secondary faces, and as many linear color video cameras as there are main faces, and wherein, while acquiring the R G B color signals from each linear camera, image lines are built up comprising a juxtaposition of pixels from the various linear cameras.
The invention also applies to apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section including at least one secondary face adjacent to a main face, with the maximum angle defined between the main face under consideration and said adjacent secondary face lying in the range 0 to 90, the apparatus comprising at least:
a) a monochromatic light source projecting a plane beam of collimated and monochromatic radiation perpendicularly to the longitudinal axis of the part so as to illuminate said secondary face without illuminating the plane surface of the main face;
b) a linear color video camera situated facing the main face under consideration in such a manner that its axis is perpendicular to said main face and it observes a line perpendicular to the longitudinal axis of the part in the plane of the beam of collimated and monochromatic radiation;

~- 2153647 c) support and drive means for causing relative movement between the part and the camera together with the monochromatic light source along the longitudinal axis of the part, in such a manner that successive portions of the main face and of the secondary face of the part are respectively observed and illuminated by the camera and the monochromatic light source; and d) electronic means for real time acquisition and processing of the R G B color signals delivered by the linear camera to provide a succession of adjacent pixels that are associated, as a function of their R G B
components, with particular labelled domains in R G B
space, and enabling geometrical features of said secondary face to be recognized on the basis of identifying pixels belonging to a particular labelled domain in R G B space that corresponds to the monochromatic lighting from said monochromatic light source.
Advantageously, the apparatus further includes a diffuse lighting source of polychromatic light interposed between the camera and the main face of the part in order to illuminate in diffuse manner the line on said main face that is observed by the camera, and the electronic means for real time acquisition and processing of the R G
B color signals include discrimination means for detecting and discriminating said geometrical features of said secondary face and defects of structure or of coloring in the main face.
In a particular embodiment, the diffuse lighting source of polychromatic light comprises first and second light sources disposed transversely relative to the part to be observed, on either side of the plane containing the axes of the camera and of the monochromatic light source, reflectors for directing the polychromatic light from said first and second light sources towards the transverse line observed by the camera, and a support housing for the light sources and for the reflectors, 21~647 which housing includes a slot-shaped transverse opening enabling the camera to observe the line constituted by the intersection between said polychromatic light beams and the main face of the part.
The monochromatic light source may comprise a laser generator or a polychromatic light source associated with a color filter.
The apparatus of the invention is advantageously applied to parallelepiped-shaped parts of rectangular section having two opposite main faces and two secondary faces or sides.
In which case, in a first plane perpendicular to the axis of the part, the apparatus comprises first and second linear color video cameras respectively disposed facing the opposite first and second main faces of the part, first and second monochromatic light sources respectively situated facing opposite first and second secondary faces, first and second polychromatic light sources respectively interposed between the first and second cameras and the facing first and second main faces of the part, and electronic means for real time acquisition and processing of the R G B color signals delivered by the two linear cameras.
Advantageously, in order to enable all four faces of the part under observation to be examined, the apparatus further includes, on a second plane perpendicular to the axis of the part and offset longitudinally relative to said first plane perpendicular to the axis of the part, third and fourth linear color video cameras respectively disposed facing the opposite first and second secondary faces of the part, third and fourth monochromatic light sources respectively disposed facing the opposite first and second main faces, third and fourth polychromatic light sources respectively interposed between the third and fourth cameras and the opposite first and second secondary faces, and electronic means for real time acquisition and processing of the R G B color signals delivered by all four linear cameras.
The invention may be applied to parallelepiped-shaped parts of different kinds made of natural or of artificial material. Nevertheless, a particularly useful application is constituted by using the invention on pieces of wood in which it is desired to detect defects of structure such as knots or splits, and also defects of color, while simultaneously detecting geometrical features or defects, such as chamfered or waney sides adjacent to a main face under observation.
The invention, which makes it possible to scan parts of polygonal section over one or more faces while the parts are moving longitudinally, can be used in the following applications:
automatic and optimized sawing to obtain squared timber of rectangular section;
edge sawing of waney squared timber (pieces of wood having a section that is approximately trapezium-shaped);
and quality sorting of pieces output by a planing machine (pieces of wood machined to have two parallel faces including shaped profiles).
BRIEF DESCRIPTION OF THE DRAWINGS
Other characteristics and advantages of the invention appear from the following description of particular embodiments given as non-limiting examples and described with reference to the accompanying drawings, in which:
Figure 1 is a diagrammatic end view showing a first example of a simplified apparatus of the invention for detecting geometrical features of a parallelepiped-shaped part;
Figure 2 is an end view similar to Figure 1, but showing more complete optical observation apparatus of the invention, with a diffuse lighting device and two sources of laser radiation;

.
g Figures 3 and 4 are views of the Figure 2 apparatus in side view and plan view respectively;
Figure 5 is an end view similar to Figure 2, but showing even more complete optical observation apparatus enabling all four faces of a parallelepiped-shaped part to be observed simultaneously and completely;
Figure 6 is a perspective view of the Figure 5 apparatus showing the shape of the monochromatic light beams and the fields of observation of the cameras, but in which the diffuse lighting devices have been removed for reasons of clarity;
Figures 7A, 7B, 7C, and 7D are views of the four faces of the part shown in Figure 6 as examined simultaneously by the apparatus of Figure 6;
Figure 8 is a side view of an example of the diffuse lighting device usable in the context of the present invention;
Figure 9 is an end view of the Figure 8 lighting device, shown partially in section on line IX-IX of Figure 8;
Figure 10 is a plan view of the lighting device of Figures 8 and 9;
Figure 10A is a cross-section view through a reflector incorporated in the diffuse lighting device of Figures 8 to 10;
Figure 11 is a flow chart showing the various steps in processing the signals delivered by the linear color cameras of the apparatus of the invention; and Figure 12 is a block diagram showing the various functional blocks of the electronic circuits for processing the signals delivered by the linear color cameras of the apparatus of the invention.
MORE DETAILED DESCRIPTION
Figure 1 shows the two essential elements of observation apparatus of the invention adapted to automatic inspection of a parallelepiped-shaped part 5 of ` 215~6~7 polygonal section having a top main face 1, a bottom main face 2, and lateral secondary faces or sides 3, 4.
The part 5 which may be a piece of wood includes a lateral face 3 having a portion 3a that is chamfered, presenting an acute angle ~ with the top main face 1, and another portion 3b which is substantially perpendicular to said top face 1.
The observation apparatus includes a linear color video camera 11 which is situated above the main face 1, and which has its axis perpendicular to said main face 1.
The camera 11 observes a line perpendicular to the longitudinal axis of the part 5 and its field of view is large enough to enable it to observe both the main face 1 and the projection x of the secondary face 3 on the main face 1.
A source 21 of monochromatic light is also disposed on one side facing the lateral face 3 so as to project a plane beam of collimated and monochromatic radiation perpendicular to the longitudinal axis of the part 5 in such a manner as to illuminate the lateral face 3 head-on without illuminating the plane surface of the main face 1. The plane beam coming from the source 21 lies in the plane of the field of observation of the linear sensor of the camera 11.
The main plane face 1 of the part 5 is also illuminated by diffuse illumination using polychromatic light which may be the result of ambient lighting, but which preferably comes from a specific source incorporated in the observation apparatus as a whole, and ! 30 as described below with reference to Figures 2, 3, 5, and 8 to 10.
The beam of monochromatic light may come from a laser source or it may be obtained from a source of polychromatic light having a color filter associated therewith.
The source 21 of monochromatic light further includes an optical device (e.g. of the semicylindrical ` 2153647 lens type) or a mechanical device (e.g. of the rotating mirror or prism type) for generating a plane light beam perpendicular to the axis of the part 5 and projected onto the lateral face 3 thereof in the plane containing the optical axis of the camera 11 and the line under observation on the scanned surface 3.
The linear color video camera 11 may be selected from various possible types.
By way of example, in a first possible type, a single-line CCD linear camera is used in which the sensor is constituted by a juxtaposition of elements responsive to one of the three primary colors red (R), green (G), and blue (B).
Under such circumstances, the elements are organized along the sensor to constitute a plurality of R G B, ...
RGB, RGB, RGB, ... sequences. The number of image points identifiable by such a sensor is equal to the number of RGB triplets it contains.
In a second possible type, a triple-line CCD linear camera is used made up of three linear sensors.
In this case, each sensor is constituted by juxtaposing elements sensitive to one of the three primary colors. The camera thus comprises a first linear sensor sensitive to the color red, a second linear sensor sensitive to the color blue, and a third linear sensor sensitive to the color green.
These three sensors are physically parallel to one another and they are positioned in a plane that is parallel to the surface under observation. In this case, since the three sensors are physically offset by a certain distance, they do not observe simultaneously the same image line. It is then necessary for the image acquisition electronics to re-establish synchronization in time.
In a third possible type, a triple-line CCD linear camera is used made up of sensors similar to those of the above-described model, but positioned in a plane .

perpendicular to the surface under observation. In this plane, the sensors are disposed in a channel section configuration where the web is parallel to the surface under observation and the two sides are perpendicular thereto. In this case, an optical device (of the prism type) illuminates all three sensors simultaneously so that they therefore observe the same image line.
Whatever the type of camera used in this application, the camera observes the diffuse reflection due to the polychromatic lighting or to the monochromatic lighting.
When using a camera of the second type, the camera is adjusted in such a manner that the collimated monochromatic lighting is in alignment with the sensor that is the most sensitive to the wavelength of the light coming from said lighting.
The apparatus of the invention includes means for simultaneous acquisition of the three color signals R, G, and B from the camera 11, each signal representing the amplitude of one of the three primary R G B components for each scanned point along the line of observation.
The R G B color signals are processed in real time to provide a succession of adjacent pixels associated, with particular domains in R G B space being identified by labels depending on their R G B components.
Relative longitudinal displacement is produced between the part 5 to be scanned and the observation apparatus comprising the camera 11 and the monochromatic light source 21. Generally, it is more convenient to install the camera 11 and the source 21 so that they are stationary and to move the part 5 that is to be scanned, however it is also possible to observe a stationary part 5 using observation apparatus comprising a camera 11 and a monochromatic light source 21 having defined relative positions and mounted on a carriage type support capable of moving them simultaneously, e.g. along rails that extend parallel to the longitudinal direction of the part 21~36 ~7 5. During relative longitudinal displacement between the part 5 and the observation apparatus, it is thus possible to inspect its entire main face and also the projection of the secondary face 3 on the plane of the main face 1.
Geometrical features of the lateral face 3, such as the presence of chamfers 3a are observed by identifying pixels corresponding to a particular labelled domain in the R G B space that corresponds to the monochromatic lighting from the source 21 of monochromatic light.
When it is desired to observe both lateral faces 3 and 4 that are adjacent to a main face 1 of the part 5 simultaneously by means of a single camera 11 whose field of vision covers the main face 1 plus the projections of the lateral faces 3 and 4 on the plane of the main face 1, then it suffices to place a second monochromatic light source 22 facing the other face 4, as shown in Figures 2 and 4. The sources 21 and 22 may be identical, both projecting a respective plane beam of collimated monochromatic radiation perpendicularly to the longitudinal axis of the part 5, the plane beams being situated in the observation plane of the camera 11 and not lighting the plane surface of the main face 1.
In Figures 3 and 4 there can be seen the trace 25 of the monochromatic light beam from the source 21 which makes it possible to detect the presence of a chamfered portion 3a in the lateral face 3 of the part 5.
Figures 2 and 3 also show a source 31 for diffuse lighting using polychromatic light and interposed between the camera 11 and the main face 1 so as to illuminate in diffuse manner the line of the main face 1 that is observed by the camera 11.
The diffuse lighting source 31 comprises a housing 30 having a slot-shaped opening 301 extending transversely relative to the part 5 so as to enable the camera 11 to observe the line constituted by the intersection between the beams of polychromatic light and the main face 1 of the part 5, and the lines 25 of monochromatic light determining the geometrical features of the lateral faces 3 and 4 adjacent to the main face 1.
The housing 300 supports first and second light sources 308 and 309 which extend transversely relative to the part 5 to be observed on either side of the plane containing the axes of the camera 11 and of the monochromatic light sources 21 and 22. The light sources 308 and 309 may be constituted by two lamps powered by DC
or by AC via high frequency ballasts which minimize variations in light level. The housing 300 also supports reflectors 302 and 303 each containing one of the lamps 308 and 309 and directing the light emitted thereby towards the line being observed by the camera 11. An example of cylindrical reflectors 302 and 304 is shown in cross-section in Figure lOA, where the reflectors have a bottom opening that extends over an angle of about 100.
Figures 8, 9, and 10 show a particular embodiment of the housing 300 for the diffuse lighting source 31.
These Figures 8 to 10 show end elements 304 and 305 in the form of disks provided with oblong openings 306 and 307 for receiving the ends of the lamps 308 and 309 (not shown in Figures 8 to 10). These disks 304 and 305 which are themselves fixed to the housing 300 may also serve as assembly elements for the reflectors 302 and 303.
Figures 5 and 6 show an example of apparatus of the invention enabling all four faces of a parallelepiped-shaped part 5 of rectangular or trapezium-shaped section to be scanned completely. In Figure 6, which is a perspective view, diffuse lighting devices such as 31 are not shown in order to clarify the drawing.
In the apparatus of Figures 5 and 6, an additional linear color camera 12 is disposed in the same plane as the camera 11, symmetrically about the part 5 so as to observe the bottom main face 2 of the part 5.
As can be seen in Figures 5 and 6, by implementing two linear cameras 11, 12, two monochromatic light sources 21, 22 and two diffuse sources 31 and 32 of polychromatic light in the same transverse plane for the purpose of detecting defects of structure and of coloring it is possible simultaneously to recognize geometrical features of the lateral faces 3 and 4 and to detect defects of structure and of coloring in the main faces 1 and 2 of the part 5.
When it is desirable to be able to detect, on each of the faces 1 to 4 of the part 5, both geometrical features and defects of structure and of coloring, the equipment mentioned in the preceding paragraph is duplicated in a transverse plane that is offset from the transverse plane defined by the elements 1, 12 and 21, 22.
The additional set of two cameras 13, 14, two diffuse polychromatic lighting sources 33, 34, and two collimated monochromatic lighting sources 23, 24 is rotated through 90 about the axis of symm`etry of the device relative to the first set of two cameras 11, 12, two diffuse polychromatic lighting sources 31, 32, and two collimated monochromatic lighting sources 21, 22 (see Figures 5 and 6). The longitudinal offset between the two transverse planes defined by these two observation - sets may be a few centimeters, e.g. about 20 cm, in order to avoid the collimated lighting in each of the two observation sets mutually interfering.
Figure 6 and Figures 7A to 7D show examples of defects or of geometrical features that can be observed using the apparatus of the invention, as described with reference to Figures 5 and 6.
Thus, there can be seen on the top plane face l of the piece of wood 5, the mark of a knot 6, of a split 7, and of a chamfer or of waneyness 8 (Figure 7A). In contrast, the bottom plane face 2 has no defects (Figure 7B), and the same applies to the lateral face 3 (Figure 7C). Figure 7D shows both the mark of the chamfer 8 as already identified in Figure 7A, and a split 9.

There follows a description with reference to Figures 11 and 12 of the apparatus for processing the R G
B color signals delivered by one or more linear color cameras 11, 12, 13, and 14.
Figure 11 is a flow chart showing the various s-teps implemented in the apparatus for processing the R G B
color signals delivered by the cameras 11 to 14.
In a preliminary step 101, the particular characteristics of the material of the part to be scanned are defined. For example, it may be specified that the part is constituted by sawn wood. Such information makes it possible to take account of the results of learning procedures 142, 162 or of a series of rules 172, 182 that have previously been established for the type of material under consideration.
A color acquisition module 110 serves to acquire the three R, G, and B signals simultaneously from the cameras 11 to 14. Each signal represents the amplitude of one of three primary R, G, and B components at each point on an observation line on the part 5 as scanned by a camera.
In the color acquisition module 110, image lines are formed that are constituted by juxtaposing pixels from one, two, three, or four cameras 11 to i4. It is possible, with each camera, to select an arbitrary zone of consecutive pixels, and then form a single image line by concatenating the zones from each camera.
This is performed by means of first-in/first-out (FIF0) circuits 211 to 214 that handle queues which receive the R G B color signals from interfaces 201 to 204 associated with the various linear color cameras 11 to 14 (Figure 12).
A lighting correction circuit 222 which operates in real time together with a synchronization controlling circuit 221 also forms a portion of the color acquisition module 110 from which an R G B color image is produced (step 111).

21~36~

In the lighting correction circuit 222, the intensities of the R G B components as received by each pixel is electronically corrected in real time in order to compensate for possible non-uniformities in lighting.
The correction function is of the following type:

Ic = A x Ie + B
where:
Ic = corrected intensity Ie = emitted intensity A and B = correction coefficients.
In addition, during color signal acquisition, the color of each pixel is generally characterized by a triplet of co-ordinates defined in a three-dimensional co-ordinate system or "space" of the red, green, blue type = (R, G, B).
Nevertheless, this three-dimensional space is not necessarily the most advantageous and it may be preferable to characterize the color of each pixel by a triplet of co-ordinates defined in some other three-dimensional space, e.g. space of the hue, luminance, saturation type = (H, L, S).
When the linear color camera used does not directly deliver colorimetric co-ordinates in the most advantageous color space, a color co-ordinate transformation module 20 (corresponding to circuits 231 in Figure 12) automatically performs a co-ordinate change electronically.
The triplet of co-ordinates corresponding to the new image co-ordinates as provided at step 121 can then be expressed in the form:

(f(R, G, B); g (R, G, B); h(R, G, B) ) where f, ~, and h are linear combinations of R G and B
enabling the color of points to be defined in some other ` 2153617 three-dimensional space that facilitates color discrimination.
Pixels characterized by co-ordinate triplets in the color space selected in the module 120 are processed in a module 130 for classifying pixels by color (corresponding to circuits 241 of Figure 12) in order to provide a labelled color image in step 131.
At step 121 where pixels are input into the classification module by color, the set of possible values in the space is specified in the form of triplets (C1, C2, C3). Each address triplet is associated with a form or label selected from the set of labels available:
Addresses Data (C1.1, C2.1, C3.1) ~ Li (C1.1, C2.1, C3.2) ~ Lj (C1.1, C2.1, C3.3) ~ Lk (Cl.n, C2.n, C3.n) ~ Ln where:
Cli = f(R G B) C2i = g(R G B) C3i = h(R G B) Physically, each label (Li, Lj, ...) groups together the set of colors that characterize a particular entity.
The mere fact of associating a pixel of the image with a specific subset of colors (label~ makes it possible directly to characterize the specific image entity it belongs to independently of any other characteristic of dimension, shape, ..., of the image region to which said pixel belongs.
This pixel classification operation takes place in two stages:
a) learning the colors that are characteristic of each singularity that is to be identified in the image (module 132); and 21S36~7 b) classification proper in which each pixel is no longer characterized by a triplet of co-ordinates (C1, C2, C3) but solely by its label (module 130).
The learning stage 132 is implemented by a human operator who locates within a previously recorded image each of the zones that correspond to a specific singularity.
The operator also tells the computer system what type of singularity has been located in this way.
This operation of locating and identifying singularities is repeated several times to give the computer system sufficiently diverse data to be able to select as accurately as possible which colors ought to be grouped together under a common label.
The classification stage proper is subsequently performed electronically, so that each pixel is then characterized by its label (step 131).
In the labelled color image present in step 131, the fact that a pixel belongs to a particular labelled subset suffices to define what type of singularity is involved.
This image is processed in a filter module 140 associated with a prior learning procedure in a step 142.
The filter module 140 seeks to distinguish between singularities of structure or of coloring and features that are geometrical.
Given that the particular subset to which a pixel belongs suffices to define the type of singularity in which it is included, each singularity of structure or of coloring shown up by the diffuse lighting 31 to 34 is directly identifiable by its specific color.
In contrast, geometrical features that may apply to the secondary lateral faces or "sides" 3, 4 of the main plane surface of the part 5 that is under observation (1 or 2) are directly illuminated by the monochromatic lighting 21 or 22. These features cannot be characterized solely on the basis of their own color, but can be characterized on the basis of the color of the 21~3647 monochromatic light that is projected over their entire surface area.
Thus, merely by using filtering to identify the label that corresponds to the color of the lighting, it is possible to characterize a geometrical feature concerning one of the sides 3, 4 of the scanned surface 1 or 2.
The filtered and labelled image as obtained in step 141 after passing through the filter module 140 is subsequently processed by computation in a module 150 for segmenting regions and having the function of defining dimensional and geometric attributes to each entity present in the image.
Each singularity of structure or of coloring, and each geometrical feature is thus characterized by its type, its shape, its size, and its location (X, Y).
Once the definition of the regions and of their attributes is available in step 151, defects are identified in a probabilistic classification module 160.
In this module, defects are identified by a statistical procedure organized in two steps:
In a learning, first step 162, the images of typical defects are stored in the memory of the image scanning system. A human operator identifies each defect present in the image and names it.
The system then determines which parameters are the most discriminating for identifying each type of defect.
In a second step, defects are identified in continuous operation, the vision system determining a set of parameters describing each extracted defect (area, outline, ....).
Depending on the value taken by each of these parameters, the system then calculates the probability of the occurrence of each previously-learned defect.
The type of defect which is associated with the highest probability of occurrence is then considered as having been recognized.

` 21536~7 Once the list of detected defects is available in step 161, a defect interpretation module 170 proceeds to perform filtering as a function of decision-making rules previously determined in a step 172 so as to retain only those defects that are meaningful in considering the scanned part 5.
Thus, a filtered defect list is obtained in step 171. This information can, where appropriate, be used in an optimizing or grading module 180 which applies cutting or grading rules that have previously been input in a step 182.
After steps 170 and 171 in which the vision system generates the list of defects present on the parts being scanned (face number, defect type, surrounding rectangle, ...), this data is made use of in the module 180 by software for performing quality classification or for optimized throughput. Thus, at the outlet from the module 180, at a step 181, orders are obtained for classifying objects by quality or for controlling sawing, and the manufacturing process can be controlled as a function of the characteristics of the previously scanned product.
The processing and interface circuits used in the present invention can be controlled and run from a PC-type microcomputer.
In Figure 12, there is shown a bus 272 of a PC-type microcomputer, interface circuits 271 for said PC-type bus 272, and initialization registers 261 connected to the interface circuits 271. Also shown are interface circuits 251 for performing the functions specific to each of the modules 140, 150, 160, and 170.
The interface circuits 251 serve to interface dat-a with processor cards for processing digital signals. The interface circuits 251 have three memory paths "MEM"
constituted by the three color paths R G and B used for screen display, and four paths "TRAI" constituted by the four labelled paths used for processing purposes.

21536~7 The circuit 221 which has connections with the camera interfaces 201 to 204, the FIF0 circuits 211 to 214, the lighting correction circuit 222, the color space co-ordinate changing circuit 231, and the classifier circuit 241, comprises a set of components that serve firstly to manage control signals for all of the circuits in the card constituting the processor device (generating clock signals and control signals for other functional blocks), and also to manage the "bitmap" function. This bitmap function serves, for an acquired line having N
pixels (e.g. N=2098 for color cameras of the kind usable in the context of the present invention) to store and thus process only a subset of the pixels that have been declared to be useful.
It may be observed that by using monochromatic light sources 201 to 204 in the invention, it is possible not only to display and discriminate geometrical features of the parts under observation, but also, and additionally, to adjust each camera 11 to 14 on the basis of a laser trace obtained from one of the sources 21 to 24, thereby making possible to improve camera alignment.

Claims (17)

1/ A method of recognizing geometrical features of parallelepiped-shaped parts of polygonal section including at least one secondary face adjacent to a main face, the maximum angle defined between the main face under consideration and said adjacent secondary face lying in the range 0° to 90°, the method comprising the following steps:
a) disposing a monochromatic light source to project a plane beam of collimated and monochromatic radiation perpendicularly to the longitudinal axis of the part so as to light said secondary face head-on while avoiding lighting the plane surface of the main face;
b) placing a linear color video camera facing the main face under consideration, in such a manner that its axis is perpendicular to said main face and it observes a line perpendicular to the longitudinal axis of the part in the plane of the beam of collimated and monochromatic radiation, thereby enabling it to observe both said main face and the projection of said adjacent secondary face on said main face;
c) simultaneously acquiring three R G B color signals from the camera, each signal representing the amplitude of one of the three prime components R G and B
at each point scanned along the observation line;
d) processing the R G B signals in real time to provide a succession of adjacent pixels that are associated, as a function of their R G and B components, with particular labelled domains in R G B space;
e) producing relative longitudinal displacement between the part to be scanned and the camera together with the monochromatic light source to observe all of the main face and the projection of the adjacent secondary face on said main face; and f) recognizing the geometrical features of said secondary face by identifying pixels that correspond to a particular domain in R G B space that is identified by a label corresponding to the monochromatic lighting from said monochromatic light source.
2/ A method according to claim 1, wherein, between the camera and the main face of the part, there is also a diffuse polychromatic lighting source for lighting in diffuse manner the line on the main face that is observed by the camera, and wherein defects of structure or of coloring in the main face are identified simultaneously with geometrical features of the adjacent secondary face in the image as picked up by the camera and preprocessed in real time by means of an operation of classifying pixels by color, each pixel of the picked-up image being associated with a specific subset identified by a label, thereby making it possible directly to characterize the specific image entity to which a pixel belongs independently of any other characteristic of size or shape in the region of the image to which said pixel belongs, thus making it possible to detect and distinguish both said geometrical features of said secondary face and defects of structure or of coloring in the main face as represented by pixels corresponding to particular labelled domains in R G B space that do not correspond to said monochromatic lighting from the monochromatic light source.
3/ A method according to claim 2, wherein, while simultaneously acquiring the three R G B color signals from the linear camera, the real time processing of the R G B signals includes a transformation of color space co-ordinates that is performed electrically in order to define point colors in a new three-dimensional space that favors color discrimination.
4/ A method according to claim 2, wherein, during simultaneous acquisition of the three R G B color signals from the linear camera, real time electronic correction is performed of the intensity of each of the R G B
components received by each pixel in order to compensate for possible non-uniformities in lighting.
5/ A method according to claim 2, wherein the operation of classifying pixels into specific subsets provided with respective labels depending on color is performed in a prior first stage of learning the colors that are characteristic of each singularity to be identified in the image, and a second stage of classification proper in real time during which each pixel is no longer characterized by a triplet of co-ordinates in a color space, but solely by the color label corresponding to the co-ordinate triplet.
6/ A method according to claim 2, wherein the labelled color image is subjected to a filtering operation and is then segmented into regions, in order to define dimensional and geometrical attributes of each singularity of structure or of coloring and of each geometrical feature present in the image picked up by the linear camera.
7/ A method according to claim 2, including an operation of identifying defects that is performed in a prior first stage of learning typical defects and of determining the most discriminating parameters for identifying each type of defect, and a second stage of recognizing defects in real time during which a set of parameters is determined describing each extracted defect, and depending on the value taken by each of these parameters, the probability of each previously learned defect occurring is calculated, and the types of defect associated with the greatest probabilities of occurrence are then considered as having been recognized.
8/ A method according to claim 2, wherein geometrical features are recognized simultaneously on a plurality of secondary faces adjacent to main faces and defects of structure or of coloring of said main faces are simultaneously identified by placing as many monochromatic light sources as there are secondary faces, and as many linear color video cameras as there are main faces, and wherein, while acquiring the R G B color signals from each linear camera, image lines are built up comprising a juxtaposition of pixels from the various linear cameras.
9/ Apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section including at least one secondary face adjacent to a main face, with the maximum angle defined between the main face under consideration and said adjacent secondary face lying in the range 0° to 90°, the apparatus comprising at least:
a) a monochromatic light source projecting a plane beam of collimated and monochromatic radiation perpendicularly to the longitudinal axis of the part so as to illuminate said secondary face without illuminating the plane surface of the main face;
b) a linear color video camera situated facing the main face under consideration in such a manner that its axis is perpendicular to said main face and it observes a line perpendicular to the longitudinal axis of the part in the plane of the beam of collimated and monochromatic radiation;
c) support and drive means for causing relative movement between the part and the camera together with the monochromatic light source along the longitudinal axis of the part, in such a manner that successive portions of the main face and of the secondary face of the part are respectively observed and illuminated by the camera and the monochromatic light source, and d) electronic means for real time acquisition and processing of the R G B color signals delivered by the linear camera to provide a succession of adjacent pixels that are associated, as a function of their R G B
components, with particular labelled domains in R G B
space, and enabling geometrical features of said secondary face to be recognized on the basis of identifying pixels belonging to a particular labelled domain in R G B space that corresponds to the monochromatic lighting from said monochromatic light source.
10/ Apparatus according to claim 9, further including a diffuse lighting source of polychromatic light interposed between the camera and the main face of the part in order to illuminate in diffuse manner the line on said main face that is observed by the camera, and wherein the electronic means for real time acquisition and processing of the R G B color signals comprise discrimination means for detecting and discriminating said geometrical features of said secondary face and defects of structure or of coloring in the main face.
11/ Apparatus according to claim 10, wherein the diffuse lighting source of polychromatic light comprises first and second light sources disposed transversely relative to the part to be observed, on either side of the plane containing the axes of the camera and of the monochromatic light source, reflectors for directing the polychromatic light from said first and second light sources towards the transverse line observed by the camera, and a support housing for the light sources and for the reflectors, which housing includes a slot-shaped transverse opening enabling the camera to observe the line constituted by the intersection between said polychromatic light beams and the main face of the part.
12/ Apparatus according to claim 9, wherein the monochromatic light source comprises a laser generator.
13/ Apparatus according to claim 9, wherein the monochromatic light source comprises a polychromatic light source associated with a color filter.
14/ Apparatus according to claim 9, wherein the monochromatic light source further includes an optical or a mechanical device for generating a plane light beam perpendicular to the axis of the part and projected in the plane containing the optical center of the camera and the line under observation of the scanned surface.
15/ Apparatus according to claim 9, applied to parallelepiped-shaped parts of rectangular section having two opposite main faces and two secondary faces or sides, wherein in a first plane perpendicular to the axis of the part it comprises first and second linear color video cameras respectively disposed facing the opposite first and second main faces of the part, first and second monochromatic light sources respectively situated facing opposite first and second secondary faces, first and second polychromatic light sources respectively interposed between the first and second cameras and the facing first and second main faces of the part, and electronic means for real time acquisition and processing of the R G B color signals delivered by the two linear cameras.
16/ Apparatus according to claim 15, further including, on a second plane perpendicular to the axis of the part and offset longitudinally relative to said first plane perpendicular to the axis of the part, third and fourth linear color video cameras respectively disposed facing the opposite first and second secondary faces of the part, third and fourth monochromatic light sources respectively disposed facing the opposite first and second main faces, third and fourth polychromatic light sources respectively interposed between the third and fourth cameras and the opposite first and second secondary faces, and electronic means for real time acquisition and processing of the R G B color signals delivered by all four linear cameras.
17/ The use of the method according to claim 2 and of the apparatus according to claim 10 in detecting defects of structure and defects of coloring present in a plane face of a piece of wood while simultaneously detecting geometrical features or defects that apply to the two sides adjacent to the plane face under observation.
CA 2153647 1994-07-12 1995-07-11 Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section Abandoned CA2153647A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR9408630 1994-07-12
FR9408630A FR2722573B1 (en) 1994-07-12 1994-07-12 METHOD AND DEVICE FOR RECOGNIZING GEOMETRIC PARTICULARITIES OF PARALLELEPIPEDIC PARTS OF POLYGONAL SECTION

Publications (1)

Publication Number Publication Date
CA2153647A1 true CA2153647A1 (en) 1996-01-13

Family

ID=9465298

Family Applications (1)

Application Number Title Priority Date Filing Date
CA 2153647 Abandoned CA2153647A1 (en) 1994-07-12 1995-07-11 Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section

Country Status (4)

Country Link
EP (1) EP0692714A1 (en)
CA (1) CA2153647A1 (en)
FI (1) FI953397A (en)
FR (1) FR2722573B1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6122065A (en) * 1996-08-12 2000-09-19 Centre De Recherche Industrielle Du Quebec Apparatus and method for detecting surface defects
CA2245412C (en) 1998-08-20 2001-06-26 Centre De Recherche Industrielle Du Quebec Apparatus and method for marking elongated articles
AU7270200A (en) * 1999-09-14 2001-04-17 Opti-Wood Aps Method for quality determination and handling of elongate wood items
ITBZ20020018A1 (en) * 2002-04-18 2003-10-20 Microtec Srl PROCEDURE FOR RECOGNIZING ON THE SURFACE OF WOODEN TABLES THE PRESENCE OF DEFECTS SUCH AS CRACKS OR BEVELED EDGES.
PT102835B (en) * 2002-09-03 2004-08-31 Continental Mabor Ind De Pneus MONITORING SYSTEM AND AUTOMATIC CONTROL OF TOLERANCE IN TEXTILE CLOTH OVERLAYS AMENDMENTS.
ITBZ20050027A1 (en) * 2005-05-31 2006-12-01 Microtec Srl PROCEDURE FOR PHOTOGRAPHICALLY REPRESENTING THE EXTERNAL APPEARANCE OF A TRUNK AND ASSOCIATING ITS PHOTOGRAPHIC REPRESENTATION WITH THE RESPECTIVE THREE-DIMENSIONAL STRUCTURE OF THE MIDDLE TRUNK AS WELL AS A DEVICE FOR IMPLEMENTING THIS PROCEDURE.
CN104097820A (en) * 2013-04-10 2014-10-15 苏州华觉智能科技有限公司 Detection device
CN115308218A (en) * 2022-10-11 2022-11-08 中材节能股份有限公司 Calcium silicate board surface defect on-line measuring system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4831545A (en) * 1987-06-26 1989-05-16 Weyerhaeuser Company Method for determination of pith location relative to lumber surfaces
FR2624601B1 (en) * 1987-12-11 1990-05-25 Tech Bois Ameublement Centre VIDEO-LASER DETECTION DEVICE FOR DETERMINING GEOMETRIC CHARACTERISTICS OF AN OBJECT
WO1990011488A1 (en) * 1989-03-29 1990-10-04 Visionsmart Inc. Real-time lumber grading process and system
SE468107B (en) * 1989-10-27 1992-11-02 Autosort Ab PROCEDURE SHOULD REVISE AATMINSTONE BY AN OPTELECTRIC CAMERA WHICH IS LISTED FROM AATMINSTONE TWO DIFFERENT HALLS AND WHICH THE PICTURE SHOWS AATMINSTONE CONTAINS TWO LENGTH BOARDS, ACCORDINGLY
DE4218971C2 (en) * 1992-06-10 1994-09-22 Grecon Greten Gmbh & Co Kg Process for calibrating an image processing system

Also Published As

Publication number Publication date
FR2722573B1 (en) 1996-10-04
EP0692714A1 (en) 1996-01-17
FI953397A (en) 1996-01-13
FR2722573A1 (en) 1996-01-19
FI953397A0 (en) 1995-07-11

Similar Documents

Publication Publication Date Title
KR101338576B1 (en) Defect inspection device for inspecting defect by image analysis
US5097516A (en) Technique for illuminating a surface with a gradient intensity line of light to achieve enhanced two-dimensional imaging
EP0452905B1 (en) Method and apparatus for inspecting surface pattern of object
US6175645B1 (en) Optical inspection method and apparatus
AU699751B2 (en) Lumber defect scanning including multi-dimensional pattern recognition
KR101298957B1 (en) Wood knot detecting method, device, and program
US20010030744A1 (en) Method of simultaneously applying multiple illumination schemes for simultaneous image acquisition in an imaging system
EP0382466A2 (en) Methods and apparatus for optically determining the acceptability of products
US7355692B2 (en) System and method for inspecting electrical circuits utilizing reflective and fluorescent imagery
US6084663A (en) Method and an apparatus for inspection of a printed circuit board assembly
US6556291B2 (en) Defect inspection method and defect inspection apparatus
CA2153647A1 (en) Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section
CA2226473A1 (en) Inspection system for exterior article surfaces
JP2710527B2 (en) Inspection equipment for periodic patterns
KR20230139166A (en) Inspection Method for Wood Product
JP3311880B2 (en) Automatic detection device for fruits and vegetables injury
JPH05215694A (en) Method and apparatus for inspecting defect of circuit pattern
JPH05188006A (en) Surface flaw detecting device
Affolder et al. Automated visual inspection and defect detection of large-scale silicon strip sensors
JPS60207980A (en) Method and device for fetching picture
JPH1185980A (en) Method and device for inspecting print quantity of printed metal plate
JPH10160676A (en) Rice grain inspection device
JP3449469B2 (en) Color / shape identification method and device
JP2000258348A (en) Defect inspection apparatus
CN117949470A (en) Multi-station transparent material edging corner defect detection system and method

Legal Events

Date Code Title Description
EEER Examination request
FZDE Dead