CN110018174B - Method and device for detecting object appearance - Google Patents

Method and device for detecting object appearance Download PDF

Info

Publication number
CN110018174B
CN110018174B CN201910431015.6A CN201910431015A CN110018174B CN 110018174 B CN110018174 B CN 110018174B CN 201910431015 A CN201910431015 A CN 201910431015A CN 110018174 B CN110018174 B CN 110018174B
Authority
CN
China
Prior art keywords
line
preset
appearance
image
length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910431015.6A
Other languages
Chinese (zh)
Other versions
CN110018174A (en
Inventor
林广栋
刘浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Lianbao Information Technology Co Ltd
Original Assignee
Hefei Lianbao Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Lianbao Information Technology Co Ltd filed Critical Hefei Lianbao Information Technology Co Ltd
Priority to CN201910431015.6A priority Critical patent/CN110018174B/en
Priority to CN202110950345.3A priority patent/CN113588667B/en
Publication of CN110018174A publication Critical patent/CN110018174A/en
Application granted granted Critical
Publication of CN110018174B publication Critical patent/CN110018174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application provides a method and a device for detecting the appearance of an object. The method and the device find the four edges of the B surface of the notebook computer through an image processing method, and adjust the B surface of the notebook computer to a standard position, so that the influence of the opening and closing angle of the B surface of the notebook computer on a detection result is reduced.

Description

Method and device for detecting object appearance
Technical Field
The present application relates to the field of computer networks, and in particular, to a method for detecting the appearance of an object and an apparatus for detecting the appearance of an object.
Background
The detection of the appearance of a product by using an intelligent visual system (AOI for short) is a common detection means. The method for detecting the appearance of the B surface of the notebook computer by using the intelligent vision method is an important means for ensuring the appearance quality of key components such as a business mark (Logo) and a camera of the B surface of the notebook computer.
Currently, the scheme for detecting the B-side of the notebook computer is based on a conventional product appearance detection scheme. The user selects a plurality of preset positioning areas on a shot standard photo as reference through a user interface in a computer. When the product of the production line is detected, the actual positioning area in the product image is searched, and the product image is subjected to position correction according to the relation between the actual positioning area and the preset positioning area. And then detecting each area to be detected of the product image with the corrected position.
The opening and closing angle of the B surface of the notebook computer can be adjusted. When the notebook computer passes through the AOI detection station of the production line, the opening and closing angle of the B surface of the notebook computer is not fixed. The actual positioning area of the product image may have a deviation in a certain angle from the preset positioning area, and there is a high probability of failure in directly searching the actual positioning area on the product image. Therefore, in the current notebook computer production line, the failure rate of the AOI detection of the a side and the B side is high. So that in the practical AOI detection of the notebook computer, the detection functions of the A plane and the B plane are often disabled.
Disclosure of Invention
The application provides a method for detecting the appearance of an object, and a device for detecting the appearance of the object; the problem of high failure rate during AOI detection is solved.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
the application provides a method for detecting the appearance of an object, which comprises the following steps:
acquiring a plurality of pieces of first characteristic point information of a first contour line of a first object appearance image; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected;
acquiring a first conversion model according to the first characteristic point information and a corresponding preset reference point;
adjusting the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image;
sequentially acquiring similarity matching results of the second component image and the corresponding preset reference image;
judging whether all the similarity matching results meet preset qualified conditions or not;
and if so, judging that the appearance of the first object is qualified.
Optionally, before the obtaining of the information of the plurality of first feature points of the first contour line of the first object appearance image, the method further includes:
acquiring a plurality of first appearance lines of a first object appearance image according to a preset line model;
determining the first appearance line according to a preset profile condition to obtain a first profile line;
and acquiring a plurality of pieces of first characteristic point information of the first contour line according to preset characteristic point conditions.
Optionally, the preset feature point condition includes: the condition for the intersection of the two lines and/or the condition for the centre point of the circular arc is determined.
Optionally, the preset line model is an algorithm of a Hough fitting straight line.
Optionally, the front view of the first object is rectangular;
the preset profile condition comprises the following steps: determining a preset left line condition of a left line of the first contour line, determining a preset right line condition of a right line of the first contour line, determining a preset upper line condition of an upper line of the first contour line and/or determining a preset lower line condition of a lower line of the first contour line;
the preset left line condition comprises: the length of the first appearance line meets a preset first length threshold, the absolute value of the slope of the first appearance line meets a preset first slope threshold, and the midpoint of the first appearance line is located on the leftmost side of the first object appearance image;
the preset right line condition comprises the following steps: the length of the first appearance line meets a preset first length threshold, the absolute value of the slope of the first appearance line meets a preset first slope threshold, and the midpoint of the first appearance line is located on the rightmost side of the first object appearance image;
the preset line drawing condition comprises the following steps: the length of the first appearance line meets a preset second length threshold, the absolute value of the slope of the first appearance line meets a preset second slope threshold, and the midpoint of the first appearance line is located at the uppermost edge of the first object appearance image;
the preset lower line condition comprises the following steps: the length of the first appearance line meets a preset second length threshold, and the absolute value of the slope of the first appearance line meets a preset second slope threshold.
Optionally, the preset lower line condition further includes: a ratio of the first height to the first length is less than or equal to an actual length-to-height ratio of a rectangle of the first object;
wherein the first height is a length from a midpoint of the upper line to a midpoint of the first appearance line; the first length is a length from a midpoint of the left line to a midpoint of the right line.
Optionally, the presetting of the line-up condition or the presetting of the line-down condition further includes: the average color value of a preset first area above the first appearance line meets a preset first color threshold and the average color value of a preset second area below the first appearance line meets a preset second color threshold.
Optionally, the first feature point information is an intersection of the left line, the right line, the upper line, and the lower line.
Optionally, the first conversion model is a perspective transformation matrix.
The application provides a device for detecting object appearance, includes:
the characteristic point obtaining unit is used for obtaining a plurality of first characteristic point information of a first contour line of the first object appearance image; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected;
the conversion model obtaining unit is used for obtaining a first conversion model according to the first characteristic point information and a corresponding preset reference point;
an adjusting unit, configured to adjust the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image
The matching unit is used for sequentially acquiring similarity matching results of the second component image and the corresponding preset reference image;
the judging unit is used for judging whether all the similarity matching results meet preset qualified conditions;
and the judging unit is used for judging that the appearance of the first object is qualified if the output result of the judging unit is 'yes'.
Based on the disclosure of the above embodiments, it can be known that the embodiments of the present application have the following beneficial effects:
the application provides a method and a device for detecting the appearance of an object, wherein the method comprises the following steps: acquiring a plurality of pieces of first characteristic point information of a first contour line of a first object appearance image; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected; acquiring a first conversion model according to the first characteristic point information and a corresponding preset reference point; adjusting the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image; sequentially acquiring similarity matching results of the second component image and the corresponding preset reference image; respectively judging whether the similarity matching results meet preset qualified conditions; and if so, judging that the part to be detected associated with the second part image is qualified.
The method and the device find the four edges of the B surface of the notebook computer through an image processing method, and adjust the B surface of the notebook computer to a standard position, so that the influence of the opening and closing angle of the B surface of the notebook computer on a detection result is reduced.
Drawings
Fig. 1 is a flowchart of a method for detecting an appearance of an object according to an embodiment of the present application;
fig. 2 is a block diagram of units of an apparatus for detecting an appearance of an object according to an embodiment of the present application.
Detailed Description
Specific embodiments of the present application will be described in detail below with reference to the accompanying drawings, but the present application is not limited thereto.
It will be understood that various modifications may be made to the embodiments disclosed herein. Accordingly, the foregoing description should not be construed as limiting, but merely as exemplifications of embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the application.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the application and, together with a general description of the application given above and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the present application will become apparent from the following description of preferred forms of embodiment, given as non-limiting examples, with reference to the attached drawings.
It should also be understood that, although the present application has been described with reference to some specific examples, a person of skill in the art shall certainly be able to achieve many other equivalent forms of application, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present application will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely examples of the application, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application of unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
The present application provides a first embodiment, namely an embodiment of a method for detecting the appearance of an object.
The present embodiment is described in detail below with reference to fig. 1, where fig. 1 is a flowchart of a method for detecting an appearance of an object according to an embodiment of the present application.
Step S101, acquiring a plurality of first feature point information of a first contour line of a first object appearance image; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected.
The first contour line refers to an edge line, that is, a peripheral line of an object or an outer frame line of a figure.
The first feature point information is a point on the first contour line that can represent the appearance contour of the object. For example the intersection of two straight lines.
The purpose of the embodiment of detecting the appearance of the object is to automatically detect the appearance of the product. For example, whether the position of a business sign (Logo) on the B surface of the notebook computer is correct, whether the installation of a camera is misplaced and the like are detected; wherein, the parts to be detected refer to commercial signs (Logo), cameras and the like; however, since the opening and closing angle of the B-side of the notebook computer is not fixed relative to the camera for acquiring the appearance image of the first object, a certain uncertainty is brought to the automatic detection.
And step S102, acquiring a first conversion model according to the first characteristic point information and a corresponding preset reference point.
The reference points are preset, namely information of a plurality of characteristic points in the contour line of the standard appearance image displaying the correct installation of the component. The preset reference point corresponds to the first feature point information. The purpose of setting the standard appearance image and the preset reference point is to adjust the first object appearance image to the position of the standard appearance image as a reference so as to compare the image of the part to be detected in the first object appearance image with the preset reference image.
The position of the preset reference point in the image is related to the size of the B surface of the notebook computer.
Optionally, the first conversion model is a perspective transformation matrix.
For example, the B surface of the notebook computer is a rectangle, and the first feature point information is the intersection of 4 rectangles; according to the positions of the 4 vertexes in the B surface appearance image of the notebook computer and the positions of the 4 vertexes of the preset reference point, a perspective transformation matrix for converting the 4 vertexes in the B surface appearance image of the notebook computer to the preset reference point is obtained through perspective transformation in computer vision.
Step S103, adjusting the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image.
For example, continuing the above example, the B-plane appearance image of the notebook computer is adjusted to the standard image position by the perspective transformation matrix, wherein a business Logo (Logo) image and a camera image included in the B-plane appearance image of the notebook computer are also adjusted simultaneously.
And step S104, sequentially acquiring similarity matching results of the second component image and the corresponding preset reference image.
The preset reference image is a standard image for determining whether the second component image is correct. In the appearance image of the first object, there are several images of the part to be detected, and there are several preset reference images corresponding to the part to be detected.
For example, continuing with the above example, the position and size of the second component image (e.g., a business Logo (Logo) image and a camera image) are saved on the production management system, and a screenshot of the detection area can be obtained through the information, and then the screenshot is compared with a preset reference image to determine whether the corresponding component is qualified.
And step S105, judging whether all the similarity matching results meet preset qualified conditions.
And step S106, if yes, judging that the appearance of the first object is qualified.
When the similarity matching result is judged to meet the preset qualified conditions, whether the similarity matching result meets the preset qualified conditions or not can be respectively judged. And if so, determining the part to be detected associated with the second part image.
And when all the parts to be detected in the appearance of the first object are qualified, the appearance of the first object is qualified.
Before the obtaining of the information of the plurality of first feature points of the first contour line of the first object appearance image, the method further includes:
step S100-1, a plurality of first appearance lines of the first object appearance image are obtained according to a preset line model.
The first appearance lines refer to all identifiable lines in the first object appearance image. Wherein, the first contour line is included.
The preset line model is an algorithm of Hough fitting straight lines.
All the first appearance lines in the first object appearance image can be found through an algorithm of Hough fitting straight lines in computer vision. However, the first contour line of the first object appearance image needs to be found from all the first appearance lines.
And S100-2, determining the first appearance line according to a preset contour condition to obtain a first contour line.
And S100-3, acquiring a plurality of pieces of first characteristic point information of the first contour line according to preset characteristic point conditions.
The outline of an object is mainly composed of lines, including straight lines and curved lines. Whereas the curve is composed of circular arcs. Therefore, the key to determining the object profile is the intersection of the two lines and the center point of the arc. These points are captured to outline the object.
Therefore, the preset feature point conditions include: the condition for the intersection of the two lines and/or the condition for the centre point of the circular arc is determined.
For the method of detecting the appearance of an object, the present embodiment provides a first application scenario. The first object has a rectangular front view.
The preset profile condition comprises the following steps: determining a preset left line condition of a left line of the first contour line, determining a preset right line condition of a right line of the first contour line, determining a preset upper line condition of an upper line of the first contour line and/or determining a preset lower line condition of a lower line of the first contour line.
The preset left line condition comprises: the length of the first appearance line meets a preset first length threshold, the absolute value of the slope of the first appearance line meets a preset first slope threshold, and the midpoint of the first appearance line is located on the leftmost side of the first object appearance image.
The preset right line condition comprises the following steps: the length of the first appearance line meets a preset first length threshold, the absolute value of the slope of the first appearance line meets a preset first slope threshold, and the midpoint of the first appearance line is located on the rightmost side of the first object appearance image.
The preset line drawing condition comprises the following steps: the length of first outward appearance lines satisfies and predetermines second length threshold, just the slope absolute value of first outward appearance lines satisfies and predetermines second slope threshold, just the midpoint position of first outward appearance lines is in first object outward appearance image is the top.
The preset lower line condition comprises the following steps: the length of the first appearance line meets a preset second length threshold, and the absolute value of the slope of the first appearance line meets a preset second slope threshold.
Optionally, the preset lower line condition further includes that a ratio of the first height to the first length is smaller than or equal to an actual length-to-height ratio of the rectangle of the first object. Wherein the first height is a length from a midpoint of the upper line to a midpoint of the first appearance line; the first length is a length from a midpoint of the left line to a midpoint of the right line.
For example, the process of finding the bottom line of the B-side of the notebook computer is interfered by the line on the keyboard. The space key on the keyboard fits 2 straight lines. Are susceptible to interference from straight lines formed on the keyboard. When the B surface of the notebook computer is completely vertical to the camera, the ratio of the height to the width in the real image of the B surface of the notebook computer is equal to the theoretical height-width ratio of the B surface of the notebook computer; however, in practice, the angle between the B-plane of the notebook computer and the camera cannot be completely 90 degrees; the B-side image formed in the camera has a certain inclination, so that the ratio of the height to the width in the B-side image of the notebook computer before the alignment is smaller than the actual ratio of the height to the width; if the aspect ratio value formed by the fitted straight line is larger than the aspect ratio value of the B surface theory, the fitted straight line on the bottom edge is necessarily the straight line on the keyboard; by this limitation, some of the interference caused by the straight lines on the keyboard can be eliminated.
Optionally, the presetting of the line-up condition or the presetting of the line-down condition further includes: the average color value of a preset first area above the first appearance line meets a preset first color threshold and the average color value of a preset second area below the first appearance line meets a preset second color threshold.
For example, the lower line of the B-side of the notebook computer is easily interfered by other lines due to the connection with the rotating shaft of the system end, a dark area is arranged below the lower line of the B-side of the notebook computer, a bright area is arranged above the lower line, and the lower line is identified by judging the upper and lower colors of the lower line to eliminate the interference lines.
Optionally, the first feature point information is an intersection of the left line, the right line, the upper line, and the lower line.
Optionally, the first slope threshold is greater than 2. Optionally, the second slope threshold is less than 0.3.
The larger the absolute value of the slope of the line, the more vertical the line is in the planar rectangular coordinate system. Conversely, the smaller the absolute value of the slope of the line, the more horizontal the line is in the planar rectangular coordinate system. Therefore, the purpose of finding a straight line having an absolute value of slope greater than 2 is to find a vertical line. For example, because the position of the notebook computer is not fixed when the notebook computer passes through the detection station, the opening and closing angle of the B surface of the notebook computer is not fixed, and the straight line formed by the two sides of the B surface of the notebook computer is not completely vertical.
Therefore, a threshold value is set, and when the slope of the straight line is larger than the threshold value, the straight line is considered to be a line formed on two sides of the B surface of the notebook computer.
In the embodiment, the four edges of the B surface of the notebook computer are found by the image processing method, and the B surface of the notebook computer is adjusted to the standard position, so that the influence of the opening and closing angle of the B surface of the notebook computer on the detection result is reduced.
Corresponding to the first embodiment provided by the application, the application also provides a second embodiment, namely an apparatus for detecting the appearance of an object. Since the second embodiment is basically similar to the first embodiment, the description is simple, and the relevant portions should be referred to the corresponding description of the first embodiment. The device embodiments described below are merely illustrative.
Fig. 2 illustrates an embodiment of an apparatus for detecting an appearance of an object provided in the present application. Fig. 2 is a block diagram of units of an apparatus for detecting an appearance of an object according to an embodiment of the present application.
Referring to fig. 2, the present application provides an apparatus for detecting an appearance of an object, including:
an obtaining feature point unit 201, configured to obtain a plurality of pieces of first feature point information of a first contour line of an appearance image of a first object; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected;
an obtaining conversion model unit 202, configured to obtain a first conversion model according to the first feature point information and a corresponding preset reference point;
an adjusting unit 203, configured to adjust the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image
A matching unit 204, configured to sequentially obtain similarity matching results between the second component image and corresponding preset reference images;
a determining unit 205, configured to determine whether all the similarity matching results satisfy a preset qualified condition;
a determination unit 206, configured to determine that the appearance of the first object is acceptable if the output result of the determination unit is "yes".
In the apparatus, further comprising: the preprocessing unit is used for acquiring a plurality of pieces of first characteristic point information according to the first object appearance image;
in the preprocessing unit, comprising:
the acquiring unit is used for acquiring a plurality of first appearance lines of the first object appearance image according to a preset line model;
the first contour line obtaining subunit is used for determining the first appearance line according to a preset contour condition to obtain a first contour line;
and the acquiring first feature point information subunit is used for acquiring a plurality of pieces of first feature point information of the first contour line according to preset feature point conditions.
Optionally, the preset feature point condition includes: the condition for the intersection of the two lines and/or the condition for the centre point of the circular arc is determined.
Optionally, the preset line model is an algorithm of a Hough fitting straight line.
Optionally, the front view of the first object is rectangular;
the preset profile condition comprises the following steps: determining a preset left line condition of a left line of the first contour line, determining a preset right line condition of a right line of the first contour line, determining a preset upper line condition of an upper line of the first contour line and/or determining a preset lower line condition of a lower line of the first contour line;
the preset left line condition comprises: the length of the first appearance line meets a preset first length threshold, the absolute value of the slope of the first appearance line meets a preset first slope threshold, and the midpoint of the first appearance line is located on the leftmost side of the first object appearance image;
the preset right line condition comprises the following steps: the length of the first appearance line meets a preset first length threshold, the absolute value of the slope of the first appearance line meets a preset first slope threshold, and the midpoint of the first appearance line is located on the rightmost side of the first object appearance image;
the preset line drawing condition comprises the following steps: the length of the first appearance line meets a preset second length threshold, the absolute value of the slope of the first appearance line meets a preset second slope threshold, and the midpoint of the first appearance line is located at the uppermost edge of the first object appearance image;
the preset lower line condition comprises the following steps: the length of the first appearance line meets a preset second length threshold, and the absolute value of the slope of the first appearance line meets a preset second slope threshold.
Optionally, the preset lower line condition further includes: a ratio of the first height to the first length is less than an actual length-to-height ratio of the rectangle of the first object;
wherein the first height is a length from a midpoint of the upper line to a midpoint of the first appearance line; the first length is a length from a midpoint of the left line to a midpoint of the right line.
Optionally, the presetting of the line-up condition or the presetting of the line-down condition further includes: the average color value of a preset first area above the first appearance line meets a preset first color threshold and the average color value of a preset second area below the first appearance line meets a preset second color threshold.
Optionally, the first feature point information is an intersection of the left line, the right line, the upper line, and the lower line.
Optionally, the first conversion model is a perspective transformation matrix.
In the embodiment, the four edges of the B surface of the notebook computer are found by the image processing method, and the B surface of the notebook computer is adjusted to the standard position, so that the influence of the opening and closing angle of the B surface of the notebook computer on the detection result is reduced.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present application, and the protection scope of the present application is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered to be within the scope of the present application.

Claims (8)

1. A method of detecting the appearance of an object, comprising:
acquiring a plurality of pieces of first characteristic point information of a first contour line of a first object appearance image; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected;
acquiring a first conversion model according to the first characteristic point information and a corresponding preset reference point;
adjusting the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image;
sequentially acquiring similarity matching results of the second component image and the corresponding preset reference image;
judging whether all the similarity matching results meet preset qualified conditions or not;
if yes, judging that the appearance of the first object is qualified;
before the obtaining of the information of the plurality of first feature points of the first contour line of the first object appearance image, the method further includes:
acquiring a plurality of first appearance lines of a first object appearance image according to a preset line model;
determining a plurality of first appearance lines according to a preset profile condition to obtain a first profile line;
acquiring a plurality of pieces of first characteristic point information of the first contour line according to preset characteristic point conditions;
the front view of the first object is rectangular; the preset profile condition comprises the following steps: determining a preset left line condition of a left line of the first contour line, determining a preset right line condition of a right line of the first contour line, determining a preset upper line condition of an upper line of the first contour line and/or determining a preset lower line condition of a lower line of the first contour line; the preset left line condition comprises: the length of the left first appearance line meets a preset first length threshold, the absolute value of the slope of the left first appearance line meets a preset first slope threshold, and the midpoint of the left first appearance line is located on the leftmost side of the first object appearance image; the preset right line condition comprises the following steps: the length of the right first appearance line meets a preset first length threshold, the absolute value of the slope of the right first appearance line meets a preset first slope threshold, and the midpoint of the right first appearance line is located on the rightmost side of the first object appearance image; the preset line drawing condition comprises the following steps: the length of the upper first appearance line meets a preset second length threshold, the absolute value of the slope of the upper first appearance line meets a preset second slope threshold, and the midpoint of the upper first appearance line is located at the uppermost edge of the first object appearance image; the preset lower line condition comprises the following steps: the length of the lower first appearance line meets a preset second length threshold, and the absolute value of the slope of the lower first appearance line meets a preset second slope threshold.
2. The method of claim 1, wherein the pre-set feature point conditions comprise: the condition for the intersection of the two lines and/or the condition for the centre point of the circular arc is determined.
3. The method of claim 1, wherein the predetermined line model is a Hough fitting straight line algorithm.
4. The method of claim 1, wherein the preset lower line condition further comprises: a ratio of the first height to the first length is less than or equal to an actual height-to-length ratio of a rectangle of the first object;
wherein the first height is a length from a midpoint of the upper line to a midpoint of the lower line; the first length is a length from a midpoint of the left line to a midpoint of the right line.
5. The method of claim 1, wherein the preset top line condition or the preset bottom line condition further comprises: the average color value of a preset first area above the upper first appearance line meets a preset first color threshold and the average color value of a preset second area below the lower first appearance line meets a preset second color threshold.
6. The method according to any one of claims 1 to 5, wherein the first feature point information is an intersection of the left line, the right line, the upper line, and the lower line.
7. The method of claim 1, wherein the first transformation model is a perspective transformation matrix.
8. An apparatus for inspecting the appearance of an object using the method of any of claims 1-7, comprising:
the characteristic point obtaining unit is used for obtaining a plurality of first characteristic point information of a first contour line of the first object appearance image; wherein the first object appearance image comprises a first part image of a plurality of parts to be inspected;
the conversion model obtaining unit is used for obtaining a first conversion model according to the first characteristic point information and a corresponding preset reference point;
an adjusting unit, configured to adjust the first object appearance image into a second adjusted image according to the first conversion model; wherein the first part image is adjusted to a second part image in the second adjusted image
The matching unit is used for sequentially acquiring similarity matching results of the second component image and the corresponding preset reference image;
the judging unit is used for judging whether all the similarity matching results meet preset qualified conditions;
and the judging unit is used for judging that the appearance of the first object is qualified if the output result of the judging unit is 'yes'.
CN201910431015.6A 2019-05-22 2019-05-22 Method and device for detecting object appearance Active CN110018174B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910431015.6A CN110018174B (en) 2019-05-22 2019-05-22 Method and device for detecting object appearance
CN202110950345.3A CN113588667B (en) 2019-05-22 2019-05-22 Method and device for detecting appearance of object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910431015.6A CN110018174B (en) 2019-05-22 2019-05-22 Method and device for detecting object appearance

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110950345.3A Division CN113588667B (en) 2019-05-22 2019-05-22 Method and device for detecting appearance of object

Publications (2)

Publication Number Publication Date
CN110018174A CN110018174A (en) 2019-07-16
CN110018174B true CN110018174B (en) 2021-10-22

Family

ID=67194316

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910431015.6A Active CN110018174B (en) 2019-05-22 2019-05-22 Method and device for detecting object appearance
CN202110950345.3A Active CN113588667B (en) 2019-05-22 2019-05-22 Method and device for detecting appearance of object

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110950345.3A Active CN113588667B (en) 2019-05-22 2019-05-22 Method and device for detecting appearance of object

Country Status (1)

Country Link
CN (2) CN110018174B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111192250B (en) * 2019-12-30 2022-02-08 合肥联宝信息技术有限公司 Computer B-side frame detection method and device, computer storage medium and computer

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1619574A (en) * 2003-10-07 2005-05-25 索尼株式会社 Image matching method, program, and image matching system
CN102622614A (en) * 2012-02-24 2012-08-01 山东鲁能智能技术有限公司 Knife switch closing reliability judging method based on distance between knife switch arm feature point and fixing end
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor
CN103052971A (en) * 2010-07-30 2013-04-17 松下电器产业株式会社 Detection device and method for transition area in space
CN103175843A (en) * 2011-12-20 2013-06-26 西安扩力机电科技有限公司 Product quality inspection instrument based on image processing
KR101364046B1 (en) * 2012-11-05 2014-02-19 재단법인대구경북과학기술원 Method and apparatus for object tracking in video sequences
CN103605979A (en) * 2013-12-03 2014-02-26 苏州大学张家港工业技术研究院 Object identification method and system based on shape fragments
CN103916588A (en) * 2012-12-28 2014-07-09 三星电子株式会社 Image transformation apparatus and method
CN104237249A (en) * 2014-09-11 2014-12-24 苏州佳祺仕信息科技有限公司 Tag appearance inspection and detection technological method
CN104915957A (en) * 2015-05-29 2015-09-16 何再兴 Matching rectification method for improving three dimensional visual sense identification precision of industrial robot
CN106327483A (en) * 2016-08-12 2017-01-11 广州视源电子科技股份有限公司 Method, system and device for attaching logo of detection equipment
CN106501271A (en) * 2016-11-24 2017-03-15 深圳市博视科技有限公司 product appearance detection method
CN109738450A (en) * 2019-01-09 2019-05-10 合肥联宝信息技术有限公司 The detection method and device of keyboard of notebook computer

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2798349B2 (en) * 1993-11-08 1998-09-17 松下電器産業株式会社 Vehicle position detection device
KR100819614B1 (en) * 2006-06-20 2008-04-04 호서대학교 산학협력단 Method for Generating Image for Testing Flat Pannel of Displaying Device
DE102008031942A1 (en) * 2008-07-07 2010-01-14 Steinbichler Optotechnik Gmbh Method and device for 3D digitizing an object
ES2415240B1 (en) * 2011-12-21 2014-05-21 Abengoa Solar New Technologies, S.A. METHOD FOR AUTOMATED INSPECTION OF PHOTOVOLTAIC SOLAR COLLECTORS INSTALLED IN PLANTS.
CN105096299B (en) * 2014-05-08 2019-02-26 北京大学 Polygon detecting method and polygon detecting device
KR101705762B1 (en) * 2015-09-02 2017-02-14 주식회사 미르기술 Method for Correcting tilt of 3D shape measuring device
KR101761641B1 (en) * 2015-10-20 2017-08-08 주식회사 셀바스에이아이 Device and method for obtaining edge line by detecting outline
CN105654097B (en) * 2015-12-29 2019-04-16 上海珍岛信息技术有限公司 The detection method of quadrangle marker in image
CN107607542A (en) * 2017-08-31 2018-01-19 苏州诺维博得智能装备科技有限公司 notebook appearance quality detection method and device
CN107561087A (en) * 2017-08-31 2018-01-09 广东工业大学 A kind of mouse logo positioning and defect inspection method based on machine vision

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1619574A (en) * 2003-10-07 2005-05-25 索尼株式会社 Image matching method, program, and image matching system
CN103052971A (en) * 2010-07-30 2013-04-17 松下电器产业株式会社 Detection device and method for transition area in space
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor
CN103175843A (en) * 2011-12-20 2013-06-26 西安扩力机电科技有限公司 Product quality inspection instrument based on image processing
CN102622614A (en) * 2012-02-24 2012-08-01 山东鲁能智能技术有限公司 Knife switch closing reliability judging method based on distance between knife switch arm feature point and fixing end
KR101364046B1 (en) * 2012-11-05 2014-02-19 재단법인대구경북과학기술원 Method and apparatus for object tracking in video sequences
CN103916588A (en) * 2012-12-28 2014-07-09 三星电子株式会社 Image transformation apparatus and method
CN103605979A (en) * 2013-12-03 2014-02-26 苏州大学张家港工业技术研究院 Object identification method and system based on shape fragments
CN104237249A (en) * 2014-09-11 2014-12-24 苏州佳祺仕信息科技有限公司 Tag appearance inspection and detection technological method
CN104915957A (en) * 2015-05-29 2015-09-16 何再兴 Matching rectification method for improving three dimensional visual sense identification precision of industrial robot
CN106327483A (en) * 2016-08-12 2017-01-11 广州视源电子科技股份有限公司 Method, system and device for attaching logo of detection equipment
CN106501271A (en) * 2016-11-24 2017-03-15 深圳市博视科技有限公司 product appearance detection method
CN109738450A (en) * 2019-01-09 2019-05-10 合肥联宝信息技术有限公司 The detection method and device of keyboard of notebook computer

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Direct Visual Odometry Using Lines for a Monocular Camera";Yingge Wang 等;《Journal of Physics: Conference Series》;20190501;第1229卷(第1期);第1-6页 *
"弱纹理环境下基于线条的图像位姿恢复";刘坤 等;《信息技术》;20190430;第43卷(第4期);第128-130+134页 *

Also Published As

Publication number Publication date
CN113588667B (en) 2024-06-14
CN110018174A (en) 2019-07-16
CN113588667A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN109409374B (en) Joint-based same-batch test paper answer area cutting method
US8416314B2 (en) Method and system for processing images
US20140193034A1 (en) Object detection device, object detection method and object detection program
CN103810478A (en) Sitting posture detection method and device
CN105067638A (en) Tire fetal-membrane surface character defect detection method based on machine vision
CN104168478B (en) Based on the video image color cast detection method of Lab space and relevance function
CN109738450B (en) Method and device for detecting notebook keyboard
CN103925893B (en) Quality detection method of battery cells
CN108615030B (en) Title consistency detection method and device and electronic equipment
CN106709952B (en) A kind of automatic calibration method of display screen
WO2018006566A1 (en) View adjustment method and system
CN117252868B (en) Direct current screen defect detection method based on machine vision
CN106651837A (en) White glass plate surface edge breakage defect detecting method
CN110018174B (en) Method and device for detecting object appearance
CN111861979A (en) Positioning method, positioning equipment and computer readable storage medium
CN111027517A (en) Sitting posture correction reminding system and method based on vision and application
CN116337412A (en) Screen detection method, device and storage medium
CN113569859B (en) Image processing method and device, electronic equipment and storage medium
CN109840453B (en) Face matching method and device
CN112419225B (en) SOP type chip detection method and system based on pin segmentation
CN108596981B (en) Aerial view angle re-projection method and device of image and portable terminal
CN110610163A (en) Table extraction method and tool based on ellipse fitting in natural scene
TW201328312A (en) Image processing method and device for redeye correction
US8345965B2 (en) Method and apparatus for positioning edges of photograph
CN113610091A (en) Intelligent identification method and device for air switch state and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant