CN111783801B - Object contour extraction method and system and object contour prediction method and system - Google Patents

Object contour extraction method and system and object contour prediction method and system Download PDF

Info

Publication number
CN111783801B
CN111783801B CN202010690822.2A CN202010690822A CN111783801B CN 111783801 B CN111783801 B CN 111783801B CN 202010690822 A CN202010690822 A CN 202010690822A CN 111783801 B CN111783801 B CN 111783801B
Authority
CN
China
Prior art keywords
contour
matrix
pixel point
contours
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010690822.2A
Other languages
Chinese (zh)
Other versions
CN111783801A (en
Inventor
蒋晨晓
陆永健
徐杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bwave Technology Co ltd
Original Assignee
Shanghai Bwave Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bwave Technology Co ltd filed Critical Shanghai Bwave Technology Co ltd
Priority to CN202010690822.2A priority Critical patent/CN111783801B/en
Publication of CN111783801A publication Critical patent/CN111783801A/en
Application granted granted Critical
Publication of CN111783801B publication Critical patent/CN111783801B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an object contour extraction method, which comprises the steps of obtaining an object absolute temperature to form a thermodynamic diagram; linearly constraining the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix; converting the thermodynamic real image matrix into a gray image matrix; converting the gray map matrix into a binary gray map matrix of a second preset real range; filtering noise points of the binary gray scale image matrix; converting the binary gray scale map matrix into a binary map of a first preset real range; extracting all contours in the binary image, and simultaneously generating a contour level table, wherein the contour level table is used for representing the inclusion relation among the contours of the images; and eliminating all the inner contours through the contour hierarchy table to obtain a measuring peripheral contour set of one frame. The invention also discloses an object contour extraction system, an object contour prediction method and an object contour prediction system. The invention can accurately extract or predict all contours in the fuzzy thermal image and improve the contour information acquisition precision in the fuzzy thermal image.

Description

Object contour extraction method and system and object contour prediction method and system
Technical Field
The invention relates to the field of image processing and analysis, in particular to an object contour extraction method based on a temperature image and an object contour extraction system based on the temperature image. The invention also relates to an object contour prediction method or system based on the object contour extraction method or system.
Background
Current infrared temperature sensors fall into two forms: one is to collect single-valued temperature values, typically for a simple hand-held body temperature detector, and simple device temperature exploration. The infrared temperature sensor is applied to analyze specific temperature after acquiring temperature images. The other is a high-end sensor capable of outputting a temperature matrix, and common applications are personnel detection, fire detection, building automation, light control, monitoring systems and the like. The infrared temperature sensor is applied to visualizing the temperature distribution of an object through image fusion with image data acquired by a traditional camera, and is widely applied to thermodynamic diagram analysis of industrial equipment; the image information is acquired through the camera, and then the image processing is performed to acquire or output the information in a traditional environment information acquisition mode, but the color gamut image which is acquired from the camera and is of RGB type essentially is required to acquire the corresponding temperature distribution, a temperature sensor is also required to acquire the temperature image, and then the temperature distribution of the image is acquired through the combination of the temperature image and the temperature image.
One feature of the above application is that the required output information is also temperature, the processing problems are concentrated, and no related technology for directly and accurately extracting the contour information based on the fuzzy thermodynamic diagram exists at present.
Disclosure of Invention
In the summary section, a series of simplified form concepts are introduced that are all prior art simplifications in the section, which are described in further detail in the detailed description section. The summary of the invention is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The invention aims to provide an object contour extraction method capable of accurately extracting an object contour in a fuzzy temperature image (thermodynamic diagram).
Another technical problem to be solved by the present invention is to provide an object contour extraction system capable of accurately extracting an object contour in a blurred temperature image (thermodynamic diagram).
Another technical problem to be solved by the present invention is to provide a prediction method capable of accurately predicting the position of an object contour in a blurred temperature image (thermodynamic diagram).
Still another object of the present invention is to provide a prediction system that can accurately predict the position of an object contour in a blurred temperature image (thermodynamic diagram).
The blurred temperature image (thermodynamic diagram) includes, but is not limited to, a temperature image obtained with an infrared temperature sensor.
In order to solve the technical problems, the object contour extraction method provided by the invention comprises the following steps:
S1, acquiring an absolute temperature of an object to form a thermodynamic diagram;
S2, linearly constraining the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix;
S3, converting the thermodynamic real image matrix into a gray image matrix;
S4, converting the gray scale image matrix into a binary gray scale image matrix of a second preset real number range;
S5, filtering noise points of the binary gray scale image matrix;
s6, converting the binary gray scale map matrix into a binary map of a first preset real number range;
S7, extracting all contours in the binary image, and simultaneously generating a contour level table, wherein the contour level table is used for representing the inclusion relation among the contours of the images;
S8, eliminating all inner contours through a contour hierarchy table to obtain a measuring peripheral contour set of one frame.
Alternatively, the first preset real number range is 0-1, and the second preset real number range is 0-255.
Alternatively, the hierarchy table uses an integer based on 2, indexed by an integer starting from 2.
Optionally, when step S5 is implemented, the following sub-steps are adopted to filter noise points;
S5.1, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the object frame;
S5.2, if the sum of the obtained gray values is smaller than a preset threshold value, the pixel point is set to be 1, otherwise, the pixel point is set to be 0.
Alternatively, when step S7 is performed, the following substeps are adopted to extract the contour;
S7.1, filling to form a peripheral boundary of a frame by filling black pixel points, wherein the filling width is one pixel;
S7.2, searching a first white pixel point from top to bottom and from left to right, wherein the left collar of the white pixel point is a black pixel point, the first white pixel point is found as a point on a contour, and the point is marked as a preset mark;
S7.3, starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
s7.4, repeating the steps S7.2-S7.3 until the initial white pixel point in a frame is completely searched;
S7.5, obtaining all contours of the original measurement of a frame and a contour hierarchy table by adopting the same contour with the same preset mark.
A neighborhood: in the digital image, the neighborhood is divided into a 4 neighborhood and an 8 neighborhood, wherein the 4 neighborhood is the upper, lower, left and right four points of a certain (x, y) point, and the 8 neighborhood is added with the upper, left, upper, right, lower, left and right four points. If p is within 8 points around q, it is that p is within 8 neighbors of q. Abutment: the adjacency algorithm contains a neighborhood, and if p and q are said to be adjacencies, then p and q must be within the neighborhood of each other.
Alternatively, when step S8 is performed, the following substeps are used to extract the peripheral contour;
S8.1, marking the inner contour as black pixel points according to a contour hierarchy table of all contours in a frame to directly eliminate all the inner contours in the contours, and obtaining peripheral contours;
Optionally, when step S8 is implemented, the method further includes;
s8.2, deleting the noise contour of the peripheral contour through the contour circumference to obtain a measuring peripheral contour set of one frame.
The invention provides an object contour prediction method by utilizing the object contour extraction method, which comprises the following steps:
S9, calculating through a formula (1) to obtain an object prediction contour;
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K,0 < K < 1, and the time sequence number is t. Preferably, k=0.7
Optionally, the method further comprises the steps of:
S10, correcting the predicted outline of the object by using the measured peripheral outline set through Gaussian product.
The invention provides an object contour extraction system, comprising:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
the first filtering unit is used for filtering noise points of the binary gray scale image matrix;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit is used for extracting all contours in the binary image and simultaneously generating a contour level table which is used for representing the inclusion relation among the contours of the images;
and the peripheral contour acquisition unit is used for eliminating all the inner contours through the contour hierarchy table and obtaining a measured peripheral contour set of one frame.
Alternatively, the first preset real number range is 0-1, and the second preset real number range is 0-255.
Alternatively, the hierarchy table uses an integer based on 2, indexed by an integer starting from 2.
Optionally, the first filtering unit sequentially calculates the sum of gray values of 8 neighboring pixel points around all pixel points except the object frame, if the sum of the obtained gray values is smaller than a preset threshold value, the pixel point is set to be 1, otherwise, the pixel point is set to be 0.
Alternatively, the contour extraction unit extracts the contour by the following steps;
filling black pixel points to form a peripheral boundary of a frame, wherein the filling width is one pixel;
Searching a first white pixel point from top to bottom from left to right, wherein the left collar of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
Starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
repeating the marking and searching until the initial white pixel point in a frame is completely searched;
and acquiring all the contours of the original measurement of a frame and a contour hierarchy table by adopting the same contour with the same preset mark.
Alternatively, the peripheral contour obtaining unit extracts the peripheral contour by the following steps;
Marking the inner contour as black pixel points according to a contour hierarchy table of all contours in a frame to directly eliminate all the inner contours in the contours to obtain peripheral contours;
optionally, the method further comprises: and a second filtering unit for deleting the noise contour of the peripheral contour through the contour perimeter to obtain a measured peripheral contour set of one frame.
The invention provides an object contour prediction system with the object contour extraction system, which further comprises:
an object contour prediction module for obtaining an object predicted contour by calculation of formula (1);
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K,0 < K < 1, and the time sequence number is t. Preferably, k=0.7
Optionally, the method further comprises: an object prediction contour correction module for correcting an object prediction contour using the measured peripheral contour set by a gaussian product.
The invention obtains an object absolute temperature forming thermodynamic diagram, obtains a contour hierarchy table and a measuring peripheral contour set through noise filtering, parameter conversion and contour extraction, and obtains an object contour prediction position through calculating the object moving speed between two frames as constant quantity. Under the condition that the thermodynamic diagram acquired by the sensor is unstable, all contours in the thermodynamic image can be accurately extracted or predicted, the contour information acquisition precision in the fuzzy thermodynamic image is improved, and a contour set taking time as an independent variable is acquired. The obtained contour set and the object contour prediction position can be used for various subsequent applications, such as personnel identification, statistics of the number of people and the like.
Drawings
The accompanying drawings are intended to illustrate the general features of methods, structures and/or materials used in accordance with certain exemplary embodiments of the invention, and supplement the description in this specification. The drawings of the present invention, however, are schematic illustrations that are not to scale and, thus, may not be able to accurately reflect the precise structural or performance characteristics of any given embodiment, the present invention should not be construed as limiting or restricting the scope of the numerical values or attributes encompassed by the exemplary embodiments according to the present invention. The invention is described in further detail below with reference to the attached drawings and detailed description:
fig. 1 to 5 are flowcharts of first to fifth embodiments of the present invention.
FIG. 6 is a diagram of the present invention for searching and labeling white pixels in the 8-domain area.
Fig. 7 is a schematic diagram of an object prediction contour correction process.
Description of the reference numerals
Prediction contour set X (n)
T-1 moment profile set X (n) t-1
T-2 moment profile set X (n) t-2
The difference delta X (n) between t-2 and t-1 time
Contour set X (n) at time t t
Measuring peripheral profile Z (n) t
The specified gain factor K t at time t.
Detailed Description
Other advantages and technical effects of the present invention will become more fully apparent to those skilled in the art from the following disclosure, which is a detailed description of the present invention given by way of specific examples. The invention may be practiced or carried out in different embodiments, and details in this description may be applied from different points of view, without departing from the general inventive concept. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict. The following exemplary embodiments of the present invention may be embodied in many different forms and should not be construed as limited to the specific embodiments set forth herein. It should be appreciated that these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the technical solution of these exemplary embodiments to those skilled in the art.
Furthermore, it will be understood that, although the terms "first," "second," etc. may be used herein to describe various elements, parameters, components, regions, layers and/or sections, these elements, parameters, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, parameter, component, region, layer or section from another element, parameter, component, region, layer or section. Thus, a first element, parameter, component, region, layer or section discussed below could be termed a second element, parameter, component, region, layer or section without departing from the teachings of the example embodiments of the present invention.
In a first embodiment, as shown in fig. 1, the object contour extraction method provided by the present invention includes the following steps:
S1, acquiring an absolute temperature of an object to form a thermodynamic diagram;
S2, linearly constraining the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix;
S3, converting the thermodynamic real image matrix into a gray image matrix;
S4, converting the gray scale image matrix into a binary gray scale image matrix of a second preset real number range;
S5, filtering noise points of the binary gray scale image matrix;
s6, converting the binary gray scale map matrix into a binary map of a first preset real number range;
S7, extracting all contours in the binary image, and simultaneously generating a contour level table, wherein the contour level table is used for representing the inclusion relation among the contours of the images;
S8, eliminating all inner contours through a contour hierarchy table to obtain a measuring peripheral contour set of one frame.
Alternatively, the first preset real number range is 0-1, and the second preset real number range is 0-255. The hierarchical table uses an integer with a base of 2, and is indexed by an integer starting from 2.
In a second embodiment, as shown in fig. 2, the object contour extraction method provided by the present invention includes the following steps:
S1, acquiring an absolute temperature of an object to form a thermodynamic diagram;
S2, linearly constraining the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix;
S3, converting the thermodynamic real image matrix into a gray image matrix;
S4, converting the gray scale image matrix into a binary gray scale image matrix of a second preset real number range;
S5, filtering noise points of the binary gray scale image matrix, wherein the method comprises the following substeps;
S5.1, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the object frame;
S5.2, if the sum of the obtained gray values is smaller than a preset threshold value, the pixel point is set to be 1, otherwise, the pixel point is set to be 0.
S6, converting the binary gray scale map matrix into a binary map of a first preset real number range;
S7, extracting all contours in the binary image, and simultaneously generating a contour level table, wherein the contour level table is used for representing the inclusion relation among the contours of the images and comprises the following substeps;
S7.1, filling to form a peripheral boundary of a frame by filling black pixel points, wherein the filling width is one pixel;
S7.2, searching a first white pixel point from top to bottom and from left to right, wherein the left collar of the white pixel point is a black pixel point, the first white pixel point is found as a point on a contour, and the point is marked as a preset mark; for example, mark this pixel as 2;
s7.3, referring to FIG. 6, starting from the first white pixel point, searching other white pixel points in a counterclockwise mode with 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
s7.4, repeating the steps S7.2-S7.3 until the initial white pixel point in a frame is completely searched;
S7.5, acquiring all contours of the original measurement of a frame and a contour hierarchy table by adopting the same contour with the same preset mark;
s8, eliminating all inner contours through a contour hierarchy table to obtain a measuring peripheral contour set of one frame; comprises the following substeps;
S8.1, marking the inner contour as black pixel points according to a contour hierarchy table of all contours in a frame to directly eliminate all the inner contours in the contours, and obtaining peripheral contours;
In a third embodiment, as shown in fig. 3, the object contour extraction method provided by the present invention includes the following steps:
S1, acquiring an absolute temperature of an object to form a thermodynamic diagram;
S2, linearly constraining the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix;
S3, converting the thermodynamic real image matrix into a gray image matrix;
S4, converting the gray scale image matrix into a binary gray scale image matrix of a second preset real number range;
S5, filtering noise points of the binary gray scale image matrix, wherein the method comprises the following substeps;
S5.1, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the object frame;
S5.2, if the sum of the obtained gray values is smaller than a preset threshold value, the pixel point is set to be 1, otherwise, the pixel point is set to be 0.
S6, converting the binary gray scale map matrix into a binary map of a first preset real number range;
S7, extracting all contours in the binary image, and simultaneously generating a contour level table, wherein the contour level table is used for representing the inclusion relation among the contours of the images and comprises the following substeps;
S7.1, filling to form a peripheral boundary of a frame by filling black pixel points, wherein the filling width is one pixel;
S7.2, searching a first white pixel point from top to bottom and from left to right, wherein the left collar of the white pixel point is a black pixel point, the first white pixel point is found as a point on a contour, and the point is marked as a preset mark; for example, mark this pixel as 2;
s7.3, referring to FIG. 6, starting from the first white pixel point, searching other white pixel points in a counterclockwise mode with 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
s7.4, repeating the steps S7.2-S7.3 until the initial white pixel point in a frame is completely searched;
S7.5, acquiring all contours of the original measurement of a frame and a contour hierarchy table by adopting the same contour with the same preset mark;
s8, eliminating all inner contours through a contour hierarchy table to obtain a measuring peripheral contour set of one frame; comprises the following substeps;
S8.1, marking the inner contour as black pixel points according to a contour hierarchy table of all contours in a frame to directly eliminate all the inner contours in the contours, and obtaining peripheral contours;
s8.2, deleting the noise contour of the peripheral contour through the contour circumference to obtain a measuring peripheral contour set of one frame.
A fourth embodiment, as shown in fig. 4, the present invention provides an object contour prediction method using the object contour extraction method of the first, second or third embodiment, including the steps of:
S9, calculating through a formula (1) to obtain an object prediction contour;
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K,0 < K < 1, and the time sequence number is t. Preferably, k=0.7
A fifth embodiment, as shown in fig. 5, provides an object contour prediction method using the object contour extraction method described in the first, second or third embodiment, including the following steps:
S9, calculating through a formula (1) to obtain an object prediction contour;
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The method comprises the steps of obtaining a predicted contour set, namely an n-dimensional linear space matrix X (n), wherein the moving speed of an object between two frames is constant, obtaining moving deviation through peripheral contour values of the first two frames, obtaining an object predicted contour X (n) t-1 through addition operation of the moving deviation of the previous frame, measuring the peripheral contour to be Z (n) t, designating a gain coefficient to be K, wherein K is more than 0 and less than 1, and the time sequence number is t; preferably, k=0.7
S10, correcting the predicted outline of the object by using the measured peripheral outline set through Gaussian product.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments in accordance with the invention. As used herein, the singular is intended to include the plural unless the context clearly indicates otherwise. Furthermore, it will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In a sixth embodiment, the present invention provides an object profile extraction system, including:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
the first filtering unit is used for filtering noise points of the binary gray scale image matrix;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit is used for extracting all contours in the binary image and simultaneously generating a contour level table which is used for representing the inclusion relation among the contours of the images;
and the peripheral contour acquisition unit is used for eliminating all the inner contours through the contour hierarchy table and obtaining a measured peripheral contour set of one frame.
In a seventh embodiment, the present invention provides an object profile extraction system, including:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
The first filtering unit is used for filtering noise points of the binary gray image matrix, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the frame of the object, setting the pixel point to be 1 if the sum of the gray values is smaller than a preset threshold value, and setting the pixel point to be 0 if the sum of the gray values is smaller than the preset threshold value;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit fills the peripheral boundary of a frame by filling black pixel points, and the filling width is one pixel; searching a first white pixel point from top to bottom from left to right, wherein the left collar of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
Starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
repeating the marking and searching until the initial white pixel point in a frame is completely searched;
the same outline with the same preset mark is adopted, and all outlines of the original measurement of a frame and an outline hierarchy table are obtained;
A peripheral contour obtaining unit, which marks the inner contour as black pixel points according to the contour level table of all contours in a frame to directly eliminate all inner contours in the contours, so as to obtain peripheral contours;
alternatively, the first preset real number range is 0-1, the second preset real number range is 0-255, and the hierarchy table adopts an integer with 2 as a base number and is indexed by the integer from 2.
An eighth embodiment of the present invention provides an object profile extraction system, including:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
The first filtering unit is used for filtering noise points of the binary gray image matrix, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the frame of the object, setting the pixel point to be 1 if the sum of the gray values is smaller than a preset threshold value, and setting the pixel point to be 0 if the sum of the gray values is smaller than the preset threshold value;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit fills the peripheral boundary of a frame by filling black pixel points, and the filling width is one pixel; searching a first white pixel point from top to bottom from left to right, wherein the left collar of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
Starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
repeating the marking and searching until the initial white pixel point in a frame is completely searched;
the same outline with the same preset mark is adopted, and all outlines of the original measurement of a frame and an outline hierarchy table are obtained;
A peripheral contour obtaining unit, which marks the inner contour as black pixel points according to the contour level table of all contours in a frame to directly eliminate all inner contours in the contours, so as to obtain peripheral contours;
And a second filtering unit for deleting the noise contour of the peripheral contour through the contour perimeter to obtain a measured peripheral contour set of one frame.
Alternatively, the first preset real number range is 0-1, the second preset real number range is 0-255, and the hierarchy table adopts an integer with 2 as a base number and is indexed by the integer from 2.
A ninth embodiment of the present invention provides an object contour prediction system using the object contour extraction system described in the sixth, seventh or eighth embodiment, comprising:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
The first filtering unit is used for filtering noise points of the binary gray image matrix, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the frame of the object, setting the pixel point to be 1 if the sum of the gray values is smaller than a preset threshold value, and setting the pixel point to be 0 if the sum of the gray values is smaller than the preset threshold value;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit fills the peripheral boundary of a frame by filling black pixel points, and the filling width is one pixel; searching a first white pixel point from top to bottom from left to right, wherein the left collar of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
Starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
repeating the marking and searching until the initial white pixel point in a frame is completely searched;
the same outline with the same preset mark is adopted, and all outlines of the original measurement of a frame and an outline hierarchy table are obtained;
A peripheral contour obtaining unit, which marks the inner contour as black pixel points according to the contour level table of all contours in a frame to directly eliminate all inner contours in the contours, so as to obtain peripheral contours;
A second filtering unit for deleting the noise contour of the peripheral contour through the contour perimeter to obtain a measured peripheral contour set of one frame;
an object contour prediction module for obtaining an object predicted contour by calculation of formula (1);
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K,0 < K < 1, and the time sequence number is t. Preferably, k=0.7
Alternatively, the first preset real number range is 0-1, the second preset real number range is 0-255, and the hierarchy table adopts an integer with 2 as a base number and is indexed by the integer from 2.
A tenth embodiment of the present invention provides an object contour prediction system using the object contour extraction system according to the sixth, seventh or eighth embodiment, including:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
The first filtering unit is used for filtering noise points of the binary gray image matrix, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the frame of the object, setting the pixel point to be 1 if the sum of the gray values is smaller than a preset threshold value, and setting the pixel point to be 0 if the sum of the gray values is smaller than the preset threshold value;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit fills the peripheral boundary of a frame by filling black pixel points, and the filling width is one pixel; searching a first white pixel point from top to bottom from left to right, wherein the left collar of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
Starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking 8 fields as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
repeating the marking and searching until the initial white pixel point in a frame is completely searched;
the same outline with the same preset mark is adopted, and all outlines of the original measurement of a frame and an outline hierarchy table are obtained;
A peripheral contour obtaining unit, which marks the inner contour as black pixel points according to the contour level table of all contours in a frame to directly eliminate all inner contours in the contours, so as to obtain peripheral contours;
A second filtering unit for deleting the noise contour of the peripheral contour through the contour perimeter to obtain a measured peripheral contour set of one frame;
an object contour prediction module for obtaining an object predicted contour by calculation of formula (1);
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K,0 < K < 1, and the time sequence number is t. Preferably, k=0.7
Referring to fig. 7, an object prediction contour correction module is provided for correcting an object prediction contour using a set of measured peripheral contours by gaussian product.
Alternatively, the first preset real number range is 0-1, the second preset real number range is 0-255, and the hierarchy table adopts an integer with 2 as a base number and is indexed by the integer from 2.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The present invention has been described in detail by way of specific embodiments and examples, but these should not be construed as limiting the invention. Many variations and modifications may be made by one skilled in the art without departing from the principles of the invention, which is also considered to be within the scope of the invention.

Claims (20)

1. An object contour extraction method is characterized by comprising the following steps:
S1, acquiring an absolute temperature of an object to form a thermodynamic diagram;
S2, linearly constraining the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix;
S3, converting the thermodynamic real image matrix into a gray image matrix;
S4, converting the gray scale image matrix into a binary gray scale image matrix of a second preset real number range;
S5, filtering noise points of the binary gray scale image matrix;
s6, converting the binary gray scale map matrix into a binary map of a first preset real number range;
S7, extracting all contours in the binary image, and simultaneously generating a contour level table, wherein the contour level table is used for representing the inclusion relation among the contours of the images;
S8, eliminating all inner contours through a contour hierarchy table to obtain a measuring peripheral contour set of one frame.
2. The object contour extraction method as defined in claim 1, wherein: the first preset real range is 0-1.
3. The object contour extraction method as defined in claim 1, wherein: the second preset real number range is 0-255.
4. The object contour extraction method as defined in claim 1, wherein: the hierarchical table uses an integer with a base of 2, and is indexed by an integer starting from 2.
5. The object contour extraction method as defined in claim 1, wherein: when the step S5 is implemented, noise points are filtered by adopting the following substeps;
S5.1, sequentially calculating the sum of gray values of 8 neighborhood pixel points around all pixel points except the object frame;
S5.2, if the sum of the obtained gray values is smaller than a preset threshold value, the pixel point is set to be 1, otherwise, the pixel point is set to be 0.
6. The object contour extraction method as defined in claim 1, wherein: when the step S7 is implemented, the following substeps are adopted to extract the outline;
S7.1, filling to form a peripheral boundary of a frame by filling black pixel points, wherein the filling width is one pixel;
S7.2, searching a first white pixel point from top to bottom and from left to right, wherein the left neighbor of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
s7.3, starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking the 8 neighborhood as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
s7.4, repeating the steps S7.2-S7.3 until the initial white pixel point in a frame is completely searched;
S7.5, obtaining all contours of the original measurement of a frame and a contour hierarchy table by adopting the same contour with the same preset mark.
7. The object contour extraction method as defined in claim 1, wherein: when the step S8 is implemented, the peripheral outline is extracted by adopting the following substeps;
S8.1, marking the inner contour as black pixel points according to a contour hierarchy table of all contours in a frame to directly eliminate all the inner contours in the contours, and obtaining the peripheral contour.
8. The method of claim 7, wherein the step S8 is performed, further comprising;
s8.2, deleting the noise contour of the peripheral contour through the contour circumference to obtain a measuring peripheral contour set of one frame.
9. An object contour prediction method using the object contour extraction method according to any one of claims 1 to 8, characterized by comprising the steps of:
S9, calculating through a formula (1) to obtain an object prediction contour;
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K,0 < K < 1, and the time sequence number is t.
10. The object contour prediction method as defined in claim 9, further comprising the step of:
S10, correcting the predicted outline of the object by using the measured peripheral outline set through Gaussian product.
11. An object profile extraction system, comprising:
A parameter acquisition unit for acquiring an absolute temperature of an object and forming a thermodynamic diagram;
The first conversion unit is used for linearly restricting the thermodynamic diagram temperature to a first preset real range to form a thermodynamic diagram real matrix, converting the thermodynamic real diagram matrix into a gray diagram matrix and converting the gray diagram matrix into a binary gray diagram matrix of a second preset real range;
the first filtering unit is used for filtering noise points of the binary gray scale image matrix;
a second conversion unit for converting the binary gray map matrix into a binary map of a first preset real range;
The contour extraction unit is used for extracting all contours in the binary image and simultaneously generating a contour level table which is used for representing the inclusion relation among the contours of the images;
and the peripheral contour acquisition unit is used for eliminating all the inner contours through the contour hierarchy table and obtaining a measured peripheral contour set of one frame.
12. The object profile extraction system as defined in claim 11, wherein: the first preset real range is 0-1.
13. The object profile extraction system as defined in claim 11, wherein: the second preset real number range is 0-255.
14. The object profile extraction system as defined in claim 11, wherein: the hierarchical table uses an integer with a base of 2, and is indexed by an integer starting from 2.
15. The object profile extraction system as defined in claim 11, wherein: the first filtering unit sequentially calculates the sum of gray values of 8 neighborhood pixel points around all pixel points except the object frame, if the sum of the gray values is smaller than a preset threshold value, the pixel point is set to be 1, otherwise, the pixel point is set to be 0.
16. The object profile extraction system as defined in claim 11, wherein: the contour extraction unit extracts a contour by the following steps;
filling black pixel points to form a peripheral boundary of a frame, wherein the filling width is one pixel;
searching a first white pixel point from top to bottom from left to right, wherein the left neighbor of the white pixel point is a black pixel point, the first found white pixel point is a point on a contour, and marking the point as a preset mark;
Starting from the first white pixel point, starting to search other white pixel points in a counterclockwise mode by taking the 8 neighborhood as a range, if the white pixel point is an isolated point, stopping searching, otherwise marking the white pixel point, adopting the same preset mark of the same outline, and sequentially increasing along with the preset mark of the inclusion relation;
repeating the marking and searching until the initial white pixel point in a frame is completely searched;
and acquiring all the contours of the original measurement of a frame and a contour hierarchy table by adopting the same contour with the same preset mark.
17. The object profile extraction system as defined in claim 11, wherein: the peripheral contour obtaining unit adopts the following steps to extract the peripheral contour;
and marking the inner contour as black pixel points according to a contour hierarchy table of all contours in a frame to directly eliminate all the inner contours in the contours, thereby obtaining the peripheral contour.
18. The object profile extraction system as claimed in claim 15, further comprising:
And a second filtering unit for deleting the noise contour of the peripheral contour through the contour perimeter to obtain a measured peripheral contour set of one frame.
19. An object contour prediction system having an object contour extraction system as defined in any one of claims 11-18, further comprising:
an object contour prediction module for obtaining an object predicted contour by calculation of formula (1);
x (n) t=X(n)t-1+K[Z(n)t-X(n)t-1 formula (1)
The predicted contour set is an n-dimensional linear space matrix X (n), the object moving speed between two frames is constant, the moving deviation can be obtained through the peripheral contour value of the first two frames, the object predicted contour X (n) t-1 is obtained through the addition of the moving deviation of the previous frame, the peripheral contour is measured to be Z (n) t, the designated gain coefficient is K, the range of 0 < K < 1, the K is 0-1, and the time sequence number is t.
20. The object contour prediction system as defined in claim 19, further comprising:
An object prediction contour correction module for correcting an object prediction contour using the measured peripheral contour set by a gaussian product.
CN202010690822.2A 2020-07-17 2020-07-17 Object contour extraction method and system and object contour prediction method and system Active CN111783801B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010690822.2A CN111783801B (en) 2020-07-17 2020-07-17 Object contour extraction method and system and object contour prediction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010690822.2A CN111783801B (en) 2020-07-17 2020-07-17 Object contour extraction method and system and object contour prediction method and system

Publications (2)

Publication Number Publication Date
CN111783801A CN111783801A (en) 2020-10-16
CN111783801B true CN111783801B (en) 2024-04-23

Family

ID=72764209

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010690822.2A Active CN111783801B (en) 2020-07-17 2020-07-17 Object contour extraction method and system and object contour prediction method and system

Country Status (1)

Country Link
CN (1) CN111783801B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056594A (en) * 2016-05-27 2016-10-26 四川桑莱特智能电气设备股份有限公司 Double-spectrum-based visible light image extraction system and method
CN109146924A (en) * 2018-07-18 2019-01-04 北京飞搜科技有限公司 A kind of method for tracking target and device based on thermodynamic chart
CN110246329A (en) * 2019-04-07 2019-09-17 武汉理工大学 A kind of taxi quantitative forecasting technique
WO2020010561A1 (en) * 2018-07-12 2020-01-16 华为技术有限公司 Method and apparatus for measuring object parameters
CN111178356A (en) * 2019-12-27 2020-05-19 宁波华高信息科技有限公司 Paper contour skew correction method
CN111242120A (en) * 2020-01-03 2020-06-05 中国科学技术大学 Character detection method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6937765B2 (en) * 2003-03-14 2005-08-30 The Regents Of The University Of California Method for contour extraction for object representation
JP6115545B2 (en) * 2014-11-11 2017-04-19 コニカミノルタ株式会社 Image processing apparatus and image processing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056594A (en) * 2016-05-27 2016-10-26 四川桑莱特智能电气设备股份有限公司 Double-spectrum-based visible light image extraction system and method
WO2020010561A1 (en) * 2018-07-12 2020-01-16 华为技术有限公司 Method and apparatus for measuring object parameters
CN109146924A (en) * 2018-07-18 2019-01-04 北京飞搜科技有限公司 A kind of method for tracking target and device based on thermodynamic chart
CN110246329A (en) * 2019-04-07 2019-09-17 武汉理工大学 A kind of taxi quantitative forecasting technique
CN111178356A (en) * 2019-12-27 2020-05-19 宁波华高信息科技有限公司 Paper contour skew correction method
CN111242120A (en) * 2020-01-03 2020-06-05 中国科学技术大学 Character detection method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
头部MRI图像外轮廓提取算法的实现与比较;张欣然;陈琪儒;何煦佳;杨荣骞;;计算机应用与软件(第05期);17-20 *

Also Published As

Publication number Publication date
CN111783801A (en) 2020-10-16

Similar Documents

Publication Publication Date Title
CN108665487B (en) Transformer substation operation object and target positioning method based on infrared and visible light fusion
EP3499414B1 (en) Lightweight 3d vision camera with intelligent segmentation engine for machine vision and auto identification
CN111260788B (en) Power distribution cabinet switch state identification method based on binocular vision
CN108614896A (en) Bank Hall client&#39;s moving-wire track describing system based on deep learning and method
CN114742799B (en) Industrial scene unknown type defect segmentation method based on self-supervision heterogeneous network
CN113674273A (en) Optical detection method and system based on product defects and readable storage medium
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN106023249A (en) Moving object detection method based on local binary similarity pattern
Zheng et al. Multisource-domain generalization-based oil palm tree detection using very-high-resolution (vhr) satellite images
CN113255452A (en) Extraction method and extraction system of target water body
CN111028263B (en) Moving object segmentation method and system based on optical flow color clustering
CN111783801B (en) Object contour extraction method and system and object contour prediction method and system
Tiwari et al. Potential of IRS P-6 LISS IV for agriculture field boundary delineation
CN111160262A (en) Portrait segmentation method fusing human body key point detection
CN113298755B (en) Method and device for rapidly detecting ecological environment change patch based on time sequence image
Sheikh et al. A multi-level approach for change detection of buildings using satellite imagery
CN107796323A (en) A kind of micro- change detecting system of bridge based on hot spot vision signal intellectual analysis
CN108280815B (en) Geometric correction method for monitoring scene structure
CN113962904A (en) Method for filtering and denoising hyperspectral image
CN115700541A (en) Single sand-dust meteorological disaster judgment method and judgment system
Hui et al. Camera calibration using a genetic algorithm
CN114322793B (en) Workpiece size measuring method and device based on global segmentation network and storage medium
Park et al. Unconstrained approach for isolating individual trees using high-resolution aerial imagery
CN117333675B (en) Monitoring and early warning method and system for GIS expansion joint
JP3895473B2 (en) Entry information extraction method and machine-readable recording medium recording a program for causing a computer to execute the entry information extraction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant