Mobile temperature measurement method based on background elimination and tracking algorithm in complex environment
Technical Field
The invention relates to the technical field of human body intelligent temperature measurement, in particular to a mobile temperature measurement method based on a background elimination and tracking algorithm in a complex environment.
Background
The temperature measurement used in the current market is mainly indoor temperature measurement, and the indoor environment temperature is single, so that the influence on the measurement result is small. But outdoor, the uncertainty of the ambient temperature can cause large errors in the measurement results. In addition, due to human movement, the captured RGB image is likely to be mismatched with infrared thermography.
Disclosure of Invention
The invention aims to provide a mobile temperature measurement method based on a background elimination and tracking algorithm in a complex environment, so as to overcome the defects in the prior art.
In order to achieve the technical purpose, the technical scheme of the invention is realized as follows:
a mobile temperature measurement method under a complex environment based on a background elimination and tracking algorithm comprises the following steps:
1) acquiring an RGB image and an infrared thermal imaging image;
2) processing the infrared thermal imaging image by adopting a frame difference method, specifically, subtracting the infrared thermal imaging image from the previous frame after acquiring the infrared thermal imaging image each time;
3) comparing the difference obtained by the processing in the step 2) with a set threshold, when the difference before and after a certain pixel point is smaller than the set threshold, judging that the temperature measured by the infrared thermal imaging image obtained at this time is the ambient temperature and the ambient temperature does not change greatly, and when the difference before and after a certain pixel point is larger than the set threshold and the temperature of the current frame reaches the human body temperature range, performing the following steps;
4) and judging whether a human face exists in the RGB image acquired at the moment, if so, judging that the temperature of the current frame is the human body temperature, and if not, judging that the temperature of the current frame is the environment temperature.
Further, the method comprises the following steps:
5) obtaining face position information in the RGB image through a face detection model and obtaining face region information in the infrared thermal imaging image through a human head detection algorithm, and numbering faces in the RGB image and the infrared thermal imaging image respectively;
6) carrying out binarization processing on the infrared thermal imaging image;
7) obtaining a candidate region from the thermal imaging image processed in the step 6) by using a connected component segmentation algorithm;
8) predicting a candidate area of each face of the next frame based on the candidate areas of each face numbered in the current frame and the previous frame of infrared thermal imaging image by adopting a Kalman filtering tracking algorithm;
9) comparing each candidate region obtained by prediction with the corresponding face region area detected in the step 5), if the candidate region is smaller than the corresponding face region area by 1.5 times, not processing the candidate region, and if the candidate region is larger than or equal to the face region area by 1.5 times, resetting the threshold value of binarization processing and then repeating the steps 6) -9) on the infrared thermal imaging image corresponding to the candidate region;
10) and carrying out IOU calculation on the RGB face image and all candidate regions obtained by prediction, taking the candidate region with the largest IOU value, if the IOU value is greater than 0.5, judging the temperature of the face of the region in the RGB image corresponding to the candidate region to obtain the measured temperature of the face, and if the IOU value is less than 0.5, repeating the steps 5) -10).
Further, in step 10), when the IOU value is less than 0.5, further determining the distance between the RGB face region and the candidate region, if the distance between the RGB face region and the candidate region is less than the set pixel point, determining that the face in the region in the RGB image matches the candidate region, otherwise, repeating steps 5) -10).
Further, the binarization processing in the step 6) is specifically as follows: the threshold is set to 28 degrees centigrade, the pixel points higher than 28 degrees centigrade are set to 255, and the pixel points lower than 28 degrees centigrade are set to 0.
Further, if the area of the candidate region is greater than or equal to 1.5 times of the area of the face region in the step 9), the threshold is increased by 1 degree centigrade, and then the steps 6) -9) are repeated on the infrared thermal imaging image corresponding to the candidate region.
Further, a calculation formula for performing IOU calculation on the face region detected in the RGB image and the predicted candidate region is as follows:
I=(min{yRGB_max,yH_max}-max{yRGB_min,yH_min})×(min{xRGB_max,xH_max}-max{xRGB_min,xH_min})
U=(yRGB_max-yRGB_min)×(xRGB_max-xRGB_min)+(yH_max-yH_min)×(xH_max-xH_min)-I
the subscript RGB represents the position of the face frame of the RGB image, the subscript H represents the position of the face frame of the infrared thermal imaging image, U represents the union of the two frames, and I represents the intersection of the two frames.
The invention has the beneficial effects that: the frame difference method used by the invention can effectively reduce the influence of the environmental temperature change on the temperature measurement result, and simultaneously, the human face tracking algorithm is used, so that the probability of matching of the human face and the temperature measurement can be greatly improved compared with the original method of directly calculating the human face temperature by using the RGB image and the infrared image at the same moment.
Drawings
FIG. 1 is a flow chart of background removal by frame differencing;
FIG. 2 is a flow diagram of face detection;
fig. 3 is a face matching flow chart.
Detailed Description
The technical solution in the embodiments of the present invention is clearly and completely described below with reference to the drawings in the embodiments of the present invention.
As shown in fig. 1-3, according to an embodiment of the present invention, a method for mobile thermometry in a complex environment based on a background elimination and tracking algorithm includes the following steps:
1. and acquiring an RGB image and an infrared thermal imaging image through the double cameras.
2. In order to solve the influence of outdoor environment temperature on temperature measurement, a frame difference method is adopted to process the infrared thermal imaging image, and specifically, after the infrared thermal imaging image is obtained each time, subtraction is carried out on the infrared thermal imaging image and the infrared thermal imaging image of the previous frame.
3. Comparing the difference obtained by the processing in the step 2 with a set threshold, when the difference before and after a certain pixel point is smaller than the set threshold, judging that the temperature measured by the infrared thermal imaging image obtained at this time is the ambient temperature and the ambient temperature does not change greatly, and when the difference before and after a certain pixel point is larger than the set threshold and the temperature of the current frame reaches the human body temperature range, performing the following steps. The difference value refers to a difference value generated by a previous frame difference method, namely the difference between the temperature of each pixel point of the infrared thermal imaging picture at the previous moment and the temperature of each pixel point of the infrared thermal imaging picture at the current moment. The threshold is determined according to the actual ambient temperature, and is set to a smaller value when the actual ambient temperature is higher, and is set to a larger value when the actual ambient temperature is lower. For example, the normal human body temperature is 37 ℃, and when the actual environmental temperature is 28 ℃, the threshold value can be set to be 7 ℃. The threshold is set as long as it is discriminated whether the temperature change is caused by a person passing through the temperature change or the temperature change of the environment itself.
4. And judging whether a human face exists in the RGB image acquired at the moment, if so, judging that the temperature of the current frame is the human body temperature, and performing the following steps, otherwise, judging that the temperature of the current frame is the environment temperature.
5. And obtaining face position information in the RGB image through a face detection model and face region information in the infrared thermal imaging image through a human head detection algorithm, and numbering the faces in the RGB image and the infrared thermal imaging image respectively. The face detection model and the human head detection algorithm can both use an RFSong-779 target detection model, specifically, a model capable of detecting faces of RGB pictures is trained by the RGB pictures, and a model capable of detecting heads of infrared pictures is trained by the infrared thermal imaging pictures.
6. The infrared thermal imaging image is subjected to binarization processing, and the purpose is to separate an interested region (infrared human head region) from a background region and reduce the interference of environmental noise. Specifically, the threshold value of the binarization processing is set to be 28 ℃, the pixel point higher than 28 ℃ is set to be 255, and the pixel point lower than 28 ℃ is set to be 0.
7. And (5) obtaining a candidate region from the thermal imaging image processed in the step 6 by using a connected component segmentation algorithm.
8. And predicting the candidate area of each face in the next frame by adopting a Kalman filtering tracking algorithm based on the candidate area of each face numbered in the current frame and the previous frame of infrared thermal imaging image. Because the formation of the infrared thermal imaging image has a certain delay relative to the formation of the RGB image, the matching of the current RGB image and the predicted infrared thermal imaging image of the next frame is favorable for improving the accuracy of human face temperature measurement.
9. And (3) comparing each candidate region obtained by prediction with the corresponding face region area detected in the step (5), if the candidate region is smaller than the corresponding face region area by 1.5 times, not processing the candidate region, and if the candidate region is larger than or equal to the face region area by 1.5 times, resetting the threshold value of binarization processing and then repeating the steps (6-9) on the infrared thermal imaging image corresponding to the candidate region until the candidate region is smaller than the face region area by 1.5 times, so as to obtain a real candidate region. Specifically, in step 9, if the area of the candidate region is greater than or equal to 1.5 times of the area of the face region, the threshold is increased by 1 degree centigrade, and then steps 3-5 are repeated for the infrared thermal imaging image corresponding to the candidate region. The candidate area is much larger than the head area detected by the infrared image, so that the IOU calculated in the subsequent matching with the RGB face area is small, and the actual face is difficult to match with the candidate area. Therefore, when the area is too large, the threshold needs to be increased, and the range of the candidate area is reduced.
10. Performing IOU calculation on the RGB face image and all candidate areas obtained by prediction, taking the candidate area with the largest IOU value, if the IOU value is greater than 0.5, judging the temperature of the face of the area in the RGB image corresponding to the candidate area to obtain the measured temperature of the face, and if the IOU value is less than 0.5, performing the following steps. The IOU threshold is selected according to long-term practical experience, the accuracy of the IOU threshold is improved when the IOU threshold is too large, but the number of matched heads is reduced, and the accuracy of matching is reduced when the IOU threshold is too small.
11. When the IOU value is less than 0.5 in step 10, it is indicated that the difference between the face position and the predicted face position is large, there are two reasons, one is probably caused by prediction deviation, and the other is probably that an erroneous face is detected, at this time, the distances between the RGB face region and all predicted candidate regions need to be further determined, if the distances between the RGB face region and all predicted candidate regions are less than the set pixel points, 25 pixel points are generally selected, it is determined that the face in the region in the RGB image matches the candidate regions, if the distances between the RGB face region and the candidate regions are greater than 25 pixel points, matching fails, and the face is not detected if the face does not match the candidate regions, and then the step 5-10 is repeated to detect again.
In this embodiment, the calculation formula for performing IOU calculation on the face region detected in the RGB image and the predicted candidate region is as follows:
I=(min{yRGB_max,yH_max}-max{yRGB_min,yH_min})×(min{xRGB_max,xH_max}-max{xRGB_min,xH_min})
U=(yRGB_max-yRGB_min)×(xRGB_max-xRGB_min)+(yH_max-yH_min)×(xH_max-xH_min)-I
the subscript RGB represents the position of the face frame of the RGB image, the subscript H represents the position of the face frame of the infrared thermal imaging image, U represents the union of the two frames, and I represents the intersection of the two frames.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.