CN112926580B - Image positioning method and device, electronic equipment and storage medium - Google Patents

Image positioning method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112926580B
CN112926580B CN202110332251.XA CN202110332251A CN112926580B CN 112926580 B CN112926580 B CN 112926580B CN 202110332251 A CN202110332251 A CN 202110332251A CN 112926580 B CN112926580 B CN 112926580B
Authority
CN
China
Prior art keywords
image
calibration
camera
position information
thermal imager
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110332251.XA
Other languages
Chinese (zh)
Other versions
CN112926580A (en
Inventor
肖宣煜
杨帆
李若岱
陈朝军
马堃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yuanluobu Intelligent Technology Co.,Ltd.
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202110332251.XA priority Critical patent/CN112926580B/en
Publication of CN112926580A publication Critical patent/CN112926580A/en
Application granted granted Critical
Publication of CN112926580B publication Critical patent/CN112926580B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiation Pyrometers (AREA)
  • Image Analysis (AREA)
  • Investigating Or Analyzing Materials Using Thermal Means (AREA)
  • Image Processing (AREA)

Abstract

The method comprises the steps of acquiring a thermal imager image of an object to be processed, acquired by a thermal imager camera, a first object image of the object to be processed, acquired by a first preset camera, and a second object image of the object to be processed, acquired by a second preset camera; determining first position information of a target area in a first object image and second position information of the target area in a second object image; acquiring calibration data between a thermal imager camera and a first preset camera and calibration data between the thermal imager camera and a second preset camera respectively; determining first primary selection position information and second primary selection position information corresponding to a target area in the thermal imager image according to the first position information and the second position information of the calibration data; and determining target position information of a target area in the thermal imager image based on the first primary selection position information and the second primary selection position information. By the aid of the method and the device, positioning accuracy of the region of interest in the thermal imager image can be improved.

Description

Image positioning method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer vision technologies, and in particular, to an image positioning method and apparatus, an electronic device, and a storage medium.
Background
A thermal imager camera is a device that converts an image of temperature distribution of a target object into a visible image by detecting infrared radiation of the target object and performing signal processing, photoelectric conversion and other means by using an infrared thermal imaging technology, and has wide application in the fields of automatic driving, security inspection and the like. Different colors on a thermal imager image acquired by a thermal imager camera represent different temperatures of a measured object, but because the temperatures of different areas of the measured object such as people are uniform, a region of interest (ROI) cannot be accurately distinguished in the thermal imager image. Therefore, it is desirable to provide a more effective image method to improve the accuracy of locating the region of interest in the thermal imager image.
Disclosure of Invention
The disclosure provides an image positioning method and device, an electronic device and a storage medium, which can greatly improve the positioning accuracy of an interested area in a thermal imager image. The technical scheme of the disclosure is as follows:
according to an aspect of the embodiments of the present disclosure, there is provided an image positioning method, including:
acquiring a thermal imager image, a first object image and a second object image of an object to be processed, wherein the thermal imager image is acquired by a thermal imager camera in the three-mesh camera, the first object image is an image of the object to be processed, which is acquired by a first preset camera in the three-mesh camera, the second object image is an image of the object to be processed, which is acquired by a second preset camera in the three-mesh camera, and the images acquired by the first preset camera and the second preset camera are images which can be used for distinguishing different areas based on gray scale information;
determining first position information of a target area in the first object image and second position information of the target area in the second object image;
acquiring first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera;
determining first primary selection position information corresponding to the target area in the thermal imager image according to the first calibration data and the first position information;
determining second primary selection position information corresponding to the target area in the thermal imager image according to the second calibration data and the second position information;
and determining target position information of the target area in the thermal imager image based on the first primary selection position information and the second primary selection position information.
According to the technical scheme, the target area of the object to be processed in the first object image and the second object image which can distinguish different areas based on gray information can be mapped to the thermal imager image by combining the first calibration data between the thermal imager camera and the first preset camera and the second calibration data between the thermal imager camera and the second preset camera, so that the target area can be accurately positioned in the thermal imager image, and the accuracy of the region of interest in the thermal imager image is greatly improved.
In an optional embodiment, the determining the target location information of the target area in the thermal imager image based on the first preliminary selection location information and the second preliminary selection location information includes:
determining intersection coordinate information between the first primary selection position information and the second primary selection position information;
and taking the intersection coordinate information as the target position information.
In the technical scheme, the matching points corresponding to the target areas in the first object image and the second object image can be mapped to two lines in the thermal imager image by combining the epipolar transfer principle, and the target area in the thermal imager image is accurately positioned by combining the intersection point of the two lines.
In an optional embodiment, the method further comprises:
acquiring a regional image of the target region in the thermal imager image according to the target position information;
and analyzing the area image to obtain the temperature information of the object to be processed.
According to the technical scheme, the region image of the target region in the thermal imager image can be extracted by combining the target position information of the target region in the thermal imager image, and then the temperature information of the object to be processed can be determined based on the analysis processing of the region image, so that the temperature identification accuracy of the region of interest is greatly improved.
In an optional embodiment, the method further comprises: predetermining the first calibration data and the second calibration data;
the predetermining the first calibration data and the second calibration data comprises:
acquiring a calibration object image which is acquired by the trinocular camera and comprises at least one calibration object, wherein the calibration object comprises at least one preset pattern and a hollow area with the same size as the preset pattern, and one side of the hollow area, which is back to the trinocular camera, is provided with a temperature control device; the calibration object images comprise a first calibration image acquired by the first preset camera, a second calibration image acquired by the second preset camera and a third calibration image acquired by the thermal imager camera, and the temperature control device is used for controlling the temperature difference between the hollowed-out area and the non-hollowed-out area in the at least one calibration object;
determining the position information of the hollowed-out region in the first calibration image, the second calibration image and the third calibration image according to the relative position information between the at least one preset pattern and the hollowed-out region in the first calibration image, the second calibration image and the third calibration image;
determining first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image;
and determining second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image.
In the technical scheme, at least three same preset patterns and hollow areas with the same size as the preset patterns are arranged in the calibration object corresponding to the calibration object images acquired by the three-eye camera, and the temperature adjusting device is arranged on one side of each hollow area, which is opposite to the three-eye camera, so that the position information of the hollow areas in the calibration object images acquired by the cameras of the three-eye camera can be accurately extracted; determining first calibration data between the thermal imager camera and a first preset camera by combining position information of the hollow area in the first calibration image and the third calibration image; and determining second calibration data between the thermal imager camera and the second preset camera by combining the position information of the hollow-out area in the second calibration image and the third calibration image, so that the accuracy of the calibration data can be greatly improved, and the generalization of the application scene of the calibration data can be improved.
In an optional embodiment, the at least one calibration object includes at least two calibration objects, and the acquiring a calibration object image including the at least one calibration object acquired by the trinocular camera includes:
and acquiring a calibration object image which is acquired by the trinocular camera and comprises at least two calibration objects positioned on different planes.
Among the above-mentioned technical scheme, through setting up two at least demarcation objects that are located different planes, can increase the data that are used for demarcating, and then the precision of better promotion calibration data.
In an optional embodiment, the determining the first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image includes:
determining a first number of pairs of matching points of the hollowed-out area in the first calibration image and the third calibration image;
expanding the position information of the first number of pairs of matching points into three-dimensional coordinate information;
and determining first calibration data between the thermal imager camera and the first preset camera based on the three-dimensional coordinate information corresponding to the first number of pairs of matching points.
According to the technical scheme, the two-dimensional coordinate information is expanded into the three-dimensional coordinate information, so that the projection relation between the cameras can be accurately determined, and then the thermal imager camera and the first preset camera are calibrated.
In an optional embodiment, the determining the second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image includes:
determining a second number of pairs of matching points of the hollowed-out area in the second calibration image and the third calibration image;
expanding the position information of the second number of pairs of matching points into three-dimensional coordinate information;
and determining second calibration data between the thermal imager camera and the second preset camera based on the three-dimensional coordinate information corresponding to the second number of pairs of matching points.
According to the technical scheme, the two-dimensional coordinate information is expanded into the three-dimensional coordinate information, so that the projection relation between the cameras can be accurately determined, and further the calibration between the thermal imager camera and the second preset camera is realized.
According to another aspect of the embodiments of the present disclosure, there is provided an image positioning apparatus including:
the system comprises a to-be-processed object image acquisition module, a first object image acquisition module and a second object image acquisition module, wherein the to-be-processed object image acquisition module is configured to acquire a thermal imager image, a first object image and a second object image of a to-be-processed object, the thermal imager image is acquired by a thermal imager camera in the three-mesh camera, the first object image is an image of the to-be-processed object acquired by a first preset camera in the three-mesh camera, the second object image is an image of the to-be-processed object acquired by a second preset camera in the three-mesh camera, and the images acquired by the first preset camera and the second preset camera are images capable of distinguishing different areas based on gray information;
a position information determination module configured to perform determining first position information of a target region in the first object image and second position information of the target region in the second object image;
a calibration data acquisition module configured to perform acquisition of first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera;
the first primary selection position information determining module is configured to determine first primary selection position information corresponding to the target area in the thermal imager image according to the first calibration data and the first position information;
the second primary selection position information determining module is configured to determine second primary selection position information corresponding to the target area in the thermal imager image according to the second calibration data and the second position information;
a target location information determination module configured to perform a determination of target location information for the target area in the thermal imager image based on the first preliminary location information and the second preliminary location information.
In an optional embodiment, the apparatus further comprises:
the area image acquisition module is configured to acquire an area image of the target area in the thermal imager image according to the target position information;
and the image analysis processing module is configured to perform analysis processing on the area image to obtain the temperature information of the object to be processed.
In an optional embodiment, the target location information determining module comprises:
an intersection coordinate information determination unit configured to perform determination of intersection coordinate information between the first preliminary selection position information and the second preliminary selection position information;
a target position information determination unit configured to perform the intersection coordinate information as the target position information.
In an optional embodiment, the apparatus further comprises an identification data determination module, the identification data determination module comprising:
a calibration object image acquisition unit configured to perform acquisition of a calibration object image including at least one calibration object acquired by the trinocular camera, wherein the calibration object includes at least one preset pattern and a hollowed-out area having the same size as the preset pattern, and a temperature control device is arranged on one side of the hollowed-out area, which faces away from the trinocular camera; the calibration object images comprise a first calibration image acquired by the first preset camera, a second calibration image acquired by the second preset camera and a third calibration image acquired by the thermal imager camera, and the temperature control device is used for controlling the temperature difference between the hollowed-out area and the non-hollowed-out area in the at least one calibration object;
a position information determining unit configured to determine position information of the hollowed-out region in the first calibration image, the second calibration image and the third calibration image according to relative position information between the at least one preset pattern in the first calibration image, the second calibration image and the third calibration image and the hollowed-out region;
a first calibration data unit configured to determine first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image;
a second calibration data unit configured to determine second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image.
In an optional embodiment, the at least one calibration object comprises at least two calibration objects, and the calibration object image obtaining unit is further configured to perform obtaining calibration object images acquired by the trinocular camera and comprising at least two calibration objects located in different planes.
In an optional embodiment, the position information is two-dimensional coordinate information, and the first calibration data determining unit includes:
a first matching point determination unit configured to perform a determination of a first number of pairs of matching points of the hollowed-out area in the first calibration image and the third calibration image;
a first coordinate expansion unit configured to perform expansion of the position information of the first number of pairs of matching points into three-dimensional coordinate information;
a first calibration data determining unit configured to determine first calibration data between the thermal imager camera and the first preset camera based on the three-dimensional coordinate information corresponding to the first number of pairs of matching points.
In an optional embodiment, the position information is two-dimensional coordinate information, and the second calibration data unit includes:
a second matching point determination unit configured to perform a determination of a second number of pairs of matching points of the hollowed-out area in the second calibration image and the third calibration image;
a second coordinate expanding unit configured to perform expanding the position information of the second number of pairs of matching points into three-dimensional coordinate information;
a second calibration data determining unit configured to determine second calibration data between the thermal imager camera and the second preset camera based on the three-dimensional coordinate information corresponding to the second number of pairs of matching points.
According to another aspect of the embodiments of the present disclosure, there is provided an electronic device including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method of any of the above.
According to another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform any one of the methods of the embodiments of the present disclosure.
According to another aspect of the embodiments of the present disclosure, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the methods of the embodiments of the present disclosure described above.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
by combining first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera, a target area of an object to be processed in a first object image and a second object image which can be distinguished in different areas based on gray information can be mapped to the thermal imager image, so that the target area can be accurately positioned in the thermal imager image, and the positioning accuracy of an interested area in the thermal imager image is greatly improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow chart illustrating a method of image localization according to an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method for calibrating a trinocular camera in accordance with an exemplary embodiment;
FIG. 3 is a schematic illustration of a calibration object shown in accordance with an exemplary embodiment;
FIG. 4 is a schematic illustration of a process for determining first calibration data between a thermal imager camera and a first predetermined camera based on location information of a hollow area in a first calibration image and a third calibration image, in accordance with an exemplary embodiment;
fig. 5 is a schematic flow chart illustrating a process of determining second calibration data between the thermal imager camera and a second predetermined camera according to the position information of the hollow area in the second calibration image and the third calibration image according to an exemplary embodiment;
FIG. 6 is a block diagram illustrating an image locating device according to an exemplary embodiment;
FIG. 7 is a block diagram illustrating an electronic device for image localization in accordance with an exemplary embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating an image positioning method according to an exemplary embodiment, where the image positioning method is used in an electronic device such as a terminal, a server, an edge computing node, and the like, as shown in fig. 1, and includes the following steps.
Step S101: and acquiring a thermal imager image, a first object image and a second object image of the object to be processed, which are acquired by the trinocular camera.
In a particular embodiment, the trinocular camera may include a thermal imager camera, a first preset camera, and a second preset camera. Optionally, the three-view camera may be obtained by combining the three independent cameras, or may be an integrated camera including the three cameras. Specifically, the images acquired by the first preset camera and the second preset camera are images capable of distinguishing different areas based on gray information. In a specific embodiment, the first preset camera and the second preset camera may include, but are not limited to, two of an RGB camera, an IR (infrared) camera, and a gray scale camera.
In the embodiment of the present specification, the object to be processed may be different according to different actual application scenarios of the trinocular camera. In a specific embodiment, the object to be processed may be a person, for example in a scenario where a body temperature measurement is performed in combination with a trinocular camera.
In a specific embodiment, the thermal imager image may be an image of an object to be processed collected by a thermal imager camera of the trinocular camera, the first object image is an image of the object to be processed collected by a first preset camera of the trinocular camera, and the second object image is an image of the object to be processed collected by a second preset camera of the trinocular camera.
Step S103: first position information of a target area in a first object image and second position information of the target area in a second object image are determined.
In a specific embodiment, the target area may be a region of interest of the object to be processed, for example, in the case that the object to be processed is a person, the target area may be a face area, and the like. In a specific embodiment, due to the first object image and the second object image, the position information of the target area in the first object image and the second object image can be effectively identified by directly combining the color information. Accordingly, the first position information may be position information of the target region in the first object image, and the second position information may be position information of the target region in the second object image.
Step S105: and acquiring first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera.
In a specific embodiment, the first calibration data may represent a transformation relationship between a coordinate system corresponding to the thermal imager camera and a coordinate system corresponding to the first preset camera. The second calibration data can represent a conversion relation between a coordinate system corresponding to the thermal imager camera and a coordinate system corresponding to the second preset camera.
In an alternative embodiment, the first calibration data and the second calibration data may be predetermined. Specifically, as shown in fig. 2, the step of predetermining the first calibration data and the second calibration data may include the following steps:
step S201: and acquiring a calibration object image which is acquired by the trinocular camera and comprises at least one calibration object.
In a specific embodiment, the calibration object images may include a first calibration image acquired by a first preset camera in the trinocular camera, a second calibration image acquired by a second preset camera in the trinocular camera, and a third calibration image acquired by a thermal imager camera in the trinocular camera.
In a specific embodiment, the calibration object may be a calibration plate, specifically, the calibration object may include at least one preset pattern and a hollow area having the same size as the preset pattern, and specifically, a temperature adjustment device is disposed on a side of the hollow area opposite to the monocular camera. Specifically, the temperature control device may be configured to control the temperature difference between the hollow-out area and the at least one calibration object, specifically, the temperature difference is greater than or equal to a preset temperature threshold, so that the hollow-out area may be accurately located, and specifically, the preset temperature threshold may be preset in combination with the actual application. In a particular embodiment, the temperature control device may include, but is not limited to, a heating source, a cooling source, and the like.
In a specific embodiment, the size of the preset pattern may include the size of the preset pattern and the shape of the preset pattern, and in particular, the shape and size of the preset pattern may be preset in combination with the actual application. The color of the area of the preset pattern adjacent to the calibration plate is different from the color of the calibration plate itself. In one embodiment, assuming that the preset pattern is white, accordingly, the color of the calibration board itself may be black, which is clearly distinguished from white, etc.; optionally, in order to avoid interference of the background of the calibration board during the shooting process, the preset pattern may be set to a pattern that is not easy to have the same background, such as a zigzag pattern.
In the above embodiment, the temperature adjusting device is disposed on the side of the hollow area opposite to the monocular camera, and the hollow area and the non-hollow area of the calibration object have an obvious temperature difference, so that the hollow area can be accurately positioned in the third calibration image acquired by the thermal imager camera.
The at least one calibration object comprises at least two calibration objects, and acquiring a calibration object image comprising the at least one calibration object acquired by the trinocular camera comprises:
calibration object images acquired by the trinocular camera and comprising at least two calibration objects located on different planes are acquired.
In a specific embodiment, in order to acquire more data for calibration, calibration object images acquired by the trinocular camera including at least two calibration objects located in different planes may be acquired. Specifically, in order to obtain more data for calibration, calibration object images including the calibration object may also be acquired by the trinocular camera at least two preset positions, which is optional for the sake of distinction. Specifically, the at least two preset positions may be positions at different distances from the calibration object.
In a specific embodiment, as shown in FIG. 3, FIG. 3 is a schematic diagram of a calibration object according to an exemplary embodiment. Specifically, fig. 3 shows four calibration objects located on different planes, where 100 is a preset pattern, and 200 is a hollow area, and in practical application, because one side of the hollow area, which faces away from the monocular camera, is provided with a temperature adjusting device (a heating source or a cold source), a color corresponding to the heating source or the cold source, for example, red, may be directly distinguished from the hollow area in the calibration object by the color of the heating source or the cold source, and there is no white dotted frame. Optionally, the four calibration objects located on different planes are placed in a staggered manner by 10cm in front of and behind, and optionally, images of the calibration object can be captured at 0.9m and 1.0m of the first calibration object (the first calibration object may be the calibration object closest to the trinocular camera).
In the above embodiment, one side of the hollowed-out area back to the three-view camera is provided with the temperature adjusting device, so that the hollowed-out area can be accurately positioned in a third calibration image acquired by the thermal imager camera, and meanwhile, at least two calibration objects located on different planes are arranged, so that data used for calibration can be increased, and the accuracy of the calibration data can be better improved.
Step S203: and determining the position information of the hollowed-out region in the first calibration image, the second calibration image and the third calibration image according to the relative position information between at least one preset pattern in the first calibration image, the second calibration image and the third calibration image and the hollowed-out region.
In a specific embodiment, at least one preset pattern in the calibration object may be used to locate the hollowed-out area. Specifically, the position information of the hollow area in the first calibration image, the second calibration image and the third calibration image may be determined by respectively combining the relative position information between at least one preset pattern in the first calibration image, the second calibration image and the third calibration image and the hollow area.
In an optional embodiment, the position information of the hollow area in the first calibration image, the second calibration image and the third calibration image may be determined by combining the relative position information between at least three preset patterns and the hollow area.
In another alternative embodiment, in a case where the relative position between the hollow area and the two preset patterns is determined, for example, the two preset patterns are placed in a diagonal line, and the hollow area is located at the lower right corner of the diagonal line. Correspondingly, the position information of the hollowed-out region in the first calibration image, the second calibration image and the third calibration image can also be determined by combining the relative position information between the two preset patterns and the hollowed-out region.
In another alternative embodiment, the hollow area may be set between the first calibration image, the second calibration image, and the third calibration image, and accordingly, the position information of the hollow area in the first calibration image, the second calibration image, and the third calibration image may be determined by combining the relative position information between one preset pattern and the hollow area.
In a specific embodiment, the position information of the hollow area in the first calibration image may include, but is not limited to, coordinate information of the hollow area in a camera coordinate system corresponding to a first preset camera; the position information of the hollow area in the second calibration image may include, but is not limited to, coordinate information of the hollow area in a camera coordinate system corresponding to a second preset camera; the position information of the hollow area in the third calibration image may include, but is not limited to, coordinate information of a camera coordinate system corresponding to the thermal imager camera of the hollow area.
Step S205: and determining first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image.
In a specific embodiment, the first calibration data may represent a transformation relationship between a coordinate system corresponding to the thermal imager camera and a coordinate system corresponding to the first predetermined camera.
In a specific embodiment, as shown in fig. 4, the determining the first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image includes:
step S2051: determining a first number of pairs of matching points of the hollow area in the first calibration image and the third calibration image;
step S2053: expanding the position information of the first number of pairs of matching points into three-dimensional coordinate information;
step S2055: and determining first calibration data between the thermal imager camera and a first preset camera based on the three-dimensional coordinate information corresponding to the first number of pairs of matching points.
In a specific embodiment, a certain pair of matching points may be two corresponding points of a certain point in the hollow area in the first calibration image and the third calibration image. In a specific embodiment, taking the hollow area shown in fig. 3 as an example, 4 pairs of matching points corresponding to 4 vertices of the hollow area in the first calibration image and the third calibration image may be selected as the first number of pairs of matching points.
In a specific embodiment, the transformation relationship between the coordinate system corresponding to the thermal imager camera and the coordinate system corresponding to the first preset camera represents a projection relationship between the thermal imager camera and the first preset camera. In practical application, the projection relationship between the cameras is nonlinear in a two-dimensional coordinate system, and correspondingly, because the position information of the hollow-out region in the first calibration image and the third calibration image is two-dimensional coordinate information, the position information (two-dimensional coordinate information) of the first number of pairs of matching points can be expanded into three-dimensional coordinate information, and the first calibration data between the thermal imager camera and the first preset camera can be determined by combining the three-dimensional coordinate information corresponding to the first number of pairs of matching points.
In a specific embodiment, the extended coordinate data may be preset, for example, 1, assuming that the two-dimensional coordinates of a pair of matching points of the first number of pairs of matching points is (u) 1 、v 1 ) And (u) 2 、v 2 ) Optionally, the expanded three-dimensional coordinate information may be (u 1, v1, 1) and (u 2, v2, 1).
In a specific embodiment, the determining the first calibration data between the thermal imager camera and the first preset camera based on the three-dimensional coordinate information corresponding to the first number of pairs of matching points may include, but is not limited to, random Sample Consensus (Random Sample Consensus), least square method, and the like.
In the above embodiment, the two-dimensional coordinate information is expanded into the three-dimensional coordinate information, so that the projection relationship between the cameras can be accurately determined, and calibration between the thermal imager camera and the first preset camera is further realized.
Step S207: and determining second calibration data between the thermal imager camera and a second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image.
In a specific embodiment, the second calibration data may represent a transformation relationship between a coordinate system corresponding to the thermal imager camera and a coordinate system corresponding to the second preset camera.
In a specific embodiment, as shown in fig. 5, the determining the second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image includes:
step S2071: determining a second number pair of matching points of the hollow area in the second calibration image and the third calibration image;
step S2073: expanding the position information of the second number of pairs of matching points into three-dimensional coordinate information;
step S2075: and determining second calibration data between the thermal imager camera and a second preset camera based on the three-dimensional coordinate information corresponding to the second number of pairs of matching points.
In a specific embodiment, a certain pair of matching points in the second number of pairs of matching points may be two corresponding points of a certain point in the hollow area in the second calibration image and the third calibration image. In a specific embodiment, taking the hollow area shown in fig. 2 as an example, 4 pairs of matching points corresponding to 4 vertices of the hollow area in the second calibration image and the third calibration image may be selected as the second number pair of matching points.
In a specific embodiment, the transformation relationship between the coordinate system corresponding to the thermal imager camera and the coordinate system corresponding to the second preset camera represents a projection relationship between the thermal imager camera and the second preset camera. In practical application, the projection relationship between the cameras is nonlinear in a two-dimensional coordinate system, and correspondingly, because the position information of the hollow-out region in the second calibration image and the third calibration image is two-dimensional coordinate information, the position information (two-dimensional coordinate information) of the second number of pairs of matching points can be expanded into three-dimensional coordinate information, and the second calibration data between the thermal imager camera and the second preset camera can be determined by combining the three-dimensional coordinate information corresponding to the second number of pairs of matching points.
In a specific embodiment, the determining the second calibration data between the thermal imager camera and the second preset camera based on the three-dimensional coordinate information corresponding to the second number of pairs of matching points may include, but is not limited to, random Sample Consensus (Random Sample Consensus), least square method, and the like.
In the above embodiment, the two-dimensional coordinate information is expanded into the three-dimensional coordinate information, so that the projection relationship between the cameras can be accurately determined, and calibration between the thermal imager camera and the second preset camera is further realized.
In the embodiment, at least three same preset patterns and hollow-out areas with the same size as the preset patterns are arranged in the calibration object corresponding to the calibration object images acquired by the trinocular camera, and the temperature adjusting device is arranged on one side of each hollow-out area back to the trinocular camera, so that the position information of the hollow-out areas in the calibration object images acquired by the cameras of the trinocular camera can be accurately extracted; determining first calibration data between the thermal imager camera and a first preset camera by combining position information of the hollow area in the first calibration image and the third calibration image; and determining second calibration data between the thermal imager camera and the second preset camera by combining the position information of the hollow-out area in the second calibration image and the third calibration image, so that the accuracy of the calibration data can be greatly improved, and the generalization of the application scene of the calibration data can be improved.
Step S107: and determining first primary selection position information corresponding to the target area in the thermal imager image according to the first calibration data and the first position information.
In an alternative embodiment, the position information of a plurality of key points in the target area in the first object image and the second object image may be selected to determine the corresponding position information of the target area in the thermal imager image. Specifically, the key points may be key points capable of determining the shape of the target area, and specifically, the selection of the key points may be different according to the different shapes of the target area.
In a specific embodiment, based on the epipolar transfer principle, a straight line may be mapped on the thermal imager image by the point on the first object image and the first calibration data, and accordingly, each key point on the first object image may correspond to a line in the thermal imager image. The respective corresponding lines of the plurality of key points on the first object image in the thermal imager image can be used as first primary selection position information corresponding to the target area in the thermal imager image.
Step S109: and determining second primary selection position information corresponding to the target area in the thermal imager image according to the second calibration data and the second position information.
In a specific embodiment, based on the principle of epipolar transfer, a straight line may be mapped on the thermal imager image by the point on the second object image and the second calibration data, and accordingly, each key point on the second object image may correspond to a line in the thermal imager image, and respective corresponding lines of a plurality of key points on the second object image in the thermal imager image may be used as second initial selection position information corresponding to the target area in the thermal imager image.
Step S111: and determining target position information of a target area in the thermal imager image based on the first primary selection position information and the second primary selection position information.
In an optional embodiment, the determining the target position information of the target area in the thermal imager image based on the first preliminary selection position information and the second preliminary selection position information includes: determining intersection coordinate information between the first primary selection position information and the second primary selection position information; and taking the intersection coordinate information as target position information.
In a specific embodiment, the intersection point of two lines of the same key point in the thermal imager image may be used as the intersection coordinate information between the first primary selection position information and the second primary selection position information, so that the target position information of the target area in the thermal imager image may be determined.
In the above embodiment, by combining the epipolar transfer principle, the matching points corresponding to the target areas in the first object image and the second object image may be mapped to two lines in the thermal imager image, and the target area in the thermal imager image may be accurately located by combining the intersection point of the two lines.
In an optional embodiment, the method may further include:
acquiring a regional image of a target region in the thermal imager image according to the target position information;
and analyzing the area image to obtain the temperature information of the object to be processed.
In a specific embodiment, the region image of the target region in the thermal imager image can be extracted by combining the target position information of the target region in the thermal imager image, and then the temperature information of the object to be processed can be determined based on the analysis processing of the region image, so that the temperature identification accuracy of the region of interest is greatly improved.
According to the technical scheme provided by the embodiment of the specification, in the embodiment of the specification, by combining first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera, a target area of an object to be processed in a first object image and a second object image which can distinguish different areas based on gray information can be mapped to the thermal imager image, so that the target area can be accurately positioned in the thermal imager image, and the accuracy of an interest area in the thermal imager image is greatly improved.
FIG. 6 is a block diagram illustrating an image locating device according to an exemplary embodiment. Referring to fig. 6, the apparatus includes:
the to-be-processed object image acquisition module 610 is configured to acquire a thermal imager image, a first object image and a second object image of an object to be processed, wherein the thermal imager image is an image of the object to be processed, which is acquired by a thermal imager camera in the trinocular camera, the first object image is an image of the object to be processed, which is acquired by a first preset camera in the trinocular camera, the second object image is an image of the object to be processed, which is acquired by a second preset camera in the trinocular camera, and the images acquired by the first preset camera and the second preset camera are images of different areas which can be distinguished based on gray level information;
a position information determining module 620 configured to perform determining first position information of a target region in a first object image and second position information of the target region in a second object image;
a calibration data acquiring module 630 configured to perform acquiring first calibration data between the thermal imager camera and a first preset camera and second calibration data between the thermal imager camera and a second preset camera;
the first primary selection position information determining module 640 is configured to determine first primary selection position information corresponding to a target area in the thermal imager image according to the first calibration data and the first position information;
the second primary selection position information determining module 650 is configured to determine second primary selection position information corresponding to the target area in the thermal imager image according to the second calibration data and the second position information;
a target location information determination module 660 configured to perform determining target location information for a target area in the thermal imager image based on the first preliminary location information and the second preliminary location information.
In an optional embodiment, the apparatus further comprises:
the area image acquisition module is configured to acquire an area image of a target area in the thermal imager image according to the target position information;
and the image analysis processing module is configured to perform analysis processing on the area image to obtain the temperature information of the object to be processed.
In an alternative embodiment, the target location information determination module 660 includes:
an intersection coordinate information determination unit configured to perform determination of intersection coordinate information between the first preliminary selection position information and the second preliminary selection position information;
a target position information determination unit configured to perform the intersection coordinate information as the target position information.
In an optional embodiment, the apparatus further comprises an identification data determination module, and the identification data determination module comprises
The calibration object image acquisition unit is configured to acquire a calibration object image which is acquired by the three-eye camera and comprises at least one calibration object, the calibration object comprises at least one preset pattern and a hollow area with the same size as the preset pattern, and one side, back to the three-eye camera, of the hollow area is provided with a temperature control device; the calibration object images comprise a first calibration image acquired by a first preset camera, a second calibration image acquired by a second preset camera and a third calibration image acquired by a thermal imager camera, and the temperature control device is used for controlling the temperature difference between the hollow area and the non-hollow area in at least one calibration object;
the position information determining unit is configured to determine the position information of the hollow area in the first calibration image, the second calibration image and the third calibration image according to the relative position information between at least one preset pattern in the first calibration image, the second calibration image and the third calibration image and the hollow area;
the first calibration data unit is configured to determine first calibration data between the thermal imager camera and a first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image;
and the second calibration data unit is configured to determine second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image.
In an optional embodiment, the at least one calibration object comprises at least two calibration objects, and the calibration object image obtaining unit is further configured to perform obtaining calibration object images acquired by the trinocular camera comprising at least two calibration objects located in different planes.
In an alternative embodiment, the position information is two-dimensional coordinate information, and the first calibration data determining unit includes:
a first matching point determination unit configured to perform a determination of a first number of pairs of matching points of a hollowed-out area in a first calibration image and the third calibration image;
a first coordinate expansion unit configured to perform expansion of the position information of the first number of pairs of matching points into three-dimensional coordinate information;
and the first calibration data determining unit is configured to determine first calibration data between the thermal imager camera and a first preset camera based on the three-dimensional coordinate information corresponding to the first number of pairs of matching points.
In an alternative embodiment, the position information is two-dimensional coordinate information, and the second calibration data unit includes:
a second matching point determination unit configured to perform a determination of a second number of pairs of matching points of the hollowed-out area in the second calibration image and the third calibration image;
a second coordinate expanding unit configured to perform expanding the position information of the second number of pairs of matching points into three-dimensional coordinate information;
and the second calibration data determining unit is configured to determine second calibration data between the thermal imager camera and a second preset camera based on the three-dimensional coordinate information corresponding to the second number of pairs of matching points.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a block diagram illustrating an electronic device for image positioning, which may be a terminal according to an exemplary embodiment, and an internal structure thereof may be as shown in fig. 7. The electronic device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the electronic device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement an image localization method. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the electronic equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and does not constitute a limitation on the electronic devices to which the disclosed aspects apply, as a particular electronic device may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
In an exemplary embodiment, there is also provided an electronic device including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the image localization method as in the embodiments of the present disclosure.
In an exemplary embodiment, there is also provided a storage medium having instructions that, when executed by a processor of an electronic device, enable the electronic device to perform an image localization method in the embodiments of the present disclosure.
In an exemplary embodiment, a computer program product containing instructions is also provided, which when run on a computer, causes the computer to perform the image localization method in the embodiments of the present disclosure.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. An image localization method, comprising:
acquiring a thermal imager image, a first object image and a second object image of an object to be processed, wherein the thermal imager image is acquired by a thermal imager camera in a three-view camera, the first object image is acquired by a first preset camera in the three-view camera, the first object image is acquired by the first preset camera in the three-view camera, the second object image is acquired by a second preset camera in the three-view camera, and the images acquired by the first preset camera and the second preset camera are images capable of distinguishing different areas based on gray information;
determining first position information of a target area in the first object image and second position information of the target area in the second object image;
acquiring first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera; the first calibration data and the second calibration data are obtained by adopting the following method: acquiring a calibration object image which is acquired by the trinocular camera and comprises at least one calibration object, wherein the calibration object comprises at least one preset pattern and a hollow area with the same size as the preset pattern, and one side of the hollow area, which is back to the trinocular camera, is provided with a temperature control device; the calibration object images comprise a first calibration image acquired by the first preset camera, a second calibration image acquired by the second preset camera and a third calibration image acquired by the thermal imager camera, and the temperature control device is used for controlling the temperature difference between the hollow-out area and a non-hollow-out area in the at least one calibration object; determining the position information of the hollowed-out region in the first calibration image, the second calibration image and the third calibration image according to the relative position information between the at least one preset pattern and the hollowed-out region in the first calibration image, the second calibration image and the third calibration image; determining first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image; determining second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image;
determining first primary selection position information corresponding to the target area in the thermal imager image according to the first calibration data and the first position information;
determining second primary selection position information corresponding to the target area in the thermal imager image according to the second calibration data and the second position information;
and determining target position information of the target area in the thermal imager image based on the first primary selection position information and the second primary selection position information.
2. The image positioning method according to claim 1, wherein the determining the target location information of the target area in the thermal imager image based on the first preliminary selection location information and the second preliminary selection location information comprises:
determining intersection coordinate information between the first primary selection position information and the second primary selection position information;
and taking the intersection coordinate information as the target position information.
3. The image localization method of claim 1, further comprising:
acquiring a regional image of the target region in the thermal imager image according to the target position information;
and analyzing the area image to obtain the temperature information of the object to be processed.
4. The image positioning method according to any one of claims 1 to 3, wherein the at least one calibration object comprises at least two calibration objects, and the acquiring the calibration object image including the at least one calibration object acquired by the trinocular camera comprises:
and acquiring a calibration object image which is acquired by the trinocular camera and comprises at least two calibration objects positioned on different planes.
5. The image positioning method according to any one of claims 1 to 3, wherein the position information is two-dimensional coordinate information, and the determining the first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image comprises:
determining a first number of pairs of matching points of the hollowed-out area in the first calibration image and the third calibration image;
expanding the position information of the first number of pairs of matching points into three-dimensional coordinate information;
and determining first calibration data between the thermal imager camera and the first preset camera based on the three-dimensional coordinate information corresponding to the first number of pairs of matching points.
6. The image positioning method according to any one of claims 1 to 3, wherein the position information is two-dimensional coordinate information, and the determining second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image comprises:
determining a second number of pairs of matching points of the hollowed-out area in the second calibration image and the third calibration image;
expanding the position information of the second number of pairs of matching points into three-dimensional coordinate information;
and determining second calibration data between the thermal imager camera and the second preset camera based on the three-dimensional coordinate information corresponding to the second number of pairs of matching points.
7. An image localization apparatus, comprising:
the system comprises a to-be-processed object image acquisition module, a first object image acquisition module and a second object image acquisition module, wherein the to-be-processed object image acquisition module is configured to acquire a thermal imager image, a first object image and a second object image of a to-be-processed object, which are acquired by a thermal imager camera in a three-mesh camera, the first object image is an image of the to-be-processed object, which is acquired by a first preset camera in the three-mesh camera, and the second object image is an image of the to-be-processed object, which is acquired by a second preset camera in the three-mesh camera;
a position information determination module configured to perform determining first position information of a target region in the first object image and second position information of the target region in the second object image;
a calibration data acquisition module configured to perform acquisition of first calibration data between the thermal imager camera and the first preset camera and second calibration data between the thermal imager camera and the second preset camera; the first calibration data and the second calibration data are obtained by adopting the following method: acquiring a calibration object image which is acquired by the trinocular camera and comprises at least one calibration object, wherein the calibration object comprises at least one preset pattern and a hollow area with the same size as the preset pattern, and one side of the hollow area, which is back to the trinocular camera, is provided with a temperature control device; the calibration object images comprise a first calibration image acquired by the first preset camera, a second calibration image acquired by the second preset camera and a third calibration image acquired by the thermal imager camera, and the temperature control device is used for controlling the temperature difference between the hollowed-out area and the non-hollowed-out area in the at least one calibration object; determining the position information of the hollowed-out region in the first calibration image, the second calibration image and the third calibration image according to the relative position information between the at least one preset pattern and the hollowed-out region in the first calibration image, the second calibration image and the third calibration image; determining first calibration data between the thermal imager camera and the first preset camera according to the position information of the hollow area in the first calibration image and the third calibration image; determining second calibration data between the thermal imager camera and the second preset camera according to the position information of the hollow area in the second calibration image and the third calibration image;
the first primary selection position information determining module is configured to determine first primary selection position information corresponding to the target area in the thermal imager image according to the first calibration data and the first position information;
the second primary selection position information determining module is configured to determine second primary selection position information corresponding to the target area in the thermal imager image according to the second calibration data and the second position information;
a target location information determination module configured to perform a determination of target location information for the target area in the thermal imager image based on the first and second preliminary location information.
8. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image localization method of any of claims 1 to 6.
9. A computer readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the image localization method of any of claims 1 to 6.
CN202110332251.XA 2021-03-29 2021-03-29 Image positioning method and device, electronic equipment and storage medium Active CN112926580B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110332251.XA CN112926580B (en) 2021-03-29 2021-03-29 Image positioning method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110332251.XA CN112926580B (en) 2021-03-29 2021-03-29 Image positioning method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112926580A CN112926580A (en) 2021-06-08
CN112926580B true CN112926580B (en) 2023-02-03

Family

ID=76176359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110332251.XA Active CN112926580B (en) 2021-03-29 2021-03-29 Image positioning method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112926580B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921782A (en) * 2018-05-17 2018-11-30 腾讯科技(深圳)有限公司 A kind of image processing method, device and storage medium
CN110070083A (en) * 2019-04-24 2019-07-30 深圳市微埃智能科技有限公司 Image processing method, device, electronic equipment and computer readable storage medium
CN111950520A (en) * 2020-08-27 2020-11-17 重庆紫光华山智安科技有限公司 Image recognition method and device, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111223139B (en) * 2018-11-26 2024-02-13 深圳市优必选科技有限公司 Target positioning method and terminal equipment
CN109584312B (en) * 2018-11-30 2020-09-11 Oppo广东移动通信有限公司 Camera calibration method, device, electronic equipment and computer-readable storage medium
CN110009687A (en) * 2019-03-14 2019-07-12 深圳市易尚展示股份有限公司 Color three dimension imaging system and its scaling method based on three cameras

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921782A (en) * 2018-05-17 2018-11-30 腾讯科技(深圳)有限公司 A kind of image processing method, device and storage medium
CN110070083A (en) * 2019-04-24 2019-07-30 深圳市微埃智能科技有限公司 Image processing method, device, electronic equipment and computer readable storage medium
CN111950520A (en) * 2020-08-27 2020-11-17 重庆紫光华山智安科技有限公司 Image recognition method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112926580A (en) 2021-06-08

Similar Documents

Publication Publication Date Title
US10129490B2 (en) Systems and approaches for thermal image corrections
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN113556977A (en) C-arm-based medical imaging system and method for matching 2D image with 3D space
US10771776B2 (en) Apparatus and method for generating a camera model for an imaging system
CN111583188A (en) Operation navigation mark point positioning method, storage medium and computer equipment
CN109901123B (en) Sensor calibration method, device, computer equipment and storage medium
CN111127559A (en) Method, device, equipment and storage medium for detecting marker post in optical dynamic capturing system
Yeung et al. In-situ calibration of laser/galvo scanning system using dimensional reference artefacts
CN113834571A (en) Target temperature measurement method, device and temperature measurement system
CN111652314A (en) Temperature detection method and device, computer equipment and storage medium
US20190313082A1 (en) Apparatus and method for measuring position of stereo camera
CN112926580B (en) Image positioning method and device, electronic equipment and storage medium
CN112635042B (en) Monitor calibration method, device, equipment and storage medium
Pollok et al. A visual SLAM-based approach for calibration of distributed camera networks
CN111721201A (en) Temperature detection method
JP3919722B2 (en) Skin shape measuring method and skin shape measuring apparatus
CN112241984A (en) Binocular vision sensor calibration method and device, computer equipment and storage medium
US10705217B2 (en) Controlling multiple imaging sensors
CN109727234B (en) Display panel generation method, scanning range planning method and equipment
US20230027236A1 (en) Dimensional calibration of the field-of-view of a single camera
WO2017107564A1 (en) Board image acquisition method and system
CN115457089A (en) Registration fusion method and device for visible light image and infrared image
CN112163519A (en) Image mapping processing method, device, storage medium and electronic device
Cardone et al. Warping-based co-registration of thermal infrared images: Study of factors influencing its applicability
Du et al. Grid-based matching for full-field large-area deformation measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240403

Address after: 200030, Units 6-77, 6th Floor, No. 1900 Hongmei Road, Xuhui District, Shanghai

Patentee after: Shanghai Yuanluobu Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 518000 Room 201, building A, 1 front Bay Road, Shenzhen Qianhai cooperation zone, Shenzhen, Guangdong

Patentee before: SHENZHEN SENSETIME TECHNOLOGY Co.,Ltd.

Country or region before: China