CN110213566B - Image matching method, device, equipment and computer readable storage medium - Google Patents

Image matching method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN110213566B
CN110213566B CN201910427813.1A CN201910427813A CN110213566B CN 110213566 B CN110213566 B CN 110213566B CN 201910427813 A CN201910427813 A CN 201910427813A CN 110213566 B CN110213566 B CN 110213566B
Authority
CN
China
Prior art keywords
depth
information
image
module
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910427813.1A
Other languages
Chinese (zh)
Other versions
CN110213566A (en
Inventor
徐振宾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Optical Technology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN201910427813.1A priority Critical patent/CN110213566B/en
Publication of CN110213566A publication Critical patent/CN110213566A/en
Application granted granted Critical
Publication of CN110213566B publication Critical patent/CN110213566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image matching method, an image matching device, image matching equipment and a computer readable storage medium, wherein the image matching method adopts a depth module to shoot images, and comprises the following steps: controlling the depth module to collect a target image of a target area, and acquiring first depth information corresponding to the target image; controlling the depth module to search and identify the matched source region to acquire second depth information after search and identification; and acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information. According to the technical scheme, the three-dimensional depth information of the image can be shot and obtained, and the image with different directions and orientations can be effectively identified.

Description

Image matching method, device, equipment and computer readable storage medium
Technical Field
The present invention relates to the field of image matching identification technologies, and in particular, to an image matching method, an image matching device, an image matching apparatus, and a computer-readable storage medium.
Background
The image matching is to search another template image in one image, and the application scenes of the image matching are more, such as image retrieval, target tracking and the like, but the existing image matching algorithm can only carry out calculation according to two-dimensional images, if the upper, lower, left and right sides of the images are reversed, the existing algorithm cannot be effectively distinguished, especially, the requirement on the attaching direction of a specification or a trademark in the packaging industry is met, if the packaged products are stacked and superposed together, the difficulty in matching the trademark direction is increased, and the placing posture of the packaged products is difficult to acquire and master at any time.
Disclosure of Invention
The invention mainly aims to provide an image matching method, an image matching device, image matching equipment and a computer readable storage medium, and aims to solve the problem that the prior art can only carry out two-dimensional identification on images.
In order to achieve the above object, the image matching method provided by the present invention uses a depth module to capture an image, and the image matching method includes:
controlling the depth module to collect a target image of a target area, and acquiring first depth information corresponding to the target image;
controlling the depth module to search and identify the matched source region to acquire second depth information after search and identification;
and acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information.
Optionally, the step of controlling the depth module to collect a target image of a target area, and acquiring first depth information corresponding to the target image includes:
and controlling the shooting direction of the depth module and the plane of the target image to generate a first shooting inclination angle, wherein the angle range of the first shooting inclination angle is between 0 and 180 degrees.
Optionally, the controlling the depth module to collect a target image of a target area, and the step of obtaining first depth information corresponding to the target image includes:
controlling the depth module to acquire first plane information of the target image;
controlling the depth module to obtain a first depth distance of the target image;
and generating the first depth information according to the first plane information and the first depth distance.
Optionally, the step of controlling the depth module to perform search identification on the matching source region, and acquiring second depth information after the search identification includes:
controlling the shooting direction of the depth module and the plane of the matching source region to generate a second shooting inclination angle, wherein the second shooting inclination angle is the same as the first shooting inclination angle;
controlling the depth module to acquire second plane information of the matching source region;
controlling the depth module to obtain a second depth distance of the matching source region;
and generating the second depth information according to the second plane information and the second depth distance.
Optionally, the step of obtaining a matching image corresponding to the target image according to the first depth information and the second depth information includes:
combining the first depth information and the second depth information to generate combined information;
comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information;
and acquiring a matching image correspondingly matched with the target image according to the matching information.
In addition, in order to achieve the above object, the present invention provides an image matching apparatus for capturing an image using a depth module, the image matching apparatus including:
the control module is used for controlling the depth module to collect a target image of a target area and controlling the depth module to search and identify a matched source area;
the acquisition module is used for acquiring first depth information corresponding to the target image, acquiring second depth information after searching and identifying, and acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information.
Optionally, the control module is further configured to control the shooting direction of the depth module and the plane of the target image to generate a first shooting inclination angle, where an angle of the first shooting inclination angle ranges from 0 ° to 180 °, and control the shooting direction of the depth module and the plane of the matching source region to generate a second shooting inclination angle, where an angle of the second shooting inclination angle is the same as that of the first shooting inclination angle;
the acquisition module is further used for controlling the depth module to acquire first plane information of the target image and a first depth distance of the target image, and controlling the depth module to acquire second plane information of the matching source region and controlling the depth module to acquire a second depth distance of the matching source region;
the image matching apparatus further includes: the generating module is configured to generate the first depth information according to the first plane information and the first depth distance, and generate the second depth information according to the second plane information and the second depth distance.
Optionally, the image matching apparatus further includes:
a combining module, configured to combine the first depth information and the second depth information to generate combined information;
the analysis module is used for comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information;
the acquisition module is further used for acquiring a matching image which is correspondingly matched with the target image according to the matching information.
In addition, in order to achieve the above object, the present invention further provides an image matching apparatus, comprising: a memory, a processor, and an image matching program stored on the memory and executable on the processor; the image matching program when executed by the processor implements the steps of the image matching method as described above.
Further, in order to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon an image matching program which, when executed by a processor, implements the steps of the image matching method as described above.
According to the technical scheme, the target area and the matching source area are searched and identified through the depth module, first depth information corresponding to the target image and second depth information corresponding to the matching source area are obtained respectively, and the information shot by the depth module not only comprises plane information but also comprises distance information from the depth module to the corresponding position, so that the depth module can shoot three-dimensional depth information and can effectively identify images in different directions.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a flowchart illustrating an image matching method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating an image matching method according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating an image matching method according to a third embodiment of the present invention;
FIG. 4 is a flowchart illustrating an image matching method according to a fourth embodiment of the present invention;
FIG. 5 is a flowchart illustrating an image matching method according to a fifth embodiment of the present invention;
FIG. 6 is a schematic diagram of a target image of the image matching method of the present invention;
FIG. 7 is a schematic diagram of a source image for the image matching method of the present invention;
FIG. 8 is a diagram illustrating first plane information of a target image according to the image matching method of FIG. 6;
FIG. 9 is a diagram illustrating a first depth distance of a target image according to the image matching method of FIG. 6;
FIG. 10 is a diagram illustrating first depth information of a target image according to the image matching method of FIG. 6;
FIG. 11 is a diagram illustrating second plane information of a source image of the image matching method of FIG. 7 according to the present invention;
FIG. 12 is a second depth distance diagram of a source image for the image matching method of FIG. 7 in accordance with the present invention;
FIG. 13 is a diagram illustrating second depth information of a source image of the image matching method of FIG. 7 according to the present invention;
FIG. 14 is a diagram illustrating the combination information of the image matching method according to the present invention;
fig. 15 is a schematic view of a connection structure of the image matching apparatus of the present invention.
The reference numbers illustrate:
reference numerals Name (R) Reference numerals Name (R)
100 Control module 400 Combination module
200 Acquisition module 500 Analysis module
300 Generation module
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a first embodiment of the present invention provides an image matching method, where the image matching method uses a depth module to capture an image, and the image matching method includes:
step S10, controlling the depth module to collect a target image of a target area, and obtaining first depth information corresponding to the target image, where the depth module is a TOF (Time of flight) camera, and the depth module is based on the principle of emitting light pulses, then recording and receiving light reflected from the surface of an object by a sensor, calculating a round-trip distance of the light pulses by multiplying Time by a speed of light, and obtaining a depth distance of the object, setting the target area so as to obtain the target image accurately in Time, and taking the target image as a standard, shooting the target image by the depth module, obtaining the first depth information, and storing and recording the first depth information.
And step S20, controlling the depth module to search and recognize a matching source region, acquiring second depth information after searching and recognizing, wherein the matching source region is a searched object, the matching source region is provided with a source image, the matching source region is set to be convenient for forming a boundary of searching and recognizing, the searching and recognizing time is saved by rapidly completing the searching and recognizing, the matching source region is shot and searched through a depth camera, and the second depth information is stored and recorded.
Step S30, obtaining a matching image corresponding to the target image according to the first depth information and the second depth information, where the second depth information includes at least two contents similar to the first depth information, that is, the first depth information is used as a reference standard, comparing and analyzing the second depth information, determining which depth information in the second depth information is closest to the first depth information, and matching the closest or same depth information with the first depth information, thereby determining the matching image corresponding to the target image.
According to the technical scheme, the target area and the matching source area are searched and identified through the depth module, first depth information corresponding to the target image and second depth information corresponding to the matching source area are obtained respectively, and the information shot by the depth module not only comprises plane information but also comprises distance information from the depth module to the corresponding position, so that the depth module can shoot three-dimensional depth information and can effectively identify images in different directions.
Referring to fig. 2, in the second embodiment of the present invention, before the step S10 of controlling the depth module to acquire a target image of a target area and acquiring first depth information corresponding to the target image, the method includes:
and step S01, controlling the shooting direction of the depth module and the plane where the target image is located to generate a first shooting inclination angle, wherein the angle range of the first shooting inclination angle is 0-180 degrees, and due to the generation of the first shooting inclination angle, the depth distance from the closest point to the farthest point is different when the depth module shoots the target image, so that for some complex images, the placing direction of the images can be effectively recorded, and the subsequent searching and identification of the images close to the complex images are facilitated.
Referring to fig. 3, in a third embodiment of the present invention, the step S10 of controlling the depth module to acquire a target image of a target area and acquiring first depth information corresponding to the target image includes:
step S11, the depth module is controlled to obtain first plane information of the target image, for example, the target image is set in a target area, the target image is a T-shaped image as shown in fig. 6, the target image may also be a Y-shaped image, a cross-shaped image is used as a graphic for distinguishing display, or a segment of characters or characters, the target area is divided into a long and wide 3 × 3 square as shown in fig. 8, or divided into a long and wide 3 × 4 square or a long and wide 5 × 4 square, the occupied position of the target image in the target area is marked with a numeral 1, or with a numeral 2 or 3, and the rest of the free areas are marked with a numeral 0.
Step S12, controlling the depth module to obtain the first depth distance of the target image, and on a 3 × 3 square image with a length and a width, marking each column of squares with the depth module as shown in fig. 9, for example, the rule from left to right is 3 columns, 2 columns, 1 column, or may be marked from right to left, or may be set in each row from top to bottom, or may be set in each row from bottom to top, and besides, the number of the mark is not limited to 321, and the rule is to ensure that the number of each column or each row is different.
Step S13, generating the first depth information according to the first plane information and the first depth distance, specifically, multiplying the first plane information by the first depth distance, as shown in fig. 10, multiplying the first plane information of the marked target image of the long-wide 3 × 3 square with the corresponding position of the long-wide 3 × 3 square according to the rule from left to right, to obtain a first depth multiplied square with the length and width of 3 × 3, so that the target image has the first depth information facing to the direction.
Referring to fig. 4, in the fourth embodiment of the present invention, the step S20 of controlling the depth module to perform search recognition on the matching source region and obtain the second depth information after the search recognition includes:
and step S21, controlling the shooting direction of the depth module and the plane of the matching source region to generate a second shooting inclination angle, wherein the second shooting inclination angle and the first shooting inclination angle have the same angle, namely the range of the second shooting inclination angle is also between 0 and 180 degrees, so as to ensure that the relative states of the shooting target region and the matching source region in the three-dimensional space are the same, and the three-dimensional shape of the graph can be more effectively distinguished for some complex images.
Step S22, controlling the depth module to obtain second plane information of the matching source region, where the second plane information includes the content of the first plane information, as shown in fig. 7, there are four groups of images in the second plane information, where the top right group of images is the same as the target image, as shown in fig. 11, the matching source region is a 6 × 6 square with length and width, or can be divided into a 6 × 7 square with length and width, or a 9 × 8 square with length and width, and the occupied positions of the matching source region in the corresponding images are marked with a numeral 1, and besides, the occupied positions of the matching source region can be marked with a numeral 2 or 3, and the rest of the vacant regions are marked with a numeral 0.
Step S23, controlling the depth module to obtain the second depth distance of the matching source region, and marking each column of squares with the depth module as shown in fig. 12 on a 6 × 6 square diagram with a length and a width, for example, the rule from left to right is 3 columns, 2 columns and 1 column, or from right to left, or from top to bottom, or from bottom to top, or each row, except that the number of the marks is not limited to 321.
Step S24 is to generate the second depth information based on the second plane information and the second depth distance, specifically, to multiply the second plane information and the second depth distance, and as shown in fig. 13, multiply the second plane information with the mark image of the 6 × 6 long-wide square being 1 by the corresponding positions of the 6 × 6 long-wide square of 3 columns, 2 columns, and 1 column according to the rule from left to right, to obtain one 6 × 6 long-wide second depth multiplied square, whereby the image matching the source region has the second depth information oriented in the direction.
Referring to fig. 5 and fig. 14, in a fifth embodiment of the present invention, the step S30 of obtaining a matching image corresponding to the target image according to the first depth information and the second depth information includes:
and step S31, combining the first depth information and the second depth information to generate combined information, wherein the second depth information of the length and width 6 × 6 square map is divided into four positions, namely, upper left, upper right, lower left and lower right, and the length and width 3 × 3 first depth information square map is multiplied by the grid numbers of the four positions, namely, the upper left, the upper right, the lower left and the lower right, to obtain the combined information of the length and width 6 × 6.
And step S32, comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information, and judging the matching information corresponding to the upper right corner by comparing the first depth information and the combination information and knowing from the arrangement rule of the numbers that the numbers on the upper right corner are closest to the numbers recorded by the first depth information.
Step S33, obtaining a matching image corresponding to the target image according to the matching information, and determining that the matching image also corresponds to the upper right corner position according to the upper right corner position corresponding to the matching information, thereby effectively identifying the corresponding image.
As shown in fig. 15, the present invention further provides an image matching apparatus, wherein the image matching apparatus uses a depth module to capture an image, and the image matching apparatus includes: a control module 100 and an acquisition module 200.
The control module 100 is configured to control the depth module to acquire a target image of a target region, where the depth module is a TOF camera, and the depth module is based on a principle of emitting a light pulse, then recording and receiving light reflected from a surface of an object through a sensor, calculating a round-trip distance of the light pulse by multiplying time by a speed of light, and obtaining a depth distance of the object, setting the target region so as to timely and accurately obtain the target image, and controlling the depth module to search and identify a matching source region, where the matching source region is an object to be searched, and setting the matching source region so as to form a boundary of the search and identification, which is beneficial to quickly completing the search and identification and saving the search and identification time, and shooting and searching the matching.
The obtaining module 200 is configured to obtain first depth information corresponding to the target image, obtain second depth information after search and identification, respectively store and record the first depth information and the second depth information, and obtain a matching image corresponding to and matching the target image according to the first depth information and the second depth information.
According to the technical scheme, the control module 100 controls the depth module to search and recognize the target area and the matching source area, the acquisition module 200 respectively acquires first depth information corresponding to the target image and second depth information corresponding to the matching source area, and the information acquired by shooting through the depth module not only comprises plane information, but also comprises distance information from the depth module to the corresponding position, so that the depth module can shoot three-dimensional depth information, and images oriented in different directions can be effectively recognized.
Further, the control module 100 is further configured to control the shooting direction of the depth module and the plane of the target image to generate a first shooting tilt angle, where an angle of the first shooting tilt angle ranges from 0 ° to 180 °, and control the shooting direction of the depth module and the plane of the matching source region to generate a second shooting tilt angle, where an angle of the second shooting tilt angle is the same as that of the first shooting tilt angle, that is, an angle of the second shooting tilt angle ranges from 0 ° to 180 °.
The obtaining module 200 is further configured to control the depth module to obtain first plane information of the target image and control the depth module to obtain a first depth distance of the target image, for example, the target image is disposed in a target area, the target image is a T-shaped graph, in addition, the target image may be a Y-shaped graph, a cross-shaped graph used for distinguishing displayed graphs, or a segment of characters or characters, the target area is divided into a long and wide 3 × 3 square graph, or a long and wide 3 × 4 square graph, or a long and wide 5 × 4 square graph, an occupied position of the target image in the target area is marked with a numeral 1, in addition, a numeral 2 or 3 is also used, and the remaining vacant area is marked with a numeral 0, thereby forming the first plane information. On the 3 × 3 long square graph, each column of squares is marked by a depth module, for example, the rule from left to right is 3 columns, 2 columns, 1 column, and can also be marked from right to left, or can be set from top to bottom in each row, or can be set from bottom to top in each row, except that the number of the mark is not limited to 321, and the rule is to ensure that the number of each column or each row is different, thereby forming the first depth distance.
The obtaining module 200 is further configured to control the depth module to obtain second plane information of the matching source region and control the depth module to obtain a second depth distance of the matching source region, where the second plane information includes content of the first plane information, for example, there are four groups of images in the second plane information, where a right upper group of images is the same as the target image, the matching source region is a 6 × 6 square with a length and a width, and may also be divided into a 6 × 7 square with a length and a width, or a 9 × 8 square with a length and a width, the occupied position of the matching source region in the corresponding image is marked with a number 1, and in addition, the matching source region may be marked with a number 2 or 3, and the remaining vacant region is marked with a number 0. And controlling the depth module to acquire the second depth distance of the matching source region, and marking each column of squares by the depth module on a 6 × 6 square graph with the length and width, for example, the rule from left to right is 3 two columns, 2 two columns and 1 two columns, and the marks can be marked from right to left, or can be arranged from top to bottom in each row, or can be arranged from bottom to top in each row, except that the number of the marks is not limited to 321.
The image matching apparatus further includes: the generating module 300 is configured to generate the first depth information according to the first plane information and the first depth distance, specifically, the first plane information is multiplied by the first depth distance, for example, the first plane information of the 3 × 3 square image that marks the target image as 1 is multiplied by the corresponding position of the 3 column, 2 column, 1 column, 3 × 3 square image according to the rule from left to right, so as to obtain a 3 × 3 square image, and thus the target image has the first depth information facing the direction.
The generating module 300 is further configured to generate the second depth information according to the second plane information and the second depth distance, that is, the second plane information and the second depth distance are multiplied, for example, the second plane information with the label image of the 6 × 6 long-wide square being 1 is multiplied by the corresponding positions of the 6 × 6 long-wide square being 3 columns, 2 columns and 1 column according to the rule from left to right, so as to obtain a 6 × 6 long-wide second depth multiplied square, and thus the image matching the source region has the second depth information facing the direction.
Further, the image matching apparatus further includes: a combination module 400 and an analysis module 500.
The combining module 400 is configured to combine the first depth information and the second depth information to generate combined information, where the second depth information of the length and width 6 × 6 square is divided into four positions, i.e., an upper left position, an upper right position, a lower left position, and a lower right position, and the length and width 3 × 3 first depth information square is multiplied by grid numbers of the four positions, i.e., the upper left position, the upper right position, the lower left position, and the lower right position, respectively, so as to obtain combined information of the length and width 6 × 6.
The analysis module 500 is configured to compare and analyze the first depth information and the combination information to obtain matching information matching the first depth information, and the matching information is known from an arrangement rule of numbers, where a number at an upper right corner is closest to a number recorded in the first depth information, so as to determine matching information corresponding to the upper right corner.
The obtaining module 200 is further configured to obtain a matching image corresponding to the target image according to the matching information, and the matching image can be determined to correspond to the upper right corner position through the matching information, so that the corresponding image can be effectively identified.
The invention also provides an image matching device, which comprises: a memory, a processor, and an image matching program stored on the memory and executable on the processor; the image matching device calls an image matching program stored in a memory through a processor and executes the following operations:
controlling the depth module to collect a target image of a target area, and acquiring first depth information corresponding to the target image;
controlling the depth module to search and identify the matched source region to acquire second depth information after search and identification;
and acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information.
Further, the processor calls the image matching program stored in the memory and further executes the following operations:
and controlling the shooting direction of the depth module and the plane of the target image to generate a first shooting inclination angle, wherein the angle range of the first shooting inclination angle is between 0 and 180 degrees.
Further, the processor calls the image matching program stored in the memory and further executes the following operations:
controlling the depth module to acquire first plane information of the target image;
controlling the depth module to obtain a first depth distance of the target image;
and generating the first depth information according to the first plane information and the first depth distance.
Further, the processor calls the image matching program stored in the memory and further executes the following operations:
controlling the shooting direction of the depth module and the plane of the matching source region to generate a second shooting inclination angle, wherein the second shooting inclination angle is the same as the first shooting inclination angle;
controlling the depth module to acquire second plane information of the matching source region;
controlling the depth module to obtain a second depth distance of the matching source region;
and generating the second depth information according to the second plane information and the second depth distance.
Further, the processor calls the image matching program stored in the memory and further executes the following operations:
combining the first depth information and the second depth information to generate combined information;
comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information;
and acquiring a matching image correspondingly matched with the target image according to the matching information.
According to the embodiment, the processor controls the depth module to search and identify the target area and the matching source area, so that first depth information corresponding to a target image and second depth information corresponding to the matching source area are obtained respectively, and the information obtained by shooting through the depth module not only comprises plane information but also comprises distance information from the depth module to the corresponding position, so that the depth module can shoot three-dimensional depth information, and images oriented in different directions can be effectively identified.
The present invention also provides a computer readable storage medium having stored thereon an image matching program executable by one or more processors for:
controlling the depth module to collect a target image of a target area, and acquiring first depth information corresponding to the target image;
controlling the depth module to search and identify the matched source region to acquire second depth information after search and identification;
and acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information.
Further, the image matching program when executed by the processor further performs the following operations:
and controlling the shooting direction of the depth module and the plane of the target image to generate a first shooting inclination angle, wherein the angle range of the first shooting inclination angle is between 0 and 180 degrees.
Further, the image matching program when executed by the processor further performs the following operations:
controlling the depth module to acquire first plane information of the target image;
controlling the depth module to obtain a first depth distance of the target image;
and generating the first depth information according to the first plane information and the first depth distance.
Further, the image matching program when executed by the processor further performs the following operations:
controlling the shooting direction of the depth module and the plane of the matching source region to generate a second shooting inclination angle, wherein the second shooting inclination angle is the same as the first shooting inclination angle;
controlling the depth module to acquire second plane information of the matching source region;
controlling the depth module to obtain a second depth distance of the matching source region;
and generating the second depth information according to the second plane information and the second depth distance.
Further, the image matching program when executed by the processor further performs the following operations:
combining the first depth information and the second depth information to generate combined information;
comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information;
and acquiring a matching image correspondingly matched with the target image according to the matching information.
According to the embodiment, the image matching program is read through the processor, the depth module searches and identifies the target area and the matching source area to respectively obtain the first depth information corresponding to the target image and the second depth information corresponding to the matching source area, and the information obtained by shooting through the depth module not only comprises plane information, but also comprises distance information from the depth module to the corresponding position, so that the depth module can shoot three-dimensional depth information, and images oriented in different directions can be effectively identified.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. An image matching method is characterized in that the image matching method adopts a depth module to shoot images, and the image matching method comprises the following steps:
controlling the depth module to collect a target image of a target area, and acquiring first depth information corresponding to the target image;
controlling the depth module to search and identify the matched source region to acquire second depth information after search and identification;
acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information; multiplying the first depth information and the second depth information to generate combined information; comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information; and acquiring a matching image correspondingly matched with the target image according to the matching information.
2. The image matching method according to claim 1, wherein the step of controlling the depth module to collect a target image of a target area and acquiring first depth information corresponding to the target image comprises:
and controlling the shooting direction of the depth module and the plane of the target image to generate a first shooting inclination angle, wherein the angle range of the first shooting inclination angle is between 0 and 180 degrees.
3. The image matching method of claim 2, wherein the step of controlling the depth module to collect a target image of a target area and acquiring first depth information corresponding to the target image comprises:
controlling the depth module to acquire first plane information of the target image;
controlling the depth module to obtain a first depth distance of the target image;
and multiplying the first plane information and the first depth distance to generate the first depth information.
4. The image matching method as claimed in claim 3, wherein the step of controlling the depth module to perform search recognition on the matching source region, and the step of obtaining the second depth information after the search recognition comprises:
controlling the shooting direction of the depth module and the plane of the matching source region to generate a second shooting inclination angle, wherein the second shooting inclination angle is the same as the first shooting inclination angle;
controlling the depth module to acquire second plane information of the matching source region;
controlling the depth module to obtain a second depth distance of the matching source region;
and multiplying the second plane information and the second depth distance to generate the second depth information.
5. An image matching device, characterized in that, the image matching device adopts the degree of depth module to shoot the image, the image matching device includes:
the control module is used for controlling the depth module to collect a target image of a target area and controlling the depth module to search and identify a matched source area;
the acquisition module is used for acquiring first depth information corresponding to the target image, acquiring second depth information after searching and identifying, and acquiring a matching image correspondingly matched with the target image according to the first depth information and the second depth information;
a combining module, configured to multiply the first depth information and the second depth information to generate combined information;
the analysis module is used for comparing and analyzing the first depth information and the combination information to obtain matching information matched with the first depth information;
the acquisition module is further used for acquiring a matching image which is correspondingly matched with the target image according to the matching information.
6. The image matching device as claimed in claim 5, wherein the control module is further configured to control the shooting direction of the depth module and the plane of the target image to generate a first shooting tilt angle, the first shooting tilt angle being in a range of 0-180 °, and to control the shooting direction of the depth module and the plane of the matching source area to generate a second shooting tilt angle, the second shooting tilt angle being the same as the first shooting tilt angle;
the acquisition module is further used for controlling the depth module to acquire first plane information of the target image and a first depth distance of the target image, and controlling the depth module to acquire second plane information of the matching source region and controlling the depth module to acquire a second depth distance of the matching source region;
the depth module-based image matching device further comprises: a generating module, configured to multiply the first plane information and the first depth distance to generate the first depth information, and multiply the second plane information and the second depth distance to generate the second depth information.
7. An image matching apparatus characterized by comprising: a memory, a processor, and an image matching program stored on the memory and executable on the processor; the image matching program when executed by the processor implements the steps of the image matching method of any of claims 1-4.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon an image matching program which, when executed by a processor, implements the steps of the image matching method according to any one of claims 1-4.
CN201910427813.1A 2019-05-20 2019-05-20 Image matching method, device, equipment and computer readable storage medium Active CN110213566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910427813.1A CN110213566B (en) 2019-05-20 2019-05-20 Image matching method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910427813.1A CN110213566B (en) 2019-05-20 2019-05-20 Image matching method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110213566A CN110213566A (en) 2019-09-06
CN110213566B true CN110213566B (en) 2021-06-01

Family

ID=67788170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910427813.1A Active CN110213566B (en) 2019-05-20 2019-05-20 Image matching method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110213566B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10223136C1 (en) * 2002-05-24 2003-12-24 Fraunhofer Ges Forschung Procedure for adjusting and setting the depth measurement range of a studio camera
CN106251353A (en) * 2016-08-01 2016-12-21 上海交通大学 Weak texture workpiece and the recognition detection method and system of three-dimensional pose thereof
CN106507116B (en) * 2016-10-12 2019-08-06 上海大学 A kind of 3D-HEVC coding method predicted based on 3D conspicuousness information and View Synthesis
CN107330917B (en) * 2017-06-23 2019-06-25 歌尔股份有限公司 The track up method and tracking equipment of mobile target
CN107463945B (en) * 2017-07-12 2020-07-10 浙江大学 Commodity type identification method based on deep matching network
CN110543871B (en) * 2018-09-05 2022-01-04 天目爱视(北京)科技有限公司 Point cloud-based 3D comparison measurement method

Also Published As

Publication number Publication date
CN110213566A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
US10347005B2 (en) Object state identification method, object state identification apparatus, and carrier
EP3407294B1 (en) Information processing method, device, and terminal
US10699147B2 (en) Systems and methods for fast identification and processing of an image area of interest
KR102326097B1 (en) Pallet detection using units of physical length
US10163225B2 (en) Object state identification method, object state identification apparatus, and carrier
CN110869974A (en) Point cloud processing method, point cloud processing device and storage medium
Lee et al. Low-cost 3D motion capture system using passive optical markers and monocular vision
KR20160003776A (en) Posture estimation method and robot
JP2011523742A (en) Rectangle table detection using RGB and depth measurement hybrid camera sensors
JP6101134B2 (en) Information processing apparatus and information processing method
US20110150300A1 (en) Identification system and method
CN111310667A (en) Method, device, storage medium and processor for determining whether annotation is accurate
Sehgal et al. Real-time scale invariant 3D range point cloud registration
CN110706278A (en) Object identification method and device based on laser radar and camera
CN111383261A (en) Mobile robot, pose estimation method and pose estimation device thereof
CN107438863B (en) Flight positioning method and device
CN110728684B (en) Map construction method and device, storage medium and electronic equipment
CN114170521A (en) Forklift pallet butt joint identification positioning method
CN110928959A (en) Method and device for determining relationship characteristic information between entities, electronic equipment and storage medium
Sadeghi et al. Ocrapose: An indoor positioning system using smartphone/tablet cameras and OCR-aided stereo feature matching
CN110213566B (en) Image matching method, device, equipment and computer readable storage medium
CN116309882A (en) Tray detection and positioning method and system for unmanned forklift application
JP6041710B2 (en) Image recognition method
CN112338910A (en) Space map determination method, robot, storage medium and system
Figueroa et al. Development of an Object Recognition and Location System Using the Microsoft Kinect TM Sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201012

Address after: 261031, north of Jade East Street, Dongming Road, Weifang hi tech Zone, Shandong province (GoerTek electronic office building, Room 502)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 261031 Dongfang Road, Weifang high tech Industrial Development Zone, Shandong, China, No. 268

Applicant before: GOERTEK Inc.

GR01 Patent grant
GR01 Patent grant