CN112785646A - Landing pose determining method and electronic equipment - Google Patents

Landing pose determining method and electronic equipment Download PDF

Info

Publication number
CN112785646A
CN112785646A CN202110107086.8A CN202110107086A CN112785646A CN 112785646 A CN112785646 A CN 112785646A CN 202110107086 A CN202110107086 A CN 202110107086A CN 112785646 A CN112785646 A CN 112785646A
Authority
CN
China
Prior art keywords
target image
target
determining
feature
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110107086.8A
Other languages
Chinese (zh)
Inventor
豆森
汪权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202110107086.8A priority Critical patent/CN112785646A/en
Publication of CN112785646A publication Critical patent/CN112785646A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a landing pose determining method and electronic equipment, which are used for improving the landing precision and acquiring a target image and an imaging plane of the target image; acquiring feature information in the target image, wherein the feature information in the target image comprises feature points in the target image and the shape of a target object in the target image; determining a target area used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image; determining the actual position of the target area according to the target area; and determining the pose of the electronic equipment when the electronic equipment lands according to the target area of the target image, the imaging plane of the target image and the actual position of the target area. Adopt the scheme that this application provided, promoted the precision of judging the landing area, in this application, can enough acquire the landing position, gesture when can acquireing again to descend has further improved the precision of descending.

Description

Landing pose determining method and electronic equipment
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to a landing pose determining method and electronic equipment.
Background
In recent years, unmanned aerial vehicle's application field is more and more extensive, and unmanned aerial vehicle's control technique is becoming mature day by day, and unmanned aerial vehicle's automatic descending is that unmanned aerial vehicle further realizes automatic key place.
At present, the unmanned aerial vehicle landing scheme is mainly based on the landing of inertial navigation system and GPS navigation system guide unmanned aerial vehicle, because inertial navigation system's accumulative error expression, and GPS's inherent error is great, consequently can't comparatively accurate guide unmanned aerial vehicle land to the target point. Therefore, how to provide a method for determining a landing pose to improve the landing accuracy is an urgent technical problem to be solved.
Disclosure of Invention
An object of the embodiment of the application is to provide a landing pose determination method and an electronic device, which are used for improving the landing accuracy.
In order to solve the technical problem, the embodiment of the application adopts the following technical scheme: a landing pose determination method, comprising:
acquiring a target image and an imaging plane of the target image;
acquiring feature information in the target image, wherein the feature information in the target image comprises feature points in the target image and the shape of a target object in the target image;
determining a target area used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image;
determining the actual position of the target area according to the target area;
and determining the pose of the electronic equipment when the electronic equipment lands according to the target area of the target image, the imaging plane of the target image and the actual position of the target area.
In one embodiment, the determining a target area in the target image for characterizing the landing position of the electronic device according to the feature information in the target image includes:
determining whether a target object matched with a preset shape template exists in the target image;
under the condition that a target object matched with a preset shape template exists in a target image, if the number of feature points in the target object reaches a specific number and the relative position relationship among the feature points meets a preset requirement, determining that the target object is a target area used for representing the landing position of the electronic equipment in the target image.
In one embodiment, further comprising:
deleting the target image under the condition that the target image does not meet any target condition;
generating an instruction to re-determine the target image;
wherein the target conditions include: a target object matched with a preset shape template does not exist in the target image; the number of feature points in the target object does not reach a specific number; the relative position relation among the characteristic points does not meet the preset requirement.
In one embodiment, the target area is composed of a circle and three rectangles within the circle; the three rectangles are arranged at equal intervals, and the center point of the rectangle in the middle is coincided with the circle center of the circular ring.
In one embodiment, further comprising:
and under the condition that the number of the characteristic points exceeds a specific number, carrying out rejection operation on the characteristic points based on a preset mode until the number of the characteristic points is the specific number.
In one embodiment, the preset manner is performed as the following steps:
selecting the feature points one by one as feature points to be eliminated;
judging whether a target feature point with a distance smaller than a preset distance from the feature point to be eliminated exists in the target image;
and under the condition that no target feature point with the distance to the feature point to be eliminated being smaller than the preset distance exists in the target image, deleting the feature point to be eliminated.
In one embodiment, in the case that there is a target feature point in the target image whose distance from the feature point to be eliminated is smaller than a preset distance, the preset manner is executed as the following steps:
acquiring gray level change parameter values of the target characteristic points and the characteristic points to be eliminated;
and deleting the feature points to be eliminated under the condition that the gray scale change parameter values of the target feature points are larger than the gray scale change parameter values of the feature points to be eliminated, otherwise deleting the target feature points.
And deleting the characteristic points to be eliminated.
In one embodiment, further comprising:
determining whether a feature point in the target image is located within the circle;
under the condition that the feature points in the target image are located in the circular ring, determining whether the feature points in the target image are overlapped with the corner points of three rectangles in the circular ring;
and under the condition that the feature points in the target image are overlapped with the corner points of the three rectangles, determining that the relative position relation among the feature points meets the preset requirement.
In one embodiment, determining the pose of the electronic device when landing according to the target area of the target image, the imaging plane of the target image and the actual position of the target area comprises:
determining an imaging coordinate system of the target image according to the imaging plane of the target image;
establishing a world coordinate system based on a plane where the actual position of the target area is located;
determining attitude angle information and relative position information of a world coordinate system relative to an imaging coordinate system according to the coordinates of the characteristic points in the world coordinate system and the coordinates of the characteristic points projected to the imaging plane of the target image;
and determining the target pose of the electronic equipment according to the attitude angle information and the relative position information.
The present application further provides an electronic device, comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a target image and an imaging plane of the target image;
the second acquisition module is used for acquiring feature information in the target image, wherein the feature information in the target image comprises feature points in the target image and the shape of a target object in the target image;
the first determining module is used for determining a target area which is used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image;
the second determining module is used for determining the actual position of the target area according to the target area;
and the third determining module is used for determining the pose of the electronic equipment when the electronic equipment lands according to the target area of the target image, the imaging plane of the target image and the actual position of the target area.
This application still provides an unmanned aerial vehicle, includes:
a processor;
a memory for storing executable instructions in the processor;
wherein the processor is configured to:
acquiring a target image and an imaging plane of the target image;
acquiring feature information in the target image, wherein the feature information in the target image comprises feature points in the target image and the shape of a target object in the target image;
determining a target area used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image;
determining the actual position of the target area according to the target area;
and determining the pose of the electronic equipment when the electronic equipment lands according to the target area of the target image, the imaging plane of the target image and the actual position of the target area.
Based on the feature points of the target image and the shape of the target object in the target image and the target area used for representing the landing position of the electronic equipment in the judgment target image, the accuracy of judging the landing area is improved, and secondly, in the method and the device, the landing position can be obtained, the posture during landing can be obtained, and the landing accuracy is further improved.
Drawings
Fig. 1 is a flowchart of a landing pose determination method according to an embodiment of the present application;
FIG. 2A is a schematic diagram including an imaging plane and an actual position of a target image for characterizing a correspondence between an imaging coordinate system and a world coordinate system;
FIG. 2B is a schematic view of a shape used to characterize a drop marker in a target region in a target image;
FIG. 2C is a flow chart of a method for determining a landing pose in another embodiment of the present application;
FIG. 3 is a flow chart of a method for determining landing pose in another embodiment of the present application;
fig. 4 is a schematic view of a landing scene including an unmanned aerial vehicle and a landing vessel;
FIG. 5 is a schematic diagram of a mapping relationship between an imaging coordinate system and a world coordinate system according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a representation of the mapping between the imaging coordinate system and the world coordinate system according to another embodiment of the present application;
FIG. 7 is a schematic diagram of a representation of the mapping between the imaging coordinate system and the world coordinate system according to another embodiment of the present application;
FIG. 8 is a schematic diagram illustrating the relative positions of the imaging plane and the actual position plane according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of the relative position of the imaging plane and the actual position plane in another embodiment of the present application;
FIG. 10 is a block diagram of an electronic device in an embodiment of the present application;
fig. 11 is a block diagram of an electronic device according to another embodiment of the present application.
Detailed Description
Various aspects and features of the present application are described herein with reference to the drawings.
It will be understood that various modifications may be made to the embodiments of the present application. Accordingly, the foregoing description should not be construed as limiting, but merely as exemplifications of embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the application.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the application and, together with a general description of the application given above and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the present application will become apparent from the following description of preferred forms of embodiment, given as non-limiting examples, with reference to the attached drawings.
It should also be understood that, although the present application has been described with reference to some specific examples, a person of skill in the art shall certainly be able to achieve many other equivalent forms of application, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present application will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the application, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application of unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
Fig. 1 is a flowchart of a landing pose determination method according to an embodiment of the present application, and the method includes the following steps S11 to S15:
in step S11, a target image and an imaging plane of the target image are acquired;
in step S12, feature information in the target image is acquired, wherein the feature information in the target image includes feature points in the target image and a shape of the target object in the target image;
in step S13, determining a target area in the target image for representing the landing position of the electronic device according to the characteristic information in the target image;
in step S14, determining an actual position of the target area based on the target area;
in step S15, the pose of the electronic device at the time of landing is determined from the target area of the target image, the imaging plane of the target image, and the actual position of the target area.
In the embodiment, a target image and an imaging plane of the target image are acquired; wherein, as shown in FIG. 2A, the imaging plane of the target image is pi1The plane of the plane. For example, if an unmanned aerial vehicle is about to land, a landing position of the unmanned aerial vehicle needs to be shot, the shot image is a target image, but whether the image shot by the unmanned aerial vehicle contains the landing position needs to be confirmed, and specifically, whether the target image contains a target area for representing the landing position of the electronic device can be determined by judging whether the target image contains the target area. Specifically, feature information in a target image is obtained, wherein the feature information in the target image comprises feature points in the target image and a shape of a target object in the target image; determining a target area used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image;
specifically, the target area may be a marker previously set at the landing position; as shown in fig. 2B, the target area is composed of a circular ring and three rectangles in the circular ring; wherein, three rectangles are arranged at equal intervals, and the center point of the rectangle in the middle is coincided with the circle center of the circular ring.
On the basis of the method, the target area used for representing the landing position of the electronic equipment in the target image can be determined by the following steps:
determining whether a target object matched with a preset shape template exists in the target image; under the condition that a target object matched with the preset shape template exists in the target image, if the number of the feature points in the target object reaches a specific number and the relative position relation among the feature points meets the preset requirement, determining that the target object is a target area used for representing the landing position of the electronic equipment in the target image.
That is, when determining that the target object is a target area for representing the landing position of the electronic device in the target image, the following three conditions need to be simultaneously satisfied:
the method comprises the following steps that firstly, a target object matched with a preset shape template exists in a target image; for example, the template with a predetermined shape may be a circular ring, or may be a circular ring and three rectangles arranged at equal intervals in the circular ring; second, the number of feature points in the target object reaches a specific number, for example, if the target region is a circular ring and three equally spaced rectangles contained in the circular ring, the specific number of feature points in the target object is 12; and thirdly, enabling the relative position relation among the characteristic points to meet the preset requirement.
And when the target object simultaneously meets the three conditions, determining the target object as a target area for representing the landing position of the electronic equipment in the target image.
Deleting the target image if the target image does not satisfy any of the above conditions (condition one, condition two, and condition three); and generating instructions to re-determine the target image.
When selecting the feature point, it is usually necessary to select a point with a large change in translation gray level as the feature point, and since the geometric shape of the target region in the present application is formed by the circular ring and the three rectangles located in the circular ring, the point with the large change in translation gray level is a point corresponding to four corners of the rectangle, it can be understood that there are 12 points in the three rectangles. Then, if the number of feature points is greater than or less than 12, there is a possibility that an erroneous target region is located, and therefore, in determining whether or not the target object satisfies the above-described condition two, if the number of feature points is less than a certain number, the filter window may be appropriately reduced so that new feature points are continuously acquired by reducing the acquisition condition, and if not, the map is discarded and re-acquired. And under the condition that the number of the characteristic points exceeds the specific number, removing the characteristic points based on a preset mode until the number of the characteristic points is the specific number (namely 12). While in culling, two cases are considered: one is a proximity type feature point, the feature point may be caused by a light source and a camera, and the estimated feature point may be near a theoretical target feature point, so that the gray level change parameter values of two adjacent points (such as the gray level change square difference between the two adjacent points and the surroundings thereof) can be compared, the change parameter value is reserved greatly, and the change parameter value is removed slightly; another is an isolated feature point, which may be due to a pixel interference source in the target region, and which is not near the theoretical target feature point, so that the isolated feature point can be culled based on whether there are other feature points near the specific point.
It should be noted that, in this embodiment, for convenience of introduction and subsequent calculation, the feature points are labeled in the following specific labeling manner:
establishing an image origin, obtaining two points which are farthest from the origin and are determined as a feature point No. 1 and a feature point No. 12 (the specific length is considered for the image), taking the central points of the two obtained feature points as tail points of a reference vector, and confirming the reference vector by combining the two points.
Assuming that the homogeneous coordinate of 12 feature points extracted from the image is P1(u1,v1,1),P2(u2,v2,1),…,P12(u12,ν12,1);
Where P1 to P12 are coordinate system information of feature points screened in the figure in a pixel coordinate system, and since all feature points are in one plane, they are represented by (u, v, 1) such a coordinate type.
Homogeneous coordinate O of origin of feature pointcorner(u0,ν01), end point P of the reference vectorlast(ulast,vlast,1)。
Namely, the vector of each feature point is respectively:
Figure BDA0002917591880000091
wherein C is1O denotes a reference vector. The Ocorn represents the pixel center of the target region, which is used as a reference point for subsequent rectification and marking of valid nodes.
The information amount of each feature point is expressed as follows:
Figure BDA0002917591880000092
wherein
Figure BDA0002917591880000101
The rotation angles of the reference vector to the 1 st feature point, the 2 nd feature point and the 12 th feature point are respectively expressed, the Z axis is taken as a rotation axis and accords with a right-hand spiral criterion, the sine value of each feature point is obtained, and the specific rotation angle cannot be accurately obtained due to the obtained angle. At this time, the cosine judgment is also needed.
Figure BDA0002917591880000102
Finishing to obtain:
Figure BDA0002917591880000103
based on the low sine, cosine and tangent information, the rotation angle (angle range of 0-360 °) between each vector (12 points to the central point) and the reference vector can be obtained, so that the clockwise rotation angle from the reference vector to each feature point can be known with certainty. As shown in fig. 2B, the feature points are numbered according to the angle size, the smaller the clockwise rotation angle from the reference vector to each feature point, the larger the number, that is, the smallest clockwise rotation angle from the reference vector to the feature point, the number of the feature point is 12, the largest clockwise rotation angle from the reference vector to the feature point, the number of the feature point is 1, and so on, and after the numbers of all the feature points are determined, the marking of the feature points is completed.
In addition, when determining whether or not the target object satisfies the third condition, it is necessary to determine the positions and the positional relationships of the feature points, specifically:
determining whether the feature point in the target image is located in the ring; under the condition that the feature points in the target image are located in the circular ring, determining whether the feature points in the target image are superposed with the corner points of the three rectangles in the circular ring; and under the condition that the feature points in the target image are overlapped with the corner points of the three rectangles, determining that the relative position relation among the feature points meets the preset requirement.
Determining the actual position of the target area according to the target area; specifically, the actual position may be π in FIG. 2A as described above2The position of the plane on which the film is placed,
a target area according to a target image; the imaging plane of the target image and the actual position of the target area determine the pose of the electronic device when it lands.
The target area of the target image is the area appearing in the image as shown in FIG. 2B, and the imaging plane of the target image is pi in FIG. 2A1In the plane of the plane, and the actual position may be pi in FIG. 2A as described above2The position of the plane. And determining the pose of the electronic equipment when the electronic equipment lands according to the three.
In one embodiment, the above step S13 can be implemented as the following steps A1-A2:
in step a1, determining whether a target object matching a preset shape template exists in the target image;
in step a2, in a case that a target object matching the preset shape template exists in the target image, if the number of feature points in the target object reaches a specific number and the relative position relationship between the feature points meets a preset requirement, determining that the target object is a target area in the target image for representing the landing position of the electronic device.
In this embodiment, when determining that the target object is a target area for representing a landing position of the electronic device in the target image, the following three conditions need to be simultaneously satisfied:
the method comprises the following steps that firstly, a target object matched with a preset shape template exists in a target image; for example, the template with a predetermined shape may be a circular ring, or may be a circular ring and three rectangles arranged at equal intervals in the circular ring; second, the number of feature points in the target object reaches a specific number, for example, if the target region is a circular ring and three equally spaced rectangles contained in the circular ring, the specific number of feature points in the target object is 12; and thirdly, enabling the relative position relation among the characteristic points to meet the preset requirement.
And when the target object simultaneously meets the three conditions, determining the target object as a target area for representing the landing position of the electronic equipment in the target image.
In one embodiment, the method may also be implemented as steps B1-B2:
in step B1, in the case where the target image does not satisfy any of the target conditions, the target image is deleted;
in step B2, an instruction to re-determine the target image is generated;
wherein the target conditions include: a target object matched with the preset shape template does not exist in the target image; the number of feature points in the target object does not reach a certain number; the relative position relation among the characteristic points does not meet the preset requirement.
In the present embodiment, in the case where the target image does not satisfy any of the above-mentioned conditions (condition one, condition two, and condition three), the target image is deleted; and generating instructions to re-determine the target image.
In one embodiment, the target area is made up of a circle and three rectangles within the circle; wherein, three rectangles are arranged at equal intervals, and the center point of the rectangle in the middle is coincided with the circle center of the circular ring.
In one embodiment, the method may also be implemented as the steps of:
and under the condition that the number of the characteristic points exceeds a specific number, carrying out rejection operation on the characteristic points based on a preset mode until the number of the characteristic points is the specific number.
In the present embodiment, when the geometric shape of the target region is composed of a circle and three rectangles located within the circle, the points having a large change in the translational gradation are points corresponding to the four corners of the rectangle, and the number of the feature points is 12. And under the condition that the number of the characteristic points exceeds the specific number, removing the characteristic points based on a preset mode until the number of the characteristic points is the specific number (namely 12). In culling, two cases are considered: one is a neighboring type feature point and the other is an isolated type feature point.
In one embodiment, the preset manner is performed as the following steps C1-C3:
in step C1, selecting feature points one by one as feature points to be eliminated;
in step C2, it is determined whether there is a target feature point in the target image whose distance from the feature point to be removed is smaller than a preset distance;
in step C3, in the case that there is no target feature point in the target image whose distance from the feature point to be removed is smaller than the preset distance, the feature point to be removed is deleted.
The method is used for removing the isolated feature points, the feature points may be due to pixel interference sources in the target region, and the feature points are not near the theoretical target feature points, so that the isolated feature points can be removed based on whether other feature points exist near the specific points or not.
In one embodiment, in the case that there is a target feature point in the target image whose distance from the feature point to be rejected is less than a preset distance, the preset manner is performed as the following steps D1-D2:
in step D1, obtaining gray scale change parameter values of the target characteristic points and the characteristic points to be eliminated;
in step D2, when the value of the grayscale variation parameter of the target feature point is greater than the value of the grayscale variation parameter of the feature point to be removed, the feature point to be removed is deleted, otherwise, the target feature point is deleted.
The method is used for removing the adjacent characteristic points, the characteristic points may be caused by the problems of a light source and a camera, and the estimated characteristic points may be near the theoretical target characteristic points, so that the gray level change parameter values of two adjacent points (such as the gray level change square difference between the two adjacent points and the periphery thereof) can be compared, the change parameter values are reserved greatly, and the removal is carried out when the change parameter values are small.
In one embodiment, as shown in FIG. 2C, the method may also be implemented as the following steps S21-S23:
in step S21, it is determined whether the feature point in the target image is located within the circle;
in step S22, in a case where the feature point in the target image is located within the ring, it is determined whether the feature point in the target image coincides with each corner point of three rectangles within the ring;
in step S23, in the case where the feature points in the target image coincide with the respective corner points of the three rectangles, it is determined that the relative positional relationship between the respective feature points meets a preset requirement.
In one embodiment, as shown in FIG. 3, the above step S15 can be implemented as the following steps S31-S34:
in step S31, determining an imaging coordinate system of the target image from the imaging plane of the target image;
in step S32, a world coordinate system is established based on the plane in which the actual position of the target area is located;
in step S33, determining attitude angle information and relative position information of the world coordinate system with respect to the imaging coordinate system based on the coordinates of the feature point in the world coordinate system and the coordinates of the feature point projected onto the imaging plane of the target image;
in step S34, a target pose of the electronic device is determined from the pose angle information and the relative position information.
In this embodiment, the target area of the target image is the area appearing in the image as shown in FIG. 2B, and the imaging plane of the target image is π in FIG. 2A1In the plane of the plane, and the actual position may be pi in FIG. 2A as described above2In a planeLocation. And determining the pose of the electronic equipment when the electronic equipment lands according to the three.
Fig. 4 shows a ship scene, where an unmanned aircraft is about to land on a marker on a ship, and then a coordinate system as shown in fig. 2A needs to be constructed, that is, an imaging plane (pi) according to a target image1The plane on which the target image is located) determines an imaging coordinate system of the target image; based on the plane (pi) in which the actual position of the target area lies2The plane is located) to establish a world coordinate system; determining attitude angle information and relative position information of the world coordinate system relative to the imaging coordinate system according to the coordinates of the characteristic points in the world coordinate system and the coordinates of the characteristic points projected to the imaging plane of the target image; and determining the target pose of the electronic equipment according to the attitude angle information and the relative position information.
The method comprises the following specific steps: establishing a relative pose 3 x 3 matrix model, and establishing a relative position information model of an actual coordinate point and a characteristic point:
Figure BDA0002917591880000151
this scheme provides 12 sets of feature information matching, where the leftmost matrix (containing X)lpw、X2pwMatrix of equal elements) is a matrix formed by characteristic points in the world coordinate system, and the rightmost matrix (containing X)1P、X2PA matrix of equal elements) is a matrix formed by each integer point in the imaging coordinate system of the target image, and the middle matrix (including h)11、h12A matrix of equal elements) is a homography matrix, which is a matrix for representing the correspondence between each element in the world coordinate system and each element in the imaging coordinate system.
Wherein, under the condition that the elements corresponding to the world coordinate system and the imaging coordinate system are known, the element to be solved is the intermediate homography matrix (including h)11、h12A matrix of equal elements). I.e. the purpose of the matrix is to solve for the parameter values h11 to h 32;
after the parameter values h11 to h32 are obtained, theoretically, the imaging plane is parallel to the horizontal plane, and the plane where the actual landing position is located usually has a certain included angle with the imaging plane, so that the relationship between the imaging plane and the actual landing position needs to be calculated. Specifically, the angle relationship and the position relationship between the imaging coordinate system and the world coordinate system and the angle relationship and the position relationship between the three coordinate axes are required to be calculated, so that the equipment to be landed can be determined to correspond to the angle of the actual position corresponding to the target area only by adjusting the large angle on the three coordinate axes respectively, and the equipment can be landed on the actual position corresponding to the target area only by offsetting the large distance on the three coordinate axes respectively. The method comprises the following specific steps:
firstly: for simplicity of description, the imaging plane is referred to as π1Plane, the actual position of which is called pi2Plane, as shown in FIG. 5, will be1The plane x-axis extends along with the plane falling at pi2A straight line L1 intersecting the planes at a relative angle of
Figure BDA0002917591880000161
According to the relative position information model of the actual coordinate points and the feature points, the following steps can be determined:
Figure BDA0002917591880000162
wherein Xw is the X axis of the world coordinate system, Yw is the Y axis of the world coordinate system, the Z axis of the world coordinate system is set as 0, XpIs the X-axis of the imaging coordinate system and the middle matrix is the inverse of the homography matrix described above. Assuming f is the focal length of the camera, then:
Figure BDA0002917591880000163
phi can be obtained through the formula, and the angle phi is the angle of equipment to be landed (such as an unmanned aerial vehicle, a helicopter and the like) needing to deflect on the X axis.
Similarly, as shown in FIG. 6, will1The plane x-axis extends along with the plane falling at pi2A straight line L2 intersecting the plane at a relative angle theta based on the actual coordinatesThe relative position information model of the points and the feature points can determine:
Figure BDA0002917591880000164
wherein Xw is X axis of world coordinate system, Yw is Y axis of world coordinate system, Z axis of world coordinate system is set as 0, YpIs the Y-axis of the imaging coordinate system and the middle matrix is the inverse of the homography matrix described above. Assuming f is the focal length of the camera, then:
Figure BDA0002917591880000165
since the remaining parameters are known, in addition to the relative angle θ, i.e. the angle by which the device to be lowered needs to be rotated in the Y-axis, can be determined.
It will be understood that the relative angle ψ shown in fig. 7 is an angle at which the unmanned aerial vehicle to be landed needs to be rotated in the Z-axis, and the relative angle ψ is solved as follows:
Figure BDA0002917591880000171
Lw=(h11,h12,h13)T
finally, by LwThe slope of the homogeneous coordinate is obtained as LwSame plane Y thereofwIs also the relative yaw angle psi.
After the angles of the equipment to be landed, which need to be adjusted on the three coordinate axes, are obtained, the distances of the equipment to be landed, which need to be offset on the three coordinate axes, are calculated, specifically, the following method is adopted:
FIG. 8 shows the imaging plane (π in the image coordinate system)1Plane) and the plane (pi of world coordinate system) where the actual position is located2Plane) relative position diagram, pi of world coordinate system2Plane surfaceXwAxis projection to image coordinate system pi1Plane l straight line, optical axis and OwIs crossed with pi1Is planar to O'wPoint, extending the straight line l and from the light source point OcIs crossed at point p, where Op is parallel to XwAxis i.e. p transmission to pi2The plane is at infinity. The parallel line of the light source passing point making a straight line l intersects with XWIs extended at point q because of the straight lines l and XWThe axes being coplanar, so only when the lines l and X areWQ is an infinite origin in the case of axis parallelism. Then, the corresponding relationship between the imaging coordinate system and the world coordinate system is:
Figure BDA0002917591880000172
from OwCoordinates under the world coordinate System (0, 0, 1)TIt can be seen that the corresponding image coordinate system is O'wThe coordinates of (a) are as follows:
Figure BDA0002917591880000181
Figure BDA0002917591880000182
light source point OcTo O'wThe distances of (a) are as follows:
Figure BDA0002917591880000183
light source point OcThe distances to p are as follows:
Figure BDA0002917591880000184
let the coordinate of the q point be
Figure BDA0002917591880000185
And is thrown through q pointsShadow to pi1At infinity of the plane, so according to the homography:
Figure BDA0002917591880000186
Figure BDA0002917591880000187
the distance parameter calculated above is chosen so that the following equation can be obtained:
Figure BDA0002917591880000188
Figure BDA0002917591880000191
as shown in FIG. 9, per OwPerpendicular to ZcThe straight line of the axes is parallel to the point O', then OcO' is the origin O under the world coordinate systemwAlong Z in the camera coordinate systemcPosition information of the direction, i.e., depth information.
Figure BDA0002917591880000192
Then, the conversion relationship between the image coordinate system and the camera coordinate system calculated in the previous step is shown as the following formula:
Figure BDA0002917591880000193
its position information along two other axes in the camera coordinate system can be solved:
Figure BDA0002917591880000194
will be provided with
Figure BDA0002917591880000195
The following steps are carried out:
Figure BDA0002917591880000196
according to internal parameters of the calibration camera, three attitude angle information and position distance information of the world coordinate system relative to the camera coordinate system can be solved by coordinates of 12 target feature points in the world coordinate system and coordinates of the target feature points transmitted to the image coordinate system. Xc and Yc are distances needing to be offset on an X axis and a Y axis of the equipment to be landed respectively, and the depth information Zc is the distance needing to be offset on a Z axis of the equipment to be landed, namely the landing height.
Fig. 10 is a block diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 10, the electronic device includes the following modules:
a first obtaining module 101, configured to obtain a target image and an imaging plane of the target image;
the second obtaining module 102 is configured to obtain feature information in a target image, where the feature information in the target image includes feature points in the target image and a shape of a target object in the target image;
the first determining module 103 is configured to determine a target area in the target image, which is used for representing a landing position of the electronic device, according to the feature information in the target image;
a second determining module 104, configured to determine an actual position of the target area according to the target area;
and the third determining module 105 is configured to determine the pose of the electronic device when the electronic device lands according to the target area of the target image, the imaging plane of the target image, and the actual position of the target area.
In one embodiment, the first determining module 103 includes:
a first determining sub-module 111, configured to determine whether a target object matching the preset shape template exists in the target image;
the second determining sub-module 112 is configured to, in a case that a target object matching the preset shape template exists in the target image, determine that the target object is a target area in the target image, where the target area is used for representing a landing position of the electronic device, if the number of feature points in the target object reaches a specific number and a relative position relationship between the feature points meets a preset requirement.
In one embodiment, further comprising:
the deleting module is used for deleting the target image under the condition that the target image does not meet any target condition;
a generation module for generating an instruction to re-determine the target image;
wherein the target conditions include: a target object matched with the preset shape template does not exist in the target image; the number of feature points in the target object does not reach a certain number; the relative position relation among the characteristic points does not meet the preset requirement.
In one embodiment, the target area is made up of a circle and three rectangles within the circle; wherein, three rectangles are arranged at equal intervals, and the center point of the rectangle in the middle is coincided with the circle center of the circular ring.
In one embodiment, further comprising:
and the rejecting module is used for rejecting the feature points based on a preset mode under the condition that the number of the feature points exceeds the specific number until the number of the feature points is the specific number.
In one embodiment, the predetermined manner is performed as the following steps:
selecting the feature points one by one as feature points to be eliminated;
judging whether a target characteristic point with a distance smaller than a preset distance from the characteristic point to be eliminated exists in the target image;
and under the condition that the target image does not have the target feature point with the distance to the feature point to be removed smaller than the preset distance, deleting the feature point to be removed.
In one embodiment, in the case that there is a target feature point in the target image whose distance from the feature point to be rejected is less than a preset distance, the preset manner is executed as the following steps:
acquiring gray level change parameter values of the target characteristic points and the characteristic points to be eliminated;
and deleting the characteristic points to be eliminated under the condition that the gray scale change parameter values of the target characteristic points are larger than the gray scale change parameter values of the characteristic points to be eliminated, otherwise deleting the target characteristic points.
In one embodiment, further comprising:
the fourth determining module is used for determining whether the characteristic point in the target image is positioned in the circular ring;
a fifth determining module, configured to determine whether a feature point in the target image coincides with each corner point of three rectangles in the ring under the condition that the feature point in the target image is located in the ring;
and the sixth determining module is used for determining that the relative position relation among the characteristic points meets the preset requirement under the condition that the characteristic points in the target image are overlapped with the corner points of the three rectangles.
In one embodiment, the third determining module includes:
the third determining submodule is used for determining an imaging coordinate system of the target image according to the imaging plane of the target image;
the establishing submodule is used for establishing a world coordinate system based on a plane where the actual position of the target area is located;
the fourth determining submodule is used for determining attitude angle information and relative position information of the world coordinate system relative to the imaging coordinate system according to the coordinates of the characteristic points in the world coordinate system and the coordinates of the characteristic points projected to the imaging plane of the target image;
and the fifth determining submodule is used for determining the target pose of the electronic equipment according to the pose angle information and the relative position information.
This application still provides an unmanned aerial vehicle, includes:
a processor;
a memory for storing executable instructions in the processor;
wherein the processor is configured to:
the method corresponding to any one of the above embodiments is executed.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present application, and the protection scope of the present application is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered to be within the scope of the present application.

Claims (10)

1. A landing pose determination method, comprising:
acquiring a target image and an imaging plane of the target image;
acquiring feature information in the target image, wherein the feature information in the target image comprises feature points in the target image and the shape of a target object in the target image;
determining a target area used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image;
determining the actual position of the target area according to the target area;
and determining the pose of the electronic equipment when the electronic equipment lands according to the target area of the target image, the imaging plane of the target image and the actual position of the target area.
2. The method of claim 1, wherein determining a target area in the target image for characterizing a landing position of an electronic device according to the feature information in the target image comprises:
determining whether a target object matched with a preset shape template exists in the target image;
under the condition that a target object matched with a preset shape template exists in a target image, if the number of feature points in the target object reaches a specific number and the relative position relationship among the feature points meets a preset requirement, determining that the target object is a target area used for representing the landing position of the electronic equipment in the target image.
3. The method of claim 2, further comprising:
deleting the target image under the condition that the target image does not meet any target condition;
generating an instruction to re-determine the target image;
wherein the target conditions include: a target object matched with a preset shape template does not exist in the target image; the number of feature points in the target object does not reach a specific number; the relative position relation among the characteristic points does not meet the preset requirement.
4. The method of claim 2, the target area being comprised of a circle and three rectangles within the circle; the three rectangles are arranged at equal intervals, and the center point of the rectangle in the middle is coincided with the circle center of the circular ring.
5. The method of claim 2, further comprising:
and under the condition that the number of the characteristic points exceeds a specific number, carrying out rejection operation on the characteristic points based on a preset mode until the number of the characteristic points is the specific number.
6. The method of claim 5, the predetermined manner being performed as the steps of:
selecting the feature points one by one as feature points to be eliminated;
judging whether a target feature point with a distance smaller than a preset distance from the feature point to be eliminated exists in the target image;
and under the condition that no target feature point with the distance to the feature point to be eliminated being smaller than the preset distance exists in the target image, deleting the feature point to be eliminated.
7. The method according to claim 6, wherein in the case that there is a target feature point in the target image whose distance from the feature point to be eliminated is smaller than a preset distance, the preset manner is executed as the following steps:
acquiring gray level change parameter values of the target characteristic points and the characteristic points to be eliminated;
and deleting the feature points to be eliminated under the condition that the gray scale change parameter values of the target feature points are larger than the gray scale change parameter values of the feature points to be eliminated, otherwise deleting the target feature points.
8. The method of claim 4, further comprising:
determining whether a feature point in the target image is located within the circle;
under the condition that the feature points in the target image are located in the circular ring, determining whether the feature points in the target image are overlapped with the corner points of three rectangles in the circular ring;
and under the condition that the feature points in the target image are overlapped with the corner points of the three rectangles, determining that the relative position relation among the feature points meets the preset requirement.
9. The method of any one of claims 2-8, determining the pose of the electronic device when landing from the target region of the target image, the imaging plane of the target image, and the actual position of the target region, comprising:
determining an imaging coordinate system of the target image according to the imaging plane of the target image;
establishing a world coordinate system based on a plane where the actual position of the target area is located;
determining attitude angle information and relative position information of a world coordinate system relative to an imaging coordinate system according to the coordinates of the characteristic points in the world coordinate system and the coordinates of the characteristic points projected to the imaging plane of the target image;
and determining the target pose of the electronic equipment according to the attitude angle information and the relative position information.
10. An electronic device, comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a target image and an imaging plane of the target image;
the second acquisition module is used for acquiring feature information in the target image, wherein the feature information in the target image comprises feature points in the target image and the shape of a target object in the target image;
the first determining module is used for determining a target area which is used for representing the landing position of the electronic equipment in the target image according to the characteristic information in the target image;
the second determining module is used for determining the actual position of the target area according to the target area;
and the third determining module is used for determining the pose of the electronic equipment when the electronic equipment lands according to the target area of the target image, the imaging plane of the target image and the actual position of the target area.
CN202110107086.8A 2021-01-26 2021-01-26 Landing pose determining method and electronic equipment Pending CN112785646A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110107086.8A CN112785646A (en) 2021-01-26 2021-01-26 Landing pose determining method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110107086.8A CN112785646A (en) 2021-01-26 2021-01-26 Landing pose determining method and electronic equipment

Publications (1)

Publication Number Publication Date
CN112785646A true CN112785646A (en) 2021-05-11

Family

ID=75757967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110107086.8A Pending CN112785646A (en) 2021-01-26 2021-01-26 Landing pose determining method and electronic equipment

Country Status (1)

Country Link
CN (1) CN112785646A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937321A (en) * 2022-09-27 2023-04-07 荣耀终端有限公司 Attitude detection method and device for electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108873917A (en) * 2018-07-05 2018-11-23 太原理工大学 A kind of unmanned plane independent landing control system and method towards mobile platform
CN109613926A (en) * 2018-12-22 2019-04-12 武汉新运维光电科技股份有限公司 Multi-rotor unmanned aerial vehicle land automatically it is High Precision Automatic identification drop zone method
KR102018892B1 (en) * 2019-02-15 2019-09-05 국방과학연구소 Method and apparatus for controlling take-off and landing of unmanned aerial vehicle
CN111680685A (en) * 2020-04-14 2020-09-18 上海高仙自动化科技发展有限公司 Image-based positioning method and device, electronic equipment and storage medium
CN112215860A (en) * 2020-09-23 2021-01-12 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108873917A (en) * 2018-07-05 2018-11-23 太原理工大学 A kind of unmanned plane independent landing control system and method towards mobile platform
CN109613926A (en) * 2018-12-22 2019-04-12 武汉新运维光电科技股份有限公司 Multi-rotor unmanned aerial vehicle land automatically it is High Precision Automatic identification drop zone method
KR102018892B1 (en) * 2019-02-15 2019-09-05 국방과학연구소 Method and apparatus for controlling take-off and landing of unmanned aerial vehicle
CN111680685A (en) * 2020-04-14 2020-09-18 上海高仙自动化科技发展有限公司 Image-based positioning method and device, electronic equipment and storage medium
CN112215860A (en) * 2020-09-23 2021-01-12 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈丽娟;周鑫;袁锁中;王从庆;: "基于视觉的无人直升机着陆位姿参数估计方法", 计算机应用与软件, no. 11, 15 November 2013 (2013-11-15), pages 21 - 23 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937321A (en) * 2022-09-27 2023-04-07 荣耀终端有限公司 Attitude detection method and device for electronic equipment
CN115937321B (en) * 2022-09-27 2023-09-22 荣耀终端有限公司 Gesture detection method and device of electronic equipment

Similar Documents

Publication Publication Date Title
CN110163930B (en) Lane line generation method, device, equipment, system and readable storage medium
CN112907676B (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
CN107194399B (en) Visual calibration method, system and unmanned aerial vehicle
CN108827154B (en) Robot non-teaching grabbing method and device and computer readable storage medium
US20220215573A1 (en) Camera pose information detection method and apparatus, and corresponding intelligent driving device
JP2009288152A (en) Calibration method of on-vehicle camera
CN108489454A (en) Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN111260539B (en) Fish eye pattern target identification method and system thereof
CN108898635A (en) A kind of control method and system improving camera calibration precision
CN112947526B (en) Unmanned aerial vehicle autonomous landing method and system
CN110083177A (en) A kind of quadrotor and control method of view-based access control model landing
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
JP2020126647A (en) Method for correcting wrong arrangement of cameras by selectively using information generated by itself and information generated by other objects and device for using the same
CN110673622A (en) Unmanned aerial vehicle automatic carrier landing guiding method and system based on visual images
CN114900609B (en) Automatic shooting control method and system for unmanned aerial vehicle
CN112686149A (en) Vision-based autonomous landing method for near-field section of fixed-wing unmanned aerial vehicle
CN112785646A (en) Landing pose determining method and electronic equipment
CN107576329B (en) Fixed wing unmanned aerial vehicle landing guiding cooperative beacon design method based on machine vision
CN114428510A (en) Method and system for correcting surrounding route
CN111964665B (en) Intelligent vehicle positioning method and system based on vehicle-mounted all-around image and storage medium
CN111121779B (en) Real-time detection method for flight area where unmanned aerial vehicle is located
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN114792343B (en) Calibration method of image acquisition equipment, method and device for acquiring image data
CN113781524B (en) Target tracking system and method based on two-dimensional label
CN113220020B (en) Unmanned aerial vehicle task planning method based on graphic labels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination