CN113220027B - Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task - Google Patents

Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task Download PDF

Info

Publication number
CN113220027B
CN113220027B CN202110500864.XA CN202110500864A CN113220027B CN 113220027 B CN113220027 B CN 113220027B CN 202110500864 A CN202110500864 A CN 202110500864A CN 113220027 B CN113220027 B CN 113220027B
Authority
CN
China
Prior art keywords
flight
coordinate system
area
point
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110500864.XA
Other languages
Chinese (zh)
Other versions
CN113220027A (en
Inventor
刘旭林
赵红颖
程印乾
李芹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peking University
Original Assignee
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peking University filed Critical Peking University
Priority to CN202110500864.XA priority Critical patent/CN113220027B/en
Publication of CN113220027A publication Critical patent/CN113220027A/en
Application granted granted Critical
Publication of CN113220027B publication Critical patent/CN113220027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a concave polygon area unmanned aerial vehicle track planning based on a remote sensing task, which carries out multiple unmanned aerial vehicle track planning aiming at the specific mapping requirement of the remote sensing task, gives the information of specific aerial shooting points and ensures the shortest total range under the condition of ensuring the full coverage of an observation area. According to the concave polygon area unmanned aerial vehicle track planning based on the remote sensing task, the track planning (multi-load, multi-aircraft, actual geographic base map, course overlapping degree, side overlapping degree, resolution ratio and the like) based on the remote sensing task considers the acquisition of the corresponding aerial shooting points of the geographic base map and the track planning, the requirements of the remote sensing task can be met, and the concave polygon area unmanned aerial vehicle track planning based on the remote sensing task is more practical. The unmanned aerial vehicle image splicing method considers the shooting of aerial images and the later image splicing in the flying process of the unmanned aerial vehicle, has obvious effect and is suitable for wide popularization.

Description

Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task
Technical Field
The invention relates to the technical field of flight path planning, in particular to a concave polygon area unmanned aerial vehicle flight path planning based on a remote sensing task.
Background
The existing unmanned aerial vehicle track planning has multiple different application scenes, the particularity of the remote sensing task is not considered in many unmanned aerial vehicle track planning algorithms proposed at present, the unmanned aerial vehicle track planning algorithms are only considered in the angle of unmanned aerial vehicle flight, the unmanned aerial vehicle track planning algorithms calculate the interval between unmanned aerial vehicle zones by the turning radius of the unmanned aerial vehicle, and the algorithms cannot be applied to the remote sensing task.
At present, algorithms for unmanned aerial vehicle track planning mostly stay in the category of convex polygons for consideration, and a small number of people also study the track planning of concave polygons, wherein a scanning line method is mostly considered as a basic algorithm, but the existing algorithms are simple, belong to the ideal situation, and do not support the unmanned aerial vehicle track planning of multiple frames. The existing unmanned aerial vehicle track planning does not take specific unmanned aerial vehicle aircraft parameters and load parameters into consideration at present, and the parameters are also very important for mapping at the later stage of remote sensing observation.
Aiming at the problems, a flight path planning method is designed, and the problems that the prior art cannot be suitable for remote sensing tasks, is simple in algorithm and ideal and does not support multi-frame unmanned aerial vehicle flight path planning are solved.
Disclosure of Invention
Aiming at the defects, the invention provides the concave polygon area unmanned aerial vehicle track planning based on the remote sensing task, and aims to solve the problems that the existing technology cannot be applied to the remote sensing task, the algorithm is simple and ideal, and the unmanned aerial vehicle track planning with multiple shelves is not supported.
The invention provides a concave polygon area unmanned aerial vehicle track planning based on a remote sensing task, which comprises the following specific steps:
step 1, obtaining basic parameters of a target flight area and relevant information of an aircraft, wherein the basic parameters comprise the size of a task area, a flight direction, an expected sidewise overlapping degree, an expected course overlapping degree and a maximum allowable flight altitude of the area;
step 2, acquiring related parameters related to the flight process according to the acquired basic parameters, wherein the related parameters comprise an actual flight direction, an actual flight height, an actual course overlapping degree, an actual lateral overlapping degree, an actual course shooting interval, an actual lateral shooting interval and an actual yaw distance;
step 3, acquiring a course end point set based on the flight area and the flight zone information, dividing the target flight area into a plurality of convex polygons, and merging the polygons of the small areas;
step 4, acquiring a course shooting point in an ideal coordinate system based on the course shooting interval, performing coordinate conversion on the course shooting point to obtain position information of the shooting point in a correct coordinate system, and obtaining list information of courses in the target flight area according to rules of different areas firstly and then different courses;
step 5, distributing the aerial photographing points to corresponding aircrafts according to the aircraft parameters;
and 6, judging the reasonability of the flight path planning result information, and finally obtaining the flight path information of the target flight area and the planning result of aircraft distribution under the correct basic parameters.
Preferably, the specific steps of step 3 include:
3.1, establishing an ideal coordinate system by taking the center of the target flight area as the origin of coordinates, the flight direction as the positive direction of an x axis and the direction vertical to the x axis as a y axis according to the flight area information, and transferring the target flight area from a real geographic coordinate system to the ideal coordinate system;
step 3.2, acquiring a lateral shooting distance to obtain a proper heading strip, and adding fly-back processing to the sar sensor, namely halving a lateral shooting interval and adding a fly-back strip;
step 3.3, obtaining a boundary point of the intersection of the strip where the air route is located and the area;
3.4, according to intersection point information on each flight band, dividing a target flight area into a plurality of convex polygons by adopting a scanning line method;
step 3.5, merging the concave polygons obtained after the scanning line segmentation between adjacent polygons, and merging the polygons of small areas meeting the requirements of any one of three criteria, wherein the three criteria are areas less than 3 routes respectively; an area where the total flight length is less than the total range 1/5 of the aircraft; and combining the areas of the adjacent convex polygons, wherein the increased route part is smaller than the reduced route part.
Preferably, the specific steps of step 5 include:
step 5.1, calculating the minimum number of the required aircrafts for meeting the flight path planning by using the maximum flight distance of the aircrafts and the total length of regional air routes;
step 5.2, carrying out aircraft distribution and flight path planning according to the actual number of the aircraft;
and 5.3, allocating aerial photography points to each area, wherein the aircrafts need to take off in the same direction in the process of allocating the aerial photography points, and the aircrafts take off in the same direction and face the right direction uniformly by default.
Preferably, the step 3.1 of converting the real geographic coordinate system into the ideal coordinate system includes the following specific steps:
step 3.1.1, acquiring the minimum outsourcing rectangle of the longitude and latitude of the real geographic coordinate system according to the flight area, corresponding the outsourcing rectangle of the real geographic coordinate system to the outsourcing rectangle of the plane coordinate system, and establishing a corresponding matrix relation, namely a transformation matrix H, through four vertexes of the outsourcing rectangle of the real geographic coordinate system and the vertexes of the plane coordinate system1According to the correspondence between coordinate systemsSystem Y ═ H1And X, converting the real geographic coordinate system into a plane coordinate system, wherein X is the coordinate of a real point in the real geographic coordinate system, and Y is the coordinate of a coordinate point in the plane coordinate system:
step 3.1.2, establishing an ideal coordinate system by taking the flight direction as the x-axis direction and the direction vertical to the x-axis as the y-axis direction, and passing through H1The matrix transforms the flight direction to the plane coordinate system direction, and then the flight direction is H according to the corresponding rotation matrix2Rotation to the x-axis, i.e. through H2Converting the plane coordinate system into an ideal coordinate system in a rotating way;
step 3.1.3, based on transformation matrix H1And the rotation matrix is H2Establishing the relation H between the real geographic coordinate system and the ideal coordinate system2 H1And obtaining the real geographic coordinate system by the ideal coordinate system through the inverse matrix transformation of H.
Preferably, the specific steps of step 3.3 include:
3.3.1, establishing flight zone rectangles by taking the width of the ground corresponding to the photo as the width and taking a certain multiple of the length in the flight direction of the obtained area as the length, obtaining the intersection of each flight zone rectangle and the flight area based on the opencv library function, wherein if only one element in the intersection is available, the flight zone is not cut by the flight area, and otherwise, the flight zone is cut;
step 3.3.2, calculating the minimum outsourcing rectangle of each intersection element, taking the left end point and the right end point of the minimum outsourcing rectangle as intersection segmentation points, and taking the y-axis value of the intersection segmentation points as the central value of the navigation band y;
and 3.3.3, if the distance between the two separated segmentation points is less than a certain threshold value, deleting the two separated segmentation points, namely merging the line segments obtained by segmenting the two segmentation points.
Preferably, the step 3.4 of dividing the convex polygon specifically includes:
3.4.1, dividing from top to bottom, taking the first two intersection points of the first flight band, taking the first two intersection points of the next flight band, obtaining the coincidence condition between the line segment x between the intersection points of the flight bands and the line segment x between the intersection points of the next flight band, and if the intersection points of the flight bands coincide, comparing the next flight band with the next flight band until the line segments between the intersection points of the two flight bands do not coincide or the whole comparison of the flight bands is finished;
3.4.2, obtaining a judgment result after the comparison is finished, forming a new convex polygon by the first two intersection points used in the comparison process, deleting the points from the intersection points of the flight band, and deleting the flight band without the intersection points;
and 3.4.3, repeating the steps 3.4.1-3.4.2 until no flight band remains, namely converting the concave polygon into a plurality of convex polygons.
Preferably, the step of obtaining basic parameters for judging the three criteria in step 3.5 includes:
3.5.1, taking the tail end point of the last flight band of one area as a flying-out point, taking the initial point of the first flight path of the adjacent area as a flying-in point, and obtaining the sum of the absolute value of the x-axis coordinate and the absolute value of the y-axis coordinate between the flying-out point and the flying-in point, namely the part of unmanned aerial vehicle flight path planning reduction after polygon combination;
and 3.5.2, acquiring the same flight band of the two combined polygons, and calculating separation intervals generated by the same flight band, wherein the sum of the separation intervals of all the strips is the increased part of the unmanned aerial vehicle flight path planning after the polygons are combined.
Preferably, the specific steps of step 5.2 include:
step 5.2.1, if the number of the aircrafts is 0, planning the flight path according to the condition of the minimum number of the aircrafts;
step 5.2.2, if the number of the aircrafts is not 0 and is less than the minimum number of the aircrafts, reporting an error, and recalculating the flying distance of the aircrafts to ensure that the number of the aircrafts in each area is as uniform as possible, so that the total flying time is shortest;
and 5.2.3, if the number of the aircrafts is not 0 and is not less than the minimum required number of the aircrafts, planning the flight path according to the condition of the minimum required number of the aircrafts.
Preferably, the specific step of assigning aerial photography points in step 5.3 includes:
step 5.3.1, if the initial flight direction of the aircraft is rightward, normally recording all waypoints of the aircraft;
step 5.3.2, if the initial flight direction of the aircraft is leftward and the sensor is 'sar', omitting the first flight band and not counting the length;
step 5.3.3, if the initial flight direction of the aircraft is leftward and the sensor is 'camera' or 'sar', all the flight belts need to be reversed;
and 5.3.4, if the direction of the aircraft is leftward and the sensor is 'sar' when the aircraft finishes flying, omitting the last flight band.
Preferably, the shooting sensor types of the aircraft in the step 1 comprise 3 types, the first type is an optical camera of a "camera" type, shooting is carried out vertically downwards, and one shooting point takes one picture; the second type is a 'vido' type optical camera which controls the starting and stopping based on the information of a shooting starting point and an end point, shoots vertically downwards and carries out video recording; the third type is an "sar" type radar camera, which controls start and stop based on shooting start point and end point information, shoots obliquely downward, and performs in a scan line manner.
According to the scheme, the unmanned aerial vehicle track planning in the concave polygonal area based on the remote sensing task carries out multiple unmanned aerial vehicle track planning aiming at the specific imaging requirement of the remote sensing task, gives the information of a specific aerial shooting point, and ensures that the total flight path is shortest under the condition of ensuring the full coverage of an observation area. The invention provides the multi-load and multi-frame unmanned aerial vehicle flight path planning in the concave polygonal area, and the flight path planning based on the remote sensing task (multi-load, multi-aircraft, actual geographic base map, course overlapping degree, lateral overlapping degree, resolution ratio and the like) considers the acquisition of the corresponding flight shooting points of the geographic base map and the flight path planning, can meet the requirements of the remote sensing task, has more practicality, has obvious effect and is suitable for wide popularization.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a first process block diagram of unmanned aerial vehicle track planning in a concave polygonal area based on a remote sensing task according to an embodiment of the present invention;
FIG. 2 is a second block diagram of a process of unmanned aerial vehicle track planning in a concave polygonal area based on a remote sensing task according to an embodiment of the present invention;
fig. 3 is a third process block diagram of the unmanned aerial vehicle track planning in the concave polygonal area based on the remote sensing task according to the embodiment of the present invention;
fig. 4 is a first simulation result diagram of the unmanned aerial vehicle track planning in the concave polygonal area based on the remote sensing task according to the embodiment of the present invention;
fig. 5 is a diagram of a simulation result of the unmanned aerial vehicle track planning in the concave polygonal area based on the remote sensing task according to the embodiment of the present invention;
fig. 6 is a third simulation result diagram of the unmanned aerial vehicle track planning in the concave polygonal area based on the remote sensing task according to the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 to fig. 6, a description will now be given of an embodiment of the present invention for a concave polygon area unmanned aerial vehicle track planning based on remote sensing task. The method for planning the unmanned aerial vehicle flight path in the concave polygonal area based on the remote sensing task comprises the following specific steps:
s1, obtaining basic parameters of the target flight area and the related information of the aircraft, wherein the basic parameters include the size of the mission area (a convex polygon or a concave polygon composed of a series of latitude and longitude, which may be a ground area or an air area flying by the aircraft, and since sar is skew, the air area may obtain the ground area by adding a horizontal offset of the camera shooting distance, and vice versa, the horizontal offset is obtained by multiplying the skew angle by the actual flight height), an area identifier (identifying that the area is the ground area or the air area), a proposed flight height, a calculation identifier (identifying that the actual flight height is calculated according to the resolution or the proposed flight height, the camera focal length/flight height is the resolution), a flight direction (obtaining the minimum width or specifying the flight direction, in order to avoid collision during the flight of the aircraft, the main flight directions of all the aircrafts are the same), the ground resolution (the ratio of the size of a pixel on an image to the actual geographic distance), the expected lateral overlapping degree (the overlapping degree between two adjacent images vertical to the flight direction of the aircraft, namely the overlapping percentage between two adjacent strips), the expected course overlapping degree (the overlapping degree between two adjacent images along the flight direction of the aircraft, namely the overlapping percentage between two adjacent images on the same flight band), the aircraft number (a manually specified natural number, if the natural number is 0, the planning result of the minimum flight frame number is selected automatically), the wind direction, the wind speed, the maximum allowable flight height of a region (an airspace needs to be applied, is limited in the flight process and cannot occupy the airspace of aviation), the aircraft parameters and the image acquirer parameters;
the aircraft parameters include the aircraft name, the aircraft number (to distinguish between different types of aircraft), the maximum flight speed, the minimum flight speed, the proposed flight speed (the speed to be considered during the proposed flight, which can be dynamically adjusted if otherwise limited in the experiment), the maximum flight altitude, the minimum flight altitude, the maximum flight path, the aircraft turning radius (to be considered when considering the total distance and the maximum flight path of the aircraft, to be considered when considering the length of the extra distance spent when the aircraft is turning).
The camera parameters include the name of the camera, the type of the camera (camera, sar, or video), the focal length (camera fixed-focus shooting, which needs to calculate the resolution, the flying height of the aircraft, and the like, and can be dynamically adjusted if other conditions are limited), the pixel size (pixel size of the actual camera image), the number of pixels on the x-axis (the number of pixels on the horizontal direction of the image pixel), the number of pixels on the y-axis (the number of pixels on the vertical direction of the image pixel), the minimum focal length (objective condition of the camera), the maximum focal length (objective condition of the camera), the minimum shooting time interval (the heading distance between two adjacent images should be not less than the minimum shooting time interval multiplied by the actual flying speed of the aircraft), the minimum observation radius of sar (the closest distance between the ground point and the sensor taken by the sar scan line), The maximum observation radius of sar (the farthest distance from the ground point of sar scan line to the sensor), the suggested observation height of sar, the included angle of the view line of sar (the included angle between the minimum observation radius and the maximum observation radius of sar, the included angle between the two radial lines of sensor emission), and the included angle between the view angle center line of sar and the vertical line.
The invention considers the actual geographic base map and different aircraft loads, the loads are mainly divided into three types, namely the shooting sensor types of the aircraft in S1 comprise 3 types, the first type is a camera type optical camera, fixed-point shooting is carried out vertically downwards, and one shooting point takes one picture; the second type is a 'vido' type optical camera, shooting is carried out vertically downwards and in a video recording mode, fixed-point shooting is not needed, and only an entry point and a departure point of a flight route, namely a shooting starting point and a shooting ending point, need to be given in one flight zone; the third type is a Radar camera of the type of "sar (Synthetic Aperture Radar)", which shoots obliquely downward and shoots in a scan line manner, and only needs to give a shooting start point and a shooting end point.
Because the sar is shot obliquely, the geographic coordinates of the aircraft and the actual scanning line have a certain horizontal offset, and because the flight directions of two adjacent strips of the aircraft are opposite, the scanning lines of the two adjacent strips are not overlapped, the method is to omit the strip of the aircraft in a certain direction, and simultaneously, the lateral interval of the aircraft is halved for the overlapping of the two scanning lines of the two strips which are separated. Shooting is carried out on the sar along a certain direction, and the shooting result in the opposite direction is discarded. For data structure uniformity during actual flight, "sar" gives the waypoint of the back-and-forth route when the last waypoint is given, but the waypoint that is folded back is ignored.
Compared with the prior art, the method ensures the course overlapping degree and the side overlapping degree of the aerial images, effectively ensures the correct splicing of the images acquired by the unmanned aerial vehicle in the later period, needs to be carried out by adopting a fixed-point shooting mode, and needs an unmanned aerial vehicle track planning algorithm to give specific aerial shooting points. Camera shots also have a limit on the minimum time interval that makes remote sensing task based observations different from past track planning.
S2, acquiring relevant parameters related to the flight process according to the acquired basic parameters, wherein the relevant parameters comprise an actual flight direction, an actual flight altitude, an actual course overlapping degree, an actual side direction overlapping degree, an actual ground resolution, an actual course shooting interval, an actual side direction shooting interval and an actual yaw distance (mainly aiming at the sar, the camera and the video are 0);
determining the flight direction, wherein the three load calculation modes are the same, if the flight direction is min _ width, calculating according to a minimum width algorithm, namely determining a minimum convex polygon of an area, processing each edge of the convex polygon, calculating the maximum width of the edge from other points, and finding out the edge corresponding to the minimum maximum width, wherein the direction of the edge is the flight direction of the aircraft.
S3, establishing a coordinate system according to the flight area information, acquiring relevant flight information of the aircraft, acquiring a flight path endpoint set according to the relevant flight information of the aircraft, namely flight zone information, dividing the target flight area into a plurality of convex polygons, and merging polygons of small areas, namely fragment polygons, wherein the relevant flight information comprises a task area, an area identifier (for identifying whether the area is a ground area or an aerial area), an x-axis component of a flight direction, a y-axis component of the flight direction, a coordinate system type (which is a WGS84 coordinate system as default), an actual heading overlapping degree, an actual sidewise overlapping degree, an actual heading shooting distance, an actual sidewise shooting distance, an actual flight altitude, a type of an image acquirer (which is 'camera', 'video' or 'sar'), a distance from a central point, and a turning distance;
because the sensor of the sar is unidirectional shooting, in order to keep the uniformity of the sensor and the video, a fly-back process is added, namely, a side shooting interval is halved, and a fly-back strip is added. The specific implementation step of S3 may be:
s3.1, establishing an ideal coordinate system by taking the center of the target flight area as a coordinate origin, the flight direction as the positive direction of an x axis and the direction vertical to the x axis as a y axis according to the flight area information, and transferring the target flight area from a real geographic coordinate system to the ideal coordinate system;
the traditional unmanned aerial vehicle flight path planning is carried out on a plane and only takes the effect of a two-dimensional plane into consideration, and the invention solves the problem that the deformation is generated in the process of map projection to influence the later image splicing due to the fact that the earth is three-dimensional and the flight area is too large under a real geographic coordinate system. S3.1 the specific steps of converting the real geographic coordinate system to the ideal coordinate system include:
s3.1.1, converting the real geographic coordinate system into a plane coordinate system:
given a flight area, solving the minimum outsourcing rectangle of the longitude and latitude (namely solving the maximum and minimum values of the longitude and latitude), and recording the maximum and minimum values of the longitude as maxlAnd minlThe maximum and minimum latitude values are maxwAnd minwAcquiring the actual distance between two real points of the earth, and recording ((max)l+minl)/2,maxw) And ((max)l+minl)/2,minw) The distance between the two points is h, recorded in (min) corresponding to the latitude of the outsourcing rectangle corresponding to the plane coordinate systeml,(maxw+minw) /2) and (max)l,(maxw+minw) And/2) if the distance between the two points is w corresponding to the longitude of the envelope rectangle corresponding to the plane coordinate system, then finding the envelope rectangle corresponding to the plane coordinate system and the left side of the envelope rectangle of the plane coordinate systemThe upper point is (-w/2, H/2), the right lower point of the outer-wrapping rectangle of the plane coordinate system is (w/2, -H/2), and then a corresponding matrix relation is established through four vertexes of the outer-wrapping rectangle of the real coordinate system and the vertexes of the plane coordinate system, namely Y is H1X, X being the coordinates of the real point, Y being the coordinates of the coordinate system, H1Is a transformation matrix;
step 3.1.2, converting the plane coordinate system into an ideal coordinate system through rotation:
the ideal coordinate system is that the flight direction is the x-axis direction, and the direction vertical to the x-axis direction is the y-axis direction, so that the flight direction passes through H1After the flight direction is converted to the plane direction by the matrix, the matrix needs to be rotated to the x axis, and the corresponding rotation matrix is H2I.e. by H2Transforming the plane coordinate system to an ideal coordinate system;
step 3.1.3, establishing the relation between the real geographic coordinate system and the ideal coordinate system
Directly establishing the relation between a real geographical coordinate system and an ideal coordinate system, namely H ═ H2 H1That is, the H matrix is the product of two transformation matrices, and the relationship of transforming from the ideal coordinate system to the real coordinate system is established, and the corresponding relationship is the inverse matrix of H.
S3.2, acquiring the lateral shooting distance L to obtain a proper course strip, and adding fly-back processing to the sar sensor, namely halving the lateral shooting interval and adding the fly-back strip;
the lateral shooting distance L is b x (1-eta), wherein b is the real width of the image corresponding to the ground, and eta is the lateral overlapping degree;
the real width b of the photo on the ground is m multiplied by s, wherein m is the number of pixels in the width direction of the photo, and s is the ground size opposite to each pixel;
the ground size s opposite to each pixel is T multiplied by p/d, wherein T is the flying height, p is the real size of the pixel, and d is the focal length of the camera.
S3.3, acquiring a boundary point of the intersection of the strip where the air route is located and the area;
s3.3 comprises the following specific steps:
s3.3.1, for each ribbon rectangle (the width of the rectangle is the width of the photo corresponding to the ground, and the length is a certain multiple of the longest length in the flight direction of the region, for example 1.1 times (the judgment on the boundary point can be omitted)), finding the intersection of the ribbon rectangle and the flight region (library function of opencv), if only one element in the intersection indicates that the ribbon is not cut by the flight region, otherwise, the ribbon is cut.
S3.3.2, calculating the minimum outsourcing rectangle of each intersection element, taking the left and right end points as the intersection segmentation points, and taking the y-axis value as the center value of the flight band y, wherein there are other detailed records, such as recording the direction of the flight band, etc.
S3.3.3, if the distance between two separated points is less than a certain threshold (set to 3 aircraft turn radii, for example), the two separated points are deleted, i.e., the line segment is merged (if two line segments are too close to each other, they are merged into one line segment). Because unmanned aerial vehicle has turning radius after turning, it is better not to divide the effect this moment, so merge the operation to the line segment after two segmentation point segmentations.
S3.4, according to intersection point information on each flight band, dividing a convex polygon into a target flight area (concave polygon area) by adopting a scanning line method, and dividing the target flight area into a plurality of convex polygons;
s3.4 the specific steps of convex polygon division comprise:
s3.4.1, dividing from top to bottom, taking the first two intersections of the first zone (S3.2.3 finds the intersections on each zone, two intersections can be a group), taking the first two intersections of the next zone, calculating whether the line segment x between the intersections of the last zone is overlapped with the line segment x between the intersections of the last zone (the two misaligned lines mean that the left end point of one zone is larger than the left end point of the other zone), and if so, comparing the next zone until the two zones are misaligned or not.
S3.4.2, by the judgment (no overlap or no flight band), the first two intersections of the previous intersections can be formed into a new convex polygon, and then the points (i.e. the first two points of each flight band intersection) are deleted from the flight band intersections, and the flight band without intersections is deleted.
S3.4.3, repeating S3.4.1-S3.4.2 until no flight band remains, the concave polygon becoming a plurality of convex polygons.
And S3.5, merging the concave polygons obtained after the scanning line segmentation among the adjacent polygons, wherein the polygons (namely the fragment polygons) in the small regions meeting the requirements of any one of the three criteria are merged because the too small polygon regions waste the forward and return voyages of the aircraft.
The three criteria include: first, the area is less than 3 lanes; second, the total regional path length is less than 1/5 for the total aircraft range, and third, the increased path portion is less than the decreased path portion after merging adjacent convex polygons, and merging is required. And if the criterion meets one criterion, merging can be carried out, and the first two merged polygons need to find a polygon with the least number of the added route parts and the reduced route parts after merging for merging. The three criteria solve the problems that when the convex polygon is too small, an additional airplane is arranged to carry out a flight task, resources are wasted, the airplane needs to fly to a designated area from an airplane hangar, and the running time is too long.
The concave polygons are divided by adopting the scanning lines, so that adjacent polygons are most likely to be merged, polygons with the same flight band or polygons corresponding to the flight bands with the highest end higher or the lower end lower are selected according to different recorded flight bands, and the increased part and the reduced part of the unmanned aerial vehicle flight path planning after the two polygons are merged are directly calculated. The key point of the step is to solve the increased part and the decreased part of the unmanned aerial vehicle flight path planning after the two polygons are combined.
The step of obtaining basic parameters for judging the three criteria in S3.5 comprises:
s3.5.1, calculation of the reduced fraction: the distance spent by the airplane flying to another area after the airplane flies from one area is finished, the tail end point of the last flight band of the first area is a flying-out point, the starting point of the first flight line of the second area is a flying-in point, and the sum of the absolute value of x and the absolute value of y between the two points is calculated to be a part for reducing the unmanned aerial vehicle flight path planning after the polygons are combined;
s3.5.2, the added part is that two polygons are merged, part of the flight band is divided by the concave polygon, and the divided flight band is the part of the extra flight coverage after the two polygons are merged. Finding the same flight band of two polygons and calculating the separation interval (3 times of turning radius is needed to be subtracted here), the addition of the separation intervals of all the strips is the addition part, because the scanning lines are swept out, so the calculated separation intervals are all connected together.
S4, acquiring information of a specific flight path shooting point on a target flight area in an ideal coordinate system based on the course shooting interval, performing coordinate conversion on the flight path shooting point, converting the position information of the shooting point from the ideal coordinate system to a correct coordinate system to obtain the position information of the shooting point in the correct coordinate system, and acquiring list information of flight paths on the target flight area according to rules of different areas and different flight paths, wherein the flight path information comprises flight path shooting point information, flight path length and flight path direction. The aerial photography point information is the same as the aerial photography point information in the final result;
s5, distributing the aerial photographing points to corresponding aircrafts according to the aircraft parameters;
the specific implementation step of S5 may be:
s5.1, calculating the minimum required number of aircrafts (possibly a plurality of regions) meeting the flight path planning by using the maximum flight distance of the aircrafts and the total length of regional routes;
s5.2, carrying out aircraft distribution and flight path planning according to the actual number of the aircraft;
the specific implementation steps of S5.2 may be:
s5.2.1, if the number of the aircrafts is 0, planning the flight path according to the condition of the minimum number of the aircrafts;
s5.2.2, if the number of the aircrafts is not 0 and is less than the minimum number of the aircrafts, reporting an error, and recalculating the distance that the aircrafts should fly, ensuring that the number of the aircrafts in each area is as uniform as possible, so as to minimize the total flight time;
s5.2.3, if the number of the aircrafts is not 0 and is not less than the minimum number of the aircrafts, the flight path planning is carried out according to the condition of the minimum number of the aircrafts.
And S5.3, carrying out aerial photography point distribution on each area, wherein the aircrafts take off in the same direction in the aerial photography point distribution process, and the aircrafts take off in the same direction and face to the right in a unified manner by default to be in the positive direction.
Because the aircraft with the sensor of "sar" is only shot in one direction, and in order to make the total length of the flight path shorter, the actual processing procedure is divided into four cases, and the specific implementation steps of S4.3 may be:
s5.3.1, if the initial flight direction of the aircraft is rightward, all waypoints of the aircraft are recorded normally;
s5.3.2, if the initial flight direction of the aircraft is to the left and the sensor is "sar", then the first flight band is omitted, not counting length (if there is only one flight band it cannot be omitted);
s5.3.3, if the initial flight direction of the aircraft is to the left and the sensor is "camera" or "sar", then all the flight strips need to be reversed;
s5.3.4, if the aircraft is heading to the left when it finishes flight and the sensor is "sar", the last flight band (if only one flight band is available, it cannot be omitted).
And S6, judging the rationality of the flight path planning result information, and finally obtaining the flight path information of the target flight area and the planning result of aircraft distribution under the correct basic parameters.
The aircrafts fly together, and the paths of the multiple aircrafts flying together are distributed according to the shortest total time under the condition that the aerial images can completely cover the flight area. The result part of the route planning includes an identifier of whether the planning is successful, and the specific implementation step of S5 may be:
s6.1, if the planning is successful, returning a list containing planning results of each aircraft, wherein the planning results comprise specific aerial photography point information, the size of a course photo of the aircraft, the size of a lateral photo of the aircraft, the actual ground resolution of the aircraft, the actual flying height of the aircraft, the actual course shooting interval of the aircraft, the actual lateral shooting interval of the aircraft, the number of aircraft routes, the course length of the aircraft, the estimated time of the aircraft finishing a target flying area, aircraft parameters and image acquirer parameters, and the aerial photography point information comprises the serial number, the longitude and latitude and the altitude of the aerial photography point, the longitude and latitude corresponding to four vertexes of an aerial image of the aerial photography point, a control code (which type of aircraft load the mark is), and point information (whether the aerial photography point is an entry point or a middle point or an exit point);
s6.2, if the planning fails, feeding back failed specific information, wherein the failed specific information comprises incorrect input parameters and problems in parameter combination.
The invention provides multi-load and multi-frame unmanned aerial vehicle flight path planning in a concave polygonal area, and based on the flight path planning of a remote sensing task (multi-load, multi-aircraft, actual geographic base map, course overlapping degree, lateral overlapping degree, resolution ratio and the like), the acquisition of corresponding flight points of the geographic base map and the flight path planning is considered, so that the requirement of the remote sensing task can be met, the flight path is optimized and the aircraft is distributed on the premise of meeting the requirement that the aircraft does not collide, the total parallel flight time is shortest under the condition of meeting the requirement, and the parallel flight path planning has higher practicability. The traditional unmanned aerial vehicle track planning only considers that the unmanned aerial vehicle path covers a flight area, plans a single aircraft in a concave polygonal area, and optimizes the path to ensure that the total length of the path is shortest.
Compared with the prior art, the method has the advantages that the particularity of the remote sensing task is considered, the flight path planning of the common flight of a plurality of aircrafts is carried out on the concave polygonal flight area based on the information obtained by using different types of sensor images in the flight process of the unmanned aerial vehicle, the flight area is covered by the path of the unmanned aerial vehicle, and the shooting of aerial images in the flight process of the unmanned aerial vehicle and the image splicing of the remote sensing images shot by the unmanned aerial vehicle in the later period are effectively ensured based on the requirement of the spatial resolution of the remote sensing images, the course overlapping degree and the lateral overlapping degree; through the mutual transformation between the real geographic coordinate system and the ideal coordinate system, the problem of deformation caused by map projection is solved, and the smooth splicing of later aerial images is realized; because the remote sensing becomes the picture and needs accurate location, the planning needs the geographical position of the shooting point of the image of shooing when obtaining unmanned aerial vehicle flight, realizes the fixed point and shoots, solves the influence of multiple factor to it among the aircraft flight process, if wind speed, aircraft own mechanical reason, causes regularly to shoot inaccurate, the poor problem of image of later stage concatenation.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. Details which are not described in detail in the embodiments of the invention belong to the prior art which is known to the person skilled in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (6)

1. The utility model provides a concave polygon region unmanned aerial vehicle flight path planning based on remote sensing task which characterized in that includes:
step 1, obtaining basic parameters of a target flight area and relevant information of an aircraft, wherein the basic parameters comprise the size of a task area, a flight direction, an expected sidewise overlapping degree, an expected course overlapping degree and a maximum allowable flight altitude of the area;
step 2, acquiring related parameters related to the flight process according to the acquired basic parameters, wherein the related parameters comprise an actual flight direction, an actual flight height, an actual course overlapping degree, an actual lateral overlapping degree, an actual course shooting interval, an actual lateral shooting interval and an actual yaw distance;
step 3, acquiring a course end point set based on the flight area and the flight zone information, dividing the target flight area into a plurality of convex polygons, and merging the polygons of the small areas;
step 4, acquiring a course shooting point in an ideal coordinate system based on the course shooting interval, performing coordinate conversion on the course shooting point to obtain position information of the shooting point in a correct coordinate system, and obtaining list information of courses in the target flight area according to rules of different areas firstly and then different courses;
step 5, distributing the aerial photographing points to corresponding aircrafts according to the aircraft parameters;
step 6, judging the rationality of the flight path planning result information, and finally obtaining the flight path information of the target flight area and the planning result of aircraft distribution under correct basic parameters;
the specific steps of the step 3 comprise:
3.1, establishing an ideal coordinate system by taking the center of the target flight area as the origin of coordinates, the flight direction as the positive direction of an x axis and the direction vertical to the x axis as a y axis according to the flight area information, and transferring the target flight area from a real geographic coordinate system to the ideal coordinate system;
step 3.2, acquiring a lateral shooting distance to obtain a proper heading strip, and adding fly-back processing to the sar sensor, namely halving a lateral shooting interval and adding a fly-back strip;
step 3.3, obtaining a boundary point of the intersection of the strip where the air route is located and the area;
3.4, according to intersection point information on each flight band, dividing a target flight area into a plurality of convex polygons by adopting a scanning line method;
step 3.5, merging the concave polygons obtained after the scanning line segmentation between adjacent polygons, and merging the polygons of small areas meeting the requirements of any one of three criteria, wherein the three criteria are areas less than 3 routes respectively; an area where the total flight length is less than the total range 1/5 of the aircraft; combining the areas in which the increased route part is smaller than the decreased route part after the adjacent convex polygons are combined;
the specific steps of the step 5 comprise:
step 5.1, calculating the minimum number of the required aircrafts for meeting the flight path planning by using the maximum flight distance of the aircrafts and the total length of regional air routes;
step 5.2, carrying out aircraft distribution and flight path planning according to the actual number of the aircraft;
step 5.3, carrying out aerial photography point distribution on each area, wherein the aircrafts take off in the same direction in the aerial photography point distribution process, and the aircrafts take off in the same direction and face the right uniformly as a positive direction by default;
the specific steps of converting the real geographic coordinate system to the ideal coordinate system in step 3.1 include:
step 3.1.1, acquiring the minimum outsourcing rectangle of the longitude and latitude of the real geographic coordinate system according to the flight area, corresponding the outsourcing rectangle of the real geographic coordinate system to the outsourcing rectangle of the plane coordinate system, and establishing a corresponding matrix relation, namely a transformation matrix H, through four vertexes of the outsourcing rectangle of the real geographic coordinate system and the vertexes of the plane coordinate system1According to the corresponding relationship between the coordinate systems, Y is H1And X, converting the real geographic coordinate system into a plane coordinate system, wherein X is the coordinate of a real point in the real geographic coordinate system, and Y is the coordinate of a coordinate point in the plane coordinate system:
step 3.1.2, establishing an ideal coordinate system by taking the flight direction as the x-axis direction and the direction vertical to the x-axis as the y-axis direction, and passing through H1The matrix transforms the flight direction to the plane coordinate system direction, and then the flight direction is H according to the corresponding rotation matrix2Rotation to the x-axis, i.e. through H2Converting the plane coordinate system into an ideal coordinate system in a rotating way;
step 3.1.3, based on transformation matrix H1And the rotation matrix is H2Establishing the relation H between the real geographic coordinate system and the ideal coordinate system2 H1Then obtaining a real geographic coordinate system by the inverse matrix transformation of the H on the ideal coordinate system;
the specific steps of step 3.3 include:
3.3.1, establishing flight zone rectangles by taking the width of the ground corresponding to the photo as the width and taking a certain multiple of the length in the flight direction of the obtained area as the length, obtaining the intersection of each flight zone rectangle and the flight area based on the opencv library function, wherein if only one element in the intersection is available, the flight zone is not cut by the flight area, and otherwise, the flight zone is cut;
step 3.3.2, calculating the minimum outsourcing rectangle of each intersection element, taking the left end point and the right end point of the minimum outsourcing rectangle as intersection segmentation points, and taking the y-axis value of the intersection segmentation points as the central value of the navigation band y;
and 3.3.3, if the distance between the two separated segmentation points is less than a certain threshold value, deleting the two separated segmentation points, namely merging the line segments obtained by segmenting the two segmentation points.
2. The unmanned aerial vehicle flight path planning system based on concave polygon area of remote sensing task as claimed in claim 1, wherein the specific step of convex polygon division in step 3.4 comprises:
3.4.1, dividing from top to bottom, taking the first two intersection points of the first flight band, taking the first two intersection points of the next flight band, obtaining the coincidence condition between the line segment x between the intersection points of the flight bands and the line segment x between the intersection points of the next flight band, and if the intersection points of the flight bands coincide, comparing the next flight band with the next flight band until the line segments between the intersection points of the two flight bands do not coincide or the whole comparison of the flight bands is finished;
3.4.2, obtaining a judgment result after the comparison is finished, forming a new convex polygon by the first two intersection points used in the comparison process, deleting the points from the intersection points of the flight band, and deleting the flight band without the intersection points;
and 3.4.3, repeating the steps 3.4.1-3.4.2 until no flight band remains, namely converting the concave polygon into a plurality of convex polygons.
3. The unmanned aerial vehicle flight path planning system based on concave polygonal areas of remote sensing tasks as claimed in claim 2, wherein the basic parameter obtaining step of judging three criteria in step 3.5 comprises:
3.5.1, taking the tail end point of the last flight band of one area as a flying-out point, taking the initial point of the first flight path of the adjacent area as a flying-in point, and obtaining the sum of the absolute value of the x-axis coordinate and the absolute value of the y-axis coordinate between the flying-out point and the flying-in point, namely the part of unmanned aerial vehicle flight path planning reduction after polygon combination;
and 3.5.2, acquiring the same flight band of the two combined polygons, and calculating separation intervals generated by the same flight band, wherein the sum of the separation intervals of all the strips is the increased part of the unmanned aerial vehicle flight path planning after the polygons are combined.
4. The unmanned aerial vehicle flight path planning system based on the concave polygonal area of the remote sensing task as claimed in claim 3, wherein the specific steps of the step 5.2 comprise:
step 5.2.1, if the number of the aircrafts is 0, planning the flight path according to the condition of the minimum number of the aircrafts;
step 5.2.2, if the number of the aircrafts is not 0 and is less than the minimum number of the aircrafts, reporting an error, and recalculating the flying distance of the aircrafts to ensure that the number of the aircrafts in each area is as uniform as possible, so that the total flying time is shortest;
and 5.2.3, if the number of the aircrafts is not 0 and is not less than the minimum required number of the aircrafts, planning the flight path according to the condition of the minimum required number of the aircrafts.
5. The unmanned aerial vehicle flight path planning system based on concave polygonal areas of remote sensing tasks as claimed in claim 4, wherein the specific step of aerial shoot point assignment in step 5.3 comprises:
step 5.3.1, if the initial flight direction of the aircraft is rightward, normally recording all waypoints of the aircraft;
step 5.3.2, if the initial flight direction of the aircraft is leftward and the sensor is 'sar', omitting the first flight band and not counting the length;
step 5.3.3, if the initial flight direction of the aircraft is leftward and the sensor is 'camera' or 'sar', all the flight belts need to be reversed;
and 5.3.4, if the direction of the aircraft is leftward and the sensor is 'sar' when the aircraft finishes flying, omitting the last flight band.
6. The unmanned aerial vehicle track planning system based on concave polygonal area of remote sensing task of claim 1, wherein the shooting sensor type of the aircraft in step 1 comprises 3 types, the first type is an optical camera of "camera" type, shooting is carried out vertically downwards, and one shooting point takes one picture; the second type is a 'vido' type optical camera which controls the starting and stopping based on the information of a shooting starting point and an end point, shoots vertically downwards and carries out video recording; the third type is an "sar" type radar camera, which controls start and stop based on shooting start point and end point information, shoots obliquely downward, and performs in a scan line manner.
CN202110500864.XA 2021-05-08 2021-05-08 Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task Active CN113220027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110500864.XA CN113220027B (en) 2021-05-08 2021-05-08 Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110500864.XA CN113220027B (en) 2021-05-08 2021-05-08 Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task

Publications (2)

Publication Number Publication Date
CN113220027A CN113220027A (en) 2021-08-06
CN113220027B true CN113220027B (en) 2022-04-08

Family

ID=77094040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110500864.XA Active CN113220027B (en) 2021-05-08 2021-05-08 Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task

Country Status (1)

Country Link
CN (1) CN113220027B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114111799B (en) * 2021-12-07 2023-08-15 青岛市勘察测绘研究院 Unmanned aerial vehicle aerial-shooting path planning method for high-macromonomer fine modeling
CN114637305B (en) * 2022-02-15 2023-08-15 山东省计算中心(国家超级计算济南中心) Unmanned aerial vehicle shortest path planning method and device
CN115410104B (en) * 2022-09-16 2023-06-16 湖南胜云光电科技有限公司 Data processing system for acquiring image acquisition points of aircraft
CN115793716B (en) * 2023-02-13 2023-05-09 成都翼比特自动化设备有限公司 Automatic optimization method and system for unmanned aerial vehicle route
CN117470199B (en) * 2023-12-27 2024-03-15 天津云圣智能科技有限责任公司 Swing photography control method and device, storage medium and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110675414A (en) * 2019-09-30 2020-01-10 广州极飞科技有限公司 Land parcel segmentation method and device, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106447671A (en) * 2016-09-09 2017-02-22 浙江大学 Automatic vector polygon segmentation method based on designated areas
CN107478231A (en) * 2017-08-10 2017-12-15 千寻位置网络有限公司 Unmanned plane Route Planning Algorithm based on polygon obstacle detection
JP6900333B2 (en) * 2018-02-13 2021-07-07 日本電信電話株式会社 Network data generator, network data generation method, and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110675414A (en) * 2019-09-30 2020-01-10 广州极飞科技有限公司 Land parcel segmentation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113220027A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
CN113220027B (en) Concave polygon area unmanned aerial vehicle flight path planning based on remote sensing task
US7363157B1 (en) Method and apparatus for performing wide area terrain mapping
CA2657957C (en) Geospatial modeling system providing building roof type identification features and related methods
Yahyanejad et al. Incremental mosaicking of images from autonomous, small-scale uavs
US5259037A (en) Automated video imagery database generation using photogrammetry
CN107251055B (en) Corridor capture
RU2562707C2 (en) Systems and methods of capture of large-area images by parts including cascade cameras and/or calibration attributes
JP5363752B2 (en) Road marking map generation method
US9230300B2 (en) Method for creating a mosaic image using masks
KR102195051B1 (en) System and method for creating spatial information using image information from drone and computer program for the same
CN109931912A (en) A kind of aviation oblique photograph method and device
US10877143B2 (en) Method and device for geo-referencing aerial image data with the aid of SAR image data
JP6080641B2 (en) 3D point cloud analysis method
Gerke Dense matching in high resolution oblique airborne images
CN112652065A (en) Three-dimensional community modeling method and device, computer equipment and storage medium
JP2015114954A (en) Photographing image analysis method
CN110825110A (en) Acquisition flight method for power line visible light point cloud resolving photo
JP2009217524A (en) System for generating and browsing three-dimensional moving image of city view
CN113188520B (en) Planning method and system for regional block surrounding type route and aerial photography method
CN115014361A (en) Air route planning method, device and computer storage medium
CN111047231A (en) Inventory method and system, computer system and computer readable storage medium
CN116433845A (en) Strange environment rapid modeling method and system based on multi-unmanned aerial vehicle cooperation
CN111506112A (en) Unmanned aerial vehicle oblique photography method based on offshore oil and gas field equipment facility
CN114757978B (en) Remote sensing satellite multi-camera multi-load image pairing method
CN113535863B (en) Moving track rendering method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant