CN111735458B - Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision - Google Patents

Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision Download PDF

Info

Publication number
CN111735458B
CN111735458B CN202010771058.1A CN202010771058A CN111735458B CN 111735458 B CN111735458 B CN 111735458B CN 202010771058 A CN202010771058 A CN 202010771058A CN 111735458 B CN111735458 B CN 111735458B
Authority
CN
China
Prior art keywords
filter
module
local
covariance
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010771058.1A
Other languages
Chinese (zh)
Other versions
CN111735458A (en
Inventor
赖欣
张晨蕾
江红
李奥华
李嘉禾
王森
王宝宝
谈峻菡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Petroleum University
Original Assignee
Southwest Petroleum University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Petroleum University filed Critical Southwest Petroleum University
Priority to CN202010771058.1A priority Critical patent/CN111735458B/en
Publication of CN111735458A publication Critical patent/CN111735458A/en
Application granted granted Critical
Publication of CN111735458B publication Critical patent/CN111735458B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention relates to a petrochemical routing inspection based on GPS, 5G and visionProvided is a robot navigation positioning method. The seamless accurate positioning of the inspection robot in running in various complex environments is realized. The technical scheme is as follows: the robot error model and the measuring model are established by utilizing a parallel implicit equal-weight particle filter positioning algorithm, and an adjustment equal-weight particle filter positioning algorithm is adoptedαAndβand the two parameters update each particle to obtain local state estimation and local covariance, the local state estimation and the local covariance are respectively transmitted to the corresponding distributed multi-sensor fusion modules, and the fusion information positioning module fuses the combined state estimation by using an optimal information fusion criterion to decide the global optimal state estimation. The invention integrates the 5G technology, adopts multi-sensor data fusion, has short time delay and high precision, and expands the application scene of the inspection robot.

Description

Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision
Technical Field
The invention relates to a navigation and positioning method of a petrochemical inspection robot based on GPS, 5G and vision, belonging to the field of robot navigation and positioning.
Background
With the development of the current society and the progress of the technology, the automatic inspection technology of the robot is unprecedentedly developed and widely applied. Automatic inspection techniques are required in many situations, such as petrochemical plants, oil fields, natural gas stations, oil transportation stations, etc. The places have the characteristics of large working site area, complex working environment, difficult troubleshooting of various potential safety hazards and the like, and bring great working difficulty and working strength for manual troubleshooting. Moreover, the traditional manual inspection has low efficiency and large workload, and has certain dangerousness even in certain specific occasions due to the existence of uncertain factors. On one hand, manual inspection can cause some hidden faults to be not checked in time, and great property loss and even casualties are probably caused to companies and enterprises. On the other hand, the manual troubleshooting efficiency is low. For the inspection of the workplaces, the inspection contents may exist indoors and outdoors, the working environment is complex and various, and severe weather such as rain, thunder, snow and the like can be encountered. To above-mentioned problem, if can replace manual work with patrolling and examining the robot, not only can reduce staff's the work degree of difficulty and working strength, can avoid the casualties that leads to because of the operation of hazardous environment to a certain extent moreover. Therefore, the intelligent inspection robot has the advantages of powerful inspection function, wide application range, certain adaptability to the environment and accurate detection result, and is imperative to research.
The autonomous navigation positioning problem is the key point of the technical research of the intelligent inspection robot, and solves the problem of ' where ' the robot is ' in the practical operation of the robot, which is not only related to whether the inspection robot can complete the operation according to a preset scheme, but also the accuracy and rapidness of positioning become the decisive factor of whether the mobile robot can be applied to the practice.
At present, there are many theories and researches related to the inspection robot, and certain applications and developments are obtained. Several navigation positioning methods and their advantages and disadvantages are described below:
(1) the navigation positioning method based on the laser radar and SLAM technology comprises the following steps: the method is applied to indoor mobile robots and transformer substation robots, and has the main defects that the inspection range of the inspection robot is limited, the adaptability to the environment is not enough, the accurate positioning cannot be realized on certain occasions with sparse characteristics, the influence of weather on the laser technology is large, and the adaptability to weather changes is poor.
(2) The navigation positioning method based on the light tracking and the infrared image comprises the following steps: the method and the magnetic strip guide and radio frequency identification RFID are high in navigation and positioning accuracy and good in environmental adaptability within a certain range. But need arrange navigation lamp in advance, increased the work degree of difficulty to later maintenance cost is higher. Moreover, the navigation device can only work in the environment with navigation lights, and besides, the navigation positioning capability is basically lost or the navigation precision is affected.
(3) The GPS track positioning method based on the Beidou satellite comprises the following steps: the current GPS track positioning method adopts single-frequency positioning, and the data source is single, so that the problem of positioning precision is easy to occur under the condition of abnormal data.
In addition, some methods for multi-sensor fusion positioning are researched, but most methods are only used in specific occasions, and in most cases, the signal stability is difficult to ensure, and the adaptability to general occasions is poor.
By adopting multi-sensor data fusion, the navigation and positioning precision can be improved, and the method has good adaptability to the common working environment. At present, the 5G technology in China is rapidly developed, and compared with 4G, 5G has the advantages of high speed, low power consumption, ubiquitous network, low time delay and the like, and brings great convenience to the fields of communication, navigation positioning and the like. Compared with the traditional nonlinear filtering method, the particle filtering method directly adopts a nonlinear model, so that the particle filtering has higher flexibility and better estimation precision. Therefore, the particle filtering technique gradually gains a unique position in the nonlinear research and the non-gaussian system optimal estimation problem, but there still exist some problems due to the particle filtering, such as the requirement of a large number of samples, and the algorithm complexity corresponding to the large number of samples is high, and the resampling stage causes the loss of sample validity and diversity, resulting in the sample depletion phenomenon. In order to solve the problems, a navigation positioning method based on parallel implicit equal-weight particle filtering and 5G and other multi-sensor fusion is provided.
The invention content is as follows:
in order to solve the technical problems in the background art, the invention aims to provide a navigation and positioning method of a petrochemical inspection robot based on GPS, 5G and vision, so that seamless and accurate positioning of the petrochemical inspection robot in operation in various complex environments is realized, and quick positioning and accurate positioning in the inspection process are ensured.
In order to realize the technical purpose, the petrochemical inspection robot navigation positioning method based on the GPS, the 5G and the vision comprises a GPS module, a 5G module, a vision module, a mileometer module, a GPS judgment module, a 5G judgment module, a vision judgment module, a filter A, a filter B, a filter C, a distributed multi-sensor fusion module A, a distributed multi-sensor fusion module B, a fusion information positioning module and an environment characteristic map, and the specific method comprises the following steps:
(1) GPS module acquisition measurement information Z1(k) Sending the frequency to a GPS judgment module, and collecting measurement information Z by a 5G module2(k) Sending the data to a 5G judgment module at a specific frequency, and acquiring measurement information Z by a vision module3(k) Sending the signal to a vision judging module at a specific frequency;
(2) the output quantity of the odometer module is sent to the filter A, the filter B and the filter C at a specific frequency, the error quantity of navigation parameters output by the odometer is used as a state quantity by the petrochemical inspection robot, and an odometer error model is constructed by combining the state quantity
Figure GDA0002729297440000021
M is the particle number, and M is 1,2,3, …, M is the total number of particle samples,
Figure GDA0002729297440000022
is the state value of the m-th particle in the particle sample at time k,
Figure GDA0002729297440000023
is the state value of the mth particle in the particle sample at time k-1,
Figure GDA0002729297440000024
for a priori estimation, i.e. the state at the current moment
Figure GDA0002729297440000031
Only with the last state
Figure GDA0002729297440000032
(ii) related;
(3) the GPS judgment module is combined with the environment characteristic map to judge available measurement information and send the measurement information to the filter A, the 5G judgment module is combined with the environment characteristic map to judge the available measurement information and send the measurement information to the filter B, and the vision judgment module is combined with the environment characteristic map to judge the available measurement information and send the measurement information to the filter BThe map judges available measurement information and sends the measurement information to a filter C, the filter A, the filter B and the filter C which receive the information adopt the difference value of the odometer output quantity and the measurement information output by a GPS module, a 5G module and a vision module as a measurement equation to construct a measurement model
Figure GDA0002729297440000033
And calculating an importance weight:
Figure GDA0002729297440000034
carrying out normalization weight processing to obtain:
Figure GDA0002729297440000035
wherein z isj(k) Wherein, when j is 1,2,3, respectively represents the measurement information collected by the GPS module, the 5G module and the vision module,
Figure GDA0002729297440000036
wherein, when j is 1,2, and 3, the importance weights of m-th particles in filter a, filter B, and filter C are respectively represented,
Figure GDA0002729297440000037
wherein, when j is 1,2,3, the normalized importance weight of the m-th particle in filter a, filter B and filter C is respectively represented,
Figure GDA0002729297440000038
is a likelihood probability density function, namely the state of the petrochemical inspection robot
Figure GDA0002729297440000039
Lower observation zj(k) Probability of p (z)j(k) Is a probability of an edge,
Figure GDA00027292974400000310
in order to be a function of the importance density,
Figure GDA00027292974400000311
in (1), 2 and 3 denote filters, respectivelyA. Filter B and filter C conform to the importance density function at the mth sample value of the particle samples at time k
Figure GDA00027292974400000312
(4) The filter A, the filter B and the filter C utilize a parallel implicit equal-weight particle filter positioning algorithm to establish a robot error model and a measurement model to obtain corresponding local state estimation
Figure GDA00027292974400000313
And local covariance
Figure GDA00027292974400000314
Wherein
Figure GDA00027292974400000315
When j is 1,2 and 3, the local state estimation of the petrochemical inspection robot output by the filter A, the filter B and the filter C is respectively shown;
Figure GDA00027292974400000316
when j is 1,2 and 3, the local covariance of the petrochemical patrol robot output by the filter a, the filter B and the filter C is respectively represented; filter a estimates the local state at a certain frequency
Figure GDA00027292974400000317
And local covariance
Figure GDA00027292974400000318
Sending the data to a distributed multi-sensor fusion module A; filter B estimates the local state at a certain frequency
Figure GDA00027292974400000319
And local covariance
Figure GDA00027292974400000320
Sending the data to a distributed multi-sensor fusion module A and a distributed multi-sensor fusion module B; filter C at a specific frequencyEstimating local state
Figure GDA00027292974400000321
And local covariance
Figure GDA00027292974400000322
Sent to the distributed multi-sensor fusion module B,
updating the local state estimate of the petrochemical inspection robot as follows:
Figure GDA0002729297440000041
and updating the local covariance of the petrochemical inspection robot as follows:
Figure GDA0002729297440000042
(5) receiving, by a distributed multi-sensor fusion module A, a local state estimate from a filter A
Figure GDA0002729297440000043
Local covariance
Figure GDA0002729297440000044
And local state estimation from filter B
Figure GDA0002729297440000045
Local covariance
Figure GDA0002729297440000046
The distributed multi-sensor fusion module A obtains a combined state estimation by using an optimal information fusion criterion as follows:
Figure GDA0002729297440000047
the combined covariance expression is:
Figure GDA0002729297440000048
wherein
Figure GDA0002729297440000049
In order to optimize the weight matrix,
Figure GDA00027292974400000410
Figure GDA00027292974400000411
E=(e4,e4)T,e4is a 4 × 4 identity matrix; estimating X the fused combined state1(k) Sum combined covariance P1(k) Sending the information to a fusion information positioning module;
receiving, by a distributed multi-sensor fusion module B, local state estimates from a filter B
Figure GDA00027292974400000412
Local covariance
Figure GDA00027292974400000413
Local state estimation of sum filter C
Figure GDA00027292974400000414
Local covariance
Figure GDA00027292974400000415
The distributed multi-sensor fusion module B obtains a combined state estimation by using an optimal information fusion criterion as follows:
Figure GDA00027292974400000416
the combined covariance expression is:
Figure GDA00027292974400000417
wherein
Figure GDA00027292974400000418
In order to optimize the weight matrix,
Figure GDA00027292974400000419
Figure GDA00027292974400000420
estimating X the fused combined state2(k) Sum combined covariance P2(k) Sending the information to a fusion information positioning module;
(6) the fusion information positioning module fuses the received combined state estimation and combined covariance by using an optimal information fusion criterion to decide a global optimal state estimation X and a global covariance P, wherein
Figure GDA0002729297440000051
Figure GDA0002729297440000052
In order to optimize the weight matrix,
Figure GDA0002729297440000053
3=diag([P1(k),P2(k)]);
further, based on the above navigation positioning method, in step (3), each particle is updated by adjusting two parameters, α and β:
Figure GDA0002729297440000054
due to the fact that
Figure GDA0002729297440000055
Is a deterministic movement of particles, transformed by coordinates
Figure GDA0002729297440000056
Figure GDA0002729297440000057
Figure GDA0002729297440000058
Wherein
Figure GDA0002729297440000059
Being the absolute value of the jacobian determinant,
Figure GDA00027292974400000510
in the above description, when j is 1,2, and 3, the local covariance ξ, respectively, of the outputs of filter a, filter B, and filter C at time k-1 is representedmIs derived from a standard multivariate Gaussian distribution, SmIs a random vector, ξm、SmAnd q (ξ) are both normal distributions N (0,1), N (0,1) being the probability distribution with the desired 0 and 1 standard deviation.
Further, based on the above positioning method, in step (4), the parallel implicit equal-weight particle filter positioning algorithm method is as follows:
first, when k is initialized to 0, M samples are sampled from the prior distribution p (X (0)), and the initial distribution is obtained
Figure GDA00027292974400000511
The corresponding weights are all 1/M;
② sampling to obtain samples at k time
Figure GDA00027292974400000512
The sample conforms to an importance density function
Figure GDA00027292974400000513
Thirdly, calculating the weight of the particles according to the measurement model:
Figure GDA0002729297440000061
carrying out normalization weight:
Figure GDA0002729297440000062
output local state estimation
Figure GDA0002729297440000063
And local covariance
Figure GDA0002729297440000064
When the measured value comes at the next moment, returning to the step II;
further, based on the navigation and positioning method, the environment feature map generates a 3D point cloud map model through a mapping matrix corresponding to the original map.
The method uses the parallel implicit equal-weight particle filter positioning algorithm, has higher flexibility and better estimation precision, and overcomes the defects that the common particle filter algorithm needs a large number of samples, has high algorithm complexity, is easy to cause sample depletion and the like; the 5G technology is integrated for positioning, the time delay is short, the precision is high, indoor positioning can be well supported, and the application scene of the inspection robot is expanded; the multi-sensor data fusion is adopted, the navigation and positioning precision is improved, the speed is high, the consumption is low, the time delay is low, the adaptability to the working environment of the petrochemical inspection robot is good, and the seamless connection of the robot under the full-scene and full-working condition is ensured.
Drawings
FIG. 1 is a component of the process of the present invention.
Fig. 2 is a flow chart of the method of the present invention.
In the figure: 1. the system comprises a GPS module, 2, 5G modules, 3, a vision module, 4, a mileometer module, 5, a GPS judgment module, 6, 5G judgment modules, 7, a vision judgment module, 8, filters A, 9, filters B, 10, filters C, 11, distributed multi-sensor fusion modules A, 12, distributed multi-sensor fusion modules B, 13, a fusion information positioning module and 14 environment characteristic maps.
Detailed Description
The positioning method of the present invention will be further described with reference to fig. 1 and 2.
The positioning method based on the navigation positioning system comprises the following specific implementation steps:
step S1, when the inspection robot is started to start normal operation, the robot state and the measurement information z of the GPS module 1, the 5G module 2 and the vision module 3 are initializedi(k) Wherein k represents discrete time, zi(k) When i is 1,2 and 3, the measurement information of the GPS module 1, the 5G module 2 and the vision module 3 is respectively represented;
step S2, in the normal operation process of the robot, the GPS module 1 automatically collects the GPS measurement information with specific frequency and transmits the information to the GPS judgment module 5 in the normal operation process of the petrochemical inspection robot, and the GPS judgment module 5 judges whether the measurement information is effective or not; the 5G module 2 automatically collects 5G measurement information at a specific frequency and transmits the 5G measurement information to the 5G judgment module 6 in the normal operation process of the petrochemical inspection robot, and the 5G judgment module 6 judges whether the measurement information is effective or not; the vision module 3 automatically collects image information data at a specific frequency and transmits the image information data to the vision judging module 7 in the normal operation process of the petrochemical inspection robot, the vision judging module 7 judges whether the measurement information is effective or not, and the specific judging method is introduced in detail in step S4;
step S3, the output quantity of the odometer module 4 is sent to a filter A8, a filter B9 and a filter C10 at a specific frequency, the error quantity of the navigation parameters output by the odometer is used as a state quantity by the petrochemical inspection robot, and an odometer error model is constructed by combining the state quantity
Figure GDA0002729297440000071
M is the particle number, and M is 1,2,3, …, M is the total number of particle samples,
Figure GDA0002729297440000072
is the state value of the m-th particle in the particle sample at time k,
Figure GDA0002729297440000073
is the state value of the mth particle in the particle sample at time k-1,
Figure GDA0002729297440000074
for a priori estimation, i.e. the state at the current moment
Figure GDA0002729297440000075
Only with the last state
Figure GDA0002729297440000076
(ii) related;
step S4, the GPS judgment module 5 combines the environment feature map 14 to judge the available measurement information and sends the information to the filter A8, the 5G judgment module 6 combines the environment feature map 14 to judge the available measurement information and sends the information to the filter B9, the visual judgment module 7 combines the environment feature map 14 to judge the available measurement information and sends the information to the filter C10, the filter A8, the filter B9 and the filter C10 which receive the information use the difference value between the output quantity of the odometer and the measurement information output by the GPS module 1, the 5G module 2 and the visual module 3 as a measurement equation to construct a measurement model
Figure GDA0002729297440000077
And calculating an importance weight:
Figure GDA0002729297440000078
carrying out normalization weight processing to obtain:
Figure GDA0002729297440000079
wherein
Figure GDA00027292974400000710
Wherein, when j is 1,2,3, the measurement information collected by the GPS module 1, 5G module 2, and vision module 3 respectively,
Figure GDA00027292974400000711
in the above description, when j is 1,2, and 3, the filter A8 and the filter are respectively indicatedB9 and the importance weight of the mth particle in filter C10,
Figure GDA00027292974400000712
wherein, when j is 1,2,3, the normalized importance weights of the m-th particle in filter A8, filter B9 and filter C10 are respectively represented,
Figure GDA00027292974400000713
is a likelihood probability density function, namely the state of the petrochemical inspection robot
Figure GDA00027292974400000714
Lower observation zj(k) Probability of p (z)j(k) Is a probability of an edge,
Figure GDA00027292974400000715
in order to be a function of the importance density,
Figure GDA00027292974400000716
when j is 1,2, and 3, m-th sample values in the particle samples at time k in the filter A8, the filter B9, and the filter C10 are represented, respectively, and the m-th sample values conform to the importance density function
Figure GDA0002729297440000081
Each particle is updated with two parameters, α and β, adjusted:
Figure GDA0002729297440000082
due to the fact that
Figure GDA0002729297440000083
Is a deterministic movement of particles, transformed by coordinates
Figure GDA0002729297440000084
Then the importance weight is
Figure GDA0002729297440000085
Figure GDA0002729297440000086
Wherein
Figure GDA0002729297440000087
Being the absolute value of the jacobian determinant,
Figure GDA0002729297440000088
when j is 1,2, and 3, the local covariance ξ respectively output by the filter A8, the filter B9, and the filter C10 at the time k-1 is representedmIs derived from a standard multivariate Gaussian distribution, SmIs a random vector, ξm、SmAnd q (ξ) are both normal distributions N (0,1), N (0,1) being the probability distribution with the desired 0 and 1 standard deviation.
The GPS determining module 5 is used for determining whether the GPS module 1 can receive at least the timestamps transmitted by four or more satellites in real time, and if the GPS module 1 can receive the timestamps transmitted by four or more satellites, the measurement information Z is obtained1(k) If applicable, the GPS determination module 5 transmits the measurement information to the filter A8; otherwise, the GPS judging module 5 cancels the data transmission when the measurement information is unavailable;
the 5G judging module 6 judges the measurement information Z of the 5G module 22(k) Whether the method is available or not is specifically as follows: if the measurement information Z2(k) If the measurement information is available, the 5G determining module 6 transmits the measurement information to the filter B9; otherwise, if the data is unavailable, the 5G judgment module 6 cancels the data transmission;
the vision judging module 7 is used for judging whether the collected measurement information of the vision module 3 is valid or not. The specific implementation process comprises the following steps: extracting feature points in image frames acquired from a camera in real time, matching the extracted feature points with the feature point similarity of the 3D point cloud map, and if the similarity matching degree does not exceed a set threshold T, measuring information Z at the time3(k) Visual determination module, available information) passes this measurement information to filter C10; otherwise, the measurement information is not available, and the visual judgment module 7 cancels the information transmission.
In the step of S5,the filter A8, the filter B9 and the filter C10 utilize a parallel implicit equal-weight particle filter positioning algorithm to establish a robot error model and a measurement model to obtain corresponding local state estimation
Figure GDA0002729297440000091
And local covariance
Figure GDA0002729297440000092
Wherein
Figure GDA0002729297440000093
When j is 1,2, and 3, the local state estimates of the petrochemical inspection robot output by the filter A8, the filter B9, and the filter C10 are respectively represented;
Figure GDA0002729297440000094
when j is 1,2, and 3, the local covariance of the petrochemical inspection robot output by the filter A8, the filter B9, and the filter C10 is represented, respectively; filter A8 estimates local states at a particular frequency
Figure GDA0002729297440000095
And local covariance
Figure GDA0002729297440000096
Sending to the distributed multi-sensor fusion module A11; filter B9 estimates the local state at a particular frequency
Figure GDA0002729297440000097
And local covariance
Figure GDA0002729297440000098
To the distributed multi-sensor fusion module a11 and the distributed multi-sensor fusion module B12; filter C10 estimates local states at a particular frequency
Figure GDA0002729297440000099
And local covariance
Figure GDA00027292974400000910
To the distributed multi-sensor fusion module B12,
updating the local state estimate of the petrochemical inspection robot as follows:
Figure GDA00027292974400000911
and updating the local covariance of the petrochemical inspection robot as follows:
Figure GDA00027292974400000912
when the GPS determining module 5, the 5G determining module 6, and the visual determining module 7 determine that the corresponding measurement information is unavailable, it is assumed that information collected by at least two modules is available data, and the conditions of the distributed multi-sensor fusion module a11 and the distributed multi-sensor fusion module B12 are as follows:
when the GPS judging module 5 judges the measurement information Z of the GPS module 11(k) When the measurement information is not available, the GPS determining module 5 cancels the transmission of the measurement information to the filter A8. At this point, the distributed multi-sensor fusion module a11 receives only the local state estimates and local covariance from filter B9;
when the 5G judging module 6 judges the 5G module 7 measuring information Z2(k) When it is not available, the 5G determining module 6 cancels the transmission of the measurement information to the filter B9. At this point, the distributed multi-sensor fusion module a11 receives only the local state estimate and local covariance from filter A8; the distributed multi-sensor fusion module B12 receives only local state estimates and local covariance from the filter C10;
when the vision judging module 7 judges the measurement information Z of the vision module 33(k) When it is not available, the vision determination module 7 cancels the transmission of the measurement information to the filter C10. At this point, the distributed multi-sensor fusion module B12 receives only the local state estimates and local covariance from filter B9.
Step S6, receiving the local signal from the filter A8 by the distributed multi-sensor fusion module A11State estimation
Figure GDA0002729297440000101
Local covariance
Figure GDA0002729297440000102
And local state estimation from filter B9
Figure GDA0002729297440000103
And local covariance
Figure GDA0002729297440000104
The distributed multi-sensor fusion module A11 uses the optimal information fusion criterion to obtain the combined state estimation as follows:
Figure GDA0002729297440000105
the combined covariance expression is:
Figure GDA0002729297440000106
wherein
Figure GDA0002729297440000107
In order to optimize the weight matrix,
Figure GDA0002729297440000108
Figure GDA0002729297440000109
E=(e4,e4)T,e4is a 4 × 4 identity matrix; estimating X the fused combined state1(k) Sum combined covariance P1(k) Sending to the fusion information positioning module 13;
received by the distributed multi-sensor fusion module B12 fromLocal state estimation for filter B9
Figure GDA00027292974400001010
Local covariance
Figure GDA00027292974400001011
Local state estimation of sum filter C10
Figure GDA00027292974400001012
Local covariance
Figure GDA00027292974400001013
The distributed multi-sensor fusion module B12 obtains a combined state estimation by using an optimal information fusion criterion as follows:
Figure GDA00027292974400001014
the combined covariance expression is:
Figure GDA00027292974400001015
wherein
Figure GDA00027292974400001016
In order to optimize the weight matrix,
Figure GDA00027292974400001017
Figure GDA00027292974400001018
estimating X the fused combined state2(k) Sum combined covariance P2(k) Sending to the fusion information positioning module 13;
when the distributed multi-sensor fusion module a11 or the distributed multi-sensor fusion module B12 receives only the processing information of one filter, it does not perform any data processing, and only serves as a transmission channel of the data processing information, and transmits the local state estimation and the local covariance obtained by the filter to the fusion information positioning module 13.
In step S7, the fusion information positioning module 13 fuses the received combined state estimate and combined covariance according to the optimal information fusion criterion, and determines a global optimal state estimate X and a global covariance P.
Wherein
Figure GDA0002729297440000111
Figure GDA0002729297440000112
In order to optimize the weight matrix,
Figure GDA0002729297440000113
3=diag([P1(k),P2(k)]);
and converting the fusion positioning result back to the local coordinate system where the coordinate systems of the sensors are located, and resetting the positioning estimation values and covariance matrixes of the filter A8, the filter B9 and the filter C10 by utilizing the coordinate system transformation to estimate the pose of the next period.

Claims (2)

1. A petrochemical inspection robot navigation positioning method based on GPS, 5G and vision is characterized in that:
(1) the GPS module (1) collects the measurement information Z1(k) Sending the frequency to a GPS judgment module (5) by a specific frequency, and collecting measurement information Z by a 5G module (2)2(k) Sends the data to a 5G judgment module (6) and a vision module (3) at a specific frequency to acquire measurement information Z3(k) Sending the frequency to a visual judgment module (7);
(2) the output quantity of the odometer module (4) is sent to a filter A (8), a filter B (9) and a filter C (10) at a specific frequency, the error quantity of navigation parameters output by the odometer is used as a state quantity by the petrochemical inspection robot, and an odometer error model is constructed by combining the state quantity
Figure FDA0002729297430000011
M is the particle number, and M is 1,2,3, …, M is the total number of particle samples,
Figure FDA0002729297430000012
is the state value of the m-th particle in the particle sample at time k,
Figure FDA0002729297430000013
is the state value of the mth particle in the particle sample at time k-1,
Figure FDA0002729297430000014
for a priori estimation, i.e. the state at the current moment
Figure FDA0002729297430000015
Only with the last state
Figure FDA0002729297430000016
(ii) related;
(3) the GPS judgment module (5) is combined with the environment feature map (14) to judge available measurement information and send the measurement information to the filter A (8), the 5G judgment module (6) is combined with the environment feature map (14) to judge the available measurement information and send the measurement information to the filter B (9), the visual judgment module (7) is combined with the environment feature map (14) to judge the available measurement information and send the measurement information to the filter C (10), the filter A (8), the filter B (9) and the filter C (10) which receive the information adopt the difference value between the output quantity of the odometer and the measurement information output by the GPS module (1), the 5G module (2) and the visual module (3) as a measurement equation, and a measurement model is constructed
Figure FDA0002729297430000017
And calculating an importance weight:
Figure FDA0002729297430000018
carrying out normalization weight processing to obtain:
Figure FDA0002729297430000019
wherein Zj(k) When j is equal to1,2 and 3 respectively represent the measurement information collected by the GPS module (1), the 5G module (2) and the vision module (3),
Figure FDA00027292974300000110
wherein, when j is 1,2, and 3, the importance weights of the m-th particles in the filter a (8), the filter B (9), and the filter C (10) are respectively represented,
Figure FDA00027292974300000111
wherein, when j is 1,2, and 3, the normalized importance weights of the m-th particle in filter a (8), filter B (9), and filter C (10) are respectively expressed,
Figure FDA00027292974300000112
is a likelihood probability density function, namely the state of the petrochemical inspection robot
Figure FDA00027292974300000113
Lower observation Zj(k) Probability of p (Z)j(k) Is a probability of an edge,
Figure FDA00027292974300000114
in order to be a function of the importance density,
Figure FDA00027292974300000115
when j is 1,2, and 3, m-th sample values of the particle samples at time k in filter a (8), filter B (9), and filter C (10) are represented, respectively, and the m-th sample values conform to the importance density function
Figure FDA00027292974300000116
(4) The filter A (8), the filter B (9) and the filter C (10) utilize a parallel implicit equal-weight particle filter positioning algorithm to establish a robot error model and a measurement model to obtain corresponding local state estimation
Figure FDA0002729297430000021
And local covariance
Figure FDA0002729297430000022
Wherein
Figure FDA0002729297430000023
When j is 1,2, and 3, the local state estimation of the petrochemical inspection robot output by the filter a (8), the filter B (9), and the filter C (10) is respectively expressed;
Figure FDA0002729297430000024
when j is 1,2, and 3, the local covariance of the petrochemical inspection robot output by the filter a (8), the filter B (9), and the filter C (10) is represented, respectively; the filter A (8) estimates the local state at a specific frequency
Figure FDA0002729297430000025
And local covariance
Figure FDA0002729297430000026
Sending the data to a distributed multi-sensor fusion module A (11); filter B (9) estimates the local state at a specific frequency
Figure FDA0002729297430000027
And local covariance
Figure FDA0002729297430000028
Sending the data to a distributed multi-sensor fusion module A (11) and a distributed multi-sensor fusion module B (12); the filter C (10) estimates the local state at a specific frequency
Figure FDA0002729297430000029
And local covariance
Figure FDA00027292974300000210
To the distributed multi-sensor fusion module B (12),
updating the local state estimation of the petrochemical inspection robot:
Figure FDA00027292974300000211
updating the local covariance of the petrochemical inspection robot:
Figure FDA00027292974300000212
(5) receiving, by a distributed multi-sensor fusion module A (11), a local state estimate from a filter A (8)
Figure FDA00027292974300000213
Local covariance
Figure FDA00027292974300000214
And local state estimation from filter B (9)
Figure FDA00027292974300000215
Local covariance
Figure FDA00027292974300000216
The distributed multi-sensor fusion module A (11) obtains a combined state estimation by using an optimal information fusion criterion as follows:
Figure FDA00027292974300000217
the combined covariance estimation expression is:
Figure FDA00027292974300000218
wherein
Figure FDA00027292974300000219
In order to optimize the weight matrix,
Figure FDA00027292974300000220
Figure FDA00027292974300000221
E=(e4,e4)T,e4is a 4 × 4 identity matrix; estimating X the fused combined state1(k) Sum combined covariance P1(k) Sending the information to a fusion information positioning module (13);
receiving, by a distributed multi-sensor fusion module B (12), local state estimates from a filter B (9)
Figure FDA00027292974300000222
And local covariance
Figure FDA00027292974300000223
And local state estimation from filter C (10)
Figure FDA00027292974300000224
And local covariance
Figure FDA00027292974300000225
The distributed multi-sensor fusion module B (12) obtains a combined state estimation by using an optimal information fusion criterion as follows:
Figure FDA0002729297430000031
the combined covariance estimation expression is:
Figure FDA0002729297430000032
wherein
Figure FDA0002729297430000033
In order to optimize the weight matrix,
Figure FDA0002729297430000034
Figure FDA0002729297430000035
estimating X the fused combined state2(k) Sum combined covariance P2(k) Sending the information to a fusion information positioning module (13);
(6) the fusion information positioning module (13) fuses the received combined state estimation and combined covariance by using the optimal information fusion criterion to decide a global optimal state estimation X and a global covariance P, wherein
Figure FDA0002729297430000036
Figure FDA0002729297430000037
In order to optimize the weight matrix,
Figure FDA0002729297430000038
2. the navigation positioning method of claim 1, wherein in step (3), each particle is updated by adjusting two parameters, α and β:
Figure FDA0002729297430000039
due to the fact that
Figure FDA00027292974300000310
Is a deterministic movement of particles, transformed by coordinates
Figure FDA00027292974300000311
Figure FDA00027292974300000312
Figure FDA00027292974300000313
Wherein
Figure FDA00027292974300000314
Being the absolute value of the jacobian determinant,
Figure FDA00027292974300000315
when j is 1,2, and 3, the local covariance ξ respectively output by the filter a (8), the filter B (9), and the filter C (10) at the time k-1 is representedmIs derived from a standard multivariate Gaussian distribution, SmIs a random vector, ξm、SmAnd q (ξ) are both normal distributions N (0,1), N (0,1) being the probability distribution with the desired 0 and 1 standard deviation.
CN202010771058.1A 2020-08-04 2020-08-04 Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision Active CN111735458B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010771058.1A CN111735458B (en) 2020-08-04 2020-08-04 Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010771058.1A CN111735458B (en) 2020-08-04 2020-08-04 Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision

Publications (2)

Publication Number Publication Date
CN111735458A CN111735458A (en) 2020-10-02
CN111735458B true CN111735458B (en) 2020-11-24

Family

ID=72657112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010771058.1A Active CN111735458B (en) 2020-08-04 2020-08-04 Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision

Country Status (1)

Country Link
CN (1) CN111735458B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112333818B (en) * 2020-10-27 2021-11-02 中南民族大学 Multi-source fusion indoor positioning system and method based on self-adaptive periodic particle filtering
CN113011475B (en) * 2021-01-29 2022-12-02 深圳信息职业技术学院 Distributed fusion method considering correlated noise and random parameter matrix
CN112986908B (en) * 2021-04-26 2021-08-17 网络通信与安全紫金山实验室 Positioning calibration method, system and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110440801A (en) * 2019-07-08 2019-11-12 浙江吉利控股集团有限公司 A kind of location aware information acquisition method, apparatus and system
CN110542436A (en) * 2019-09-11 2019-12-06 百度在线网络技术(北京)有限公司 Evaluation method, device and equipment of vehicle positioning system and storage medium
CN110631589A (en) * 2019-09-29 2019-12-31 广东星舆科技有限公司 Method for correcting positioning track in real time

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106123897B (en) * 2016-06-14 2019-05-03 中山大学 Indoor fusion and positioning method based on multiple features
CN106851820B (en) * 2017-03-07 2020-02-07 西南石油大学 Positioning method of underground wireless sensor network
CN107421543B (en) * 2017-06-22 2020-06-05 北京航空航天大学 Implicit function measurement model filtering method based on state dimension expansion
ES2736292A1 (en) * 2018-06-25 2019-12-27 Mendoza Diadosa Carlos Jose Integral and remote control system of zebra crossings and smart crossings (Machine-translation by Google Translate, not legally binding)
CN109559329B (en) * 2018-11-28 2023-04-07 陕西师范大学 Particle filter tracking method based on depth denoising automatic encoder
CN109739830B (en) * 2019-02-28 2021-01-26 电子科技大学 Position fingerprint database rapid construction method based on crowdsourcing data
CN110333701A (en) * 2019-06-25 2019-10-15 河南跃薪智能机械有限公司 A kind of unmanned centralized control system in mine based on 5G technology
CN111337018B (en) * 2020-05-21 2020-09-01 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110440801A (en) * 2019-07-08 2019-11-12 浙江吉利控股集团有限公司 A kind of location aware information acquisition method, apparatus and system
CN110542436A (en) * 2019-09-11 2019-12-06 百度在线网络技术(北京)有限公司 Evaluation method, device and equipment of vehicle positioning system and storage medium
CN110631589A (en) * 2019-09-29 2019-12-31 广东星舆科技有限公司 Method for correcting positioning track in real time

Also Published As

Publication number Publication date
CN111735458A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
CN111735458B (en) Navigation and positioning method of petrochemical inspection robot based on GPS, 5G and vision
CN108225302B (en) Petrochemical plant inspection robot positioning system and method
CN110174136B (en) Intelligent detection robot and intelligent detection method for underground pipeline
CN106291488B (en) A kind of Radar Calibration error calibration method
CN112533163B (en) Indoor positioning method based on NB-IoT (NB-IoT) improved fusion ultra-wideband and Bluetooth
CN108318868A (en) Radar tactical performance based on ADS-B data sources tests appraisal procedure
CN106028447B (en) Indoor floor positioning method based on air pressure fingerprints
CN113791074A (en) Unmanned aerial vehicle bridge crack inspection system and method based on multi-sensor fusion
CN113109344A (en) Novel real-time efficient water quality monitoring system based on internet of things
CN109241228A (en) A kind of multiple mobile robot's cooperation synchronous superposition strategy
CN114565674A (en) Pure visual positioning method and device for urban structured scene of automatic driving vehicle
Klein et al. Wireless sensor networks for fugitive methane emissions monitoring in oil and gas industry
CN105575154A (en) Vehicle GPS positioning lost data compensation method
Cho et al. Enhancing GNSS performance and detection of road crossing in urban area using deep learning
CN117647993A (en) Unmanned aerial vehicle inspection method based on electric power inspection scene
Ren et al. Adaptive sensor fusion of camera, GNSS and IMU for autonomous driving navigation
CN113358117A (en) Visual inertial indoor positioning method using map
CN110308436B (en) Laser optical axis calibration method and system for multi-line laser scanner
CN111854745A (en) Clock prediction method based on Internet of things indoor positioning
CN109389053B (en) Method and system for detecting position information of vehicle to be detected around target vehicle
CN115951369A (en) Multi-sensor fusion positioning method for complex port environment
Wang et al. Fusion positioning system based on IMU and roadside LiDAR in tunnel for C-V2X use
CN114916059A (en) WiFi fingerprint sparse map extension method based on interval random logarithm shadow model
CN110531397B (en) Outdoor inspection robot positioning system and method based on GPS and microwave
TW202136752A (en) Detection system and method for detecting road damage distinguishing road damage according to the image and calculating and storing instantaneous coordinates of the GNSS-RTK moving device according to the observation data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant