CN109099901A - Full-automatic road roller localization method based on multisource data fusion - Google Patents

Full-automatic road roller localization method based on multisource data fusion Download PDF

Info

Publication number
CN109099901A
CN109099901A CN201810670215.2A CN201810670215A CN109099901A CN 109099901 A CN109099901 A CN 109099901A CN 201810670215 A CN201810670215 A CN 201810670215A CN 109099901 A CN109099901 A CN 109099901A
Authority
CN
China
Prior art keywords
road roller
point
point cloud
laser radar
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810670215.2A
Other languages
Chinese (zh)
Other versions
CN109099901B (en
Inventor
李煊鹏
李宇杰
谌中平
王东
张为公
许剑
邹承利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Weiyi (Suzhou) Intelligent Technology Co., Ltd.
Original Assignee
Suzhou Road Agent Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Road Agent Intelligent Technology Co Ltd filed Critical Suzhou Road Agent Intelligent Technology Co Ltd
Priority to CN201810670215.2A priority Critical patent/CN109099901B/en
Publication of CN109099901A publication Critical patent/CN109099901A/en
Application granted granted Critical
Publication of CN109099901B publication Critical patent/CN109099901B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses the full-automatic road roller localization methods based on multisource data fusion, on construction area boundary, boundary marker is set, Multi-path synchronous camera obtains the image comprising boundary marker, and construction area point cloud data is obtained using laser radar, image is matched with laser point cloud data, it identifies construction area mark, calculates construction fence region;Using laser point cloud data and position and posture information of the IMU output information fusion calculation road roller in construction area, realize unmanned road roller in the positioning of construction area.Pass through the localization method of machine vision, by vision data and IMU data fusion, it can adapt to the special construction environment of road roller, it effectively avoids the problem that relying on the signal deletion that GPS positioning generates merely, and positioning and path planning requirement of the unmanned road roller in construction environment can be met in terms of positioning accuracy and speed, it is efficiently applied to road roller construction site, the problem of effective resolving roller positioning accuracy difference.

Description

Full-automatic road roller localization method based on multisource data fusion
Technical field
The present invention relates to a kind of full-automatic road roller localization method based on multisource data fusion.
Background technique
Engineering machinery is often in the disaster fields such as earthquake, flood, tsunami and high temperature, high and cold, High aititude, the high evil radiated It constructs under bad operating condition, not only construction efficiency is low, but also operator usually will also emit life danger.Optimized using unmanned technology Existing engineering machinery can be with lifting construction quality, reduction personnel cost, reduction security risk, and provides for the update of realization industry Important base.The road roller being particularly directed in engineering machinery belongs to the scope of road furniture, be widely used in highway, The embankment compacting operation of the Larger Engineering Projects such as railway, airfield runway, dam, stadium can roll sand, half viscosity and viscous Property soil, subgrade stability soil and asphalt concrete pavement layer, have extremely wide construction demand and higher task difficulty.Press road The problem of standardized construction of machine is also each unit in charge of construction in the current whole world and machinery production factory urgent need to resolve, and automatically press road Machine provides be effectively ensured to solve this problem.
Realize that full-automatic road roller is independently constructed on condition that accurately to obtain self-position.Engineering machinery is main fixed at present Position method is that positioning is merged with odometer based on GPS positioning or GPS.Due to relying on GPS device positioning, itself exists and miss Difference, while in the case where signal blocks, for example special operation conditions environment, the GPS signal such as tunnel, bustling urban district can weaken rapidly It even loses, to be unable to complete the automated construction for meeting homework precision.
There is the deficiencies of positioning accuracy is low, at high cost mostly in current unmanned road roller.Patent publication No. is CN106127177A discloses a kind of unmanned road roller, is positioned using GPS to road roller, not yet in view of existing when road roller The problem of GPS signal can not receive when construction operation under particular surroundings causes not making in tunnel, the inferior particular surroundings of bridge The problem of industry.
Machine vision is to obtain picture using visual sensor and carry out various measurements and judgement using image processing system, It is an important branch of Computer Subject, combines the technology of optics, machinery, electronics, computer software and hardware etc., be related to To multiple fields such as computer, image procossing, pattern-recognition, artificial intelligence, signal processing, optical, mechanical and electronic integration.Vision guided navigation It is that respective handling is carried out to obtain a kind of technology of carrier pose parameter to the image that visual sensor obtains.Vision is led at present Boat technology is mainly used in the racing contest of mobile robot, industry AGV, the independent navigation of intelligent vehicle and science and techniques of defence technology Study this four aspects.Patent publication No. CN104835173A proposes a kind of localization method based on machine vision, but the party Method realizes AGV positioning by vehicle-mounted vision system, and this method relies solely on camera and realizes that vision positioning, stability are inadequate.
Summary of the invention
The purpose of the present invention is overcoming the shortcomings of the prior art, provide a kind of based on the full-automatic of multisource data fusion Road roller localization method.
The purpose of the present invention is achieved through the following technical solutions:
Full-automatic road roller localization method based on multisource data fusion, it is characterised in that: be arranged on construction area boundary Boundary marker, as the index point in laser point cloud data and fusing image data positioning auxiliary, Multi-path synchronous camera obtains packet Image containing boundary marker, and construction area point cloud data is obtained using laser radar, image and laser point cloud data are carried out Matching identifies construction area mark, calculates construction fence region;Utilize laser point cloud data and IMU output information fusion calculation Position and posture information of the road roller in construction area, realize unmanned road roller in the positioning of construction area.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein at the construction field (site) Construction area bordering is set, industrial personal computer and inertial navigation unit are installed on road roller, the surrounding of road roller, which is installed, to be used Three-dimensional laser radar is set on front side of the camera for obtaining road roller construction area image information, road roller.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein road roller Camera is distributed in four side of front, rear, left and right, and camera collocation wide-angle lens covers 360 ° of ranges around road roller, utilizes gridiron pattern Camera is demarcated, obtains camera internal reference, and distortion correction is carried out to camera;
The mesh of three-dimensional laser radar scanning 30 degree of vertical direction and 360 degree of horizontal directions within the scope of 100 meters of radius Mark, obtains the label to road roller motion profile by way of continuity point cloud Data Matching;
The construction area bordering is cone, using image and the matched method of point cloud data, is obtained to friendship The effective identification and range measurement of logical cone, realize surveying for construction area boundary;
The inertial navigation unit obtains road roller speed, the information of position, using extended Kalman filter to laser Radar information is merged with inertial navigation information, realizes synchronization composition and the positioning of unmanned road roller.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein the cone Pyrometric cone is acquired by three-dimensional laser radar and camera respectively and is regarded at two as construction area bordering for pyrometric cone Three-dimensional laser point cloud and image under point are closed using the calibration between the laser point cloud data and image data of foundation and obtain triangle The point cloud for boring marker is semantic, as vehicle location reference frame.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein swashed using three-dimensional Optical radar obtains the three dimensional point cloud of scene, three-dimensional S LAM composition is realized using LOAM method, in LOAM method, by mentioning Take calculating coordinate change after Feature Points Matching;Cloud and IMU data are pre-processed first, for extracting characteristic point: primary The point of scanning is classified by curvature value, and formula is as follows:
Wherein, { L } is the three-dimensional system of coordinate of laser radar, the geometric center originating from laser radar, x-axis direction left side, y Axis points up, and z-axis is directing forwardly;pkIndicate the point cloud perceived during scanning k, { LkMidpoint i, i ∈ pkCoordinate representation beS is one group of continuity point that laser scanner returns in same scanning process;
Point in scanning sorts and carries out the selection of characteristic point according to C value, wherein choosing the maximum of points of C as edge Point, the minimum point of C is as planar point;Single pass, which is divided into 4 independent subregions, makes characteristic point be evenly distributed on ring In border, each subregion at most provides 2 marginal points and 4 planar points, then carries out the registration between adjacent two frames point cloud data, The association of t moment and t+1 moment point cloud data is completed, and estimates the relative motion relation of radar;The process of point cloud registering are as follows: For characteristic curve, find a point i point j nearest in t moment point cloud using KD tree, and time near point l is looked for around j, then (j, l) is known as correspondence of the point i in t moment point cloud;It is similar with characteristic curve for characteristic face, closest approach j is first looked for, is looked for around j L looks for m around j, and (j, l, m) is known as correspondence of the point i in t moment point cloud;
After finding registration point, the constraint relationship between different moments point cloud is obtained, calculates to correspond to from characteristic point to it and close The distance of system, since marginal point, for point i, if (j, l) is corresponding edge line, the distance of point to line is calculated are as follows:
WhereinIt is the coordinate of the point i in { L },Be i in last moment corresponding points j, The coordinate of l;If (j, l, m) is corresponding plane, then the distance of point to face calculates are as follows:
WhereinIt is the coordinate of { L } midpoint m;
To the parameter to be estimated in above formulaIt asks local derviation to obtain Jaccobian matrix, carries out estimation using L-M algorithm It solves:
Wherein,Rigid motion comprising laser radar on 6DOF, tx,tyAnd tzRespectively along the x of coordinate system { L }, y and z-axis, θxyAnd θzTo rotate angle, it then follows the right-hand rule;Every a line of f A corresponding characteristic point, d include corresponding distance, and J isFor the Jaccobian matrix of f,Then, pass through D is leveled off to and zero is obtained by nonlinear iteration:
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein utilize vehicle-mounted three Dimension laser radar obtains initial data and obtains quasi- road sign set by data filtering and data clustering processing;Then iteration is used Corresponding relationship in these road signs of closest approach algorithm search and map between road sign calculates two as the corresponding relationship to obtained by Positional shift T and angle offset r between point set, and utilize the pose of this offset calculating laser radar, it is assumed that it is current to swash Pose of the optical radar under global coordinate system is expressed as state variable (xL,yLL), first two are laser radar in world coordinates Position under system, Section 3 φLIndicate the direction of advance of laser radar;
Wherein, (xk,ykk) it is system state variables, (xL,k,yL,kL,k) it is systematic observation variable, T is that position is inclined It moves, r is angle offset, and v is the observation noise that error matrix is R.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein road roller operation In the process, due to the limitation of laser radar scanning range, the road roller visual field may without apparent road sign feature, at this time without Method carries out the pose estimation of road roller;Therefore, the continuity that tracking keeps the estimation of its pose is carried out to road roller pose, using expansion The data of opening up Kalman filtering fusion odometer and laser radar, count from mileage, calculate the position and side of road roller Upward increment, as input quantity u=(Δ S, Δ φ)T, it is as follows to establish Vehicular system state equation:
Wherein, w is the process white noise that error matrix is Q.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein the camera is Big target surface industrial camera.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein described three-dimensional sharp Optical radar is set on front side of road roller above steel wheel at rack.
Further, the above-mentioned full-automatic road roller localization method based on multisource data fusion, wherein described three-dimensional sharp Optical radar is installed with tilt angle.
The present invention has significant advantages and beneficial effects compared with prior art, embodies in the following areas:
Vision data and IMU data fusion can adapt to road roller spy by the localization method of machine vision by the present invention Different construction environment effectively avoids the problem that the signal deletion for relying on GPS positioning generation merely, and in positioning accuracy and speed Aspect can meet positioning and path planning requirement of the unmanned road roller in construction environment, can be effectively applied to road roller and apply Work scene;The present invention efficiently solves the problems, such as that road roller positioning accuracy is poor, and can reduce the workload of construction personnel, improves Labor productivity.
Detailed description of the invention
Fig. 1: structural schematic diagram of the invention.
The meaning of each appended drawing reference see the table below in figure:
Specific embodiment
For a clearer understanding of the technical characteristics, objects and effects of the present invention, specific implementation is now described in detail Scheme.
The present invention is based on the full-automatic road roller localization methods of multisource data fusion, mark on construction area boundary setting boundary Will, Multi-path synchronous camera obtains the image comprising boundary marker, and obtains construction area point cloud data using laser radar, will scheme As being matched with laser point cloud data, identifies construction area mark, calculate construction fence region;Using laser point cloud data with Position and posture information of the IMU output information fusion calculation road roller in construction area, realize unmanned road roller in construction area The positioning in domain.
As shown in Figure 1, industrial personal computer 2 is set up in the cockpit of road roller 1, fixed inertial navigation unit 5 at industrial personal computer 2, Surrounding installs the camera 3 for obtaining road roller construction area image information, the front side of road roller 1 at the top of the cockpit of road roller Three-dimensional laser radar 4 is set;Construction area bordering is set at the construction field (site).Industrial personal computer provides rich for each sensor unit Rich expansion interface, and each sensing data is handled, camera, three-dimensional laser radar, inertial navigation unit and industry control It is attached by being issued based on robot operating system (ROS) nodal information with received communication mode between machine, i.e., in ROS In program process, node is received by publication topic information for other nodes, completes mutual communication.The process flow of data It is as follows:
Wherein, four side of front, rear, left and right of road roller 1 is distributed with camera 3, and camera 3 uses big target surface industrial camera, phase The collocation wide-angle lens of machine 3, is covered 360 ° of ranges around road roller, is demarcated using gridiron pattern to camera, obtain camera internal reference, And distortion correction is carried out to camera.
Three-dimensional laser radar 4 is set to above the front side steel wheel of road roller 1 at rack, is installed with tilt angle, and three-dimensional swashs Optical radar 4 scans the target of 30 degree of vertical direction and 360 degree of horizontal directions within the scope of 100 meters of radius, passes through continuity point cloud number The label to road roller motion profile is obtained according to matched mode;It realizes that road roller is accurately positioned, solves GPS signal missing etc. and ask Topic.
Construction area bordering is cone, using image and the matched method of point cloud data, is obtained to cone Effective identification and range measurement, realize that construction area boundary surveys;Realize unmanned road roller in the accurate fixed of construction area Position.
Inertial navigation unit 5 obtains road roller speed, the information of position, using extended Kalman filter to laser radar Information is merged with inertial navigation information, realizes synchronization composition and the positioning of unmanned road roller.Realize that unmanned road roller is being applied The accurate positioning in work area domain.
When it is implemented, including following aspect:
A) boundary marker analyte detection
It is specific to use the convolutional neural networks based on region using the object detection method based on deep learning, that is, it combines The object detection method of region nomination and convolutional neural networks.It is required in terms of comprehensively considering arithmetic speed and accuracy rate two, Training Faster R-CNN model under caffe frame, Faster-RCNN introduces RPN network, so that extracted region, classification, recurrence Convolution feature is shared, guarantees to promote arithmetic speed while computational accuracy.
B) camera is merged with laser data
Cone is pyrometric cone, as construction area bordering, is acquired respectively by three-dimensional laser radar and camera Three-dimensional laser point cloud and image of the pyrometric cone under two viewpoints, using between the laser point cloud data and image data of foundation The point cloud semanteme for obtaining pyrometric cone marker is closed in calibration, as vehicle location reference frame.
C) the LOAM localization method based on boundary marker object point cloud
The three dimensional point cloud that scene is obtained using three-dimensional laser radar realizes three-dimensional S LAM composition using LOAM method, In LOAM method, pass through calculating coordinate change after extraction Feature Points Matching;Cloud and IMU data are pre-processed first, For extracting characteristic point: the point of single pass is classified by curvature value, and formula is as follows:
Wherein, { L } is the three-dimensional system of coordinate of laser radar, the geometric center originating from laser radar, x-axis direction left side, y Axis points up, and z-axis is directing forwardly;pkIndicate the point cloud perceived during scanning k, { LkMidpoint i, i ∈ pkCoordinate representation beIt is one group of continuity point that laser scanner returns in same scanning process.
Point in scanning sorts and carries out the selection of characteristic point according to C value, wherein choosing the maximum of points of C as edge Point, for the minimum point of C as planar point, single pass, which is divided into 4 independent subregions, makes characteristic point be evenly distributed on ring In border, each subregion at most provides 2 marginal points and 4 planar points, then carries out the registration between adjacent two frames point cloud data, The association of t moment and t+1 moment point cloud data is completed, and estimates the relative motion relation of radar;The process of point cloud registering are as follows: For characteristic curve, find a point i point j nearest in t moment point cloud using KD tree, and time near point l is looked for around j, then (j, l) is known as correspondence of the point i in t moment point cloud;It is similar with characteristic curve for characteristic face, closest approach j is first looked for, is looked for around j L looks for m around j, and (j, l, m) is known as correspondence of the point i in t moment point cloud;
After finding registration point, the constraint relationship between different moments point cloud is obtained, calculates to correspond to from characteristic point to it and close The distance of system, since marginal point, for point i, if (j, l) is corresponding edge line, the distance put to line be may be calculated:
WhereinIt is the coordinate of the point i in { L },Be i in last moment corresponding points j, The coordinate of l.If (j, l, m) is corresponding plane, then the distance of point to face calculates are as follows:
WhereinIt is the coordinate of { L } midpoint m.
To the parameter to be estimated in above formulaIt asks local derviation to obtain Jaccobian matrix, carries out estimation using L-M algorithm It solves:
Wherein,Rigid motion comprising laser radar on 6DOF, tx,tyAnd tzRespectively along the x of coordinate system { L }, y and z-axis, θxyAnd θzTo rotate angle, it then follows the right-hand rule.Every a line of f A corresponding characteristic point, d include corresponding distance, and J isFor the Jaccobian matrix of f,Then, pass through D is leveled off to and zero is obtained by nonlinear iteration:
D) road roller athletic posture is continuously estimated
Initial data, which is obtained, using vehicle-mounted three-dimensional laser radar obtains quasi- road by data filtering and data clustering processing Mark set;Then using the corresponding relationship between road sign in these road signs of iteration closest approach algorithm search and map, by institute It obtains corresponding relationship and calculates positional shift T and angle offset r between two point sets, and calculate laser thunder using this offset The pose reached, it is assumed that pose of the present laser radar under global coordinate system is expressed as state variable (xL,yLL), first two For position of the laser radar under global coordinate system, Section 3 φLIndicate the direction of advance of laser radar;
Wherein, wherein (xk,ykk) it is system state variables, (xL,k,yL,kL,k) it is systematic observation variable, T is position Offset is set, r is angle offset, and v is the observation noise that error matrix is R.
In road roller operational process, due to the limitation of laser radar scanning range, the road roller visual field may be not bright Aobvious road sign feature can not carry out the pose estimation of road roller at this time;Therefore, tracking is carried out to road roller pose and keeps its pose The continuity of estimation is counted from mileage using the data of Extended Kalman filter fusion odometer and laser radar, is calculated Increment in the position and direction of road roller out, as input quantity u=(Δ S, Δ φ)T, establish Vehicular system state equation such as Under:
Wherein, w is the process white noise that error matrix is Q.
In conclusion marker is arranged on construction area periphery in the present invention, marker is cone, as laser point cloud number According to the index point in fusing image data positioning auxiliary;Using the marker of the method identification construction area based on machine learning And mobile target, construction site image data is acquired, marker and mobile target, the training under caffe frame are marked Faster R-CNN model identifies construction area marker and mobile target;By the image data of camera acquisition, laser point cloud number Accordingly and IMU inertial data merges, and solves position and posture information of the road roller in construction area.
By the localization method of machine vision, by vision data and IMU data fusion, it can adapt to that road roller is special to be applied Work environment effectively avoids the problem that the signal deletion for relying on GPS positioning generation merely, and in terms of positioning accuracy and speed all It can satisfy positioning and path planning requirement of the unmanned road roller in construction environment, it is existing to can be effectively applied to road roller construction ?.The present invention efficiently solves the problems, such as that road roller positioning accuracy is poor, and can reduce the workload of construction personnel, improves labour Productivity.
It should be understood that the foregoing is merely the preferred embodiment of the present invention, the power that is not intended to limit the invention Sharp range;The description above simultaneously, should can be illustrated and implement for the special personage of correlative technology field, thus it is other without departing from The equivalent change or modification completed under disclosed spirit, should be included in claim.

Claims (10)

1. the full-automatic road roller localization method based on multisource data fusion, it is characterised in that: side is arranged on construction area boundary Boundary mark will, as the index point in laser point cloud data and fusing image data positioning auxiliary, the acquisition of Multi-path synchronous camera includes The image of boundary marker, and construction area point cloud data is obtained using laser radar, by image and laser point cloud data progress Match, identify construction area mark, calculates construction fence region;Utilize laser point cloud data and IMU output information fusion calculation pressure Position and posture information of the road machine in construction area, realize unmanned road roller in the positioning of construction area.
2. the full-automatic road roller localization method according to claim 1 based on multisource data fusion, it is characterised in that: Construction area bordering is arranged in construction site, installation industrial personal computer and inertial navigation unit on road roller, and the four of road roller Camera for obtaining road roller construction area image information is installed in week, three-dimensional laser radar is set on front side of road roller.
3. the full-automatic road roller localization method according to claim 2 based on multisource data fusion, it is characterised in that: pressure Camera is distributed in four side of front, rear, left and right of road machine, and camera collocation wide-angle lens covers 360 ° of ranges around road roller, utilizes Gridiron pattern demarcates camera, obtains camera internal reference, and carry out distortion correction to camera;
The target of three-dimensional laser radar scanning 30 degree of vertical direction and 360 degree of horizontal directions within the scope of 100 meters of radius, leads to The mode for crossing continuity point cloud Data Matching obtains label to road roller motion profile;
The construction area bordering is cone, using image and the matched method of point cloud data, is obtained to cone Effective identification and range measurement, realize that construction area boundary surveys;
The inertial navigation unit obtains road roller speed, the information of position, using extended Kalman filter to laser radar Information is merged with inertial navigation information, realizes synchronization composition and the positioning of unmanned road roller.
4. the full-automatic road roller localization method according to claim 3 based on multisource data fusion, it is characterised in that: institute It states cone and pyrometric cone is acquired by three-dimensional laser radar and camera respectively as construction area bordering for pyrometric cone Three-dimensional laser point cloud and image under two viewpoints are closed using the calibration between the laser point cloud data and image data of foundation The point cloud for obtaining pyrometric cone marker is semantic, as vehicle location reference frame.
5. the full-automatic road roller localization method according to claim 3 based on multisource data fusion, it is characterised in that: benefit The three dimensional point cloud that scene is obtained with three-dimensional laser radar realizes three-dimensional S LAM composition using LOAM method, in LOAM method In, pass through calculating coordinate change after extraction Feature Points Matching;Cloud and IMU data are pre-processed first, for extracting spy Levy point: the point of single pass is classified by curvature value, and formula is as follows:
Wherein, the three-dimensional system of coordinate of { L } for laser radar, the geometric center originating from laser radar, x-axis are directed toward left side, and y-axis refers to Upwards, z-axis is directing forwardly;pkIndicate the point cloud perceived during scanning k, { LkMidpoint i, i ∈ pkCoordinate representation beS is one group of continuity point that laser scanner returns in same scanning process;
Point in scanning sorts and carries out the selection of characteristic point according to C value, wherein choose the maximum of points of C as marginal point, C's Minimum point is as planar point;Single pass, which is divided into 4 independent subregions, is uniformly distributed characteristic point in the environment, often Sub-regions at most provide 2 marginal points and 4 planar points, then carry out the registration between adjacent two frames point cloud data, i.e. completion t The association at moment and t+1 moment point cloud data, and estimate the relative motion relation of radar;The process of point cloud registering are as follows: for spy Line is levied, a point i point j nearest in t moment point cloud is found using KD tree, and look for time near point l around j, then (j, l) is claimed For correspondence of the point i in t moment point cloud;It is similar with characteristic curve for characteristic face, closest approach j is first looked for, l is looked for around j, at j weeks It encloses and looks for m, (j, l, m) is known as correspondence of the point i in t moment point cloud;
After finding registration point, the constraint relationship between different moments point cloud is obtained, is calculated from characteristic point to its corresponding relationship Distance, since marginal point, for point i, if (j, l) is corresponding edge line, the distance of point to line is calculated are as follows:
WhereinIt is the coordinate of the point i in { L },It is corresponding points j, l of i in last moment Coordinate;If (j, l, m) is corresponding plane, then the distance of point to face calculates are as follows:
WhereinIt is the coordinate of { L } midpoint m;
To the parameter to be estimated in above formulaIt asks local derviation to obtain Jaccobian matrix, carries out estimation solution using L-M algorithm:
Wherein,Rigid motion comprising laser radar on 6DOF,tx,ty And tzRespectively along the x of coordinate system { L }, y and z-axis, θxyAnd θzTo rotate angle, it then follows the right-hand rule;Every a line of f is corresponding One characteristic point, d include corresponding distance, and J isFor the Jaccobian matrix of f,Then, by non-thread D is leveled off to and zero is obtained by property iteration:
6. the full-automatic road roller localization method according to claim 3 based on multisource data fusion, it is characterised in that: benefit Initial data, which is obtained, with vehicle-mounted three-dimensional laser radar obtains quasi- road sign set by data filtering and data clustering processing;Then Using the corresponding relationship between road sign in these road signs of iteration closest approach algorithm search and map, pass through the corresponding relationship meter to obtained by The positional shift T and angle offset r between two point sets are calculated, and calculates the pose of laser radar using this offset, it is false If pose of the present laser radar under global coordinate system is expressed as state variable (xL,yLL), first two exist for laser radar Position under global coordinate system, Section 3 φLIndicate the direction of advance of laser radar;
Wherein, (xk,ykk) it is system state variables, (xL,k,yL,kL,k) it is systematic observation variable, T is positional shift, r For angle offset, v is the observation noise that error matrix is R.
7. the full-automatic road roller localization method according to claim 6 based on multisource data fusion, it is characterised in that: pressure In the machine operational process of road, due to the limitation of laser radar scanning range, the road roller visual field may be special without apparent road sign Sign can not carry out the pose estimation of road roller at this time;Therefore, tracking is carried out to road roller pose and keeps the continuous of its pose estimation Property, it using the data of Extended Kalman filter fusion odometer and laser radar, counts from mileage, calculates road roller Increment in position and direction, as input quantity u=(Δ S, Δ φ)T, it is as follows to establish Vehicular system state equation:
Wherein, w is the process white noise that error matrix is Q.
8. the full-automatic road roller localization method according to claim 2 based on multisource data fusion, it is characterised in that: institute Stating camera is big target surface industrial camera.
9. the full-automatic road roller localization method according to claim 2 based on multisource data fusion, it is characterised in that: institute Three-dimensional laser radar is stated to be set on front side of road roller above steel wheel at rack.
10. the full-automatic road roller localization method according to claim 2 based on multisource data fusion, it is characterised in that: The three-dimensional laser radar is installed with tilt angle.
CN201810670215.2A 2018-06-26 2018-06-26 Full-automatic road roller positioning method based on multi-source data fusion Active CN109099901B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810670215.2A CN109099901B (en) 2018-06-26 2018-06-26 Full-automatic road roller positioning method based on multi-source data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810670215.2A CN109099901B (en) 2018-06-26 2018-06-26 Full-automatic road roller positioning method based on multi-source data fusion

Publications (2)

Publication Number Publication Date
CN109099901A true CN109099901A (en) 2018-12-28
CN109099901B CN109099901B (en) 2021-09-24

Family

ID=64845012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810670215.2A Active CN109099901B (en) 2018-06-26 2018-06-26 Full-automatic road roller positioning method based on multi-source data fusion

Country Status (1)

Country Link
CN (1) CN109099901B (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110082739A (en) * 2019-03-20 2019-08-02 深圳市速腾聚创科技有限公司 Method of data synchronization and equipment
CN110147095A (en) * 2019-03-15 2019-08-20 广东工业大学 Robot method for relocating based on mark information and Fusion
CN110244322A (en) * 2019-06-28 2019-09-17 东南大学 Pavement construction robot environment sensory perceptual system and method based on Multiple Source Sensor
CN110389590A (en) * 2019-08-19 2019-10-29 杭州电子科技大学 A kind of AGV positioning system and method merging 2D environmental map and sparse artificial landmark
CN110568447A (en) * 2019-07-29 2019-12-13 广东星舆科技有限公司 Visual positioning method, device and computer readable medium
CN110749327A (en) * 2019-08-08 2020-02-04 南京航空航天大学 Vehicle navigation method in cooperation environment
CN110782497A (en) * 2019-09-06 2020-02-11 腾讯科技(深圳)有限公司 Method and device for calibrating external parameters of camera
CN110849362A (en) * 2019-11-28 2020-02-28 湖南率为控制科技有限公司 Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia
CN111364549A (en) * 2020-02-28 2020-07-03 江苏徐工工程机械研究院有限公司 Synchronous drawing and automatic operation method and system based on laser radar
WO2020154970A1 (en) * 2019-01-30 2020-08-06 Baidu.Com Times Technology (Beijing) Co., Ltd. Deep learning–based feature extraction for lidar localization of autonomous driving vehicles
CN111551976A (en) * 2020-05-20 2020-08-18 四川万网鑫成信息科技有限公司 Method for automatically completing abnormal positioning by combining various data
CN111595333A (en) * 2020-04-26 2020-08-28 武汉理工大学 Modularized unmanned vehicle positioning method and system based on visual inertial laser data fusion
CN111709355A (en) * 2020-06-12 2020-09-25 北京百度网讯科技有限公司 Method and device for identifying target area, electronic equipment and road side equipment
CN111754798A (en) * 2020-07-02 2020-10-09 上海电科智能***股份有限公司 Method for realizing detection of vehicle and surrounding obstacles by fusing roadside laser radar and video
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
CN111947647A (en) * 2020-08-26 2020-11-17 四川阿泰因机器人智能装备有限公司 Robot accurate positioning method integrating vision and laser radar
CN111998832A (en) * 2020-08-12 2020-11-27 河北雷神科技有限公司 Laser point cloud-based inspection method for accurately positioning target object by using unmanned aerial vehicle
CN112254729A (en) * 2020-10-09 2021-01-22 北京理工大学 Mobile robot positioning method based on multi-sensor fusion
CN112334790A (en) * 2019-08-21 2021-02-05 深圳市大疆创新科技有限公司 Positioning system and positioning method for movable object, and storage medium
CN112540382A (en) * 2019-09-07 2021-03-23 山东大学 Laser navigation AGV auxiliary positioning method based on visual identification detection
WO2021056190A1 (en) * 2019-09-24 2021-04-01 Beijing Didi Infinity Technology And Development Co., Ltd. Semantic-assisted multi-resolution point cloud registration
CN112904395A (en) * 2019-12-03 2021-06-04 青岛慧拓智能机器有限公司 Mining vehicle positioning system and method
CN113091736A (en) * 2021-04-02 2021-07-09 京东数科海益信息科技有限公司 Robot positioning method, device, robot and storage medium
CN113359782A (en) * 2021-05-28 2021-09-07 福建工程学院 Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
CN113777600A (en) * 2021-09-09 2021-12-10 北京航空航天大学杭州创新研究院 Multi-millimeter-wave radar cooperative positioning tracking method
CN114429432A (en) * 2022-04-07 2022-05-03 科大天工智能装备技术(天津)有限公司 Multi-source information layered fusion method and device and storage medium
CN114663992A (en) * 2022-03-18 2022-06-24 福建工程学院 Multi-source data fusion expressway portal positioning method
WO2022142827A1 (en) * 2020-12-30 2022-07-07 华为技术有限公司 Road occupancy information determination method and apparatus
CN115235478A (en) * 2022-09-23 2022-10-25 武汉理工大学 Intelligent automobile positioning method and system based on visual label and laser SLAM
WO2022256976A1 (en) * 2021-06-07 2022-12-15 深圳市大疆创新科技有限公司 Method and system for constructing dense point cloud truth value data and electronic device
US11550058B2 (en) 2020-04-10 2023-01-10 Caterpillar Paving Products Inc. Perception system three lidar coverage
CN116299367A (en) * 2023-05-18 2023-06-23 中国测绘科学研究院 Multi-laser space calibration method

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101017080A (en) * 2002-09-30 2007-08-15 石川岛播磨重工业株式会社 Device of measuring object
CN101469998A (en) * 2007-12-27 2009-07-01 爱信艾达株式会社 Feature information collecting apparatus and feature information collecting program, and own vehicle position recognition apparatus and navigation apparatus
CN103389103A (en) * 2013-07-03 2013-11-13 北京理工大学 Geographical environmental characteristic map construction and navigation method based on data mining
CN105210128A (en) * 2013-04-10 2015-12-30 谷歌公司 Mapping active and inactive construction zones for autonomous driving
CN105512646A (en) * 2016-01-19 2016-04-20 腾讯科技(深圳)有限公司 Data processing method, data processing device and terminal
CN105719284A (en) * 2016-01-18 2016-06-29 腾讯科技(深圳)有限公司 Data processing method, device and terminal
CN106127153A (en) * 2016-06-24 2016-11-16 南京林业大学 The traffic sign recognition methods of Vehicle-borne Laser Scanning cloud data
CN106774335A (en) * 2017-01-03 2017-05-31 南京航空航天大学 Guiding device based on multi-vision visual and inertial navigation, terrestrial reference layout and guidance method
CN106932780A (en) * 2017-03-14 2017-07-07 北京京东尚科信息技术有限公司 Object positioning method, device and system
CN107111885A (en) * 2014-12-30 2017-08-29 诺基亚技术有限公司 For the method for the position for determining portable set
CN107463918A (en) * 2017-08-17 2017-12-12 武汉大学 Lane line extracting method based on laser point cloud and image data fusion
CN107688184A (en) * 2017-07-24 2018-02-13 宗晖(上海)机器人有限公司 A kind of localization method and system
CN108196535A (en) * 2017-12-12 2018-06-22 清华大学苏州汽车研究院(吴江) Automated driving system based on enhancing study and Multi-sensor Fusion
CN108628206A (en) * 2017-03-17 2018-10-09 通用汽车环球科技运作有限责任公司 Road construction detecting system and method

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101017080A (en) * 2002-09-30 2007-08-15 石川岛播磨重工业株式会社 Device of measuring object
CN101469998A (en) * 2007-12-27 2009-07-01 爱信艾达株式会社 Feature information collecting apparatus and feature information collecting program, and own vehicle position recognition apparatus and navigation apparatus
CN105210128A (en) * 2013-04-10 2015-12-30 谷歌公司 Mapping active and inactive construction zones for autonomous driving
CN107976200A (en) * 2013-04-10 2018-05-01 伟摩有限责任公司 The method and system of vehicle of the operation with autonomous driving pattern
CN103389103A (en) * 2013-07-03 2013-11-13 北京理工大学 Geographical environmental characteristic map construction and navigation method based on data mining
CN107111885A (en) * 2014-12-30 2017-08-29 诺基亚技术有限公司 For the method for the position for determining portable set
CN105719284A (en) * 2016-01-18 2016-06-29 腾讯科技(深圳)有限公司 Data processing method, device and terminal
CN105512646A (en) * 2016-01-19 2016-04-20 腾讯科技(深圳)有限公司 Data processing method, data processing device and terminal
CN106127153A (en) * 2016-06-24 2016-11-16 南京林业大学 The traffic sign recognition methods of Vehicle-borne Laser Scanning cloud data
CN106774335A (en) * 2017-01-03 2017-05-31 南京航空航天大学 Guiding device based on multi-vision visual and inertial navigation, terrestrial reference layout and guidance method
CN106932780A (en) * 2017-03-14 2017-07-07 北京京东尚科信息技术有限公司 Object positioning method, device and system
CN108628206A (en) * 2017-03-17 2018-10-09 通用汽车环球科技运作有限责任公司 Road construction detecting system and method
CN107688184A (en) * 2017-07-24 2018-02-13 宗晖(上海)机器人有限公司 A kind of localization method and system
CN107463918A (en) * 2017-08-17 2017-12-12 武汉大学 Lane line extracting method based on laser point cloud and image data fusion
CN108196535A (en) * 2017-12-12 2018-06-22 清华大学苏州汽车研究院(吴江) Automated driving system based on enhancing study and Multi-sensor Fusion

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020154970A1 (en) * 2019-01-30 2020-08-06 Baidu.Com Times Technology (Beijing) Co., Ltd. Deep learning–based feature extraction for lidar localization of autonomous driving vehicles
CN110147095A (en) * 2019-03-15 2019-08-20 广东工业大学 Robot method for relocating based on mark information and Fusion
CN110082739A (en) * 2019-03-20 2019-08-02 深圳市速腾聚创科技有限公司 Method of data synchronization and equipment
CN110082739B (en) * 2019-03-20 2022-04-12 深圳市速腾聚创科技有限公司 Data synchronization method and device
CN110244322A (en) * 2019-06-28 2019-09-17 东南大学 Pavement construction robot environment sensory perceptual system and method based on Multiple Source Sensor
CN110568447A (en) * 2019-07-29 2019-12-13 广东星舆科技有限公司 Visual positioning method, device and computer readable medium
CN110749327A (en) * 2019-08-08 2020-02-04 南京航空航天大学 Vehicle navigation method in cooperation environment
CN110389590A (en) * 2019-08-19 2019-10-29 杭州电子科技大学 A kind of AGV positioning system and method merging 2D environmental map and sparse artificial landmark
CN110389590B (en) * 2019-08-19 2022-07-05 杭州电子科技大学 AGV positioning system and method integrating 2D environment map and sparse artificial landmark
CN112334790A (en) * 2019-08-21 2021-02-05 深圳市大疆创新科技有限公司 Positioning system and positioning method for movable object, and storage medium
CN110782497A (en) * 2019-09-06 2020-02-11 腾讯科技(深圳)有限公司 Method and device for calibrating external parameters of camera
CN110782497B (en) * 2019-09-06 2022-04-29 腾讯科技(深圳)有限公司 Method and device for calibrating external parameters of camera
CN112540382B (en) * 2019-09-07 2024-02-13 山东大学 Laser navigation AGV auxiliary positioning method based on visual identification detection
CN112540382A (en) * 2019-09-07 2021-03-23 山东大学 Laser navigation AGV auxiliary positioning method based on visual identification detection
WO2021056190A1 (en) * 2019-09-24 2021-04-01 Beijing Didi Infinity Technology And Development Co., Ltd. Semantic-assisted multi-resolution point cloud registration
CN110849362A (en) * 2019-11-28 2020-02-28 湖南率为控制科技有限公司 Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia
CN110849362B (en) * 2019-11-28 2022-01-04 湖南率为控制科技有限公司 Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia
CN112904395A (en) * 2019-12-03 2021-06-04 青岛慧拓智能机器有限公司 Mining vehicle positioning system and method
CN111364549B (en) * 2020-02-28 2021-11-09 江苏徐工工程机械研究院有限公司 Synchronous drawing and automatic operation method and system based on laser radar
CN111364549A (en) * 2020-02-28 2020-07-03 江苏徐工工程机械研究院有限公司 Synchronous drawing and automatic operation method and system based on laser radar
US11550058B2 (en) 2020-04-10 2023-01-10 Caterpillar Paving Products Inc. Perception system three lidar coverage
CN111595333A (en) * 2020-04-26 2020-08-28 武汉理工大学 Modularized unmanned vehicle positioning method and system based on visual inertial laser data fusion
CN111551976A (en) * 2020-05-20 2020-08-18 四川万网鑫成信息科技有限公司 Method for automatically completing abnormal positioning by combining various data
CN111709355B (en) * 2020-06-12 2023-08-29 阿波罗智联(北京)科技有限公司 Method and device for identifying target area, electronic equipment and road side equipment
CN111709355A (en) * 2020-06-12 2020-09-25 北京百度网讯科技有限公司 Method and device for identifying target area, electronic equipment and road side equipment
CN111754798A (en) * 2020-07-02 2020-10-09 上海电科智能***股份有限公司 Method for realizing detection of vehicle and surrounding obstacles by fusing roadside laser radar and video
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
CN111998832A (en) * 2020-08-12 2020-11-27 河北雷神科技有限公司 Laser point cloud-based inspection method for accurately positioning target object by using unmanned aerial vehicle
CN111947647B (en) * 2020-08-26 2024-05-17 四川阿泰因机器人智能装备有限公司 Precise positioning method for vision and laser radar integrated robot
CN111947647A (en) * 2020-08-26 2020-11-17 四川阿泰因机器人智能装备有限公司 Robot accurate positioning method integrating vision and laser radar
CN112254729A (en) * 2020-10-09 2021-01-22 北京理工大学 Mobile robot positioning method based on multi-sensor fusion
WO2022142827A1 (en) * 2020-12-30 2022-07-07 华为技术有限公司 Road occupancy information determination method and apparatus
CN113091736A (en) * 2021-04-02 2021-07-09 京东数科海益信息科技有限公司 Robot positioning method, device, robot and storage medium
CN113359782B (en) * 2021-05-28 2022-07-29 福建工程学院 Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
CN113359782A (en) * 2021-05-28 2021-09-07 福建工程学院 Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
WO2022256976A1 (en) * 2021-06-07 2022-12-15 深圳市大疆创新科技有限公司 Method and system for constructing dense point cloud truth value data and electronic device
CN113777600B (en) * 2021-09-09 2023-09-15 北京航空航天大学杭州创新研究院 Multi-millimeter wave radar co-location tracking method
CN113777600A (en) * 2021-09-09 2021-12-10 北京航空航天大学杭州创新研究院 Multi-millimeter-wave radar cooperative positioning tracking method
CN114663992A (en) * 2022-03-18 2022-06-24 福建工程学院 Multi-source data fusion expressway portal positioning method
CN114429432B (en) * 2022-04-07 2022-06-21 科大天工智能装备技术(天津)有限公司 Multi-source information layered fusion method and device and storage medium
CN114429432A (en) * 2022-04-07 2022-05-03 科大天工智能装备技术(天津)有限公司 Multi-source information layered fusion method and device and storage medium
CN115235478A (en) * 2022-09-23 2022-10-25 武汉理工大学 Intelligent automobile positioning method and system based on visual label and laser SLAM
CN115235478B (en) * 2022-09-23 2023-04-07 武汉理工大学 Intelligent automobile positioning method and system based on visual label and laser SLAM
CN116299367B (en) * 2023-05-18 2024-01-26 中国测绘科学研究院 Multi-laser space calibration method
CN116299367A (en) * 2023-05-18 2023-06-23 中国测绘科学研究院 Multi-laser space calibration method

Also Published As

Publication number Publication date
CN109099901B (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN109099901A (en) Full-automatic road roller localization method based on multisource data fusion
CN110084272B (en) Cluster map creation method and repositioning method based on cluster map and position descriptor matching
US10897575B2 (en) Lidar to camera calibration for generating high definition maps
CN108571971B (en) AGV visual positioning system and method
CN106651953B (en) A kind of vehicle position and orientation estimation method based on traffic sign
CN111652179A (en) Semantic high-precision map construction and positioning method based on dotted line feature fusion laser
CN103411609B (en) A kind of aircraft return route planing method based on online composition
CN106296814B (en) Highway maintenance detection and virtual interactive interface method and system
CN105667518A (en) Lane detection method and device
CN104729485A (en) Visual positioning method based on vehicle-mounted panorama image and streetscape matching
CN110308729A (en) The AGV combined navigation locating method of view-based access control model and IMU or odometer
CN103837143B (en) Super-mapping machine
CN103885455B (en) Tracking measurement robot
CN103868504B (en) Autonomous surveying and mapping machine
CN110263607A (en) A kind of for unpiloted road grade global context drawing generating method
CN106705962A (en) Method and system for acquiring navigation data
CN111721279A (en) Tail end path navigation method suitable for power transmission inspection work
CN112749584B (en) Vehicle positioning method based on image detection and vehicle-mounted terminal
CN116883604A (en) Three-dimensional modeling technical method based on space, air and ground images
Hara et al. Vehicle localization based on the detection of line segments from multi-camera images
Alshaiba et al. Automatic manhole extraction from MMS data to update basemaps
CN112446915A (en) Picture-establishing method and device based on image group
CN110415299B (en) Vehicle position estimation method based on set guideboard under motion constraint
CN112530270B (en) Mapping method and device based on region allocation
KR20210098534A (en) Methods and systems for creating environmental models for positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190404

Address after: Room 525, Building 333 Xingpu Road, Suzhou Industrial Park, Jiangsu Province

Applicant after: Zhongke Weiyi (Suzhou) Intelligent Technology Co., Ltd.

Address before: 215021 Linquan Street 399, Suzhou Industrial Park, Jiangsu Province

Applicant before: Suzhou Road Agent Intelligent Technology Co., Ltd.

GR01 Patent grant
GR01 Patent grant