CN112179362A - High-precision map data acquisition system and acquisition method - Google Patents

High-precision map data acquisition system and acquisition method Download PDF

Info

Publication number
CN112179362A
CN112179362A CN201910593310.1A CN201910593310A CN112179362A CN 112179362 A CN112179362 A CN 112179362A CN 201910593310 A CN201910593310 A CN 201910593310A CN 112179362 A CN112179362 A CN 112179362A
Authority
CN
China
Prior art keywords
unit
time
gps
laser radar
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910593310.1A
Other languages
Chinese (zh)
Inventor
李志伟
郭坤
薛周鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Automobile Technology Co Ltd
Original Assignee
Shendong Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shendong Technology Beijing Co ltd filed Critical Shendong Technology Beijing Co ltd
Priority to CN201910593310.1A priority Critical patent/CN112179362A/en
Publication of CN112179362A publication Critical patent/CN112179362A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The invention provides a high-precision map data acquisition system which comprises a camera unit, a laser radar unit, an inertia measurement unit, a GPS unit, an FPGA synchronous time service unit, a processor unit and a power supply management unit, wherein the camera unit is connected with the laser radar unit; the camera unit, the laser radar unit, the inertia measurement unit and the GPS unit are used for acquiring data; the processor unit is used for processing data collected by the camera unit, the laser radar unit and the GPS unit; the power supply management unit is used for providing stable power supply for other units; wherein the mounting positions of the camera unit, the lidar unit, the inertial measurement unit and the GPS unit remain relatively unchanged, respectively. By using the method and the device, strict time sequence synchronization of each sensor can be ensured, so that effective and correct data fusion can be carried out, and the high precision of map data is ensured.

Description

High-precision map data acquisition system and acquisition method
Technical Field
The invention relates to the field of automatic driving, in particular to a high-precision map data acquisition system and a corresponding acquisition method for automatic driving.
Background
Automotive autopilot technology involves the use of video cameras, radar sensors, laser rangefinders, etc. to understand the surrounding traffic and navigate the road ahead through a detailed map (a map collected by a manned automobile). All this is done by means of a data centre which can process the vast amount of information collected by the car about the surrounding terrain. In this regard, the autonomous vehicle corresponds to a remote control vehicle or an intelligent vehicle of a data center. The autopilot technology distinguishes four unmanned phases according to the level of automation: driving assistance, partial automation, high automation and full automation. But at which stage the pre-collected map data is not kept away.
With the progress of automatic driving research and the improvement of automatic driving stages, the precision requirement on map data collected in advance is higher and higher. The manufacturing precision and process of the high-precision map are closely related to the original collected data, sensors such as a laser radar sensor, a camera, an inertial navigation sensor and a GPS (global positioning system) represent information with different dimensions, and the information of the sensors needs to be fused in the high-precision map manufacturing process. In the current multi-sensor acquisition system, different sensors work independently, which may cause map data errors or the accuracy is reduced because the sensors cannot work cooperatively, for example, timing synchronization cannot be performed strictly. Therefore, a high-precision map data acquisition mode is needed to ensure the cooperative work of all sensors.
Disclosure of Invention
The invention provides a high-precision map data acquisition mode. By using the method, strict time sequence synchronization of the sensors can be ensured, so that effective and correct data fusion can be carried out, and the high precision of map data is guaranteed.
According to one aspect of the invention, a high-precision map data acquisition system is provided, and the system comprises a camera unit, a laser radar unit, an inertia measurement unit, a GPS unit, an FPGA synchronous time service unit, a processor unit and a power supply management unit;
the camera unit, the laser radar unit, the inertia measurement unit and the GPS unit are used for acquiring data; the processor unit is used for processing data collected by the camera unit, the laser radar unit and the GPS unit; the power supply management unit is used for providing stable power supply for other units;
the installation positions of the camera unit, the laser radar unit, the inertia measurement unit and the GPS unit are respectively kept relatively unchanged;
the FPGA time service synchronization unit generates a time signal and a synchronization control signal and sends the time signal and the synchronization control signal to the camera unit, the laser radar unit and the inertia measurement unit for time service and synchronization control; and the FPGA time service synchronization unit also receives the time and the synchronization signal of the GPS unit as input correction.
Preferably, a synchronous control signal generated by the FPGA time service synchronous unit is sent to the camera unit, the laser radar unit and the inertia measurement unit for synchronous control; and the time signals generated by the time signal are sent to the camera unit and the laser radar unit for time service.
Preferably, the processor unit receives a time signal generated by the FPGA time service synchronization unit.
Preferably, the processor unit receives a time signal generated by the GPS unit.
Preferably, the inertial measurement unit gives attitude information of the camera unit and the laser radar unit; and/or the GPS unit also gives the position information of the camera unit and the laser radar unit.
Preferably, the data measured by the inertia measurement unit is returned to the FPGA time service synchronization unit and then is stamped.
Preferably, the data measured by the camera unit and the lidar unit are transmitted directly to the processor unit.
Preferably, the data measured by the camera unit is transmitted to the FPGA time service synchronization unit and then to the processor unit.
Preferably, the system further comprises a wheel speed measurement unit for measuring speed information of the system.
Preferably, the data measured by the wheel speed measuring unit is transmitted to the FPGA time service synchronization unit and then transmitted to the processor unit.
Preferably, the synchronization control signal is a pulse per second signal or a frequency divided per second signal.
Preferably, the system further comprises an information interaction unit for displaying the real-time state of the system and sending a control command.
Preferably, the system further comprises a server unit for storing and/or further processing the data processed by the processor unit.
Preferably, the processor unit is installed in the server unit.
According to another aspect of the present invention, there is provided a high-precision map data acquisition method implemented by the aforementioned high-precision map data acquisition system, the method including the steps of:
enabling the FPGA synchronous time service unit to generate a synchronous control signal;
enabling the synchronous control signal to trigger the camera unit, the laser radar unit and the inertia measurement unit to acquire data;
enabling the time and the synchronous signal generated by the GPS unit and checking whether the time and the synchronous signal are good or not;
when the time and the synchronization signal generated by the GPS unit are good, the time and the synchronization signal are used as the input correction of the FPGA synchronization time service unit.
Drawings
The foregoing summary, as well as the following detailed description, will be better understood when read in conjunction with the appended drawings. For the purpose of illustration, certain embodiments of the disclosure are shown in the drawings. It should be understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate implementations of systems and apparatus according to the invention and, together with the description, serve to explain the advantages and principles of the invention.
Wherein the content of the first and second substances,
FIG. 1 schematically illustrates a high accuracy map data acquisition system in accordance with one embodiment of the present invention; while
Fig. 2 schematically shows a high-precision map data acquisition method according to an embodiment of the present invention.
Detailed Description
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The drawings and written description are provided to guide those skilled in the art in making and using the invention for which patent protection is sought. The invention is capable of other embodiments and of being practiced and carried out in various ways. Those skilled in the art will appreciate that not all features of a commercial embodiment are shown for the sake of clarity and understanding. Those skilled in the art will also appreciate that the development of an actual commercial embodiment incorporating aspects of the present inventions will require numerous implementation-specific decisions to achieve the developer's ultimate goal for the commercial embodiment. While these efforts may be complex and time consuming, these efforts will be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting. For example, use of singular terms, such as "a," "an," and "the" is not intended to limit the number of items. Also, the use of relational terms, such as, but not limited to, "top," "bottom," "left," "right," "upper," "lower," "down," "up," "side," and the like are used in this description with specific reference to the figures for clarity and are not intended to limit the scope of the invention or the appended claims. Furthermore, it will be appreciated that any of the features of the present invention may be used alone, or in combination with other features. Other systems, methods, features and advantages of the invention will be, or will become, apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims.
Reference will now be made in detail to embodiments of the invention as illustrated in the accompanying drawings.
Referring initially to FIG. 1, FIG. 1 schematically illustrates a high accuracy map data acquisition system in accordance with one embodiment of the present invention.
As shown in the figure, the system 100 includes a camera unit 101, a laser radar unit 102, an inertial measurement unit 103, a GPS unit 104, a wheel speed measurement unit 105, an FPGA synchronous time service unit 106, a processor unit 107, a server unit 108, an information interaction unit 109, a power supply management unit 110, and the like.
The camera unit 101, the laser radar unit 102, the inertial measurement unit 103, the GPS unit 104, and the like are key sensor units, and are all data-connected to the FPGA synchronous timing unit 106. The camera unit 101, the lidar unit 102, the GPS unit 104, etc. are also data connected to the processor unit. Meanwhile, data connection also exists between the FPGA synchronous timing unit 106 and the processor unit 107.
The camera unit 101 is used to capture images for later extraction of map element information. In some embodiments of the present invention, the interface between the camera unit 101 and the processor unit 107 includes, but is not limited to, MIPI interface or LVDS interface, and the number of cameras is not limited to monocular or binocular. For example, the camera unit 101 may be a Basler company acA1920-40uc camera or daA1280-54lm camera.
Lidar unit 102 is configured to provide 360 ° point cloud data. In some embodiments of the present invention, the number, installation manner, and the like of the laser radar are not limited. For example, lidar unit 102 may be VLP-16 from Velodyne or VUX-1HA from RIEGL.
The inertial measurement unit 103 is used to obtain attitude information of the system. In some embodiments of the invention, the inertial measurement unit 103 comprises at least a 3-axis acceleration sensor and a 3-axis gyroscope, although the scale is not limited. The inertial measurement unit 103 may be any commercially available 3-axis acceleration sensor and 3-axis gyroscope.
The GPS unit 104 is used to acquire the real-time position of the system, and the generated pulse-per-second synchronization signal is used for system synchronization. It is to be understood that the GPS unit 104 refers to a unit that utilizes satellite positioning and navigation, and is not limited to a certain satellite system. For example, it can be the GPS system in the United states, the Beidou system in China, or the Glonass system such as Galileo or Russia in Europe.
In addition, it is to be noted that, regardless of how many of the above-described sensor units are mounted, the relative positions of these sensor units need to be kept constant after mounting.
The FPGA time service synchronization unit 106 is used for synchronization control and time service management of each unit of the system, receives the second pulse signal and the position time signal of the GPS unit 104, gives out different pulse signals according to the characteristics of different sensors to obtain sensor data, and stamps the sensor time according to the GPS time to ensure time synchronization. Under the condition of good GPS signals, the FPGA time service synchronization unit 106 takes GPS time and pulse per second synchronization signals as input for correction; the correction may be achieved by a local phase locked loop, for example. When the GPS signal is poor or is absent, the FPGA time service synchronization module synchronously controls and provides time for each unit of the system according to self-maintained time sequence and time, so that the effectiveness of the acquired data is ensured.
As shown in the figure, the camera unit 101 receives the time signal and the synchronous control signal from the FPGA synchronous time service unit 106, and after data is acquired, a timestamp is printed on the data, and then the data is transmitted to the processor unit 107 for preprocessing. Lidar unit 102 is also similar. However, it should be understood that, according to the different interface types adopted, in some embodiments, the data collected by the camera unit 101 is also first transmitted back to the FPGA time service synchronization unit 106, and then transmitted to the processor unit 107.
In addition, as shown in the figure, after receiving the synchronous control signal of the FPGA synchronous time service unit 106, the inertia measurement unit 103 acquires data, and returns the data to the FPGA synchronous time service unit 106, and a timestamp is printed on the data. The time-stamped inertial data is then transmitted to the processor unit 107 via the FPGA synchronization time service unit 106.
The processor unit 107 is used for collecting, summarizing and preliminarily fusing data of each unit, and can be used for extracting map element information; and the processor unit 107 also passes the data to the server unit 108 for storage on a server or for processing by a back-end processor. In some embodiments of the present invention, the processor unit 107 includes at least a main control chip circuit, and a peripheral power supply circuit, a peripheral storage circuit, and an external communication interface (such as a network port or a USB), where the main control chip may be a processor with different architectures such as ARM, FPGA, TX2, or X86.
In the illustrated embodiment, the system 100 further includes a wheel speed measurement unit 105, and the wheel speed measurement unit 105 acquires velocity information of the moving carrier. The wheel speed measuring unit 105 is also in data connection with the FPGA synchronous time service unit 106. The wheel speed measurement unit 105 does not need to maintain a relative position with respect to other sensor units; however, it is of course also possible to keep the relative position unchanged. As shown in the figure, the wheel speed measurement unit 105 transmits a wheel speed signal to the FPGA synchronous timing unit 106, and it does not need to receive a time signal and a synchronous control signal of the FPGA synchronous timing unit 106; however, it is understood that it is also possible to receive the signal of the FPGA synchronous timing unit 106. It should also be understood that in some alternative embodiments, the wheel speed measurement unit 105 may not be retained depending on the actual application.
As shown in the figure, each sensor unit such as camera unit 101, laser radar unit 102, inertia measurement unit 103, and wheel speed measurement unit 105 receives only the synchronization signal of FPGA synchronization timing unit 106. Processor unit 107 may receive the synchronization signal from FPGA synchronization time service unit 106 or the synchronization signal from GPS unit 104.
In the illustrated embodiment, the system 100 further includes an information interaction unit 109, which displays the real-time status of the system and can send control commands. The information interaction unit 109 also maintains a data connection with the server unit 108. However, it should also be understood that in some alternative embodiments, the information interaction unit 109 as an auxiliary functional unit may also not be reserved according to the actual application.
In the illustrated embodiment, the system 100 further includes a server unit 108 for processing the sensor data collected by the system to facilitate map element extraction and map updating using deep learning. The server unit 108 may be deployed in the cloud or locally. Processor unit 107 also maintains a data connection with server unit 108. However, it should be understood that in other implementations, the processor unit 107 may be installed directly in the server unit 108; in this case, all data is uploaded to the server for processing. Of course, it is understood that the server unit 108 may not be reserved according to the actual application.
The power management unit 110 is used to provide stable power to the units of the system. A conventional suitable power management unit 110 may be used.
Referring now to FIG. 2, FIG. 2 schematically illustrates a high accuracy map data collection method in accordance with one embodiment of the present invention.
First, it should be understood that, in the present invention, the high-precision map elements to be collected and extracted include road signs such as lane lines, traffic lights, diversion strips, pedestrian crossing lines, stop lines, traffic regulation information and topology information of lanes, and traffic signs such as guardrails, curbs, street lamps, guideboards, symbols and signs. The high-precision map elements also include road attributes, road curvature, heading, gradient, cross-slope, and the like.
As shown in fig. 2, in step 201, after the system starts to operate, the FPGA synchronization timing unit 106 generates a synchronization control signal and a time signal. The time signal is transmitted through, for example, a serial port, and is used to time the camera unit 101, the laser radar unit 102, the inertia measurement unit 103, the processor unit 107 (and the optional wheel speed measurement unit 105), and the like. And the synchronization control signal is transmitted directly, for example, at high and low levels, to control the camera to trigger photographing, lidar aiming, and data acquisition by the inertial measurement unit 103 (and optionally the wheel speed measurement unit 105). In step 202, each sensor data outputs data according to the trigger pulse and time given by the FPGA. In an exemplary embodiment, the FPGA synchronous timing unit 106 triggers each sensor unit by counting trigger pulses, and each sensor transmits data back with a timestamp. The trigger pulse may be a pulse per second signal or a divided-by-second signal, such as 1/10s or 1/5s, aligned in whole seconds. The time stamp can be accurate to a microsecond or nanosecond level according to different requirements. It should be understood that the time stamp is preferably a relative time, and the reference thereof is preferably the time when the FPGA synchronization timing unit 106 gives the synchronization control signal.
With continued reference to fig. 2, during operation, in the case of good GPS signals (see step 204), the FPGA time service synchronization unit 106 continuously receives as input corrections, for example, GPS time and pulse-per-second synchronization signals given by the GPS unit 104 (see step 205).
The attitude information of the camera unit 101 is given by the inertial measurement unit 103. And the position information of the camera unit 101 is given by the GPS unit 104. The attitude information and the position information of the camera unit 101 include six-degree-of-freedom information of the camera, i.e., XYZ spatial position information, pitch, roll, yaw information, and the like. It should be understood that the parameters and relative positional relationships of the inertial measurement unit 103, the GPS unit 104, and the camera unit 101 are strictly calibrated in advance after installation. The shooting of the camera unit 101 is triggered and controlled by the FPGA synchronous time service unit 106, so that each frame of image corresponds to the posture information and the position information of the camera.
The lidar unit 102 collects laser point cloud data. The laser point cloud data is that the laser is utilized to obtain the space coordinates of each sampling point on the surface of an object under the same space reference system, and a series of massive point sets expressing the space distribution and the surface characteristics of the object are obtained, and the point sets are called point clouds. The attributes of the point cloud include spatial resolution, point location accuracy, surface normal vectors of the target surface, and the like.
Because the acquisition systems are all in the carrier coordinate system, the laser radar unit 102 can accurately measure the distance information. In addition, lidar unit 102 may also measure reflectivity. Traffic signs such as lane lines can be extracted by the reflectivity, buildings and trees around the road can be recognized, and even concave and convex surfaces of the road, such as pits, ditches, and the like, can be recognized.
In the system of the present invention, lidar unit 102 may be, for example, a multiline three-dimensional sensor having a 360 ° field of view, which may be achieved by employing a mechanical 360 ° rotary radar or arranging multiple radars to cover a 360 ° field of view. In some embodiments, the 360 ° can be, for example, 360 ° of the roof plane, i.e., one rotation in the roof plane. In a preferred embodiment, the 360 ° may be 360 ° of a plane which may be slightly inclined to the vehicle forward direction than the roof plane; of course, the inclination may be slightly toward the rear of the vehicle. Once the tilt angle of lidar unit 102 is determined, it needs to remain constant during data acquisition.
It should be understood that the lidar unit 102 is also strictly calibrated with the camera unit 101, the inertial measurement unit 103, and the like before use, the obtained radar data is motion compensated by using the result of the inertial measurement unit 103, and meanwhile, the timestamp of the radar data is given by the FPGA time service synchronization module.
Camera unit 101 may be used in conjunction with lidar unit 102. In some implementations, the low-line lidar point cloud data may be fused with the image. In a preferred embodiment, the high beam lidar point cloud is dense and can be used directly for cluster analysis and extraction of map elements.
In step 203, the processor unit 107 collects the sensor units such as the camera unit 101, the laser radar unit 102, the inertia measurement unit 103, and the alternative wheel speed measurement unit 105, and performs preliminary fusion preprocessing on the data. It should be appreciated that although the pre-processing of the data by the processor unit 107 is shown in the illustrated embodiment, this step is not necessary for data acquisition, but is an alternative step. It should also be understood that the processed data may be saved in local storage or uploaded to a server for high precision mapping (not shown).
It can be understood that, in a preferred embodiment, for the collected image information of the camera unit 101, a deep learning method may be adopted to label and train high-precision map elements in the image in the early stage, so as to establish a model for deep learning image recognition. Errors in the training results of the image recognition model can be identified, for example, through manual examination, and the image model is iteratively optimized; after the model is determined, the system is deployed, and map element information in the newly acquired image can be extracted. And the spatial position and the geometric structure of the element information in the high-precision map can be obtained by integrating the extracted map element information, the attitude position information of the camera unit 101 and the distance information of the laser radar unit 102.
It should be appreciated that the system and method of the present invention is applicable to the collection of high precision map data, particularly for road navigation. However, the system and the method are also applicable to the requirements of similar scenes, such as digital city modeling, geographic information mapping, road disease detection and the like, and the system and the method are not limited by the use place.
It will be appreciated by those skilled in the art that changes could be made to the embodiments described above without departing from the broad inventive concept thereof. It is understood, therefore, that this invention is not limited to the particular embodiments disclosed, but it is intended to cover modifications within the spirit and scope of the present invention as defined by the appended claims.

Claims (10)

1. A high-precision map data acquisition system comprises a camera unit, a laser radar unit, an inertia measurement unit, a GPS unit, an FPGA synchronous time service unit, a processor unit and a power supply management unit;
the camera unit, the laser radar unit, the inertia measurement unit and the GPS unit are used for acquiring data; the processor unit is used for processing the data collected by the camera unit, the laser radar unit and the GPS unit; the power supply management unit is used for providing stable power supplies for the camera unit, the laser radar unit, the inertia measurement unit, the GPS unit, the FPGA synchronous time service unit and the processor unit;
wherein the installation positions of the camera unit, the laser radar unit, the inertial measurement unit and the GPS unit are respectively kept relatively unchanged;
the FPGA time service synchronization unit generates a time signal and a synchronization control signal and sends the time signal and the synchronization control signal to the camera unit, the laser radar unit and the inertia measurement unit for time service and synchronization control; and the FPGA time service synchronization unit also receives the time and the synchronization signal of the GPS unit as input correction.
2. The system of claim 1, wherein the processor unit receives a time signal generated by the FPGA time service synchronization unit.
3. The system of claim 1, wherein the processor unit receives a time signal generated by the GPS unit.
4. The system of claim 1, wherein the synchronization control signal is given to the camera unit, the lidar unit, an inertial measurement unit for synchronization control; and the time signal is sent to the camera unit and the laser radar unit for time service.
5. The system of claim 1, wherein the inertial measurement unit gives attitude information of the camera unit and the lidar unit; and/or the GPS unit also gives out the position information of the camera unit and the laser radar unit.
6. The system of claim 1, further comprising a wheel speed measurement unit for measuring speed information of the system.
7. The system of claim 1, wherein the data measured by the inertial measurement unit is time stamped after being returned to the FPGA time service synchronization unit.
8. The system of claim 1, further comprising an information interaction unit for displaying a real-time status of the system and transmitting a control command.
9. The system of claim 1, further comprising a server unit for storing and/or further processing the data processed by the processor unit.
10. A high-precision map data collection method realized by a high-precision map data collection system according to any one of strong requirements 1 to 9, comprising the steps of:
enabling the FPGA synchronous time service unit to generate a synchronous control signal;
enabling the synchronous control signal to trigger the camera unit, the laser radar unit and the inertia measurement unit to acquire data;
enabling the time and the synchronous signal generated by the GPS unit and checking whether the time and the synchronous signal are good or not;
and under the condition that the time and the synchronous signal generated by the GPS unit are good, the time and the synchronous signal are used as the input of the FPGA synchronous time service unit for correction.
CN201910593310.1A 2019-07-03 2019-07-03 High-precision map data acquisition system and acquisition method Pending CN112179362A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910593310.1A CN112179362A (en) 2019-07-03 2019-07-03 High-precision map data acquisition system and acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910593310.1A CN112179362A (en) 2019-07-03 2019-07-03 High-precision map data acquisition system and acquisition method

Publications (1)

Publication Number Publication Date
CN112179362A true CN112179362A (en) 2021-01-05

Family

ID=73914398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910593310.1A Pending CN112179362A (en) 2019-07-03 2019-07-03 High-precision map data acquisition system and acquisition method

Country Status (1)

Country Link
CN (1) CN112179362A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113343849A (en) * 2021-06-07 2021-09-03 西安恒盛安信智能技术有限公司 Fusion sensing equipment based on radar and video
CN113566833A (en) * 2021-07-28 2021-10-29 上海工程技术大学 Multi-sensor fusion vehicle positioning method and system
CN115840234A (en) * 2022-10-28 2023-03-24 苏州知至科技有限公司 Radar data acquisition method and device and storage medium
WO2023083271A1 (en) * 2021-11-15 2023-05-19 虹软科技股份有限公司 Data synchronization device and method, and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201488737U (en) * 2009-06-30 2010-05-26 中国航天科技集团公司燎原无线电厂 Data acquisition system for inertia measuring unit
CN103743958A (en) * 2013-12-31 2014-04-23 国网电力科学研究院武汉南瑞有限责任公司 Thunder and lightning detection device on basis of Beidou satellite timing system
CN204945644U (en) * 2015-08-25 2016-01-06 中国石油天然气股份有限公司 A kind of long gas pipe line distributed monitoring inertia time dissemination system
CN108279014A (en) * 2017-01-05 2018-07-13 武汉四维图新科技有限公司 Automatic Pilot map data collecting apparatus and system, map Intelligent Production System
CN108445808A (en) * 2018-03-30 2018-08-24 深圳前海清科技有限公司 The sensing device and method that data synchronize
CN108919238A (en) * 2018-07-18 2018-11-30 浙江大学 A kind of bearing calibration of rotary laser radar data and system based on Inertial Measurement Unit
CN208506253U (en) * 2018-06-15 2019-02-15 百度在线网络技术(北京)有限公司 For acquiring device, system and the vehicle of map

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201488737U (en) * 2009-06-30 2010-05-26 中国航天科技集团公司燎原无线电厂 Data acquisition system for inertia measuring unit
CN103743958A (en) * 2013-12-31 2014-04-23 国网电力科学研究院武汉南瑞有限责任公司 Thunder and lightning detection device on basis of Beidou satellite timing system
CN204945644U (en) * 2015-08-25 2016-01-06 中国石油天然气股份有限公司 A kind of long gas pipe line distributed monitoring inertia time dissemination system
CN108279014A (en) * 2017-01-05 2018-07-13 武汉四维图新科技有限公司 Automatic Pilot map data collecting apparatus and system, map Intelligent Production System
CN108445808A (en) * 2018-03-30 2018-08-24 深圳前海清科技有限公司 The sensing device and method that data synchronize
CN208506253U (en) * 2018-06-15 2019-02-15 百度在线网络技术(北京)有限公司 For acquiring device, system and the vehicle of map
CN108919238A (en) * 2018-07-18 2018-11-30 浙江大学 A kind of bearing calibration of rotary laser radar data and system based on Inertial Measurement Unit

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113343849A (en) * 2021-06-07 2021-09-03 西安恒盛安信智能技术有限公司 Fusion sensing equipment based on radar and video
CN113566833A (en) * 2021-07-28 2021-10-29 上海工程技术大学 Multi-sensor fusion vehicle positioning method and system
WO2023083271A1 (en) * 2021-11-15 2023-05-19 虹软科技股份有限公司 Data synchronization device and method, and computer readable storage medium
CN116156073A (en) * 2021-11-15 2023-05-23 虹软科技股份有限公司 Data synchronization device, method thereof and computer readable storage medium
CN115840234A (en) * 2022-10-28 2023-03-24 苏州知至科技有限公司 Radar data acquisition method and device and storage medium
CN115840234B (en) * 2022-10-28 2024-04-19 苏州知至科技有限公司 Radar data acquisition method, device and storage medium

Similar Documents

Publication Publication Date Title
US10928523B2 (en) Accuracy of global navigation satellite system based positioning using high definition map based localization
JP7432285B2 (en) Lane mapping and navigation
CN110057373B (en) Method, apparatus and computer storage medium for generating high-definition semantic map
CN109341706B (en) Method for manufacturing multi-feature fusion map for unmanned vehicle
US11915440B2 (en) Generation of structured map data from vehicle sensors and camera arrays
CN111448476B (en) Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle
CN109583415B (en) Traffic light detection and identification method based on fusion of laser radar and camera
CN112179362A (en) High-precision map data acquisition system and acquisition method
US10970924B2 (en) Reconstruction of a scene from a moving camera
EP3137850B1 (en) Method and system for determining a position relative to a digital map
US10767990B2 (en) Device, method, and system for processing survey data, and program therefor
CN109374008A (en) A kind of image capturing system and method based on three mesh cameras
US20130293716A1 (en) Mobile mapping system for road inventory
CN113566833A (en) Multi-sensor fusion vehicle positioning method and system
EP4246088A1 (en) Surveying system, surveying method, and surveying program
Vallet et al. Development and experiences with a fully-digital handheld mapping system operated from a helicopter
CN114880334A (en) Map data updating method and electronic equipment
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
CN114127738A (en) Automatic mapping and positioning
CN113405555B (en) Automatic driving positioning sensing method, system and device
EP3693702A1 (en) Method for localizing a vehicle
CN114966793B (en) Three-dimensional measurement system, method and GNSS system
Alamús et al. On the accuracy and performance of the GEOMÒBIL System
Skaloud et al. GPS/INS Integration: From Modern Methods of Data Acquisition fo New Applications
RU2772620C1 (en) Creation of structured map data with vehicle sensors and camera arrays

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220727

Address after: Room 618, 6 / F, building 5, courtyard 15, Kechuang 10th Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing 100176

Applicant after: Xiaomi Automobile Technology Co.,Ltd.

Address before: 1219, 11 / F, SOHO, Zhongguancun, 8 Haidian North 2nd Street, Haidian District, Beijing 100089

Applicant before: SHENDONG TECHNOLOGY (BEIJING) Co.,Ltd.

TA01 Transfer of patent application right