CN111912418A - Method, device and medium for deleting obstacles in non-driving area of mobile carrier - Google Patents

Method, device and medium for deleting obstacles in non-driving area of mobile carrier Download PDF

Info

Publication number
CN111912418A
CN111912418A CN202010685784.1A CN202010685784A CN111912418A CN 111912418 A CN111912418 A CN 111912418A CN 202010685784 A CN202010685784 A CN 202010685784A CN 111912418 A CN111912418 A CN 111912418A
Authority
CN
China
Prior art keywords
map
mobile carrier
real
information
aerial view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010685784.1A
Other languages
Chinese (zh)
Inventor
王泽荔
王文爽
顾晨益
陈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Imotion Automotive Technology Suzhou Co Ltd
Original Assignee
Imotion Automotive Technology Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imotion Automotive Technology Suzhou Co Ltd filed Critical Imotion Automotive Technology Suzhou Co Ltd
Priority to CN202010685784.1A priority Critical patent/CN111912418A/en
Publication of CN111912418A publication Critical patent/CN111912418A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to a method, a device and a medium for deleting obstacles in a non-driving area of a mobile carrier, belonging to the technical field of computers, wherein the method comprises the following steps: in the process that the mobile carrier moves on a mobile plane, an environment map of the current mobile environment of the mobile carrier is constructed based on an SLAM algorithm; acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information of the mobile carrier; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information of the mobile carrier; acquiring a corresponding aerial view through the real-time positioning information and the real-time orientation information; deleting the environmental information in the non-driving area in the local map in the aerial view of the local map corresponding to the aerial view in the environmental map to obtain a processed aerial view; the deployment efficiency of the SLAM map in different scenes can be improved; environmental information in the non-drivable area can be deleted without using a high-precision map.

Description

Method, device and medium for deleting obstacles in non-driving area of mobile carrier
Technical Field
The application relates to a method, a device and a medium for deleting obstacles in a non-driving area of a mobile carrier, belonging to the technical field of computers.
Background
With the rapid development of the automatic driving technology, in the automatic driving process, the automatic driving system needs to acquire the surrounding obstacle information so as to achieve the purpose of safe driving on the road.
The existing method for acquiring surrounding obstacle information of a moving carrier comprises the following steps: acquiring a high-precision map; and deleting the obstacles outside the lane area in the high-precision map, and only displaying the obstacles inside the lane area. Such as: the vehicle in front of the moving carrier is displayed, and the green belt and the like near the moving carrier are not displayed.
However, since the high-precision map is a map defined with high precision and fine definition, and the precision needs to reach the order of decimeter, the manufacturing efficiency is low, which causes a problem that the efficiency of displaying the mobile environment of the mobile carrier based on the high-precision map is low.
Disclosure of Invention
The application provides a method, a device and a medium for deleting obstacles in a non-driving area of a mobile carrier, which can solve the problem of low efficiency of displaying the obstacles in the non-driving area of the mobile carrier based on a high-precision map. The application provides the following technical scheme:
in a first aspect, a method for deleting obstacles in a non-driving area of a mobile carrier is provided, and the method comprises the following steps:
in the process that the mobile carrier moves on a moving plane, an environment map of the current moving environment of the mobile carrier is built based on an instant positioning and mapping (SLAM) algorithm, wherein the environment map comprises initial positioning information of the mobile carrier in an initial state and initial orientation information of the mobile carrier;
acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information;
acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information;
acquiring the real-time positioning information and a bird's-eye view corresponding to the real-time orientation information, wherein the bird's-eye view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction;
and deleting the environment information of the non-driving area in the local map in the aerial view of the local map corresponding to the aerial view in the environment map, and obtaining the processed aerial view.
Optionally, the deleting, in the bird's eye view, the environmental information associated with the non-drivable area in the local map to obtain a processed bird's eye view includes:
determining an image intersection between the aerial view and the local map;
performing logic and operation on the intersection of the aerial view and the image to obtain the processed aerial view;
or the like, or, alternatively,
and carrying out logic and operation on the aerial view and the local map to obtain the processed aerial view.
Optionally, before the partial map corresponding to the bird's eye view map in the environment map is deleted from the bird's eye view map and environment information of an area that cannot be driven in the partial map is obtained, the method further includes:
carrying out binarization processing on the environment map to obtain a binarized environment map; the environment map after binarization comprises a first pixel area and a second pixel area, wherein the first pixel area is used for indicating a travelable area, and the second pixel area is used for indicating the non-travelable area.
Optionally, the method further comprises:
determining whether to initiate a prune static barrier function;
and triggering and executing the local map corresponding to the aerial view map in the environment map, and deleting the environment information of the non-driving area in the local map in the aerial view map to obtain a processed aerial view map when determining to start the static obstacle deleting function.
Optionally, a laser detection assembly is mounted on the mobile carrier, and the laser detection assembly is used for collecting point cloud data of the reflector in the preset spatial range; the acquiring the real-time positioning information and the aerial view corresponding to the real-time orientation information includes:
acquiring point cloud data acquired by the laser detection assembly at a position corresponding to the real-time positioning information; the point cloud data comprises three-dimensional coordinates of sampling points, sampling point density and reflected signal intensity; the three-dimensional coordinates are used for indicating the three-dimensional position of the corresponding sampling point relative to the laser detection assembly;
projecting the sampling points to a two-dimensional plane according to the three-dimensional coordinates by taking the height direction vertical to the moving plane as a projection direction to obtain the aerial view; the two-dimensional plane is parallel to the movement plane; or, the two-dimensional plane is the moving plane.
Optionally, the projecting the sampling points to a two-dimensional plane according to the three-dimensional coordinate with a height direction perpendicular to the moving plane as a projection direction to obtain the bird's-eye view includes:
acquiring a relative position relation between the laser detection assembly and the mobile carrier;
converting the three-dimensional coordinates of the sampling points to a public coordinate system based on the relative position relationship to obtain converted three-dimensional coordinates; the common coordinate system is a coordinate system established based on the position of the mobile carrier;
and projecting the converted three-dimensional coordinate to a two-dimensional plane according to the projection direction to obtain the aerial view.
Optionally, after the partial map corresponding to the bird's eye view map in the environment map is deleted from the bird's eye view map and environment information of an area that cannot be driven in the partial map is obtained, the method further includes:
and performing obstacle recognition on the processed aerial view to obtain a dynamic obstacle recognition result of the processed aerial view.
In a second aspect, there is provided an apparatus for deleting an obstacle in a non-driving area of a mobile carrier, the apparatus comprising:
the map building module is used for building an environment map of the current mobile environment of the mobile carrier based on an instant positioning and map building SLAM algorithm in the process that the mobile carrier moves on a mobile plane, wherein the environment map comprises initial positioning information of the mobile carrier in an initial state and initial orientation information of the mobile carrier;
the real-time positioning module is used for acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information;
the aerial view acquisition module is used for acquiring the real-time positioning information and an aerial view corresponding to the real-time orientation information, wherein the aerial view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction;
and the aerial view processing module is used for deleting the environmental information of the non-driving area in the local map in the environment map corresponding to the aerial view to obtain the processed aerial view.
In a third aspect, an apparatus for deleting an obstacle in a non-driving area of a mobile carrier is provided, the apparatus comprising a processor and a memory; the memory has stored therein a program that is loaded and executed by the processor to implement the method of deleting an obstacle in a non-travelable area of a mobile carrier according to the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, in which a program is stored, which program, when being executed by the processor, is adapted to carry out the method of deleting obstacles in a non-travelable area of a mobile carrier according to the first aspect.
The beneficial effect of this application lies in: in the process that the mobile carrier moves on a moving plane, an environment map of the current moving environment of the mobile carrier is constructed based on an instant positioning and map construction SLAM algorithm, wherein the environment map comprises initial positioning information in the initial state of the mobile carrier and initial orientation information of the mobile carrier; acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information; acquiring real-time positioning information and a bird's-eye view corresponding to the real-time orientation information, wherein the bird's-eye view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction; deleting the environmental information of the non-driving area in the local map from the local map corresponding to the aerial view in the environmental map to obtain a processed aerial view; the problem of low efficiency of displaying the obstacles in the non-driving area of the mobile carrier based on the high-precision map can be solved; the deployment efficiency of the SLAM map in different scenes can be improved; environmental information in the non-drivable area can be deleted without using a high-precision map.
The foregoing description is only an overview of the technical solutions of the present application, and in order to make the technical solutions of the present application more clear and clear, and to implement the technical solutions according to the content of the description, the following detailed description is made with reference to the preferred embodiments of the present application and the accompanying drawings.
Drawings
Fig. 1 is a schematic structural diagram of an apparatus for removing obstacles in a non-driving area of a mobile carrier according to an embodiment of the present application;
fig. 2 is a flowchart of a method for deleting an obstacle in a non-driving area of a mobile carrier according to an embodiment of the present application;
FIG. 3 is a diagram illustrating a binarized environment map according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a local map being extracted from an environment map according to an embodiment of the present application;
FIG. 5 is a schematic diagram of the application for deleting environmental information of an unlawful area in the content of a local map from a bird's eye view according to an embodiment of the application;
fig. 6 is a block diagram of an apparatus for removing obstacles in a non-driving area of a mobile carrier according to an embodiment of the present application;
fig. 7 is a block diagram of an apparatus for removing an obstacle in a non-driving area of a mobile carrier according to another embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present application will be described in conjunction with the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
First, several terms referred to in the present application will be described.
Simultaneous localization and mapping (SLAM): the method is characterized in that the robot starts to move from an unknown position in an unknown environment, self-positioning is carried out according to position estimation and a map in the moving process, and meanwhile, autonomous positioning and navigation of the robot are realized.
The principle of the SLAM algorithm includes: acquiring pose information and environmental observation data of the robot in real time; performing spatial uncertainty estimation on the pose information and the environmental observation data so as to estimate and obtain positioning information of the robot; and establishing a map based on the environmental observation data and the positioning information to obtain an environmental map. Wherein, the environmental observation data is obtained by observing the environmental road signs. In general, an environment map obtained based on the SLAM algorithm includes a travelable area, a non-travelable area, and an area to be detected of the robot. SLAM algorithms include, but are not limited to: gmaping, Karto, Hector, Cartograter, etc., and the present embodiment does not limit the type of SLAM algorithm.
Binarization: the gray value of the pixel point on the image is set to 0 or 255, namely the process of the obvious black and white effect of the whole image.
Fig. 1 is a schematic structural diagram of an apparatus 100 for removing an obstacle in a non-driving area of a mobile carrier according to an embodiment of the present application. The moving carrier is a carrier that moves on a moving plane at a constant speed. Alternatively, the moving carrier may be a vehicle or a cleaning robot, and the embodiment is not limited herein. As shown in fig. 1, the apparatus at least comprises: a control component 110 and a sensing component 120 communicatively coupled to the control component 110.
The sensing assembly 120 is mounted on a moving carrier. Sensing component 120 includes, but is not limited to: odometers, gyroscopes, laser detection assemblies, etc. Alternatively, the laser detection assembly 120 may be a laser radar, a stereo camera, or a transit time camera, and the present embodiment does not limit the type of the laser detection assembly 120.
The sensing component 120 is configured to collect sensing data and transmit the sensing data to the control component 110. Accordingly, the control component 110 supports obtaining the sensing data and constructing an environment map of the current moving environment of the mobile carrier based on the SLAM algorithm according to the sensing data during the moving process of the mobile carrier on the moving plane.
The laser detection assembly is used for emitting laser beams in a preset space range and collecting point cloud data of a reflector when the reflector exists. The preset spatial range includes a spatial range in a traveling direction of the moving carrier and a spatial range in a vertical direction perpendicular to the traveling direction.
Wherein the traveling direction may be a direction in which the mobile carrier is traveling, such as: the moving carrier moves backwards, and then the moving direction of the moving carrier is backwards; alternatively, the direction of travel may also be the direction in which a moving carrier in a stationary state is about to travel.
The vertical direction perpendicular to the direction of travel includes: the vertical direction of the front side of the moving carrier perpendicular to the advancing direction, the vertical direction of the rear side of the moving carrier perpendicular to the advancing direction, the direction of the left side of the moving carrier perpendicular to the advancing direction, and the vertical direction of the right side of the moving carrier perpendicular to the advancing direction.
A reflector is an object that reflects a laser beam back to the laser detection assembly. Taking the moving carrier as an example of a vehicle, the reflector may be a road edge, garbage, stone, other vehicles traveling near the vehicle, etc., and the embodiment is not limited to the type of the reflector.
The point cloud data refers to a data set of points which are reflected by the surface of a reflector after a plurality of laser beams are emitted by a laser detection assembly. In this embodiment, the point cloud data includes three-dimensional coordinates of the sampling points, density of the sampling points, and intensity of the reflected signal. The three-dimensional coordinates are used to indicate the three-dimensional position of the corresponding sampling point relative to the laser detection assembly. Wherein, the sampling point is the point reflected by the surface of the reflector.
The control component 110 supports acquisition of point cloud data acquired by the laser detection component; and generating a bird's-eye view according to the point cloud data.
The bird's-eye view is a two-dimensional image formed by projecting a sampling point onto a two-dimensional plane according to three-dimensional coordinates (i.e., converting the coordinate value in the height direction indicated by the three-dimensional coordinates into 0) with the height direction perpendicular to the moving plane as the projection direction.
In this embodiment, the control component 110 is configured to: in the process that the mobile carrier moves on a moving plane, an environment map of the current moving environment of the mobile carrier is constructed based on an instant positioning and map construction SLAM algorithm, wherein the environment map comprises initial positioning information in the initial state of the mobile carrier and initial orientation information of the mobile carrier; acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information; acquiring real-time positioning information and a bird's-eye view corresponding to the real-time orientation information, wherein the bird's-eye view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction; and deleting the environment information of the non-driving area in the local map in the aerial view of the local map corresponding to the aerial view in the environment map, and obtaining the processed aerial view.
Optionally, the embodiment is described by taking an example that the control component 110 is installed in a control system of a mobile carrier (for example, a vehicle), in other implementation manners, the control component 110 may also be implemented in other devices independent from the mobile carrier, and the implementation manner of the control component 110 is not limited in the embodiment.
When the current mobile environment of the mobile carrier comprises static obstacles and dynamic obstacles, the environment map determined based on the SLAM algorithm comprises the static obstacles, and the corresponding aerial view comprises the static obstacles and the dynamic obstacles, so that the static obstacles can be filtered out by deleting the environment information of the middle non-driving area, and only the dynamic obstacles are displayed. The dynamic barrier is usually other moving carriers near the moving carrier, so that the dynamic barrier can be displayed without using a high-precision map, and the display efficiency of the dynamic barrier is improved.
In this application, a static obstacle refers to an object that is stationary in a moving environment, such as: green belts, road blocks, etc. in the mobile environment of the vehicle; dynamic obstacles refer to objects in a moving environment that change position over time, such as: other vehicles in the moving environment of the vehicle.
Fig. 2 is a flowchart of a method for deleting an obstacle in an area where a mobile carrier cannot travel according to an embodiment of the present application, where the present embodiment is described by taking as an example that the method is used in the apparatus 100 for deleting an obstacle in an area where a mobile carrier cannot travel shown in fig. 1, and a main execution subject of each step is described by taking as an example that the control component 110 in the apparatus 100 for deleting an obstacle in an area where a mobile carrier cannot travel. The method at least comprises the following steps:
step 201, in the process that the mobile carrier moves on the moving plane, an environment map of the current moving environment of the mobile carrier is constructed based on the SLAM algorithm, and the environment map comprises initial positioning information in the initial state of the mobile carrier and initial orientation information of the mobile carrier.
The orientation information may be a heading angle of the moving carrier.
Optionally, the mobile carrier determines whether to initiate a prune static barrier function; when determining to start the function of deleting the static obstacle, executing step 201; when it is determined that the delete static obstacle function is not activated, the process ends.
When the environment map is constructed, the mobile carrier inputs sensing data acquired by the sensing assembly into an SLAM algorithm to obtain the environment map.
Alternatively, after the environment map is obtained, binarization processing is performed on the environment map to obtain a binarized environment map (refer to fig. 3). The environment map after binarization comprises a first pixel area and a second pixel area, wherein the first pixel area is used for indicating a travelable area, and the second pixel area is used for indicating an unlawable area. In one example, the pixel value of the travelable region is 0; the pixel value of the no-travel region is 255.
Of course, the environment map may further include an area to be explored, and the embodiment does not limit the area division manner of the environment map.
In other examples, the environment map output by the SLAM algorithm may be a binarized environment map, at which time the step of binarizing the environment map need not be performed.
Step 202, acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information; and acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information.
The mobile carrier can acquire real-time orientation information in real time based on the initial orientation information through a gyroscope. And then, the mobile carrier can calculate real-time positioning information according to the running time length, the running speed and the real-time orientation information.
And step 203, acquiring a bird's-eye view corresponding to the real-time positioning information and the real-time orientation information, wherein the bird's-eye view is used for representing the actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in the advancing direction indicated by the real-time orientation information and a space range in the vertical direction perpendicular to the advancing direction.
In one example, a laser detection assembly is mounted on the mobile carrier and is used for collecting point cloud data of the reflector within a preset space range. At this moment, obtain the aerial view that real-time locating information and real-time orientation information correspond, include: acquiring point cloud data of the laser detection assembly in a preset space range acquired at a position corresponding to the real-time positioning information; the point cloud data comprises three-dimensional coordinates of sampling points, density of the sampling points and intensity of reflected signals; the three-dimensional coordinates are used for indicating the three-dimensional position of the corresponding sampling point relative to the laser detection assembly; projecting the sampling points to a two-dimensional plane according to the three-dimensional coordinates by taking the height direction vertical to the moving plane as a projection direction to obtain a bird's-eye view; the two-dimensional plane is parallel to the moving plane; alternatively, the two-dimensional plane is a moving plane.
Wherein, use the direction of height that is perpendicular to with the mobile plane as the projection direction, according to three-dimensional coordinate with sample point projection to two-dimensional plane, obtain the aerial view, include: acquiring a relative position relation between the laser detection assembly and the mobile carrier; converting the three-dimensional coordinates of the sampling points to a public coordinate system based on the relative position relationship to obtain converted three-dimensional coordinates; the public coordinate system is a coordinate system established based on the position of the mobile carrier; and projecting the converted three-dimensional coordinate to a two-dimensional plane according to the projection direction to obtain the aerial view.
Optionally, the two-dimensional plane is parallel to the movement plane; alternatively, the two-dimensional plane is a moving plane.
The bird's-eye view is a two-dimensional image formed by projecting a sampling point onto a two-dimensional plane according to three-dimensional coordinates (i.e., converting the coordinate value in the height direction indicated by the three-dimensional coordinates into 0) with the height direction perpendicular to the moving plane as the projection direction.
The relative positional relationship refers to the positional relationship of the laser detection assembly relative to the moving carrier. In one example, the three-dimensional coordinates are coordinate values in a coordinate system established with the laser detection assembly as a coordinate origin, the common coordinate system is a coordinate system established with the central position of the movable carrier as the coordinate origin, and at this time, the relative positional relationship is a coordinate system conversion relationship between the coordinate system in which the laser detection assembly is located and the common coordinate system.
Since the three-dimensional coordinates of the sampling point are coordinates relative to the laser detection assembly, that is, coordinate values in a coordinate system established based on the laser detection assembly; the coordinate system established based on the laser detection assembly is not suitable for the three-dimensional coordinates acquired by other laser detection assemblies. Therefore, in this embodiment, the three-dimensional coordinates are converted into a common coordinate system, and the common coordinate system is suitable for the converted three-dimensional coordinates acquired by each laser detection assembly, so that the three-dimensional coordinates acquired by each laser detection assembly can be combined to obtain an overall three-dimensional image.
As can be seen from the acquisition process of the bird's eye view, when the moving environment includes an obstacle (including a static obstacle and a dynamic obstacle), the moving carrier acquires point cloud data of the obstacle, and generates the bird's eye view having the obstacle based on the point cloud data. In this embodiment, the bird's eye view further includes the position of the mobile carrier.
And step 204, deleting the environment information of the non-drivable area in the local map in the aerial view of the local map corresponding to the aerial view in the environment map, and obtaining the processed aerial view.
And after the mobile carrier acquires the environment map and the aerial view corresponding to the positioning information and the orientation information in the environment map, intercepting a part corresponding to the aerial view in the environment map to obtain a local map. Such as: referring to fig. 4, a portion 41 corresponding to the bird's eye view is cut out in the environment map (a), and a local map (b) is obtained.
Optionally, deleting the environmental information of the non-drivable area in the local map from the bird's-eye view map to obtain a processed bird's-eye view map, including: determining an image intersection between the aerial view and the local map; performing logic and operation on the intersection of the aerial view and the image to obtain a processed aerial view; or carrying out logic and operation on the aerial view and the local map to obtain the processed aerial view.
The logical and operation means: the results of 255 and 0 are 0; results of 255 and 255 are 255; the results of 0 and 0 are 0.
The environment information of the non-drivable area in the local map is deleted from the bird's-eye view map, and the bird's-eye view map after processing can be represented as follows: bird's-eye view (bird's-eye view & ' local map); or, as a bird's eye view-local map.
Such as: referring to fig. 5, aerial view 51 includes mobile carrier 511, green area 512, and vehicle 513 (dynamic obstacle) located in front of mobile carrier 511; if the local map 52 includes the green area 521, the bird's eye view 51 deletes the same area as the local map 52, and the resulting processed bird's eye view 53 includes only the vehicle 513 instead of the green area 521.
Alternatively, the mobile carrier may display the processed bird's eye view.
Optionally, the mobile carrier may perform obstacle recognition on the processed bird's-eye view image to obtain a dynamic obstacle recognition result of the processed bird's-eye view image.
In summary, in the method for deleting the obstacle in the non-driving area of the mobile carrier provided by this embodiment, an environment map of the current moving environment of the mobile carrier is constructed based on an instant positioning and mapping SLAM algorithm in the moving process of the mobile carrier on the moving plane, where the environment map includes initial positioning information in the initial state of the mobile carrier and initial orientation information of the mobile carrier; acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information; acquiring real-time positioning information and a bird's-eye view corresponding to the real-time orientation information, wherein the bird's-eye view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction; deleting the environmental information of the non-driving area in the local map from the local map corresponding to the aerial view in the environmental map to obtain a processed aerial view; the problem of low efficiency of displaying the obstacles in the non-driving area of the mobile carrier based on the high-precision map can be solved; the deployment efficiency of the SLAM map in different scenes can be improved; environmental information in the non-drivable area can be deleted without using a high-precision map.
Fig. 6 is a block diagram of an apparatus for deleting an obstacle in an area where a mobile carrier cannot travel according to an embodiment of the present application, and this embodiment takes as an example a control component of the apparatus applied to the apparatus for deleting an obstacle in an area where a mobile carrier cannot travel shown in fig. 1. The device at least comprises the following modules: the map building module 610, the real-time positioning module 620, the aerial view acquisition module 630 and the aerial view processing module 640.
A map building module 610, configured to build an environment map of a current mobile environment of the mobile carrier based on an instant positioning and map building SLAM algorithm in a process that the mobile carrier moves on a mobile plane, where the environment map includes initial positioning information in an initial state of the mobile carrier and initial orientation information of the mobile carrier;
a real-time positioning module 620, configured to obtain real-time positioning information of the mobile carrier in real time based on the initial positioning information; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information;
a bird's-eye view obtaining module 630, configured to obtain a bird's-eye view corresponding to the real-time positioning information and the real-time orientation information, where the bird's-eye view is used to represent actual environment information within a preset spatial range determined based on the real-time positioning information and the real-time orientation information, and the preset spatial range includes a spatial range in a traveling direction indicated by the real-time orientation information and a spatial range in a vertical direction perpendicular to the traveling direction;
and a bird's-eye view processing module 640, configured to delete the environment information of the non-drivable area in the local map from the local map corresponding to the bird's-eye view map in the environment map, and obtain a processed bird's-eye view map.
For relevant details reference is made to the above-described method embodiments.
It should be noted that: in the device for deleting the obstacle in the non-driving area of the mobile carrier provided in the above embodiment, when the device for deleting the obstacle in the non-driving area of the mobile carrier is used for deleting the obstacle in the non-driving area of the mobile carrier, only the division of the above functional modules is taken as an example, in practical application, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device for deleting the obstacle in the non-driving area of the mobile carrier is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the device for deleting the obstacle in the non-driving area of the mobile carrier and the method embodiment for deleting the obstacle in the non-driving area of the mobile carrier provided by the above embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment and is not described again here.
Fig. 7 is a block diagram of an apparatus for deleting an obstacle in an area where a mobile carrier cannot travel according to an embodiment of the present application, where the apparatus may be an apparatus including the control component 110 in the apparatus 100 for deleting an obstacle in an area where a mobile carrier cannot travel shown in fig. 1. The apparatus includes at least a processor 701 and a memory 702.
Processor 701 may include one or more processing cores, such as: 4 core processors, 8 core processors, etc. The processor 701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 701 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 701 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 702 may include one or more computer-readable storage media, which may be non-transitory. Memory 702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 702 is configured to store at least one instruction for execution by processor 701 to implement a method of removing an obstacle in a non-travelable area of a mobile carrier as provided by method embodiments herein.
In some embodiments, the device for deleting the obstacle in the non-driving area of the mobile carrier may further include: a peripheral interface and at least one peripheral. The processor 701, memory 702, and peripheral interface may be connected by bus or signal lines. Each peripheral may be connected to the peripheral interface via a bus, signal line, or circuit board. Illustratively, peripheral devices include, but are not limited to: radio frequency circuit, touch display screen, audio circuit, power supply, etc.
Of course, the device for deleting the obstacle in the non-driving area of the mobile carrier may also include fewer or more components, which is not limited by the embodiment.
Optionally, the present application further provides a computer-readable storage medium, in which a program is stored, and the program is loaded and executed by a processor to implement the method for deleting the obstacle in the non-driving area of the mobile carrier according to the above method embodiment.
Optionally, the present application further provides a computer product, which includes a computer-readable storage medium, in which a program is stored, and the program is loaded and executed by a processor to implement the method for deleting the obstacle in the non-driving area of the mobile carrier according to the above method embodiment.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of removing obstacles from an area in which a mobile carrier cannot travel, the method comprising:
in the process that the mobile carrier moves on a moving plane, an environment map of the current moving environment of the mobile carrier is built based on an instant positioning and mapping (SLAM) algorithm, wherein the environment map comprises initial positioning information of the mobile carrier in an initial state and initial orientation information of the mobile carrier;
acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information;
acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information;
acquiring the real-time positioning information and a bird's-eye view corresponding to the real-time orientation information, wherein the bird's-eye view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction;
and deleting the environment information of the non-driving area in the local map in the aerial view of the local map corresponding to the aerial view in the environment map, and obtaining the processed aerial view.
2. The method of claim 1, wherein the deleting environmental information associated with the non-drivable area in the local map in the aerial view to obtain a processed aerial view comprises:
determining an image intersection between the aerial view and the local map; performing logic and operation on the intersection of the aerial view and the image to obtain the processed aerial view;
or the like, or, alternatively,
and carrying out logic and operation on the aerial view and the local map to obtain the processed aerial view.
3. The method according to claim 1, wherein the method further comprises, for a partial map corresponding to the bird's eye view map in the environment map, deleting environment information corresponding to an area that cannot be traveled in the partial map from the bird's eye view map, and obtaining a processed bird's eye view map:
carrying out binarization processing on the environment map to obtain a binarized environment map; the environment map after binarization comprises a first pixel area and a second pixel area, wherein the first pixel area is used for indicating a travelable area, and the second pixel area is used for indicating the non-travelable area.
4. The method of claim 1, further comprising:
determining whether to initiate a prune static barrier function;
and triggering and executing the local map corresponding to the aerial view map in the environment map, and deleting the environment information of the non-driving area in the local map in the aerial view map to obtain a processed aerial view map when determining to start the static obstacle deleting function.
5. The method according to claim 1, wherein a laser detection assembly is mounted on the mobile carrier, and is used for collecting point cloud data of a reflector in the preset spatial range; the acquiring the real-time positioning information and the aerial view corresponding to the real-time orientation information includes:
acquiring point cloud data acquired by the laser detection assembly at a position corresponding to the real-time positioning information; the point cloud data comprises three-dimensional coordinates of sampling points, sampling point density and reflected signal intensity; the three-dimensional coordinates are used for indicating the three-dimensional position of the corresponding sampling point relative to the laser detection assembly;
projecting the sampling points to a two-dimensional plane according to the three-dimensional coordinates by taking the height direction vertical to the moving plane as a projection direction to obtain the aerial view; the two-dimensional plane is parallel to the movement plane; or, the two-dimensional plane is the moving plane.
6. The method according to claim 5, wherein the projecting the sampling points to a two-dimensional plane according to the three-dimensional coordinates with a height direction perpendicular to the moving plane as a projection direction to obtain the bird's-eye view comprises:
acquiring a relative position relation between the laser detection assembly and the mobile carrier;
converting the three-dimensional coordinates of the sampling points to a public coordinate system based on the relative position relationship to obtain converted three-dimensional coordinates; the common coordinate system is a coordinate system established based on the position of the mobile carrier;
and projecting the converted three-dimensional coordinate to a two-dimensional plane according to the projection direction to obtain the aerial view.
7. The method according to claim 1, wherein the method further comprises, after the obtaining a processed bird's-eye view map by deleting environment information of an area that cannot be traveled on the local map from the local map corresponding to the bird's-eye view map in the environment map, the method further comprising:
and performing obstacle recognition on the processed aerial view to obtain a dynamic obstacle recognition result of the processed aerial view.
8. An apparatus for removing obstacles in a non-travel-able area of a mobile carrier, the apparatus comprising:
the map building module is used for building an environment map of the current mobile environment of the mobile carrier based on an instant positioning and map building SLAM algorithm in the process that the mobile carrier moves on a mobile plane, wherein the environment map comprises initial positioning information of the mobile carrier in an initial state and initial orientation information of the mobile carrier;
the real-time positioning module is used for acquiring real-time positioning information of the mobile carrier in real time based on the initial positioning information; acquiring real-time orientation information of the mobile carrier in real time based on the initial orientation information;
the aerial view acquisition module is used for acquiring the real-time positioning information and an aerial view corresponding to the real-time orientation information, wherein the aerial view is used for representing actual environment information in a preset space range determined based on the real-time positioning information and the real-time orientation information, and the preset space range comprises a space range in a traveling direction indicated by the real-time orientation information and a space range in a vertical direction perpendicular to the traveling direction;
and the aerial view processing module is used for deleting the environmental information of the non-driving area in the local map in the environment map corresponding to the aerial view to obtain the processed aerial view.
9. An apparatus for removing obstacles in a non-drivable area of a mobile carrier, comprising a processor and a memory; the memory stores a program which is loaded and executed by the processor to implement the method of deleting an obstacle in a non-travelable area of a mobile carrier according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the storage medium has stored therein a program which, when being executed by the processor, is adapted to carry out the method of deleting obstacles in a zone in which a mobile carrier cannot travel according to any one of claims 1 to 7.
CN202010685784.1A 2020-07-16 2020-07-16 Method, device and medium for deleting obstacles in non-driving area of mobile carrier Pending CN111912418A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010685784.1A CN111912418A (en) 2020-07-16 2020-07-16 Method, device and medium for deleting obstacles in non-driving area of mobile carrier

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010685784.1A CN111912418A (en) 2020-07-16 2020-07-16 Method, device and medium for deleting obstacles in non-driving area of mobile carrier

Publications (1)

Publication Number Publication Date
CN111912418A true CN111912418A (en) 2020-11-10

Family

ID=73280393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010685784.1A Pending CN111912418A (en) 2020-07-16 2020-07-16 Method, device and medium for deleting obstacles in non-driving area of mobile carrier

Country Status (1)

Country Link
CN (1) CN111912418A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112964263A (en) * 2021-02-01 2021-06-15 杭州唯实科技有限公司 Automatic drawing establishing method and device, mobile robot and readable storage medium
CN115032995A (en) * 2022-06-17 2022-09-09 未岚大陆(北京)科技有限公司 Motion control method, motion control device, electronic equipment and computer storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005227947A (en) * 2004-02-12 2005-08-25 Alpine Electronics Inc Navigation device and obstacle display method
US20150307024A1 (en) * 2014-04-25 2015-10-29 Hitachi Construction Machinery Co., Ltd. Vehicle peripheral obstacle notification system
CN109828592A (en) * 2019-04-22 2019-05-31 深兰人工智能芯片研究院(江苏)有限公司 A kind of method and apparatus of detection of obstacles
US20190291748A1 (en) * 2016-10-18 2019-09-26 Honda Motor Co., Ltd. Vehicle control device
CN110614992A (en) * 2018-12-29 2019-12-27 长城汽车股份有限公司 Method and system for avoiding obstacle during automatic driving of vehicle and vehicle
US20200137322A1 (en) * 2018-10-26 2020-04-30 Denso Corporation Image processing apparatus
CN111208839A (en) * 2020-04-24 2020-05-29 清华大学 Fusion method and system of real-time perception information and automatic driving map
US20200174492A1 (en) * 2018-11-29 2020-06-04 Electronics And Telecommunications Research Institute Autonomous driving method and system using road view or aerial view map information

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005227947A (en) * 2004-02-12 2005-08-25 Alpine Electronics Inc Navigation device and obstacle display method
US20150307024A1 (en) * 2014-04-25 2015-10-29 Hitachi Construction Machinery Co., Ltd. Vehicle peripheral obstacle notification system
US20190291748A1 (en) * 2016-10-18 2019-09-26 Honda Motor Co., Ltd. Vehicle control device
US20200137322A1 (en) * 2018-10-26 2020-04-30 Denso Corporation Image processing apparatus
US20200174492A1 (en) * 2018-11-29 2020-06-04 Electronics And Telecommunications Research Institute Autonomous driving method and system using road view or aerial view map information
CN110614992A (en) * 2018-12-29 2019-12-27 长城汽车股份有限公司 Method and system for avoiding obstacle during automatic driving of vehicle and vehicle
CN109828592A (en) * 2019-04-22 2019-05-31 深兰人工智能芯片研究院(江苏)有限公司 A kind of method and apparatus of detection of obstacles
CN111208839A (en) * 2020-04-24 2020-05-29 清华大学 Fusion method and system of real-time perception information and automatic driving map

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张国良,姚二亮: "移动机器人的SLAM与VSLAM方法", 30 September 2018, 西安交通大学出版社, pages: 2 - 5 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112964263A (en) * 2021-02-01 2021-06-15 杭州唯实科技有限公司 Automatic drawing establishing method and device, mobile robot and readable storage medium
CN112964263B (en) * 2021-02-01 2022-11-01 杭州荷尖科技合伙企业(有限合伙) Automatic drawing establishing method and device, mobile robot and readable storage medium
CN115032995A (en) * 2022-06-17 2022-09-09 未岚大陆(北京)科技有限公司 Motion control method, motion control device, electronic equipment and computer storage medium
US11940809B2 (en) 2022-06-17 2024-03-26 Willand (Beijing) Technology Co., Ltd. Movement control method, electronic device, and computer storage medium

Similar Documents

Publication Publication Date Title
CN111598034B (en) Obstacle detection method, obstacle detection device and storage medium
US10534091B2 (en) Method and apparatus for generating road surface, method and apparatus for processing point cloud data, computer program, and computer readable recording medium
US11709058B2 (en) Path planning method and device and mobile device
Zermas et al. Fast segmentation of 3d point clouds: A paradigm on lidar data for autonomous vehicle applications
CN108629231B (en) Obstacle detection method, apparatus, device and storage medium
CN108509820B (en) Obstacle segmentation method and device, computer equipment and readable medium
CN111932943B (en) Dynamic target detection method and device, storage medium and roadbed monitoring equipment
US10909411B2 (en) Information processing apparatus, information processing method, and computer program product
CN108470174B (en) Obstacle segmentation method and device, computer equipment and readable medium
CN112258519B (en) Automatic extraction method and device for way-giving line of road in high-precision map making
WO2024012211A1 (en) Autonomous-driving environmental perception method, medium and vehicle
CN112904369B (en) Robot repositioning method, apparatus, robot, and computer-readable storage medium
CN111912418A (en) Method, device and medium for deleting obstacles in non-driving area of mobile carrier
CN111650626B (en) Road information acquisition method, device and storage medium
CN114556442A (en) Three-dimensional point cloud segmentation method and device and movable platform
Arora et al. Static map generation from 3D LiDAR point clouds exploiting ground segmentation
CN112789521A (en) Method and device for determining perception area, storage medium and vehicle
CN114694115A (en) Road obstacle detection method, device, equipment and storage medium
CN110174115B (en) Method and device for automatically generating high-precision positioning map based on perception data
CN110390252B (en) Obstacle detection method and device based on prior map information and storage medium
CN112639822A (en) Data processing method and device
CN115115597A (en) Target detection method, device, equipment and medium
CN112651405B (en) Target detection method and device
CN114549764A (en) Obstacle identification method, device, equipment and storage medium based on unmanned vehicle
CN112199459A (en) 3D point cloud segmentation method and segmentation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 215123 g2-1901 / 1902 / 2002, No. 88, Jinjihu Avenue, Suzhou Industrial Park, Suzhou City, Jiangsu Province

Applicant after: Zhixing Automotive Technology (Suzhou) Co.,Ltd.

Address before: 215123 g2-1901 / 1902 / 2002, No. 88, Jinjihu Avenue, Suzhou Industrial Park, Suzhou City, Jiangsu Province

Applicant before: IMOTION AUTOMOTIVE TECHNOLOGY (SUZHOU) Co.,Ltd.