CN113109833A - Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar - Google Patents

Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar Download PDF

Info

Publication number
CN113109833A
CN113109833A CN202110362110.2A CN202110362110A CN113109833A CN 113109833 A CN113109833 A CN 113109833A CN 202110362110 A CN202110362110 A CN 202110362110A CN 113109833 A CN113109833 A CN 113109833A
Authority
CN
China
Prior art keywords
resolution
imaging
dimensional
visible light
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110362110.2A
Other languages
Chinese (zh)
Inventor
唐鸣元
崔焕�
徐辰宇
李国梁
鲍春
张镐宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Liming Intelligent Technology Co ltd
Original Assignee
Beijing Liming Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Liming Intelligent Technology Co ltd filed Critical Beijing Liming Intelligent Technology Co ltd
Priority to CN202110362110.2A priority Critical patent/CN113109833A/en
Publication of CN113109833A publication Critical patent/CN113109833A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of photoelectric imaging, and discloses a bionic three-dimensional imaging system based on fusion of visible light and a laser radar, which comprises an acquisition module, a processing module and a display module, wherein the acquisition module comprises a high-resolution visible light imaging module, a low-resolution visible light imaging module and a laser radar imaging module; a bionic three-dimensional imaging method based on fusion of visible light and laser radar comprises the following steps: calibrating parameters of the imaging module based on a target scene and actual requirements in advance; designing a bionic structure model based on the parameters of the imaging module; and transmitting the finally obtained image data to output equipment for real-time three-dimensional scene display. The invention realizes central super-resolution and edge low-resolution imaging by a space variable-resolution imaging mode designed by a composite bionic structure, compresses data redundancy of peripheral regions, has the characteristics of large field of view, high resolution and real-time compatibility, and can accurately and comprehensively sense the imaging range in real time.

Description

Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar
Technical Field
The invention relates to the technical field of photoelectric imaging, in particular to a bionic three-dimensional imaging system and method based on fusion of visible light and a laser radar.
Background
The optical imaging technology can reconstruct the two-dimensional plane information of the target by efficiently capturing the reflected light of the target surface, particularly in the visible light wave band. However, in the case of not using multi-view imaging or other auxiliary imaging methods, such as binocular vision imaging, conventional optical imaging cannot acquire depth information or distance information of a target, so that stereoscopic three-dimensional imaging of the target cannot be performed. The laser radar is a common method for acquiring three-dimensional depth information of a target, and the core of the method is to acquire the three-dimensional information of the target by analyzing the change of a laser echo signal reflected from a three-dimensional space of the target. Therefore, the method is a high-precision and high-quality three-dimensional target imaging method by fusing the traditional optical two-dimensional imaging and the laser radar depth information. Currently, this technique has been applied to a variety of applications, such as: in the field of robots, the three-dimensional information acquired by a laser radar endows the robots with the capability of finishing complex behaviors such as obstacle avoidance, tracking and the like; meanwhile, the robot can complete precise tasks such as target recognition, matching and the like through imaging of visible light. Therefore, through the fusion of the visible light and the laser radar, the robot can complete various tasks in a complex scene. In the aspect of auxiliary driving, the laser radar is used for acquiring road information, so that the driving state, the traffic condition, the position of an automobile and the like of the automobile can be sensed in real time, and possible driving problems can be fed back in time; meanwhile, visible light imaging is carried out on the periphery of the automobile, so that convenience is brought to the control of the driving condition of the automobile by a driver, the visual blind area is reduced, and the driving safety is guaranteed.
The fusion method of the visible light and the laser radar needs a large amount of data calculation, and the calculation amount of the data is larger and larger along with the increase of the field of view, which has high requirements on the memory and the efficient algorithm of a data processor, but at present, no good solution for balancing the field of view and the data amount exists, which also causes the increase of data processing time, which means that the real-time performance is more difficult to meet, so that the problem of solving the large field of view and the real-time performance becomes the difficult problem of the fusion perception of the visible light and the laser radar. The above shows that although the fusion of visible light and laser radar brings sufficient information for sensing external information, there are problems that large field of view, high resolution and real-time performance are difficult to be considered.
The invention discloses a three-dimensional imaging system of a bionic camera for retrieving Chinese patent CN108803228B, which comprises an imaging unit, wherein the imaging unit consists of a lens, a photoreceptor and an optical fiber according to a preset rule; the imaging unit is used for providing at least two layers of imaging surfaces to realize three-dimensional imaging; the imaging unit can provide a plurality of imaging surfaces and simultaneously acquire data on a plurality of planes, and images of the plurality of imaging surfaces are combined in a three-dimensional space so as to realize three-dimensional imaging; the imaging system solves the problems that the existing imaging system can only image on one plane and can not realize three-dimensional imaging. However, there are some limitations, and there are problems that it is difficult to support a large field of view, and that it is difficult to achieve both high resolution and real-time performance.
An effective solution to the problems in the related art has not been proposed yet.
Disclosure of Invention
Aiming at the problems in the related art, the invention provides a bionic three-dimensional imaging system and method based on the fusion of visible light and a laser radar, and solves the technical problems that the large field of view, high resolution and real-time performance are difficult to be considered at present.
The technical scheme of the invention is realized as follows:
in one aspect of the present invention, there is provided:
a bionic three-dimensional imaging system based on fusion of visible light and a laser radar comprises an acquisition module, a processing module and a display module, wherein the display module and the acquisition module are respectively connected with the processing module;
the acquisition module comprises a high-resolution visible light imaging module, a low-resolution visible light imaging module and a laser radar imaging module;
furthermore, the high-resolution visible light imaging modules are at least two groups, and the low-resolution visible light imaging modules are at least two groups.
Furthermore, the high-resolution visible light imaging modules are arranged at two sides of the laser radar imaging module, the low-resolution visible light imaging modules are positioned at two sides of the high-resolution visible light imaging module,
furthermore, the high-resolution visible light imaging module, the low-resolution visible light imaging module and the laser radar imaging module are arranged in a curved surface form.
Furthermore, the included angle alpha between the high-resolution visible light imaging module and the laser radar imaging module1Is 30 degrees; included angle alpha between high-resolution visible light imaging module and low-resolution visible light imaging module2Is 30 deg..
In another aspect of the present invention, there is provided:
a bionic three-dimensional imaging method based on fusion of visible light and laser radar is used for an imaging method of a bionic three-dimensional imaging system based on fusion of visible light and laser radar, and comprises the following steps:
calibrating parameters of the imaging module based on a target scene and actual requirements in advance;
designing a bionic structure model based on the parameters of the imaging module;
matching laser radar imaging parameters and designing a corresponding laser radar system;
two-dimensional imaging with variable resolution and large field of view and acquiring depth information in the large field of view;
carrying out data processing through an FPGA (field programmable gate array) so as to carry out image two-dimensional and three-dimensional information fusion within a field range;
and transmitting the finally obtained image data to output equipment for real-time three-dimensional scene display.
The bionic structure model design based on the imaging module parameters comprises the following steps:
high-resolution cameras are symmetrically arranged on two sides of a central laser radar system in advance, and low-resolution cameras are arranged on two sides of the high-resolution cameras at the same time;
calibrating the arrangement included angle alpha between a high-resolution camera and a central laser radar system1A value;
calibrating an angle alpha between a high resolution camera and a low resolution camera2The value is obtained.
The method comprises the following steps of:
acquiring target two-dimensional image information in a field of view, wherein the target two-dimensional image information comprises two high-resolution images and two low-resolution-rate images which are provided with certain overlapping areas;
the relative positions of the four images are calculated to carry out image splicing, and the super-resolution calculation is carried out on the area in the overlapping part of the two high-resolution images by a bilinear interpolation method, so that the super-resolution imaging effect is achieved;
obtaining a variable-resolution two-dimensional imaging image meeting the requirement of a field range, and obtaining three-dimensional depth information in the field of view based on a finished laser radar imaging system;
the method for fusing two-dimensional and three-dimensional information of the image in the field of view comprises the following steps:
acquiring a two-dimensional image with space variable resolution and a three-dimensional image meeting the field of view requirement;
acquiring a relative position relation between two images, and finding three-dimensional depth information of each point in a two-dimensional image;
the method comprises the steps of sampling a large-view-field two-dimensional image with a spatial resolution by an annular sampling mode of a bionic visual mechanism, giving corresponding three-dimensional depth information to each point, achieving the purpose of fusing two-dimensional and three-dimensional information, and obtaining a variable-resolution large-view-field three-dimensional image with two-dimensional color information.
The invention has the beneficial effects that:
1. the bionic three-dimensional imaging system and method based on the fusion of the visible light and the laser radar realize central super-resolution and edge low-resolution imaging through a space variable resolution imaging mode designed by a composite bionic structure, compress data redundancy of peripheral areas while ensuring clear imaging of a central concave target, have the characteristics of large field of view, high resolution and real-time consideration, and can accurately and comprehensively sense an imaging range in real time;
2. according to the bionic three-dimensional imaging system and method based on fusion of visible light and the laser radar, through the annular imaging characteristic of a bionic visual mechanism, the two-dimensional color information of the visible light wave band and the depth information of the laser radar are fused in a space variable resolution sampling mode, so that the target in an imaging range is quickly and accurately sensed, and the imaging target information is comprehensively acquired;
3. the bionic three-dimensional imaging system and method based on the fusion of the visible light and the laser radar have the advantages of small volume and large field of view compared with the traditional arrangement mode of the planar camera through the arrangement of the curved surface array, and can be flexibly applied to various complex scenes.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic block diagram of a bionic three-dimensional imaging system based on visible light and laser radar fusion according to an embodiment of the invention;
in the figure: 1. an acquisition module; 2. a processing module; 3. a display module; 4. a high-resolution visible light imaging module; 5. a low-resolution visible light imaging module; 6. laser radar formation of image module.
FIG. 2 is a schematic flow chart of a bionic three-dimensional imaging method based on fusion of visible light and a laser radar according to an embodiment of the invention;
FIG. 3 is a first scene schematic diagram of a bionic three-dimensional imaging method based on the fusion of visible light and a laser radar according to an embodiment of the present invention;
in the figure: a. d is a background image acquired by the low-resolution visible light imaging module; b. c is a target image acquired by the high-resolution visible light imaging module; e is a visible light imaging module fused image; f is a depth map of the image acquired by the laser radar after preliminary processing.
Fig. 4 is a scene schematic diagram ii of a bionic three-dimensional imaging method based on fusion of visible light and a laser radar according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present invention.
According to one embodiment of the invention, a bionic three-dimensional imaging system based on fusion of visible light and laser radar is provided.
As shown in fig. 1, the bionic three-dimensional imaging system based on the fusion of visible light and laser radar according to the embodiment of the present invention includes an acquisition module 1, a processing module 2 and a display module 3, wherein the display module 3 and the acquisition module 1 are respectively connected to the processing module 2;
the acquisition module 1 comprises a high-resolution visible light imaging module 4, a low-resolution visible light imaging module 5 and a laser radar imaging module 6;
the high-resolution visible light imaging modules 4 are at least two groups, and the low-resolution visible light imaging modules 5 are at least two groups.
Wherein, the high-resolution visible light imaging module 4 is arranged at two sides of the laser radar imaging module 6, and the low-resolution visible light imaging module 5 is arranged at two sides of the high-resolution visible light imaging module 6,
wherein, high resolution visible light imaging module 4, low resolution visible light imaging module 5 and lidar imaging module 6 arrange and are arranged in the curved surface form.
Wherein, the included angle alpha between the high-resolution visible light imaging module 4 and the laser radar imaging module 61Is 30 degrees; an included angle alpha between the high-resolution visible light imaging module 4 and the low-resolution visible light imaging module 52Is 30 deg..
In addition, the processing module 2 is a computing unit which takes the FPGA as a core, data acquired by the acquisition module 1 are integrated through calculation and transmitted to the display module 3 part in time, and the main purpose of the display module 3 is to display finished image information in real time so as to facilitate observation of an operator.
In addition, above-mentioned collection module 1 is the key part of formation of image, according to bionical vision mechanism, and high resolution visible light imaging module 4 arranges in 6 both sides of laser radar imaging module, and low resolution visible light imaging module 5 is located 6 both sides of high resolution visible light imaging module to the curved surface form is arranged. Meanwhile, super-resolution imaging is carried out on the central concave part with the coincident view field of the laser radar imaging module 6, so that the space variable-resolution imaging effect with multiple resolutions is achieved. Meanwhile, the laser radar imaging system is located in the center of the whole curved surface to obtain three-dimensional information of a target scene. The whole system presents a symmetrically distributed curved surface array form.
Further, consider when α1When the central target is enlarged, the overlapping area of the two imaging modules is reduced, so that the method is suitable for the situation that the central target is smaller but the required integral view field is larger; when alpha is1When the central target is reduced, the overlapping area of the two imaging modules is increased, and the imaging module is suitable for the situation that the central target is large; when alpha is2When the imaging field of view of the system is reduced, the inclination angle of the low-resolution imaging module is reduced, and the complementary capability of the peripheral field of view is reduced, so that the overall imaging field of view of the system is reduced. Alpha for 120 degree imaging field of view is preferred1And alpha230 deg. and 30 deg., respectively.
By means of the technical scheme, the curved-surface array arrangement has the advantages of small size and large view field compared with the traditional plane camera arrangement mode, and can be flexibly applied to various complex scenes; the imaging mode of spatial variable resolution designed by a composite bionic structure realizes central super-resolution and edge low-resolution imaging, compresses data redundancy of peripheral areas while ensuring clear imaging of a central concave target, has the characteristics of large field of view, high resolution and real-time compromise, and can realize real-time and accurate comprehensive perception of an imaging range.
According to another embodiment of the invention, a bionic three-dimensional imaging method based on fusion of visible light and laser radar is provided.
As shown in fig. 2, the bionic three-dimensional imaging method based on the fusion of visible light and lidar according to the embodiment of the present invention is an imaging method for a bionic three-dimensional imaging system based on the fusion of visible light and lidar, and includes the following steps:
calibrating parameters of the imaging module based on a target scene and actual requirements in advance;
designing a bionic structure model based on the parameters of the imaging module;
matching laser radar imaging parameters and designing a corresponding laser radar system;
two-dimensional imaging with variable resolution and large field of view and acquiring depth information in the large field of view;
carrying out data processing through an FPGA (field programmable gate array) so as to carry out image two-dimensional and three-dimensional information fusion within a field range;
and transmitting the finally obtained image data to output equipment for real-time three-dimensional scene display.
Wherein, carry out bionic structure model design based on imaging module parameter, include following step:
high-resolution cameras are symmetrically arranged on two sides of a central laser radar system in advance, and low-resolution cameras are arranged on two sides of the high-resolution cameras at the same time;
calibrating between high-resolution camera and central lidar systemIncluded angle of arrangement alpha1A value;
calibrating an angle alpha between a high resolution camera and a low resolution camera2The value is obtained.
The method comprises the following steps of:
acquiring target two-dimensional image information in a field of view, wherein the target two-dimensional image information comprises two high-resolution images and two low-resolution-rate images which are provided with certain overlapping areas;
the relative positions of the four images are calculated to carry out image splicing, and the super-resolution calculation is carried out on the area in the overlapping part of the two high-resolution images by a bilinear interpolation method, so that the super-resolution imaging effect is achieved;
obtaining a variable-resolution two-dimensional imaging image meeting the requirement of a field range, and obtaining three-dimensional depth information in the field of view based on a finished laser radar imaging system;
the method for fusing two-dimensional and three-dimensional information of the image in the field of view comprises the following steps:
acquiring a two-dimensional image with space variable resolution and a three-dimensional image meeting the field of view requirement;
acquiring a relative position relation between two images, and finding three-dimensional depth information of each point in a two-dimensional image;
the method comprises the steps of sampling a large-view-field two-dimensional image with a spatial resolution by an annular sampling mode of a bionic visual mechanism, giving corresponding three-dimensional depth information to each point, achieving the purpose of fusing two-dimensional and three-dimensional information, and obtaining a variable-resolution large-view-field three-dimensional image with two-dimensional color information.
Specifically, as shown in fig. 2 to 4, when applied, the method includes the following steps:
the method comprises the following steps: and selecting parameters of the imaging module according to the target scene and the actual requirement so as to ensure timely and stable information transmission and data processing. The resolution of the high-resolution imaging module which can be adopted is 1280 x 960 focal length 2.96mm, and the volume is 25 x 24 x 20mm 3; the low-resolution imaging module has the resolution of 800 × 600, the focal length of 4.2mm and the volume of 32 × 32 × 30mm 3; and carrying out data transmission through the gigabit network cable.
Step two: and designing a bionic structure model according to the selected imaging module. Firstly, the high-resolution camera is symmetrically arranged at the central position, and the low-resolution cameras are arranged at two sides at the same time, so that the imaging resolution of the middle part is higher than that of the two sides. Due to the fact that the cameras on the two sides are symmetrically arranged, the arrangement included angle alpha 1 between the high-resolution camera and the central laser radar system and the arrangement included angle alpha 2 between the high-resolution camera and the low-resolution camera can be set to be 30 degrees and 30 degrees, and the actual requirement that the imaging view field is 120 degrees is met.
Step three: and matching laser radar imaging parameters and designing a corresponding laser radar system. A4X 4 area array laser radar with the volume of 50X 40mm3 is designed according to the size requirement of the whole structure system and is arranged in the central part of the whole imaging system. Meanwhile, according to the three-dimensional imaging parameters required by imaging, linear scanning is carried out on the required field range of 120 degrees multiplied by 40 degrees in a linear scanning mode.
Step four: two-dimensional imaging with variable resolution and large field of view and acquiring depth information in the large field of view; and acquiring target two-dimensional image information in the field of view through the visible light imaging module designed in the second step. At this time, two high-resolution images and two low-resolution-rate images which reserve a certain overlapping area are obtained. And carrying out image splicing by calculating the relative positions of the four images. And performing super-resolution calculation on the region at the overlapping part of the two high-resolution images by a bilinear interpolation method to achieve a super-resolution imaging effect. Therefore, the two-dimensional imaging image with variable resolution meeting the requirement of the field range is obtained through the four visible light camera imaging modules. Meanwhile, three-dimensional depth information in the field of view is obtained through the laser radar imaging system designed in the third step;
step five: and carrying out data processing through the FPGA so as to carry out image two-dimensional and three-dimensional information fusion within a field of view. In step four, a two-dimensional image with spatially varying resolution and a three-dimensional image satisfying the field of view requirements are acquired. The relative position relation between two images can be calculated by utilizing the significance characteristics of the three-dimensional depth information, and the three-dimensional depth information of each point in the two-dimensional image is found; the method comprises the steps of sampling a large-view-field two-dimensional image with a spatial resolution by an annular sampling mode of a bionic visual mechanism, giving corresponding three-dimensional depth information to each point, achieving the purpose of fusion of two-dimensional and three-dimensional information, and forming a large-view-field three-dimensional image with a variable resolution and two-dimensional color information.
Step six: and transmitting the finally obtained image data to output equipment for real-time three-dimensional scene display.
By means of the scheme, the image containing the two-dimensional and three-dimensional fusion information of the target scene is obtained in a multi-camera curved surface array mode, the data redundancy of the peripheral area is compressed while the central concave target is clearly imaged, the characteristics of large field of view, high resolution and real-time compromise are achieved, and the imaging range can be comprehensively sensed accurately in real time; and the device has the advantages of small volume and large field of view, and can be flexibly applied to various complex scenes.
In summary, with the above technical solution of the present invention, the following effects can be achieved:
1. the bionic three-dimensional imaging system and method based on the fusion of the visible light and the laser radar realize central super-resolution and edge low-resolution imaging through a space variable resolution imaging mode designed by a composite bionic structure, compress data redundancy of peripheral areas while ensuring clear imaging of a central concave target, have the characteristics of large field of view, high resolution and real-time consideration, and can accurately and comprehensively sense an imaging range in real time;
2. according to the bionic three-dimensional imaging system and method based on fusion of visible light and the laser radar, through the annular imaging characteristic of a bionic visual mechanism, the two-dimensional color information of the visible light wave band and the depth information of the laser radar are fused in a space variable resolution sampling mode, so that the target in an imaging range is quickly and accurately sensed, and the imaging target information is comprehensively acquired;
3. the bionic three-dimensional imaging system and method based on the fusion of the visible light and the laser radar have the advantages of small volume and large field of view compared with the traditional arrangement mode of the planar camera through the arrangement of the curved surface array, and can be flexibly applied to various complex scenes.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. A bionic three-dimensional imaging system based on fusion of visible light and a laser radar is characterized by comprising an acquisition module (1), a processing module (2) and a display module (3), wherein the display module (3) and the acquisition module (1) are respectively connected with the processing module (2);
the acquisition module (1) comprises a high-resolution visible light imaging module (4), a low-resolution visible light imaging module (5) and a laser radar imaging module (6).
2. The bionic three-dimensional imaging system based on the fusion of visible light and laser radar as claimed in claim 1, wherein the high-resolution visible light imaging modules (4) are at least two groups, and the low-resolution visible light imaging modules (5) are at least two groups.
3. The bionic three-dimensional imaging system based on visible light and lidar fusion of claim 2, wherein the high-resolution visible light imaging modules (4) are arranged on two sides of the lidar imaging module (6), and the low-resolution visible light imaging modules (5) are arranged on two sides of the high-resolution visible light imaging module (6).
4. The bionic three-dimensional imaging system based on the fusion of visible light and laser radar as claimed in claim 3, wherein the high-resolution visible light imaging module (4), the low-resolution visible light imaging module (5) and the laser radar imaging module (6) are arranged in a curved surface form.
5. Visible light based according to claim 4The bionic three-dimensional imaging system fused with the laser radar is characterized in that the included angle alpha between the high-resolution visible light imaging module (4) and the laser radar imaging module (6)1Is 30 degrees; the included angle alpha between the high-resolution visible light imaging module (4) and the low-resolution visible light imaging module (5)2Is 30 deg..
6. A bionic three-dimensional imaging method based on the fusion of visible light and laser radar is characterized in that the imaging method for the bionic three-dimensional imaging system based on the fusion of visible light and laser radar according to claims 1-5 comprises the following steps:
calibrating parameters of the imaging module based on a target scene and actual requirements in advance;
designing a bionic structure model based on the parameters of the imaging module;
matching laser radar imaging parameters and designing a corresponding laser radar system;
two-dimensional imaging with variable resolution and large field of view and acquiring depth information in the large field of view;
carrying out data processing through an FPGA (field programmable gate array) so as to carry out image two-dimensional and three-dimensional information fusion within a field range;
and transmitting the finally obtained image data to output equipment for real-time three-dimensional scene display.
7. The bionic three-dimensional imaging method based on the fusion of the visible light and the laser radar, according to claim 6, is characterized in that the bionic structure model design based on the parameters of the imaging module comprises the following steps:
high-resolution cameras are symmetrically arranged on two sides of a central laser radar system in advance, and low-resolution cameras are arranged on two sides of the high-resolution cameras at the same time;
calibrating the arrangement included angle alpha between a high-resolution camera and a central laser radar system1A value;
calibrating an angle alpha between a high resolution camera and a low resolution camera2The value is obtained.
8. The bionic three-dimensional imaging method based on the fusion of the visible light and the laser radar, according to claim 7, is characterized in that the variable-resolution large-field two-dimensional imaging and the acquisition of depth information in the large field of view comprise the following steps:
acquiring target two-dimensional image information in a field of view, wherein the target two-dimensional image information comprises two high-resolution images and two low-resolution-rate images which are provided with certain overlapping areas;
the relative positions of the four images are calculated to carry out image splicing, and the super-resolution calculation is carried out on the area in the overlapping part of the two high-resolution images by a bilinear interpolation method, so that the super-resolution imaging effect is achieved;
and obtaining a variable-resolution two-dimensional imaging image meeting the requirement of the field range, and obtaining three-dimensional depth information in the field based on the finished laser radar imaging system.
9. The bionic three-dimensional imaging method based on the fusion of the visible light and the laser radar, according to claim 8, is characterized in that the two-dimensional and three-dimensional information fusion of the image in the field of view comprises the following steps:
acquiring a two-dimensional image with space variable resolution and a three-dimensional image meeting the field of view requirement;
acquiring a relative position relation between two images, and finding three-dimensional depth information of each point in a two-dimensional image;
the method comprises the steps of sampling a large-view-field two-dimensional image with a spatial resolution by an annular sampling mode of a bionic visual mechanism, giving corresponding three-dimensional depth information to each point, achieving the purpose of fusing two-dimensional and three-dimensional information, and obtaining a variable-resolution large-view-field three-dimensional image with two-dimensional color information.
CN202110362110.2A 2021-04-02 2021-04-02 Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar Pending CN113109833A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110362110.2A CN113109833A (en) 2021-04-02 2021-04-02 Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110362110.2A CN113109833A (en) 2021-04-02 2021-04-02 Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar

Publications (1)

Publication Number Publication Date
CN113109833A true CN113109833A (en) 2021-07-13

Family

ID=76713579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110362110.2A Pending CN113109833A (en) 2021-04-02 2021-04-02 Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar

Country Status (1)

Country Link
CN (1) CN113109833A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114638942A (en) * 2022-03-16 2022-06-17 北京理工大学 Heterogeneous variable-resolution-ratio point cloud imaging visualization method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574338A (en) * 2015-01-26 2015-04-29 西安交通大学 Remote sensing image super-resolution reconstruction method based on multi-angle linear array CCD sensors
US20150281601A1 (en) * 2014-03-25 2015-10-01 INVIS Technologies Corporation Modular Packaging and Optical System for Multi-Aperture and Multi-Spectral Camera Core
CN108828606A (en) * 2018-03-22 2018-11-16 中国科学院西安光学精密机械研究所 Laser radar and binocular visible light camera-based combined measurement method
CN108932475A (en) * 2018-05-31 2018-12-04 中国科学院西安光学精密机械研究所 Three-dimensional target identification system and method based on laser radar and monocular vision
CN109375191A (en) * 2018-09-18 2019-02-22 南京航空航天大学 Concurrent irradiation source 3D laser radar and 2D detector superspace resolution information acquisition methods and device
CN110426762A (en) * 2019-08-02 2019-11-08 北京理工大学 A kind of parallel type bionic compound eyes nest area's imaging method and system
CN110595624A (en) * 2019-09-17 2019-12-20 北京理工大学 Cross-shaped four-aperture view field partially-overlapped heat-generation-simulating imaging system
CN110595625A (en) * 2019-09-17 2019-12-20 北京理工大学 Cross-shaped five-aperture view field partially-overlapped bionic thermal imaging system
CN111292376A (en) * 2020-02-13 2020-06-16 北京理工大学 Visual target tracking method of bionic retina
CN111928775A (en) * 2020-06-28 2020-11-13 深圳市今朝智能有限公司 Target tracking measurement method based on combination of camera and laser radar

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150281601A1 (en) * 2014-03-25 2015-10-01 INVIS Technologies Corporation Modular Packaging and Optical System for Multi-Aperture and Multi-Spectral Camera Core
CN104574338A (en) * 2015-01-26 2015-04-29 西安交通大学 Remote sensing image super-resolution reconstruction method based on multi-angle linear array CCD sensors
CN108828606A (en) * 2018-03-22 2018-11-16 中国科学院西安光学精密机械研究所 Laser radar and binocular visible light camera-based combined measurement method
CN108932475A (en) * 2018-05-31 2018-12-04 中国科学院西安光学精密机械研究所 Three-dimensional target identification system and method based on laser radar and monocular vision
CN109375191A (en) * 2018-09-18 2019-02-22 南京航空航天大学 Concurrent irradiation source 3D laser radar and 2D detector superspace resolution information acquisition methods and device
CN110426762A (en) * 2019-08-02 2019-11-08 北京理工大学 A kind of parallel type bionic compound eyes nest area's imaging method and system
CN110595624A (en) * 2019-09-17 2019-12-20 北京理工大学 Cross-shaped four-aperture view field partially-overlapped heat-generation-simulating imaging system
CN110595625A (en) * 2019-09-17 2019-12-20 北京理工大学 Cross-shaped five-aperture view field partially-overlapped bionic thermal imaging system
CN111292376A (en) * 2020-02-13 2020-06-16 北京理工大学 Visual target tracking method of bionic retina
CN111928775A (en) * 2020-06-28 2020-11-13 深圳市今朝智能有限公司 Target tracking measurement method based on combination of camera and laser radar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
曹杰等: "曲面相机阵列多分辨成像方法", 《光子学报》 *
贾永红 等: "《数字图像处理技巧》", 31 January 2017 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114638942A (en) * 2022-03-16 2022-06-17 北京理工大学 Heterogeneous variable-resolution-ratio point cloud imaging visualization method

Similar Documents

Publication Publication Date Title
EP3144880B1 (en) A method and an apparatus for generating data representative of a light field
US20060018509A1 (en) Image generation device
US6304285B1 (en) Method and apparatus for omnidirectional imaging
US6744569B2 (en) Method and apparatus for omnidirectional three dimensional imaging
KR100882011B1 (en) Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof
Varga et al. Super-sensor for 360-degree environment perception: Point cloud segmentation using image features
JP4825971B2 (en) Distance calculation device, distance calculation method, structure analysis device, and structure analysis method.
CN116685873A (en) Vehicle-road cooperation-oriented perception information fusion representation and target detection method
US20040125228A1 (en) Apparatus and method for determining the range of remote objects
CN103471715A (en) Common optical path combined optical field spectral imaging method and device
CN101546111A (en) Method for twin-lens wide baseline catadioptric omnidirectional stereo imaging by using single camera and device thereof
CN108881717B (en) Depth imaging method and system
CN108924408B (en) Depth imaging method and system
CN109087395B (en) Three-dimensional reconstruction method and system
JP2004258266A (en) Stereoscopic adapter and distance image input device using the same
DE112017003815T5 (en) IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD
CN104406539A (en) All-weather active type panoramic sensing device and 3D (three dimensional) panoramic modeling approach
EP0897163A1 (en) Stereovision method for producing cartographic data
CN108805921A (en) Image-taking system and method
CN113109833A (en) Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar
CN114659635A (en) Spectral depth imaging device and method based on image surface segmentation light field
JP2020150427A (en) Imaging device, imaging optical system, and moving object
JP6756898B2 (en) Distance measuring device, head-mounted display device, personal digital assistant, video display device, and peripheral monitoring system
CN117395485A (en) Integrated polarized light field depth perception imaging device and method adopting same
JP2006033282A (en) Image forming device and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210713