CN110740238A - light splitting HDR camera applied to mobile robot SLAM field - Google Patents

light splitting HDR camera applied to mobile robot SLAM field Download PDF

Info

Publication number
CN110740238A
CN110740238A CN201911017598.4A CN201911017598A CN110740238A CN 110740238 A CN110740238 A CN 110740238A CN 201911017598 A CN201911017598 A CN 201911017598A CN 110740238 A CN110740238 A CN 110740238A
Authority
CN
China
Prior art keywords
spectroscope
sub
light splitting
cmos chip
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911017598.4A
Other languages
Chinese (zh)
Other versions
CN110740238B (en
Inventor
吕恩利
王飞仁
刘妍华
郭嘉明
李斌
苏秋双
赵伟伟
吴鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Agricultural University
Original Assignee
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Agricultural University filed Critical South China Agricultural University
Priority to CN201911017598.4A priority Critical patent/CN110740238B/en
Publication of CN110740238A publication Critical patent/CN110740238A/en
Application granted granted Critical
Publication of CN110740238B publication Critical patent/CN110740238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses light splitting HDR cameras applied to the SLAM field of mobile robots, which comprise a shell and a lens arranged at the front end of the shell, wherein a light splitting system is arranged in the shell, the light splitting system comprises a main light splitting mirror arranged right behind the lens, the splitting surface of the main light splitting mirror is inclined to the optical axis of the lens, a sub-light splitting mirror is arranged behind the main light splitting mirror, the splitting surface of a sub-light splitting mirror is perpendicular to the splitting surface of the main light splitting mirror, a second sub-light splitting mirror is arranged above the main light splitting mirror, the splitting surface of the second sub-light splitting mirror is parallel to the mirror surface of the main light splitting mirror, CMOS chips welded on a PCB substrate are respectively arranged below and behind the sub-light splitting mirror and above the second sub-light splitting mirror, and all PCB substrates are connected to FPGA chips.

Description

light splitting HDR camera applied to mobile robot SLAM field
Technical Field
The invention relates to the field of mobile robot positioning and navigation, in particular to high dynamic range imaging (HDR) cameras applied to the field of mobile robot simultaneous positioning and mapping (SLAM).
Background
The image-based positioning algorithm is a thermal problem in the field of autonomous mobile robots, and is a precondition for solving motion planning and control of mobile robots.A map of an environment can be restored while estimating the pose of the robot by analyzing an image sequence under an unknown environment by simultaneous positioning and Mapping (SLAM). The technology obtains general attention at home and abroad because the technology only depends on an image sensor of the robot, does not need environment modification and manual marking, and meanwhile, the camera has low cost and the like.A visual SLAM mainly depends on the camera, so the Imaging quality of the camera is directly related to the positioning accuracy of the SLAM, however, the traditional camera has limited light latitude, and under the environment of , such as sunlight penetrating through windows, shadows of objects and the like, the image obtained by the camera has local over-exposure and local under-exposure conditions, most of bright portions or dark portions of the image is lost, the extraction of characteristics of the SLAM is not beneficial to the extraction of High-level and High-resolution continuous motion of a camera, and the High-resolution of an HDR camera is not beneficial to the extraction of the High-level Dynamic camera.
Disclosure of Invention
The invention aims to provide light splitting HDR cameras applied to the mobile robot SLAM field, which are mainly used for solving the problem that the traditional camera has motion blur in an HDR mode.
In order to realize the task, the invention adopts the following technical scheme:
A light splitting HDR camera applied to mobile robot SLAM field comprises a shell and a lens installed at the front end of the shell, wherein a light splitting system is arranged in the shell, and the light splitting HDR camera comprises:
the main spectroscope is arranged right behind the lens, the splitting surface of the main spectroscope is inclined to the optical axis of the lens, the th auxiliary spectroscope is arranged behind the main spectroscope, and the splitting surface of the th auxiliary spectroscope is vertical to the splitting surface of the main spectroscope;
the dynamic range image synthesis device comprises an th auxiliary spectroscope, a th CMOS chip is arranged below the 8926 th auxiliary spectroscope, a second CMOS chip is arranged behind the th auxiliary spectroscope, a third CMOS chip is arranged behind the second auxiliary spectroscope, a fourth CMOS chip is arranged above the second auxiliary spectroscope, each CMOS chips are respectively welded on PCB substrates, the PCB substrates corresponding to CMOS chips are connected to an FPGA chip, and the FPGA chip is used for synthesizing images acquired by each CMOS chips into images in a high dynamic range.
And , the reflection and projection ratios of the main spectroscope, the secondary spectroscope and the second secondary spectroscope are all 1: 1.
, the included angle between the splitting surface of the main beam splitter and the optical axis of the lens is 45 degrees.
, the main spectroscope, the secondary spectroscope and the second secondary spectroscope are cube spectroscopes.
After being focused by the lens, the external light passes through the main spectroscope, 50% of the external light penetrates through the splitting surface of the main spectroscope and is emitted into the th secondary spectroscope, and the rest 50% of the external light is reflected into the second secondary spectroscope by the splitting surface of the main spectroscope;
50% of the external light entering the th sub-spectroscope, wherein half of external light is reflected by the spectroscopic surface of the th sub-spectroscope and then enters the th CMOS chip, and half of external light passes through the spectroscopic surface of the th sub-spectroscope and then enters the second CMOS chip;
and half of the external light rays enter the other 50% of the external light rays in the second secondary spectroscope, are reflected by the light splitting surface of the second secondary spectroscope and then enter the third CMOS chip, and half of the external light rays penetrate the light splitting surface of the second secondary spectroscope and then enter the fourth CMOS chip.
, the FPGA chip communicates with the outside through a data communication interface, when the FPGA chip receives an image request sent by the outside, exposure instructions with different exposure time lengths are issued to the CMOS chip to the fourth CMOS chip, and the CMOS chip to the fourth CMOS chip respectively and independently acquire images through the corresponding light splitting surfaces.
, the working modes of the split HDR camera include a weak light mode and a strong light mode, wherein the FPGA chip determines the light condition of the environment by analyzing the average brightness and exposure time of the normally exposed images in the images obtained by every CMOS chips times, and then selects to switch the working mode between the weak light mode and the strong light mode before obtaining the images by every CMOS chips times.
, the FPGA chip synthesizes the images acquired by each CMOS chips into high dynamic range images by means of coefficient fusion, including:
for the images obtained by CMOS chips, fusion coefficients are corresponding to pixel points of each images, and then the high-dynamic-range images are obtained by calculating the pixel values of the same positions on different images through weighting.
, in the low light mode, the exposure instruction is to extend the exposure time by 2 steps, extend the exposure time by 1 step, normally expose, reduce the exposure time by 1 step, at this time, 2 overexposed images, 1 normally exposed image, and 1 underexposed image can be obtained, then the light splitting system focuses more on the image details of the dark part;
in the strong light mode, the exposure instruction is as follows: and the exposure time is prolonged by 1 gear, the normal exposure is carried out, the exposure time is reduced by 1 gear, the exposure time is reduced by 2 gears, at the moment, 1 piece of overexposed image, 1 piece of normal exposure image and 2 pieces of underexposed image can be obtained, and the light splitting system focuses on the image details of the bright part.
, a light splitting system support is arranged inside the shell, the light splitting system support comprises pairs of symmetrically arranged supporting plates, and L-shaped clamping grooves for fixing the main spectroscope, the auxiliary spectroscope and the second auxiliary spectroscope are symmetrically formed in the pairs of supporting plates;
after the main spectroscope, the th secondary spectroscope and the second secondary spectroscope are installed in the card slot, a th fixing groove and a second fixing groove are symmetrically formed in the supporting plate below and behind the secondary spectroscope and are used for fixedly installing the PCB substrate corresponding to the th CMOS chip and the PCB substrate corresponding to the second CMOS chip respectively, and a third fixing groove and a fourth fixing groove are symmetrically formed in the supporting plate behind and above the second secondary spectroscope and are used for fixedly installing the PCB substrate corresponding to the third CMOS chip and the PCB substrate corresponding to the fourth CMOS chip respectively.
, the PCB substrate is parallel to the end face of the adjacent secondary beam splitter and has the same area.
The invention has the following technical characteristics:
1. the camera is in an light splitting system structure, adopts a multi-CMOS simultaneous exposure mode to collect images under different exposure parameters to synthesize an HDR image, can reduce the overall exposure time and reduce the image smear phenomenon compared with the traditional HDR imaging mode, and is suitable for outdoor large scenes and indoor small scenes due to the adoption of the exposure mode with the same time and space.
2. The HDR image is obtained by adopting a plurality of low-cost CMOS chips through a synthesis mode, has huge cost advantage compared with a chip with high light tolerance, and is beneficial to reducing the cost of a mobile robot positioning system.
3. Compared with a common camera, the camera has a higher dynamic range, and is beneficial to improving the positioning and mapping accuracy of the SLAM algorithm of the mobile robot.
Drawings
FIG. 1 is a schematic view of the present invention with a portion of the housing removed;
FIG. 2 is a schematic view of a spectroscopy system mount of the present invention;
FIG. 3 is a schematic view of the overall structure of the present invention;
the numbers in the figure indicate that 1 lens, 2 shell, 3 fourth CMOS chip, 4 second secondary spectroscope, 5FPGA chip, 6 circuit board, 7 light splitting system bracket, 71 supporting plate, 72 card slot, 73 th fixing slot, 74 second fixing slot, 75 third fixing slot, 76 fourth fixing slot, 77 fifth fixing slot, 8 third CMOS chip, 9 second CMOS chip, 10 st secondary spectroscope, 11 th CMOS chip and 12 primary spectroscope.
Detailed Description
The invention discloses light-splitting HDR cameras applied to the SLAM neighborhood of a mobile robot, which comprise a shell 2 and a lens 1 arranged at the front end of the shell 2, wherein the lens 1 can be fixed at the front end of the shell 2 in a thread matching way, for example, the lens 1 is arranged at the front end of the shell 2, a light-splitting system is arranged in the shell 2, and the light-splitting system comprises:
the lens comprises a main spectroscope 12 arranged right behind a lens 1, wherein the right rear direction means that the central line of the main spectroscope 12 is overlapped with the optical axis of the lens 1, the splitting surface of the main spectroscope 12 is inclined to the optical axis of the lens 1, a -th sub spectroscope 10 is arranged behind the main spectroscope 12, the splitting surface of the -th sub spectroscope 10 is perpendicular to the splitting surface of the main spectroscope 12, a second sub spectroscope 4 is arranged above the main spectroscope 12, the splitting surface of the second sub spectroscope 4 is parallel to the mirror surface of the main spectroscope 12, in the embodiment, the main spectroscope 12, the -th sub spectroscope 10 and the second sub spectroscope 4 are all upright body spectroscopes with the same size, the splitting surfaces are all between the bottom edge of the side and the top edge of the other side in the upright body, the included angles between the splitting surfaces and the top surface and the bottom surface of the upright body are all 45 degrees, the reflection and projection ratios of the main spectroscope 12, the -th sub spectroscope 10 and the second sub spectroscope 4 are all 1:1, namely, the light rays are irradiated on the bottom surface of the spectroscope 395 in the direction, and the half light rays pass through the splitting surface, and then.
Optionally, the angle between the splitting surface of the main beam splitter 12 and the optical axis of the lens 1 is 45 ° to split light precisely, as shown in fig. 1, the lower end of the main beam splitter 12 is closer to the 1 side of the lens 1.
The PCB comprises th auxiliary spectroscope 10, th CMOS chip 11, th auxiliary spectroscope 10, a second CMOS chip 9, third CMOS chip 8, fourth CMOS chip 3, 4 PCB substrates corresponding to each CMOS chip (including th CMOS chip 11 to fourth CMOS chip 3) and being respectively welded on PCB substrates for receiving light passing through the corresponding auxiliary spectroscope, and FPGA 5 connected to 4 PCB substrates corresponding to each CMOS chip, wherein the PCB substrates are PCB substrates welded with the CMOS chips, peripheral circuits of the CMOS chips are realized on the PCB substrates, the PCB substrates are parallel to the end surfaces of the adjacent auxiliary spectroscopes and have the same area, in the embodiment, each PCB substrate is a square plate, the area of each PCB substrate is the same as that of the auxiliary spectroscope 10 and the auxiliary spectroscope 4 of the second , the center is on the same axial line , for example, the area of the PCB substrate corresponding to the auxiliary spectroscope 11 is the same as that of the second auxiliary spectroscope 10, the PCB substrate is parallel to the front axis of the auxiliary spectroscope, the PCB substrate is the same as that of the auxiliary spectroscope 10, the second auxiliary spectroscope 84, and the PCB substrate is parallel to the front axis 6857, and the PCB substrate of the auxiliary spectroscope is the PCB substrate, and the PCB substrate is the same as that of the front axis of the auxiliary spectroscope 84, and the PCB substrate of the auxiliary spectroscope 10, and the PCB is the PCB of the PCB substrate is.
In the scheme, after being focused by the lens 1, the external light passes through the main spectroscope 12, 50% of the external light penetrates through the splitting surface of the main spectroscope 12 and is emitted into the -th sub-spectroscope 10, the rest 50% of the external light is reflected by the splitting surface of the main spectroscope 12 to the second sub-spectroscope 4, the 50% of the external light entering the -th sub-spectroscope 10, wherein -half of the external light is reflected by the splitting surface of the -th sub-spectroscope 10 and is emitted into the -th CMOS chip 11, the other -half of the external light penetrates through the splitting surface of the -th sub-spectroscope 10 and is emitted into the second CMOS chip 9, the rest 50% of the external light entering the second sub-spectroscope 4, wherein -half of the external light is reflected by the splitting surface of the second sub-spectroscope 4 and is emitted into the third CMOS chip 8, and the other -half of the external light penetrates through the splitting surface of the second sub-spectroscope 4 and is emitted into the fourth CMOS chip 3.
According to the scheme, the FPGA chip 5 is fixed inside the shell 2 through the circuit board 6, the FPGA chip 5 is in communication with the outside through the data communication interface, the PCB substrate is connected with the circuit board through a flat cable and then is in communication with the FPGA chip 5, the data interface can adopt USB3.0, the outside can be a computer, a controller of a mobile robot and the like, and the FPGA chip 5 is used for synthesizing images with a high dynamic range from under-exposed images, normal-exposed images and over-exposed images acquired by CMOS chips.
When the FPGA chip 5 receives an image request sent from the outside, exposure instructions with different exposure durations are sent to the th CMOS chip 11 to the fourth CMOS chip 3 at the same time, the th CMOS chip 11 to the fourth CMOS chip 3 independently acquire images through corresponding splitting surfaces, wherein the th CMOS chip 11 and the second CMOS chip 9 independently acquire images through reflection and transmission of the splitting surface of the th secondary beam splitter 10, and the third CMOS chip 8 and the fourth CMOS chip 3 independently acquire images through reflection and transmission of the splitting surface of the second secondary beam splitter 4.
The working modes of the light splitting HDR camera comprise a weak light mode and a strong light mode, wherein the FPGA chip 5 determines the light condition of the environment by analyzing the average brightness and the exposure time of normally exposed images in images obtained by CMOS chips at times, and then selects to switch the working mode to the weak light mode or the strong light mode before obtaining the images by CMOS chips at times.
In the low light mode, the exposure instruction is as follows: the exposure time is prolonged by 2 steps, the exposure time is prolonged by 1 step, normal exposure is carried out, the exposure time is reduced by 1 step, 2 overexposed images, 1 normally exposed image and 1 underexposed image can be obtained at the moment, and the light splitting system focuses on image details of a dark part more under the condition;
in the strong light mode, the exposure instruction is as follows: and the exposure time is prolonged by 1 gear, the normal exposure is carried out, the exposure time is reduced by 1 gear, the exposure time is reduced by 2 gears, at the moment, 1 piece of overexposed image, 1 piece of normal exposure image and 2 pieces of underexposed image can be obtained, and in the case, the light splitting system focuses on the image details of the bright part.
Wherein, the 1 st gear and the 2 nd gear refer to the time degree of exposure, for example, if the exposure time is prolonged by 1 st gear to be prolonged by exposure time T, the exposure time is prolonged by 2 nd gear to be prolonged by exposure time 2T; the exposure time is reduced; the specific gear can be set according to actual requirements.
The time of the normal exposure is obtained according to the principle that the entropy of the image is the maximum, the entropy of the image reflects the average information amount in the image, after the normal exposure image is obtained each time, the FPGA chip 5 counts the current entropy value of the image, and then the exposure time is adjusted for times according to the entropy value.
Figure BDA0002246188450000061
Wherein p isiIs the probability that a pixel with a pixel grey value i appears in the image:
pi=ni/M×N
wherein M, N respectively indicate the number of rows and columns of pixels in the image, niRepresenting the number of pixels in the image with a grey value i.
The FPGA chip 5 synthesizes images acquired by every CMOS chips into images with a high dynamic range in a coefficient fusion mode, specifically, fusion weights of different exposure images are estimated firstly, then weighted average fusion is carried out, and a fusion method follows the following formula:
wherein Wn(x, y) is the fusion weight, In(x, y) represents different images, and n represents an image number.
That is, for the images obtained by CMOS chips, fusion coefficients are assigned to each pixel points of each images, and then the high dynamic range images are obtained by weighting the pixel values at the same positions on different images.
As shown in fig. 1, in order to facilitate the arrangement of the optical splitting system, an optical splitting system support 7 is disposed inside the housing 2, the optical splitting system support 7 includes pairs of support plates 71 which are symmetrically disposed, where the pairs of support plates 71 are parallel and corresponding to each other, L-shaped slots for fixing the main spectroscope 12, the sub-spectroscope 10 and the second sub-spectroscope 4 are symmetrically disposed on the pairs of support plates 71, as shown in fig. 2, as the main spectroscope 12, the sub-spectroscope 10 and the second sub-spectroscope 4 are located behind the main spectroscope 12 and above the main spectroscope 12 in terms of position relation, the L-shaped slots formed on the support plates 71 can well fix the optical splitting mirrors, and specifically, after the optical splitting mirrors are placed according to positions, two sides of the optical splitting mirrors are respectively snapped into the L-shaped slots on the pairs of support plates 71.
After the main spectroscope 12, the th secondary spectroscope 10 and the second secondary spectroscope 4 are installed in the card slot, the support plate 71 below and behind the secondary spectroscope 10 is symmetrically provided with a th fixing groove 73 and a second fixing groove 74, as shown in fig. 1 and 2, the th fixing groove 73 and the second fixing groove 74 are both strip-shaped grooves and are respectively used for fixedly installing a PCB substrate corresponding to the th CMOS chip and a PCB substrate corresponding to the second CMOS chip, specifically, two sides of the PCB substrate are respectively clamped into the fixing grooves on the support plate 71 by the , that is, the PCB substrate is effectively fixed, and the CMOS chip on the PCB substrate is also effectively fixed, so that the CMOS chip can receive light passing through the spectroscope at a set accurate position, the support plate 71 behind and above the second secondary spectroscope 4 is symmetrically provided with a third fixing groove 75 and a fourth fixing groove 76 which are respectively used for fixedly installing a PCB substrate corresponding to the third CMOS chip and a PCB substrate corresponding to the fourth CMOS chip.
, as shown in fig. 2, a fifth fixing groove 77 is symmetrically formed on the supporting plate 71 on the right side of the fourth fixing groove 76, and in combination with fig. 1, the fifth fixing groove is used for the circuit board 6 of the FPGA chip 5.
In this embodiment, when the lens 1 is screwed to the housing 2, the screw type is a C-type interface for a system of the industrial lens 1, and the parameters of the lens 1 are as follows: the focal length is 5mm, the aperture is 1.8, the maximum image surface of the adaptive CMOS is 1/2', the horizontal visual angle is 64 degrees, the vertical visual angle is 50 degrees, the distortion of the lens 1 is-0.33 percent, the length of the lens 1 is 58mm, the maximum resolution of the lens 1 is 10M, and the interface C-mount of the lens 1 is formed.
the CMOS chips 11 to the fourth CMOS chip 3 are of the same type, in this embodiment, ONSemiconductor MT9V034C12STM, image resolution 752H × 480V, maximum image plane 1/3 ", pixel size 6.0um × 6.0um, gray imaging, global speed , maximum frame rate 60Hz, 10-bit ADC resolution, and dynamic range greater than 55dB, the CMOS has lower cost and higher pixel size, and is helpful to improve the dynamic range of the image, wherein the specifications of the PCB substrates corresponding to each CMOS chips are the same, for example, the thickness of the PCB substrate is 1.2mm, and the size is 20mm × 20 mm.
The specifications of the main beam splitter 12, the th beam splitter and the second beam splitter are the same, and the preferred dimensions are 20mm × 20mm × 20 mm.
The FPGA chip 5 is connected and communicated with the outside through a USB3.0 interface and used for sending exposure instructions to each CMOS chip, receiving, processing and synthesizing images at the same time, and sending the finally synthesized images to the outside through the USB3.0 interface, for example, the model of the FPGA chip 5 can be Xilinx Zynq 7020, wherein an editable logic unit is 85K, and has sufficient computing resources to process data of 4 CMOS chips, meanwhile, the chip internally comprises dual-core ARM Cortex-A9 SOC computing units which can perform general computation and are used for computing image entropy, exposure time and the like, the FPGA chip 5 performs photometry on the current environment after obtaining the image data of the CMOS chip every time, counts the gray distribution of the normally exposed images, namely the probability of each gray value of 0-255, then computes the information entropy of the normally exposed images, and dynamically adjusts the time of the normally exposed frames of frames according to the time of the normally exposed frames of , and determines that the current mode is in a weak light mode and a strong light mode.
In the embodiment, 4 low-order CMOS chips with high cost performance are matched with the light splitting system, simultaneous exposure at the same position to the environment is realized, an HDR image is obtained through image synthesis, the problems of insufficient light tolerance of the image collected by a traditional camera and motion blur of a common camera in an HDR mode are solved, the method is suitable for small indoor scenes and large outdoor scenes, the cost advantage of a high-end camera is outstanding, and the SLAM precision of the mobile robot is improved.

Claims (10)

  1. The light splitting HDR camera applied to the SLAM field of the mobile robot comprises a shell (2) and a lens (1) arranged at the front end of the shell (2), wherein a light splitting system is arranged in the shell (2), and the light splitting system is characterized by comprising:
    the optical lens comprises a main spectroscope (12) arranged right behind the lens (1), a splitting surface of the main spectroscope (12) is inclined to an optical axis of the lens (1), an -th sub spectroscope (10) is arranged behind the main spectroscope (12), and a splitting surface of the -th sub spectroscope (10) is perpendicular to the splitting surface of the main spectroscope (12);
    the dynamic range image sensor is characterized in that a CMOS chip (11) is arranged below the th sub-spectroscope (10), a second CMOS chip (9) is arranged behind the th sub-spectroscope (10), a third CMOS chip (8) is arranged behind the second sub-spectroscope (4), a fourth CMOS chip (3) is arranged above the second sub-spectroscope (4), each CMOS chip is welded on PCB substrates, each PCB substrate corresponding to the CMOS chip is connected to an FPGA chip (5), and the FPGA chip (5) is used for synthesizing images acquired by each CMOS chips into images in a high dynamic range.
  2. 2. The HDR camera applied to SLAM field of mobile robot as claimed in claim 1, wherein the reflection and projection ratio of the main beam splitter (12), the th sub beam splitter (10) and the second sub beam splitter (4) is 1: 1.
  3. 3. The HDR camera of claim 1, wherein the main beam splitter (12), the th sub beam splitter (10) and the second sub beam splitter (4) are cube beam splitters.
  4. 4. The spectral HDR camera applied to the SLAM field of mobile robots of claim 1, wherein the angle between the splitting plane of the main beam splitter (12) and the optical axis of the lens (1) is 45 °;
    after being focused by the lens (1), the external light passes through the main spectroscope (12), 50% of the external light penetrates through the light splitting surface of the main spectroscope (12) and is emitted into the -th auxiliary spectroscope (10), and the rest 50% of the external light is reflected into the second auxiliary spectroscope (4) by the light splitting surface of the main spectroscope (12);
    50% of the external light entering the th sub-spectroscope (10), wherein half of the external light is reflected by the splitting surface of the th sub-spectroscope (10) and then enters the th CMOS chip (11), and half of the external light passes through the splitting surface of the th sub-spectroscope (10) and then enters the second CMOS chip (9);
    the rest 50% of the external light rays entering the second secondary spectroscope (4) are half of the external light rays reflected by the light splitting surface of the second secondary spectroscope (4) and then enter the third CMOS chip (8), and the other half of the external light rays penetrate the light splitting surface of the second secondary spectroscope (4) and then enter the fourth CMOS chip (3).
  5. 5. The light splitting HDR camera applied to the SLAM field of the mobile robot as claimed in claim 1, wherein the FPGA chip (5) communicates with the outside through a data communication interface, when the FPGA chip (5) receives an image request sent from the outside, exposure instructions with different exposure time lengths are issued to the th CMOS chip (11) to the fourth CMOS chip (3) at the same time, and the th CMOS chip (11) to the fourth CMOS chip (3) respectively acquire images independently through corresponding light splitting surfaces.
  6. 6. The split HDR camera applied to the SLAM field of the mobile robot as claimed in claim 1, wherein the operation modes of the split HDR camera include a weak light mode and a strong light mode, wherein the FPGA chip (5) determines the light condition of the environment by analyzing the average brightness of the normally exposed images and the exposure time in the images obtained with each CMOS chips for times, and then selects to switch the operation mode between the weak light mode and the strong light mode before obtaining the images with each CMOS chips for times.
  7. 7. The spectral HDR camera in SLAM field of mobile robot of claim 5, wherein in low light mode, the exposure command is: prolonging the exposure time by 2 steps, prolonging the exposure time by 1 step, normally exposing and reducing the exposure time by 1 step, wherein 2 overexposed images, 1 normally exposed image and 1 underexposed image can be obtained, and the light splitting system focuses more on the image details of the dark part;
    in the strong light mode, the exposure instruction is as follows: and the exposure time is prolonged by 1 gear, the normal exposure is carried out, the exposure time is reduced by 1 gear, the exposure time is reduced by 2 gears, at the moment, 1 piece of overexposed image, 1 piece of normal exposure image and 2 pieces of underexposed image can be obtained, and the light splitting system focuses on the image details of the bright part.
  8. 8. The spectral HDR camera applied to the SLAM field of mobile robots of claim 1, wherein the FPGA chip (5) synthesizes the images acquired by each CMOS chips into high dynamic range images by means of coefficient fusion, comprising:
    for the images obtained by CMOS chips, fusion coefficients are corresponding to pixel points of each images, and then the high-dynamic-range images are obtained by calculating the pixel values of the same positions on different images through weighting.
  9. 9. The light splitting HDR camera applied to the SLAM field of the mobile robot as claimed in claim 1, wherein a light splitting system support (7) is disposed inside the housing (2), the light splitting system support (7) comprises pairs of support plates (71) symmetrically disposed, and L-shaped slots for fixing the main spectroscope (12), the -th auxiliary spectroscope (10) and the second auxiliary spectroscope (4) are symmetrically disposed on the pairs of support plates (71);
    after the main spectroscope (12), the -th auxiliary spectroscope (10) and the second auxiliary spectroscope (4) are installed in the card slot, a -th fixing groove (73) and a second fixing groove (74) are symmetrically formed in a supporting plate (71) below and behind the -th auxiliary spectroscope (10) and are respectively used for fixedly installing a PCB substrate corresponding to the -th CMOS chip (11) and a PCB substrate corresponding to the second CMOS chip (9), and a third fixing groove (75) and a fourth fixing groove (76) are symmetrically formed in a supporting plate (71) behind and above the second auxiliary spectroscope (4) and are respectively used for fixedly installing a PCB substrate corresponding to the third CMOS chip (8) and a PCB substrate corresponding to the fourth CMOS chip (3).
  10. 10. The HDR camera of claim 1, wherein the PCB substrate is parallel to and has the same area as the end surface of the adjacent sub beam splitter.
CN201911017598.4A 2019-10-24 2019-10-24 Light splitting HDR camera applied to mobile robot SLAM field Active CN110740238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911017598.4A CN110740238B (en) 2019-10-24 2019-10-24 Light splitting HDR camera applied to mobile robot SLAM field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911017598.4A CN110740238B (en) 2019-10-24 2019-10-24 Light splitting HDR camera applied to mobile robot SLAM field

Publications (2)

Publication Number Publication Date
CN110740238A true CN110740238A (en) 2020-01-31
CN110740238B CN110740238B (en) 2021-05-11

Family

ID=69271236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911017598.4A Active CN110740238B (en) 2019-10-24 2019-10-24 Light splitting HDR camera applied to mobile robot SLAM field

Country Status (1)

Country Link
CN (1) CN110740238B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115942072A (en) * 2022-11-21 2023-04-07 北京航空航天大学 Device and method for acquiring image with arbitrary exposure duration under dual-processor architecture

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101631202A (en) * 2008-07-16 2010-01-20 肖长诗 Method for acquiring images with super-wide dynamic range
CN101888487A (en) * 2010-06-02 2010-11-17 中国科学院深圳先进技术研究院 High dynamic range video imaging system and image generating method
CN203414717U (en) * 2013-08-23 2014-01-29 重庆米森科技有限公司 Device for frequency-division processing of short-wave infrared and visible light of shooting/photographing equipment
CN105872311A (en) * 2016-05-30 2016-08-17 深圳Tcl数字技术有限公司 High-dynamic-range picture switching method and device
CN207036260U (en) * 2017-06-02 2018-02-23 苏州优函信息科技有限公司 Modularization push-broom type visible ray/near infrared imaging spectrometer
CN108280817A (en) * 2018-01-15 2018-07-13 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN109068067A (en) * 2018-08-22 2018-12-21 Oppo广东移动通信有限公司 Exposal control method, device and electronic equipment
CN109218628A (en) * 2018-09-20 2019-01-15 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and storage medium
CN208890910U (en) * 2018-10-26 2019-05-21 杭州海康威视数字技术股份有限公司 A kind of video camera

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101631202A (en) * 2008-07-16 2010-01-20 肖长诗 Method for acquiring images with super-wide dynamic range
CN101888487A (en) * 2010-06-02 2010-11-17 中国科学院深圳先进技术研究院 High dynamic range video imaging system and image generating method
CN203414717U (en) * 2013-08-23 2014-01-29 重庆米森科技有限公司 Device for frequency-division processing of short-wave infrared and visible light of shooting/photographing equipment
CN105872311A (en) * 2016-05-30 2016-08-17 深圳Tcl数字技术有限公司 High-dynamic-range picture switching method and device
CN207036260U (en) * 2017-06-02 2018-02-23 苏州优函信息科技有限公司 Modularization push-broom type visible ray/near infrared imaging spectrometer
CN108280817A (en) * 2018-01-15 2018-07-13 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN109068067A (en) * 2018-08-22 2018-12-21 Oppo广东移动通信有限公司 Exposal control method, device and electronic equipment
CN109218628A (en) * 2018-09-20 2019-01-15 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and storage medium
CN208890910U (en) * 2018-10-26 2019-05-21 杭州海康威视数字技术股份有限公司 A kind of video camera

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115942072A (en) * 2022-11-21 2023-04-07 北京航空航天大学 Device and method for acquiring image with arbitrary exposure duration under dual-processor architecture
CN115942072B (en) * 2022-11-21 2024-07-16 北京航空航天大学 Image acquisition device and method for arbitrary exposure time under dual-processor architecture

Also Published As

Publication number Publication date
CN110740238B (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN107343130B (en) High dynamic imaging module based on DMD dynamic light splitting
CN103905731B (en) A kind of wide dynamic images acquisition method and system
CN107995396B (en) Two camera modules and terminal
US20110149100A1 (en) Image apparatus and imaging method
CN110740238B (en) Light splitting HDR camera applied to mobile robot SLAM field
CN113660415A (en) Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
CN1787613A (en) Imaging apparatus, imaging method and imaging processing program
CN112305561A (en) Laser active illumination space target polarization imaging system and method
CN110798624B (en) HDR camera applied to outdoor SLAM field of mobile robot
CN110749550B (en) Astronomical spectrometer image quality compensation method and system based on deep learning
CN102096174B (en) System and method for executing automatic focusing in low-brightness scene
RU2707714C1 (en) Device for automatic acquisition and processing of images
CN111355896B (en) Method for acquiring automatic exposure parameters of all-day camera
Mannami et al. Adaptive dynamic range camera with reflective liquid crystal
CN211152092U (en) Lens detection device
CN114222047A (en) Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
Ferrat et al. Ultra-miniature omni-directional camera for an autonomous flying micro-robot
CN110389088A (en) A kind of large caliber reflecting mirror surface particulate contamination object on-line monitoring method
CN111052721A (en) Sky monitoring system
CN110702099B (en) High dynamic range fixed star detection imaging method and star sensor
CN112040203B (en) Computer storage medium, terminal device, image processing method and device
US20230179873A1 (en) Apparatus, optical apparatus, image pickup method, and non-transitory computer-readable storage medium
CN220085063U (en) Binocular thunder all-in-one device
CN110913143B (en) Image processing method, image processing device, storage medium and electronic equipment
US10771675B2 (en) Imaging control apparatus and imaging control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant