WO2020207185A1 - 基于三维光场技术的光学无人机监测*** - Google Patents

基于三维光场技术的光学无人机监测*** Download PDF

Info

Publication number
WO2020207185A1
WO2020207185A1 PCT/CN2020/079089 CN2020079089W WO2020207185A1 WO 2020207185 A1 WO2020207185 A1 WO 2020207185A1 CN 2020079089 W CN2020079089 W CN 2020079089W WO 2020207185 A1 WO2020207185 A1 WO 2020207185A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
light field
monitoring system
rotating platform
drone
Prior art date
Application number
PCT/CN2020/079089
Other languages
English (en)
French (fr)
Inventor
李应樵
李莉华
Original Assignee
深圳市视觉动力科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市视觉动力科技有限公司 filed Critical 深圳市视觉动力科技有限公司
Priority to US17/601,819 priority Critical patent/US11978222B2/en
Publication of WO2020207185A1 publication Critical patent/WO2020207185A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/232Image signal generators using stereoscopic image cameras using a single 2D image sensor using fly-eye lenses, e.g. arrangements of circular lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • F16M11/10Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting around a horizontal axis
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/18Heads with mechanism for moving the apparatus relatively to the stand
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/20Undercarriages with or without wheels
    • F16M11/2007Undercarriages with or without wheels comprising means allowing pivoting adjustment
    • F16M11/2014Undercarriages with or without wheels comprising means allowing pivoting adjustment around a vertical axis
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M13/00Other supports for positioning apparatus or articles; Means for steadying hand-held apparatus or articles
    • F16M13/02Other supports for positioning apparatus or articles; Means for steadying hand-held apparatus or articles for supporting on, or attaching to, an object, e.g. tree, gate, window-frame, cycle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/557Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/53Constructional details of electronic viewfinders, e.g. rotatable or detachable
    • H04N23/531Constructional details of electronic viewfinders, e.g. rotatable or detachable being rotatable or detachable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/957Light-field or plenoptic cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses

Definitions

  • the invention belongs to the field of drone monitoring, and particularly relates to a drone monitoring system based on light field technology.
  • the purpose of the present invention is to provide a new high-resolution and stable monitoring system to obtain clear stereo images, thereby improving efficiency and accuracy in the process of monitoring or detecting drones.
  • the present invention discloses a drone monitoring system based on light field technology, in which a first camera is used to continuously obtain image information in a monitoring area; and a second camera is a light field containing a fly-eye lens
  • the camera is used to obtain the light field information of the UAV when the obtained image information is judged to be the UAV; and the vertical rotating platform and the horizontal rotating platform placed perpendicular to each other; the vertical rotating platform is used for control The first camera and the second camera rotate in a direction perpendicular to the horizontal rotating platform; and the horizontal rotating platform is used to control the first camera and the second camera to rotate in the horizontal direction; wherein the first camera And the second camera can rotate synchronously under the control of the vertical rotating platform and the horizontal rotating platform; and a computer processor for analyzing and judging whether the obtained image information in the monitoring area is a drone, and passing the The obtained light field information calculates the depth information of the drone to obtain the position of the drone.
  • the vertical rotation platform controls the rotation range of the first camera and the second camera to an elevation angle of 15-45 degrees.
  • the horizontal rotation platform controls the rotation range of the first camera and the second camera to be 0-120 degrees.
  • the second camera further includes a telephoto lens group with a diagonal lens of about 1° and a camera photosensitive element, and the fly-eye lens is arranged between the telephoto lens group and the camera photosensitive element and is close to the The camera's photosensitive element.
  • the fly-eye lens is a micro lens array.
  • the microlens array is arranged in a linear arrangement or a hexagonal arrangement, and the hexagonal arrangement of the microlens array is arranged in a misaligned arrangement compared to the previous row.
  • the width of each microlens of the hexagonal microlens array is 60 ⁇ m, but there are two microlenses within 90 ⁇ m. Which said
  • the light field image I(x,y) can be expressed by the formula can be expressed by the formula:
  • Figure 3( d) is a schematic diagram of the principle of calculating the refocused image by moving the sub-aperture image of the light field imaging system of the present invention.
  • the refocused image is calculated by moving the sub-aperture image in the manner shown in Figure 3(d):
  • the optical drone monitoring system of the three-dimensional light field technology provided by the present invention can isolate the vibration in the monitoring process, thereby improving the efficiency and accuracy in the process of monitoring or detecting the drone.
  • Figure 1 is a schematic structural diagram of the UAV monitoring system of the present invention.
  • FIG. 2 is a schematic diagram of the structure of the second camera of the drone monitoring system of the present invention.
  • Figures 3(a) and 3(b) are schematic diagrams of the light field imaging system of the present invention.
  • Figure 3(c) is an example diagram of the processed light field image.
  • Fig. 3(d) is a schematic diagram of the image principle of the moving sub-aperture image of the light field imaging system of the present invention to calculate the refocusing.
  • FIG. 4(a) is a microscope view of the hexagonal microlens array of the second camera 103 of the drone monitoring system of the present invention.
  • 4(b) is a white light interferometer detection diagram of the hexagonal microlens array of the second camera 103 of the drone monitoring system of the present invention.
  • FIG. 5 is a working principle diagram of the UAV monitoring system of the present invention.
  • FIG. 1 is a schematic diagram of the structure of the UAV monitoring system 100 of the present invention.
  • the system 100 includes a chassis 101 for storing a processor and a rotating platform controller, a first camera 102, a second camera 103, a telephoto lens 104, a wide-angle lens 105, a vertical rotating platform 106, a horizontal rotating platform 107, and a vertical rotating platform.
  • the first camera 102 and the second camera 103 are both high-resolution cameras, the axes of the two are parallel to each other, and are fixed on the horizontal support plate 109 by a fixed rod 110; the vertical rotating platform 106 is fixed on one side of the horizontal support plate 109; The axis of the vertical rotation platform 106 is fixed to the vertical support panel 108, the axis of the vertical rotation platform 106 is horizontal, and the vertical rotation platform can surround the axis in the direction of arrow A and along the axis. The opposite direction of arrow A rotates respectively.
  • the vertical rotating platform 106, the horizontal support plate 109 and the first fixing rod 110 are fixed to each other, when the vertical rotating platform 106 rotates around its axis, it can be fixed on the horizontal support plate 109 by driving
  • the first fixed rod 110 in turn makes the first camera 102 and the second camera 103 move at the same time, that is, the first camera 102 and the second camera 103 are synchronized around the vertical rotating platform along the arrow A or the direction opposite to arrow A rotates.
  • the range of rotation may be 15-45 degrees of elevation.
  • the vertical support panel 108 is fixed on the second fixing rod 111, and the second support rod 111 is fixed on the horizontal rotating platform 107.
  • the second support rod 111 As shown in Figure 1, driven by the second support rod 111, it includes the first camera 102, the second camera 103, the telephoto lens 104, the wide-angle lens 105, the vertical rotating platform 106, the vertical support panel 108, and the horizontal
  • the supporting plate 109 can all rotate around the axis of the horizontal rotating platform 107 along the arrow B or in the opposite direction of the arrow B; the axis of the horizontal rotating platform 107 is perpendicular to the rotating direction of the horizontal rotating platform.
  • the second fixed rod 111 and the horizontal rotating platform 107 are fixed, the second fixed rod 111 is located on the horizontal rotating platform 107 in the direction of arrow B or in the opposite direction of arrow B.
  • the range of rotation may be 15-45 degrees of elevation. In an embodiment, the range of rotation may be the range allowed by the horizontal rotating platform, and is particularly preferably 0-120 degrees.
  • the horizontal rotating platform 107 is a vibration-free optical platform, which is placed on a chassis 101, which is used to accommodate the computer workstation including a processor and a rotating platform controller.
  • the horizontal rotating platform 107 provides a horizontal plane and is fixed on the horizontal
  • the camera system on the rotating platform 107 can track the unmanned aerial vehicle, while keeping the camera system relatively fixed without being disturbed by vibration.
  • the two high-resolution cameras 102 and 103 may use the same camera body, for example, an ultra-high-definition 4K (4096 ⁇ 2160 pixel) camera; or different camera bodies.
  • the second camera 103 is a wide-angle camera, which has a wide shooting range, has the shooting characteristics of exaggerating the sense of distance, and a wide focus range.
  • the wide-angle can also adjust any point in the image to the most appropriate focal length, making the picture clearer. It can also be called full autofocus.
  • the second wide-angle camera 103 can use a single pixel on the camera's photosensitive element with a size of 1- A camera in the range of 8 microns; the wide-angle second camera 103 includes a viewing angle lens or a diagonal lens.
  • the second camera 103 is an ultra-long-distance camera, and the second camera 103 uses a super telephoto lens to track the flying drone, and the super telephoto lens adopts a lens in a focal length range of 100-2200 mm.
  • the detailed structure of the second camera 103 will be described below.
  • the computer workstation is located inside the chassis 101 of the optical drone monitoring system 100, and processes the collected information, monitors the flight of the drone, and gives timely warnings.
  • the chassis 101 is mainly used to protect the optical drone monitoring system 100 outdoors, such as an airport.
  • FIG. 2 is a schematic diagram of the structure of the second camera 103 of the drone monitoring system of the present invention.
  • the microlens array 201 is placed between the photosensitive element 202 of the second camera 103 (ie, the light field camera) and the camera lenses 204 and 205. It can be seen that the picture collected by the second camera 103 is enlarged on the display, and it can be seen that the picture is composed of circular sub-images equally spaced horizontally and vertically. Each sub-image corresponds to a micro lens.
  • the second camera 103 can be a light field camera.
  • the second camera 103 also includes a super telephoto lens group 204, 205 with a diagonal lens of about 1°; the super telephoto lens group 204, 205 can be Lens in the range of 100-2200mm focal length.
  • the second camera 103 further includes a microlens array 201.
  • the presence of the microlens array 201 makes the second camera 103 a compound eye camera, for example, an ultra-high definition (HUD) 4K compound eye camera, in which the micro lens
  • the array 201 is designed according to the image sensor specifications and the microscope optical path.
  • the camera equipped with a compound eye lens operates in the same way as a normal camera. The picture obtained is enlarged before processing, and you can see that the entire image is composed of small images of each compound eye lens.
  • the image After processing by the light field calculation software, the image can be refocused to different focal planes. Because of the refocusing feature, the light field image can obtain the depth information of the captured image. Because the compound spectacle lens is added to the camera's photosensitive element, the image that records the light field information can be captured.
  • Figures 3(a) and 3(b) are schematic diagrams of the light field imaging system of the present invention.
  • the mechanism of the light field imaging system with the micro lens array 302 in front of the CMOS sensor 301 is shown.
  • 3(a) All light passing through the pixel passes through its mother micro lens and passes through the conjugate square (sub-aperture) on the main lens 303.
  • Figure 3(b) All light passing through the sub-aperture is focused by the corresponding pixels under different microlenses. These pixels form the picture seen through the sub-aperture.
  • the light field image I(x,y) can be expressed by the formula can be expressed by the formula:
  • FIG. 3(d) is a schematic diagram of the principle of calculating the refocused image by moving the sub-aperture image of the light field imaging system of the present invention.
  • the refocused image is calculated by moving the sub-aperture image in the manner shown in Figure 3(d):
  • Light field imaging technology allows to refocus the image and estimate the depth map of the scene.
  • the basic depth range is calculated through the light field and combined with the position on the image to determine the position of the drone.
  • Figure 3(c) is an example diagram of the processed light field image.
  • a larger number ( ⁇ m) in the positive direction means a closer virtual focal plane toward the objective lens.
  • the focal plane on the surface of the objective lens is calibrated to 0 ⁇ m.
  • the processed light field image The upper left image of Fig. 3(c) is the top line layer, the upper right image of Fig. 3(c) is the middle layer, the lower left image of Fig. 3(c) is the bottom metal layer, and the lower right image of Fig. 3(c) is the full focus image.
  • Auto-focus software will be developed to capture all line images without commanding any mechanical movement of the vertical axis.
  • Real-time AOI software will be developed and used in conjunction with autofocus software.
  • the user interface will display an image taken by the camera and an all-focus image, and any defects detected will be marked.
  • the method of obtaining distance through depth information is simpler than the existing method of obtaining distance through distance calibration objects, and can obtain clear images and accurate distance information.
  • the micro lens array 201 is only installed on the second camera 103 and not on the second camera 103.
  • the pixel pitch and the sensor size are both influencing factors.
  • the hexagonal microlens array is slightly different from the linear array.
  • the width of each microlens is 60 ⁇ m, but there are two microlenses within 90 ⁇ m.
  • the second camera 103 also includes a camera photosensitive element 202 close to the micro lens array 201.
  • An example of the photosensitive element 202 is a CMOS charge-coupled device image sensor CCD, which is used to convert the sensed light signal into an electrical signal.
  • the subject passes through the super telephoto lens groups 204 and 205 to form a virtual image 203, which is processed by the microlens array 201, and the virtual image 203 forms a high-definition image in the photosensitive element 202 Image electrical signal.
  • 4(a) is a microscope view of the hexagonal microlens array of the second camera 103 of the drone monitoring system of the present invention.
  • the coordinate unit is microns.
  • 4(b) is a white light interferometer detection diagram of the hexagonal microlens array of the second camera 103 of the drone monitoring system of the present invention.
  • Figure 4(a) and Figure 4(b) test the same microlens array.
  • the micro lens array adopts a linear array arrangement. However, the linear arrangement will result in a large gap between the microlenses, which cannot produce images.
  • the microlens array can be arranged in a hexagonal arrangement, while the hexagonal arrangement Because each row of the microlens array is misaligned compared to the previous row, the gap between the microlenses can be greatly reduced, and the utilization efficiency of the image sensor behind the microlenses can be improved.
  • linear alignment is required to ensure that each sub-image is extracted by the program. Therefore, horizontal and vertical reference lines are generated, so that the reference line passes through the center of the horizontal and vertical sub-images, and then all the sub-images can be captured. Sub-image and calculate depth information.
  • the misalignment distribution of the microlenses requires a new alignment method during alignment, so that the software can accurately extract and compare and analyze the images of each microlens.
  • the hexagonal microlens array pattern is much denser than the linear microlens array pattern.
  • the hexagonal microlens array is linearly aligned in the vertical direction, it is translated in the horizontal direction. Therefore, the UAV monitoring system of the present invention can adopt the hexagonal micro lens array mode. This is because the denser hexagonal microlens array pattern can provide a higher resolution light field imaging system.
  • FIG. 5 is a working flowchart 500 of the UAV monitoring system of the present invention.
  • the drone monitoring system is started to start monitoring; in step 502, the second camera 103 continues to work and continuously sends the acquired video image information in a larger range to the computer workstation; because the second camera 103 is a wide-angle camera, so the camera can scan the protected area and capture high-definition 4K video.
  • the scanning range is large, the resolution of the obtained image is low, and it is impossible to obtain a clear UAV image.
  • the video captured by the wide-angle second camera 103 will always be processed by the computer.
  • the computer will monitor it through a monitoring algorithm based on machine learning.
  • step 504 by comparing with the database stored in the computer workstation, it is determined whether the obtained image conforms to the shape of the drone in the database. Once the target matches, step 504 is entered.
  • a signal is generated to control the horizontal rotation platform 107 and the vertical rotation platform 106 to rotate to a suitable position, and the second camera 103 takes a picture.
  • the taken picture contains the light field image of the shot scene, and then it will be handed over to the light
  • the field processing software performs information processing; that is, the UAV monitoring system 100 will drive the second camera 103 to focus on the telephoto lens on the locked target to track the suspicious target, thereby obtaining high-resolution images and light of the target UAV.
  • Field information and go to step 504; if there is no match, go back to step 502, and repeatedly send the video image information obtained by the second camera 103 to the computer workstation; in step 504, the UAV monitoring system 100 again uses the database pair
  • the depth and position information of the light field image is calculated and obtained, and a warning is issued to the user or the control tower.
  • an alarm signal will be sent to the monitoring system, and the depth and position information of the drone will also be sent back to the supervision. center.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • Studio Devices (AREA)

Abstract

本发明公开了一种基于光场技术的无人机监测***,其中第一摄像机,用于持续获得监测区域内的图像信息;以及第二摄像机,所述第二摄像机为包含复眼透镜的光场相机,用于在所获得的图像信息被判定为无人机时,获得所述无人机的光场信息;以及彼此垂直放置的垂直旋转平台和水平旋转平台;其中所述第一摄像机和第二摄像机能够在所述垂直旋转平台和所述水平旋转平台的控制下同步旋转;以及计算机处理器,通过所获得的光场信息计算出无人机的深度信息从而获得所述无人机的位置。本发明提供的三维光场技术的光学无人机监测***能够对监测过程中的震动进行隔离,从而在监测或探测无人机的过程中提高效率和准确性。

Description

基于三维光场技术的光学无人机监测*** 技术领域
本发明属于无人机监测领域,特别涉及一种基于光场技术的无人机监测***。
背景技术
随着无人机技术的发展,对无人机的监测***的改进有着广泛的需求,现有技术中多采用雷达和摄像头相结合的监测***。雷达监测易受隐身技术欺骗且低空监测的效果较差,摄像头一般来说分辨率较低。中国发明专利申请201810128587.2公开了一种无人机监测***及其监督方法。在该方法中,利用软件方法,扫描区域内的图像,通过第一和第二相机形成立体视觉判断图像中是否存在可疑的目标,通过计算可疑目标的准确位置,对可疑目标进行跟踪拍摄。该技术主要是在软件部分进行改进。中国发明专利申请201810396720.2公开了一种无人机探测方法、装置及电子设备。也主要是从软件的角度,控制探测平台上的相机转动,采取发送转动指令至转台的电机,以使电机带动转台上的多台相机转动预设角度;发送停止指令至该电机,以使该电机控制转台在转动预设角度后停止转动;当确定上述多台相机停止预设时间后,控制多台相机进行一次拍摄以获取多个图像;对该多个 图像进行图像识别,确定监控区域内是否存在无人机;如果监控区域内不存在无人机,则重新执行以上步骤。
需要一种新的分辨率高,并且稳定的监测***,获得清晰的立体图像,从而在监测或探测无人机的过程中提高效率和准确性。
发明内容
本发明的目的在于提供一种新的分辨率高,并且稳定的监测***,获得清晰的立体图像,从而在监测或探测无人机的过程中提高效率和准确性。
本发明公开了一种基于光场技术的无人机监测***,其中第一摄像机,用于持续获得监测区域内的图像信息;以及第二摄像机,所述第二摄像机为包含复眼透镜的光场相机,用于在所获的图像信息被判定为无人机时,获得所述无人机的光场信息;以及彼此垂直放置的垂直旋转平台和水平旋转平台;所述垂直旋转平台用于控制所述第一摄像机和第二摄像机沿垂直于所述水平旋转平台的方向旋转;并且所述水平旋转平台用于控制所述第一摄像机和第二摄像机沿水平方向旋转;其中所述第一摄像机和第二摄像机能够在所述垂直旋转平台和所述水平旋转平台的控制下同步旋转;以及计算机处理器,用于分析判断所获得的监测区域内的图像信息是否为无人机,并且通过所获得的光场信息计算出无人机的深度信息从而获得与所述无人机的位置。
本发明的一个方面,所述垂直旋转平台控制所述第一摄像机和第二摄像机的旋转范围为仰角15-45度。所述水平旋转平台控制所述第一摄像机和第二摄像机的旋转范围为0-120度。所述第二摄像机还包括一个具有约1°对角线透镜的远摄透镜组和相机感光元件,所 述复眼透镜配置于所述远摄透镜组和所述相机感光元件之间,并靠近所述相机感光元件。其中所述复眼透镜为微透镜阵列。其中所述微透镜阵列为线性排列或六边形排布的方式,其中所述六边形排布的微透镜阵列由于其每一行相较上一行错位排布。其中所述六边形微透镜阵列的每个微透镜的宽度均为60μm,但是在90μm内存有两个微透镜。其中所述
光场图像I(x,y)可以由公式表示可以由公式表示:
I(x,y)=∫∫L F(u,v,x,y)dudv       (1)
其中(u,v,x,y)表示沿着与(u,v)处的主透镜相交的光线行进的光和(x,y)处的微透镜平面,并且使用全光圈;并且图3(d)为本发明的光场成像***的移动子孔径图像来计算重新聚焦的图像原理示意图,通过图3(d)所示的方式移动子孔径图像来计算重新聚焦的图像:
移位的光场函数可以表示为:
Figure PCTCN2020079089-appb-000001
本发明提供的三维光场技术的光学无人机监测***能够对监测过程中的震动进行隔离,从而在监测或探测无人机的过程中提高效率和准确性。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍。显而易见地,下面描述中的附图仅仅是本发明的一些实例,对于本领域普通技术人员来讲,在不付出 创新性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明的无人机监测***的结构示意图。
图2为本发明的无人机监测***的第二摄像机的结构示意图。
图3(a)和3(b)为本发明的光场成像***的原理图。
图3(c)为处理后的光场图像示例图。
图3(d)为本发明的光场成像***的移动子孔径图像来计算重新聚焦的图像原理示意图。
图4(a)为为本发明的无人机监测***的第二摄像机103的六边形微透镜阵列的显微镜图。
图4(b)为本发明的无人机监测***的第二摄像机103的六边形微透镜阵列的白光干涉仪检测图。
图5为本发明的无人机监测***的工作原理图。
具体实施方式
现结合相应的附图,对本发明的具体实施例进行描述。然而,本发明可以以多种不同的形式实施,而不应被解释为局限于此处展示的实施例。提供这些实施例只是为了本发明可以详尽和全面,从而可以将本发明的范围完全地描述给本领域的技术人员。附图中说明的实施例的详细描述中使用的措辞不应对本发明造成限制。
图1为本发明的无人机监测***100的结构示意图。其中所述***100包括,用于存放处理器以及旋转平台控制器的机箱101,第一摄像机102,第二摄像机103,长焦镜头104,广角镜头105,垂直旋 转平台106,水平旋转平台107,垂直支撑面板108,水平支撑板109。其中第一摄像机102和第二摄像机103均为高分辨率摄像机,二者的轴线彼此平行,通过固定杆110固定在水平支撑板109上;垂直旋转平台106固定于水平支撑板109的一侧;其中所述垂直旋转平台106的轴心部分与垂直支撑面板108相固定,所述垂直旋转平台106的轴线为水平方向,并所述垂直旋转平台能围绕所述轴心沿箭头A的方向与沿箭头A的相反方向分别旋转。由于所述垂直旋转平台106与水平支撑板109和第一固定杆110之间均为相互固定的,因此,在垂直旋转平台106围绕其轴心旋转时,能够通过带动固定在水平支撑板109上的第一固定杆110,进而使所述第一摄像机102与所述第二摄像机103同时运动,即,使所述第一摄像机102与所述第二摄像机103同步围绕所述垂直旋转台沿箭头A或与箭头A相反的方向旋转,在一个实施例中,旋转的范围可为仰角15-45度。进一步,所述垂直支撑面板108固定在第二固定杆111上,所述第二支撑杆111固定在水平旋转平台107上。如图1所示,在所述第二支撑杆111的带动下,包括所述第一摄像机102,第二摄像机103,长焦镜头104,广角镜头105,垂直旋转平台106,垂直支撑面板108,水平支撑板109均能围绕所述水平旋转平台107的轴心沿箭头B或者沿箭头B相反的方向旋转;所述水平旋转平台107的轴线垂直于所述水平旋转平台的旋转方向。同样地,由于第二固定杆111与所述水平旋转平台107之间是固定的,因此在所述第二固定杆111在所述水平旋转平台107上沿箭头B的方向或沿箭头B相反的方向旋转的时候,能够同时带动所述光场相机102与所述普通摄像机103同时运动,即,使所述光场相机102与所述普通摄像机103同步围绕所述水平旋转台沿箭头B或与箭头B相反的方向旋转。在一个实施例中,旋转的范围可为仰角15- 45度。在一个实施例中,旋转的范围可为水平旋转平台所允许的范围,特别优选地在于0-120度。
水平旋转平台107为无振动光学平台,其放置在机箱101上,机箱101用来容纳包括处理器和旋转平台控制器的所述计算机工作站,所述水平旋转平台107提供水平的平面,固定在水平旋转平台107上的摄像机***能够对无人机进行追踪,同时能够保持所述摄像机***相对固定,并不受到震动的干扰。
所述两个高分辨率摄像机102,103可以采用相同的相机主体,例如采用超高清的4K(4096×2160像素)摄像机;也可以采用不同的相机主体。其中第二摄像机103为广角摄像机,其拍摄广阔的范围,具有将距离感夸张化,对焦范围广等拍摄特点。使用广角时可将眼前的物体放得更大,将远处的物体缩得更小,令四周的图像失真。广角还能使图像中的任意一点都调节到最适当的焦距,使得画面更加清晰,也可以称之为完全自动对焦,所述第二广角摄像机103可采用相机感光元件上单个像素尺寸在1-8微米范围内的摄像机;在广角第二摄像机103内包括一个视角透镜或对角透镜。第二摄像机103为超远距离摄像机,所述第二摄像机103利用超级远摄镜头对飞行的无人机进行追踪,所述超级远摄镜头采用100-2200mm焦段范围内的镜头。将在下文对第二摄像机103进行详细的结构说明。计算机工作站位于光学无人机监测***100机箱101的内部,对采集获得的信息进行处理、对无人机的飞行进行监测,并及时给出警报。机箱101主要是在户外,例如机场等环境中保护光学无人机监测***100。
图2为本发明的无人机监测***的第二摄像机103的结构示意图。微透镜阵列201置于第二摄像机103(即光场相机)的感光元件 202与相机镜头204和205之间。通过第二摄像机103采集的照片在显示器上放大可以看出,图片是由横竖都等距分布的圆形子图像组成。每个子图像对应的是一个微透镜。第二摄像机103可以为光场相机,在第二摄像机103中,还包括一个具有约1°对角线透镜的超级远摄透镜组204,205;所述超级远摄透镜组204,205可以采用100-2200mm焦段范围内的透镜。一个实施例中,第二摄像机103还包括微透镜阵列201,所述微透镜阵列201的存在使所述第二摄像机103成为复眼摄像机,例如,超高清(HUD)4K复眼摄像机,其中的微镜头阵列201是根据图像传感器规范以及显微镜光路来进行设计的。装有复眼透镜的相机进行拍摄时的操作方式与普通相机的拍摄相同,拍摄获得的图片未处理前放大可以看到整幅图像是由每个复眼透镜的小图像组合而成。光场计算软件处理后可以将图片重聚焦到不同焦平面上,正因为有重聚焦的特性所以光场图像可以获得所拍摄图像的深度信息。由于,复眼镜片加装到相机感光元件之前即可拍摄得到记录光场信息的图像。
图3(a)和3(b)为本发明的光场成像***的原理图。显示了在CMOS传感器301前面具有微透镜阵列302的光场成像***的机制。图3(a)通过像素的所有光线都通过其母微透镜并通过主透镜303上的共轭方形(子光圈)。图3(b)通过子孔径的所有光线通过不同微透镜下的相应像素聚焦。这些像素形成通过该子孔径看到的照片。
光场图像I(x,y)可以由公式表示可以由公式表示:
I(x,y)=∫∫L F(u,v,x,y)dudv       (1)
其中(u,v,x,y)表示沿着与(u,v)处的主透镜相交的光线行进的光和(x,y)处的微透镜平面,并且使用全光圈。图3(d)为本发明的光场成 像***的移动子孔径图像来计算重新聚焦的图像原理示意图,通过图3(d)所示的方式移动子孔径图像来计算重新聚焦的图像:
移位的光场函数可以表示为
Figure PCTCN2020079089-appb-000002
光场成像技术允许重新聚焦图像并估计场景的深度图。通过光场计算出基本的深度范围,并结合图像上的位置来确定无人机的位置。
对于芯片板应用的半导体制造,可以使用复眼来找出铝粘合线的最大环高度,芯片上的第一键合高度和基板上的第二键合高度。图3(c)为处理后的光场图像示例图。在图3(c)中,正方向上的更大数量(μm)意味着朝向物镜的更近的虚拟焦平面。物镜表面上的焦平面校准为0μm。处理后的光场图像。图3(c)的左上图像是顶部线层,图3(c)的右上图像是中间层,图3(c)的左下图像是底部金属层,图3(c)的右下图像是全焦点图像。将开发自动聚焦软件以捕获所有线图像,而无需命令垂直轴的任何机械运动。将开发实时AOI软件并与自动对焦软件结合使用。用户界面将显示由相机拍摄的图像和全焦点图像,将标记检测到的任何缺陷。
通过深度信息获得距离的方式与现有的通过距离标定物获得距离的方式相比更简便,能获得清晰的图像和准确的距离信息。
因此,微镜头阵列201只加装在第二摄像机103而不装于第二摄像机103上。在微镜头阵列201参数的设计过程中,像素间距以及传感器尺寸都是其影响因素。在微透镜阵列对准策略中有线性和六边形两种。六边形微透镜阵列较线性阵列略有差异,每个微透镜的宽度均 为60μm,但是在90μm内存有两个微透镜。第二摄像机103中还包括靠近微透镜阵列201的相机感光元件202。感光元件202的示例为,CMOS电荷耦合器件图像传感器CCD,用于将所感应的光线信号转变为电信号。当第二摄像机103发现被摄物体206时,被摄物体经过超级远摄透镜组204,205形成虚拟图像203,经过微透镜阵列201的处理,所述虚拟图像203在感光元件202中形成高清晰的图像电信号。
图4(a)为本发明的无人机监测***的第二摄像机103的六边形微透镜阵列的显微镜图。坐标单位为微米。图4(b)为本发明的无人机监测***的第二摄像机103的六边形微透镜阵列的白光干涉仪检测图。图4(a)和图4(b)检测的是同一个微透镜阵列。在一个示例中,微透镜阵列采用线性阵列的排布。但是线性排布会导致微透镜之间留有较大的空隙,此空隙不能产生图像,因此,在另一示例中,微透镜阵列可采用六边形排布的方式,而六边形排布的微透镜阵列由于其每一行相较上一行错位排布即可大幅减小微透镜之间的空隙提高微透镜后的图像传感器的利用效率。计算光场信息时,需要通过线性对准来确保每个子图像都被程序提取到,因此生成横向与纵向的参考线,使参考线穿过横向和纵向子图像的中心,之后即可抓取所有子图像并算出深度信息。对于图4(a)中显示的六边形微透镜阵列来说,微透镜的错位分布,使得对准时需要采用新的对准方法,使软件可以准确提取并对比分析每个微透镜的图像。如4(a)所示,六边形微透镜阵列模式要比线性微透镜阵列模式密集的多。六边形微透镜阵列虽然在垂直方面是线性对齐的,但是其在水平方向则是平移的。因此,本发明的无人机监测***可采用六边形微透镜阵列模式。这是因为, 密集度更高的六边形微透镜阵列模式可以提供更高分辨率的光场成像***。
图5为本发明的无人机监测***的工作流程图500。在步骤501,启动无人机监测***开始监测;在步骤502,第二摄像机103持续工作,并不断地将将较大范围内的获得的视频图像信息发送至计算机工作站;由于所述第二摄像机103为广角摄像机,因此所述摄像机可以扫描保护区域并捕获高清4K视频。同时,因为扫描的范围较大,因此所获得的图像分辨率较低,无法获知清晰的无人机图像。广角第二摄像机103拍摄到的视频会一直由计算机处理。当无人机进入相机的视场后,计算机会通过基于机器学***或垂直方向的阈值则会产生控制信号,控制信号通过USB接口与平台控制器相连接。在步骤503,通过与计算机工作站中存储的数据库进行比对,判定所获得的图像是否符合数据库中无人机的形状,一旦目标匹配,则进入步骤504,当监测到的无人机处于相机视场中心后会产生信号控制水平旋转平台107和垂直旋转平台106旋转到合适的位置,通过第二摄像机103进行拍照,所拍摄的照片即包含了所拍摄景物的光场图像,之后将交由光场处理软件进行信息处理;也就是说,无人机监测***100将驱动第二摄像机103,聚焦在锁定目标上的远摄镜头来跟踪可疑目标,从而获得目标无人机的高分辨率图像和光场信息,并进入步骤504;在不匹配的情况下,回到步骤502,重复将第二摄像机103获得的视频图像信息发送至计算机工作站;在步骤504,无人机监测***100再次使用数据库对这些信息进行计算和验证后,计算并获得所述光场图像的深度及位置信息并向用户或控制塔 发出警示。也就是说,高分辨率图像一旦与数据库中的无人机形状信息具有高度的相似性,就会将报警信号发送给监控***,所述无人机的深度及位置信息也将被送回监督中心。
本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本发明的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。
以上所述仅用于说明本发明的技术方案,任何本领域普通技术人员均可在不违背本发明的精神及范畴下,对上述实施例进行修饰与改变。因此,本发明的权利保护范围应视权利要求范围为准。本发明已结合例子在上面进行了阐述。然而,在本发明公开范围以内的上述实施例以外的其它实施例也同样可行。本发明的不同的特点和步骤可以以不同于所描述的其它方法进行组合。本发明的范围仅受限于所附的权利要求书。更一般地,本领域普通技术人员可以轻易地理解此处描述的所有的参数,尺寸,材料和配置是为示范目的而实际的参数,尺寸,材料和/或配置将取决于特定应用或本发明教导所用于的应用。

Claims (8)

  1. 一种基于光场技术的无人机监测***,包括:
    第一摄像机,用于持续获得监测区域内的图像信息;以及
    第二摄像机,所述第二摄像机为包含复眼透镜的光场相机,用于在所获的图像信息被判定为无人机时,获得所述无人机的光场信息;以及
    彼此垂直放置的垂直旋转平台和水平旋转平台;
    所述垂直旋转平台用于控制所述第一摄像机和第二摄像机沿垂直于所述水平旋转平台的方向旋转;并且
    所述水平旋转平台用于控制所述第一摄像机和第二摄像机沿水平方向旋转;
    其中所述第一摄像机和第二摄像机能够在所述垂直旋转平台和所述水平旋转平台的控制下同步旋转;
    计算机处理器,用于分析判断所获得的监测区域内的图像信息是否为无人机,并且通过所获得的光场信息计算出无人机的深度信息从而获得与所述无人机的位置。
  2. 如权利要求1所述的无人机监测***,其中,所述垂直旋转平台控制所述第一摄像机和第二摄像机的旋转范围为仰角15-45度。
  3. 如权利要求1所述的无人机监测***,其中,所述水平旋转平台控制所述第一摄像机和第二摄像机的旋转范围为0-120度。
  4. 如权利要求1-3任意一个所述的无人机监测***,所述第二摄像机还包括一个具有约1°对角线透镜的远摄透镜组和相机感光元件,所述复眼透镜配置于所述远摄透镜组和所述相机感光元件之间,并靠近所述相机感光元件。
  5. 如权利要求4所述的无人机监测***,其中所述复眼透镜为微透镜阵列。
  6. 如权利要求5所述的无人机监测***,其中所述微透镜阵列为线性排列或六边形排布的方式,其中所述六边形排布的微透镜阵列由于其每一行相较上一行错位排布。
  7. 如权利要求6所述的无人机监测***,其中所述六边形微透镜阵列的每个微透镜的宽度均为60μm,但是在90μm内存有两个微透镜。
  8. 如权利要求1所述的无人机监测***,其中所述光场信息I(x,y)可以由公式表示可以由公式表示:
    I(x,y)=∫∫L F(u,v,x,y)dudv   (1)
    其中(u,v,x,y)表示沿着与(u,v)处的主透镜相交的光线行进的光和(x,y)处的微透镜平面,并且使用全光圈;并且通过移动子孔径图像来计算重新聚焦的图像,
    移位的光场函数可以表示为:
    Figure PCTCN2020079089-appb-100001
PCT/CN2020/079089 2019-04-08 2020-03-13 基于三维光场技术的光学无人机监测*** WO2020207185A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/601,819 US11978222B2 (en) 2019-04-08 2020-03-13 Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910277372.1A CN111800588A (zh) 2019-04-08 2019-04-08 基于三维光场技术的光学无人机监测***
CN201910277372.1 2019-04-08

Publications (1)

Publication Number Publication Date
WO2020207185A1 true WO2020207185A1 (zh) 2020-10-15

Family

ID=72751533

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/079089 WO2020207185A1 (zh) 2019-04-08 2020-03-13 基于三维光场技术的光学无人机监测***

Country Status (3)

Country Link
US (1) US11978222B2 (zh)
CN (1) CN111800588A (zh)
WO (1) WO2020207185A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113280754A (zh) * 2021-07-22 2021-08-20 清华大学 高精度深度计算装置及方法
US11954842B2 (en) * 2021-09-06 2024-04-09 Huaneng Nanjing Jinling Power Generation Co., Ltd. Powder leakage monitoring device and powder leakage monitoring method
CN114857456B (zh) * 2022-07-08 2022-10-04 北京神导科技股份有限公司 同心轴的双轴云台***及其供电链路和角度调整控制方法
CN116002093B (zh) * 2022-12-13 2024-02-20 自然资源部第二航测遥感院 一种用于无人机的航空摄影平台

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101547344A (zh) * 2009-04-24 2009-09-30 清华大学深圳研究生院 基于联动摄像机的视频监控装置及其跟踪记录方法
CN102528821A (zh) * 2011-12-31 2012-07-04 北京工业大学 一种基于多独立云台的仿生多目视觉物理平台
CN105635530A (zh) * 2014-11-03 2016-06-01 北京蚁视科技有限公司 光场成像***
CN106707296A (zh) * 2017-01-09 2017-05-24 华中科技大学 一种基于双孔径光电成像***的无人机检测与识别方法
CN107578437A (zh) * 2017-08-31 2018-01-12 深圳岚锋创视网络科技有限公司 一种基于光场相机的深度估计方法、***及便携式终端
US20180259342A1 (en) * 2017-03-10 2018-09-13 Qualcomm Incorporated System and method for dead reckoning for a drone
CN210327775U (zh) * 2019-04-08 2020-04-14 深圳市视觉动力科技有限公司 基于三维光场技术的光学无人机监测***

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201519634D0 (en) * 2015-11-06 2015-12-23 Squarehead Technology As UAV detection
US20180364741A1 (en) * 2016-03-11 2018-12-20 Raytheon Bbn Technologies Corp. Human indication of target drone for interception
CN107991838B (zh) * 2017-11-06 2020-10-23 万维科研有限公司 自适应三维立体成像***
CN108447075B (zh) * 2018-02-08 2020-06-23 烟台欣飞智能***有限公司 一种无人机监测***及其监测方法
US10609266B2 (en) * 2018-08-21 2020-03-31 Rockwell Automation Technologies, Inc. Camera for wide field of view with an arbitrary aspect ratio
US11237572B2 (en) * 2018-12-27 2022-02-01 Intel Corporation Collision avoidance system, depth imaging system, vehicle, map generator and methods thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101547344A (zh) * 2009-04-24 2009-09-30 清华大学深圳研究生院 基于联动摄像机的视频监控装置及其跟踪记录方法
CN102528821A (zh) * 2011-12-31 2012-07-04 北京工业大学 一种基于多独立云台的仿生多目视觉物理平台
CN105635530A (zh) * 2014-11-03 2016-06-01 北京蚁视科技有限公司 光场成像***
CN106707296A (zh) * 2017-01-09 2017-05-24 华中科技大学 一种基于双孔径光电成像***的无人机检测与识别方法
US20180259342A1 (en) * 2017-03-10 2018-09-13 Qualcomm Incorporated System and method for dead reckoning for a drone
CN107578437A (zh) * 2017-08-31 2018-01-12 深圳岚锋创视网络科技有限公司 一种基于光场相机的深度估计方法、***及便携式终端
CN210327775U (zh) * 2019-04-08 2020-04-14 深圳市视觉动力科技有限公司 基于三维光场技术的光学无人机监测***

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIAOJIE XIE ET AL.: ""A Target Depth Measurement Method Based on Light Field Imaging"", 2016 9TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID), 26 January 2017 (2017-01-26), XP033049977, ISSN: 2473-3547, DOI: 20200609122737Y *

Also Published As

Publication number Publication date
US11978222B2 (en) 2024-05-07
US20220172380A1 (en) 2022-06-02
CN111800588A (zh) 2020-10-20

Similar Documents

Publication Publication Date Title
WO2020207185A1 (zh) 基于三维光场技术的光学无人机监测***
CN105744163B (zh) 一种基于深度信息跟踪对焦的摄像机及摄像方法
CN107659774B (zh) 一种基于多尺度相机阵列的视频成像***及视频处理方法
CN106303228B (zh) 一种聚焦型光场相机的渲染方法和***
US8049776B2 (en) Three-dimensional camcorder
WO2019113966A1 (zh) 一种避障方法、装置和无人机
JP2020506487A (ja) シーンから深度情報を取得するための装置および方法
CN110132226B (zh) 一种无人机巡线的距离及方位角测量***和方法
US20180184077A1 (en) Image processing apparatus, method, and storage medium
CN103971375A (zh) 一种基于图像拼接的全景凝视相机空间标定方法
US20190191082A1 (en) Image processing apparatus, image processing method, storage medium, system, and electronic apparatus
CN107578450B (zh) 一种用于全景相机装配误差标定的方法及***
CN104079916A (zh) 一种全景三维视觉传感器及使用方法
WO2020207172A1 (zh) 基于三维光场技术的光学无人机监测方法及***
US20210364288A1 (en) Optical measurement and calibration method for pose based on three linear array charge coupled devices (ccd) assisted by two area array ccds
KR20200093004A (ko) 웨어러블 디바이스를 테스트하기 위한 방법 및 시스템
CN110602376B (zh) 抓拍方法及装置、摄像机
CN102595178A (zh) 视场拼接三维显示图像校正***及校正方法
CN210327775U (zh) 基于三维光场技术的光学无人机监测***
CN115601437A (zh) 一种基于目标识别的动态会聚型双目立体视觉***
Li et al. A cooperative camera surveillance method based on the principle of coarse-fine coupling boresight adjustment
WO2023098362A1 (zh) 一种基于亿级像素相机的目标区域安防与监控***
WO2023197841A1 (zh) 一种对焦方法、摄像装置、无人机和存储介质
CN113596441B (zh) 光轴调整装置、方法、***及可读存储介质
Lu et al. Image-based system for measuring objects on an oblique plane and its applications in 2-D localization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20788005

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20788005

Country of ref document: EP

Kind code of ref document: A1