CN109255820A - A kind of actively perceive apparatus and method based on unmanned boat - Google Patents

A kind of actively perceive apparatus and method based on unmanned boat Download PDF

Info

Publication number
CN109255820A
CN109255820A CN201811011740.XA CN201811011740A CN109255820A CN 109255820 A CN109255820 A CN 109255820A CN 201811011740 A CN201811011740 A CN 201811011740A CN 109255820 A CN109255820 A CN 109255820A
Authority
CN
China
Prior art keywords
coordinate system
camera
unit
cell
video camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811011740.XA
Other languages
Chinese (zh)
Inventor
胡钊政
游继安
穆孟超
黄刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN201811011740.XA priority Critical patent/CN109255820A/en
Publication of CN109255820A publication Critical patent/CN109255820A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The actively perceive apparatus and method based on unmanned boat that the invention discloses a kind of, the device include: unmanned boat navigation control unit, boat-carrying laser radar apparatus, camera calibration unit, data acquisition and display unit, data leaflet member, bank base server control unit, bank base Target Detection unlit;The bank base Target Detection unlit determines the position of target after detecting complex water areas target;The unmanned boat navigates by water control unit, navigates by water for controlling unmanned boat to target proximity to be observed;Boat-carrying laser radar apparatus, for detecting the specific location of target;The camera calibration unit, for being demarcated by video camera to all Cell groups;Data acquisition and display unit, the clear image for photographic subjects;Bank base server control unit, for storing data acquisition and display unit acquired image data.The clarity of shooting image and video can be greatly improved in the present invention.

Description

A kind of actively perceive apparatus and method based on unmanned boat
Technical field
The present invention relates to computer vision technique more particularly to a kind of actively perceive apparatus and method based on unmanned boat.
Background technique
With the continuous development of science and technology, shipping business plays increasingly important role, rivers in the development of national economy The amusement on water industry of Hu Hai is quickly grown, and the following complex water areas automation security monitoring level is also to be improved.Navigation Environment constantly deteriorates, and the safety issue of cruiseway also becomes increasingly conspicuous, and therefore, improves the control and monitoring and level of complex water areas It is particularly important.Monitor mode common at present is fixed point monitoring, and monitoring station is arranged in some fixed location of waters, mainly has Camera supervised, radar monitoring and laser radar monitoring, radar monitoring wide range of applications, but its there are inborn Deficiency, i.e., rain, haze, the heavier weather of haze just fails, equally, camera supervised there is also similar problem, laser Apart from less than 100 meters, the monitoring distance of video camera also only has 2-4 kilometers for the monitoring of radar, as distance increases, video camera Monitoring effect is poorer, and the motion track that radar monitoring can only detect target can not but obtain the image and video of target, i.e., can not Obtain intuitive monitoring effect.Then, using automatic monitor mode: camera supervised, radar monitoring, infrared monitoring, AIS are supervised Control joins together to realize and can be very good to solve problem above to the self-action monitoring of complex water areas with unmanned boat, moreover, endangering In dangerous environment, it has not been convenient to people be in the action monitoring place as can be sufficiently sent out under the situations such as high sea environment or nuclear pollution area The advantage of unmanned boat monitoring is waved, unmanned boat monitoring not only saves human cost, and improves the freedom degree of monitoring, may be implemented Monitoring free of discontinuities in 24 hours in one day, even if target position is relatively far away from, also available target clearly image and video.
The mode of the video image of market capture at present is mostly passively to perceive, and actively perceive mode is increasingly by people's Active video camera, laser radar and unmanned boat triple combination are got up to realize the target to complex water areas by concern, author of the present invention Monitoring, category are put forward for the first time.Firstly, being multiple Cell units by the visual angle scene partitioning of laser radar, and utilize active video camera Every group of Cell unit is demarcated, secondly, detecting complex water areas target position using high powered radar and AIS, then It is navigated by water using unmanned boat to target proximity, after Airborne Lidar, which measures target, to be appeared in Cell unit, photographic subjects are clear Clear image, and pass image back bank base server.
Currently, capture image method or apparatus mainly include the following types:
A. the patent invented by Nantong shipping Vocationl Technical College: a kind of pathfinder or the intelligence of AIS tracking parameter guidance It can video monitoring system (the published patent No.: CN104184990A), a kind of pathfinder of the disclosure of the invention or AIS tracking The intelligent video monitoring system of Leading of parameters, the pathfinder or AIS tracking parameter guidance intelligent video monitoring system include: Radar or ais signal receiving module, data processing module, automatic control module, computer, video monitoring module.Base of the present invention It is that ocean prison patrols in the local survey station or strange land video networking survey station of radar system parameters, ARP tracking parameter or AIS Leading of parameters The innovation of joint-monitoring method is realized with maritime administration, all kinds of video survey stations utilize attitude heading reference system and angle position after receiving order Displacement sensor provides optical axis orientation and pitch angle feedback, realizes closed-loop control, control method proposed adoption target angular speed, Angle Position Two-parameter PID closed-loop control method, parametric filtering+disturbance observer+feedback control Compound Control Technique will be further in algorithm Improve the round and smooth degree of control of holder.The invention positions target first with radar or AIS, then, is supervised using video The equipment of survey station carries out track up to target, and tracking equipment is mainly the video camera for having holder function.The advantages of this method It is to carry out " combining shooting " with video monitor using radar or AIS data, coverage is wider, and is not just directed to target Object itself carries out track up using video camera, meanwhile, it is run for target object imaging parameters, and to target object The track algorithm in path, the program do not mention it is bright, it is thus impossible to the image and video of enough photographic subjects object high-resolutions.The party Method shoots object using active video camera, can change shooting this feature of posture just with active video camera, The characteristics of in terms of there is no being taken pictures using active video camera, i.e., this method is not finely divided target area, also not to target Region is demarcated, therefore, it is impossible to the inside and outside parameter of video camera is adjusted, the clear image of the specific position of photographic subjects object, And specific position is tracked.Meanwhile when carrying out photographic subjects image using this method, it can only be fixed on bank, camera shooting is flat Platform is immovable, cannot short distance observed object.
B. a kind of patent invented by China Electronics Technology Group Co., Ltd. Second Research Institute 18: CCTV ship video smoothing Tracking (the published patent No.: 105430326A), a kind of CCTV ship video smoothing tracking of the disclosure of the invention, The information such as longitude and latitude, course, the speed of a ship or plane, captain, height including the monitored ship of acquisition, in conjunction with the installation longitude and latitude of CCTV camera Degree away from water surface elevation, zero-bit azimuthal angle calculation and calculates camera PTZ level deflection angle and vertical deflection angle and focal length, and With appropriate speed along ship course projecting direction rotary platform.Then this is detected when each AIS or radar data arrive When camera and monitored ship relative bearing, in advance then slow down or stop, lag then accelerate.This scheme persistently rotated So that holder is kept the same direction continuous rotation during monitoring, reduce the shake of monitored picture, prevents closely quick mesh Target is lost.The final tracking for realizing CCTV monitoring camera to ship, and guarantee that ratio of the ship in monitored picture is identical, Tracking picture is smoothly shaken smaller.The characteristics of invention is that solve the problems, such as float in shooting process, still, this method Can not photographic subjects high definition image, while the also speed of groundless target object, to deploy the tracking velocity of video, this method Target is positioned by sensor, is then tracked, still, shooting point is fixed, can not photographic subjects clearly image.
Summary of the invention
The technical problem to be solved in the present invention is that for the defects in the prior art, providing a kind of master based on unmanned boat Innervation knows apparatus and method.
The technical solution adopted by the present invention to solve the technical problems is:
A kind of actively perceive device based on unmanned boat, comprising: unmanned boat navigates by water control unit, boat-carrying laser radar dress It sets, camera calibration unit, data acquisition and display unit, data leaflet member, the spy of bank base server control unit, bank base target Survey unit;
The bank base Target Detection unlit determines the position of target after detecting complex water areas target;
Unmanned boat navigates by water control unit, for the location information according to bank base Target Detection unlit, receives bank base server The control signal of control unit, control unmanned boat are navigated by water to target proximity to be observed;
Boat-carrying laser radar apparatus detects Cell unit number locating for target, institute for detecting the specific location of target The component units that Cell unit is the area Cell are stated, the area Cell is that the area Cell is divided into n by the visual angle scene areas of laser radar (n >=2) a Cell unit, the region area of the Cell unit are more than or equal to the range of video camera minimum observation, and by Cell Element number;
The camera calibration unit, for being taken the photograph by the boat-carrying actively perceive PTZ (pan tilt zoom) with holder Camera demarcates all Cell groups, and calibrated data are stored in the data acquisition and display unit on unmanned boat In;Calibrated data include: video camera and the relative position of Cell group, corresponding calibrating parameters;The Cell group is Cell The combination of any one Cell unit or any one Cell unit and the Cell unit closed in area;
Data acquisition and display unit is stored for the clear image of photographic subjects, and by the image of collected target In the database of the unit, it is also used to store the nominal data of camera calibration unit;
Bank base server control unit, for passing through data transmission unit and data acquisition and display unit communication, storage Data acquisition and display unit acquired image data;Unmanned boat navigation control signal is also used to receive and emit.
According to the above scheme, active video camera demarcate in the camera calibration unit specific as follows:
Using every video camera in camera shooting unit, is demarcated respectively for every group of Cell, obtain the internal reference of video camera It is several to record the posture and angle value of holder when every video camera is directed at every group of Cell with outer parameter, then utilize chessboard plate pair Video camera is demarcated, and scaling method is as follows:
1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system: image coordinate system is with a photo left side The rectangular coordinate system as unit of pixel is established at upper angle, and image coordinate point is set as [XP YP]T;Camera coordinate system is with camera light The heart is origin, and optical axis is z-axis, establishes in direction x-axis, y-axis with x in Picture Coordinate system, y, coordinate system meets right-hand rule, coordinate Point is [XC YC ZC]T
2) relationship between Two coordinate system is established
Since camera meets pin-hole model, meet following relationship
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou calibration method;
3) world coordinate system is established
For convenience of calculating, world coordinate system is introduced below, and establishes the relationship of world coordinate system Yu above-mentioned Two coordinate system;Generation It is set in boundary's coordinate system along gridiron pattern length-width direction as x, y-axis, is z-axis perpendicular to gridiron pattern direction, world coordinate system midpoint is set For [XW YW ZW]T
World coordinate system and camera coordinate system and image coordinate system relationship are as follows:
4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
It obtains to coordinate of all angle points under world coordinate system in gridiron pattern, i-th of angular coordinate is [XWi YWi 0]T, while finding corresponding points coordinate [X under image coordinate systemPi YPi]T, in conjunction with calibration camera intrinsic parameter K, with following formula to R and t It is solved
Wherein, λ is scale factor, the matrix that R is 3 × 3, the rotation angle of expression world coordinate system to camera coordinate system Degree, the matrix that t is 3 × 1 indicate that world coordinate system to the translation distance of camera coordinates system, finds out R and λ by above formula, thus To the optimal imaging parameters for actively imaging unit;
By camera calibration as a result, include that the intrinsic parameter of video camera is counted with outer parameter, and the value is stored in In the database of data acquisition and display unit, to take out at any time.
A kind of actively perceive method based on unmanned boat, the unmanned boat are equipped with laser radar and one group of actively perceive PTZ (pan tilt zoom) video camera, video camera and laser radar are fixed on unmanned boat using bracket, and laser radar is being pacified After dress, the visual field immobilizes, and video camera is after mounting, and video camera can constantly change visual angle under the drive of holder;
Characterized by comprising the following steps:
S1, using the visual angle scene areas of laser radar as the area Cell, be divided into a Cell unit of n (n >=2), it is described The unit area Cell area is more than or equal to the range of video camera minimum observation, and by Cell element number;
S2, due to having fixed the relative position of laser radar and video camera, video camera and each Cell unit Relative position be also fixed and invariable, all Cell groups are demarcated using video camera, and calibrated data are stored In data acquisition and display unit on unmanned boat;Calibrated data include: the relative position, right of video camera and Cell group The calibrating parameters answered;The Cell group is any one Cell unit in the area Cell or any one Cell unit and closes on The combination of Cell unit;
S3, after bank base Target Detection unlit detects complex water areas target, determine the position of target, control unmanned boat It navigates by water to target proximity to be observed;
S4, using laser radar, detect the specific location of target, detect locating for target Cell unit number (1 or more It is a), it then determines Cell group locating for detection target, video camera is taken out from data acquisition and display unit and is directed to the Cell group Calibration as a result, and the calibrating parameters are input in video camera, control active camera pan-tilt is moved in nominal data and is taken the photograph The relative position of camera and Cell group is directed at the Cell group, according to the parameter for the result adjustment video camera demarcated in advance, shoots mesh Mark object clearly image (as in the night, then needing to open lighting effects);
S5, by the clearly image transmitting taken to bank base server control unit, remotely pilotless ship is navigated by water to bank.
According to the above scheme, active video camera demarcate in the S2 specific as follows:
Using every video camera in camera shooting unit, is demarcated respectively for every group of Cell, obtain the internal reference of video camera It is several to record the posture and angle value of holder when every video camera is directed at every group of Cell with outer parameter, then utilize chessboard plate pair Video camera is demarcated, and scaling method is as follows:
1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system: image coordinate system is with a photo left side The rectangular coordinate system as unit of pixel is established at upper angle, and image coordinate point is set as [XP YP]T;Camera coordinate system is with camera light The heart is origin, and optical axis is z-axis, establishes in direction x-axis, y-axis with x in Picture Coordinate system, y, coordinate system meets right-hand rule, coordinate Point is [XC YC ZC]T
2) relationship between Two coordinate system is established
Since camera meets pin-hole model, meet following relationship
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou calibration method;
3) world coordinate system is established
For convenience of calculating, world coordinate system is introduced below, and establishes the relationship of world coordinate system Yu above-mentioned Two coordinate system;Generation It is set in boundary's coordinate system along gridiron pattern length-width direction as x, y-axis, is z-axis perpendicular to gridiron pattern direction, world coordinate system midpoint is set For [XW YW ZW]T
World coordinate system and camera coordinate system and image coordinate system relationship are as follows:
4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
It obtains to coordinate of all angle points under world coordinate system in gridiron pattern, i-th of angular coordinate is [XWi YWi 0]T, while finding corresponding points coordinate [X under image coordinate systemPi YPi]T, in conjunction with calibration camera intrinsic parameter K, with following formula to R and t It is solved
Wherein, λ is scale factor, the matrix that R is 3 × 3, the rotation angle of expression world coordinate system to camera coordinate system Degree, the matrix that t is 3 × 1 indicate that world coordinate system to the translation distance of camera coordinates system, finds out R and λ by above formula, thus To the optimal imaging parameters for actively imaging unit;
By camera calibration as a result, include that the intrinsic parameter of video camera is counted with outer parameter, and the value is stored in In the database of data acquisition and display unit, to take out at any time.
The beneficial effect comprise that:
1, relative to fixed observer website, the present invention has the function of mobile and actively perceive, and video camera of the invention Device has the function of PTZ (translation, inclination, scaling), therefore, can be with because having multiple Pan/Tilt/Zoom cameras in apparatus of the present invention Multiple targets clearly image and video are shot simultaneously, greatly improves the clarity of shooting image and video;Bat is reduced simultaneously The time is taken the photograph, is improved efficiency.
2, this system can not only work at night, dark, can also be in dense fog, heavy rain, the severe day such as heavy snow Gas work, can also be inconvenient to the environmental work entered in people, if Strong Breezes Over is unrestrained, work under the environment such as nuclear leakage, and this Invention will not be lazy and goes on strike, and reduces observation cost.
3, the invention proposes a kind of new camera shooting thinkings: first using laser radar and video camera as core, establishing data Acquisition unit, and be n Cell unit by the visual angle scene partitioning of laser radar, then, target is detected with laser radar, with this The number of target to be observed and the Cell unit where target are determined, finally, by the video camera alignment target institute of corresponding quantity Cell unit, search in advance to the corresponding Cell unit of camera calibration as a result, adjustment video camera parameter, shooting Target clearly image;
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples, in attached drawing:
Fig. 1 is the area the Cell schematic diagram of the embodiment of the present invention;
Fig. 2 is the active camera motion schematic diagram of the embodiment of the present invention;
Fig. 3 is the method flow diagram of the embodiment of the present invention.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to embodiments, to the present invention It is further elaborated.It should be appreciated that described herein, specific examples are only used to explain the present invention, is not used to limit The fixed present invention.
A kind of actively perceive device based on unmanned boat, comprising: unmanned boat navigates by water control unit, boat-carrying laser radar dress It sets, camera calibration unit, data acquisition and display unit, data leaflet member, the spy of bank base server control unit, bank base target Survey unit;
The bank base Target Detection unlit determines the position of target after detecting complex water areas target;
Unmanned boat navigates by water control unit, for the location information according to bank base Target Detection unlit, receives bank base server The control signal of control unit, control unmanned boat are navigated by water to target proximity to be observed;Unmanned boat navigates by water control unit and is arranged in nothing There are onboard AIS device, Shipborne navigation real-Time Signal Transfer device on people's ship, on unmanned boat, the main function of the unit is to pass through It is constantly communicated with bank base server control unit, the unmanned boat for being loaded with data acquisition and display unit is navigated by water to target to be observed Near;
Boat-carrying laser radar apparatus detects Cell unit number locating for target, institute for detecting the specific location of target The component units that Cell unit is the area Cell are stated, the area Cell is that the area Cell is divided into n by the visual angle scene areas of laser radar (n >=2) a Cell unit, the region area of the Cell unit are more than or equal to the range of video camera minimum observation, and by Cell Element number;
The camera calibration unit, for being taken the photograph by the boat-carrying actively perceive PTZ (pan tilt zoom) with holder Camera demarcates all Cell groups, and calibrated data are stored in the data acquisition and display unit on unmanned boat In;Calibrated data include: video camera and the relative position of Cell group, corresponding calibrating parameters;The Cell group is Cell The combination of any one Cell unit or any one Cell unit and the Cell unit closed in area;
The calibration device that camera calibration unit is used is mainly a monoplane gridiron pattern, the monoplane gridiron pattern point by A chequered with black and white grid (each grid side length is 10cm) composition of 100 (10*10)
Data acquisition and display unit is stored for the clear image of photographic subjects, and by the image of collected target In the database of the unit, it is also used to store the nominal data of camera calibration unit;Data acquisition and display unit is main Including a computer, a display, one group of band holder function (that holder has the function of is vertical, tilt and horizontal motion) Active (PTZ) camera shooting unit (being made of 1 to more active video cameras), one group of high-power lamp (unlatching when night takes pictures) or bat According to photosensitive lamp;
Bank base server control unit, for passing through data transmission unit and data acquisition and display unit communication, storage Data acquisition and display unit acquired image data;Unmanned boat navigation control signal is also used to receive and emit.
Bank base server control unit specifically includes that (1) bank base image data receiving module, which has connects at a distance The function of image data is received, can be made of, (2) image data display apparatus master 4G module (or figure conduct electricity platform, data radio station) To include a calculator, a display, contain an image data base in the device, which is mainly used for storing Acquired image data, in addition, the unit needs to show data acquired image data, (3) unmanned ship position positioning with Display device, main includes the device for showing unmanned boat location information real-time, quickly, for showing the specific position of unmanned boat It sets, (4) unmanned boat navigation control signal receives and emitter, mainly device is sent and received including high-power, for receiving With the signal for sending control unmanned boat navigation.
Such as Fig. 3, using a kind of actively perceive method based on unmanned boat of above-mentioned apparatus,
The unmanned boat is equipped with laser radar and one group of actively perceive PTZ (pan tilt zoom) video camera, camera shooting Machine and laser radar are fixed on unmanned boat using bracket, and after mounting, the visual field immobilizes laser radar, and video camera exists After installation, video camera can constantly change visual angle under the drive of holder;The following steps are included:
S1, using the visual angle scene areas of laser radar as the area Cell, be divided into a Cell unit of n (n >=2), it is described The unit area Cell area is more than or equal to the range of video camera minimum observation, and by Cell element number;
S2, due to having fixed the relative position of laser radar and video camera, video camera and each Cell unit Relative position be also fixed and invariable, all Cell groups are demarcated using video camera, and calibrated data are stored In data acquisition and display unit on unmanned boat;Calibrated data include: the relative position, right of video camera and Cell group The calibrating parameters answered;The Cell group is any one Cell unit in the area Cell or any one Cell unit and closes on The combination of Cell unit;
Active video camera demarcate in S2 specific as follows:
Using every video camera in camera shooting unit, is demarcated respectively for every group of Cell, obtain the internal reference of video camera It is several to record the posture and angle value of holder when every video camera is directed at every group of Cell with outer parameter, then utilize chessboard plate pair Video camera is demarcated, and scaling method is as follows:
1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system: image coordinate system is with a photo left side The rectangular coordinate system as unit of pixel is established at upper angle, and image coordinate point is set as [XP YP]T;Camera coordinate system is with camera light The heart is origin, and optical axis is z-axis, establishes in direction x-axis, y-axis with x in Picture Coordinate system, y, coordinate system meets right-hand rule, coordinate Point is [XC YC ZC]T
2) relationship between Two coordinate system is established
Since camera meets pin-hole model, meet following relationship
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou calibration method;
3) world coordinate system is established
For convenience of calculating, world coordinate system is introduced below, and establishes the relationship of world coordinate system Yu above-mentioned Two coordinate system;Generation It is set in boundary's coordinate system along gridiron pattern length-width direction as x, y-axis, is z-axis perpendicular to gridiron pattern direction, world coordinate system midpoint is set For [XW YW ZW]T
World coordinate system and camera coordinate system and image coordinate system relationship are as follows:
4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
It obtains to coordinate of all angle points under world coordinate system in gridiron pattern, i-th of angular coordinate is [XWi YWi 0]T, while finding corresponding points coordinate [X under image coordinate systemPi YPi]T, in conjunction with calibration camera intrinsic parameter K, with following formula to R and t It is solved:
Wherein, λ is scale factor, the matrix that R is 3 × 3, the rotation angle of expression world coordinate system to camera coordinate system Degree, the matrix that t is 3 × 1 indicate that world coordinate system to the translation distance of camera coordinates system, finds out R and λ by above formula, thus To the optimal imaging parameters for actively imaging unit;
By camera calibration as a result, include that the intrinsic parameter of video camera is counted with outer parameter, and the value is stored in In the database of data acquisition and display unit, to take out at any time.
S3, after bank base Target Detection unlit detects complex water areas target, determine the position of target, control unmanned boat It navigates by water to target proximity to be observed;
S4, using laser radar, detect the specific location of target, detect locating for target Cell unit number (1 or more It is a), it then determines Cell group locating for detection target, video camera is taken out from data acquisition and display unit and is directed to the Cell group Calibration as a result, and the calibrating parameters are input in video camera, control active camera pan-tilt is moved in nominal data and is taken the photograph The relative position of camera and Cell group is directed at the Cell group, according to the parameter for the result adjustment video camera demarcated in advance, shoots mesh Mark object clearly image (as in the night, then needing to open lighting effects);
S5, by the clearly image transmitting taken to bank base server control unit, remotely pilotless ship is navigated by water to bank.
One specific embodiment:
What this example was selected actively images unit containing more active video cameras, and active video camera such as Fig. 2, this actively images unit It is placed nearby with radar, every active video camera works independently, and can be selected according to the number of target and the traffic direction of target Suitable number of active video camera is taken to be tracked observation to target;
Such as Fig. 1, every video camera is demarcated, (Cell group may include 1,2,4,6,8 ..., and n is closed on for each Cell unit) demarcated, and result is stored into the database of data acquisition and display unit;In Fig. 1, Cell group Can be the corresponding Cell unit of number 5, or the corresponding Cell unit of number 5 and any adjacent Cell unit combination (5 and 2,5 and 8,5 and 6,5 and 4), or the combination (5 and 1,2,4 with neighbouring Cell unit;5 and 1,2,3,4 and 6), principle is each Cell unit will have common edge with other Cell units in Cell group;
It is loaded with and actively images the ship's navigation of unit and laser radar and enter complex water areas, when target appears in laser radar When observation area, the Cell group where target is determined, meanwhile, active video camera is chosen, the angle of video camera is adjusted, will actively take the photograph Cell group where camera alignment target object, from data acquisition and display unit, taking-up has been demarcated in advance accordingly Active video camera parameter, be input in the active video camera of selection, finally, clearly image is (as in the night for photographic subjects Operation then needs to open light of taking pictures).
If target when continuous mobile, is constantly switched using two video cameras in camera shooting unit, clapped to reach tracking The purpose for taking the photograph target object video, instead of photo.
It should be understood that for those of ordinary skills, it can be modified or changed according to the above description, And all these modifications and variations should all belong to the protection domain of appended claims of the present invention.

Claims (4)

1. a kind of actively perceive device based on unmanned boat, comprising: unmanned boat navigate by water control unit, boat-carrying laser radar apparatus, Camera calibration unit, data acquisition and display unit, data transmission unit, bank base server control unit, bank base target are visited Survey unit;
The bank base Target Detection unlit determines the position of target after detecting complex water areas target;
The unmanned boat navigates by water control unit, for the location information according to bank base Target Detection unlit, receives bank base server The control signal of control unit, control unmanned boat are navigated by water to target proximity to be observed;
The boat-carrying laser radar apparatus detects Cell unit number locating for target, institute for detecting the specific location of target The component units that Cell unit is the area Cell are stated, the area Cell is that the area Cell is divided into n by the visual angle scene areas of laser radar The region area of a Cell unit, the Cell unit is more than or equal to the range of video camera minimum observation, and Cell unit is compiled Number;
The camera calibration unit, for being carried out by the boat-carrying actively perceive Pan/Tilt/Zoom camera with holder to all Cell groups Calibration, and calibrated data are stored in the data acquisition and display unit on unmanned boat;Calibrated data include: to take the photograph Camera and the relative position of Cell group, corresponding calibrating parameters;The Cell group is any one Cell unit in the area Cell, or The combination of any one Cell unit and the Cell unit closed on;
The data acquisition and display unit is stored for the clear image of photographic subjects, and by the image of collected target In the database of the unit, it is also used to store the nominal data of camera calibration unit;
The bank base server control unit, for passing through data transmission unit and data acquisition and display unit communication, storage Data acquisition and display unit acquired image data;Unmanned boat navigation control signal is also used to receive and emit.
2. the actively perceive device according to claim 1 based on unmanned boat, which is characterized in that the camera calibration list Active video camera demarcate in member specific as follows:
Using every video camera in camera shooting unit, demarcated respectively for every group of Cell, obtain the intrinsic parameter of video camera with Outer parameter records the posture and angle value of holder when every video camera is directed at every group of Cell, then using chessboard plate to camera shooting Machine is demarcated, and scaling method is as follows:
1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system: image coordinate system is with the photo upper left corner The rectangular coordinate system as unit of pixel is established, image coordinate point is set as [XP YP]T;Camera coordinate system is with camera optical center Origin, optical axis are z-axis, establish in direction x-axis, y-axis with x in Picture Coordinate system, y, coordinate system meets right-hand rule, and coordinate points are [XC YC ZC]T
2) relationship between Two coordinate system is established
Since camera meets pin-hole model, meet following relationship
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou calibration method;
3) world coordinate system is established
For convenience of calculating, world coordinate system is introduced below, and establishes the relationship of world coordinate system Yu above-mentioned Two coordinate system;The world is sat It sets along gridiron pattern length-width direction in mark system as x, y-axis is z-axis perpendicular to gridiron pattern direction, and world coordinate system midpoint is set as [XW YW ZW]T
World coordinate system and camera coordinate system and image coordinate system relationship are as follows:
4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
It obtains to coordinate of all angle points under world coordinate system in gridiron pattern, i-th of angular coordinate is [XWi YWi 0]T, Corresponding points coordinate [X under image coordinate system is found simultaneouslyPi YPi]T, in conjunction with calibration camera intrinsic parameter K, R and t is carried out with following formula It solves
Wherein, λ is scale factor, the matrix that R is 3 × 3, indicates world coordinate system to the rotation angle of camera coordinate system, t is 3 × 1 matrix indicates that world coordinate system to the translation distance of camera coordinates system, finds out R and λ by above formula, to obtain actively Image the optimal imaging parameters of unit;
By camera calibration as a result, include that the intrinsic parameter of video camera is counted with outer parameter, and the value is stored in data In the database of acquisition and display unit, to take out at any time.
3. a kind of actively perceive method based on unmanned boat, the unmanned boat is equipped with laser radar and one group of actively perceive PTZ Video camera, video camera and laser radar are fixed on unmanned boat using bracket, and after mounting, the visual field is fixed not for laser radar Become, and video camera is after mounting, video camera can constantly change visual angle under the drive of holder;
Characterized by comprising the following steps:
S1, using the visual angle scene areas of laser radar as the area Cell, be divided into n Cell unit, the unit area Cell Area is more than or equal to the range of video camera minimum observation, and by Cell element number;
S2, due to having fixed the relative position of laser radar and video camera, the phase of video camera and each Cell unit To position and changeless, all Cell groups are demarcated using video camera, and calibrated data are stored in nothing In data acquisition and display unit on people's ship;Calibrated data include: video camera and the relative position of Cell group, corresponding Calibrating parameters;The Cell group be the area Cell in any one Cell unit or any one Cell unit and the Cell closed on it is mono- The combination of member;
S3, after bank base Target Detection unlit detects complex water areas target, determine the position of target, control unmanned boat navigation To target proximity to be observed;
S4, using laser radar, detect the specific location of target, detect Cell unit number locating for target, then determine and visit Cell group locating for target is surveyed, video camera is taken out from data acquisition and display unit for Cell group calibration as a result, simultaneously The calibrating parameters are input in video camera, control active camera pan-tilt moves to video camera and Cell group in nominal data Relative position is directed at the Cell group, and according to the parameter for the result adjustment video camera demarcated in advance, photographic subjects object is clearly schemed Picture;
S5, by the clearly image transmitting taken to bank base server control unit, remotely pilotless ship is navigated by water to bank.
4. the actively perceive device according to claim 3 based on unmanned boat, which is characterized in that actively taking the photograph in the S2 Camera demarcate specific as follows:
Using every video camera in camera shooting unit, demarcated respectively for every group of Cell, obtain the intrinsic parameter of video camera with Outer parameter records the posture and angle value of holder when every video camera is directed at every group of Cell, then using chessboard plate to camera shooting Machine is demarcated, and scaling method is as follows:
1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system: image coordinate system is with the photo upper left corner The rectangular coordinate system as unit of pixel is established, image coordinate point is set as [XP YP]T;Camera coordinate system is with camera optical center Origin, optical axis are z-axis, establish in direction x-axis, y-axis with x in Picture Coordinate system, y, coordinate system meets right-hand rule, and coordinate points are [XC YC ZC]T
2) relationship between Two coordinate system is established
Since camera meets pin-hole model, meet following relationship
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou calibration method;
3) world coordinate system is established
For convenience of calculating, world coordinate system is introduced below, and establishes the relationship of world coordinate system Yu above-mentioned Two coordinate system;The world is sat It sets along gridiron pattern length-width direction in mark system as x, y-axis is z-axis perpendicular to gridiron pattern direction, and world coordinate system midpoint is set as [XW YW ZW]T
World coordinate system and camera coordinate system and image coordinate system relationship are as follows:
4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
It obtains to coordinate of all angle points under world coordinate system in gridiron pattern, i-th of angular coordinate is [XWi YWi 0]T, Corresponding points coordinate [X under image coordinate system is found simultaneouslyPi YPi]T, in conjunction with calibration camera intrinsic parameter K, R and t is carried out with following formula It solves
Wherein, λ is scale factor, the matrix that R is 3 × 3, indicates world coordinate system to the rotation angle of camera coordinate system, t is 3 × 1 matrix indicates that world coordinate system to the translation distance of camera coordinates system, finds out R and λ by above formula, to obtain actively Image the optimal imaging parameters of unit;
By camera calibration as a result, include that the intrinsic parameter of video camera is counted with outer parameter, and the value is stored in data In the database of acquisition and display unit, to take out at any time.
CN201811011740.XA 2018-08-31 2018-08-31 A kind of actively perceive apparatus and method based on unmanned boat Pending CN109255820A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811011740.XA CN109255820A (en) 2018-08-31 2018-08-31 A kind of actively perceive apparatus and method based on unmanned boat

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811011740.XA CN109255820A (en) 2018-08-31 2018-08-31 A kind of actively perceive apparatus and method based on unmanned boat

Publications (1)

Publication Number Publication Date
CN109255820A true CN109255820A (en) 2019-01-22

Family

ID=65049450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811011740.XA Pending CN109255820A (en) 2018-08-31 2018-08-31 A kind of actively perceive apparatus and method based on unmanned boat

Country Status (1)

Country Link
CN (1) CN109255820A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111812610A (en) * 2020-06-29 2020-10-23 珠海云洲智能科技有限公司 Overwater target supervision system and method, terminal device and storage medium
CN112305529A (en) * 2020-10-19 2021-02-02 杭州海康威视数字技术股份有限公司 Parameter calibration method, target object tracking method, device and system
CN112684469A (en) * 2021-01-14 2021-04-20 江苏恒澄交科信息科技股份有限公司 Channel characteristic direction identification method and system based on marine radar image
CN113483730A (en) * 2021-07-02 2021-10-08 迈润智能科技(上海)有限公司 Marine wave actual measurement device and method based on binocular stereo vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105242023A (en) * 2015-11-10 2016-01-13 四方继保(武汉)软件有限公司 Unmanned ship achieving multi-function monitoring of water area
CN205620814U (en) * 2016-05-12 2016-10-05 四方继保(武汉)软件有限公司 Unmanned ship cluster operating system based on miniature satellite
KR20170058719A (en) * 2015-11-19 2017-05-29 대우조선해양 주식회사 Control method for path following and obstacles collision avoidance in unmanned ship
CN106741782A (en) * 2016-12-27 2017-05-31 武汉理工大学 A kind of unmanned boat and its navigation control method driven based on wind energy
CN107452038A (en) * 2017-07-28 2017-12-08 武汉理工大学 Complex water areas method for tracking target based on AIS and active video camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105242023A (en) * 2015-11-10 2016-01-13 四方继保(武汉)软件有限公司 Unmanned ship achieving multi-function monitoring of water area
KR20170058719A (en) * 2015-11-19 2017-05-29 대우조선해양 주식회사 Control method for path following and obstacles collision avoidance in unmanned ship
CN205620814U (en) * 2016-05-12 2016-10-05 四方继保(武汉)软件有限公司 Unmanned ship cluster operating system based on miniature satellite
CN106741782A (en) * 2016-12-27 2017-05-31 武汉理工大学 A kind of unmanned boat and its navigation control method driven based on wind energy
CN107452038A (en) * 2017-07-28 2017-12-08 武汉理工大学 Complex water areas method for tracking target based on AIS and active video camera

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111812610A (en) * 2020-06-29 2020-10-23 珠海云洲智能科技有限公司 Overwater target supervision system and method, terminal device and storage medium
CN111812610B (en) * 2020-06-29 2023-09-29 珠海云洲智能科技股份有限公司 Water target supervision system, method, terminal equipment and storage medium
CN112305529A (en) * 2020-10-19 2021-02-02 杭州海康威视数字技术股份有限公司 Parameter calibration method, target object tracking method, device and system
CN112684469A (en) * 2021-01-14 2021-04-20 江苏恒澄交科信息科技股份有限公司 Channel characteristic direction identification method and system based on marine radar image
CN113483730A (en) * 2021-07-02 2021-10-08 迈润智能科技(上海)有限公司 Marine wave actual measurement device and method based on binocular stereo vision

Similar Documents

Publication Publication Date Title
CN109255820A (en) A kind of actively perceive apparatus and method based on unmanned boat
CN108965809A (en) The video linkage monitoring system and control method of radar vectoring
US11017228B2 (en) Method and arrangement for condition monitoring of an installation with operating means
CN104914863B (en) A kind of unmanned motion platform environment understanding system of integral type and its method of work
CN107247458A (en) UAV Video image object alignment system, localization method and cloud platform control method
CN106878687A (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
CN108227751A (en) The landing method and system of a kind of unmanned plane
CN108614273A (en) A kind of airborne two waveband photoelectricity wide area is scouted and tracks of device and method
CN201611930U (en) Ship-borne searching and evidence-obtaining integrated equipment
CN106797438A (en) Control device, control method and aircraft devices
CN108303078B (en) Omnidirectional ship anti-collision early warning and navigation system based on stereoscopic vision
CN108447075A (en) A kind of unmanned plane monitoring system and its monitoring method
CN206611521U (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
EP3881221A1 (en) System and method for measuring the distance to an object in water
CN101858743B (en) Distance measuring device and method based on large-field shooting and image processing
KR101508290B1 (en) Day-night vision machine and water monitoring system thereof
CN107452038A (en) Complex water areas method for tracking target based on AIS and active video camera
KR102129502B1 (en) Cadastral survey system and method using drone
CN108020831A (en) A kind of intelligence maritime affairs tracking radar
CN109597432A (en) A kind of unmanned plane landing monitoring method and system based on vehicle-mounted pick-up unit
CN105243364B (en) Photoelectric nacelle searching method, device and system
JP6482855B2 (en) Monitoring system
CN113654526B (en) Photoelectric nacelle scanning method under low-altitude rapid flight condition
KR101012281B1 (en) Optimal strongpoint image control system
CN108924494B (en) Aerial monitoring system based on ground

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190122

RJ01 Rejection of invention patent application after publication