CN118071842A - Ship identity recognition system based on camera calibration and deep learning algorithm - Google Patents

Ship identity recognition system based on camera calibration and deep learning algorithm Download PDF

Info

Publication number
CN118071842A
CN118071842A CN202410493952.5A CN202410493952A CN118071842A CN 118071842 A CN118071842 A CN 118071842A CN 202410493952 A CN202410493952 A CN 202410493952A CN 118071842 A CN118071842 A CN 118071842A
Authority
CN
China
Prior art keywords
ship
module
unit
deep learning
learning algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410493952.5A
Other languages
Chinese (zh)
Inventor
项赉
崔晓东
曹宇
朱秋伟
王亚雪
张飞虎
王佳苗
胡海洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN202410493952.5A priority Critical patent/CN118071842A/en
Publication of CN118071842A publication Critical patent/CN118071842A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of ship identification, and discloses a ship identity identification system based on camera calibration and a deep learning algorithm. The monitoring module is used for monitoring and feeding back the ships in the monitoring area. The auxiliary module is used for assisting the monitoring module to monitor the ship. The identity recognition module is used for recognizing the identity of the ship monitored by the monitoring module. The identity comparison identifier is used for acquiring AIS coordinates of the ship and accessing AIS to acquire identity information of the ship. The deep learning algorithm module is used for identifying the ship data which are not recorded. The invention has the advantages of improving the three-dimensional identification of the ship, increasing the accuracy of identifying and positioning the ship position, assisting in positioning the ship, carrying out entity judgment on the specific position of the ship and entity notification on the ship, increasing the identification degree of the color of the ship in the pixel unit and the like.

Description

Ship identity recognition system based on camera calibration and deep learning algorithm
Technical Field
The invention relates to the technical field of ship identity recognition, in particular to a ship identity recognition system based on camera calibration and a deep learning algorithm.
Background
The traditional maritime supervision mode is to supervise by means of manpower through a sea patrol boat to randomly patrol the water area, and huge manpower and material resources are consumed, so that the ship in the water area of the channel bayonet is difficult to automatically and accurately identify. Video monitoring is an important information source in the civil and military fields such as sea polices, maritime affairs, ship management systems and the like. Therefore, china is always dedicated to developing intelligent monitoring on water, developing video high-definition monitoring projects on water greatly, and installing a large number of CCTV equipment along the river water area. Through integrated AIS sensor, draw the ship target from real-time video monitoring fast and effectively, judge the identity of the ship automatically, report an emergency and ask for help or increased vigilance to the abnormal behavior of ship, become an important application field, camera calibration often uses background difference method and interframe difference method.
The existing monitoring technology lacks of accuracy and auxiliary devices, and because the background difference method and the inter-frame difference method are easily influenced by the change of ambient light, the recognition degree of the monitoring camera on pixels is weak at night or in poor weather, and the ship is difficult to recognize. Meanwhile, the existing camera is single-point identification, omission occurs easily when a ship is shielded, and staggered positioning is lacked. In addition, existing monitoring often relies on a single device, which can easily lead to an overall identification system when the device fails, requiring a backup auxiliary positioning device.
Disclosure of Invention
The technical problems solved by the invention are as follows: at night or in poor weather, the monitoring camera has weak recognition degree to the pixels, and the ship is difficult to recognize. Meanwhile, the existing camera is single-point identification, omission occurs easily when a ship is shielded, and staggered positioning is lacked.
The aim of the invention can be achieved by the following technical scheme:
A ship identity recognition system based on camera calibration and a deep learning algorithm comprises a monitoring module, an auxiliary module, an identity recognition module and a deep learning algorithm module.
The monitoring module is used for monitoring and feeding back ships in the monitoring area and comprises a ship monitoring unit, an execution judging unit, a tracking and positioning unit, a feedback receiving unit and a site warning unit.
The auxiliary module is used for assisting the monitoring module to monitor the ship and comprises a satellite unit, a physical unit, a light unit and a signal notification unit.
The identity recognition module is used for recognizing the identity of the ship monitored by the monitoring module and comprises an identity comparison identifier. The identity comparison identifier is used for acquiring AIS coordinates of the ship and accessing AIS to acquire identity information of the ship.
The deep learning algorithm module is used for identifying the ship data which are not recorded.
In one aspect of the invention: the vessel monitoring unit comprises at least two cameras. Two cameras are vertically arranged and are movably arranged in the horizontal direction, wherein one camera acquires a horizontal image and the other camera acquires a vertical image.
In one aspect of the invention: the execution judging unit is used for acquiring the transverse horizontal image and the longitudinal horizontal image and judging the position of the ship shot by the camera. The horizontal image and the vertical horizontal image have lattice-like pixel points.
In one aspect of the invention: the working mode of the execution judging unit comprises color distinction and contrast distinction.
The color distinction distinguishes the ship position by the adjacent color between the pixel points.
The comparison and differentiation is carried out by comparing the horizontal images after rotation with the longitudinal horizontal images, so as to obtain the position of the ship and establish a three-dimensional model.
In one aspect of the invention: the tracking and positioning unit is used for continuously tracking and positioning the pixel points after the ship position is determined by the execution judging unit. The feedback receiving unit is used for feeding back and receiving signals to the identity recognition unit after the execution judging unit determines the ship position. The station warning unit is used for warning personnel after the identity recognition module cannot recognize the ship.
In one aspect of the invention: the satellite unit is used for judging the ship position through satellite overlooking, shooting a plurality of pixel layers through satellite interval overlooking, overlapping and comparing the pixel layers, comparing to obtain the ship pixel point position, transmitting the ship pixel point position to the monitoring module for comparison, and transmitting the ship pixel point position to the identity recognition module for recognition of the ship.
The entity unit is used for entity judging the ship position. The entity unit comprises an unmanned aerial vehicle, when only one of the monitoring module and the satellite unit monitors the ship and the identity recognition module cannot recognize the ship, the unmanned aerial vehicle approaches the ship to judge the specific position of the ship and inform the ship of the entity, positions the ship and transmits the ship to the monitoring module and the identity recognition module to assist in recognizing the ship.
The light unit is used for assisting in working when the color difference of the pixel points shot by the monitoring module is smaller than a preset value, and amplifying the color difference.
The signal notification unit is used for transmitting and comparing signals sent by the satellite unit, the entity unit and the light unit to the monitoring module.
In one aspect of the invention: the process of performing ship positioning comprises the following steps:
By interpolating the pseudo homonymous point pair elevation by using the filtered ground point cloud, ALS and SLS feature point coordinates of the pseudo homonymous point pair are represented by [ Xals Yals Zals ] T and [ XSLS YSLS ZSLS ] T respectively, the coordinate conversion relationship is represented by formula A, B, C, and the coordinate conversion parameter is calculated by using least square.
The formula A is:
The formula B is:
the formula C is:
Wherein R is% ) Representing a rotation matrix,/>The euler rotation angles around the Z, Y and X axes, respectively. [And T represents the amount of translation in the X, Y, Z direction. Lambda represents the scale parameter. And (3) performing primary registration and ship position positioning on the ship through a formula.
In one aspect of the invention: when the identity comparison identifier cannot identify the AIS coordinate signal of the ship, the deep learning algorithm module judges that the ship does not use the AIS, acquires the ship characteristic information through the monitoring module and the auxiliary module, inputs the ship characteristic information into the identity comparison identifier, and feeds back the ship characteristic information to personnel for signal communication.
In one aspect of the invention: the deep learning algorithm module comprises the following steps of:
And (3) satellite positioning, namely accelerating the speed of shooting the image layer at intervals of the satellite through a satellite unit, and continuously positioning the ship.
And (3) appearance collection, namely collecting a transverse horizontal image and a longitudinal horizontal image of the ship through a camera, and calculating the length, width, height, volume and hull characteristics of the ship according to the pixel points.
And acquiring the color of the ship body pixels of the ship through the execution judging unit.
And acquiring the entity, namely judging the specific position of the ship and informing the ship of the entity by the unmanned aerial vehicle approaching the ship.
In one aspect of the invention: the output end of the monitoring module is in signal connection with the input end of the identity recognition module, the input end of the monitoring module is in signal connection with the output end of the identity recognition module, the output end of the auxiliary module is in signal connection with the input end of the identity recognition module, the input end of the auxiliary module is in signal connection with the input end of the auxiliary module, the input end of the monitoring module is in signal connection with the output end of the auxiliary module, the output end of the identity recognition module is in signal connection with the input end of the deep learning algorithm module, and the input end of the identity recognition module is in signal connection with the output end of the deep learning algorithm module.
The invention has the beneficial effects that:
Through the combination setting of monitoring module, can utilize the camera of perpendicular setting to stagger monitoring boats and ships, the pixel that presents in horizontal image and the vertical horizontal image can the colour distinguish and contrast distinguish boats and ships and fix a position, and horizontal and vertical angle contrast colour piece can improve the three-dimensional discernment nature of boats and ships, increase the accuracy of discernment and location boats and ships position.
Through the cooperation setting of monitoring module and auxiliary module, can utilize the satellite interval to overlook and shoot a plurality of pixel layers, each pixel layer stack contrast, the comparison obtains boats and ships pixel point position, the location of auxiliary ship. Meanwhile, the ship position can be judged by utilizing the entity unit, when the ship cannot be identified by the monitoring module, the satellite unit judgment or the identity identification module, the unmanned aerial vehicle is closely close to the ship, the specific position of the ship is judged by the entity, the ship is informed by the entity, and the auxiliary positioning is carried out. In addition, by utilizing the light unit for assistance, when the color difference of the pixel point photographed by the night monitoring module is smaller than a preset value, the laser range finder is utilized to rotate and measure the monitoring area, when the distance measured by the laser range finder is shortened, the ship appears, and the light unit is utilized for assisting in irradiating the direction, so that the identifiable degree of the color of the ship in the pixel unit is increased.
In summary, the invention has the advantages of improving the three-dimensional identification of the ship, increasing the accuracy of identifying and positioning the ship position, assisting in positioning the ship, carrying out entity judgment on the specific position of the ship and entity notification on the ship, increasing the identifiable degree of the color of the ship in the pixel unit, and the like.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and may be better understood from the following description of embodiments taken in conjunction with the accompanying drawings in which:
FIG. 1 is a schematic diagram of a ship identification system according to the present invention;
FIG. 2 is a schematic diagram of a monitoring module according to the present invention;
FIG. 3 is a schematic diagram of an auxiliary module according to the present invention;
FIG. 4 is a flowchart showing the distinguishing of the execution judging unit according to the present invention;
fig. 5 is a flowchart of the collection of the characteristic information of the ship by the deep learning algorithm module.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the description of the present invention, it should be understood that references to orientation descriptions such as upper, lower, front, rear, left, right, etc. are based on the orientation or positional relationship shown in the drawings, are merely for convenience of description of the present invention and to simplify the description, and do not indicate or imply that the apparatus or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the present invention.
In the description of the present invention, a number means one or more, a number means two or more, and greater than, less than, exceeding, etc. are understood to not include the present number, and above, below, within, etc. are understood to include the present number. The description of the first and second is for the purpose of distinguishing between technical features only and should not be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
Referring to fig. 1-5, the invention discloses a ship identity recognition system based on camera calibration and a deep learning algorithm, which comprises a monitoring module, an auxiliary module, an identity recognition module and a deep learning algorithm module. The output end of the monitoring module is in signal connection with the input end of the identity recognition module, the input end of the monitoring module is in signal connection with the output end of the identity recognition module, the output end of the auxiliary module is in signal connection with the input end of the identity recognition module, the input end of the auxiliary module is in signal connection with the input end of the auxiliary module, the input end of the monitoring module is in signal connection with the output end of the auxiliary module, the output end of the identity recognition module is in signal connection with the input end of the deep learning algorithm module, and the input end of the identity recognition module is in signal connection with the output end of the deep learning algorithm module.
The monitoring module is used for monitoring and feeding back ships in the monitoring area and comprises a ship monitoring unit, an execution judging unit, a tracking and positioning unit, a feedback receiving unit and a site warning unit.
The auxiliary module is used for assisting the monitoring module to monitor the ship and comprises a satellite unit, a physical unit, a light unit and a signal notification unit.
The identity recognition module is used for recognizing the identity of the ship monitored by the monitoring module and comprises an identity comparison identifier. The identity comparison identifier is used for acquiring AIS coordinates of the ship and accessing AIS to acquire identity information of the ship.
The deep learning algorithm module is used for identifying the ship data which are not recorded.
It should be noted that,
Referring to fig. 2, in the present invention, the ship monitoring unit includes at least two cameras. Two cameras are vertically arranged and are movably arranged in the horizontal direction, wherein one camera acquires a horizontal image and the other camera acquires a vertical image.
In the present invention, the execution judgment unit is used for acquiring the horizontal image and the vertical horizontal image, and judging the position of the ship shot by the camera. The horizontal and vertical horizontal images have dot-matrix-like pixels, obtained by using on-board LiDAR (Airborne LASER SCANNING, ALS) and on-board LiDAR (clip-borne LASERSCANNING, SLS).
In the present invention, the mode of operation of the execution judging unit includes color discrimination and contrast discrimination.
The colors among the pixel points are different, and the ship position is distinguished by the adjacent colors among the pixel points through color distinction.
The transverse horizontal image and the longitudinal horizontal image are the same in the monitored object, and the comparison and differentiation are carried out by rotating the transverse horizontal image and comparing the transverse horizontal image with the longitudinal horizontal image;
Sequentially selecting one point ap from the characteristic points of the camera, searching k nearest characteristic points ap, apn, … belonging to the same reference object ship, Calculate the center of gravity/>, of the k points
Sequentially selecting one point vp from the characteristic points of the camera, searching k nearest characteristic points vp, vpn1, … of the reference object ships belonging to the same reference object,Calculate the center of gravity/>, of the k points
Through the correlation coefficient of the ALS feature point coordinate and the SLS feature point coordinate difference sequence;
In the method, in the process of the invention, ,/>,/>,/>The horizontal coordinate and the vertical coordinate of the ith neighborhood point of the current point of the ALS point cloud are respectively,Respectively the horizontal coordinate and the vertical coordinate of the ith neighborhood point of the current point in the SLS point cloud; calculating ap to/>Azimuth and vp to/>And calculates ap to/>Euclidean distance from vp to/>The difference in Euclidean distance; and comparing to obtain the position of the ship, and establishing a three-dimensional model.
In the invention, the tracking and positioning unit is used for continuously tracking and positioning the pixel points after the execution judging unit determines the ship position. The feedback receiving unit is used for feeding back and receiving signals to the identity recognition unit after the execution judging unit determines the ship position. The station warning unit is used for warning personnel after the identity recognition module cannot recognize the ship.
It should be noted that, referring to fig. 3, in the auxiliary module of the present invention:
The satellite unit is used for judging the ship position through satellite overlooking, shooting a plurality of pixel layers through satellite interval overlooking, overlapping and comparing the pixel layers, comparing to obtain the ship pixel point position, transmitting the ship pixel point position to the monitoring module for comparison, and transmitting the ship pixel point position to the identity recognition module for recognition of the ship.
The entity unit is used for entity judging the ship position. The entity unit comprises an unmanned aerial vehicle, when only one of the monitoring module and the satellite unit monitors the ship and the identity recognition module cannot recognize the ship, the unmanned aerial vehicle approaches the ship to judge the specific position of the ship and inform the ship of the entity, positions the ship and transmits the ship to the monitoring module and the identity recognition module to assist in recognizing the ship. The unmanned aerial vehicle has camera, sound production spare, luminous spare and setting element.
The light unit is used for assisting in working when the color difference of the pixel points shot by the monitoring module is smaller than a preset value, and amplifying the color difference.
The signal notification unit is used for transmitting and comparing signals sent by the satellite unit, the entity unit and the light unit to the monitoring module.
The process of performing ship positioning comprises the following steps:
By interpolating the pseudo homonymous point pair elevation by using the filtered ground point cloud, ALS and SLS feature point coordinates of the pseudo homonymous point pair are represented by [ Xals Yals Zals ] T and [ XSLS YSLS ZSLS ] T respectively, the coordinate conversion relationship is represented by formula A, B, C, and the coordinate conversion parameter is calculated by using least square.
The formula A is:
The formula B is:
the formula C is:
Wherein R is% ) Representing a rotation matrix,/>The euler rotation angles around the Z, Y and X axes, respectively. [And T represents the amount of translation in the X, Y, Z direction. λ represents the scale parameter, let λ=1, since ALS and SLS are registered as rigid transformations. And (3) performing primary registration and ship position positioning on the ship through a formula.
Preferably, when the identity comparison identifier cannot identify the AIS coordinate signal of the ship, the deep learning algorithm module judges that the ship does not use the AIS, acquires the ship characteristic information through the monitoring module and the auxiliary module, enters the identity comparison identifier, and feeds back to personnel for signal communication. The calculation formula of the judging function L is as follows:
The deep learning algorithm module comprises the following steps of:
And (3) satellite positioning, namely accelerating the speed of shooting the image layer at intervals of the satellite through a satellite unit, and continuously positioning the ship.
And (3) appearance collection, namely collecting a transverse horizontal image and a longitudinal horizontal image of the ship through a camera, and calculating the length, width, height, volume and hull characteristics of the ship according to the pixel points.
And acquiring the color of the ship body pixels of the ship through the execution judging unit.
And acquiring the entity, namely judging the specific position of the ship and informing the ship of the entity by the unmanned aerial vehicle approaching the ship.
Wherein L is a judging function, is a pixel true value of a horizontal image, is a pixel true value of a vertical horizontal image, and n is the total number of pixels and simultaneously passes through a horizontal-vertical coordinate difference;
Each feature point in SLS is matched with the current ALS feature point, and when the absolute value of the transverse and longitudinal correlation coefficients is larger than the threshold value Ap to/>Azimuth and vp to/>When the azimuthal difference of (2) is less than a threshold and ap to/>Euclidean distance from vp to/>The absolute difference of Euclidean distances of (2) is less than a threshold/>When the formula is satisfied, the initial matching set is storedIn (a) and (b);
Counting the difference between the horizontal and vertical coordinates of other points and the absolute difference thereof within a threshold range Establishing a set of pseudo homonymous point pair/>, based on the number of matching point pairs having the greatest numberThe point pairs of (ρ (x) +ρ (y)) with the largest numerical value are selected to ensure one-to-one matching, so that pseudo homonymous point pairs are obtained, and position difference correction and learning are performed.
According to the ship identity recognition system based on the camera calibration and the deep learning algorithm, through the combined arrangement of the monitoring modules, the vertically arranged cameras can be utilized to monitor ships in a staggered mode, pixel points displayed in the horizontal image and the vertical horizontal image can be distinguished and positioned by color, the ships can be distinguished and compared, the color blocks can be compared in the horizontal angle and the vertical angle, the three-dimensional recognition of the ships can be improved, and the accuracy of recognizing and positioning the ship positions can be improved.
Through the cooperation setting of monitoring module and auxiliary module, can utilize the satellite interval to overlook and shoot a plurality of pixel layers, each pixel layer stack contrast, the comparison obtains boats and ships pixel point position, the location of auxiliary ship. Meanwhile, the ship position can be judged by utilizing the entity unit, when the ship cannot be identified by the monitoring module, the satellite unit judgment or the identity identification module, the unmanned aerial vehicle is closely close to the ship, the specific position of the ship is judged by the entity, the ship is informed by the entity, and the auxiliary positioning is carried out. In addition, by utilizing the light unit for assistance, when the color difference of the pixel point photographed by the night monitoring module is smaller than a preset value, the laser range finder is utilized to rotate and measure the monitoring area, when the distance measured by the laser range finder is shortened, the ship appears, and the light unit is utilized for assisting in irradiating the direction, so that the identifiable degree of the color of the ship in the pixel unit is increased.
In summary, the invention has the advantages of improving the three-dimensional identification of the ship, increasing the accuracy of identifying and positioning the ship position, assisting in positioning the ship, carrying out entity judgment on the specific position of the ship and entity notification on the ship, increasing the identifiable degree of the color of the ship in the pixel unit, and the like.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All such equivalent changes and modifications as come within the scope of the following claims are intended to be embraced therein.

Claims (10)

1. The ship identity recognition system based on the camera calibration and the deep learning algorithm is characterized by comprising a monitoring module, an auxiliary module, an identity recognition module and a deep learning algorithm module;
the monitoring module is used for monitoring and feeding back ships in the monitoring area and comprises a ship monitoring unit, an execution judging unit, a tracking and positioning unit, a feedback receiving unit and a site warning unit;
The auxiliary module is used for assisting the monitoring module to monitor the ship and comprises a satellite unit, an entity unit, a light unit and a signal notification unit;
The identity recognition module is used for recognizing the identity of the ship monitored by the monitoring module and comprises an identity comparison identifier; the identity comparison identifier is used for acquiring AIS coordinates of the ship, and accessing AIS to acquire identity information of the ship;
The deep learning algorithm module is used for identifying the ship data which are not input.
2. The camera calibration and deep learning algorithm based ship identification system of claim 1, wherein the ship monitoring unit comprises at least two cameras; two cameras are vertically arranged and are movably arranged in the horizontal direction, wherein one camera acquires a horizontal image and the other camera acquires a vertical image.
3. The ship identity recognition system based on the camera calibration and deep learning algorithm according to claim 2, wherein the execution judging unit is used for acquiring the transverse horizontal image and the longitudinal horizontal image and judging the position of the ship shot by the camera; the horizontal image and the vertical horizontal image have dot-matrix-like pixel points.
4. The ship identification system based on the camera calibration and deep learning algorithm according to claim 3, wherein the working mode of the execution judging unit comprises color distinction and contrast distinction;
the color distinction is carried out by distinguishing the ship position through the adjacent colors among the pixel points;
the comparison and differentiation is carried out by comparing the position of the ship by rotating the transverse horizontal image and then comparing the position of the ship with the longitudinal horizontal image, and a three-dimensional model is built.
5. The ship identity recognition system based on the camera calibration and deep learning algorithm according to claim 1, wherein the tracking and positioning unit is used for continuously tracking and positioning the pixel points after the execution judging unit determines the ship position; the feedback receiving unit is used for feeding back and receiving signals to the identity recognition unit after the execution judging unit determines the ship position; the station warning unit is used for warning personnel after the identity recognition module cannot recognize the ship.
6. The ship identity recognition system based on the camera calibration and the deep learning algorithm according to claim 1, wherein the satellite unit is used for judging the ship position through satellite overlooking, shooting a plurality of pixel layers through satellite overlooking at intervals, overlapping and comparing the pixel layers, comparing to obtain the ship pixel point position, and transmitting the ship pixel point position to the monitoring module for comparison and transmitting the ship pixel point position to the identity recognition module for ship recognition;
the entity unit is used for entity judging the ship position; the entity unit comprises an unmanned aerial vehicle, and when only one of the monitoring module and the satellite unit monitors the ship and the identity recognition module cannot recognize the ship, the unmanned aerial vehicle approaches the ship to judge the specific position of the ship and inform the ship of the entity, positions the ship and transmits the ship to the monitoring module and the identity recognition module to assist in recognizing the ship;
The light unit is used for assisting in working when the color difference of the pixel points shot by the monitoring module is smaller than a preset value, and amplifying the color difference;
The signal notification unit is used for transmitting and comparing signals sent by the satellite unit, the entity unit and the light unit to the monitoring module.
7. The camera calibration and deep learning algorithm based ship identification system of claim 6, wherein the process of performing ship positioning comprises:
The ALS and SLS characteristic point coordinates of the pseudo homonymous point pair are represented by [ Xals Yals Zals ] T and [ XSLS YSLS ZSLS ] T respectively through interpolation of the pseudo homonymous point pair by using the filtered ground point cloud, the coordinate conversion relation is represented by a formula A, B, C, and the coordinate conversion parameter is calculated by using least square;
the formula A is:
The formula B is:
the formula C is:
Wherein R is% ) Representing a rotation matrix,/>Euler rotation angles around a Z axis, a Y axis and an X axis respectively; [/>T represents the translational amount in X, Y, Z directions; lambda represents a scale parameter; and (3) performing primary registration and ship position positioning on the ship through a formula.
8. The ship identity recognition system based on the camera calibration and the deep learning algorithm according to claim 2, wherein when the identity comparison identifier cannot identify an AIS coordinate signal of a ship, the deep learning algorithm module judges that the ship does not use the AIS, and the deep learning algorithm module collects ship characteristic information through the monitoring module and the auxiliary module, inputs the ship characteristic information into the identity comparison identifier and feeds back the ship characteristic information to personnel for signal communication.
9. The ship identity recognition system based on the camera calibration and the deep learning algorithm according to claim 8, wherein the deep learning algorithm module, when collecting ship characteristic information, comprises:
satellite positioning, namely accelerating the speed of shooting a layer at intervals of a satellite through the satellite unit in overlooking, and continuously positioning a ship;
the appearance is collected, the transverse horizontal image and the longitudinal horizontal image of the ship are collected through the camera, and the length, width, height, volume and hull characteristics of the ship are calculated according to the pixel points;
Color collection, namely collecting the hull pixel color of the ship through the execution judging unit;
and acquiring the entity, namely judging the specific position of the ship by the entity unit and informing the ship by the entity through the entity unit approaching to the ship.
10. The ship identity recognition system based on the camera calibration and deep learning algorithm according to claim 1, wherein an output end of the monitoring module is in signal connection with an input end of the identity recognition module, an input end of the monitoring module is in signal connection with an output end of the identity recognition module, an output end of the auxiliary module is in signal connection with an input end of the identity recognition module, an input end of the auxiliary module is in signal connection with an output end of the identity recognition module, an output end of the monitoring module is in signal connection with an input end of the auxiliary module, an input end of the monitoring module is in signal connection with an output end of the auxiliary module, an output end of the identity recognition module is in signal connection with an input end of the deep learning algorithm module, and an input end of the identity recognition module is in signal connection with an output end of the deep learning algorithm module.
CN202410493952.5A 2024-04-24 2024-04-24 Ship identity recognition system based on camera calibration and deep learning algorithm Pending CN118071842A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410493952.5A CN118071842A (en) 2024-04-24 2024-04-24 Ship identity recognition system based on camera calibration and deep learning algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410493952.5A CN118071842A (en) 2024-04-24 2024-04-24 Ship identity recognition system based on camera calibration and deep learning algorithm

Publications (1)

Publication Number Publication Date
CN118071842A true CN118071842A (en) 2024-05-24

Family

ID=91106112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410493952.5A Pending CN118071842A (en) 2024-04-24 2024-04-24 Ship identity recognition system based on camera calibration and deep learning algorithm

Country Status (1)

Country Link
CN (1) CN118071842A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105575185A (en) * 2016-01-04 2016-05-11 上海海事大学 Water (marine) intelligent cruise system
CN111523465A (en) * 2020-04-23 2020-08-11 中船重工鹏力(南京)大气海洋信息***有限公司 Ship identity recognition system based on camera calibration and deep learning algorithm
CN113012206A (en) * 2021-02-07 2021-06-22 山东科技大学 Airborne and vehicle-mounted LiDAR point cloud registration method considering eave characteristics
CN113763484A (en) * 2021-09-17 2021-12-07 交通运输部水运科学研究所 Ship target positioning and speed estimation method based on video image analysis technology
CN114898594A (en) * 2022-04-22 2022-08-12 大连海事大学 General sensing calculation control integrated intelligent light boat control system capable of carrying unmanned aerial vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105575185A (en) * 2016-01-04 2016-05-11 上海海事大学 Water (marine) intelligent cruise system
CN111523465A (en) * 2020-04-23 2020-08-11 中船重工鹏力(南京)大气海洋信息***有限公司 Ship identity recognition system based on camera calibration and deep learning algorithm
CN113012206A (en) * 2021-02-07 2021-06-22 山东科技大学 Airborne and vehicle-mounted LiDAR point cloud registration method considering eave characteristics
CN113763484A (en) * 2021-09-17 2021-12-07 交通运输部水运科学研究所 Ship target positioning and speed estimation method based on video image analysis technology
CN114898594A (en) * 2022-04-22 2022-08-12 大连海事大学 General sensing calculation control integrated intelligent light boat control system capable of carrying unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
CN111523465B (en) Ship identity recognition system based on camera calibration and deep learning algorithm
US11410002B2 (en) Ship identity recognition method based on fusion of AIS data and video data
CN111968046B (en) Target association fusion method for radar photoelectric sensor based on topological structure
CN112083437A (en) Marine laser radar and video combined target capturing system and method
CN108847026A (en) A method of it is converted based on matrix coordinate and realizes that data investigation is shown
CN112987751A (en) System and method for quickly detecting hidden sewage draining outlet in automatic cruising mode
CN109785562B (en) Vertical photoelectric ground threat alert system and suspicious target identification method
CN115909816A (en) Buoy collision early warning and recording system
CN115225865A (en) Video monitoring device is prevented to limit sea based on infrared thermal imaging
CN115880231A (en) Power transmission line hidden danger detection method and system based on deep learning
CN115909240A (en) Road congestion detection method based on lane line and vehicle identification
CN113965733A (en) Binocular video monitoring method, system, computer equipment and storage medium
KR102017154B1 (en) Marine Observation System Using Drone
CN118071842A (en) Ship identity recognition system based on camera calibration and deep learning algorithm
CN117347991A (en) Photoelectric target tracking system method based on radar and AIS fusion
CN117197779A (en) Track traffic foreign matter detection method, device and system based on binocular vision
CN116859948A (en) Autonomous navigation control method and system for unmanned ship for channel sweep based on target detection algorithm
CN111399014A (en) Local stereoscopic vision infrared camera system and method for monitoring wild animals
CN116469276A (en) Water area safety early warning method, device, equipment and storage medium
CN215622333U (en) Can independently detect album truck of trailing contained angle
CN115902926A (en) Forest sample plot investigation monitoring system based on unmanned aerial vehicle cluster carrying laser radar
CN111583336B (en) Robot and inspection method and device thereof
CN210466625U (en) Unmanned sea patrol ship
CN113938610A (en) Unmanned aerial vehicle supervision method and system
CN113239948B (en) Data fusion method and system for millimeter wave radar and video image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination