CN117542225A - Augmented reality ship auxiliary navigation system - Google Patents

Augmented reality ship auxiliary navigation system Download PDF

Info

Publication number
CN117542225A
CN117542225A CN202311364742.8A CN202311364742A CN117542225A CN 117542225 A CN117542225 A CN 117542225A CN 202311364742 A CN202311364742 A CN 202311364742A CN 117542225 A CN117542225 A CN 117542225A
Authority
CN
China
Prior art keywords
ship
module
navigation
augmented reality
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311364742.8A
Other languages
Chinese (zh)
Inventor
邹迪
谭超
钱颖麒
韩燎星
李泉洲
边思远
王曰根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 52 Research Institute
Original Assignee
CETC 52 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 52 Research Institute filed Critical CETC 52 Research Institute
Priority to CN202311364742.8A priority Critical patent/CN117542225A/en
Publication of CN117542225A publication Critical patent/CN117542225A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/937Radar or analogous systems specially adapted for specific applications for anti-collision purposes of marine craft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • G08G3/02Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Ocean & Marine Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an augmented reality ship auxiliary navigation system which comprises a range-finding cradle head camera, a panoramic camera, a navigation radar, an AIS, a comprehensive processing unit and a client which are deployed on a ship. The augmented reality ship auxiliary navigation system carries out image display by accessing videos shot by each camera to a client, and uploads the obtained heading, longitude and latitude coordinates and pitching values of a target ship after fusion to the client for image display, and forms a three-dimensional electronic sea chart system, so that navigation information can be intuitively displayed through the client, and the problem of various hidden dangers caused by throwing navigation element data into AR head-mounted equipment in the prior art is solved; the type and the side number of the target ship are identified through the target identification algorithm, so that the type and the side number characteristics of the target ship can be acquired under the condition of closing the AIS, and the accuracy of acquiring the information of the target ship is improved.

Description

Augmented reality ship auxiliary navigation system
Technical Field
The invention belongs to the field of ship auxiliary navigation, and particularly relates to an augmented reality ship auxiliary navigation system.
Background
Information superposition technology: the virtual-real vision fusion technology refers to a function of realizing a correct effect when the surrounding environment changes after a virtual object and a real environment are overlapped according to a certain rule. In an augmented reality system, a virtual object needs to be rendered by rapidly responding to the changes in a real scene and a visual field, sensing related information, generating a vivid image in real time, and realizing natural virtual-real vision fusion. In addition, the efficiency of graph generation is also considered, and the drawing quality is improved as much as possible on the premise of ensuring the frame rate required by comfortable interaction of the user. The effect of virtual-real fusion directly influences the authenticity of the system, and meanwhile, the occupied space of the virtual object is reduced as much as possible in consideration of the requirement of the augmented reality system on instantaneity, otherwise, fusion delay between the real environment and the virtual object can occur.
In the prior art CN110009731a, a three-dimensional scene construction method for ship auxiliary driving is to construct environmental information into a three-dimensional model, combine the three-dimensional model with a video acquisition image of front AR headset equipment, and put navigation element data into the AR headset equipment. The scheme can cause serious shaking of the visual field due to the bumping of the ship under the navigation environment of high sea conditions, and can easily cause dizzy of drivers; meanwhile, when the head-mounted device is used, peripheral information of the head-mounted device can be isolated, so that only a visual field in a system can be acquired, hidden danger exists for personal safety of a driver, communication interaction with peripheral personnel is not facilitated, and meanwhile, the information can be displayed only by virtue of front-end AR equipment, so that the head-mounted device is not beneficial to sharing by multiple persons and synchronous research and judgment; meanwhile, the existing ship is lack of visual recognition capability on an offshore target, the ship AIS (automatic ship recognition system) and the navigation radar can only acquire related information of the ship with the AIS opened at the periphery, the target cannot be found for some illegal ships or special ships, the ship information acquired by the navigation radar is high in limitation, and the information such as the ship type and the ship model cannot be known.
Disclosure of Invention
The invention aims to solve the problems in the background art and provides an augmented reality ship auxiliary navigation system.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
the invention provides an augmented reality ship auxiliary navigation system, which comprises a ranging cradle head camera, a panoramic camera, a navigation radar, an AIS, a comprehensive processing unit, a client, a ship live-action augmented reality unit and a ship three-dimensional GIS auxiliary navigation unit which are deployed on a ship, a service unit and an electronic sea chart which are deployed on the comprehensive processing unit, wherein the ranging cradle head camera, the panoramic camera, the navigation radar, the AIS, the comprehensive processing unit and the client are deployed on the client, and the service unit and the electronic sea chart are deployed on the comprehensive processing unit:
a marine realistic augmented reality unit comprising:
the visual monitoring module is used for accessing the video of each camera to the client for image display;
the navigation element video superposition module is used for converting the space coordinates of the electronic chart information into video coordinates and superposing the video coordinates into the images of the client;
the target video information superposition module is used for converting longitude and latitude coordinates of the target ship after fusion into video coordinates and superposing the video coordinates into an image of the client;
the three-dimensional GIS auxiliary navigation unit of boats and ships includes:
the berthing guarantee module is used for updating the ship position in real time and making collision early warning;
the equipment control module is used for carrying out pan-tilt control, focusing and frequency modulation operation on each camera;
the navigation element three-dimensional situation superposition module is used for superposing a ship three-dimensional model and a two-dimensional environment of the electronic chart to form a three-dimensional electronic chart system;
a service unit comprising:
the situation service module comprises a situation fusion module and a threat assessment module, wherein the situation fusion module is used for fusing information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into a comprehensive track, and the threat assessment module is used for obtaining threat indexes of a target ship relative to the ship according to situation elements, the navigation radar and surrounding environments provided by AIS, and carrying out a predictive warning if the threat indexes exceed preset warning values;
and the navigation data processing module is used for obtaining the heading, longitude and latitude coordinates and pitching values of the target ship after fusion according to the situation elements, and uploading the heading, longitude and latitude coordinates and pitching values to the client for display.
Preferably, the electronic sea chart information includes various types of course lines, anchor points and shallow water areas.
Preferably, the augmented reality ship auxiliary navigation system further comprises a laser range finder deployed on a ship board of the ship, wherein the berthing guarantee module detects the distance between the ship and the shore through the laser range finder, and when the distance reaches a preset value, collision early warning is carried out.
Preferably, the situation fusion module is used for fusing information collected by the navigation radar, the ranging cradle head camera and the panoramic camera into a comprehensive track, and comprises:
and the situation fusion module fuses the information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into an accurate comprehensive track by sequentially carrying out space-time alignment, data filtering and feature association matching processing.
Preferably, the threat assessment module is configured to obtain, according to situation elements and surrounding environments, a threat index of a target ship relative to the ship, and if the threat index exceeds a preset alarm value, perform a predictive alarm, including:
the threat assessment module comprises situation element extraction, behavior intention identification and threat assessment;
the situation elements comprise the speed of the target ship, the distance and the azimuth of the ship, the type, the course, the navigational speed, the longitude and latitude coordinates and the pitching value of the ship, and the surrounding environment comprises an isopipe, an obstacle area, an operation area and navigation aid markers;
the behavior intention recognition adopts a GRU network-based target behavior intention recognition modeling to judge and divide the behavior threat degree of the target ship, and finally a quantized threat index is formed;
and reporting an alarm when the threat index exceeds a set threshold value during threat assessment.
Preferably, the navigation data processing module is configured to obtain, according to situation elements, a heading, longitude and latitude coordinates and a pitch value of the target ship after fusion, and report the heading, longitude and latitude coordinates and the pitch value to the threat assessment module for storage, where the navigation data processing module includes:
and the navigation data processing module sequentially performs time alignment, space alignment, identity judgment and extended Kalman filtering algorithm processing on the speed, the distance and the azimuth of the target ship and the ship, as well as the type, the course, the navigational speed, the longitude and latitude coordinates and the pitching value of the ship, so as to obtain the course, the longitude and latitude coordinates and the pitching value of the fused target ship.
Preferably, the service unit further comprises a video management module, and the video management module performs access, forwarding and storage management on videos of the cameras.
Preferably, the service unit further comprises an object coupling module, and the object coupling module is used for uniformly managing the ranging cradle head camera, the panoramic camera, the navigation radar, the AIS and the laser range finder.
Compared with the prior art, the invention has the beneficial effects that:
the augmented reality ship auxiliary navigation system carries out image display by accessing videos shot by each camera to a client, and uploads the obtained heading, longitude and latitude coordinates and pitching values of a target ship after fusion to the client for image display, and forms a three-dimensional electronic sea chart system, so that navigation information can be intuitively displayed through the client, and the problem of various hidden dangers caused by throwing navigation element data into AR head-mounted equipment in the prior art is solved; the type and the side number of the target ship are identified through the target identification algorithm, so that the type and the side number characteristics of the target ship can be acquired under the condition of closing the AIS, and the accuracy of acquiring the information of the target ship is improved.
Drawings
FIG. 1 is a schematic diagram of an augmented reality marine assistance navigation system of the present invention;
fig. 2 is a block diagram of a module of the augmented reality marine assistance navigation system of the present invention.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made clearly and completely with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
It will be understood that when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
As shown in fig. 1-2, the augmented reality ship auxiliary navigation system comprises a ranging cradle head camera, a panoramic camera, a navigation radar, an AIS, a comprehensive processing unit, a client, a ship live-action augmented reality unit and a ship three-dimensional GIS auxiliary navigation unit which are deployed on the ship, a service unit and an electronic sea chart which are deployed on the comprehensive processing unit.
It should be noted that, the ranging cradle head camera is disposed at the bow of the ship, and the panoramic camera, the navigation radar, the AIS, the integrated processing unit and the client are disposed at the side board of the ship. The ship live-action augmented reality unit and the ship three-dimensional GIS auxiliary navigation unit are two clients
A marine realistic augmented reality unit comprising:
the visual monitoring module is used for accessing the video of each camera to the client for image display;
it should be noted that the visual monitoring module may perform video preview, video playback, and video access.
The navigation element video superposition module is used for converting the space coordinates of the electronic chart information into video coordinates and superposing the video coordinates into the images of the client;
specifically, the electronic sea chart information comprises various channel lines, anchoring points and shallow water areas, and the space coordinates of the various channel lines, the anchoring points and the shallow water areas are converted into video coordinates based on a coordinate conversion algorithm, and are superimposed into an image of a client.
The target video information superposition module is used for converting longitude and latitude coordinates of the target ship after fusion into video coordinates and superposing the video coordinates into an image of the client;
specifically, the longitude and latitude coordinates after the fusion of the target ship are sequentially processed by a time alignment, a space alignment, an identity judgment and an extended Kalman filtering algorithm according to the speed, the distance and the azimuth of the target ship and the type of the target ship, and the course, the navigational speed, the longitude and latitude coordinates and the pitching value of the ship, so that the course, the longitude and latitude coordinates and the pitching value after the fusion of the target ship are obtained, and the longitude and latitude coordinates after the fusion of the target ship are obtained.
The three-dimensional GIS auxiliary navigation unit of boats and ships includes:
the berthing guarantee module is used for updating the ship position in real time and making collision early warning;
specifically, the augmented reality ship auxiliary navigation system further comprises a laser range finder which is deployed on the ship board of the ship, the berthing guarantee module detects the distance between the ship and the shore through the laser range finder, and collision early warning is carried out when the distance reaches a preset value.
The equipment control module is used for carrying out pan-tilt control, focusing and frequency modulation operation on each camera;
the navigation element three-dimensional situation superposition module is used for superposing a ship three-dimensional model and a two-dimensional environment of the electronic chart to form a three-dimensional electronic chart system;
specifically, a three-dimensional electronic sea chart system capable of being overlooked and overlooked is formed by superposing a three-dimensional model of a ship and a two-dimensional environment of the electronic sea chart.
A service unit comprising:
the situation service module comprises a situation fusion module and a threat assessment module, wherein the situation fusion module is used for fusing information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into a comprehensive track, and the threat assessment module is used for obtaining threat indexes of a target ship relative to the ship according to situation elements, the navigation radar and surrounding environments provided by AIS, and carrying out a predictive warning if the threat indexes exceed preset warning values;
the situation fusion module fuses information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into an accurate comprehensive track by sequentially carrying out space-time alignment, data filtering and feature association matching processing. This process belongs to the content of a multi-sensor data fusion method of the prior patent CN114202025a, and is not described in detail here.
The threat assessment module comprises situation element extraction, behavior intention identification and threat assessment;
the situation factors comprise the speed of the target ship, the distance and the azimuth of the ship (the speed of the target ship, the distance and the azimuth of the ship can be obtained through sensors), the type (the type of the target ship is based on R-CNN, fast R-CNN, yoloV3, yoloV4 and other target detection algorithms for the type of the target ship in the video, board number identification), the course, navigational speed, longitude and latitude coordinates and pitching values of the ship (the course, navigational speed, longitude and latitude coordinates and pitching values of the ship can be obtained through sensors), and the surrounding environment comprises isolines, barrier areas, operation areas and navigation aid markers;
the behavioral intention recognition adopts target behavioral intention recognition modeling based on GRU network to judge and divide the behavioral threat degree of the target ship, and finally forms quantitative threat indexes (the process belongs to the prior art, such as references Yang Tongyao, yang Fengbao and Ji Linna, the marine target dynamic threat assessment based on behavioral intention [ A ], university of North information and communication engineering institute, 2021), and the detailed description is not provided in the patent;
and reporting an alarm when the threat index exceeds a set threshold value during threat assessment.
And the navigation data processing module is used for obtaining the heading, longitude and latitude coordinates and pitching values of the target ship after fusion according to the situation elements, and uploading the heading, longitude and latitude coordinates and pitching values to the client for display.
Specifically, the navigation data processing module sequentially performs time alignment, space alignment, identity judgment and extended kalman filtering algorithm processing on the speed, the distance and the azimuth of the target ship and the type of the target ship, and the heading, the navigational speed, the longitude and latitude coordinates and the pitching value of the ship (the process belongs to the content of a multi-sensor data fusion method in the prior patent CN114202025a and is not described in detail herein), so as to obtain the heading, the longitude and latitude coordinates and the pitching value of the target ship after fusion, and finally displays the heading, the longitude and latitude coordinates and the pitching value of the target ship in a client image, namely, superimposes the heading, the longitude and latitude coordinates and the pitching value of the target ship in a video image displayed by the client.
In one embodiment, the service unit further includes a video management module, and the video management module performs access, forwarding and storage management on the video of each camera.
Specifically, the video management module accesses and forwards the video of each camera to the client and stores the video.
In one embodiment, the business unit further comprises an object coupling in module for unified management of the range pan-tilt camera, the panoramic camera, the navigation radar, the AIS, and the laser rangefinder.
The augmented reality ship auxiliary navigation system carries out image display by accessing videos shot by each camera to a client, and uploads the obtained heading, longitude and latitude coordinates and pitching values of a target ship after fusion to the client for image display, and forms a three-dimensional electronic sea chart system, so that navigation information can be intuitively displayed through the client, and the problem of various hidden dangers caused by throwing navigation element data into AR head-mounted equipment in the prior art is solved; the type and the side number of the target ship are identified through the target identification algorithm, so that the type and the side number characteristics of the target ship can be acquired under the condition of closing the AIS, and the accuracy of acquiring the information of the target ship is improved.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above-described embodiments are merely representative of the more specific and detailed embodiments described herein and are not to be construed as limiting the claims. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (8)

1. An augmented reality ship auxiliary navigation system, which is characterized in that: the augmented reality ship auxiliary navigation system comprises a ranging cradle head camera, a panoramic camera, a navigation radar, an AIS, a comprehensive processing unit, a client, a ship live-action augmented reality unit and a ship three-dimensional GIS auxiliary navigation unit which are deployed on a ship, a service unit and an electronic sea chart which are deployed on the comprehensive processing unit, wherein:
the marine realistic augmented reality unit comprises:
the visual monitoring module is used for accessing the video of each camera to the client for image display;
the navigation element video superposition module is used for converting the space coordinates of the electronic chart information into video coordinates and superposing the video coordinates into the images of the client;
the target video information superposition module is used for converting longitude and latitude coordinates of the target ship after fusion into video coordinates and superposing the video coordinates into an image of the client;
the three-dimensional GIS auxiliary navigation unit of the ship comprises:
the berthing guarantee module is used for updating the ship position in real time and making collision early warning;
the equipment control module is used for carrying out pan-tilt control, focusing and frequency modulation operation on each camera;
the navigation element three-dimensional situation superposition module is used for superposing a ship three-dimensional model and a two-dimensional environment of the electronic chart to form a three-dimensional electronic chart system;
the service unit comprises:
the situation service module comprises a situation fusion module and a threat assessment module, wherein the situation fusion module is used for fusing information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into a comprehensive track, and the threat assessment module is used for obtaining threat indexes of a target ship relative to the ship according to situation elements, the navigation radar and surrounding environments provided by AIS, and carrying out a predictive warning if the threat indexes exceed preset warning values;
and the navigation data processing module is used for obtaining the heading, longitude and latitude coordinates and pitching values of the target ship after fusion according to the situation elements, and uploading the heading, longitude and latitude coordinates and pitching values to the client for display.
2. The augmented reality vessel assisted navigation system of claim 1, wherein: the electronic sea chart information comprises various channel lines, anchor points and shallow water areas.
3. The augmented reality vessel assisted navigation system of claim 1, wherein: the augmented reality ship auxiliary navigation system further comprises a laser range finder which is deployed on the ship board of the ship, the berthing guarantee module detects the distance between the ship and the shore through the laser range finder, and collision early warning is carried out when the distance reaches a preset value.
4. The augmented reality vessel assisted navigation system of claim 1, wherein: the situation fusion module is used for fusing information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into a comprehensive track, and comprises the following steps:
the situation fusion module fuses information acquired by the navigation radar, the ranging cradle head camera and the panoramic camera into an accurate comprehensive track by sequentially carrying out space-time alignment, data filtering and feature association matching processing.
5. The augmented reality vessel assisted navigation system of claim 1, wherein: the threat assessment module is used for obtaining threat indexes of the target ship relative to the ship according to situation elements and surrounding environments, and carrying out predictive warning when the threat indexes exceed a preset warning value, and comprises the following steps:
the threat assessment module comprises situation element extraction, behavior intention identification and threat assessment;
the situation elements comprise the speed of the target ship, the distance and the azimuth of the ship, the type, the course, the navigational speed, the longitude and latitude coordinates and the pitching value of the ship, and the surrounding environment comprises an isopipe, an obstacle area, an operation area and navigation aid markers;
the behavior intention recognition adopts a GRU network-based target behavior intention recognition modeling to judge and divide the behavior threat degree of the target ship, and finally a quantized threat index is formed;
and reporting an alarm when the threat index exceeds a set threshold value during threat assessment.
6. The augmented reality marine auxiliary navigation system of claim 5, wherein: the navigation data processing module is used for obtaining the course, longitude and latitude coordinates and pitching values of the target ship after fusion according to the situation factors, reporting the course, longitude and latitude coordinates and pitching values to the threat assessment module for storage, and comprises the following steps:
the navigation data processing module sequentially performs time alignment, space alignment, identity judgment and extended Kalman filtering algorithm processing on the speed, the distance and the azimuth of the target ship and the type of the target ship, and the course, the navigational speed, the longitude and latitude coordinates and the pitching value of the ship to obtain the course, the longitude and latitude coordinates and the pitching value of the target ship after fusion.
7. The augmented reality vessel assisted navigation system of claim 1, wherein: the service unit also comprises a video management module, and the video management module is used for carrying out access, forwarding and storage management on videos of all cameras.
8. The augmented reality vessel assisted navigation system of claim 1, wherein: the business unit further comprises an object coupling in module, wherein the object coupling in module is used for uniformly managing the ranging cradle head camera, the panoramic camera, the navigation radar, the AIS and the laser range finder.
CN202311364742.8A 2023-10-20 2023-10-20 Augmented reality ship auxiliary navigation system Pending CN117542225A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311364742.8A CN117542225A (en) 2023-10-20 2023-10-20 Augmented reality ship auxiliary navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311364742.8A CN117542225A (en) 2023-10-20 2023-10-20 Augmented reality ship auxiliary navigation system

Publications (1)

Publication Number Publication Date
CN117542225A true CN117542225A (en) 2024-02-09

Family

ID=89785079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311364742.8A Pending CN117542225A (en) 2023-10-20 2023-10-20 Augmented reality ship auxiliary navigation system

Country Status (1)

Country Link
CN (1) CN117542225A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117972452A (en) * 2024-03-28 2024-05-03 交通运输部南海航海保障中心广州海事测绘中心 S-127-based marine ship behavior compliance recognition system and electronic medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117972452A (en) * 2024-03-28 2024-05-03 交通运输部南海航海保障中心广州海事测绘中心 S-127-based marine ship behavior compliance recognition system and electronic medium

Similar Documents

Publication Publication Date Title
US10908678B2 (en) Video and image chart fusion systems and methods
US10942028B2 (en) Video sensor fusion and model based virtual and augmented reality systems and methods
US11328155B2 (en) Augmented reality labels systems and methods
US20200278433A1 (en) Real-time monitoring of surroundings of marine vessel
CN111524392B (en) Comprehensive system for assisting intelligent ship remote driving
KR102661363B1 (en) Device and method for monitoring a berthing
CN110580044A (en) unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing
EP3654233A1 (en) System and method for identifying an object in water
US20220172464A1 (en) Water non-water segmentation systems and methods
CN104184990A (en) Navigation radar or AIS tracking parameter booted intelligent video monitoring system
US20220392211A1 (en) Water non-water segmentation systems and methods
CN113050121A (en) Ship navigation system and ship navigation method
KR102231343B1 (en) Marine warning system for the protection of bridge facilities
CA2282064A1 (en) A system and method for use with a moveable platform
WO2021178603A1 (en) Water non-water segmentation systems and methods
CN113780127A (en) Ship positioning and monitoring system and method
KR100990764B1 (en) System and method of image monitoring harbor
Sorial et al. Towards a real time obstacle detection system for unmanned surface vehicles
CA2279165A1 (en) A system and method for use with a moveable platform
CN111710192A (en) Ship bridge collision accident early warning and recording method, device and system
CN117542225A (en) Augmented reality ship auxiliary navigation system
CN116309732A (en) Ship motion visualization method based on digital twinning
CN111258322A (en) Marine driving auxiliary device and method based on augmented reality technology
CN213748480U (en) Electronic sand table system
KR102660973B1 (en) Mobile hmd for providing integration voyage information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination