CN112665588A - Ship navigation situation sensing method based on augmented reality - Google Patents
Ship navigation situation sensing method based on augmented reality Download PDFInfo
- Publication number
- CN112665588A CN112665588A CN202011434833.0A CN202011434833A CN112665588A CN 112665588 A CN112665588 A CN 112665588A CN 202011434833 A CN202011434833 A CN 202011434833A CN 112665588 A CN112665588 A CN 112665588A
- Authority
- CN
- China
- Prior art keywords
- ship
- target ship
- target
- virtual
- video image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 24
- 238000000034 method Methods 0.000 title claims abstract description 22
- 230000004927 fusion Effects 0.000 claims abstract description 17
- 238000005516 engineering process Methods 0.000 claims abstract description 13
- 238000013507 mapping Methods 0.000 claims abstract description 11
- 238000006243 chemical reaction Methods 0.000 claims abstract description 8
- 238000010801 machine learning Methods 0.000 claims abstract description 8
- 230000008447 perception Effects 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 6
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 4
- 230000002776 aggregation Effects 0.000 claims description 3
- 238000004220 aggregation Methods 0.000 claims description 3
- 238000013527 convolutional neural network Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 239000000284 extract Substances 0.000 abstract 1
- 230000000007 visual effect Effects 0.000 abstract 1
- 230000003416 augmentation Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012827 research and development Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000008520 organization Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- QZXCCPZJCKEPSA-UHFFFAOYSA-N chlorfenac Chemical compound OC(=O)CC1=C(Cl)C=CC(Cl)=C1Cl QZXCCPZJCKEPSA-UHFFFAOYSA-N 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a ship navigation situation perception method based on augmented reality, which comprises the following steps: the network camera positioned on the bow transmits the video image which is shot in real time in front of the navigation route of the ship to the computer; the multicast/broadcast transmits the radar fusion identification information and the chart mark information to a computer; the computer analyzes the received video image, extracts the virtual three-dimensional coordinate of the target ship from the analyzed video image by utilizing a machine learning technology, and compares the virtual three-dimensional coordinate of the target ship with the actual three-dimensional coordinate in the radar fusion identification information to obtain a mapping relation of virtual-real coordinate conversion of the target ship; and mapping and superposing the three-dimensional virtual model of the target ship in a video image according to the mapping relation of the virtual-real coordinate conversion of the target ship, and superposing the chart mark information in the video image. The method and the device realize the superposition of navigation data and a real scene based on a mixed three-dimensional registration technology, and realize the simultaneous and visual display of various information.
Description
Technical Field
The invention relates to the technical field of ship navigation augmented reality, in particular to a ship navigation situation perception method based on augmented reality.
Background
Augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding a corresponding image, thereby seamlessly integrating real world information and virtual world information. The technology aims at carrying out fusion interaction on a virtual world and a real world on a display screen, and has three prominent characteristics: (1) information integration of real world and virtual world; (2) real-time interactivity is achieved; (3) virtual objects can be additionally positioned in the three-dimensional scale space. Currently, augmented reality technology has been widely used in the fields of medical research and anatomical training, precision instrument manufacturing and maintenance, military aircraft navigation, engineering design, and remote robot control.
In the field of ship navigation, the augmented reality navigation system can display other ship information on a planned route of a certain ship and information of surrounding sea areas and other sea conditions (such as shallow water regions) on a tablet computer or a screen. Real-time video images from an Automatic Identification System (AIS) information, radar information, and a cockpit camera are fused. As a shipping country, at present, the research and development aiming at the ship navigation reality augmentation in China is still blank, and the invention independently researches and develops the ship augmented reality navigation system with complete intellectual property rights.
Disclosure of Invention
In view of the above, the present invention provides a method for sensing a ship navigation situation based on augmented reality, so as to solve the problems in the background art.
A ship navigation situation perception method based on augmented reality comprises the following steps:
s1, the network camera shoots video images in front of the navigation route of the ship in real time and transmits the video images to the computer through the network;
s2, acquiring radar fusion identification information and chart marking information by multicast/broadcast and transmitting to a computer through a network;
s3, analyzing the received video image by a video processing module of the computer, extracting the virtual three-dimensional coordinates of the target ship from the analyzed video image by using a machine learning technology, and comparing the virtual three-dimensional coordinates of the target ship with the actual three-dimensional coordinates in the radar fusion identification information to obtain the mapping relation of virtual-real coordinate conversion of the target ship;
and S4, mapping and superposing the target ship in a video image according to the mapping relation of the virtual and real coordinate conversion of the target ship, and superposing the chart mark information in the video image.
Preferably, the specific step of extracting the virtual three-dimensional coordinates of the target ship from the analyzed video image by using the machine learning technique in step S3 is:
firstly, carrying out high-dimensional feature extraction on a video image by using a trained convolutional neural network to obtain a plurality of feature maps with different scales;
then, using a multi-scale branch network to perform feature aggregation processing on the feature map under the corresponding scale and transmitting the processed image features to a detection classifier;
and the detection classifier predicts the received image characteristics, judges the position of the boundary box and the class of the ship in the boundary box and acquires the virtual three-dimensional coordinate of the target ship.
Preferably, the radar fusion identification information comprises the distance between the target ship and the ship, the ground speed of the target ship, the minimum meeting distance between the target ship and the ship, the time when the target ship reaches the closest point expected to meet, the distance when the target ship passes the bow of the ship, the time when the target ship passes the bow of the ship, the longitude and latitude of the target ship and the AIS information of the target ship.
Preferably, the AIS information of the target vessel includes the target vessel's model, MMSI, call sign and name, length and width.
Preferably, the method further comprises step S5, when the target ship in the video image enters the warning area of the ship, or the time of the target ship reaching the nearest point of expected encounter is less than a set value, or the minimum encounter distance between the target ship and the ship is less than a set value, the computer issues an alarm command.
Preferably, the chart marking information includes water depth of the current sea area, waypoints, non-navigable areas, navigation stations, buoys, safe boundary lines, coastlines, landmarks.
The invention has the beneficial effects that:
the method of the invention fills in the research and development blank in the technical field of domestic ship navigation reality augmentation, uses the AR technology in ship navigation, carries out data processing on a large amount of information which is originally analyzed and processed by a ship driver through an AR analysis synthesis system, then fuses with background basic parameters, and augments and displays the processed result on a screen in front of the driver so as to realize simultaneous and intuitive augmented display of various information.
Meanwhile, the method realizes the superposition of navigation data and a real scene based on a mixed three-dimensional registration technology, and accesses the data of target recognition and intelligent collision, thereby achieving the purpose of virtual-real fusion and improving the convenience and intuition of data observation.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a configuration diagram of a ship navigation situation sensing system.
FIG. 2 is a hierarchical diagram of a data processing system within a computer.
Fig. 3 is a schematic diagram of extracting virtual three-dimensional coordinates from a video image using a machine learning technique.
Detailed Description
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The present application is described in further detail below with reference to specific embodiments and with reference to the attached drawings.
The invention provides a ship navigation situation perception method based on augmented reality, which fills the research and development blank in the technical field of domestic ship navigation reality augmentation.
Specifically, the ship navigation situation perception system comprises a network high-definition camera and one or more AR display control consoles (namely computers), wherein the network high-definition camera and the AR display control consoles are communicated through an Ethernet switch, and the AR display control consoles are also communicated with an external data service unit through the Ethernet switch.
The external data service units mainly comprise four types, namely a target fusion identification information unit, an electronic chart marking unit, a ship sensor information unit and a collision avoidance information unit. Each unit transmits information according to a fixed format; except for strict convention of network communication protocol format, there is no requirement on the entity form of each information input unit as long as the unit can send data to the local area network in the form of network multicast or broadcast. All data are exchanged and transmitted through a network switch, and the switch plays a role of a data hub and is a collection node of data flow, information flow and control flow of the whole ship navigation situation perception system. The AR displays the relationship of unidirectional communication between the console, the camera and the five types of information input units, the console is the receiver of the information, and the other units are the senders of the information.
The network camera adopts a fixed focus/fixed installation mode and is used for acquiring a video image in front of a navigation route of the ship in real time. In this embodiment, the network camera selects the Haokawav video POE power supply network camera.
The AR display console (namely a computer) is responsible for functions of superposition display and man-machine interaction, is provided with a large display screen, superposes and displays video images transmitted by a network camera in real time and information sent by various external data service units, is provided with voice input and output equipment, and supports functions of user voice command control and alarm reminding. If a plurality of sets of AR display control consoles are deployed, the AR display control consoles are independent and have no master-slave relationship.
The AR display console mainly consists of the following parts:
1) the video processing module: and the video stream is communicated with the network camera to complete the receiving and analyzing of the video stream.
2) The network information processing module: and acquiring a radar fusion identification result, chart mark information, ship sensor information and the like by multicast/broadcast, classifying, decoding and storing according to an agreed format, and providing data support for subsequent augmented reality superposition camera videos and navigation elements.
3) Augmented reality fuses stack module: the method is the core of the system, carries out three-dimensional coordinate system calibration based on the processed video, integrates and supplements data by applying a hybrid tracking registration technology and combining ship posture information of a sensor and a target identification result to obtain more complete data. And then, displaying the video image in a three-dimensional space, and displaying various navigation elements such as collision avoidance information, ship sensor information, chart mark information, target fusion identification processing information and the like.
4) The voice control module: creating a command dictionary, training a recognition model, and controlling command translation and execution.
5) An auxiliary module: setting options, warning prompts, other components for assistance, and the like.
And the augmented reality fusion reality module is responsible for organically fusing the camera video stream with information such as a target, a navigation chart element, collision avoidance and a local sensor in a three-dimensional space coordinate system. Around this core functionality, the system can be largely divided into two pieces, an operator side and a data side.
The invention discloses a ship navigation situation perception method based on augmented reality, which comprises the following steps:
and S1, the network camera shoots the video image in front of the navigation route of the ship in real time and transmits the video image to the computer through the network.
And S2, acquiring the radar fusion identification information and the chart mark information by multicast/broadcast and transmitting the information to the computer through the network.
The radar fusion identification information comprises the distance (RNG) between a target ship and the ship, the ground Speed (SOG) of the target ship, the minimum encounter distance (CPA) between the target ship and the ship, the Time (TCPA) when the target ship reaches the nearest point expected to encounter, the distance (BCR) when the target ship passes through the bow of the ship, the time (BCT) when the target ship passes through the bow of the ship, the longitude and latitude of the target ship and the AIS information of the target ship.
The AIS information of the target vessel includes a vessel type, MMSI, call sign and name, length and width of the target vessel.
The chart marking information comprises the water depth of the current sea area, waypoints, non-navigable areas, navigation stations, buoys, safe boundary lines, coastlines and landmarks.
In the embodiment, except for video data shot by a camera, other data are transmitted in a local area network in a network multicast or broadcast mode, so that the efficiency of network packet receiving is worth considering. Specifically, an ACE _ Proactor frame is adopted to receive a network packet, a UDP data packet is read from an Event object Event _ Handler to be in an ACE _ MessageBlock message block structure, then the received message block is processed in a multithread mode through a thread pool structure TP _ UpdateVideo (control class), data are put into a derivation class of a Package Descriptor according to different protocol messages, and the data are sorted in a display Buffer area Update Buffer in a Package Descriptor pointer mode.
And S3, analyzing the received video image by a video processing module of the computer, and extracting the virtual three-dimensional coordinates of the target ship from the analyzed video image by using a machine learning technology.
The specific steps of extracting the virtual three-dimensional coordinates of the target ship from the analyzed video image by utilizing the machine learning technology are as follows:
firstly, carrying out high-dimensional feature extraction on a video image by using a trained convolutional neural network to obtain a plurality of feature maps with different scales;
then, using a multi-scale branch network to perform feature aggregation processing on the feature map under the corresponding scale and transmitting the processed image features to a detection classifier;
and the detection classifier predicts the received image characteristics, judges the position of the boundary box and the class of the ship in the boundary box and acquires the virtual three-dimensional coordinate of the target ship.
And comparing the virtual three-dimensional coordinates of the target ship with the actual three-dimensional coordinates in the radar fusion identification information to obtain a mapping relation of virtual and real coordinate conversion of the target ship.
And S4, mapping and superposing the three-dimensional model of the target ship in a video image according to the mapping relation of the virtual and real coordinate conversion of the target ship, and superposing the chart mark information in the video image.
The three-dimensional model modeling of the virtual image provides a set of complete geographic space reference systems including a geographic coordinate system, projection transformation and the like by means of an Open Scene Graph (OSG for short, http:// www.openscenegraph.org /) cross-platform three-dimensional Open-source Scene graphic system application program development interface (API), and OSGERth is used. The coordinate system and the projection mode can be customized. In OSG/OSGEarth, there are two core concepts for the organization of three-dimensional scene data: nodes and scene trees. The basic unit of a three-dimensional scene in an OSG is a node, including a group node and a leaf node. The leaf node manages information of one or more renderable volumes and can query the renderable volumes through its interface function. The OSG rendering engine adopts a top-down hierarchical tree structure (scene tree) to realize the organization of spatial data, thereby greatly improving the rendering efficiency of the scene.
The method for sensing the ship navigation situation based on the augmented reality further comprises the step S5, when the target ship in the video image enters the warning area of the ship, or the time when the target ship reaches the nearest point of expected meeting is less than a set value, or the minimum meeting distance between the target ship and the ship is less than a set value, the computer sends an alarm instruction.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (6)
1. A ship navigation situation perception method based on augmented reality is characterized by comprising the following steps:
s1, the network camera shoots video images in front of the navigation route of the ship in real time and transmits the video images to the computer through the network;
s2, acquiring radar fusion identification information and chart marking information by multicast/broadcast and transmitting to a computer through a network;
s3, analyzing the received video image by a video processing module of the computer, extracting the virtual three-dimensional coordinates of the target ship from the analyzed video image by using a machine learning technology, and comparing the virtual three-dimensional coordinates of the target ship with the actual three-dimensional coordinates in the radar fusion identification information to obtain the mapping relation of virtual-real coordinate conversion of the target ship;
and S4, superimposing the three-dimensional virtual model of the target ship in the video image according to the mapping relation of the virtual-real coordinate conversion of the target ship, and simultaneously superimposing the chart mark information in the video image.
2. The method for sensing ship navigation situation based on augmented reality of claim 1, wherein the specific steps of extracting the virtual three-dimensional coordinates of the target ship from the analyzed video image by using the machine learning technology in step S3 are as follows:
firstly, carrying out high-dimensional feature extraction on a video image by using a trained convolutional neural network to obtain a plurality of feature maps with different scales;
then, using a multi-scale branch network to perform feature aggregation processing on the feature map under the corresponding scale and transmitting the processed image features to a detection classifier;
and the detection classifier predicts the received image characteristics, judges the position of the boundary box and the class of the ship in the boundary box and acquires the virtual three-dimensional coordinate of the target ship.
3. The augmented reality-based ship navigation situation awareness method according to claim 1, wherein the radar fusion identification information includes a distance from a target ship to the ship, a speed of the target ship to the ground, a minimum encounter distance between the target ship and the ship, a time when the target ship reaches a nearest point where the target ship is expected to encounter, a distance from the target ship to the ship's bow, a time when the target ship passes the ship's bow, a longitude and latitude of the target ship, and AIS information of the target ship.
4. The augmented reality-based vessel navigation situation awareness method according to claim 3, wherein the AIS information of the target vessel includes a model, MMSI, call sign and name, length and width of the target vessel.
5. The augmented reality-based ship navigation situation awareness method according to claim 3, further comprising a step S5 of issuing an alarm command by the computer when the target ship in the video image enters the surveillance zone of the own ship, or the time when the target ship reaches the expected encounter closest point is less than a set value, or the minimum encounter distance between the target ship and the own ship is less than a set value.
6. The augmented reality-based ship navigation situation awareness method according to claim 1, wherein the chart marking information includes water depth of current sea area, waypoints, non-navigable area, navigation station, buoy, safe boundary line, coastline, landmark.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011434833.0A CN112665588A (en) | 2020-12-10 | 2020-12-10 | Ship navigation situation sensing method based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011434833.0A CN112665588A (en) | 2020-12-10 | 2020-12-10 | Ship navigation situation sensing method based on augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112665588A true CN112665588A (en) | 2021-04-16 |
Family
ID=75401723
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011434833.0A Pending CN112665588A (en) | 2020-12-10 | 2020-12-10 | Ship navigation situation sensing method based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112665588A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113450597A (en) * | 2021-06-09 | 2021-09-28 | 浙江兆晟科技股份有限公司 | Ship auxiliary navigation method and system based on deep learning |
CN114089299A (en) * | 2021-09-01 | 2022-02-25 | 中船航海科技有限责任公司 | Marine target detection and identification method based on situation awareness multi-source sensor linkage |
CN114954839A (en) * | 2022-05-31 | 2022-08-30 | 浙江省交通运输科学研究院 | Ship situation perception control method and system and vision processing chip |
CN115372911A (en) * | 2022-08-30 | 2022-11-22 | 中国船舶集团有限公司第七二三研究所 | Virtual scene and real test platform space position mapping conversion method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6181302B1 (en) * | 1996-04-24 | 2001-01-30 | C. Macgill Lynde | Marine navigation binoculars with virtual display superimposing real world image |
KR101004126B1 (en) * | 2010-09-27 | 2010-12-27 | (주)에디넷 | 3-dimensional vessel traffic service system |
KR20180065411A (en) * | 2016-12-07 | 2018-06-18 | 한국해양과학기술원 | System and method for automatic tracking of marine objects |
CN108550281A (en) * | 2018-04-13 | 2018-09-18 | 武汉理工大学 | A kind of the ship DAS (Driver Assistant System) and method of view-based access control model AR |
CN109084747A (en) * | 2018-06-26 | 2018-12-25 | 武汉理工大学 | Water transportation panorama three-dimension navigation system and method based on general three-dimensional engine |
CN109766811A (en) * | 2018-12-31 | 2019-05-17 | 复旦大学 | The end-to-end detection and recognition methods of sea ship in a kind of satellite-borne SAR image |
-
2020
- 2020-12-10 CN CN202011434833.0A patent/CN112665588A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6181302B1 (en) * | 1996-04-24 | 2001-01-30 | C. Macgill Lynde | Marine navigation binoculars with virtual display superimposing real world image |
KR101004126B1 (en) * | 2010-09-27 | 2010-12-27 | (주)에디넷 | 3-dimensional vessel traffic service system |
KR20180065411A (en) * | 2016-12-07 | 2018-06-18 | 한국해양과학기술원 | System and method for automatic tracking of marine objects |
CN108550281A (en) * | 2018-04-13 | 2018-09-18 | 武汉理工大学 | A kind of the ship DAS (Driver Assistant System) and method of view-based access control model AR |
CN109084747A (en) * | 2018-06-26 | 2018-12-25 | 武汉理工大学 | Water transportation panorama three-dimension navigation system and method based on general three-dimensional engine |
CN109766811A (en) * | 2018-12-31 | 2019-05-17 | 复旦大学 | The end-to-end detection and recognition methods of sea ship in a kind of satellite-borne SAR image |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113450597A (en) * | 2021-06-09 | 2021-09-28 | 浙江兆晟科技股份有限公司 | Ship auxiliary navigation method and system based on deep learning |
CN113450597B (en) * | 2021-06-09 | 2022-11-29 | 浙江兆晟科技股份有限公司 | Ship auxiliary navigation method and system based on deep learning |
CN114089299A (en) * | 2021-09-01 | 2022-02-25 | 中船航海科技有限责任公司 | Marine target detection and identification method based on situation awareness multi-source sensor linkage |
CN114954839A (en) * | 2022-05-31 | 2022-08-30 | 浙江省交通运输科学研究院 | Ship situation perception control method and system and vision processing chip |
CN114954839B (en) * | 2022-05-31 | 2023-08-18 | 浙江省交通运输科学研究院 | Ship situation awareness control method and system and vision processing chip |
CN115372911A (en) * | 2022-08-30 | 2022-11-22 | 中国船舶集团有限公司第七二三研究所 | Virtual scene and real test platform space position mapping conversion method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112665588A (en) | Ship navigation situation sensing method based on augmented reality | |
CN110221546B (en) | Virtual-real integrated ship intelligent control system test platform | |
CN111448476B (en) | Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle | |
CN103941746B (en) | Image processing system and method is patrolled and examined without man-machine | |
CN111239790A (en) | Vehicle navigation system based on 5G network machine vision | |
CN109154499A (en) | System and method for enhancing stereoscopic display | |
CN107144281B (en) | Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision | |
CN106546245B (en) | Aircraft trace based on ADS-B data is inferred and smoothing method | |
CN109084747A (en) | Water transportation panorama three-dimension navigation system and method based on general three-dimensional engine | |
CN112785842B (en) | Online traffic flow simulation system | |
CN103714719A (en) | Navigation flight navigating system based on BeiDou satellite navigation | |
BR102019020832A2 (en) | AIRCRAFT FLIGHT INFORMATION SYSTEM AND METHOD | |
CN111223354A (en) | Unmanned trolley, and AR and AI technology-based unmanned trolley practical training platform and method | |
CN109656319B (en) | Method and equipment for presenting ground action auxiliary information | |
CN114295139A (en) | Cooperative sensing positioning method and system | |
CA3020190C (en) | Intelligent lighting system, intelligent vehicle and auxiliary vehicle driving system and method therefor | |
CN114485700A (en) | High-precision dynamic map generation method and device | |
CN114923477A (en) | Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology | |
CN115641426A (en) | Method and device for displaying environment information and computer readable storage medium | |
CN116243725A (en) | Substation unmanned aerial vehicle inspection method and system based on visual navigation | |
Ross et al. | Vision-based target geolocation and optimal surveillance on an unmanned aerial vehicle | |
Wu et al. | A method of information fusion for the civil aviation ASTERIX data and airport surface video surveillance | |
US11450216B2 (en) | Aircraft display systems and methods for identifying target traffic | |
CN207379510U (en) | Unmanned plane indoor locating system based on cooperative target and monocular vision | |
CN204881653U (en) | Outdoor scene video navigation of hi -Fix |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |