CN113179292A - Ship positioning method, system and device based on edge calculation and storage medium - Google Patents

Ship positioning method, system and device based on edge calculation and storage medium Download PDF

Info

Publication number
CN113179292A
CN113179292A CN202110227242.4A CN202110227242A CN113179292A CN 113179292 A CN113179292 A CN 113179292A CN 202110227242 A CN202110227242 A CN 202110227242A CN 113179292 A CN113179292 A CN 113179292A
Authority
CN
China
Prior art keywords
position information
equipment
image
piloting
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110227242.4A
Other languages
Chinese (zh)
Inventor
马枫
王宁
刘佳仑
李诗杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN202110227242.4A priority Critical patent/CN113179292A/en
Publication of CN113179292A publication Critical patent/CN113179292A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a ship positioning method, a system, a device and a storage medium based on edge calculation, wherein the method comprises the following steps: shooting an equipment image of the piloting equipment by using a camera, and carrying out image processing on the equipment image through edge computing equipment to determine first position information of the piloting equipment; and acquiring camera parameters of the camera, determining second position information of the ship according to the first position information and the camera parameters, and sending the second position information to the cloud. According to the embodiment of the application, the position information of the piloting equipment is acquired through the edge computing equipment, and the position information of the ship is determined, so that the accuracy of auxiliary positioning is improved; in addition, the embodiment of the application completes the processes of image processing and the like through the edge computing equipment, and solves the problem of data transmission delay from the edge computing equipment to the terminal to a certain extent.

Description

Ship positioning method, system and device based on edge calculation and storage medium
Technical Field
The present disclosure relates to the field of ship positioning, and more particularly, to a ship positioning method, system, apparatus and storage medium based on edge calculation.
Background
With the development of the science and technology level, the domestic shipping business also has higher and higher intelligent development requirements. In the process of ship navigation, except for using a Global Positioning System (GPS) to perform satellite Positioning on a ship, fixed navigation devices such as buoys and shore marks are also used to assist in Positioning the ship, so as to ensure that the ship is determined to run on a correct channel. In the past, a pilot on a ship often needs to determine the relative distance between the ship and a fixed pilot device through visual observation, and the observation mode is low in accuracy and poor in auxiliary positioning effect. With the development of technology, radar equipment is increasingly applied to the field of ships as a positioning assistance means. In inland river shipping environments, a narrow channel often needs to be judged at a short distance, and generally, the identification effect of a shipping radar is better under the condition that the distance is more than 10 times of the ship length, so that the short-distance identification is easy to be greatly discounted.
Disclosure of Invention
The present application is directed to solving at least one of the problems in the related art to some extent, and to provide a method, a system, an apparatus and a storage medium for positioning a ship based on edge calculation.
In a first aspect, an embodiment of the present application provides a ship positioning method based on edge calculation, including: acquiring an equipment image of piloting equipment; performing image processing on the equipment image, and determining first position information of the piloting equipment; acquiring camera parameters of a camera for shooting the piloting equipment; determining second position information of the ship according to the first position information and the camera parameters; and sending the second position information to a cloud.
Optionally, the image processing the device image to determine the first position information of the piloting device includes: performing image processing on the equipment image to determine a two-dimensional code on the piloting equipment; and identifying the two-dimensional code and acquiring the first position information stored in the two-dimensional code.
Optionally, the camera parameters include a lens focal length, a target surface size height, and a lens height of the camera, and the determining the second position information of the ship according to the first position information and the camera parameters specifically includes: determining the distance between the lens of the camera and the piloting equipment according to the focal length of the lens, the size height of the target surface and the height of the lens; and determining the second position information according to the first position information and the distance between the lens of the camera and the piloting equipment.
Optionally, the method further comprises: and when the distance between the lens of the camera and the piloting equipment exceeds a preset safety threshold value, determining a collision early warning instruction.
Optionally, the method further comprises: acquiring environmental information of the ship in navigation, wherein the environmental information comprises temperature, humidity and wind speed; and sending the environment information to a cloud.
Optionally, the acquiring a device image of a piloting device includes: acquiring an environment image around the ship; and carrying out image detection on the environment image, and determining an equipment image with a piloting device.
In a second aspect, an embodiment of the present application provides a ship positioning system based on edge calculation, including: a camera, an edge computing device, a cloud; the camera is used for shooting the piloting equipment to acquire an equipment image; the edge computing device is used for performing image processing on the acquired device image and determining first position information of the piloting device; estimating second position information of the ship according to the first position information and the camera parameters of the camera; sending the second position information to a cloud end; the cloud end is used for receiving and storing the second position information, sending the second position information to the terminal and receiving an instruction sent by the terminal; the terminal is used for receiving the second position information sent by the cloud end and sending the instruction to the cloud end.
Optionally, the edge computing device further comprises: an acquisition module; the acquisition module comprises a temperature and humidity sensor and an air speed sensor; the acquisition module is used for acquiring environmental information, and the environmental information comprises temperature, humidity and wind speed.
In a third aspect, an embodiment of the present application provides an apparatus, including: at least one processor; at least one memory for storing at least one program; when the at least one program is executed by the at least one processor, the at least one processor is caused to implement the method for vessel positioning based on edge calculation according to the first aspect.
In a fourth aspect, the present application provides a storage medium, in which a program executable by a processor is stored, and when the program is executed by the processor, the program is used to implement the ship positioning method based on edge calculation according to the first aspect.
The embodiment of the application has the following beneficial effects: shooting equipment images of piloting equipment such as a shore mark, a buoy and the like by using a camera, and carrying out image processing on the equipment images through edge computing equipment to determine first position information of the piloting equipment; and acquiring camera parameters of a camera for shooting the piloting equipment, determining second position information of the ship according to the first position information and the camera parameters, and sending the second position information to the cloud. According to the method and the device, the equipment image is processed through the edge computing equipment, the position information of the piloting equipment is obtained, the position information of the ship is determined according to the position information of the piloting equipment, and the accuracy of auxiliary positioning is improved; in addition, the image processing and other processes are completed through the edge computing device, the computing power of a cloud end does not need to be relied on, and the problem of data transmission delay from the edge computing device to the terminal is solved to a certain extent.
Drawings
The accompanying drawings are included to provide a further understanding of the claimed subject matter and are incorporated in and constitute a part of this specification, illustrate embodiments of the subject matter and together with the description serve to explain the principles of the subject matter and not to limit the subject matter.
FIG. 1 is a schematic diagram of an edge computing based vessel positioning system provided by some embodiments of the present application;
FIG. 2 is a step diagram of a method for edge-based computing vessel positioning provided by some embodiments of the present application;
FIG. 3 is a diagram illustrating the steps of OpenCV-based image processing of device images in some embodiments of the present application;
fig. 4 is a diagram illustrating steps of scanning and recognizing a two-dimensional code by using Zbar according to some embodiments of the present application;
fig. 5 is an apparatus provided in some embodiments of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It should be noted that although functional block divisions are provided in the system drawings and logical orders are shown in the flowcharts, in some cases, the steps shown and described may be performed in different orders than the block divisions in the systems or in the flowcharts. The terms first, second and the like in the description and in the claims, and the drawings described above, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
With the development of the science and technology level, the domestic shipping business also has higher and higher intelligent development requirements. In the process of ship navigation, fixed piloting devices such as buoys, shore marks and the like are commonly used for assisting the auxiliary positioning of ships. In the past, a pilot on a ship often needs to determine the relative distance between the ship and a fixed pilot device through visual observation, and the observation mode has low accuracy and poor auxiliary positioning effect. With the development of the technology, radar equipment is also applied to the field of ship auxiliary positioning, but because the identification effect of the shipping radar is better under the condition that the distance is more than 10 times of the ship length, the short-distance identification effect of the shipping radar is not good enough for inland river shipping environment.
Based on the defects in the related technology, the method is a feasible method for identifying the fixed piloting equipment in the inland waterway environment by using the computer vision related technology, carrying out image processing through the edge equipment and extracting the position information to carry out auxiliary positioning on the ship. In addition, the embodiment of the application introduces the edge equipment with calculation power, so that the defect that the terminal needs to be processed in a centralized manner is avoided to a certain extent, and the method has important significance for the development of ship intellectualization.
The embodiments of the present application will be further explained with reference to the drawings.
Referring to fig. 1, fig. 1 is a schematic diagram of a ship positioning system based on edge computing according to some embodiments of the present application, where the system 100 includes a camera 110, an edge computing device 120, a cloud 130, and a terminal 140, and the camera is configured to capture a piloting device and obtain a device image; the edge computing equipment is used for carrying out image processing on the acquired equipment image and determining first position information of the piloting equipment; estimating second position information of the ship according to the first position information and camera parameters of the camera; sending the second position information to the cloud end; the cloud end is used for receiving and storing the second position information, sending the second position information to the terminal and receiving an instruction sent by the terminal; the terminal is used for receiving the second position information sent by the cloud end and sending an instruction to the cloud end.
The edge computing device may specifically include an artificial intelligence chip K-210 based on a RISC-V (reduced instruction set computer; V denotes the fifth generation) architecture, the chip adopts a RISC-V processor architecture, has dual-core 64-bit CPU and 1TOPS computing power, supports machine vision and machine hearing multi-modal recognition, and has the characteristics of low power consumption, easy expansion and strong programmability, thereby meeting the requirements of the edge computing device in the embodiments of the present application. When the ship positioning system based on the edge calculation is implemented, a development board carrying a K-210 chip is prepared, a corresponding matched tool chain is utilized, a development environment is configured, namely, a MicroPython-based MaixPy firmware is programmed to the development board, so that most Python libraries can be called by the programmed development board program, and subsequent operation is easy to realize. And expanding the K-210 chip, connecting a camera for shooting to the development board, and enabling the K-210 chip to process the image shot by the camera.
The terminal can be for the PC end at the boats and ships detection center on land, and the PC end carries out two-way communication with the high in the clouds, receives the boats and ships positional information that the high in the clouds sent or the environmental information of boats and ships navigation in-process, and the PC end can show these information to carry out visual show by the user at the PC end, issue the instruction to the high in the clouds. In addition, the terminal can be a mobile phone, and the application does not limit the specific type of the terminal.
The cloud end can be in two-way communication with the edge computing equipment, receives ship position information calculated by the edge computing equipment, and sends a command issued by the terminal to the edge computing equipment. The cloud can also be in two-way communication with the terminal, send the stored ship position information to the terminal, and receive the instruction issued by the terminal.
Optionally, the K-210 chip is expanded, the development board with the K-210 chip is connected with an acquisition module composed of a plurality of sensors, the acquisition module includes but is not limited to a temperature and humidity sensor, a wind speed sensor and the like, and the acquisition module is used for acquiring environmental information in the navigation. After the collected environment information data are simply processed, the collected environment information data, the first position information, the second position information and other information are packaged into a complete Json data packet, the K-210 and the cloud are in two-way communication through a TCP protocol, and the Json data packet is uploaded to the cloud. And after the cloud receives the data, storing the data. The cloud end adopts a multithreading mode, and bidirectional communication is also established with the PC end to carry out data transmission. After making a data request at the PC end, sending the data request to the PC end in sequence at a certain frequency. After receiving the data, the PC terminal calls the compiled visualization module to display the data and can send a command to the K-210 through the cloud terminal.
It should be noted that the piloting equipment mentioned in the embodiments of the present application is fixed-position piloting equipment such as buoys and shore marks.
Referring to fig. 2, fig. 2 is a step diagram of a ship positioning method based on edge calculation according to some embodiments of the present application, where the method includes, but is not limited to, steps S200 to S240.
And step S200, acquiring an equipment image of the piloting equipment.
Specifically, during the ship navigation, a camera arranged on the ship is used for shooting the surrounding environment, acquiring an environment image, and performing image detection on the environment image. In the embodiment of the application, an image detection can be performed on an environmental image by using a target detection algorithm, the current popular target detection algorithms are divided into two types, the first type of algorithm is an R-CNN algorithm based on a Region candidate, the algorithm is a two-stage algorithm, a heuristic method such as Selective Search or a CNN network is required to be used for generating the Region pro, and then classification and regression are performed on the Region pro; the second type of algorithm is a one-stage algorithm represented by Yolo, which uses a CNN network to directly predict the categories and locations of different targets. The first type of algorithm has high accuracy but slow speed; the second category of algorithms is fast but less accurate. By combining the use scene of the ship positioning in the embodiment of the application, the embodiment of the application uses the Yolo v2 algorithm to perform image detection on the environment image, continuously shoots the environment image in the ship navigation process, and identifies the piloting equipment in the environment image by using the Yolo v2 algorithm; after the piloting equipment is determined to be in the current shooting direction, clear equipment images of the piloting equipment can be obtained by adjusting parameters such as the focal length of the camera.
Step S210, performing image processing on the device image, and determining first position information of the piloting device.
In this embodiment, the first location information is GPS positioning information of the piloting device. The piloting equipment is provided with a recognizable mark for indicating the positioning of the piloting equipment. The identifiable identifier can have various expression modes, for example, the position information such as the GPS coordinate of the piloting equipment can be directly printed on the piloting equipment, the equipment image is identified by using an image identification technology, and the first position information of the piloting equipment can be acquired. In the embodiment of the present application, the position information of the piloting device is stored in the two-dimensional code. Specifically, install the suitable two-dimensional code of size additional on fixed piloting equipment such as buoy, bank mark, the size of specific two-dimensional code needs to test to the fixed piloting equipment of difference, and too big or undersize should not carry out the detection of two-dimensional code, and the information that the two-dimensional code carried should include the detailed information of piloting equipment and the GPS data that piloting equipment position is located at least, it can to remain two after the GPS data decimal point. After the device image is acquired, the first position information of the piloting device can be acquired by identifying the two-dimensional code.
Specifically, image processing is performed on an equipment image based on OpenCV, a two-dimensional code area in the equipment image is extracted, and then the two-dimensional code is scanned and identified by using Zbar to obtain first position information of the piloting equipment.
Referring to fig. 3, fig. 3 is a diagram illustrating steps of image processing on a device image based on OpenCV in some embodiments of the present application, including, but not limited to, steps S300 to S360.
It should be noted that, in the steps in fig. 3, a plurality of processes are sequentially performed on the device image, and an image obtained in a previous process step is used as an input image of a next step, and for descriptive purposes, both the image before the process and the image after the process are referred to as the device image when describing the steps in fig. 3.
And step S300, performing gray level conversion processing on the device image.
Specifically, during image processing, the original three-dimensional RGB image is converted into a two-dimensional gray scale image, so that color interference is eliminated, and the processing is easier. The cvtColor () function provided in OpenCV is a color space conversion function that can convert an original RGB image into a grayscale image, and thus converts a device image into a grayscale image using the cvtColor () function.
In step S310, gaussian smoothing filter processing is performed on the device image.
In particular, gaussian filtering is a linear smoothing filter, which is often used to eliminate gaussian noise with good effect. Therefore, a gasssianblur () function in OpenCV is introduced, and the grayscale image of the device image obtained in step S300 is blurred with a gaussian filter, thereby eliminating gaussian noise.
Step S320, median filtering is performed on the device image.
Specifically, by using the nonlinear filtering method of median filtering, edge details of the image can be retained while removing noise, and the method is suitable for extracting the region where the two-dimensional code is located. Therefore, the function introduced in OpenCV median filters the device image for media blur ().
Step S330, edge detection is performed on the device image.
Specifically, the purpose of edge detection is to extract the contour of the target region, so that the data amount of image processing can be reduced, background image information irrelevant to the two-dimensional code in the image is removed, and only the two-dimensional code region information is reserved. And respectively calculating the gradient of the equipment image X, Y direction by using a Sobel edge detection algorithm, and then overlapping the gradient results of the two directions.
In step S340, the device image is binarized.
Specifically, in order to facilitate further processing and avoid interference caused by excessive colors, the device image obtained in step S330 is subjected to binarization processing, so that the image has only black and white color values, and the contour of the target area can be better highlighted, and thus binarization of the device image is achieved by using a threshold function in OpenCV.
Step S350, performing a closing operation on the device image.
Specifically, in order to enlarge the gap between the two-dimensional codes, remove isolated interference points, perform closed operation on the device image, perform expansion and then corrosion, and implement the closed operation, the closed operation may be implemented by using a function morphologoex () in OpenCV.
And step S360, acquiring a two-dimensional code area in the equipment image.
Specifically, a findContours () function is used for finding out a rectangular boundary of a two-dimensional code region in the equipment image, then finding out an outline with the largest area, cutting the two-dimensional code according to the region, storing the outline as a corresponding result for subsequent identification, and recording the size of the outline region of the two-dimensional code and the position of the center of the region.
Through steps S300 to S360, the device image may be subjected to image processing using various functions in OpenCV, so as to extract the two-dimensional code on the piloting device.
Referring to fig. 4, fig. 4 is a diagram of steps for performing scanning identification on a two-dimensional code by using Zbar according to some embodiments of the present application, where the process includes, but is not limited to, steps S400 to S420.
Step S400, initializing the scan object and performing parameter setting on the scan object.
Specifically, a Zbar scanner ImageScanner object is constructed initially, and then the set _ config () method is used for carrying out corresponding parameter setting on the scanned object.
Step S410, acquiring the two-dimensional code and defining a scanning range.
Specifically, a two-dimensional code region obtained by the method steps in fig. 3 is acquired, and a scanning range of the two-dimensional code is defined.
Step S420, identify the two-dimensional code and obtain first position information.
Specifically, a scan () method of the image scanner is called to identify the two-dimensional code image, and the first position information of the piloting device stored in the two-dimensional code is acquired.
Through the steps S400 to S420, the two-dimensional code on the piloting device is identified, and the first position information stored in the two-dimensional code is acquired.
In summary, the specific implementation of step S210 in fig. 2 has been described through the method steps in fig. 3 and fig. 4.
In step S220, camera parameters of a camera of the shooting pilot device are acquired.
Specifically, camera parameters of the camera are acquired, and the camera parameters comprise a lens focal length, a target surface size height and a lens height of the camera. The camera lens used in the embodiment of the present application may be fixed focus or variable focus; the lens height is the height of the lens in the shooting scene, and is generally set to be twice the height of the object to be shot.
And step S230, determining second position information of the ship according to the first position information and the camera parameters.
In particular, using the lens transmission principle, the second position information of the vessel may be determined from the first position information and the camera parameters. The second position information is the current GPS positioning information of the ship. F represents the focal length of the lens, D represents the distance between the lens and the piloting equipment, H represents the size height of the target surface of the lens, and H represents the height of the lens. In addition, the direction of the piloting equipment can be estimated by comparing the area center of the two-dimensional code with the center of the equipment image. The GPS coordinates of the navigation equipment in the first position information are converted into plane coordinates in an (X, Y) form, the coordinates of the position where the ship is located can be estimated according to the acquired distance from the lens to the navigation equipment and the direction of the navigation equipment, the ship coordinates are also the plane coordinates expressed in the (X, Y) form, and the ship coordinates are subjected to longitude and latitude conversion, so that the GPS coordinates of the ship can be deduced, namely, the second position information is determined, and the auxiliary positioning of the ship is completed.
Optionally, since the distance between the lens and the piloting device is approximately equal to the distance between the ship and the piloting device, when the distance between the lens of the camera and the piloting device exceeds a preset safety threshold, a collision warning instruction is generated. The collision warning instruction includes, but is not limited to, generating warning information. For example, when the distance between the lens of the camera and the piloting equipment exceeds 200 meters, an early warning message is generated to remind a pilot that the distance between the ship and the piloting equipment is too close, and the direction adjustment and the speed reduction of the ship are needed.
In step S240, the second location information is sent to the cloud.
Specifically, steps S200 to S230 are all completed at an edge node, which includes an edge computing device and a camera. Since the ship runs on the river or sea surface, and the monitoring center for remotely monitoring the ship is generally arranged on the land and at a long distance, data needs to be transmitted from the edge node to the terminal through the transfer station, namely the cloud. In the related art, image data acquired by a camera can be directly uploaded to a cloud end, and the cloud end performs image processing to obtain data, but although the computing power of the existing cloud computing is strong enough, the problem of data transmission delay still exists inevitably. The embodiment of the application provides an image acquired by a camera is processed by utilizing edge computing equipment, the problem of data transmission delay can be solved to a certain extent, when an edge node on one side of the equipment has computing power, data processing and simple judgment can be carried out independently, a cloud only needs to be used as a transfer station, data acquired by processing of the edge computing equipment are received and stored and sent to a terminal, an instruction sent by the terminal is received and transmitted to the edge computing equipment, and therefore the data transmission speed in the ship positioning process is improved.
And when the edge computing equipment processes and obtains second position information of the ship, sending the second position information to the cloud.
Optionally, the cloud sends the stored data including the second position information and the environmental information to a remote terminal, and after the terminal receives the data, the terminal visually displays the data, for example, an early warning instruction is sent to the cloud according to the distance between the ship and the piloting device, the cloud sends the early warning instruction to the edge computing device, and early warning information is generated at the ship.
Through the steps S200 to S240, shooting equipment images of piloting equipment such as a shore mark, a buoy and the like by using a camera, carrying out image processing on the equipment images through edge computing equipment, and determining first position information of the piloting equipment; and acquiring camera parameters of a camera for shooting the piloting equipment, determining second position information of the ship according to the first position information and the camera parameters, and sending the second position information to the cloud. According to the method and the device, the equipment image is processed through the edge computing equipment, the position information of the piloting equipment is obtained, the position information of the ship is determined according to the position information of the piloting equipment, and the accuracy of auxiliary positioning is improved; in addition, the image processing and other processes are completed through the edge computing device, the computing power of a cloud end does not need to be relied on, and the problem of data transmission delay from the edge computing device to the terminal is solved to a certain extent.
Referring to fig. 5, fig. 5 illustrates an apparatus 500 according to some embodiments of the present application, the apparatus 500 including at least one processor 510 and at least one memory 520 for storing at least one program; in fig. 5, a processor and a memory are taken as an example.
The processor and memory may be connected by a bus or other means, such as by a bus in FIG. 5.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs as well as non-transitory computer executable programs. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and these remote memories may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Another embodiment of the present application also provides an apparatus that may be used to perform the control method as in any of the above embodiments, e.g., to perform the method steps of fig. 2 described above.
The above-described embodiments of the apparatus are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may also be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
The embodiment of the application also discloses a storage medium, wherein a program executable by a processor is stored, and the program executable by the processor is used for realizing the ship positioning method based on edge calculation when being executed by the processor.
One of ordinary skill in the art will appreciate that all or some of the steps, systems, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
While the preferred embodiments of the present invention have been described, the present invention is not limited to the above embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present invention, and such equivalent modifications or substitutions are included in the scope of the present invention defined by the claims.

Claims (10)

1. The ship positioning method based on edge calculation is characterized by comprising the following steps:
acquiring an equipment image of piloting equipment;
performing image processing on the equipment image, and determining first position information of the piloting equipment;
acquiring camera parameters of a camera for shooting the piloting equipment;
determining second position information of the ship according to the first position information and the camera parameters;
and sending the second position information to a cloud.
2. The method of claim 1, wherein the image processing the device image to determine the first position information of the piloting device comprises:
performing image processing on the equipment image to determine a two-dimensional code on the piloting equipment;
and identifying the two-dimensional code and acquiring the first position information stored in the two-dimensional code.
3. The method according to claim 1, wherein the camera parameters include a lens focal length, a target surface size height, and a lens height of the camera, and the determining the second position information of the ship according to the first position information and the camera parameters includes:
determining the distance between the lens of the camera and the piloting equipment according to the focal length of the lens, the size height of the target surface and the height of the lens;
and determining the second position information according to the first position information and the distance between the lens of the camera and the piloting equipment.
4. The edge-computing-based vessel positioning method of claim 3, further comprising:
and when the distance between the lens of the camera and the piloting equipment exceeds a preset safety threshold value, determining a collision early warning instruction.
5. The edge-computing-based vessel positioning method of claim 1, further comprising:
acquiring environmental information of the ship in navigation, wherein the environmental information comprises temperature, humidity and wind speed;
and sending the environment information to a cloud.
6. The method of claim 1, wherein the acquiring of the device image of the piloting device comprises:
acquiring an environment image around the ship;
and carrying out image detection on the environment image, and determining an equipment image with a piloting device.
7. An edge-computing-based vessel positioning system, comprising: a camera, an edge computing device, a cloud;
the camera is used for shooting the piloting equipment to acquire an equipment image;
the edge computing device is used for performing image processing on the acquired device image and determining first position information of the piloting device; estimating second position information of the ship according to the first position information and the camera parameters of the camera; sending the second position information to a cloud end;
the cloud end is used for receiving and storing the second position information, sending the second position information to the terminal and receiving an instruction sent by the terminal;
the terminal is used for receiving the second position information sent by the cloud end and sending the instruction to the cloud end.
8. The edge computing-based vessel positioning system of claim 7, wherein the edge computing device further comprises: an acquisition module;
the acquisition module comprises a temperature and humidity sensor and an air speed sensor;
the acquisition module is used for acquiring environmental information, and the environmental information comprises temperature, humidity and wind speed.
9. An apparatus, comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement the edge computing-based vessel positioning method of any one of claims 1-6.
10. A storage medium having stored therein a processor-executable program, wherein the processor-executable program, when executed by the processor, is for implementing the edge computing-based vessel localization method according to any one of claims 1-6.
CN202110227242.4A 2021-03-02 2021-03-02 Ship positioning method, system and device based on edge calculation and storage medium Pending CN113179292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110227242.4A CN113179292A (en) 2021-03-02 2021-03-02 Ship positioning method, system and device based on edge calculation and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110227242.4A CN113179292A (en) 2021-03-02 2021-03-02 Ship positioning method, system and device based on edge calculation and storage medium

Publications (1)

Publication Number Publication Date
CN113179292A true CN113179292A (en) 2021-07-27

Family

ID=76921787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110227242.4A Pending CN113179292A (en) 2021-03-02 2021-03-02 Ship positioning method, system and device based on edge calculation and storage medium

Country Status (1)

Country Link
CN (1) CN113179292A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005125209A1 (en) * 2004-06-22 2005-12-29 Stratech Systems Limited Method and system for surveillance of vessels
US8594866B1 (en) * 2010-04-16 2013-11-26 The Boeing Company Remote sensing and determination of tactical ship readiness
CN104142683A (en) * 2013-11-15 2014-11-12 上海快仓智能科技有限公司 Automated guided vehicle navigation method based on two-dimension code positioning
CN106093954A (en) * 2016-06-02 2016-11-09 邓湘 A kind of Quick Response Code laser ranging vehicle positioning method and equipment thereof
CN106989746A (en) * 2017-03-27 2017-07-28 远形时空科技(北京)有限公司 Air navigation aid and guider
CN109754034A (en) * 2019-01-08 2019-05-14 北京邮电大学 A kind of terminal device localization method and device based on two dimensional code
CN111610789A (en) * 2020-07-01 2020-09-01 上海船舶研究设计院(中国船舶工业集团公司第六0四研究院) Ship comprehensive management and control system and intelligent ship

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005125209A1 (en) * 2004-06-22 2005-12-29 Stratech Systems Limited Method and system for surveillance of vessels
US8594866B1 (en) * 2010-04-16 2013-11-26 The Boeing Company Remote sensing and determination of tactical ship readiness
CN104142683A (en) * 2013-11-15 2014-11-12 上海快仓智能科技有限公司 Automated guided vehicle navigation method based on two-dimension code positioning
CN106093954A (en) * 2016-06-02 2016-11-09 邓湘 A kind of Quick Response Code laser ranging vehicle positioning method and equipment thereof
CN106989746A (en) * 2017-03-27 2017-07-28 远形时空科技(北京)有限公司 Air navigation aid and guider
CN109754034A (en) * 2019-01-08 2019-05-14 北京邮电大学 A kind of terminal device localization method and device based on two dimensional code
CN111610789A (en) * 2020-07-01 2020-09-01 上海船舶研究设计院(中国船舶工业集团公司第六0四研究院) Ship comprehensive management and control system and intelligent ship

Similar Documents

Publication Publication Date Title
CN110325818B (en) Joint 3D object detection and orientation estimation via multimodal fusion
CN107844750B (en) Water surface panoramic image target detection and identification method
CN109725310B (en) Ship positioning supervision system based on YOLO algorithm and shore-based radar system
CN111222395B (en) Target detection method and device and electronic equipment
US11948344B2 (en) Method, system, medium, equipment and terminal for inland vessel identification and depth estimation for smart maritime
CN110930459A (en) Vanishing point extraction method, camera calibration method and storage medium
CN107527368B (en) Three-dimensional space attitude positioning method and device based on two-dimensional code
CN110619328A (en) Intelligent ship water gauge reading identification method based on image processing and deep learning
CN111967396A (en) Processing method, device and equipment for obstacle detection and storage medium
CN109447902B (en) Image stitching method, device, storage medium and equipment
CN111259710B (en) Parking space structure detection model training method adopting parking space frame lines and end points
CN111767780A (en) AI and vision combined intelligent hub positioning method and system
CN111832760B (en) Automatic inspection method for well lid based on visual algorithm
CN112683228A (en) Monocular camera ranging method and device
CN111964680A (en) Real-time positioning method of inspection robot
CN114926726A (en) Unmanned ship sensing method based on multitask network and related equipment
Zhan et al. Effective waterline detection for unmanned surface vehicles in inland water
KR20210090573A (en) Method and device for monitoring harbor and ship considering sea level
CN109829421B (en) Method and device for vehicle detection and computer readable storage medium
CN112733678A (en) Ranging method, ranging device, computer equipment and storage medium
CN116682286A (en) Ship early warning method and system based on offshore target detection
Cafaro et al. Towards Enhanced Support for Ship Sailing
CN113179292A (en) Ship positioning method, system and device based on edge calculation and storage medium
CN116935369A (en) Ship water gauge reading method and system based on computer vision
CN114897999B (en) Object pose recognition method, electronic device, storage medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210727