CN112347218A - Unmanned ship environment map generation method and unmanned ship sensing system - Google Patents

Unmanned ship environment map generation method and unmanned ship sensing system Download PDF

Info

Publication number
CN112347218A
CN112347218A CN202011263337.3A CN202011263337A CN112347218A CN 112347218 A CN112347218 A CN 112347218A CN 202011263337 A CN202011263337 A CN 202011263337A CN 112347218 A CN112347218 A CN 112347218A
Authority
CN
China
Prior art keywords
data
unmanned ship
unmanned
internet
ship
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011263337.3A
Other languages
Chinese (zh)
Other versions
CN112347218B (en
Inventor
黄旭艳
余天亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Yunzhou Intelligence Technology Ltd
Original Assignee
Zhuhai Yunzhou Intelligence Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Yunzhou Intelligence Technology Ltd filed Critical Zhuhai Yunzhou Intelligence Technology Ltd
Priority to CN202011263337.3A priority Critical patent/CN112347218B/en
Publication of CN112347218A publication Critical patent/CN112347218A/en
Application granted granted Critical
Publication of CN112347218B publication Critical patent/CN112347218B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/60Positioning; Navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application is suitable for the technical field of unmanned ships and provides an environment map generation method of the unmanned ships and an unmanned ship sensing system, wherein the method is applied to a cloud computing platform and comprises the following steps: receiving boat-end sensing data transmitted by an unmanned boat, wherein the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat; determining the current navigation position of the unmanned ship; extracting Internet of things data in a preset range of the navigation position, wherein the Internet of things data is acquired by Internet of things equipment configured on a shore base and a current navigation path of the unmanned ship; and generating an environment map of the unmanned ship according to the ship-side sensing data and the Internet of things data. By adopting the method, the accuracy of the generated environment map can be improved.

Description

Unmanned ship environment map generation method and unmanned ship sensing system
Technical Field
The application belongs to the technical field of unmanned ships and boats, and particularly relates to an environment map generation method of an unmanned ship and an unmanned ship sensing system.
Background
In the field of automated driving, automated driving can be classified into different classes depending on whether manual control is required and to what degree. Wherein, the autopilot of high wisdom level does not need artificial attention, can realize complete unmanned driving.
By applying the automatic driving technology to the unmanned ship, the autonomous navigation of the unmanned ship can be realized. The unmanned ship realizes autonomous navigation with high intelligent level, and a high-precision environment map must be constructed so as to realize real-time accurate decision and obstacle avoidance.
At present, an environment map used in the navigation process of an unmanned ship mainly depends on data collected by a sensing system of the unmanned ship, including data collected by sensing devices such as a navigation radar, a laser radar, a visual device, an Automatic Identification System (AIS) and the like carried in the unmanned ship. By analyzing and processing the data, an environment map of the unmanned ship navigation area can be constructed. However, the method is limited by the size and carrying capacity of the boat and the detection capacity of the shipborne sensing equipment, and the accuracy and real-time performance of the environment map constructed by the current technical means are not enough to support the environment map required by the unmanned boat for accurate obstacle avoidance in a high-speed or complex environment, so that the development of the intelligent level of the unmanned boat is greatly limited, and the decision-making capacity of the unmanned boat for complex water areas and complex environments is greatly limited.
Disclosure of Invention
In view of this, an embodiment of the present application provides an environment map generation method for an unmanned ship and an unmanned ship sensing system, so as to solve the problem that an environment map constructed by a current technical means in the prior art cannot meet the accuracy and real-time performance of an environment map required by an unmanned ship for accurately avoiding an obstacle in a high-speed or complex environment.
A first aspect of an embodiment of the present application provides an environment map generation method for an unmanned ship, which is applied to a cloud computing platform, and the method includes:
receiving boat-end sensing data transmitted by an unmanned boat, wherein the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
determining the current navigation position of the unmanned ship;
extracting Internet of things data in a preset range of the navigation position, wherein the Internet of things data is acquired by Internet of things equipment configured on a shore base and a current navigation path of the unmanned ship;
and generating an environment map of the unmanned ship according to the ship-side sensing data and the Internet of things data.
A second aspect of the embodiments of the present application provides an environment map generation apparatus for an unmanned ship, which is applied to a cloud computing platform, and the apparatus includes:
the receiving module is used for receiving boat-end sensing data transmitted by the unmanned boat, and the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
the determining module is used for determining the current navigation position of the unmanned ship;
the extraction module is used for extracting the data of the Internet of things in the preset range of the navigation position, and the data of the Internet of things are acquired by Internet of things equipment configured on a shore base and the current route of the unmanned ship;
and the generating module is used for generating an environment map of the unmanned ship according to the ship side sensing data and the Internet of things data.
A third aspect of the embodiments of the present application provides an unmanned ship sensing system, including an unmanned ship, multiple sensing devices configured in the unmanned ship, internet of things devices configured on a shore base and on an unmanned ship route, and a cloud computing platform connected to the unmanned traditional communication, where the cloud computing platform includes:
the receiving module is used for receiving boat-end sensing data transmitted by the unmanned boat, and the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
the determining module is used for determining the current navigation position of the unmanned ship;
the extraction module is used for extracting the data of the Internet of things in the preset range of the navigation position, and the data of the Internet of things are acquired by Internet of things equipment configured on a shore base and the current route of the unmanned ship;
and the generating module is used for generating an environment map of the unmanned ship according to the ship side sensing data and the Internet of things data.
A fourth aspect of embodiments of the present application provides a cloud computing platform, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the unmanned ship environment map generation method according to the first aspect.
A fifth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the unmanned boat environment map generation method according to the first aspect.
A sixth aspect of the embodiments of the present application provides a computer program product, which when running on a cloud computing platform, causes the cloud computing platform to execute the method for generating an environment map of an unmanned ship according to the first aspect.
Compared with the prior art, the embodiment of the application has the following advantages:
according to the embodiment of the application, the boat end sensing data collected by the various sensing devices configured on the unmanned boat are transmitted to the cloud computing platform and processed by the cloud computing platform, so that the computing resource consumption of the unmanned boat end controller can be effectively reduced, the shipborne energy is saved, and the cruising ability of the unmanned boat is improved. Secondly, according to the embodiment of the application, the data of the internet of things collected by the equipment of the internet of things on the shore base and the unmanned ship route are uploaded to the cloud computing platform, so that the problems that the equipment information on the shore base and the unmanned ship route is dispersed, a unified storage and calling platform is unavailable or accurate information construction is unavailable at present can be effectively solved, massive information related to water navigation can be networked and collected to the unified cloud computing platform, and the construction of an all-round time domain and space domain information grid is facilitated. Thirdly, the ship-side sensing data and the internet of things data are further fused at the cloud computing platform side, and a full-scene real-time navigation path high-precision digital environment map can be constructed by fully utilizing information obtained by detection of various devices, so that an accurate sensing information system is provided for implementing an autonomous obstacle avoidance decision of a higher intelligent level at a higher navigation speed for the unmanned ship, and the decision capability of the unmanned ship for dealing with complex water areas and complex environments and dealing with complex conditions is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic flowchart illustrating steps of an environment map generating method for an unmanned ship according to an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating steps of an environment map generation method for an unmanned ship according to an embodiment of the present application;
fig. 3 is a schematic diagram of information fusion of a cloud computing platform according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a step of fusing boat-side sensing data according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an environment map provided by an embodiment of the present application;
fig. 6 is a schematic diagram of an environment map generating apparatus of an unmanned ship according to an embodiment of the present application;
fig. 7 is a schematic diagram of a cloud computing platform provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The technical solution of the present application will be described below by way of specific examples.
Referring to fig. 1, a schematic flow chart illustrating steps of a method for generating an environment map of an unmanned ship according to an embodiment of the present application is shown, and the method may specifically include the following steps:
s101, receiving boat-end sensing data transmitted by the unmanned boat, wherein the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat.
The method can be applied to a cloud computing platform, and a high-precision environment map can be generated through processing of the cloud computing platform and used by the unmanned ship in the navigation process, so that autonomous navigation of the unmanned ship is realized.
The cloud computing platform in the embodiment of the present application may be a server cluster composed of a plurality of servers and other devices configured on a shore base, and the embodiment of the present application does not limit the form of the cloud computing platform.
In the embodiment of the application, various sensing devices, such as radar devices, vision devices, Automatic Identification Systems (AIS) devices of ships and the like, can be configured in the unmanned ship. The radar device may include a navigation radar, a laser radar, a millimeter wave radar, and the like, which is not limited in the embodiments of the present application.
During the navigation process of the unmanned ship, various types of data can be acquired through the various sensing devices. For example, the navigation radar can scan dynamic and static targets within about 2 kilometers around the unmanned ship, the vision equipment can detect the dynamic and static targets within 200-500 meters, the laser radar can detect the targets within 50-100 meters, and the AIS equipment can be used for providing relevant information of the marine ship.
In this application embodiment, the boat-side sensing data transmitted by the unmanned boat to the cloud platform may be raw data acquired by the above-mentioned multiple sensing devices, or may be data obtained by performing preliminary processing on the raw data.
S102, determining the current navigation position of the unmanned ship.
In this embodiment of the application, the cloud computing platform generates the high-precision environment map used for autonomous navigation of the unmanned ship, and the high-precision environment map may be a three-dimensional map which is dynamically changed along with the navigation process of the unmanned ship and is centered at the current navigation position of the unmanned ship.
Therefore, when the cloud computing platform processes the boat-side sensing data transmitted by the unmanned boat, the current navigation position of the unmanned boat can be determined at first. The navigation position may be automatically reported to the cloud computing platform by the unmanned ship, for example, the navigation position information of the unmanned ship may be transmitted to the cloud computing platform along with the ship-side sensing data. Or, the navigation position may also be acquired by the cloud computing platform from other systems or devices, for example, the cloud computing platform may be connected to the AIS system, and the real-time navigation position of the unmanned ship is obtained based on the AIS system.
S103, extracting the data of the Internet of things in the preset range of the navigation position, wherein the data of the Internet of things are acquired by the equipment of the Internet of things which is configured on a shore base and the current navigation path of the unmanned ship.
In this application embodiment, the internet of things data in the preset range of unmanned ship navigation position can refer to the information collected by multiple internet of things devices in the preset range. The internet of things devices may include internet of things devices deployed on shore-based, and internet of things devices on the course of the unmanned boat. The internet of things data can include equipment information, signal information, hydrologic information, meteorological information, building information, wharf information, shoreline information, channel information and the like of various internet of things equipment, and the embodiment of the application does not limit the information.
In this embodiment of the application, the internet of things data may be acquired in advance through internet of things devices on shore-based and on air routes and stored in a database of the cloud computing platform, or may be acquired in real time through the internet of things devices when a high-precision environment map is generated, which is not limited in this embodiment of the application.
In a possible implementation manner of the embodiment of the application, the data of the internet of things may include static quantity information and dynamic quantity information. The static quantity information may include a bridge position, a bridge pier size, a dock position, a shoreline coordinate, an anchor coordinate, a navigation mark, and the like, and the information may be regarded as the static quantity information due to an extremely low change frequency (for example, a bridge has a life of 50 years). The width, curvature, length, water depth and the like of the navigation channel are influenced by tide, flood season, drought season and the like and change every moment, so that the information can be regarded as dynamic quantity information. Static quantity information and dynamic quantity information acquired through the Internet of things equipment can be stored in a database of the cloud computing platform.
The stored static quantity information can be kept unchanged for a certain time, and the dynamic quantity information needs to be updated according to a certain frequency. The navigation light, the shore-based radar target information, the shore-based photoelectric information and the like belong to high-frequency dynamic variables, and the updating frequency of the dynamic variables is higher.
And S104, generating an environment map of the unmanned ship according to the ship-side sensing data and the Internet of things data.
In the embodiment of the application, the cloud computing platform can perform data fusion on the two data after receiving the boat-side sensing data and the internet of things data transmitted by the unmanned boat to generate the high-precision environment map.
In the embodiment of the application, when the cloud computing platform fuses the two data to generate the high-precision environment map, the cloud computing platform can directly perform fusion processing on the received data, and generate the corresponding environment map according to the result of the fusion processing. Or the cloud computing platform can also process the ship-side sensing data and the internet of things data respectively, generate a local grid map according to the ship-side sensing data, generate a global grid map according to the internet of things data, and then fuse the local grid map and the global grid map to obtain the high-precision environment map. The embodiment of the application does not limit the specific mode that the cloud computing platform generates the environment map by using the ship-side sensing data and the internet of things data.
In the embodiment of the application, the boat end sensing data collected by the various sensing devices configured on the unmanned boat are transmitted to the cloud computing platform and processed by the cloud computing platform, so that the computing resource consumption of the unmanned boat end controller can be effectively reduced, the shipborne energy is saved, and the cruising ability of the unmanned boat is improved. Secondly, according to the embodiment of the application, the data of the internet of things collected by the equipment of the internet of things on the shore base and the unmanned ship route are uploaded to the cloud computing platform, so that the problems that the equipment information on the shore base and the unmanned ship route is dispersed, a unified storage and calling platform is unavailable or accurate information construction is unavailable at present can be effectively solved, massive information related to water navigation can be networked and collected to the unified cloud computing platform, and the construction of an all-round time domain and space domain information grid is facilitated. Thirdly, the ship-side sensing data and the internet of things data are further fused at the cloud computing platform side, and a full-scene real-time navigation path high-precision digital environment map can be constructed by fully utilizing information obtained by detection of various devices, so that an accurate sensing information system is provided for implementing an autonomous obstacle avoidance decision of a higher intelligent level at a higher navigation speed for the unmanned ship, and the decision capability of the unmanned ship for dealing with complex water areas and complex environments and dealing with complex conditions is further improved.
Referring to fig. 2, a schematic flow chart illustrating steps of another unmanned ship environment map generation method provided in the embodiment of the present application is shown, where the method may be applied to a cloud computing platform, and specifically may include the following steps:
s201, receiving boat-end sensing data transmitted by the unmanned boat, wherein the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat.
In the embodiment of the present application, the boat-side sensing data may be acquired by various sensing devices configured on the unmanned boat. The boat-side sensed data may be raw data collected by a variety of sensing devices. Such as the dynamic and static targets in the range of about 2 kilometers around the unmanned ship scanned by the navigation radar, the dynamic and static targets in the range of 200 and 500 meters detected by the vision equipment, and the like. Or, the boat-side sensing data may also be a local grid map obtained by fusing the above-mentioned raw data acquired by various sensing devices.
If the boat-end sensing data transmitted to the cloud computing platform by the unmanned boat are the original data acquired by the boat-end sensing equipment, the original data are processed by the cloud computing platform. Therefore, the data of equipment carried on the unmanned ship can be further reduced, the consumption of computing resources of the unmanned ship end controller is reduced, the shipborne energy is saved, and the endurance of the unmanned ship is improved.
For convenience of understanding, in the embodiment of the present application, the ship-side sensing data transmitted from the unmanned ship to the cloud computing platform is taken as the original data acquired by the ship-side sensing device as an example for subsequent description. After receiving the raw data, the cloud computing platform may execute S202, and perform fusion processing on the received raw data in a layered data fusion manner.
S202, fusing the original data acquired by the multiple sensing devices by adopting a layered data fusion mode to obtain a local grid map taking the unmanned ship as a center.
Fig. 3 is a schematic view illustrating information fusion of a cloud computing platform according to an embodiment of the present disclosure. According to the scheme shown in fig. 3, a vision device, a navigation radar, a laser radar, a millimeter wave radar and other radar devices can be configured at the unmanned ship end, and the ship end sensing data are collected through the devices and transmitted to the cloud computing platform. Of course, the sensing device configured at the boat end may also include other types of devices, such as AIS devices, and the like, which is not limited in this embodiment.
In the embodiment of the present application, the data formats, the characteristic lengths, and the like of the raw data acquired by the sensing devices are different from each other. Especially for visual devices, it obtains unstructured data of video images. When the method is used for fusing the images, the video images need to be further processed, target features (azimuth, distance, size and the like) are extracted, the extracted target features are mapped with three-dimensional point cloud structured data obtained by radar equipment and the like, and a local grid map with the unmanned ship as the center is constructed.
In general, the fusion processing of the boat-side sensing data can include three ways, namely:
1) and (4) direct data fusion. Direct data fusion may be used if various types of data are additive, such as using multiple image sensors or multiple acoustic sensors. Direct data fusion involves some classical estimation methods, such as kalman filtering, whereas only feature fusion or decision fusion can be used;
2) and (5) feature fusion. The feature vectors in the data can be fetched and fused based on the feature vectors;
3) and (5) decision fusion. And processing and judging the original data acquired by each sensing device, and finally fusing all decisions.
In the embodiment of the application, a layered data fusion mode can be adopted to fuse the original data acquired by the multiple sensing devices, so as to construct a local grid map with the unmanned ship as the center.
As shown in fig. 4, which is a schematic diagram of a step of fusing sensed data at a boat end according to an embodiment of the present application, the fusing process may include the following steps S2021 to S2024:
s2021, converting the original data collected by the radar equipment to the same coordinate system for data superposition to obtain first-layer fusion information.
In the embodiment of the application, for point cloud data with similar data formats acquired by devices such as a navigation radar and a laser radar, a direct data fusion mode can be adopted in the first-layer fusion processing process, data of two or more devices are converted into the same coordinate system, and then a Kalman filtering algorithm is adopted to stack the data of the devices, so that first-layer fusion information is obtained.
S2022, determining a plurality of target objects to be fused from the raw data collected by the vision device, and extracting feature information of the plurality of target objects to be fused.
S2023, performing feature fusion on the feature information of the target objects to be fused and the first layer of fusion information to obtain second layer of fusion information.
In this embodiment of the application, for the particularity of the image data acquired by the vision device, the second layer fusion processing process may adopt a feature fusion mode. After the target image to be fused is subjected to the image processing algorithm of detection and recognition, the features of the direction, the size, the distance and the like of the obstacle target can be extracted, and therefore the feature vector of the obstacle information is formed. Then, a feature vector algorithm based on data fusion can be adopted to further fuse the feature vector algorithm with the first layer of fusion information to obtain second layer of fusion information.
And S2024, performing decision fusion on the original data acquired by the automatic ship identification system equipment and the second layer of fusion information by adopting a neural network model to obtain a local grid map with the unmanned ship as a center.
In the embodiment of the present application, for information of a dynamic and static target provided by an AIS system, a chart system, and the like, when performing the third-layer fusion processing, a decision fusion algorithm based on a Deep Neural Network (DNN) learning mechanism may be used, so as to finally obtain a local grid map centered on an unmanned ship.
It should be noted that, because the ship information broadcasted by the AIS system has uncertainty or a large error, when the channel information is fused, a certain confidence decision is needed to blend the channel information into a whole.
S203, determining the current navigation position of the unmanned ship.
In the embodiment of the application, the cloud computing platform can determine the current navigation position of the unmanned ship by processing the received ship-side sensing data.
And S204, extracting the data of the Internet of things in the preset range of the navigation position, wherein the data of the Internet of things are acquired by the Internet of things equipment configured on a shore base and the current route of the unmanned ship.
In this application embodiment, the internet of things data in the preset range of unmanned ship navigation position can refer to the information collected by multiple internet of things devices in the preset range.
As shown in fig. 3, the data of the internet of things collected by the various internet of things devices may include shore-based radar information, shore-based photoelectric information, navigation mark information, bridge information, water flow information, ship information, wharf information, shoreline information, and the like. The information can be uniformly fused and processed by the cloud computing platform after being transmitted to the cloud computing platform by the Internet of things equipment.
In the embodiment of the application, the massive internet of things data come from different devices and different platforms, and have no direct association form, different data sets are low-coupling, and the data forms are not necessarily consistent. In order to fuse the internet of things data together to construct a high-precision gridding digital map on the air route, the massive data can be classified according to the feature vectors of the massive data.
In a specific implementation, the classification of the mass data of the internet of things can be performed according to a decision tree model and according to the feature selection of a target variable, so as to perform optimal attribute division. For example, mass data uploaded to a cloud computing platform may be first classified into shore-based data and water-based data according to their characteristics. The shore-based data can be further divided into wharf information, shoreline information, bridge information and the like; the water data can be further divided into navigation mark information, ship information, hydrological information, meteorological information and the like. Furthermore, the wharf information can be divided into wharf position coordinates, wharf mooring position size data and the like; the bridge information can be divided into bridge height, navigable bridge hole position, navigable bridge hole width, and the like.
According to the decision tree model, after being uploaded to the cloud computing platform, the mass Internet of things data can be continuously classified and grouped according to the characteristic attributes of the mass Internet of things data, and the mass Internet of things data is stored in the corresponding data set for calling during data fusion.
After being grouped, the massive internet of things data can comprise a plurality of data objects, and the wharf information, the shoreline information, the bridge information, the wharf position coordinates and the wharf anchor position size data can be regarded as one data object. Each data object contains data sets respectively acquired by a plurality of internet of things devices, and the data sets are data acquired by different internet of things devices.
In the embodiment of the application, after massive internet of things data transmitted to the cloud computing platform are grouped and stored, when a high-precision environment map needs to be constructed for the unmanned ship, the cloud computing platform can extract data in a preset range of a navigation position according to the received current navigation position of the unmanned ship and execute S205, and a global grid map containing the current route of the unmanned ship is generated according to the extracted internet of things data.
S205, generating a global grid map containing the current route of the unmanned ship according to the data of the Internet of things.
In the embodiment of the application, when the global grid map containing the current route of the unmanned ship is generated according to the data of the internet of things, data fusion can be performed on a plurality of data groups contained in each data object to respectively obtain the characteristic vector value of each data object, and then the global grid map is generated according to each data object and the corresponding characteristic vector value thereof.
In specific implementation, according to the extracted data of the internet of things, association mining can be performed on the characteristic vectors of the data sets acquired by each piece of internet of things equipment on the basis of each data object, so that the confidence of the characteristic values output by the data sets in the internet of things equipment is improved. The feature vectors are subjected to association mining, a data association algorithm based on deep learning DNN can be adopted, and the accuracy of definition and fusion of the same feature value by different data sets under the same Internet of things equipment is continuously improved through continuous learning. These feature vector values, which are generated after data fusion, will be used as numerical values on the corresponding grid on the global grid map to characterize the complete three-dimensional environment information.
For example, in order to obtain the feature vectors (ship size, heading, speed, accurate position, etc.) of ships on water, multiple data sets including shore-end photoelectric data, AIS data, shore-based radar detection data, etc. may be subjected to fusion processing. These data sets contain all or part of the characteristic information of the water vessel. Firstly, characteristic values of the data of the internet of things from each type of equipment of the internet of things can be extracted (a shore-based radar can provide the position, distance, size and speed information of a ship; photoelectric data can provide the position and size information of the ship; AIS can provide the position and size information of the ship) and characteristic vectors are generated according to the characteristic values. After data fusion, a feature vector (azimuth, distance, size, heading, speed) including all relevant information of the motion characteristics of the water ship can be formed. The fused ship characteristic vector values (azimuth, distance, size, course, speed) and the like are compared with the standard values, and according to errors and rules in the standard values, the ship characteristic vector values can be continuously corrected based on a deep learning DNN algorithm, so that the accuracy of the ship characteristic vector values for aquatic motion output in the classification data set is improved.
As another example, the main data source of the bridge information on the navigation path of the boat is the building information about the bridge stored on the cloud computing platform. Therefore, the bridge information needed in the high-precision map can be directly called the characteristic values of the bridge buildings stored on the cloud computing platform (such as the navigation height of the bridge, the navigation bridge hole positioning, the navigation width and the like) without fusion.
The global grid map constructed based on the data of the shore-based and channel multi-internet of things is a grid digital dynamic map which continuously changes in the time domain. The method is equivalent to providing accurate background information for autonomous navigation of the unmanned ship, and after the ship-side sensing data of the unmanned ship is mapped onto the map, a complete environment map containing the self-positioning and attitude information of the unmanned ship can be constructed.
S206, performing data mapping on the local grid map and the global grid map to obtain an environment map of the unmanned ship.
The local grid map generated based on the boat-end sensing data transmitted by the unmanned boat is a close-distance grid digital map with an unmanned boat central point as an origin coordinate system, and the local grid map is continuously and dynamically changed in a time domain; the global grid map generated based on the internet of things data acquired by the internet of things equipment is a grid digital map under a global coordinate system on the whole air route.
After the local grid map and the global grid map are respectively generated, the cloud computing platform can perform data mapping on the two grid maps to obtain a high-precision environment map.
In the embodiment of the application, the center of the unmanned ship can be used as a reference origin of a coordinate system, the center line of the heading position of the unmanned ship is used as the positioning coordinate direction of the coordinate system, the coordinate systems are respectively established in the local grid map and the global grid map, and then the local grid map and the global grid map are subjected to data mapping based on the coordinate systems in the local grid map and the global grid map to obtain the environment map of the unmanned ship. The environment map is a new, integrated and more detailed panoramic digital environment map with both static and dynamic states and global and local states.
Fig. 5 is a schematic diagram of an environment map according to an embodiment of the present application. The environment map obtained after fusion is a high-precision three-dimensional map with large overall static grid data density and small local dynamic grid data density. The environment map can provide an accurate environment information source for autonomous navigation of the unmanned ship, so that accurate sensing capacity is provided for obstacle avoidance decision and navigation strategy of higher intelligent level at higher navigation speed, and the capacity of the unmanned ship adapting to more complex water areas is greatly improved.
It should be noted that, the sequence numbers of the steps in the foregoing embodiments do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Referring to fig. 6, a schematic diagram of an environment map generating apparatus of an unmanned ship provided in an embodiment of the present application is shown, where the apparatus may be applied to a cloud computing platform, and specifically may include a receiving module 601, a determining module 602, an extracting module 603, and a generating module 604, where:
the receiving module is used for receiving boat-end sensing data transmitted by the unmanned boat, and the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
the determining module is used for determining the current navigation position of the unmanned ship;
the extraction module is used for extracting the data of the Internet of things in the preset range of the navigation position, and the data of the Internet of things are acquired by Internet of things equipment configured on a shore base and the current route of the unmanned ship;
and the generating module is used for generating an environment map of the unmanned ship according to the ship side sensing data and the Internet of things data.
In this embodiment, the boat-side sensing data includes raw data collected by the plurality of sensing devices, and the apparatus may further include:
and the boat-side perception data fusion module is used for fusing the original data acquired by the multiple perception devices by adopting a layered data fusion mode to obtain a local grid map taking the unmanned boat as a center.
In this embodiment of the present application, the boat-side sensing data fusion module may include the following sub-modules:
the first fusion submodule is used for converting the original data collected by the radar equipment to the same coordinate system for data superposition to obtain first-layer fusion information;
the characteristic information extraction submodule is used for determining a plurality of target objects to be fused from the original data acquired by the visual equipment and extracting the characteristic information of the plurality of target objects to be fused;
the second fusion submodule is used for performing feature fusion on the feature information of the target objects to be fused and the first layer of fusion information to obtain second layer of fusion information;
and the third fusion submodule is used for performing decision fusion on the original data acquired by the automatic ship identification system equipment and the second layer of fusion information by adopting a neural network model to obtain a local grid map taking the unmanned ship as a center.
In this embodiment of the application, the boat-side sensing data includes a local grid map obtained by fusing the original data acquired by the plurality of sensing devices.
In an embodiment of the present application, the generating module may include the following sub-modules:
the global grid map generation submodule is used for generating a global grid map containing the current route of the unmanned ship according to the data of the Internet of things;
and the environment map generation submodule is used for carrying out data mapping on the local grid map and the global grid map to obtain an environment map of the unmanned ship.
In this embodiment of the application, the data of the internet of things includes a plurality of data objects, each data object includes data sets respectively acquired by a plurality of devices of the internet of things, and the global grid map generation submodule may include the following units:
the internet of things data fusion unit is used for carrying out data fusion on a plurality of data groups contained in each data object to respectively obtain a characteristic vector value of each data object;
and the global grid map generating unit is used for generating the global grid map according to each data object and the corresponding characteristic vector value thereof.
In an embodiment of the present application, the environment map generation sub-module may include the following units:
a coordinate system establishing unit, configured to establish coordinate systems in the local grid map and the global grid map respectively, with a center of the unmanned ship as a reference origin of the coordinate system, and a center line of a heading position of the unmanned ship as a positioning coordinate direction of the coordinate system;
and the data mapping unit is used for carrying out data mapping on the local grid map and the global grid map based on the coordinate systems in the local grid map and the global grid map to obtain the environment map of the unmanned ship.
In the embodiment of the application, an unmanned ship sensing system is further provided, which comprises an unmanned ship, multiple sensing devices configured on the unmanned ship, internet of things devices configured on a shore base and an unmanned ship route, and a cloud computing platform in communication connection with an unmanned ship, wherein the cloud computing platform can comprise each module in the environment map generation device of the unmanned ship.
The high-precision time domain and airspace dual sensing system fusing the Internet of things data on the shore base and the channel and the boat end sensing data at the unmanned boat end comprises a cloud computing platform, sensing equipment at the unmanned boat end, boat end information transmission equipment, and relevant comprehensive networking equipment such as a navigation mark, a radar, a photoelectric device, a flow speed, a water depth, a wind speed, a bridge, a wharf and a coastal building on the shore base and the channel. In the process of constructing the high-precision environment map by adopting the system, the preliminary fusion of the ship-side sensing information is included, the real-time comprehensive networking data of the water area within about 5 kilometers taking the longitude and latitude of the unmanned ship as the center is called after the fusion, and the real-time comprehensive networking data is mapped to the grid map generated based on the ship-side sensing data, so that the real-time grid map with higher precision, more comprehensive information and more refinement is formed. The map can be used for autonomous obstacle avoidance decision of the unmanned ship after being issued to the unmanned ship end controller from the cloud computing platform. By adopting the system, the dependence on the single shipborne sensing equipment when the unmanned ship sails independently at present can be effectively solved, and the system also enables the source of partial information to become public information, so that the quantity of the sensing equipment required to be carried by the unmanned ship is reduced, the cost is reduced, and the cruising ability of the unmanned ship is improved. More importantly, a high-precision gridding environment map is constructed by fusing massive internet of things real-time data at the cloud, so that accurate environment information is provided for autonomous navigation of the unmanned ship entering a higher intelligent level.
For the apparatus embodiment, since it is substantially similar to the method embodiment, it is described relatively simply, and reference may be made to the description of the method embodiment section for relevant points.
Referring to fig. 7, a schematic diagram of a cloud computing platform provided in an embodiment of the present application is shown. As shown in fig. 7, the cloud computing platform 700 of the present embodiment includes: a processor 710, a memory 720, and a computer program 721 stored in said memory 720 and operable on said processor 710. The processor 710, when executing the computer program 721, implements the steps in the various embodiments of the unmanned ship' S environment map generation method described above, such as the steps S101 to S104 shown in fig. 1. Alternatively, the processor 710, when executing the computer program 721, implements the functions of each module/unit in each device embodiment described above, for example, the functions of the modules 601 to 604 shown in fig. 6.
Illustratively, the computer program 721 may be divided into one or more modules/units, which are stored in the memory 720 and executed by the processor 710 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which may be used to describe the execution of the computer program 721 in the cloud computing platform 700. For example, the computer program 721 may be divided into a receiving module, a determining module, an extracting module and a generating module, each module having the following specific functions:
the receiving module is used for receiving boat-end sensing data transmitted by the unmanned boat, and the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
the determining module is used for determining the current navigation position of the unmanned ship;
the extraction module is used for extracting the data of the Internet of things in the preset range of the navigation position, and the data of the Internet of things are acquired by Internet of things equipment configured on a shore base and the current route of the unmanned ship;
and the generating module is used for generating an environment map of the unmanned ship according to the ship side sensing data and the Internet of things data.
The cloud computing platform 700 may be a server cluster composed of computing devices such as a desktop computer, a notebook, a palm computer, and a cloud server. The cloud computing platform 700 may include, but is not limited to, a processor 710, a memory 720. Those skilled in the art will appreciate that fig. 7 is merely an example of a cloud computing platform 700 and is not intended to be limiting of cloud computing platform 700 and may include more or fewer components than illustrated, or some components in combination, or different components, e.g., cloud computing platform 700 may also include input-output devices, network access devices, buses, etc.
The Processor 710 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 720 may be an internal storage unit of the cloud computing platform 700, such as a hard disk or a memory of the cloud computing platform 700. The memory 720 may also be an external storage device of the cloud computing platform 700, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the cloud computing platform 700. Further, the memory 720 may also include both internal storage units and external storage devices of the cloud computing platform 700. The memory 720 is used to store the computer programs 721 and other programs and data required by the cloud computing platform 700. The memory 720 may also be used to temporarily store data that has been output or is to be output.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An environment map generation method of an unmanned ship is applied to a cloud computing platform, and comprises the following steps:
receiving boat-end sensing data transmitted by an unmanned boat, wherein the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
determining the current navigation position of the unmanned ship;
extracting Internet of things data in a preset range of the navigation position, wherein the Internet of things data is acquired by Internet of things equipment configured on a shore base and a current navigation path of the unmanned ship;
and generating an environment map of the unmanned ship according to the ship-side sensing data and the Internet of things data.
2. The unmanned boat environment map generation method of claim 1, wherein the boat-side awareness data comprises raw data collected by the plurality of awareness devices, the method further comprising:
and fusing the original data acquired by the multiple sensing devices by adopting a layered data fusion mode to obtain a local grid map taking the unmanned ship as a center.
3. The method according to claim 2, wherein the plurality of sensing devices include a radar device, a vision device and an automatic ship identification system device, and the fusing of the raw data collected by the plurality of sensing devices by using a hierarchical data fusion method to obtain the local grid map centered on the unmanned ship comprises:
converting the original data collected by the radar equipment to the same coordinate system for data superposition to obtain first-layer fusion information;
determining a plurality of target objects to be fused from original data acquired by the visual equipment, and extracting characteristic information of the plurality of target objects to be fused;
performing feature fusion on the feature information of the target objects to be fused and the first layer of fusion information to obtain second layer of fusion information;
and performing decision fusion on the original data acquired by the automatic ship identification system equipment and the second layer of fusion information by adopting a neural network model to obtain a local grid map with the unmanned ship as a center.
4. The unmanned ship environment map generation method of claim 1, wherein the ship-side sensing data comprises a local grid map obtained by fusing raw data collected by the plurality of sensing devices.
5. The unmanned ship environment map generation method according to any one of claims 2 to 4, wherein the generating the unmanned ship environment map according to the ship-side sensing data and the Internet of things data comprises:
generating a global grid map containing the current route of the unmanned ship according to the data of the Internet of things;
and carrying out data mapping on the local grid map and the global grid map to obtain an environment map of the unmanned ship.
6. The method according to claim 5, wherein the IOT data includes a plurality of data objects, each data object includes data sets respectively collected by a plurality of IOT devices, and the generating the global grid map including the current route of the unmanned ship according to the IOT data includes:
performing data fusion on a plurality of data groups contained in each data object to respectively obtain characteristic vector values of each data object;
and generating the global grid map according to each data object and the corresponding characteristic vector value thereof.
7. The method according to claim 5, wherein the data mapping of the local grid map and the global grid map to obtain the environment map of the unmanned ship comprises:
establishing a coordinate system in the local grid map and the global grid map respectively by taking the center of the unmanned ship as a reference origin of the coordinate system and the center line of the heading position of the unmanned ship as the positioning coordinate direction of the coordinate system;
and performing data mapping on the local grid map and the global grid map based on coordinate systems in the local grid map and the global grid map to obtain an environment map of the unmanned ship.
8. An unmanned ship sensing system, comprising an unmanned ship, a plurality of sensing devices configured on the unmanned ship, internet of things devices configured on a shore base and an unmanned ship route, and a cloud computing platform connected with the unmanned traditional communication, wherein the cloud computing platform comprises:
the receiving module is used for receiving boat-end sensing data transmitted by the unmanned boat, and the boat-end sensing data is acquired by various sensing devices configured on the unmanned boat;
the determining module is used for determining the current navigation position of the unmanned ship;
the extraction module is used for extracting the data of the Internet of things in the preset range of the navigation position, and the data of the Internet of things are acquired by Internet of things equipment configured on a shore base and the current route of the unmanned ship;
and the generating module is used for generating an environment map of the unmanned ship according to the ship side sensing data and the Internet of things data.
9. A cloud computing platform comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the unmanned boat environment map generation method of any of claims 1-7.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the unmanned boat environment map generation method according to any one of claims 1 to 7.
CN202011263337.3A 2020-11-12 2020-11-12 Unmanned ship environment map generation method and unmanned ship sensing system Active CN112347218B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011263337.3A CN112347218B (en) 2020-11-12 2020-11-12 Unmanned ship environment map generation method and unmanned ship sensing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011263337.3A CN112347218B (en) 2020-11-12 2020-11-12 Unmanned ship environment map generation method and unmanned ship sensing system

Publications (2)

Publication Number Publication Date
CN112347218A true CN112347218A (en) 2021-02-09
CN112347218B CN112347218B (en) 2024-06-04

Family

ID=74362698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011263337.3A Active CN112347218B (en) 2020-11-12 2020-11-12 Unmanned ship environment map generation method and unmanned ship sensing system

Country Status (1)

Country Link
CN (1) CN112347218B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112977165A (en) * 2021-03-23 2021-06-18 西安应用光学研究所 Power management system and method for unmanned naval vessel
CN113917930A (en) * 2021-11-11 2022-01-11 中国船舶重工集团公司第七一九研究所 Unmanned ship navigation state control method based on sensing data
CN113945219A (en) * 2021-09-28 2022-01-18 武汉万集光电技术有限公司 Dynamic map generation method, system, readable storage medium and terminal equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130093245A (en) * 2012-02-14 2013-08-22 (주)지엠티 Suspected smuggling vessel ais analysis system and it's analysis method on the basis of multi-sensors and sailing pattern analysis
US20160070265A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Multi-sensor environmental mapping
CN106909145A (en) * 2017-02-22 2017-06-30 武汉理工大学 Unmanned hydrographical survey ship barrier real-time perception obstacle avoidance system and method
CN108230247A (en) * 2017-12-29 2018-06-29 达闼科技(北京)有限公司 Generation method, device, equipment and the application program of three-dimensional map based on high in the clouds
KR20190036405A (en) * 2017-09-27 2019-04-04 한국해양과학기술원 System and method for supporting ship entering and leaving port using 3d lidar mounted on unmanned aerial vehicle
CN111507429A (en) * 2020-05-29 2020-08-07 智慧航海(青岛)科技有限公司 Intelligent ship multi-source perception data ship-side fusion method and device and decision system
WO2020168464A1 (en) * 2019-02-19 2020-08-27 SZ DJI Technology Co., Ltd. Local sensing based autonomous navigation, and associated systems and methods

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130093245A (en) * 2012-02-14 2013-08-22 (주)지엠티 Suspected smuggling vessel ais analysis system and it's analysis method on the basis of multi-sensors and sailing pattern analysis
US20160070265A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Multi-sensor environmental mapping
CN109388150A (en) * 2014-09-05 2019-02-26 深圳市大疆创新科技有限公司 Multi-sensor environment map structuring
CN106909145A (en) * 2017-02-22 2017-06-30 武汉理工大学 Unmanned hydrographical survey ship barrier real-time perception obstacle avoidance system and method
KR20190036405A (en) * 2017-09-27 2019-04-04 한국해양과학기술원 System and method for supporting ship entering and leaving port using 3d lidar mounted on unmanned aerial vehicle
CN108230247A (en) * 2017-12-29 2018-06-29 达闼科技(北京)有限公司 Generation method, device, equipment and the application program of three-dimensional map based on high in the clouds
WO2020168464A1 (en) * 2019-02-19 2020-08-27 SZ DJI Technology Co., Ltd. Local sensing based autonomous navigation, and associated systems and methods
CN111507429A (en) * 2020-05-29 2020-08-07 智慧航海(青岛)科技有限公司 Intelligent ship multi-source perception data ship-side fusion method and device and decision system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
李文华等: "水面自主船舶技术发展路径", 船舶工程, pages 64 - 73 *
苏士斌等: "无人驾驶运输船发展现状与关键技术", 船海工程, pages 56 - 59 *
贾冬青;赵福伟;孙超;: "船联网中的大数据融合管理体系研究", 舰船科学技术, no. 08, 23 April 2016 (2016-04-23), pages 155 - 157 *
贾冬青;赵福伟;孙超;: "船联网中的大数据融合管理体系研究", 舰船科学技术, no. 08, pages 155 - 157 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112977165A (en) * 2021-03-23 2021-06-18 西安应用光学研究所 Power management system and method for unmanned naval vessel
CN112977165B (en) * 2021-03-23 2022-09-06 西安应用光学研究所 Power management system and method for unmanned naval vessel
CN113945219A (en) * 2021-09-28 2022-01-18 武汉万集光电技术有限公司 Dynamic map generation method, system, readable storage medium and terminal equipment
CN113945219B (en) * 2021-09-28 2024-06-11 武汉万集光电技术有限公司 Dynamic map generation method, system, readable storage medium and terminal device
CN113917930A (en) * 2021-11-11 2022-01-11 中国船舶重工集团公司第七一九研究所 Unmanned ship navigation state control method based on sensing data

Also Published As

Publication number Publication date
CN112347218B (en) 2024-06-04

Similar Documents

Publication Publication Date Title
CN112347218B (en) Unmanned ship environment map generation method and unmanned ship sensing system
Han et al. Coastal SLAM with marine radar for USV operation in GPS-restricted situations
CN111738112B (en) Remote sensing ship image target detection method based on deep neural network and self-attention mechanism
CN111753677B (en) Multi-angle remote sensing ship image target detection method based on characteristic pyramid structure
Bovcon et al. A water-obstacle separation and refinement network for unmanned surface vehicles
CN113191372B (en) Construction method and application of ship target directional detection model
Yin et al. Improved PSPNet-based water shoreline detection in complex inland river scenarios
Wang et al. Research and implementation of global path planning for unmanned surface vehicle based on electronic chart
He et al. Mining channel water depth information from IoT-based big automated identification system data for safe waterway navigation
KR20240080189A (en) Distance measurement method and distance measurement device using the same
CN113920447A (en) Ship harbor detection method and device, computer equipment and storage medium
Wu et al. An overview of developments and challenges for unmanned surface vehicle autonomous berthing
CN113805178A (en) Method for detecting static obstructive objects on water surface
CN111824357B (en) Test method, test device, electronic equipment and computer readable storage medium
CN108052629A (en) A kind of quick extra large land determination methods based on high accuracy DEM data
Cafaro et al. Towards Enhanced Support for Ship Sailing
Zhou et al. A real-time algorithm for visual detection of high-speed unmanned surface vehicle based on deep learning
CN115587308A (en) Method and device for determining navigation channel, electronic equipment and storage medium
CN115439745A (en) Navigation mark carrying type monitoring system and method for ship image characteristics
CN114445572A (en) Deeplab V3+ based method for instantly positioning obstacles and constructing map in unfamiliar sea area
Bhaganagar et al. A novel machine-learning framework with a moving platform for maritime drift calculations
He et al. A recognition approach of radar blips based on improved fuzzy c means
Mehla et al. Object Detection in Autonomous Maritime Vehicles: Comparison Between YOLO V8 and EfficientDet
Sedova et al. Intelligent Collision Danger Assessment of Autonomous Unmanned Sea-Going Vessels
CN110895680A (en) Unmanned ship water surface target detection method based on regional suggestion network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519080 rooms 311 and 312A, 3 / F, Xiangshan ocean science and technology port, 3888 North Lovers Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province

Applicant after: Zhuhai Yunzhou Intelligent Technology Co.,Ltd.

Address before: 519080 rooms 311 and 312A, 3 / F, Xiangshan ocean science and technology port, 3888 North Lovers Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province

Applicant before: ZHUHAI YUNZHOU INTELLIGENCE TECHNOLOGY Ltd.

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant