CN109459046A - The positioning and air navigation aid of suspending underwater autonomous navigation device - Google Patents

The positioning and air navigation aid of suspending underwater autonomous navigation device Download PDF

Info

Publication number
CN109459046A
CN109459046A CN201811407006.5A CN201811407006A CN109459046A CN 109459046 A CN109459046 A CN 109459046A CN 201811407006 A CN201811407006 A CN 201811407006A CN 109459046 A CN109459046 A CN 109459046A
Authority
CN
China
Prior art keywords
image
positioning
carries out
navigation device
data base
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811407006.5A
Other languages
Chinese (zh)
Other versions
CN109459046B (en
Inventor
史剑光
李芙蓉
于海滨
彭时林
刘敬彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hanlu Marine Technology Co ltd
Original Assignee
Hangzhou Electronic Science and Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Electronic Science and Technology University filed Critical Hangzhou Electronic Science and Technology University
Priority to CN201811407006.5A priority Critical patent/CN109459046B/en
Publication of CN109459046A publication Critical patent/CN109459046A/en
Application granted granted Critical
Publication of CN109459046B publication Critical patent/CN109459046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses the positioning and air navigation aid of a kind of suspending underwater autonomous navigation device, include the following steps, S10, acquire underwater picture;S20 carries out significance analysis to acquired image;S30 carries out pretreatment and feature extraction input feature vector bag of words to acquired image;S40, progress characteristic matching, multisensor positions in real time and propeller control;The output of S20-S40 is constructed image data base by S50;S60 carries out closed loop detection according to image data base to image;S70 carries out offline optimization to image data base.The present invention is a kind of suspending underwater autonomous navigation device application environment for seabed base station, merges the positioning and air navigation aid of multi-sensor information, can effectively improve the underwater positioning accuracy of suspending underwater autonomous navigation device.

Description

The positioning and air navigation aid of suspending underwater autonomous navigation device
Technical field
The invention belongs to submarine navigation device fields, and in particular to a kind of positioning and navigation of suspending underwater autonomous navigation device Method.
Background technique
Suspending underwater aircraft (Hover Autonomous Underwater Vehicle, abbreviation HAUV) is a kind of The Autonomous Underwater Vehicle for having powerful maneuverability.HAUV has multiple propellers, and have can suspend in water layer, Have powerful up-and-down maneuver ability, can close to seabed navigate by water etc. multiple advantages.
Use cost in HAUV using traditional Underwater Navigation methods such as acoustics positioning, dead reckoning positioning is high, and deposits Limited in the effective distance of acoustics positioning, dead reckoning has the defects of error accumulation problem.At this stage, based on sea-floor relief Machine vision method is to solve the effective means of HAUV High precision underwater positioning and navigation problem.
Summary of the invention
The technical problem to be solved in the present invention is to provide a kind of HAUV application environments for seabed base station, merge more sensings The positioning and air navigation aid of device information can effectively improve the underwater positioning accuracy of HAUV.
In order to solve the above technical problems, the present invention adopts the following technical scheme that:
A kind of positioning and air navigation aid of suspending underwater autonomous navigation device, include the following steps,
S10 acquires underwater picture;
S20 carries out significance analysis to acquired image;
S30 carries out pretreatment and feature extraction input feature vector bag of words to acquired image;
S40, progress characteristic matching, multisensor positions in real time and propeller control;
The output of S20-S40 is constructed image data base by S50;
S60 carries out closed loop detection according to image data base to image;
S70 carries out offline optimization to image data base.
Preferably, the multisensor positions in real time, comprising the following steps:
S41, if k moment posePose variation is calculated by inertial navigation unitPass through electronics Compass obtains azimuthal variation
S42, such asFor threshold value, it is determined that course angle variable quantity
S43, such asThen determine that electronic compass is interfered by external magnetic field
S44 carries out visual pattern matching;
S45, if it matches, then calculating pose variation by homography matrix
S46, if it does not match, being determined as error hiding, k+1 moment pose is
S47, if position deviationAnd
S48 is to be judged to correctly matching, and k+1 moment pose isOtherwise S46.
Preferably, the significance analysis is measured by comentropy.
Preferably, the characteristic matching detects characteristic point using ORB feature operator, is excluded using RANSAC algorithm invalid special Sign.
Preferably, closed loop detection the following steps are included:
S61, from image of the distance found out in image data base with current location within setting value;
S62 compares present image and its cosine similarity one by one;
S63 carries out characteristic matching if cosine similarity is higher than setting value.
Preferably, the offline optimization uses BA nonlinear method.
Using the present invention have it is following the utility model has the advantages that
1, the present invention is higher than setting value as standard using cosine similarity by the way of closed loop detection, to determine whether into Characteristic matching between row image.According to camera position corresponding to the homography matrix calculating present image between closed image, thus The accumulated error of that section of voyage before elimination closed loop point.It is calculated by the relative position between closed image, it can be to postorder image Position be corrected, to improve the position precision of image sequence.
2, present invention sensor used for positioning includes inertial navigation unit, electronic compass, altimeter and camera, is hanged Positioning and navigation during the navigation of floating underwater autonomous navigation device use on-line Algorithm, by machine vision engagement height, are used to Property navigation elements information, directional information calculate the current pose of suspending underwater autonomous navigation device in real time, and pass through pose and feed back Propeller is controlled, so that robot is navigated by water with preset path.
3, off-line algorithm of the present invention in database image and location information calculated again, obtain more accurate Map.
Detailed description of the invention
Fig. 1 is the step flow chart of the embodiment of the present invention;
Fig. 2 is that Multi-sensor Fusion positions specific steps flow chart in the embodiment of the present invention;
Fig. 3 is that closed loop detects specific steps flow chart in the embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, every other implementation obtained by those of ordinary skill in the art without making creative efforts Example, shall fall within the protection scope of the present invention.
The invention discloses the positioning and air navigation aid of a kind of suspending underwater autonomous navigation device, specific embodiment referring to Fig. 1-3 and as described below.
S10 acquires underwater picture;
S20 carries out significance analysis to acquired image;
S30 carries out pretreatment and feature extraction input feature vector bag of words to acquired image;
S40, progress characteristic matching, multisensor positions in real time and propeller control;
The output of S20-S40 is constructed image data base by S50;
S60 carries out closed loop detection according to image data base to image;
S70 carries out offline optimization to image data base.
Multisensor positions in real time in S40, comprising the following steps:
S41, if k moment posePose variation is calculated by inertial navigation unitPass through electronics Compass obtains azimuthal variation
S42, such asFor threshold value, it is determined that course angle variable quantity
S43, such asThen determine that electronic compass is interfered by external magnetic field
S44 carries out visual pattern matching;
S45, if it matches, then calculating pose variation by homography matrix
S46, if it does not match, being determined as error hiding, k+1 moment pose is
S47, if position deviationAnd
S48 is to be judged to correctly matching, and k+1 moment pose isOtherwise S46.
Specifically, in above-mentioned steps, first compare the data of inertial navigation unit and electronic compass, determine that course angle changes AmountScreening foundation is that electronic compass is more sensitive for periphery electromagnetic environment, and inertial navigation unit precision is lower, and deposits In accumulated error, but it is stable, it is seldom interfered by ambient enviroment, therefore passes through threshold valueTo determine having for electronic compass data Effect property effectively such as it then uses the surveyed course angle of electronic compass, if it is invalid, then uses the surveyed course angle of inertial navigation unit.
Since water-bed environment is complicated and changeable, in fact it could happen that the case where images match fails uses inertial navigation unit at this time Surveyed coordinate data and vectoring angular data update current posture information.If images match success, utilizes inertial navigation Cell data and course angle information test to vision positioning result, if the two is not much different, it is possible to determine that are image With correct, take the surveyed coordinate of vision positioning and vectoring angular data updates current posture information.Conversely, then determining images match Mistake, is subject to inertial navigation unit data.
Machine vision positioning in above-mentioned steps uses images match to realize, specific formula is as follows:
In above formula, (Δ x, Δ y) are changes in coordinates of the image center in world coordinate system, and T is navigation body coordinate system To the transition matrix of world coordinate system, it is assumed here that navigation body coordinate system coincides with picture centre coordinate system,It is current The course of moment aircraft, K is simplified camera parameter matrix, with camera heights h (being obtained by altitude measuring) with camera x, the direction y Pixel focal length fxAnd fyRatio reflect that imaging plane to the size conversion of seabed plane, has ignored camera imaging in matrix Distortion, H is homography matrix of the eve image to present image, since camera is towards substantially vertical seabed plane, and phase The height approximation of machine is constant, which can be approximated to be equilong transformation.H-matrix is solved by Feature Points Matching described below It arrives.θ indicates the relative rotation angle of image, tx、tyIndicate the relative displacement of image, unit is pixel.The effect of P matrix be by The coordinate origin of the plane of delineation is moved to picture centre, and W is picture traverse, and H is picture altitude, and unit is pixel.
Since the sensor in above-mentioned steps has measurement error, institute's measured data is filtered using EKF algorithm.For Raising operation efficiency simplifies state conversion, and EKF is with the two dimension of camera between the two images that are obtained by fuse information location algorithm (current course angle of Δ x, Δ y) and HAUV change changes in coordinatesState vector is formed, i.e.,
Its state equation is xk=f (xk-1k-1)+wk-1
In formula, f () indicates the kinetic model of system, has reacted system input υk-1Influence to system state change, wk-1It is the process noise of Normal Distribution.Input parameter of the invention is the revolving speed of two propellers, is measured by experiment, Revolution speed of propeller can be obtained roughly for the influence function of suspending underwater autonomous navigation device forward speed and turning velocity, from And the foundation of completion status equation.
Measuring equation is zk=h (xk)+vk
In formula, vkIt is the observation noise of Normal Distribution;H () indicate state vector to observation vector transfer function, The actually reaction of the variation of camera position on a sensor.
After the processing of EKF, using starting point as origin, the current pose of image center can be expressed as Since electronic compass can measure absolute course angleThe accumulation of error may be not present, therefore, if electronic compass data are normal, take It, which reads, is used as current course angle, and current pose is represented by
Significance analysis is measured by comentropy in S20, and characteristic matching detects feature using ORB feature operator in S40 Point excludes invalid feature using RANSAC algorithm, specifically,
Characteristic point is detected using ORB feature operator, invalid feature is excluded with RANSAC algorithm, passes through feature between two images The mapping of point solves homography matrix.
The quality of image is measured with conspicuousness.Fuzzy processing is carried out to picture first.This is tiny and numerous in order to exclude More characteristic points only retains biggish feature.Saliency is measured by comentropy E, and calculation formula is as follows:
Wherein, wk indicates a feature vocabulary in image, it is assumed that its sum is n, and p (wk) is that wk vocabulary goes out in the picture Existing frequency.From the formula it can be seen that feature vocabulary type is more, distribution is more uniform, and comentropy is bigger, can more embody The conspicuousness of image.
In S60 closed loop detection the following steps are included:
S61, from image of the distance found out in image data base with current location within setting value;
S62 compares present image and its cosine similarity one by one;
S63 carries out characteristic matching if cosine similarity is higher than setting value.
Specifically, during map is explored, the quantity of vocabulary is ever-increasing in feature lexicon.Use Feature Words Remittance histogram vectors (nw1,nw2...nwk) characterize image (nwkIndicate feature vocabulary wkThe number occurred in the picture), it is assumed that At a time, the feature histogram vector of the i-th width image is xi, the feature vocabulary histogram vectors of jth width image are xj, then The similarity degree of two images can be indicated with cosine similarity:
In closed loop detection process, the first image from the distance found out in database with current location within setting value, by One compares cosine similarity of the present image with them, if cosine similarity is higher than setting value, carries out characteristic matching.If Characteristic matching is successful, then the camera position according to corresponding to the homography matrix calculating present image between closed image, to eliminate The accumulated error of that section of voyage before between closed loop point.If being more than piece image successful match, the higher figure of conspicuousness is chosen As carrying out dead reckoning.
In S50, in order to effectively manage image, the analysis of image is stored as a result, and convenient for the utilization again of image, foundation Image data base manages image.Database synchronous building during HAUV is navigated by water, wherein the quasi- data item included is as follows,
Offline optimization in S70, since the operational capability of robot kernel is limited, on-line Algorithm is by sacrificing certain efficiency Improve execution efficiency.And after HAUV completes navigational duty, there is time enough to carry out global optimization offline, obtains higher precision Map, so as to next time navigation in use, therefore the target of offline optimization be disregard calculate cost obtain high-precision map (when So, also to accomplish that calculation amount is as small as possible under equal accuracy).The present invention is using the nonlinear optimization side BA being extensively examined Method.Since sea-floor relief is approximately plane, and camera moves in the planes, can simplify traditional BA problem with the two constraints, Construct following cost function:
In formula,WithThe i-th width image and jth width image are respectively indicated to public map plane (for splicing Have the imaginary plane of image) homography matrix,WithIt is the match point on the i-th width image and jth width image respectively.Optimization Target is so that distance is minimum between projection of all match points on map plane.It is asked with column Wen Baige-Ma Kuaer special formula method Above-mentioned least square problem, homography matrix between available optimal terrain piecing figure and image-map are solved, and then is optimized Homography matrix between image-image afterwards.Homography matrix is substituted into formula:
Camera position corresponding to image can be acquired, to complete the global nonlinear optimization of image data.
It should be appreciated that exemplary embodiment as described herein is illustrative and be not restrictive.Although being retouched in conjunction with attached drawing One or more embodiments of the invention is stated, it should be understood by one skilled in the art that not departing from through appended right In the case where the spirit and scope of the present invention defined by it is required that, the change of various forms and details can be made.

Claims (6)

1. a kind of positioning and air navigation aid of suspending underwater autonomous navigation device, which comprises the following steps:
S10 acquires underwater picture;
S20 carries out significance analysis to acquired image;
S30 carries out pretreatment and feature extraction input feature vector bag of words to acquired image;
S40, progress characteristic matching, multisensor positions in real time and propeller control;
The output of S20-S40 is constructed image data base by S50;
S60 carries out closed loop detection according to image data base to image;
S70 carries out offline optimization to image data base.
2. the positioning and air navigation aid of suspending underwater autonomous navigation device as described in claim 1, which is characterized in that described more Sensor positions in real time, comprising the following steps:
S41, if k moment pose x, y,Pose changes delta x is calculated by inertial navigation uniti,Δyi,Pass through electronics sieve It examines and seizes and takes azimuthal variation
S42, such as For threshold value, it is determined that course angle variable quantity
S43, such asThen determine that electronic compass is interfered by external magnetic field
S44 carries out visual pattern matching;
S45, if it matches, then calculating pose changes delta x by homography matrixv,Δyv,
S46, if it does not match, being determined as error hiding, k+1 moment pose is x+ Δ xi,y+Δyi,
S47, if position deviationAnd
S48 is to be judged to correctly matching, and k+1 moment pose is x+ Δ xv,y+Δyv,Otherwise S46.
3. the positioning and air navigation aid of suspending underwater autonomous navigation device as described in claim 1, which is characterized in that described aobvious The analysis of work property is measured by comentropy.
4. the positioning and air navigation aid of suspending underwater autonomous navigation device as described in claim 1, which is characterized in that the spy Sign matching detects characteristic point using ORB feature operator, excludes invalid feature using RANSAC algorithm.
5. the positioning and air navigation aid of suspending underwater autonomous navigation device as described in claim 1, which is characterized in that described to close Ring detection the following steps are included:
S61, from image of the distance found out in image data base with current location within setting value;
S62 compares present image and its cosine similarity one by one;
S63 carries out characteristic matching if cosine similarity is higher than setting value.
6. the positioning and air navigation aid of suspending underwater autonomous navigation device as described in claim 1, which is characterized in that it is described from Line optimization uses BA nonlinear method.
CN201811407006.5A 2018-11-23 2018-11-23 Positioning and navigation method of suspension type underwater autonomous vehicle Active CN109459046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811407006.5A CN109459046B (en) 2018-11-23 2018-11-23 Positioning and navigation method of suspension type underwater autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811407006.5A CN109459046B (en) 2018-11-23 2018-11-23 Positioning and navigation method of suspension type underwater autonomous vehicle

Publications (2)

Publication Number Publication Date
CN109459046A true CN109459046A (en) 2019-03-12
CN109459046B CN109459046B (en) 2020-05-26

Family

ID=65611503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811407006.5A Active CN109459046B (en) 2018-11-23 2018-11-23 Positioning and navigation method of suspension type underwater autonomous vehicle

Country Status (1)

Country Link
CN (1) CN109459046B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111951198A (en) * 2019-05-16 2020-11-17 杭州海康机器人技术有限公司 Unmanned aerial vehicle aerial image splicing optimization method and device and storage medium
CN111968035A (en) * 2020-08-05 2020-11-20 成都圭目机器人有限公司 Image relative rotation angle calculation method based on loss function
CN118226884A (en) * 2024-05-22 2024-06-21 江苏园上园智能科技有限公司 Method for controlling precise cruising of flying robot in air

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107690840B (en) * 2009-06-24 2013-07-31 中国科学院自动化研究所 Unmanned plane vision auxiliary navigation method and system
CN105208347A (en) * 2015-10-08 2015-12-30 成都时代星光科技有限公司 Automatic unmanned aerial vehicle aerial patrolling and real-time image collecting and transmitting monitoring device for railway line
CN105222760A (en) * 2015-10-22 2016-01-06 一飞智控(天津)科技有限公司 The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method
CN105825517A (en) * 2016-03-31 2016-08-03 湖北航天技术研究院总体设计所 Image correction method and system of navigation height errors
US20170050747A1 (en) * 2015-08-22 2017-02-23 Olaf Wessler Method for destination approach control of unmanned aerial vehicles
CN108592916A (en) * 2018-04-20 2018-09-28 杭州电子科技大学 The more flight number Orientation on map and air navigation aid of suspending underwater autonomous navigation device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107690840B (en) * 2009-06-24 2013-07-31 中国科学院自动化研究所 Unmanned plane vision auxiliary navigation method and system
US20170050747A1 (en) * 2015-08-22 2017-02-23 Olaf Wessler Method for destination approach control of unmanned aerial vehicles
CN105208347A (en) * 2015-10-08 2015-12-30 成都时代星光科技有限公司 Automatic unmanned aerial vehicle aerial patrolling and real-time image collecting and transmitting monitoring device for railway line
CN105222760A (en) * 2015-10-22 2016-01-06 一飞智控(天津)科技有限公司 The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method
CN105825517A (en) * 2016-03-31 2016-08-03 湖北航天技术研究院总体设计所 Image correction method and system of navigation height errors
CN108592916A (en) * 2018-04-20 2018-09-28 杭州电子科技大学 The more flight number Orientation on map and air navigation aid of suspending underwater autonomous navigation device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李娟 等: "未知环境下基于感知自适应的多AUV目标搜索算法", 《***工程与电子技术》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111951198A (en) * 2019-05-16 2020-11-17 杭州海康机器人技术有限公司 Unmanned aerial vehicle aerial image splicing optimization method and device and storage medium
CN111951198B (en) * 2019-05-16 2024-02-02 杭州海康威视数字技术股份有限公司 Unmanned aerial vehicle aerial image stitching optimization method, device and storage medium
CN111968035A (en) * 2020-08-05 2020-11-20 成都圭目机器人有限公司 Image relative rotation angle calculation method based on loss function
CN111968035B (en) * 2020-08-05 2023-06-20 成都圭目机器人有限公司 Image relative rotation angle calculation method based on loss function
CN118226884A (en) * 2024-05-22 2024-06-21 江苏园上园智能科技有限公司 Method for controlling precise cruising of flying robot in air

Also Published As

Publication number Publication date
CN109459046B (en) 2020-05-26

Similar Documents

Publication Publication Date Title
CN101435704B (en) Star tracking method of star sensor under high dynamic state
CN113052908B (en) Mobile robot pose estimation algorithm based on multi-sensor data fusion
US9355453B2 (en) Three-dimensional measurement apparatus, model generation apparatus, processing method thereof, and non-transitory computer-readable storage medium
US9990726B2 (en) Method of determining a position and orientation of a device associated with a capturing device for capturing at least one image
US7193636B2 (en) Image processing device and method therefor and program codes, storing medium
CN109676604A (en) Robot non-plane motion localization method and its motion locating system
CN106052691B (en) Close ring error correction method in the mobile drawing of laser ranging
CN109459046A (en) The positioning and air navigation aid of suspending underwater autonomous navigation device
WO2008024772A1 (en) Image-based system and method for vehicle guidance and navigation
CN112197765B (en) Method for realizing fine navigation of underwater robot
US20220390954A1 (en) Topology Processing for Waypoint-based Navigation Maps
Miao et al. UniVIO: Unified direct and feature-based underwater stereo visual-inertial odometry
CN110388919B (en) Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality
CN114608554B (en) Handheld SLAM equipment and robot instant positioning and mapping method
CN107831515A (en) Underwater Navigation method and system
CN111665512A (en) Range finding and mapping based on fusion of 3D lidar and inertial measurement unit
CN114485613B (en) Positioning method for multi-information fusion underwater robot
Shmatko et al. Estimation of rotation measurement error of objects using computer simulation
CN115344033A (en) Monocular camera/IMU/DVL tight coupling-based unmanned ship navigation and positioning method
Palmer et al. Vision based localization system for AUV docking on subsea intervention panels
CN114842224A (en) Monocular unmanned aerial vehicle absolute vision matching positioning scheme based on geographical base map
CN112489118A (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN112214028A (en) Underwater robot pose control method based on OpenMV
JPH1137736A (en) Method and device for measuring 3-dimentional shape
JPWO2021111613A1 (en) 3D map creation device, 3D map creation method, and 3D map creation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221020

Address after: Room c-1108, No. 198, Qidi Road, economic and Technological Development Zone, Xiaoshan District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Hanlu Marine Technology Co.,Ltd.

Address before: 310018 No.1, No.2 street, Xiasha Higher Education Park, Hangzhou City, Zhejiang Province

Patentee before: HANGZHOU DIANZI University

TR01 Transfer of patent right