CN109146919B - Tracking and aiming system and method combining image recognition and laser guidance - Google Patents

Tracking and aiming system and method combining image recognition and laser guidance Download PDF

Info

Publication number
CN109146919B
CN109146919B CN201810641748.8A CN201810641748A CN109146919B CN 109146919 B CN109146919 B CN 109146919B CN 201810641748 A CN201810641748 A CN 201810641748A CN 109146919 B CN109146919 B CN 109146919B
Authority
CN
China
Prior art keywords
target object
laser
image
module
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810641748.8A
Other languages
Chinese (zh)
Other versions
CN109146919A (en
Inventor
梁欢
金科
杨天乙
黄辉
黄凤
李春龙
张冉
邓辉
黄莉
杨智豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
State Grid Corp of China SGCC
State Grid Shanxi Electric Power Co Ltd
Global Energy Interconnection Research Institute
Economic and Technological Research Institute of State Grid Shanxi Electric Power Co Ltd
Original Assignee
Nanjing University of Aeronautics and Astronautics
State Grid Corp of China SGCC
State Grid Shanxi Electric Power Co Ltd
Global Energy Interconnection Research Institute
Economic and Technological Research Institute of State Grid Shanxi Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics, State Grid Corp of China SGCC, State Grid Shanxi Electric Power Co Ltd, Global Energy Interconnection Research Institute, Economic and Technological Research Institute of State Grid Shanxi Electric Power Co Ltd filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201810641748.8A priority Critical patent/CN109146919B/en
Publication of CN109146919A publication Critical patent/CN109146919A/en
Application granted granted Critical
Publication of CN109146919B publication Critical patent/CN109146919B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J50/00Circuit arrangements or systems for wireless supply or distribution of electric power
    • H02J50/30Circuit arrangements or systems for wireless supply or distribution of electric power using light, e.g. lasers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Power Engineering (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a tracking and aiming system and a method combining image recognition and laser guidance, wherein the system comprises a laser receiving device and a laser transmitting device, firstly, inspection is carried out through a preset line, after a target object is judged to enter a view field, the position information of the target object is obtained, a tracking and aiming holder is adjusted to enable the target object to be in the central area of the view field, a laser beam is guided to enter the target area to complete rough tracking, and after the target object enters the target area, the direction of a beacon light is finely adjusted according to the deviation of the beacon light and a reflected beacon light to realize high-precision tracking and aiming.

Description

Tracking and aiming system and method combining image recognition and laser guidance
Technical Field
The invention relates to the technical field of laser wireless power transmission and image recognition, in particular to a tracking and aiming system and method combining image recognition and laser guidance.
Background
With the increasing popularization of electrical equipment, the traditional power supply mode mainly based on contact conduction has the problems of poor mobility, insecurity, reliability and the like. Especially, in a system where the electrical equipment and the power supply system are inconvenient to contact with each other due to a long distance or move relatively, more additional equipment is needed to meet the requirement of physical contact, and inconvenience is brought to the application of the electrical equipment. And wireless laser power transmission has the advantage that transmission distance is far away, the directionality is good, can be used to carry out non-contact charging from the higher equipment of distance on ground, perhaps carries out non-contact charging to fast moving targets such as unmanned aerial vehicle, tank, vehicle.
The current commonly used laser tracking and aiming technology mainly comprises a GPS positioning technology, a four-quadrant infrared detection technology and the like, but the GPS positioning technology has the defects of low short-distance positioning precision, high cost and the like, and the four-quadrant infrared detection technology is only suitable for the condition that the initial position of a laser beam is in a target area and has a small search range.
Disclosure of Invention
Therefore, the tracking and aiming system and method combining image recognition and laser guidance provided by the invention overcome the defects of low positioning precision and high cost in the prior art.
The embodiment of the invention provides a laser emitting device with a tracking and aiming function, which comprises: the system comprises an image processing module, a laser guide module, a laser transmitter and a photoelectric detector, wherein the image processing module acquires an inspection image according to a preset route, extracts image information in the inspection image, compares the image information with preset template image information and judges whether a target object enters a view field or not; when the target object is judged to enter a field of view, the image processing module acquires the position information of the target object and adjusts the target object to the central area of the field of view according to the position information of the target object; the laser guide module drives the laser transmitter to transmit beacon light to the target object; the photoelectric detector collects a reflected light beam of the beacon light and determines position information of the reflected light beam; the laser guiding module adjusts the laser transmitter according to the position information of the reflected light beam and the position of the central point of the photoelectric detector, so that the reflected light beam of the beacon light is aligned to the central point of the photoelectric detector; the laser transmitter transmits a main beam to the target object.
Further, the image processing module includes: the system comprises an image acquisition submodule, an image matching submodule and a view field area adjusting submodule, wherein the image acquisition submodule acquires a patrol inspection image according to a preset route and extracts image information in the patrol inspection image; the image matching sub-module matches the image information of the target object with the image information in the inspection image and judges whether the target object enters a view field or not; when the target object is judged to enter the field of view, the field of view area adjusting submodule acquires the position information of the target object and adjusts the target object to the central area of the field of view according to the position information of the target object.
Furthermore, the image processing module and the laser transmitter are arranged on a tracking cloud deck, and the tracking cloud deck drives the image processing module to collect a polling image according to a preset route and the laser transmitter to transmit beacon light and a main light beam to a target object.
Further, the laser transmitter includes: the device comprises a laser, a transmitter and a laser power supply, wherein the laser power supply supplies electric energy to the laser; the laser excites photons to generate beacon light according to the electric energy provided by the laser power supply; the transmitter transmits the beacon light, generates a main light beam according to the shaping of the beacon light, and transmits the main light beam.
The embodiment of the invention also discloses a tracking and aiming system combining image recognition and laser guidance, which comprises: laser receiving arrangement and laser emission device, laser machine receiving arrangement sets up on the target object, laser emission device includes: the system comprises an image processing module, a laser guide module, a laser transmitter and a photoelectric detector, wherein the image processing module acquires an inspection image according to a preset route, extracts image information in the inspection image, compares the image information with preset template image information and judges whether a target object enters a view field or not; when the target object is judged to enter a field of view, the image processing module acquires the position information of the target object and adjusts the target object to the central area of the field of view according to the position information of the target object; the laser guide module drives the laser transmitter to transmit beacon light to the target object; the photoelectric detector collects a reflected light beam of the beacon light reflected by the laser receiving device and determines the position information of the reflected light beam; the laser guiding module adjusts the laser transmitter to transmit according to the position information of the reflected light beam and the position of the central point of the photoelectric detector, so that the reflected light beam of the beacon light is aligned to the central point of the photoelectric detector; the laser transmitter transmits a main beam to the target object.
Further, the laser light receiving apparatus includes: a plurality of marker lights for constructing a plurality of color feature points on the target object; the image processing module comprises an image acquisition sub-module and an image matching sub-module, wherein,
the image acquisition submodule acquires the inspection image according to a preset route and extracts image information in the inspection image; the image matching sub-module includes: the mass center coordinate acquisition sub-module is used for matching the patrol inspection image information with preset template image information and acquiring the mass center coordinate of a matching area with the highest matching degree with the target object; the first color characteristic point coordinate obtaining submodule is used for obtaining the coordinates of a plurality of color characteristic points arranged on the target object; the distance minimum value obtaining submodule is used for respectively calculating the distances between the centroid coordinates of the matching area and the coordinates of the plurality of color feature points and obtaining the minimum value of the distances; the first distance judgment submodule is used for judging whether the minimum value of the distance is smaller than a first preset value or not; and the object judgment sub-module judges that the target object enters the field of view when the minimum distance value is smaller than a first preset value.
Further, the image processing module further includes a field of view region adjustment sub-module, which includes: the first position centroid coordinate acquisition sub-module is used for matching the patrol inspection image information with preset template image information and acquiring a first position centroid coordinate of a matching area with the highest matching degree with the target object; the second color characteristic point coordinate obtaining submodule is used for obtaining the coordinates of a plurality of color characteristic points arranged on the target object; the second position centroid coordinate acquisition sub-module is used for acquiring a second position centroid coordinate of the target object according to the coordinates of the plurality of color feature points; a centroid coordinate distance submodule for calculating a distance between the first and second location centroid coordinates; the second distance judgment submodule is used for judging whether the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value or not; the target object position information determining submodule determines the first position centroid coordinate as the position information of the target object when the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value; and the target object adjusting submodule is used for adjusting the target object to the central area of the view field according to the position information of the target object.
Further, the laser receiving apparatus further includes: an optical assembly and a photovoltaic cell, wherein the optical assembly reflects the beacon light to produce the reflected light beam; the photovoltaic cell receives the main beam and converts the main beam into electric energy.
Further, the laser receiving apparatus further includes: the power converter is used for converting the power of the electric energy generated by the photovoltaic cell and then transmitting the electric energy to the target object and/or the storage battery.
The embodiment of the invention also provides a tracking and aiming method combining image recognition and laser guidance, which comprises the following steps: acquiring a patrol inspection image according to a preset route, and extracting image information in the patrol inspection image; comparing the image information with preset template image information, and judging whether the target object enters a view field or not; when the target object is judged to enter the field of view, acquiring the position information of the target object, and adjusting the target object to the central area of the field of view according to the position information of the target object; transmitting beacon light to the target object, collecting a reflected light beam of the beacon light, and determining position information of the reflected light beam; adjusting the beacon light according to the position information of the reflected light beam and the position of the central point of the photoelectric detector to enable the reflected light beam to be aligned with the central point of the photoelectric detector; a main beam is emitted toward the target object.
Further, the step of comparing the image information with preset template image information and judging whether the target object enters the field of view specifically includes: matching the inspection image information with preset template image information to obtain a centroid coordinate of a matching area with the highest matching degree with the target object; acquiring coordinates of a plurality of color feature points arranged on the target object; respectively calculating the distances between the centroid coordinates of the matching area and the coordinates of the plurality of color feature points, and acquiring the minimum value of the distances; judging whether the minimum value of the distance is smaller than a first preset value or not; and when the minimum distance value is smaller than a first preset value, judging that the target object enters a field of view.
Further, the step of acquiring the position information of the target object and adjusting the target object to the central area of the field of view according to the position information of the target object specifically includes: matching the inspection image information with preset template image information to obtain a first position centroid coordinate of a matching area with the highest matching degree with the target object; acquiring coordinates of a plurality of color feature points arranged on the target object; acquiring a second position centroid coordinate of the target object according to the coordinates of the plurality of color feature points; calculating a distance between the first and second location centroid coordinates; judging whether the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value or not; when the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value, determining the first position centroid coordinate as the position information of the target object; and adjusting the target object to the central area of the field of view according to the position information of the target object.
Further, the step of adjusting the beacon light according to the position information of the reflected light beam and the position of the central point of the photodetector to align the reflected light beam with the central point of the photodetector specifically includes: acquiring the coordinate of the central point of the photoelectric detector and the coordinate of a reflected beam of the beacon light reflected by the target object; calculating a deviation value of the coordinate of the central point of the photoelectric detector and the coordinate of the reflected light beam; and adjusting the coordinates of the reflected light beam of the beacon light to be aligned with the central point of the photoelectric detector according to the deviation value.
The technical scheme of the invention has the following advantages:
the laser emitting device with the tracking function, the tracking system combining image recognition and laser guidance and the method thereof provided by the embodiment of the invention have the advantages that the inspection is carried out through a preset line, after the target object is judged to enter a view field, the position information of the target object is obtained, the tracking holder is adjusted to enable the target object to be in the central area of the view field, the laser beam is guided to enter the target area to complete rough tracking, and after the target object enters the target area, the direction of the beacon light is finely adjusted according to the deviation between the beacon light and the reflected beacon light to realize high-precision tracking.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic view of a specific example of a laser emitting apparatus with a tracking function provided in an embodiment of the present invention;
fig. 2 is a schematic view of another specific example of a laser emitting device with a tracking function provided in the embodiment of the present invention; (ii) a
FIG. 3 is a schematic diagram of one specific example of a binding image recognition and laser-guided tracking system provided in an embodiment of the present invention;
FIG. 4 is a diagram illustrating one specific example of an image processing module in a tracking system incorporating image recognition and laser guidance provided in an embodiment of the present invention;
FIG. 5 is a diagram illustrating one specific example of a method for integrating image recognition and laser-guided tracking as provided in an embodiment of the present invention;
FIG. 6 is a diagram illustrating a specific example of step S2 in a method for tracking combined image recognition and laser guidance according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a specific example of step S3 in a method for tracking combined image recognition and laser guidance according to an embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a specific example of acquiring light coordinates of a reflective beacon in a tracking method combining image recognition and laser guidance according to an embodiment of the present invention;
fig. 9 is a schematic diagram illustrating a specific example of step S5 in the method for tracking combined image recognition and laser guidance according to an embodiment of the present invention.
Reference numerals
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example 1
An embodiment of the present invention provides a laser emission device with a tracking function, as shown in fig. 1, including: an image processing module 1, a laser guide module 2, a laser emitter 3 and a photoelectric detector 4,
the image processing module 1 collects the inspection image according to a preset route, extracts image information in the inspection image, compares the image information with preset template image information and judges whether a target object enters a view field or not; when it is determined that the target object enters the field of view, the image processing module 1 acquires position information of the target object, and adjusts the target object to the central region of the field of view according to the position information of the target object.
The laser guide module 2 drives the laser transmitter 3 to transmit beacon light to the target object; the photoelectric detector 4 collects the reflected beam of the beacon light, the position information of the reflected beam is determined, and the laser guide module 2 adjusts the laser transmitter 3 according to the position information of the reflected beam and the central point position of the photoelectric detector 4, so that the reflected beam of the beacon light is aligned to the central point of the photoelectric detector 4; the laser transmitter 3 transmits a main beam to a target object. In an embodiment of the present invention, the photodetector 4 may be a four-quadrant photodetector.
In the embodiment of the present invention, as shown in fig. 2, the image processing module 1 and the laser transmitter 3 are disposed on the tracking cloud deck 5, and the tracking cloud deck 5 drives the image processing module 1 to collect the inspection image according to the preset route and the laser transmitter 3 to transmit the beacon light and the main light beam to the target object. In the embodiment of the present invention, the tracking pan/tilt 5 may be a two-axis tracking pan/tilt.
In a preferred embodiment, as shown in fig. 2, the image processing module 1 includes: an image acquisition sub-module 11, an image matching sub-module 12 and a field of view region adjustment sub-module 13.
The image acquisition sub-module 11 acquires the inspection image according to the preset route, and extracts the image information in the inspection image, in the embodiment of the invention, the image acquisition sub-module 11 can be a camera, and the size of the image shot by each frame is 640 × 480. The image matching sub-module 12 matches the image information of the target object with the image information in the inspection image according to a preset image processing algorithm, and judges whether the target object enters the field of view. In the embodiment of the present invention, the image processing algorithm may be a template matching algorithm, etc., but the present invention is not limited thereto. When it is determined that the target object enters the field of view, the field-of-view area adjustment submodule 13 acquires position information of the target object, and adjusts the target object to the center area of the field of view according to the position information of the target object.
In a preferred embodiment, as shown in fig. 2, the laser transmitter 3 comprises: a laser 31, a transmitter 32, and a laser power supply 33, wherein,
the laser power supply 33 supplies power to the laser 31, and the laser 31 excites photons to generate beacon light in accordance with the power supplied from the laser power supply 33. The transmitter 32 emits beacon light, and generates a main beam according to shaping of the beacon light, and emits the main beam.
In the embodiment of the present invention, the laser power supply 33 may be a storage battery or a commercial power, the laser is a 31 semiconductor laser, and photons are excited on the semiconductor to generate laser after receiving the current injection of the laser power supply 33. The beacon light has small power but good directivity and does not need to be shaped, the main light beam is 808nm laser beam which has high transmission efficiency in the atmosphere and the wavelength matched with the photovoltaic cell but has large divergence angle and uneven irradiated facula and can be used after being shaped by a laser collimation system, and the transmitter 32 in the embodiment of the invention is essentially an integrated collimation system and transmits laser generated by a laser after being shaped.
The laser emitting device with the tracking and aiming function provided by the embodiment of the invention firstly carries out routing inspection through a preset line, acquires the position information of a target object after judging that the target object enters a view field, adjusts the tracking and aiming holder to enable the target object to be in the central area of the view field, guides a laser beam to enter the target area to finish rough tracking, finely adjusts the direction of beacon light according to the deviation of the beacon light and reflected beacon light after entering the target area, and realizes high-precision tracking and aiming.
Example 2
An embodiment of the present invention provides a tracking and pointing system combining image recognition and laser guidance, as shown in fig. 3, including: laser emitting device 6 and laser receiving arrangement 7, wherein, laser emitting device 6 includes: the system comprises an image processing module 61, a laser guide module 62, a laser emitter 63 and a photoelectric detector 64, wherein the image processing module 61 collects an inspection image according to a preset route, extracts image information in the inspection image, compares the image information with preset template image information and judges whether a target object enters a view field or not; when it is determined that the target object enters the field of view, the image processing module 61 acquires position information of the target object, and adjusts the target object to the central region of the field of view according to the position information of the target object. The laser guiding module 62 drives the laser transmitter 63 to transmit beacon light to the target object; the photoelectric detector 64 collects the reflected light beam of the beacon light reflected by the laser receiving device 7 and determines the position information of the reflected light beam; the laser guiding module 62 adjusts the laser transmitter 63 according to the position information of the reflected beam and the position of the center point of the photodetector 62, so that the reflected beam of the beacon light is aligned with the center point of the photodetector 64; the laser transmitter 63 transmits a main beam to the target object.
In the embodiment of the present invention, as shown in fig. 3, a laser receiver 7 is disposed on a target object, the laser receiver 7 includes an optical assembly 71 and a photovoltaic cell 72, a power converter 73, a storage battery 74 and a plurality of marker lights 75, the optical assembly 71 reflects a beacon light emitted by a laser emitter to generate a reflected light beam, the photovoltaic cell 72 receives a primary light beam emitted by the laser emitter to convert the primary light beam into electrical energy, the power converter 73 converts the electrical energy generated by the photovoltaic cell 72 into power and transmits the power to the target object and/or the storage battery 74, and the marker lights 75 are used to construct a plurality of color feature points on the target object.
In a preferred embodiment, as shown in fig. 4, the image processing module 61 includes an image collecting sub-module 611, an image matching sub-module 612, and a field of view area adjusting sub-module 613, wherein the image collecting sub-module 611 collects the inspection image according to the predetermined route and extracts the image information from the inspection image.
The image matching sub-module 612 determines whether the target enters the field of view, that is, the target object enters the shooting range of the camera, and for each frame of image obtained by the camera, firstly, the image is filtered to remove noise which may cause interference in the image, so as to obtain a relatively smooth image, and then the image is used as a to-be-inspected inspection image for template matching and specific color feature point processing.
The image matching sub-module 612 in the embodiment of the present invention includes: the centroid coordinate obtaining sub-module 6121 is configured to match the inspection image information with preset template image information, and obtain a centroid coordinate of a matching area with a highest matching degree with the target object.
In the embodiment of the invention, the template matching algorithm comprises the specific processes of carrying out gray level processing on a template image and an image to be inspected to obtain a standardized gray level value, taking the processed template image as a matched template block, traversing each position in the image to be inspected, comparing the similarity degree of each position and the template block, namely, substituting the standardized gray level values of the template image and the image to be inspected into a similarity function to obtain the matching degree of the template image and the image to be inspected, and then obtaining the centroid coordinate of a matching area with the highest matching degree of the template image.
The first color feature point coordinate obtaining sub-module 6122 is configured to obtain coordinates of a plurality of color feature points set in the target object. The plurality of color feature points set on the target object are specific color feature points, and may be red in the embodiment of the present invention. Firstly, converting a filtered to-be-detected inspection image from an RGB (red, green and blue) model to an HSV (hue, saturation and brightness) model, determining an H component interval where a specific color is located, and simultaneously matching an appropriate S component interval and an appropriate V component interval to perform color detection on each pixel point of the to-be-detected inspection image. When the pixel point accords with the HSV interval, the value of the pixel point is set to be 255, otherwise, the pixel point is set to be 0, and the pixel point is stored in a result image. Thus, a binary image highlighting the specific color feature points is obtained. And carrying out contour detection on the image to obtain the coordinates of each color characteristic point.
The distance minimum obtaining submodule 6123 is configured to calculate distances between the centroid coordinate of the matching area and the coordinates of the multiple color feature points, and obtain a minimum value of the distances. The first distance determining submodule 6124 is configured to determine whether the minimum value of the distance is smaller than a first preset value. The object determination sub-module 6125 determines that the target object enters the field of view when the minimum distance value is smaller than the first preset value.
In the embodiment of the invention, when the target object is outside the visual field of the camera, the area which is most similar to the target in the visual field is randomly matched through the template matching algorithm, but the probability that the area happens to have the specific color characteristic point due to environmental interference is extremely small, so that the distance between the specific color characteristic point and the template matching area is larger, and at the moment, the target is considered not to be in the visual field, and the inspection process is continued. When the target is in the field of view of the camera, the template matching area and the specific color feature points are approximately in the same area (namely the target area), and the distance between the specific color feature points and the template matching area is extremely small and is smaller than a first preset value (the first preset value can be 10 pixels), the target object is judged to enter the field of view.
In an embodiment of the present invention, as shown in fig. 4, the view area adjusting submodule 613 includes:
the first position centroid coordinate obtaining sub-module 6131 is configured to match the inspection image information with preset template image information, and obtain a first position centroid coordinate of a matching area with a highest matching degree with the target object. In the embodiment of the invention, the template matching algorithm comprises the specific processes of carrying out gray processing on a template image and an image to be inspected to obtain a standardized gray value, taking the processed template image as a matched template block, traversing each position in the image to be inspected, comparing the similarity degree of each position and the template block, namely, bringing the standardized gray values of the template image and the image to be inspected into a similarity function to obtain the matching degree of the template image and the image to be inspected, and then obtaining the centroid coordinate of a matching area with the highest matching degree of the template image.
The second color feature point coordinate obtaining sub-module 6132 is configured to obtain coordinates of a plurality of color feature points set in the target object. The second position centroid coordinate obtaining sub-module 6133 is configured to obtain the second position centroid coordinate of the target object according to the coordinates of the multiple color feature points.
In the embodiment of the invention, the filtered to-be-detected inspection image is converted into an HSV (hue, saturation and brightness) model from an RGB (red, green and blue) model to obtain a binary image of the specific color characteristic points, contour detection is carried out on the binary image to obtain the coordinates of each color characteristic point, and the second position centroid coordinates of the target object are obtained according to the coordinates of each color characteristic point.
The centroid coordinate distance submodule 6134 is configured to calculate a distance between the first location centroid coordinate and the second location centroid coordinate. The second distance determining submodule 6135 is configured to determine whether the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value. When the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than the second preset value, the target object position information determining sub-module 6136 determines the first position centroid coordinate as the position information of the target object. According to the embodiment of the invention, the template matching algorithm is used, namely, the standardized gray value is obtained through gray processing, the similarity between the template image and the standardized gray value of the image to be inspected is obtained to determine the matching value, when the matching value is greater than a preset value, the target object is successfully matched, and then the accuracy of the centroid coordinate of the target object is judged to be higher when the deviation between the obtained centroid coordinate of the target object and the centroid coordinate of the target object obtained through the characteristic color characteristic points is smaller than a second preset value. In the embodiment of the present invention, the second preset value may be 10 pixels, and is not limited thereto, and in other embodiments, the preset value may be adjusted according to the size of the actually captured image.
The target object adjusting sub-module 6137 is configured to adjust the target object to the central area of the field of view according to the position information of the target object. In the embodiment of the invention, after the position information of the target object is acquired, namely the centroid coordinate of the target area of the target object is acquired, the two-axis tracking cloud deck is controlled to rotate according to the position information of the target object until the center area of the view field is adjusted.
After the target object is adjusted to the central region of the field of view, the laser guide module 62 guides the beacon light emitted by the laser transmitter 63 to emit light into the target region.
The tracking and aiming system combining image recognition and laser guidance provided by the embodiment of the invention firstly carries out routing inspection through a preset line, acquires the position information of a target object after judging that the target object enters a view field, adjusts the tracking and aiming holder to enable the target object to be in the central area of the view field, guides a laser beam to enter the target area to finish rough tracking, finely adjusts the direction of beacon light according to the deviation of the beacon light and reflected beacon light after entering the target area, and realizes high-precision tracking and aiming.
Example 3
The embodiment of the invention provides a tracking method combining image recognition and laser guidance, and as shown in fig. 5, the tracking method comprises the following steps:
step S1: and acquiring the inspection image according to a preset route, and extracting image information in the inspection image.
In the embodiment of the invention, the camera mounted on the two-axis tracking pan-tilt head is used for routing inspection according to a preset line, and the size of an image shot by each frame is 640 × 480.
Step S2: and comparing the image information with preset template image information, and judging whether the target object enters the view field. In the embodiment of the invention, the judgment of whether the target object enters the field of view is to judge whether the target object enters the shooting range of the camera.
In a preferred embodiment, as shown in fig. 6, the step S2 of comparing the image information with the preset template image information to determine whether the target object enters the field of view includes:
step S21: and matching the patrol inspection image information with the preset template image information through a preset template matching algorithm to obtain the centroid coordinate of the matching area with the highest matching degree with the target object. In the embodiment of the invention, the template matching processing algorithm is as follows: firstly, photographing a target object to obtain a template picture, taking the template picture as a matched template block, traversing each position in the patrol inspection image, comparing the similarity degree of each position and the template block, and when the similarity degree is high enough, determining that the target object is matched. And selecting a standard correlation matching method to compare the similarity degree of the template block and the image according to specific conditions. In the embodiment of the invention, after the template image and the inspection image to be detected are subjected to graying processing, the average value of the whole image is subtracted from each pixel point of the template block and the inspection image, and then the square difference is divided to obtain a standardized result so as to ensure that the calculation result is not influenced when the illumination brightness of the image and the template is respectively changed, and then the standardized gray values of the template and the inspection image to be detected are brought into the similarity function to obtain the value of the matching degree of the template and the inspection image to be detected. The correlation coefficient calculated by the similarity function is limited to-1 to 1, 1 representing identical, -1 representing the opposite gray levels of the two images, and 0 representing no linear relationship between the two images. The closer the correlation coefficient obtained by the similarity function is to 1 in the embodiment of the invention, the higher the matching degree is. The calculation formula of the similarity function is as follows:
Figure BSA0000165778190000161
Figure BSA0000165778190000162
Figure BSA0000165778190000163
where T (x, y) is the grayscale value of the (x, y) point in the template, and T' (x, y) is the normalized grayscale value of the (x, y) point in the template. I (x, y): grayscale value I' (x, y) of (x, y) point in patrol: the standardized gray value of (x, y) points in the to-be-detected routing inspection image, R (x, y): a current matching region similarity function. And w and h respectively represent the width and the height of the template image and the to-be-detected patrol image.
Step S22: coordinates of a plurality of color feature points set on a target object are acquired.
Step S23: and respectively calculating the distances between the centroid coordinate of the matching area and the coordinates of the plurality of color feature points, and acquiring the minimum value of the distances. In the embodiment of the present invention, a method for processing a plurality of color feature points includes: firstly, the filtered image to be detected is converted from an RGB (red, green and blue) model to an HSV (hue, saturation and brightness) model, so that the color description mode of the image is closer to the visual perception of human beings. In the HSV model, the value of H component can approximately represent the color of an object, S component represents the degree of mixing of such color with white, and V component represents the degree of mixing of such color with black. Therefore, when identifying the specific color feature points, firstly, the H component interval where the specific color is located is determined, and simultaneously, the appropriate S component interval and the appropriate V component interval are matched to perform color detection on each pixel point of the inspection image to be detected. And when the pixel point accords with the HSV interval, setting the value of the pixel point to be 255, otherwise, setting the value of the pixel point to be 0, and storing the pixel point in a result image to obtain a binary image highlighting the specific color feature point. And carrying out contour detection on the image to obtain the coordinates of each characteristic point.
Step S24: judging whether the minimum value of the distance is smaller than a first preset value or not;
step S25: and when the minimum distance value is smaller than a first preset value, judging that the target object enters the field of view.
In the embodiment of the present invention, the first preset value may be 10 pixels, but is not limited thereto, and in other embodiments, the preset value may be adjusted according to the size of the actually captured image.
Step S3: and when the target object is judged to enter the view field, acquiring the position information of the target object, and adjusting the target object to the central area of the view field according to the position information of the target object.
The embodiment of the invention combines the template matching and the color characteristic point identification, and only when the difference between the centroid coordinates determined by the matching area and the specific color characteristic point is within the preset precision, the target position coordinate and the displacement instruction obtained by calculation are sent to the two-axis tracking cloud deck, so that the accuracy and precision of image identification are greatly improved.
In a preferred embodiment, as shown in fig. 7, step S3, the process of obtaining the position information of the target object and adjusting the target object to the central area of the field of view according to the position information of the target object includes:
step S31: and matching the inspection image information with preset template image information through an algorithm to obtain a first position centroid coordinate of a matching area with the highest matching degree with the target object. The template matching algorithm used in this step is the same as that used in step S21, and will not be described herein.
Step S32: coordinates of a plurality of color feature points set on a target object are acquired. The coordinate algorithm for obtaining the plurality of color feature points in this step is the same as that of step S22, and is not described herein again.
Step S33: and acquiring second position centroid coordinates of the target object according to the coordinates of the plurality of color feature points.
Step S34: a distance between the first location centroid coordinates and the second location centroid coordinates is calculated.
Step S35: and judging whether the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value.
Step S36: and when the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value, determining the first position centroid coordinate as the position information of the target object.
In the embodiment of the present invention, the second preset value may be 10 pixels, but is not limited thereto, and in other embodiments, the preset value may be adjusted according to the size of the actually captured image.
Step S37: and adjusting the target object to the central area of the field of view according to the position information of the target object.
Step S4: the method includes the steps of emitting beacon light to a target object, collecting reflected light beams of the beacon light, and determining position information of the reflected light beams. In the embodiment of the present invention, as shown in fig. 8, after the beacon light emitted by the laser emitter is emitted by the optical element, the coordinates of the reflected light beam are obtained by the four-quadrant photodetector.
Step S5: and adjusting the beacon light according to the position information of the reflected light beam and the position of the central point of the photoelectric detector to enable the reflected light beam to be aligned with the central point of the photoelectric detector.
In a preferred embodiment, as shown in fig. 9, the step S5 of adjusting the beacon light according to the position information of the reflected light beam and the position of the center point of the photodetector, so that the reflected light beam is aligned with the center point of the photodetector includes:
step S51: and acquiring the coordinates of the central point of the photoelectric detector and the coordinates of a reflected beam of the beacon light reflected by the target object.
Step S52: and calculating a deviation value of the coordinate of the central point of the photoelectric detector and the coordinate of the reflected light beam.
Step S53: and adjusting the coordinates of the reflected light beam of the beacon light to be aligned with the central point of the photoelectric detector according to the deviation value. In the embodiment of the invention, the angle of the laser emitter is adjusted according to the deviation value of the coordinate of the central point of the photoelectric detector and the coordinate of the reflected light beam until the deviation value is smaller than the fourth preset value. The fourth preset value can determine a specific value according to an actual application scene.
Step S6: a main beam is emitted toward a target object.
And finally, the laser beam is shaped and then sent to a photovoltaic cell of the laser receiving device, and power is converted to supply power to the target object, so that the non-contact power supply function facing the laser wireless power transmission is realized.
The tracking and aiming method combining image recognition and laser guidance provided by the embodiment of the invention comprises the steps of firstly, carrying out inspection through a preset line, obtaining the position information of a target object after judging that the target object enters a view field, adjusting a tracking and aiming holder to enable the target object to be in the central area of the view field, guiding a laser beam to enter the target area to finish rough tracking, finely adjusting the direction of beacon light according to the deviation of the beacon light and reflected beacon light after entering the target area, and realizing high-precision tracking and aiming.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.

Claims (10)

1. A tracking system that combines image recognition and laser guidance, comprising: a laser receiving device and a laser emitting device, the laser receiving device is arranged on a target object,
the laser emitting device includes: an image processing module, a laser guide module, a laser emitter and a photoelectric detector,
the image processing module acquires an inspection image according to a preset route, extracts image information in the inspection image, compares the image information with preset template image information and judges whether a target object enters a view field or not; when the target object is judged to enter a field of view, the image processing module acquires the position information of the target object and adjusts the target object to the central area of the field of view according to the position information of the target object;
the laser guide module drives the laser transmitter to transmit beacon light to the target object;
the photoelectric detector collects a reflected light beam of the beacon light reflected by the laser receiving device and determines the position information of the reflected light beam;
the laser guiding module adjusts the laser transmitter to transmit according to the position information of the reflected light beam and the position of the central point of the photoelectric detector, so that the reflected light beam of the beacon light is aligned to the central point of the photoelectric detector;
the laser transmitter transmits a main beam to the target object;
the laser receiving apparatus includes: a plurality of marker lights for constructing a plurality of color feature points on the target object;
the image processing module comprises an image acquisition sub-module and an image matching sub-module, wherein,
the image acquisition submodule acquires the inspection image according to a preset route and extracts image information in the inspection image;
the image matching sub-module includes:
the mass center coordinate acquisition sub-module is used for matching the patrol inspection image information with preset template image information and acquiring the mass center coordinate of a matching area with the highest matching degree with the target object;
the first color characteristic point coordinate obtaining submodule is used for obtaining the coordinates of a plurality of color characteristic points arranged on the target object;
the distance minimum value obtaining submodule is used for respectively calculating the distances between the centroid coordinates of the matching area and the coordinates of the plurality of color feature points and obtaining the minimum value of the distances;
the first distance judgment submodule is used for judging whether the minimum value of the distance is smaller than a first preset value or not;
and the object judgment sub-module judges that the target object enters the field of view when the minimum distance value is smaller than a first preset value.
2. The combined image recognition and laser guided tracking system of claim 1, wherein the image processing module further comprises a field of view region adjustment sub-module, the field of view region adjustment sub-module comprising:
the first position centroid coordinate acquisition sub-module is used for matching the patrol inspection image information with preset template image information and acquiring a first position centroid coordinate of a matching area with the highest matching degree with the target object;
the second color characteristic point coordinate obtaining submodule is used for obtaining the coordinates of a plurality of color characteristic points arranged on the target object;
the second position centroid coordinate acquisition sub-module is used for acquiring a second position centroid coordinate of the target object according to the coordinates of the plurality of color feature points;
a centroid coordinate distance submodule for calculating a distance between the first and second location centroid coordinates;
the second distance judgment submodule is used for judging whether the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value or not;
the target object position information determining submodule determines the first position centroid coordinate as the position information of the target object when the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value;
and the target object adjusting submodule is used for adjusting the target object to the central area of the view field according to the position information of the target object.
3. The combined image recognition and laser guided tracking system of claim 1, wherein the laser receiving device further comprises: an optical assembly and a photovoltaic cell, wherein,
the optical assembly reflects the beacon light to generate the reflected light beam;
the photovoltaic cell receives the main beam and converts the main beam into electric energy.
4. The combined image recognition and laser guided tracking system of claim 3, wherein the laser receiving device further comprises: a power converter and a storage battery,
and the power converter converts the power of the electric energy generated by the photovoltaic cell and then transmits the electric energy to the target object and/or the storage battery.
5. The combined image recognition and laser-guided tracking system of claim 1, wherein the image processing module comprises: an image acquisition sub-module, an image matching sub-module and a field of view region adjustment sub-module, wherein,
the image acquisition sub-module acquires an inspection image according to a preset route and extracts image information in the inspection image;
the image matching sub-module matches the image information of the target object with the image information in the inspection image and judges whether the target object enters a view field or not;
when the target object is judged to enter the field of view, the field of view area adjusting submodule acquires the position information of the target object and adjusts the target object to the central area of the field of view according to the position information of the target object.
6. The tracking and aiming system combining image recognition and laser guidance according to claim 1 or 5, wherein the image processing module and the laser transmitter are arranged on a tracking and aiming holder, and the tracking and aiming holder drives the image processing module to collect a patrol inspection image according to a preset route and the laser transmitter to transmit beacon light and a main light beam to a target object.
7. The combined image recognition and laser-guided tracking system of claim 1, wherein the laser transmitter comprises: a laser, a transmitter and a laser power supply, wherein,
the laser power supply supplies electric energy to the laser;
the laser excites photons to generate beacon light according to the electric energy provided by the laser power supply;
the transmitter transmits the beacon light, generates a main light beam according to the shaping of the beacon light, and transmits the main light beam.
8. A tracking method combining image recognition and laser guidance is characterized by comprising the following steps:
acquiring a patrol inspection image according to a preset route, and extracting image information in the patrol inspection image;
comparing the image information with preset template image information, and judging whether the target object enters a view field or not;
when the target object is judged to enter the field of view, acquiring the position information of the target object, and adjusting the target object to the central area of the field of view according to the position information of the target object;
transmitting beacon light to the target object, collecting a reflected light beam of the beacon light, and determining position information of the reflected light beam;
adjusting the beacon light according to the position information of the reflected light beam and the position of the central point of the photoelectric detector to enable the reflected light beam to be aligned with the central point of the photoelectric detector;
emitting a main beam to the target object;
the step of comparing the image information with preset template image information and judging whether the target object enters the view field specifically comprises the following steps:
matching the inspection image information with preset template image information to obtain a centroid coordinate of a matching area with the highest matching degree with the target object;
acquiring coordinates of a plurality of color feature points arranged on the target object;
respectively calculating the distances between the centroid coordinates of the matching area and the coordinates of the plurality of color feature points, and acquiring the minimum value of the distances;
judging whether the minimum value of the distance is smaller than a first preset value or not;
and when the minimum distance value is smaller than a first preset value, judging that the target object enters a field of view.
9. The combined image recognition and laser-guided tracking method according to claim 8, wherein the step of acquiring the position information of the target object and adjusting the target object to the central area of the field of view according to the position information of the target object specifically comprises:
matching the inspection image information with preset template image information to obtain a first position centroid coordinate of a matching area with the highest matching degree with the target object;
acquiring coordinates of a plurality of color feature points arranged on the target object;
acquiring a second position centroid coordinate of the target object according to the coordinates of the plurality of color feature points;
calculating a distance between the first and second location centroid coordinates;
judging whether the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value or not;
when the distance between the first position centroid coordinate and the second position centroid coordinate is smaller than a second preset value, determining the first position centroid coordinate as the position information of the target object;
and adjusting the target object to the central area of the field of view according to the position information of the target object.
10. The combined image recognition and laser-guided tracking method according to claim 8, wherein the step of adjusting the beacon light according to the position information of the reflected light beam and the position of the center point of the photodetector to align the reflected light beam with the center point of the photodetector comprises:
acquiring the coordinate of the central point of the photoelectric detector and the coordinate of a reflected beam of the beacon light reflected by the target object;
calculating a deviation value of the coordinate of the central point of the photoelectric detector and the coordinate of the reflected light beam;
and adjusting the coordinates of the reflected light beam of the beacon light to be aligned with the central point of the photoelectric detector according to the deviation value.
CN201810641748.8A 2018-06-21 2018-06-21 Tracking and aiming system and method combining image recognition and laser guidance Active CN109146919B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810641748.8A CN109146919B (en) 2018-06-21 2018-06-21 Tracking and aiming system and method combining image recognition and laser guidance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810641748.8A CN109146919B (en) 2018-06-21 2018-06-21 Tracking and aiming system and method combining image recognition and laser guidance

Publications (2)

Publication Number Publication Date
CN109146919A CN109146919A (en) 2019-01-04
CN109146919B true CN109146919B (en) 2020-08-04

Family

ID=64802160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810641748.8A Active CN109146919B (en) 2018-06-21 2018-06-21 Tracking and aiming system and method combining image recognition and laser guidance

Country Status (1)

Country Link
CN (1) CN109146919B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112117835B (en) * 2019-06-19 2022-06-28 华为技术有限公司 Laser alignment method and related device
CN112350775A (en) * 2019-08-06 2021-02-09 中车株洲电力机车研究所有限公司 FSO communication system and method based on machine vision
CN110531217B (en) * 2019-08-12 2021-12-10 深圳供电局有限公司 Line tracking device and tracking method thereof
CN110503687B (en) * 2019-08-12 2022-09-20 中国科学院光电技术研究所 Target positioning method for aerial photoelectric measurement platform
CN111739003B (en) * 2020-06-18 2022-11-18 上海电器科学研究所(集团)有限公司 Machine vision method for appearance detection
CN114447756A (en) * 2020-11-02 2022-05-06 华为技术有限公司 Laser emission device, laser emission method and laser wireless charging system
CN112731343B (en) * 2020-12-18 2023-12-12 福建汇川物联网技术科技股份有限公司 Target measurement method and device for measurement camera
CN113034842A (en) * 2020-12-30 2021-06-25 神思电子技术股份有限公司 Oil-gas pipeline safety protection method, device and system
CN113311861B (en) * 2021-05-14 2023-06-16 国家电投集团青海光伏产业创新中心有限公司 Automatic detection method and system for hidden crack characteristics of photovoltaic module
CN114587586B (en) * 2022-03-21 2022-11-11 黄伟 Noninvasive layered display equipment and system for dental injury and dental pulp injury
CN116054308B (en) * 2022-07-27 2023-10-24 荣耀终端有限公司 Wireless charging method, electronic device and readable medium
CN115265366A (en) * 2022-07-29 2022-11-01 华能澜沧江水电股份有限公司 Object deformation detection method and device, terminal equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102075243A (en) * 2010-12-28 2011-05-25 哈尔滨工业大学 Error detection device and control method for laser communication link beams
CN103078678A (en) * 2012-12-29 2013-05-01 中国航天科技集团公司第五研究院第五一三研究所 Satellite-borne laser wireless energy transmission system
CN103384172A (en) * 2013-06-28 2013-11-06 中国航天科技集团公司第五研究院第五一三研究所 Laser wireless energy transfer communication and tracking integrating system and method
EP2746806A1 (en) * 2012-12-21 2014-06-25 Leica Geosystems AG Self-calibrating laser tracker and auto calibration technique
CN104899590A (en) * 2015-05-21 2015-09-09 深圳大学 Visual target tracking method and system for unmanned aerial vehicle
CN105469400A (en) * 2015-11-23 2016-04-06 广州视源电子科技股份有限公司 Method and system for quickly identifying and marking polarity direction of electronic element
CN105811874A (en) * 2016-03-18 2016-07-27 南京航空航天大学 Optimal series-parallel method for photovoltaic arrays in laser wireless power transmission system
KR101660703B1 (en) * 2015-06-26 2016-09-28 주식회사 유진로봇 Visual homing system and method using stereo camera and active logo
CN106679504A (en) * 2017-01-09 2017-05-17 中国人民解放军武汉军械士官学校 Laser guidance simulation experimental method and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102075243A (en) * 2010-12-28 2011-05-25 哈尔滨工业大学 Error detection device and control method for laser communication link beams
EP2746806A1 (en) * 2012-12-21 2014-06-25 Leica Geosystems AG Self-calibrating laser tracker and auto calibration technique
CN103078678A (en) * 2012-12-29 2013-05-01 中国航天科技集团公司第五研究院第五一三研究所 Satellite-borne laser wireless energy transmission system
CN103384172A (en) * 2013-06-28 2013-11-06 中国航天科技集团公司第五研究院第五一三研究所 Laser wireless energy transfer communication and tracking integrating system and method
CN104899590A (en) * 2015-05-21 2015-09-09 深圳大学 Visual target tracking method and system for unmanned aerial vehicle
KR101660703B1 (en) * 2015-06-26 2016-09-28 주식회사 유진로봇 Visual homing system and method using stereo camera and active logo
CN105469400A (en) * 2015-11-23 2016-04-06 广州视源电子科技股份有限公司 Method and system for quickly identifying and marking polarity direction of electronic element
CN105811874A (en) * 2016-03-18 2016-07-27 南京航空航天大学 Optimal series-parallel method for photovoltaic arrays in laser wireless power transmission system
CN106679504A (en) * 2017-01-09 2017-05-17 中国人民解放军武汉军械士官学校 Laser guidance simulation experimental method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A Survey on Intersatellite Laser Communication;Manjit Sandhu等;《International Journal of Emerging Technologies in Engineering Research》;20160531;第4卷(第5期);第249-255页 *
基于GPS的激光通信初始定位技术研究与实现;刘杨;《中国优秀硕士学位论文全文数据库 信息科技辑》;20110415(第4期);第I136-522页 *
基于人体主颜色特征的多摄像机目标连续跟踪;李文灿等;《计算机与数字工程》;20111231;第39卷(第4期);第119-122、166页 *
空中目标实时跟踪算法研究及***设计;唐仁圣;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20050115(第1期);第C032-28页正文第4-5、13-14页 *

Also Published As

Publication number Publication date
CN109146919A (en) 2019-01-04

Similar Documents

Publication Publication Date Title
CN109146919B (en) Tracking and aiming system and method combining image recognition and laser guidance
US20230146379A1 (en) Multi-channel lidar sensor module
CN103778523B (en) Vertical take-off and landing unmanned aerial vehicle and precise positioning and obstacle avoidance method thereof
JP6540009B2 (en) Image processing apparatus, image processing method, program, image processing system
CN101171833B (en) Digital cameras with triangulation autofocus systems and related methods
CN109116298B (en) Positioning method, storage medium and positioning system
US11715293B2 (en) Methods for identifying charging device, mobile robots and systems for identifying charging device
CN109451233B (en) Device for collecting high-definition face image
CN111965625B (en) Correction method and device for laser radar and environment sensing system
CN111765974B (en) Wild animal observation system and method based on miniature refrigeration thermal infrared imager
CN114905512B (en) Panoramic tracking and obstacle avoidance method and system for intelligent inspection robot
CN111525957B (en) Machine vision-based visible light communication automatic capturing, tracking and aiming method and system
CN109246371B (en) Light spot capturing system and method
CN209991983U (en) Obstacle detection equipment and unmanned aerial vehicle
CN114020039A (en) Automatic focusing system and method for unmanned aerial vehicle inspection tower
US20240127576A1 (en) System and method for matching of image features with indistinguishable landmarks
CN109782811B (en) Automatic following control system and method for unmanned model vehicle
CN114973037B (en) Method for intelligently detecting and synchronously positioning multiple targets by unmanned aerial vehicle
JP2021099387A (en) Wireless camera system and drive method of wireless camera system
CN110111393B (en) Automobile panorama calibration method, device and system
KR102035414B1 (en) Drone with image correction function for managing yard and system for managing yard using the same
CN108663685B (en) Light supplement method, device and system
RU2792974C1 (en) Method and device for autonomous landing of unmanned aerial vehicle
RU2782702C1 (en) Device for supporting object positioning
CN115065409B (en) Visible light indoor communication and positioning integrated system based on wavelength division multiplexing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant