CN109782811B - Automatic following control system and method for unmanned model vehicle - Google Patents

Automatic following control system and method for unmanned model vehicle Download PDF

Info

Publication number
CN109782811B
CN109782811B CN201910107843.4A CN201910107843A CN109782811B CN 109782811 B CN109782811 B CN 109782811B CN 201910107843 A CN201910107843 A CN 201910107843A CN 109782811 B CN109782811 B CN 109782811B
Authority
CN
China
Prior art keywords
image
matrix
following target
pixel
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910107843.4A
Other languages
Chinese (zh)
Other versions
CN109782811A (en
Inventor
刘井莲
关闯
王春红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dragon Totem Technology Hefei Co ltd
Hefei Minglong Electronic Technology Co ltd
Original Assignee
Suihua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suihua University filed Critical Suihua University
Priority to CN201910107843.4A priority Critical patent/CN109782811B/en
Publication of CN109782811A publication Critical patent/CN109782811A/en
Application granted granted Critical
Publication of CN109782811B publication Critical patent/CN109782811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An automatic following control system and method for an unmanned model vehicle comprises the following steps: acquiring images shot by a camera device in a preset time period; selecting an initial following target according to an image shot by a camera device; the image shot by the camera device is processed in a blocking mode; selecting a central pixel as the pixel center of the image area from each image area after the image is partitioned, and taking other pixels of the image area as adjacent pixels; carrying out data standardization processing on adjacent pixels of each image area to obtain characteristic information of the image areas; the characteristic information of each image area is put into a combined group to obtain a characteristic image of the image; comparing the characteristic image of the image with the characteristic image in the previous frame of image, and judging whether the initial following target is the following target or not; if so, determining the initial following target as a following target; and acquiring the relative position of the following target and the center of the camera device according to the image, and adjusting the camera device.

Description

Automatic following control system and method for unmanned model vehicle
Technical Field
The invention relates to the technical field of intelligent control, in particular to an automatic following control system and method for an unmanned model vehicle.
Background
With the rapid development of economy and science and technology, changes are brought to the aspects of life of people. With the development of the unmanned model vehicle applied to human life and industrial production, for example, in the process of carrying goods, the workload of workers can be greatly reduced through the unmanned model vehicle.
However, the existing unmanned model vehicles all work according to a manually set route, so that the range of motion of the unmanned model vehicles is limited, and the unmanned model vehicles cannot automatically follow a certain target.
Disclosure of Invention
In order to solve the technical problems, the invention provides an automatic following control system and method for an unmanned model vehicle, which are used for realizing the automatic following of the unmanned model vehicle to a target.
The embodiment of the invention provides an automatic following control method of an unmanned model vehicle, which comprises the following steps:
s101, acquiring images shot by a camera device in a preset time period;
s102, selecting an initial following target according to the image shot by the camera device;
s103, carrying out blocking processing on the image shot by the camera device;
s104, selecting a central pixel as the pixel center of the image area from each image area after the image is partitioned, and taking other pixels of the image area as adjacent pixels; the distances from the adjacent pixels to the centers of the pixels are equal;
s105, performing data standardization processing on adjacent pixels of each image area to acquire characteristic information of the image areas;
s106, classifying the characteristic information of each image area into a combined group to obtain a characteristic image of the image;
s107, comparing the characteristic image of the image with the characteristic image in the previous frame of image, and judging whether the initial following target is a following target;
s108, if so, determining the initial following target as a following target;
and S109, acquiring the relative position of the following target and the center of the camera device according to the image, and adjusting the camera device.
Further, the preset time period is 60ms, and the image capturing device captures an image at the preset time period of one frame of 60 ms.
Further, before the step S102, the method further includes: processing the image shot by the camera device; the method comprises the following steps:
acquiring a light quantity value of each pixel in the image;
acquiring the ratio of the light quantity value of each pixel to a standard light quantity value acquired when the following target is irradiated;
comparing the ratio with the preset threshold; if the ratio is larger than the preset threshold value, judging the pixel as an abnormal pixel; if the ratio is smaller than the preset threshold value, judging the pixel to be a normal pixel;
and calculating a standard light value of the abnormal pixel according to the light values of the normal pixels adjacent to the abnormal pixel, and replacing the light value of the abnormal pixel with the standard light value.
Further, before the step S109, the method further includes: measuring a distance between the camera device and the following target; the method specifically comprises the following steps:
transmitting a laser signal to the following target and receiving a reflected laser signal;
acquiring the time interval between the emission of the laser signal and the reception of the reflected laser signal;
calculating a sample value of the change time of the received reflected laser signal;
correcting the time interval according to the sample value of the change time;
according to the corrected time interval, obtaining the distance between the following target and the unmanned model vehicle;
comparing the distance with a set distance threshold according to the distance; if the distance is smaller than or equal to the set threshold value, controlling the unmanned model vehicle to stop moving; if the distance is larger than the set threshold value, controlling the unmanned model vehicle to move;
wherein, the step of calculating the sample value of the change time of the received reflected laser signal further comprises: one parameter of the spectral width, wavelength and signal energy of the received laser signal is adjusted such that the sample value of the variation time is minimized.
Further, the image shot by the camera is subjected to blocking processing according to the following formula:
Figure BDA0001967134160000031
wherein QiIs an image matrix of the i-th frame image, Q, taken by the imaging deviceimnIs the pixel point of the mth row and the nth column in the image matrix of the ith frame image, AipkThe image matrix of the ith frame image is subjected to blocking processing to obtain a block matrix of the kth row and the kth column in the block matrix;
accordingly, the pixels of the feature information included in each block matrix after being partitioned are determined according to the following formula:
ti11∈Ai11;ti12∈Ai12;…ti1k∈Ai1k;…tipk∈Aipk
wherein t isiCarrying out blocking processing on an image matrix of the ith frame image to obtain a pixel containing characteristic information in a block matrix corresponding to the block matrix, tipkPixel points of characteristic information of the kth row and the kth column in the block matrix; at tiOn the basis, a feature matrix of the ith frame image is formed according to the following formula:
Figure BDA0001967134160000032
and Ti is a characteristic matrix of the ith frame image, and the characteristic matrix of the ith frame image is compared with the characteristic matrix of the previous frame by using the following equation, so that the adjustment angle and the distance of the camera device are obtained:
in order to enable the camera to accurately follow the target, extracting coordinates corresponding to the same elements in the feature matrix of the previous frame, and then calculating as follows:
Figure BDA0001967134160000041
Figure BDA0001967134160000042
wherein θ is the angle to be adjusted by the image pickup device, (m)j,i,nj,i) Is composed ofCoordinates corresponding to the jth identical element in the feature matrix of the ith frame image (a)j,i-1,bj,i-1) The coordinate corresponding to the jth identical element in the feature matrix of the ith-1 frame image is shown, u is the total number of the identical elements in the feature matrix, and L is the distance to be adjusted by the camera device.
The coordinate corresponding to the same element and the formula are used for calculating the angle, so that the rotating accuracy of the camera is greatly improved, the moving distance is calculated in a mode that the coordinate and the characteristic matrix are integrally combined, the moving error of the camera can be reduced, and the camera can move more accurately.
Corresponding to the method, the embodiment of the invention provides an automatic following control system of an unmanned model vehicle, which comprises a camera device, a processing module, a following target identification module and an adjusting module;
the camera device is used for shooting according to a preset time period and transmitting the acquired image to the processing module;
the processing module is used for processing the image to obtain the initial following target;
the following target identification module comprises a blocking unit, a feature extraction unit, a feature image synthesis unit and a feature comparison unit; the blocking unit is used for carrying out blocking processing on the image shot by the camera device; the feature extraction unit is configured to acquire adjacent pixels in each image region after the image is segmented, and perform normalization processing on the adjacent pixels to acquire feature information in the image region; the characteristic image synthesis unit is used for classifying the characteristic information of each image area into a synthesis group and acquiring the characteristic image of the image; the characteristic comparison unit is used for comparing the characteristic image of the image with the characteristic image in the previous frame of image and judging whether the initial following target is the following target or not;
the adjusting module is used for acquiring the relative position of the following target and the center of the camera device according to the image and adjusting the angle of the camera device.
Further, the blocking unit performs a blocking process on the image captured by the image capturing device according to the following formula:
Figure BDA0001967134160000051
wherein QiIs an image matrix of the i-th frame image, Q, taken by the imaging deviceimnIs the pixel point of the mth row and the nth column in the image matrix of the ith frame image, AipkThe image matrix of the ith frame image is subjected to blocking processing to obtain a block matrix of the kth row and the kth column in the block matrix;
accordingly, the pixels of the feature information included in each block matrix after being partitioned are determined according to the following formula:
ti11∈Ai11;ti12∈Ai12;…ti1k∈Ai1k;…tipk∈Aipk
wherein t isiCarrying out blocking processing on an image matrix of the ith frame image to obtain a pixel containing characteristic information in a block matrix corresponding to the block matrix, tipkPixel points of characteristic information of the kth row and the kth column in the block matrix; at tiOn the basis, a feature matrix of the ith frame image is formed according to the following formula:
Figure BDA0001967134160000052
wherein T isiComparing the feature matrix of the ith frame image with the feature matrix of the previous frame by using the following equations, thereby obtaining the adjustment angle and the distance of the camera device:
in order to enable the camera to accurately follow the target, extracting coordinates corresponding to the same elements in the feature matrix of the previous frame, and then calculating as follows:
Figure BDA0001967134160000053
Figure BDA0001967134160000061
wherein θ is the angle to be adjusted by the image pickup device, (m)j,i,nj,i) The coordinates corresponding to the jth identical element in the feature matrix of the ith frame image are (a)j,i-1,bj,i-1) The coordinate corresponding to the jth identical element in the feature matrix of the ith-1 frame image is shown, u is the total number of the identical elements in the feature matrix, and L is the distance to be adjusted by the camera device.
Further, the camera device comprises a timing unit, a camera and a microcontroller; the microcontroller is electrically connected with the timing unit and the camera; the timing unit is used for timing according to the preset time period and transmitting a timing ending signal to the microcontroller; the microcontroller is used for controlling the camera to shoot when receiving the timing ending signal;
in the preset time period, workers can set the time period manually according to actual requirements; the preset time period is set to 60ms by default.
Further, the system also comprises an unmanned model vehicle;
the unmanned model vehicle comprises a driving device, a solar power supply device, a storage battery and a control unit; the control units are electrically connected with the driving device, the solar cell panel and the storage battery; the control unit is used for controlling the solar power supply device to receive solar light energy and convert the solar light energy into electric energy to be stored in the storage battery; the control unit is also used for controlling the storage battery to transmit electric energy to the driving device to drive the unmanned model vehicle to move;
the solar power supply device comprises: the solar energy collecting device comprises a direction angle adjusting device and a solar panel; the direction angle adjusting device comprises a base, a first linking device is arranged above the base, a first support frame perpendicular to the base is arranged at the upper end of the first linking device, a second support frame is arranged on the right side of the first support frame, two ends of the linking device are respectively linked with the lower sides of the first support frame and the second support frame, the linking device is connected with a circular arc-shaped adjusting device, the lower end of the adjusting device is fixedly arranged on the base, the upper end of the adjusting device is fixedly arranged on the first support frame, a sliding groove is arranged on the adjusting device, a double-headed bolt is connected in the sliding groove, one end of the double-headed bolt is linked with the linking device, and the other end of the double-headed bolt is connected with a first control device through the adjusting device; the upper end of the first support frame and the upper end of the second support frame are both connected with the second connecting device, a fixed flat plate is arranged on the second connecting device, concave parts are arranged at two ends of the flat plate, an arc-shaped regulator is arranged at the concave parts of the flat plate, a stud is arranged on the regulator, the upper end of the stud is arranged on a battery installation device, the lower end of the stud is connected with a second control device through the regulator, a spring is arranged on the stud, one end of the spring is connected with the battery installation device, and the other end of the spring is connected with the regulator; the solar cell panel is arranged on the cell mounting device.
Further, the system also comprises a laser scanning device;
the laser scanning device comprises a laser emitting subunit, a laser receiving subunit and a processing subunit;
the laser emission subunit is used for emitting a laser signal to the following target;
the laser receiving subunit is used for receiving the laser signal reflected back by the following target;
the processing subunit is configured to process a laser signal received by the laser receiving subunit, a time interval between the laser emitted by the laser emitting subunit and a laser signal reflected back by the laser receiving subunit, and obtain a distance between the following target and the unmanned model vehicle.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a schematic structural diagram of an automatic following control method for an unmanned model vehicle according to the present invention;
FIG. 2 is a schematic structural diagram of an automatic following control system of an unmanned model vehicle according to the present invention;
fig. 3 is a schematic structural diagram of a solar power supply device of an automatic following control system of an unmanned model vehicle provided by the invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
The embodiment of the invention provides an automatic following control method of an unmanned model vehicle, which comprises the following steps of:
s101, acquiring images shot by a camera device in a preset time period;
s102, selecting an initial following target according to an image shot by the camera device;
s103, carrying out blocking processing on the image shot by the camera device;
s104, selecting a central pixel as the pixel center of the image area from each image area after image blocking, and taking other pixels of the image area as adjacent pixels; the distances from the adjacent pixels to the centers of the pixels are equal;
s105, performing data standardization processing on adjacent pixels of each image area to acquire characteristic information of the image areas;
s106, classifying the characteristic information of each image area into a combined group to obtain a characteristic image of the image;
s107, comparing the characteristic image of the image with the characteristic image in the previous frame of image, and judging whether the initial following target is the following target or not;
s108, if so, determining the initial following target as a following target;
and S109, adjusting the camera device according to the relative position of the image acquisition following target and the center of the camera device.
The working principle of the method is as follows: the method comprises the following steps that a camera device shoots images according to a preset time period, and an initial following target is selected according to the shot images; the method comprises the steps of carrying out blocking processing on a camera device, obtaining adjacent pixels of each image area in an image, carrying out data standardization processing on the adjacent pixels, and obtaining characteristic information of each image area; the characteristic information of each image area is put into a combined group to form a characteristic image of the image; and comparing the characteristic image of the image with the characteristic image in the previous frame of image, judging whether the selected initial following target is the following target, if the initial following target is judged to be the following target, acquiring the relative position of the following target and the center of the camera device according to the shot image, and adjusting the angle of the camera device.
The method has the beneficial effects that: selecting an initial following target through an image shot by a camera device, realizing the initial selection of the following target by the unmanned model vehicle, carrying out block processing on the shot image to obtain adjacent pixels of each image area, carrying out data standardization processing and obtaining the characteristic information of each image area; the characteristic information of each image area is put into a combined group to form a characteristic image of the image; the extraction of the characteristic image of the shot image is realized; comparing the characteristic image of the acquired image with the characteristic image extracted from the previous frame of image, judging whether the image is the same following target or not, and if so, determining the initial following target as the following target; adjusting the shooting angle of the camera device according to the position of the following target in the image; the method realizes the identification of the initial following target in the image shot by the camera device, and judges and compares the obtained initial following target, thereby ensuring the uniqueness of the following target of the unmanned model vehicle; thereby realizing the automatic following function of the unmanned model vehicle.
In one embodiment, the preset time period is 60ms, and the image capturing device captures images in a preset time period of one frame of 60 ms. According to the technical scheme, the image is shot in one frame of 60ms through the camera device, the following target is judged and compared every 60ms, and the following of the following target by the unmanned model car is more accurate.
In one embodiment, before step S102, the method further comprises: processing an image shot by the camera device; the method comprises the following steps:
acquiring a light quantity value of each pixel in an image;
acquiring the ratio of the light quantity value of each pixel to a standard light quantity value acquired when the pixel irradiates a following target;
comparing the ratio with a preset threshold; if the ratio is larger than a preset threshold value, judging the pixel as an abnormal pixel; if the ratio is smaller than a preset threshold value, the pixel is judged to be a normal pixel;
and calculating a standard light value of the obtained abnormal pixel according to the light values of the normal pixels adjacent to the abnormal pixel, and replacing the light value of the abnormal pixel with the standard light value.
According to the technical scheme, after the image is shot by the camera device, the light quantity value of each pixel of the shot image is obtained, the light quantity value of the pixel is compared with the standard light quantity value to obtain a ratio, the ratio of each pixel is compared with a preset threshold value, and whether the pixel is an abnormal pixel or not is judged; if the abnormal pixel is judged, calculating a standard light value corresponding to the anti-positive pixel according to the normal pixels adjacent to the abnormal pixel and replacing the standard light value; therefore, the light quantity value of the pixel of the shot image is detected, and the abnormal pixel light quantity value is corrected and replaced, so that an image with higher quality can be obtained, and the image can be conveniently processed by other subsequent steps.
In one embodiment, before step S109, the method further comprises: measuring the distance between the camera device and the following target; the method specifically comprises the following steps:
transmitting a laser signal to a following target and receiving a reflected laser signal;
acquiring the time interval between the emission of the laser signal and the reception of the reflected laser signal;
calculating a sample value of the change time of the received reflected laser signal;
correcting the time interval according to the sample value of the change time;
acquiring the distance between the following target and the unmanned model vehicle according to the corrected time interval;
comparing the distance with a set distance threshold according to the distance; if the distance is smaller than or equal to the set threshold value, controlling the unmanned model vehicle to stop moving; if the distance is greater than the set threshold value, controlling the unmanned model vehicle to move;
wherein, the step of calculating the sample value of the change time of the received reflected laser signal further comprises: one parameter of the spectral width, wavelength and signal energy of the received laser signal is adjusted so that the sample value of the variation time is minimized.
According to the technical scheme, the laser signal is transmitted to the following target, and the reflected laser signal is received; acquiring a time interval between a transmitted laser signal and a received laser signal; correcting the time interval to obtain the distance between the measuring camera device and the following target; and comparing the acquired distance with a set distance threshold (for example, the set distance threshold is 5m), and judging whether the unmanned model vehicle moves along with the following target. According to the technical scheme, accurate distance measurement between the camera device and the following target is realized, the unmanned model vehicle moves according to the measured distance, and the distance between the unmanned model vehicle and the following target is adjusted.
In one embodiment, the image captured by the image capture device may be subjected to a blocking process according to the following formula:
Figure BDA0001967134160000111
wherein QiIs an image matrix of the i-th frame image, Q, taken by the imaging deviceimnIs the pixel point of the mth row and the nth column in the image matrix of the ith frame image, AipkThe image matrix of the ith frame image is subjected to blocking processing to obtain a block matrix of the kth row and the kth column in the block matrix;
accordingly, the pixels of the feature information included in each block matrix after being partitioned are determined according to the following formula:
ti11∈Ai11;ti12∈Ai12;…ti1k∈Ai1k;…tipk∈Aipk
wherein t isiCarrying out blocking processing on an image matrix of the ith frame image to obtain a pixel containing characteristic information in a block matrix corresponding to the block matrix, tipkPixel points of characteristic information of the kth row and the kth column in the block matrix; at tiOn the basis, a feature matrix of the ith frame image is formed according to the following formula:
Figure BDA0001967134160000112
wherein T isiComparing the feature matrix of the ith frame image with the feature matrix of the previous frame by using the following equations, thereby obtaining the adjustment angle and the distance of the camera device:
in order to enable the camera to accurately follow the target, extracting coordinates corresponding to the same elements in the feature matrix of the previous frame, and then calculating as follows:
Figure BDA0001967134160000113
Figure BDA0001967134160000114
wherein θ is the angle to be adjusted by the image pickup device, (m)j,i,nj,i) The coordinates corresponding to the jth identical element in the feature matrix of the ith frame image are (a)j,i-1,bj,i-1) The coordinate corresponding to the jth identical element in the feature matrix of the ith-1 frame image is shown, u is the total number of the identical elements in the feature matrix, and L is the distance to be adjusted by the camera device.
The coordinate corresponding to the same element and the formula are used for calculating the angle, so that the rotating accuracy of the camera is greatly improved, the moving distance is calculated in a mode that the coordinate and the characteristic matrix are integrally combined, the moving error of the camera can be reduced, and the camera can move more accurately.
Corresponding to the foregoing method, an embodiment of the present invention further provides an automatic following control system for an unmanned model vehicle, as shown in fig. 2, including a camera 21, a processing module 22, a following target recognition module 23, and an adjusting module 24;
the camera device 21 is used for shooting according to a preset time period and transmitting the acquired image to the processing module;
the processing module 22 is used for processing the image and acquiring an initial following target;
the following target identification module 23 comprises a blocking unit, a feature extraction unit, a feature image synthesis unit and a feature comparison unit; the block unit is used for carrying out block processing on the image shot by the camera device; the characteristic extraction unit is used for acquiring adjacent pixels in each image area after the image is blocked, and carrying out standardization processing on the adjacent pixels to acquire characteristic information in the image area; the characteristic image synthesis unit is used for classifying the characteristic information of each image area into a synthesis group to obtain a characteristic image of the image; the characteristic comparison unit is used for comparing the characteristic image of the image with the characteristic image in the previous frame of image and judging whether the initial following target is the following target or not;
and the adjusting module 24 is configured to adjust an angle of the camera according to a relative position between the image acquisition following target and a center of the camera.
The working principle of the system is as follows: the image pickup device 21 performs image pickup according to a preset time period to obtain an image, and the processing module 22 processes the obtained image to obtain an initial following target; the image is subjected to blocking processing by following a blocking unit in the target identification module 23, the feature information of each image area acquired by the blocking unit is acquired by a feature extraction unit, and the feature information of each image area is classified into a combination group by a feature image synthesis unit to acquire a feature image of the image; the characteristic comparison unit compares the characteristic image of the image with the characteristic image in the previous frame of image and judges whether the initial following target is the following target or not; and the adjusting module 24 is used for adjusting the angle of the camera according to the position of the following target in the image after the following target is determined by the following target identifying module.
In practical application, an image shot by the camera device is an image matrix composed of pixels, and in order to solve the problem of blocking processing, namely, changing image partitions into block matrices and acquiring characteristic information in an image area so as to operate a camera moving mode, a specific process of establishing a model is as follows:
Figure BDA0001967134160000131
wherein QiIs an image matrix of the i-th frame image, Q, taken by the imaging deviceimnIs the pixel point of the mth row and the nth column in the image matrix of the ith frame image, AipkThe image matrix of the ith frame image is subjected to blocking processing to obtain a block matrix of the kth row and the kth column in the block matrix;
accordingly, the pixels of the feature information included in each block matrix after being partitioned are determined according to the following formula:
ti11∈Ai11;ti12∈Ai12;…ti1k∈Ai1k;…tipk∈Aipk
wherein t isiCarrying out blocking processing on an image matrix of the ith frame image to obtain a pixel containing characteristic information in a block matrix corresponding to the block matrix, tipkPixel points of characteristic information of the kth row and the kth column in the block matrix; at tiOn the basis, a feature matrix of the ith frame image is formed according to the following formula:
Figure BDA0001967134160000132
wherein T isiComparing the feature matrix of the ith frame image with the feature matrix of the previous frame by using the following equations, thereby obtaining the adjustment angle and the distance of the camera device:
in order to enable the camera to accurately follow the target, extracting coordinates corresponding to the same elements in the feature matrix of the previous frame, and then calculating as follows:
Figure BDA0001967134160000133
Figure BDA0001967134160000141
wherein θ is the angle to be adjusted by the image pickup device, (m)j,i,nj,i) The coordinates corresponding to the jth identical element in the feature matrix of the ith frame image are (a)j,i-1,bj,i-1) The coordinate corresponding to the jth identical element in the feature matrix of the ith-1 frame image is shown, u is the total number of the identical elements in the feature matrix, and L is the distance to be adjusted by the camera device.
The beneficial effect of above-mentioned system lies in: the processing module selects an initial following target through an image shot by the camera device, so that the initial selection of the following target by the unmanned model vehicle is realized, and a blocking unit in the following target recognition module performs blocking processing on the shot image to divide the image into a plurality of image areas; the characteristic extraction unit is used for extracting characteristic information of each image area, and the characteristic information is classified into a combination group through the characteristic image combination unit to obtain a characteristic image of the image, so that the characteristic image of the image shot by the camera device is obtained; comparing the characteristic image of the image with the characteristic image in the previous frame of image through a characteristic comparison unit, judging whether the initial following target is the following target or not, if so, determining the initial following target as the following target, and adjusting the shooting angle of the camera device through an adjustment module according to the position of the following target in the image; the system realizes the identification of the initial following target in the image shot by the camera device through the processing module, and judges and compares the acquired initial following target through the following target identification module, thereby ensuring the following uniqueness of the unmanned model vehicle to the following target and realizing the automatic following function of the system.
In one embodiment, the camera device comprises a timing unit, a camera and a microcontroller; the microcontroller is electrically connected with the timing unit and the camera; the timing unit is used for timing according to a preset time period and transmitting a timing ending signal to the microcontroller; the microcontroller is used for controlling the camera to shoot when receiving the timing end signal;
the time period is preset, and workers can manually set the time period according to actual requirements; the preset time period is set to 60ms by default. According to the technical scheme, the timing unit, the camera and the microcontroller in the camera device are used for controlling the camera to perform timing shooting according to the preset time period in the timing unit through the microcontroller, so that the camera device can automatically perform timing shooting on the following target.
In one embodiment, the system further comprises an unmanned model vehicle;
the unmanned model vehicle comprises a driving device, a solar power supply device, a storage battery and a control unit; the control unit is electrically connected with the driving device, the solar cell panel and the storage battery; the control unit is used for controlling the solar power supply device to receive solar light energy and convert the solar light energy into electric energy to be stored in the storage battery; the control unit is also used for controlling the storage battery to transmit the electric energy to the driving device to drive the unmanned model vehicle to move;
solar power supply unit: as shown in fig. 3, the solar energy collecting device comprises a direction angle adjusting device 31 and a solar cell panel 32; the direction angle adjusting device 31 comprises a base 311, a first linking device 312 is arranged above the base 311, a first support frame 313 perpendicular to the base 311 is arranged at the upper end of the first linking device 312, a second support frame 314 is arranged on the right side of the first support frame 313, two ends of a linking device 315 are respectively linked with the lower sides of the first support frame 313 and the second support frame 314, the linking device 315 is connected with a circular arc-shaped adjusting device 316, the lower end of the adjusting device 316 is fixedly arranged on the base 311, the upper end of the adjusting device 316 is fixedly arranged on the first support frame 313, a sliding chute 3161 is arranged on the adjusting device 316, a stud 317 is connected in the sliding chute 3161, one end of the stud 317 is linked with the linking device 315, and the other end of the stud 317 is connected with a first control device 318 through the adjusting device 316; the upper end of the first support frame 313 and the upper end of the second support frame 314 are both connected with a second connecting device, the second connecting device is provided with a fixed flat plate 319, two ends of the flat plate 319 are provided with concave parts, the concave parts of the flat plate 319 are provided with arc-shaped regulators 3110, the regulators 3110 are provided with studs 3111, the upper ends of the studs 3111 are arranged on the battery installation device 3112, the lower ends of the studs 3111 are connected with a second control device 3113 through the regulators 3110, the studs 3111 are provided with springs 3114, one end of each spring 3114 is connected with the battery installation device 3112, and the other end of each spring 3110 is connected with the regulator 3110; the solar cell panel 32 is provided on the battery mounting apparatus 3112. In the above technical solution, the battery installation device 3112 is used for installing the solar panel 32, the battery installation device 3112, the first support frame 313, the second support frame 314 and the linkage device 315 form a parallelogram, and the linkage device 315 is used for adjusting the up-down angle of the battery installation device 3112; simultaneously through rotating second controlling means 3113, change between dull and stereotyped 319 and the battery installation device 3112 length of double-screw bolt 3111, alright realize battery installation device 3112 and control the rotation to realized the omnidirectional rotation of solar cell panel 32, above-mentioned technical scheme has realized the regulation to battery installation device 3112 angle, and then has improved solar cell panel 32's photoelectric conversion efficiency.
In one embodiment, the system further comprises a laser scanning device;
the laser scanning device comprises a laser emitting subunit, a laser receiving subunit and a processing subunit;
the laser emission subunit is used for emitting a laser signal to the following target;
the laser receiving subunit is used for receiving the laser signal reflected back by the target;
and the processing subunit is used for processing the laser signal received by the laser receiving subunit, the time interval between the laser emitted by the laser emitting subunit and the laser signal reflected by the laser receiving subunit to obtain the distance between the following target and the unmanned model vehicle. According to the technical scheme, the laser emission subunit in the laser scanning device emits laser signals to the following target, the laser receiving subunit receives the laser signals reflected back by the following target, the processing subunit acquires the time interval between the time of emitting the laser signals and the time of receiving the laser signals and processes the time interval, and distance measurement between the following target and the unmanned model vehicle is achieved.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. An automatic following control method of an unmanned model vehicle, which is characterized by comprising the following steps:
s101, acquiring images shot by a camera device in a preset time period;
s102, selecting an initial following target according to the image shot by the camera device;
s103, carrying out blocking processing on the image shot by the camera device;
s104, selecting a central pixel as the pixel center of the image area from each image area after the image is partitioned, and taking other pixels of the image area as adjacent pixels; the distances from the adjacent pixels to the centers of the pixels are equal;
s105, performing data standardization processing on adjacent pixels of each image area to acquire characteristic information of the image areas;
s106, classifying the characteristic information of each image area into a combined group to obtain a characteristic image of the image;
s107, comparing the characteristic image of the image with the characteristic image in the previous frame of image, and judging whether the initial following target is a following target;
s108, if so, determining the initial following target as a following target;
s109, acquiring the relative position of the following target and the center of the camera device according to the image, and adjusting the camera device;
the formula corresponding to the blocking processing is as follows:
Figure FDA0003170105210000011
wherein QiIs an image matrix of the i-th frame image, Q, taken by the imaging deviceimnIs the pixel point of the mth row and the nth column in the image matrix of the ith frame image, AipkThe image matrix of the ith frame image is subjected to blocking processing to obtain a block matrix of the kth row and the kth column in the block matrix;
accordingly, the pixels of the feature information included in each block matrix after being partitioned are determined according to the following formula:
ti11∈Ai11;ti12∈Ai12;…ti1k∈Ai1k;…tipk∈Aipk
wherein t isiCarrying out blocking processing on an image matrix of the ith frame image to obtain a pixel containing characteristic information in a block matrix corresponding to the block matrix, tipkPixel points of characteristic information of the kth row and the kth column in the block matrix; at tiOn the basis, a feature matrix of the ith frame image is formed according to the following formula:
Figure FDA0003170105210000021
wherein T isiComparing the feature matrix of the ith frame image with the feature matrix of the previous frame by using the following equations, thereby obtaining the adjustment angle and the distance of the camera device:
in order to enable the camera to accurately follow the target, extracting coordinates corresponding to the same elements in the feature matrix of the previous frame, and then calculating as follows:
Figure FDA0003170105210000022
Figure FDA0003170105210000023
wherein θ is the angle to be adjusted by the image pickup device, (m)j,i,nj,i) The coordinates corresponding to the jth identical element in the feature matrix of the ith frame image are (a)j,i-1,bj,i-1) The coordinate corresponding to the jth identical element in the feature matrix of the ith-1 frame image is shown, u is the total number of the identical elements in the feature matrix, and L is the distance to be adjusted by the camera device.
2. The method of claim 1,
the preset time period is 60ms, and the image shooting device shoots images in the preset time period of one frame of 60 ms.
3. The method of claim 1,
before the step S102, the method further includes: processing the image shot by the camera device; the method comprises the following steps:
acquiring a light quantity value of each pixel in the image;
acquiring the ratio of the light quantity value of each pixel to a standard light quantity value acquired when the following target is irradiated;
comparing the ratio with a preset threshold; if the ratio is larger than the preset threshold value, judging the pixel as an abnormal pixel; if the ratio is smaller than the preset threshold value, judging the pixel to be a normal pixel;
and calculating a standard light value of the abnormal pixel according to the light values of the normal pixels adjacent to the abnormal pixel, and replacing the light value of the abnormal pixel with the standard light value.
4. The method of claim 1,
before the step S109, the method further includes: measuring a distance between the camera device and the following target; the method specifically comprises the following steps:
transmitting a laser signal to the following target and receiving a reflected laser signal;
acquiring the time interval between the emission of the laser signal and the reception of the reflected laser signal;
calculating a sample value of the change time of the received reflected laser signal;
correcting the time interval according to the sample value of the change time;
according to the corrected time interval, obtaining the distance between the following target and the unmanned model vehicle;
comparing the distance with a set distance threshold according to the distance; if the distance is smaller than or equal to the set threshold value, controlling the unmanned model vehicle to stop moving; if the distance is larger than the set threshold value, controlling the unmanned model vehicle to move;
wherein, the step of calculating the sample value of the change time of the received reflected laser signal further comprises: one parameter of the spectral width, wavelength and signal energy of the received laser signal is adjusted such that the sample value of the variation time is minimized.
5. An automatic following control system of an unmanned model vehicle is characterized by comprising a camera device, a processing module, a following target recognition module and an adjusting module;
the camera device is used for shooting according to a preset time period and transmitting the acquired image to the processing module;
the processing module is used for processing the image and acquiring an initial following target;
the following target identification module comprises a blocking unit, a feature extraction unit, a feature image synthesis unit and a feature comparison unit; the blocking unit is used for carrying out blocking processing on the image shot by the camera device; the feature extraction unit is configured to acquire adjacent pixels in each image region after the image is segmented, and perform normalization processing on the adjacent pixels to acquire feature information in the image region; the characteristic image synthesis unit is used for classifying the characteristic information of each image area into a synthesis group and acquiring the characteristic image of the image; the feature comparison unit is configured to compare the feature image of the image with a feature image in a previous frame of image, and determine whether the initial following target is a following target, where a blocking processing formula is as follows:
Figure FDA0003170105210000041
wherein QiIs an image matrix of the i-th frame image, Q, taken by the imaging deviceimnIs the pixel point of the mth row and the nth column in the image matrix of the ith frame image, AipkThe image matrix of the ith frame image is subjected to blocking processing to obtain a block matrix of the kth row and the kth column in the block matrix;
accordingly, the pixels of the feature information included in each block matrix after being partitioned are determined according to the following formula:
ti11∈Ai11;ti12∈Ai12;…ti1k∈Ai1k;…tipk∈Aipk
wherein t isiFor image matrix of ith frame imageThe block matrix corresponding to the block matrix obtained by the block processing contains the pixels of the characteristic information tipkPixel points of characteristic information of the kth row and the kth column in the block matrix; at tiOn the basis, a feature matrix of the ith frame image is formed according to the following formula:
Figure FDA0003170105210000042
wherein T isiComparing the feature matrix of the ith frame image with the feature matrix of the previous frame by using the following equations, thereby obtaining the adjustment angle and the distance of the camera device:
in order to enable the camera to accurately follow the target, extracting coordinates corresponding to the same elements in the feature matrix of the previous frame, and then calculating as follows:
Figure FDA0003170105210000051
Figure FDA0003170105210000052
wherein θ is the angle to be adjusted by the image pickup device, (m)j,i,nj,i) The coordinates corresponding to the jth identical element in the feature matrix of the ith frame image are (a)j,i-1,bj,i-1) The coordinate corresponding to the jth same element in the feature matrix of the ith-1 frame image is obtained, u is the total number of the same elements in the feature matrix, and L is the distance to be adjusted by the camera device;
the adjusting module is used for acquiring the relative position of the following target and the center of the camera device according to the image and adjusting the angle of the camera device.
6. The system of claim 5,
the camera device comprises a timing unit, a camera and a microcontroller; the microcontroller is electrically connected with the timing unit and the camera; the timing unit is used for timing according to the preset time period and transmitting a timing ending signal to the microcontroller; the microcontroller is used for controlling the camera to shoot when receiving the timing ending signal;
in the preset time period, workers can set the time period manually according to actual requirements; the preset time period is set to 60ms by default.
7. The system of claim 6,
the system also comprises an unmanned model vehicle;
the unmanned model vehicle comprises a driving device, a solar power supply device, a storage battery and a control unit; the control units are electrically connected with the driving device, the solar cell panel and the storage battery; the control unit is used for controlling the solar power supply device to receive solar light energy and convert the solar light energy into electric energy to be stored in the storage battery; the control unit is also used for controlling the storage battery to transmit electric energy to the driving device to drive the unmanned model vehicle to move;
the solar power supply device comprises: comprises a direction angle adjusting device (31) and a solar panel (32); the direction angle adjusting device (31) comprises a base (311), a first connecting device (312) is arranged above the base, a first support frame (313) perpendicular to the base (311) is arranged at the upper end of the first connecting device (312), a second support frame (314) is arranged on the right side of the first support frame (313), two ends of a linkage device (315) are respectively connected with the lower sides of the first support frame (313) and the second support frame (314), the linkage device (315) is connected with a circular arc-shaped adjusting device (316), the lower end of the adjusting device (316) is fixedly arranged on the base (311), the upper end of the adjusting device (316) is fixedly arranged on the first support frame (313), a sliding groove (3161) is arranged on the adjusting device (316), a double-head bolt (317) is connected in the sliding groove (3161), and one end of the double-head bolt (317) is connected with the linkage device (315), the other end of the stud bolt (317) is connected with a first control device (318) through an adjusting device (316); the upper end of the first support frame (313) and the upper end of the second support frame (314) are connected with a second connecting device, a fixed flat plate (319) is arranged on the second connecting device, concave parts are arranged at two ends of the flat plate (319), an arc-shaped regulator (3110) is arranged at the concave part of the flat plate (319), a stud (3111) is arranged on the regulator (3110), the upper end of the stud (3111) is arranged on a battery installation device (3112), the lower end of the stud (3111) is connected with a second control device (3113) through the regulator (3110), a spring (3114) is arranged on the stud (3111), one end of the spring (3114) is connected with the battery installation device (3112), and the other end of the spring (3114) is connected with the regulator (3110); the solar cell panel (32) is provided on the battery mounting device (3112).
8. The system of claim 6,
the system further comprises a laser scanning device;
the laser scanning device comprises a laser emitting subunit, a laser receiving subunit and a processing subunit;
the laser emission subunit is used for emitting a laser signal to the following target;
the laser receiving subunit is used for receiving the laser signal reflected back by the following target;
the processing subunit is configured to process a laser signal received by the laser receiving subunit, a time interval between the laser emitted by the laser emitting subunit and a laser signal reflected back by the laser receiving subunit, and obtain a distance between the following target and the unmanned model vehicle.
CN201910107843.4A 2019-02-02 2019-02-02 Automatic following control system and method for unmanned model vehicle Active CN109782811B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910107843.4A CN109782811B (en) 2019-02-02 2019-02-02 Automatic following control system and method for unmanned model vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910107843.4A CN109782811B (en) 2019-02-02 2019-02-02 Automatic following control system and method for unmanned model vehicle

Publications (2)

Publication Number Publication Date
CN109782811A CN109782811A (en) 2019-05-21
CN109782811B true CN109782811B (en) 2021-10-08

Family

ID=66503207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910107843.4A Active CN109782811B (en) 2019-02-02 2019-02-02 Automatic following control system and method for unmanned model vehicle

Country Status (1)

Country Link
CN (1) CN109782811B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110522408A (en) * 2019-07-25 2019-12-03 北京爱诺斯科技有限公司 A kind of eye eyesight based on eccentricity cycles technology judges system and method
CN111026162B (en) * 2019-12-10 2023-04-11 长沙中联重科环境产业有限公司 Self-following cleaning robot

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012066201A1 (en) * 2010-11-19 2012-05-24 Gcmsd Solar-power generation facility having directable collectors
CN102867174A (en) * 2012-08-30 2013-01-09 中国科学技术大学 Method and device for positioning human face features
CN202721146U (en) * 2012-08-13 2013-02-06 郭熹 Linkage tracking light collecting apparatus
CN103325112A (en) * 2013-06-07 2013-09-25 中国民航大学 Quick detecting method for moving objects in dynamic scene
CN103727930A (en) * 2013-12-30 2014-04-16 浙江大学 Edge-matching-based relative pose calibration method of laser range finder and camera
CN104010116A (en) * 2013-02-22 2014-08-27 精工爱普生株式会社 Spectroscopic camera and spectroscopic image processing method
CN104301669A (en) * 2014-09-12 2015-01-21 重庆大学 Suspicious target detection tracking and recognition method based on dual-camera cooperation
WO2015017943A1 (en) * 2013-08-06 2015-02-12 Vergara Monsalve Miguel Solar generation systems having a common receiver bridge and collectors with multiple mobile webs
CN104376564A (en) * 2014-11-24 2015-02-25 西安工程大学 Method for extracting rough image edge based on anisotropism Gaussian directional derivative filter
KR20150003631U (en) * 2014-03-26 2015-10-06 (주) 그린솔루션 Inclination adjuster of the solar cell module
CN105095923A (en) * 2014-05-21 2015-11-25 华为技术有限公司 Image processing method and device
WO2016011204A1 (en) * 2014-07-15 2016-01-21 Face Checks Llc Multi-algorithm-based face recognition system and method with optimal dataset partitioning for a cloud environment
CN205116051U (en) * 2015-11-07 2016-03-30 陈显华 Denoter of construction of highway
CN105447465A (en) * 2015-11-25 2016-03-30 中山大学 Incomplete pedestrian matching method between non-overlapping vision field cameras based on fusion matching of local part and integral body of pedestrian
CN106131493A (en) * 2016-07-20 2016-11-16 绥化学院 Come personally based on virtual reality far-end the motion sensing control system of intelligent fire robot
CN106295498A (en) * 2016-07-20 2017-01-04 湖南大学 Remote sensing image target area detection apparatus and method
CN106651920A (en) * 2016-10-19 2017-05-10 北京邮电大学 Machine vision-based movement control method, device and system
WO2017218326A1 (en) * 2016-06-12 2017-12-21 Array Technologies, Inc. Clip-on mounting rails, mounting brackets, and methods of mounting solar modules
CN107681966A (en) * 2017-10-13 2018-02-09 东莞市天合机电开发有限公司 A kind of bracket adjustment systems of solar photovoltaic assembly
KR101838207B1 (en) * 2016-12-14 2018-03-13 재단법인대구경북과학기술원 Apparatus for distinguishing similarity, and calculation method for calculation metrix correlation distance
CN207504804U (en) * 2017-12-12 2018-06-15 湛江市农海科技有限公司 A kind of positioning device of solar energy photovoltaic panel
CN108263441A (en) * 2017-11-21 2018-07-10 中车长江车辆有限公司 A kind of pipeline transportation vehicle control
CN207603529U (en) * 2017-12-08 2018-07-10 沃玛新能源(江苏)有限公司 The adjustable support of photovoltaic module
CN108304834A (en) * 2018-02-27 2018-07-20 弗徕威智能机器人科技(上海)有限公司 A kind of object follower method
CN207884560U (en) * 2018-02-02 2018-09-18 东莞市杰阳太阳能科技有限公司 A kind of flexible thin-film solar cell
CN108664032A (en) * 2018-06-08 2018-10-16 东北大学秦皇岛分校 A kind of automatic control system and method for following carrier based on machine vision
CN108681721A (en) * 2018-05-22 2018-10-19 山东师范大学 Face identification method based on the linear correlation combiner of image segmentation two dimension bi-directional data
CN108875448A (en) * 2017-05-09 2018-11-23 上海荆虹电子科技有限公司 A kind of pedestrian recognition methods and device again
CN109087244A (en) * 2018-07-26 2018-12-25 贵州火星探索科技有限公司 A kind of Panorama Mosaic method, intelligent terminal and storage medium
CN109298806A (en) * 2018-09-21 2019-02-01 杨立群 A kind of long-range quick interface exchange method and device based on Object identifying

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9424470B1 (en) * 2014-08-22 2016-08-23 Google Inc. Systems and methods for scale invariant 3D object detection leveraging processor architecture
US10242581B2 (en) * 2016-10-11 2019-03-26 Insitu, Inc. Method and apparatus for target relative guidance

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012066201A1 (en) * 2010-11-19 2012-05-24 Gcmsd Solar-power generation facility having directable collectors
CN202721146U (en) * 2012-08-13 2013-02-06 郭熹 Linkage tracking light collecting apparatus
CN102867174A (en) * 2012-08-30 2013-01-09 中国科学技术大学 Method and device for positioning human face features
CN104010116A (en) * 2013-02-22 2014-08-27 精工爱普生株式会社 Spectroscopic camera and spectroscopic image processing method
CN103325112A (en) * 2013-06-07 2013-09-25 中国民航大学 Quick detecting method for moving objects in dynamic scene
WO2015017943A1 (en) * 2013-08-06 2015-02-12 Vergara Monsalve Miguel Solar generation systems having a common receiver bridge and collectors with multiple mobile webs
CN103727930A (en) * 2013-12-30 2014-04-16 浙江大学 Edge-matching-based relative pose calibration method of laser range finder and camera
KR20150003631U (en) * 2014-03-26 2015-10-06 (주) 그린솔루션 Inclination adjuster of the solar cell module
CN105095923A (en) * 2014-05-21 2015-11-25 华为技术有限公司 Image processing method and device
WO2016011204A1 (en) * 2014-07-15 2016-01-21 Face Checks Llc Multi-algorithm-based face recognition system and method with optimal dataset partitioning for a cloud environment
CN104301669A (en) * 2014-09-12 2015-01-21 重庆大学 Suspicious target detection tracking and recognition method based on dual-camera cooperation
CN104376564A (en) * 2014-11-24 2015-02-25 西安工程大学 Method for extracting rough image edge based on anisotropism Gaussian directional derivative filter
CN205116051U (en) * 2015-11-07 2016-03-30 陈显华 Denoter of construction of highway
CN105447465A (en) * 2015-11-25 2016-03-30 中山大学 Incomplete pedestrian matching method between non-overlapping vision field cameras based on fusion matching of local part and integral body of pedestrian
WO2017218326A1 (en) * 2016-06-12 2017-12-21 Array Technologies, Inc. Clip-on mounting rails, mounting brackets, and methods of mounting solar modules
CN106131493A (en) * 2016-07-20 2016-11-16 绥化学院 Come personally based on virtual reality far-end the motion sensing control system of intelligent fire robot
CN106295498A (en) * 2016-07-20 2017-01-04 湖南大学 Remote sensing image target area detection apparatus and method
CN106651920A (en) * 2016-10-19 2017-05-10 北京邮电大学 Machine vision-based movement control method, device and system
KR101838207B1 (en) * 2016-12-14 2018-03-13 재단법인대구경북과학기술원 Apparatus for distinguishing similarity, and calculation method for calculation metrix correlation distance
CN108875448A (en) * 2017-05-09 2018-11-23 上海荆虹电子科技有限公司 A kind of pedestrian recognition methods and device again
CN107681966A (en) * 2017-10-13 2018-02-09 东莞市天合机电开发有限公司 A kind of bracket adjustment systems of solar photovoltaic assembly
CN108263441A (en) * 2017-11-21 2018-07-10 中车长江车辆有限公司 A kind of pipeline transportation vehicle control
CN207603529U (en) * 2017-12-08 2018-07-10 沃玛新能源(江苏)有限公司 The adjustable support of photovoltaic module
CN207504804U (en) * 2017-12-12 2018-06-15 湛江市农海科技有限公司 A kind of positioning device of solar energy photovoltaic panel
CN207884560U (en) * 2018-02-02 2018-09-18 东莞市杰阳太阳能科技有限公司 A kind of flexible thin-film solar cell
CN108304834A (en) * 2018-02-27 2018-07-20 弗徕威智能机器人科技(上海)有限公司 A kind of object follower method
CN108681721A (en) * 2018-05-22 2018-10-19 山东师范大学 Face identification method based on the linear correlation combiner of image segmentation two dimension bi-directional data
CN108664032A (en) * 2018-06-08 2018-10-16 东北大学秦皇岛分校 A kind of automatic control system and method for following carrier based on machine vision
CN109087244A (en) * 2018-07-26 2018-12-25 贵州火星探索科技有限公司 A kind of Panorama Mosaic method, intelligent terminal and storage medium
CN109298806A (en) * 2018-09-21 2019-02-01 杨立群 A kind of long-range quick interface exchange method and device based on Object identifying

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
Crowd Counting System by Facial Recognition using Histogram of Oriented Gradients, Completed Local Binary Pattern, Gray-Level Co-Occurrence Matrix and Unmanned Aerial Vehicle;Balbin, JR等;《THIRD INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION》;20181231;1-5 *
Research on Color Recognition of Urine Test Paper Based on Learning Vector Quantization (LVQ);Wang Chunhong等;《2012 Second International Conference on Instrumentation, Measurement, Computer, Communication and Control》;20121231;850-853 *
Simple Structure for Reactive Power Control of AC Photovoltaic Modules;Tran, VT等;《2015 AUSTRALASIAN UNIVERSITIES POWER ENGINEERING CONFERENCE》;20151231;1-6 *
动态背景运动目标的快速锁定跟踪算法;李江等;《计算机工程与应用》;20171231;第53卷(第4期);214-222 *
基于PLC的智能温室环境控制***设计;王志国等;《哈尔滨师范大学自然科学学报》;20131231;第29卷(第3期);76-78 *
基于特征融合的人脸识别算法研究与应用;魏月纳;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170215(第02期);I138-3493 *
基于视觉特性的图像质量客观评价方法研究;齐欢;《中国博士学位论文全文数据库 信息科技辑》;20180615(第06期);I138-60 *
多尺度表达和正则化方法在图像识别中的研究与应用;吴萌;《中国博士学位论文全文数据库 信息科技辑》;20141215(第12期);I138-55 *
太阳光线跟踪***设计及其跟踪方式优化研究;邱阳;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20170615(第06期);C041-4 *

Also Published As

Publication number Publication date
CN109782811A (en) 2019-05-21

Similar Documents

Publication Publication Date Title
CN109146919B (en) Tracking and aiming system and method combining image recognition and laser guidance
CN109782811B (en) Automatic following control system and method for unmanned model vehicle
US20120013727A1 (en) Cell characterization using multiple focus planes
US20030152251A1 (en) Method and apparartus for picking up object being authenticated
CN101216591B (en) Image gray scale based automatic focusing method and its system
US20180242558A1 (en) Multifunction Livestock Measurement Station
KR101974105B1 (en) Photographing system and method for increasing recognition rate of vehicle number
KR20130138392A (en) System for detecting unexpected accident
CN109116298B (en) Positioning method, storage medium and positioning system
CN109451233B (en) Device for collecting high-definition face image
CN114441149B (en) Micron light-emitting diode detection system and detection method
CN117250200B (en) Square pipe production quality detection system based on machine vision
CN110220481A (en) Hand-held visual detection equipment and its position and posture detection method
CN114265418A (en) Unmanned aerial vehicle inspection and defect positioning system and method for photovoltaic power station
CN104667510A (en) Human motion test system
CN109343573A (en) Power equipment inspection figure image collection processing system based on light field technique for taking
KR20130050649A (en) License plate recognizing apparatus of a good fare adjustment
US8531589B2 (en) Image pickup device and image pickup method
US20050072902A1 (en) Method and device for evaluating a parameter of a moving object
CN105987807A (en) Defective point detection system and method for vertical cavity surface emitting laser array
KR102010818B1 (en) Apparatus for capturing images of blood cell
CN116463687A (en) Abnormal monitoring method and system based on out-of-groove information detection
CN115452827B (en) Triaxial motion type full-automatic multichannel slide scanning imaging analysis system
JP2000322686A (en) Number plate recognizing device for vehicle
KR101397222B1 (en) Solar tracking method, solar tracking system and solar power generation system including the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231011

Address after: 230000 B-1015, wo Yuan Garden, 81 Ganquan Road, Shushan District, Hefei, Anhui.

Patentee after: HEFEI MINGLONG ELECTRONIC TECHNOLOGY Co.,Ltd.

Address before: 230000 floor 1, building 2, phase I, e-commerce Park, Jinggang Road, Shushan Economic Development Zone, Hefei City, Anhui Province

Patentee before: Dragon totem Technology (Hefei) Co.,Ltd.

Effective date of registration: 20231011

Address after: 230000 floor 1, building 2, phase I, e-commerce Park, Jinggang Road, Shushan Economic Development Zone, Hefei City, Anhui Province

Patentee after: Dragon totem Technology (Hefei) Co.,Ltd.

Address before: 152000 No.18, Huanghe South Road, Beilin District, Suihua City, Heilongjiang Province

Patentee before: SUIHUA University