CN208335208U - Image co-registration acquisition system comprising meteorologic parameter - Google Patents

Image co-registration acquisition system comprising meteorologic parameter Download PDF

Info

Publication number
CN208335208U
CN208335208U CN201820313952.2U CN201820313952U CN208335208U CN 208335208 U CN208335208 U CN 208335208U CN 201820313952 U CN201820313952 U CN 201820313952U CN 208335208 U CN208335208 U CN 208335208U
Authority
CN
China
Prior art keywords
image
communication device
wireless communication
controller
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201820313952.2U
Other languages
Chinese (zh)
Inventor
阮驰
冯亚闯
陈小川
王允韬
马新旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XiAn Institute of Optics and Precision Mechanics of CAS
Original Assignee
XiAn Institute of Optics and Precision Mechanics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XiAn Institute of Optics and Precision Mechanics of CAS filed Critical XiAn Institute of Optics and Precision Mechanics of CAS
Priority to CN201820313952.2U priority Critical patent/CN208335208U/en
Application granted granted Critical
Publication of CN208335208U publication Critical patent/CN208335208U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The image co-registration acquisition system comprising meteorologic parameter that the utility model relates to a kind of solves the problem of that conventional images measurement depends on image luminance information and image luminance information is interfered by shooting environmental.The system includes fusion acquisition unit and server unit;Server unit includes the first controller, the first wireless communication device and at least one database, and the first wireless communication device, database are connect with the first controller respectively;The second wireless communication device, storage device, camera, positioning device, timing means and the multiple meteorological sensors that fusion acquisition unit includes second controller, connect respectively with second controller;Second wireless communication device and the first wireless communication device communicate.

Description

Image co-registration acquisition system comprising meteorologic parameter
Technical field
The utility model relates to image domains, and in particular to a kind of image co-registration acquisition system comprising meteorologic parameter.
Background technique
Currently, the expression way of traditional images is usually original luminance information, or the mark sheet based on brightness calculation It reaches, such as remote sensing images measurement, spectral image measurement, infrared image measurement etc., images above measurement depends critically upon image Luminance information, and the luminance information of image, obviously by the interference of shooting environmental, features described above is built upon the base of visible light On plinth, however under various circumstances, there are biggish othernesses for the light conditions of Same Scene.
The characterization of image generallys use a channel (gray level image), three channels (color image) or multiple channels (spectrum picture) indicates, the feature representation in computer vision application is also all based on these image parameters and calculate It arrives, however performance of the Same Scene under different shooting conditions is usually different;Such as light is more and more weaker, image It shows as by bright to dark transformation;With the increase of humidity in air, the curve of spectrum of atural object can generate apparent change in image Change;Temperature when image taking is different, and the image effect shot also can be variant;Visibility directly affects the clear of image Degree.Therefore, only rely at present collect each channel brightness value image representing method be it is incomplete, lack in characterization image Environmental information when shooting image, has ignored the meteorological condition of shooting, such as atmospheric temperature, humidity, visibility, illumination, this It will cause the missing of image information.
Utility model content
The purpose of this utility model be solve conventional images measurement dependent on image luminance information, and image luminance information by To the problem of shooting environmental interference, a kind of image co-registration acquisition system comprising meteorologic parameter is provided, can be realized to comprising more The image of a meteorologic parameter is stored, and further increases the accuracy of shooting image.
The technical solution of the utility model is:
A kind of image co-registration acquisition system comprising meteorologic parameter, including fusion acquisition unit and server unit;It is described Server unit includes the first controller, the first wireless communication device and at least one database, the first wireless communication dress It sets, database is connect with the first controller respectively;The fusion acquisition unit include second controller, respectively with second controller The second wireless communication device, storage device, camera, positioning device, timing means and the multiple meteorological sensors of connection;Described Two wireless communication devices and the first wireless communication device communicate;The database purchase has specific format and includes weather information Image;The time identifier that the station location marker and timing means that second controller is obtained according to positioning device generate, to image data It is associated with meteorological data, and is sent to server unit in the specific format, be stored in the database of server unit;It is described Specific format includes file header and file body, and file header includes file registration code and document image parameter, document image parameter packet Include picture format, image size, image channel, reserved information bit, time identifier position, station location marker;File body includes meteorological number According to and image data.
Further, the camera is the camera, camera or holder camera for capturing image.
Further, the meteorological sensor includes visibility instrument, illuminance sensor, the biography for detecting PM2.5 Sensor and/or automatic weather station.
Further, the positioning device includes big-dipper satellite positioning or GPS positioning system.
Further, second wireless communication device and the first wireless communication device are wired connection, WiFi network company It connects or cellular connection.
The utility model also provides a kind of image storage method comprising meteorologic parameter based on above system simultaneously, including Following steps:
1) data acquire;
The image and meteorologic parameter of same environment scene are acquired, and according to the station location marker at meteorological collection point It is stored data in storage device with time identifier;
2) process meteorological data and association;
The time identifier that the station location marker and timing means that second controller is obtained according to positioning device generate, to picture number It is associated with according to meteorological data, and generates the image comprising meteorologic parameter, the image of generation is sent to service with the specific format Device unit, is stored in the database of server unit;The specific format includes file header and file body, and file header includes text Part registration code and document image parameter, document image parameter include picture format, image size, image channel, reserved information bit, Time identifier position, station location marker;File body includes meteorological data and image data;
3) image procossing and classification;
3.1) weather data feature F is obtained using fully-connected network after standardizing meteorological datawea
3.2) the weather data feature F obtained using step 3.1)weaConstruct adaptive convolutional neural networks;
3.3) Characteristics of The Remote Sensing Images F is extracted using the adaptive convolutional neural networks of step 3.2) constructionrgb, and utilize SoftMax classifier classifies to it;
3.4) adaptive convolutional neural networks are trained and test, and using trained adaptive convolutional neural networks to distant Sense image is classified;
4) it stores;
The image comprising meteorologic parameter that accumulation specific format generates in the database.
Further, step 3.1) be specially set initial weather characteristics vector asFully-connected network has L layers, the process such as following formula from l layers to l+1 layers:
Wherein,For l+1 layers of weight, random initializtion value is taken;For l+1 layers of base vector, take with Machine initialization value;For l layers of output;For l+1 layers of output;Sigmoid is activation primitive;
Repetitive (1), obtains L+1 layers of outputRemember that this output is final output weather data feature Fwea
Further, step 3.2) be specially set the convolution nuclear parameter of l layers of original convolution neural network asIt is logical Cross the weather data feature F that step 1) obtainsweaWeighted convolution nuclear parameter obtains new convolution nuclear parameterProcess It is as follows:
Wherein, WtransferIt is transformation matrix, is warping function,Representative element multiplies operation, and formula (2) obtainsAs original convolution nuclear parameterAuto-adaptive parameter.
Further, step 3.3) be specially adaptive convolutional neural networks be multitiered network structure, each layer by convolution, Three activation, pondization operation compositions, the calculating from l layers to l+1 layers are obtained by following formula:
Wherein, formula (3) indicates convolution operation, and formula (4) indicates activation operation, and formula (5) indicates pondization operation;In public affairs In formula (3),For convolution operation output in l+1 layers,Indicate k-th of filter in l+1 layers,Indicate the The weighting of k-th of filter in l+1 layers,Indicate l layers of output;In formula (4),It represents in l+1 layers and swashs Operation output living, max, which refers to, is maximized operation;In formula (5), Zl+1L+1 layers of overall output is represented, pooling refers to pond Change operation;
The first layer input of convolutional neural networks is RGB image Irgb, therefore Z1=Irgb, the convolution kernel ginseng of adaptation layer l Number is obtained in step 2)By layer-by-layer propagated forward, the output Z of the last layer network is obtainedL+1, note This output is final Characteristics of The Remote Sensing Images Frgb, SoftMax classifier is recycled to classify this feature.
Further, step 3.4) specifically:
4a) training: the fully-connected network in step 1) and the adaptive convolutional neural networks parameter in step 2) are being adopted It is trained on the data set of collection, training method is error backpropagation algorithm, and the data set utilized is training set;
It 4b) tests: by the picture and the trained obtained overall network of corresponding weather data input in test set, according to pre- Classification and the difference of concrete class calculate the nicety of grading of overall network, and the correct image number of class of scoring is R, classification accuracy For accuracy, wherein RAlwaysFor test lump number of samples:
Accuracy=R/RAlways× 100% (6)
4c) classify: an arbitrary remote sensing images and corresponding weather data are inputted in network, i.e., it is exportable to be somebody's turn to do The corresponding remote sensing scene type of image.
The advantages of the utility model are as follows:
1. the utility model proposes the meteorologic parameter blending image method and system with time identifier and positioning identifier, energy Enough actual conditions that scene is more presented comprehensively and accurately, and can allow user according to the time of shooting, position and Meteorologic parameter is further processed image.
2. Tthe utility model system and method can obtain the temperature of Same Scene while acquiring scene grayscale information Degree, humidity, brightness, the system of the omnibearing airs image information such as pressure, avoiding captured image data will receive these weather informations Influence, captured image is a kind of complete scene expression way.
3. the utility model picture format be customized ZCP format, comprising time identifier, station location marker, image data, Meteorologic parameter data etc. realize vehicle-mounted meteorological image fusion acquisition unit, user are facilitated quickly and easily to obtain comprehensive field Scape information.
4. the utility model constructs the convolutional neural networks for adaptively carrying out parameter regulation according to weather characteristics, while benefit With weather characteristics and characteristics of image, conventional method is overcome to be limited to the drawbacks of environment such as illumination influence, so that the expression to scene It more refines, so that the feature learnt has more generalization, to improve the precision of scene classification.
5. the utility model removes the brightness value of collected scene image, environment letter when also while in view of shooting image Breath, can effectively avoid the ambiguity problem of scene perception and understanding in this manner.
6. the utility model breaches the limitation of image information expression mode in existing method, by multiple features fusion, obtain The correct expression way for having arrived image scene overcomes the difficult point that similitude is big between remote sensing images atural object complexity, class, can be used for ground Manage national conditions prospecting, military surveillance and environmental monitoring etc..
Detailed description of the invention
Fig. 1 is Tthe utility model system structure chart;
Fig. 2 is that the utility model generates the image method flow chart comprising meteorologic parameter;
Fig. 3 is the storage format ZCP structure chart that the utility model includes meteorologic parameter image;
Fig. 4 is the utility model data correlation method process frame diagram;
Fig. 5 is the schematic diagram that the utility model extracts weather characteristics using full Connection Neural Network;
Fig. 6 is the schematic diagram that the utility model constructs adaptive convolutional neural networks using weather characteristics;
Fig. 7 is the schematic diagram that the utility model extracts characteristics of image using adaptive convolutional neural networks.
Specific embodiment
The technical solution of the utility model is clearly and completely described with reference to the accompanying drawings of the specification.
The utility model provides a kind of image co-registration acquisition system and image storage method comprising meteorologic parameter, remembers simultaneously It records image information and obtains the weather information of image synchronization, Same Scene, such as temperature, humidity, air pressure, rainfall, wind speed, wind To, gray scale/spectral information of visibility, illuminance etc. and photographed scene.Since image data will receive these weather informations Influence, therefore include that the image of environment weather information is only a kind of complete image expression mode.The utility model will have Interdiscipline, high-resolution, it is quick, lossless the features such as research applied to Computer Vision Task, change people to the biography of image System understanding, by the expression way of sophisticated image, on the one hand the utility model can serve the design of spectrum picture system, another party The face equipment can also promote the development of computer vision technique, the utility model can using fields such as precision agriculture, intelligent transportation To drive the development of the subjects such as optical imaging system, computer vision, intelligent driving, robot, there is great science and warp Help dual value.
The utility model provides a kind of image co-registration acquisition system comprising meteorologic parameter, which is configurable for appointing In the outdoor remotes monitoring systems such as the monitoring system, such as road traffic, construction site, Forest Park of what environment, system is by family The weather information and image information of outer scene are monitored, and are handled in server unit acquisition data, are improved monitoring The accuracy and integrality of image.
As shown in Figure 1, the image co-registration acquisition system comprising meteorologic parameter include server unit and fusion acquisition unit, Wherein server unit includes the first controller, the first wireless communication device and multiple databases;First wireless communication device and Database is connect with the first controller respectively, and database purchase includes the image of weather information specific format;Merge acquisition unit The second wireless communication device, storage device, the camera, multiple meteorologies being connect including second controller, respectively with second controller Sensor, timing means and timing means, the second wireless communication device and the first wireless communication device communicate.Merge acquisition unit For capturing ambient scene image data and obtain environment weather data, according to the time of the station location marker of acquisition and generation Mark, is associated image data and meteorological data, and be sent to server unit in the specific format.Merging acquisition unit can Classified according to location information, temporal information to environment weather data and image data, specifically, by predetermined distance range Meteorological data in interior and predetermined period is associated with image data, and includes according to specific format generation by associated data The picture format of meteorologic parameter, and be sent to the first wireless communication device through the second wireless communication device, the first controller according to The image received is stored in associated databases by geographical location or time relationship.
Camera be for capturing the camera of image in particular range, camera or holder camera, can continuously shot images Or image is shot with interval specific time period;Meteorological sensor be detection visibility, illuminance, wind speed, wind direction, PM2.5, temperature, Humidity, atmospheric pressure, ultraviolet light or rainfall sensor, concretely visibility instrument, illuminance sensor, for detecting The sensor or automatic weather station of PM2.5;Positioning device can have the function of big-dipper satellite positioning or GPS positioning system;Second nothing Line communication device and the connection of the first wireless communication device are wired connection, WiFi network is connect or cellular connection.Fusion acquisition Carried out data transmission between various components by bus or serial communication such as RS232, RS485 interface in unit.
The meteorological format number that second controller can parse and handled automatic weather station or other meteorological acquisition systems are sent According to, at the same to the operating mode of sensor states, camera status, the second wireless communication device, positioning device and timing means into Row control;The meteorologic parameter that meteorological sensor acquires is converted into the meteorological data of one group of specific format by second controller, and By the information processing of camera captured image at formats such as the formats of needs, such as jpg, bmp, and by timing means provide when Between information be converted into time identifier, the location information that positioning device is provided is converted into station location marker, then according to timing identify Meteorological data and image data are converted into specific format with station location marker.
The processing function of second controller can also be completed on server unit, and server unit can be configured to specific week Phase acquires meteorological data from meteorological acquisition device and acquires positioning and image data from intelligent terminal.In server unit, lead to It crosses the image for the intelligent terminal accumulated in each meteorological acquisition device monitoring range and is associated and stores in the database.
A kind of image storage method comprising meteorologic parameter as shown in Figure 2, comprising the following steps:
1) data acquire;
The image and meteorologic parameter of same environment scene are acquired, and according to the station location marker at meteorological collection point It is stored data in storage device with time identifier;
2) process meteorological data and association;
The time identifier that the station location marker and timing means that second controller is obtained according to positioning device generate, to picture number It is associated with according to meteorological data, and generates the image comprising meteorologic parameter, the image of generation is sent to service with the specific format Device unit, is stored in the database of server unit;The specific format includes file header and file body, and file header includes text Part registration code and document image parameter, document image parameter include picture format, image size, image channel, reserved information bit, Time identifier position, station location marker;File body includes meteorological data and image data;
3) image procossing and classification;
3.1) weather data feature F is obtained using fully-connected network after standardizing meteorological datawea
3.2) the weather data feature F obtained using step 3.1)weaConstruct adaptive convolutional neural networks;
3.3) Characteristics of The Remote Sensing Images F is extracted using the adaptive convolutional neural networks of step 3.2) constructionrgb, and utilize SoftMax classifier classifies to it;
3.4) adaptive convolutional neural networks are trained and test, and using trained adaptive convolutional neural networks to distant Sense image is classified;
4) it stores;
The image comprising meteorologic parameter that accumulation specific format generates in the database.
The utility model data correlation realizes that specific step is as follows:
Step 3.1 extracts weather data feature using fully-connected network;
As shown in figure 5, the utility model weather conditions collected share 34 kinds, it is as follows:
Therefore, initial weather characteristics are the vectors of one 34 dimension, and each element of vector is 1 or 0, are represented either with or without this Weather;Because initial weather characteristics are inputted in the utility model there is very strong relevance between various weather One fully-connected network obtains final weather characteristics, if initial weather characteristics vector is(R representative has Number is managed,It is the reasonable number vector of one 34 dimension) fully-connected network has L layers, then such as following formula institute of the process from l layers to l+1 layers Show:
Wherein,It is random initializtion value for l+1 layers of weight;For l+1 layers of base vector, for Machine initialization value;For l layers of output;For l+1 layers of output;Sigmoid refers to activation primitive;
It repeats the above process L times, obtains L+1 layers of outputRemember that this output is the final output F of networkwea, ginseng NumberWithFor random initializtion value;
Step 3.2, the adaptive convolutional neural networks of weather data latent structure generated using step 3.1;
As shown in fig. 6, convolutional neural networks are a multitiered network structures, each layer is by convolution, activation, three, pondization behaviour It forms, the parameter being directed to has convolution kernel WconvWith weighting bconv, what the initial value of these parameters was randomly generated, if The convolution nuclear parameter of l layers of original convolution neural network is
The weather characteristics F using step 1 generation can then be passed throughweaCarry out weighted convolution nuclear parameter to obtain new convolution kernel ParameterProcess is as follows:
Wherein, WtransferIt is a transformation matrix, because of F hereweaDimension generally withDimension it is different, then Continuous element multiplies operation and needs the dimension of the two identical, so introducing transformation matrix and reshape function, collective effect reach here To the identical purpose of the two dimension;Reshape is warping function, the effect of transformation matrix and warping function be by weather characteristics to Measure FweaBe transformed toThe matrix that is consistent of dimension,Representative element multiplies operation, and formula (2) obtainsIt is exactly original convolution nuclear parameterAdaptive version, compared to original convolution kernel, new convolution kernel can Effectively to combine Weather information, extracting more has semantic feature in image;
Step 3.3, the depth for extracting remote sensing images using the adaptive convolutional neural networks of step 3.2 kind building is special Sign;
As shown in fig. 7, adaptive convolutional neural networks are a multitiered network structures, each layer is by convolution, activation, Chi Hua Three operation compositions, the calculating from l layers to l+1 layers can be obtained by following formula:
Wherein formula (3) indicates convolution operation, and formula (4) indicates activation operation, and formula (5) indicates pondization operation;In public affairs In formula (3),For convolution operation output in l+1 layers,Indicate k-th of filter in l+1 layers,It indicates The weighting of k-th of filter in l+1 layers,Indicate l layers of output;In formula (4),It represents in l+1 layers Activation operation output, max, which refers to, is maximized operation;In formula (5), Zl+1L+1 layers of overall output is represented, pooling refers to Pondization operation, because finally obtained characteristics of image should be a feature vector, convolutional Neural net in the utility model That the pondization operation of network the last layer is taken is global average pond (global average pooling).
The first layer input of this convolutional neural networks is RGB image Irgb, therefore Z1=Irgb, the convolution kernel of adaptation layer l Parameter is obtained in step 2By layer-by-layer propagated forward (L layers total), the defeated of the last layer network is obtained Z outL+1, remember that this output is final the utility model Characteristics of The Remote Sensing Images Frgb, recycle SoftMax classifier to this feature into Row classification, to achieve the purpose that the classification carried out to remote sensing images.
Step 3.4, adaptive convolutional neural networks are trained and tested, and remote sensing images are carried out using trained network Classification;Each of collected scene image has corresponding weather data and scene type mark, first by collected number According to being divided into two parts, respectively training set and test set;
(3.4a) training: present networks include two sub- network modules in total, the fully-connected network mould respectively in step 1 Block, the adaptive convolutional neural networks module in step 2;The parameter of two modules needs the data set acquired in the utility model On be trained, what trained method was taken is error backpropagation algorithm, the data set utilized be training set;
(3.4b) test: by the picture and the obtained overall network of corresponding weather data input training in test set, The nicety of grading that overall network is calculated according to the difference of pre- classification and concrete class, the correct image number of class of scoring are R, point Class accuracy rate is then (wherein RAlwaysFor the number of samples for testing lump), accuracy presentation class accuracy rate:
Accuracy=R/RAlways× 100% (6)
(3.4c) classification: an arbitrary remote sensing images and corresponding weather data are inputted in network, can be defeated The corresponding remote sensing scene type of the image out.
As shown in figure 3, the utility model includes the storage ZCP format structure figure of meteorologic parameter image, the ZCP storage file Format includes header file information, is stored in the image data that camera is shot in the image file format, and records above-mentioned shooting The meteorological data of environment weather parameter and taking location information and temporal information during image;Specifically, specific format packet Include file header and file body, file header is 100 (byte), be divided into file registration code (0-19), in file image parameter, Including picture format, image size, image channel (20-31), reserved information bit (32-51), time identifier position (52-59 Position), station location marker (60-67) and file body be divided into meteorological data and image data, meteorological data include visibility, temperature, Humidity, wind speed, wind direction, illuminance, atmospheric pressure etc. (68-99), image data stores (100 to file in a binary format Tail).
The utility model program any kind of non-transitory computer-readable medium can be used store and provide to Computer, and above procedure can store in electronic signal included below, optical signal, radio signal and computer-readable storage In the carrier of one of medium.Non-transitory computer-readable medium includes any kind of tangible media.Nonvolatile Property computer-readable medium include magnetic storage medium (such as floppy disk, tape, hard disk drive etc.), optomagnetic storage medium (for example, Magneto-optic disk), compact disc read-only memory (CD-ROM), (such as the read-only storage of mask of CD-R, CD-R/W and semiconductor memory Device, Erasable Programmable Read Only Memory EPROM (EPROM), flash ROM, is deposited at programmable read only memory (PROM) at random Access to memory (RAM) etc.), any kind of temporary computer-readable medium can be used to be supplied to computer, temporarily in program When property computer-readable medium includes electric signal, optical signal and electromagnetic wave, and temporary computer-readable medium can be via wired Program is supplied to computer by communication line (for example, electric wire and optical fiber) or wireless communication line.
The meteorologic parameter that scene is computed is accumulated in meteorogical phenomena database, and meteorologic parameter and the image of shooting are closed Connection, then image and meteorologic parameter and time identifier, station location marker are combined and are stored in the specific format together, so as to Checking or handling afterwards.The system be include meteorological acquisition device comprising processor, the controller with instruction storage, when When the instruction is executed by a controller, multiple sensing datas are acquired in real time, allows multiple meteorologic parameters processed At the meteorological data group with time identifier;Controller is the data of specific format to collected information processing;Communication device quilt Be configured to near-field communication NFC, bluetooth, radio frequency discrimination RFID connected with WiFi in any one mode, with server unit into Row communication is to transmit certain number formatted data and instruction.
The utility model fusion acquisition system is applied to a vehicle-mounted and intelligent terminal embodiment, vehicle-mounted acquisition dress It sets, visibility, temperature, humidity and wind speed and direction etc., and the figure for shooting user by intelligent terminal can be monitored at any time As making user more comprehensively grasp the information of photographed scene and further handle image in conjunction with meteorologic parameter.

Claims (5)

1. a kind of image co-registration acquisition system comprising meteorologic parameter, it is characterised in that: including fusion acquisition unit and server Unit;
The server unit includes the first controller, the first wireless communication device and at least one database, first nothing Line communication device, database are connect with the first controller respectively;
The fusion acquisition unit includes second controller, the second wireless communication device for connecting respectively with second controller, deposits Storage device, camera, positioning device, timing means and multiple meteorological sensors;Second wireless communication device and first is wirelessly Communication device communication;
The database purchase has specific format and includes the image of weather information;Second controller sends out data in the specific format It send to server unit, is stored in the database of server unit;
The specific format includes file header and file body, and file header includes file registration code and document image parameter, file map As parameter includes picture format, image size, image channel, reserved information bit, time identifier position, station location marker;File body packet Include meteorological data and image data.
2. the image co-registration acquisition system according to claim 1 comprising meteorologic parameter, it is characterised in that: the camera is For capturing the camera, camera or holder camera of image.
3. the image co-registration acquisition system according to claim 1 or 2 comprising meteorologic parameter, it is characterised in that: the gas Image sensor includes visibility instrument, illuminance sensor, sensor and/or automatic weather station for detecting PM2.5.
4. the image co-registration acquisition system according to claim 3 comprising meteorologic parameter, it is characterised in that: the positioning dress It sets including big-dipper satellite positioning or GPS positioning system.
5. the image co-registration acquisition system according to claim 4 comprising meteorologic parameter, it is characterised in that: second nothing Line communication device and the first wireless communication device are wired connection, WiFi network is connect or cellular connection.
CN201820313952.2U 2018-03-07 2018-03-07 Image co-registration acquisition system comprising meteorologic parameter Active CN208335208U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201820313952.2U CN208335208U (en) 2018-03-07 2018-03-07 Image co-registration acquisition system comprising meteorologic parameter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201820313952.2U CN208335208U (en) 2018-03-07 2018-03-07 Image co-registration acquisition system comprising meteorologic parameter

Publications (1)

Publication Number Publication Date
CN208335208U true CN208335208U (en) 2019-01-04

Family

ID=64782675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201820313952.2U Active CN208335208U (en) 2018-03-07 2018-03-07 Image co-registration acquisition system comprising meteorologic parameter

Country Status (1)

Country Link
CN (1) CN208335208U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111209980A (en) * 2019-12-25 2020-05-29 深圳供电局有限公司 Environment detection method and device, electronic equipment and computer readable storage medium
CN112649900A (en) * 2020-11-27 2021-04-13 上海眼控科技股份有限公司 Visibility monitoring method, device, equipment, system and medium
CN115290526A (en) * 2022-09-29 2022-11-04 南通炜秀环境技术服务有限公司 Air pollutant concentration detection method based on data analysis

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111209980A (en) * 2019-12-25 2020-05-29 深圳供电局有限公司 Environment detection method and device, electronic equipment and computer readable storage medium
CN111209980B (en) * 2019-12-25 2024-02-09 深圳供电局有限公司 Environment detection method and device, electronic equipment and computer readable storage medium
CN112649900A (en) * 2020-11-27 2021-04-13 上海眼控科技股份有限公司 Visibility monitoring method, device, equipment, system and medium
CN115290526A (en) * 2022-09-29 2022-11-04 南通炜秀环境技术服务有限公司 Air pollutant concentration detection method based on data analysis

Similar Documents

Publication Publication Date Title
CN108537122B (en) Image fusion acquisition system containing meteorological parameters and image storage method
CN112990262B (en) Integrated solution system for monitoring and intelligent decision of grassland ecological data
US10817731B2 (en) Image-based pedestrian detection
Corcoran et al. Automated detection of koalas using low-level aerial surveillance and machine learning
CN108037770B (en) Unmanned aerial vehicle power transmission line inspection system and method based on artificial intelligence
US10922585B2 (en) Deterministic labeled data generation and artificial intelligence training pipeline
US10255525B1 (en) FPGA device for image classification
CN208335208U (en) Image co-registration acquisition system comprising meteorologic parameter
US11256926B2 (en) Method and system for analyzing the movement of bodies in a traffic system
CN106462737A (en) Systems and methods for haziness detection
CN114912707B (en) Air quality prediction system and prediction method based on multi-mode fusion
CN109376660A (en) A kind of target monitoring method, apparatus and system
CN106295605A (en) Traffic lights detection and recognition methods
CN112649900A (en) Visibility monitoring method, device, equipment, system and medium
CN112528912A (en) Crop growth monitoring embedded system and method based on edge calculation
CN106453523A (en) Intelligent weather identification system and method
CN112613438A (en) Portable online citrus yield measuring instrument
CN111369760A (en) Night pedestrian safety early warning device and method based on unmanned aerial vehicle
CN108932474A (en) A kind of remote sensing image cloud based on full convolutional neural networks compound characteristics sentences method
CN111929672A (en) Method and device for determining movement track, storage medium and electronic device
WO2022107620A1 (en) Data analysis device and method, and program
US10489923B2 (en) Estimating conditions from observations of one instrument based on training from observations of another instrument
CN117423077A (en) BEV perception model, construction method, device, equipment, vehicle and storage medium
CN114782903A (en) ESN-based highway river-crossing grand bridge group fog recognition and early warning method
CN116486635A (en) Road dough fog detection and early warning method, system, storage medium and terminal

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant