WO2014056473A2 - Verfahren zur bildaufbereitung und damit durchführbares verfahren zur automatischen objekterkennung sowie beobachtungsvorrichtung und verfahren zur hochgenauen bahnverfolgung startender raketen auf grosse entfernungen - Google Patents
Verfahren zur bildaufbereitung und damit durchführbares verfahren zur automatischen objekterkennung sowie beobachtungsvorrichtung und verfahren zur hochgenauen bahnverfolgung startender raketen auf grosse entfernungen Download PDFInfo
- Publication number
- WO2014056473A2 WO2014056473A2 PCT/DE2013/000569 DE2013000569W WO2014056473A2 WO 2014056473 A2 WO2014056473 A2 WO 2014056473A2 DE 2013000569 W DE2013000569 W DE 2013000569W WO 2014056473 A2 WO2014056473 A2 WO 2014056473A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- raster
- image information
- scene
- image processing
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000012545 processing Methods 0.000 title claims abstract description 40
- 235000015842 Hesperis Nutrition 0.000 title claims abstract description 7
- 235000012633 Iberis amara Nutrition 0.000 title claims abstract description 7
- 238000001514 detection method Methods 0.000 title description 6
- 230000003287 optical effect Effects 0.000 claims abstract description 31
- 230000005670 electromagnetic radiation Effects 0.000 claims abstract description 5
- 230000003595 spectral effect Effects 0.000 claims description 25
- 230000005855 radiation Effects 0.000 claims description 14
- 238000012216 screening Methods 0.000 claims description 9
- 238000011156 evaluation Methods 0.000 claims description 7
- 238000003672 processing method Methods 0.000 claims description 5
- 238000005259 measurement Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000002123 temporal effect Effects 0.000 claims description 3
- 238000002360 preparation method Methods 0.000 claims description 2
- 230000035945 sensitivity Effects 0.000 claims description 2
- 239000000779 smoke Substances 0.000 claims 1
- 239000011159 matrix material Substances 0.000 description 16
- 230000005540 biological transmission Effects 0.000 description 7
- 239000002131 composite material Substances 0.000 description 5
- 238000001429 visible spectrum Methods 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000002329 infrared spectrum Methods 0.000 description 3
- 235000019557 luminance Nutrition 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 238000002211 ultraviolet spectrum Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000005457 Black-body radiation Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 241000016949 Acalypha chamaedrifolia Species 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/653—Three-dimensional objects by matching three-dimensional models, e.g. conformal mapping of Riemann surfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/28—Indexing scheme for image data processing or generation, in general involving image processing hardware
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30212—Military
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/243—Aligning, centring, orientation detection or correction of the image by compensating for image skew or non-uniform image deformations
Definitions
- the present invention relates to a method for image processing according to the preamble of patent claim 1. It further relates to a method for automatic object recognition according to the preamble of patent claim 4 and observation devices according to claim 6 and claim 7.
- of the object for example a rocket
- Confirmation copy of the object (for example a rocket) can be measured so accurately that a stable tracking of the observation device is possible and that
- a combat of the flying object can take place on its trajectory.
- Rocket flight route must be stationed close enough so that they always have a flying missile in view over the horizon. This not only requires a great deal of effort, but is often not sufficiently possible for political reasons.
- radar stations can only determine the position of targets at a distance of 1000 km transversely to the viewing direction with an accuracy of several kilometers and measure their Radarrückstrahlquerites, but make no accurate identification.
- decoys are usually not distinguishable by radar from real warheads, which causes great problems.
- Orbits are known that can detect launching missiles in the mid-infrared range with telescopes. Since the discovery must be made from above against the warm earth background with a multitude of hard-to-detect false targets, these systems have to contend with a comparably low sensitivity. Due to the observation from above, the sensor also sees only the less bright and strongly fluctuating part of the engine jet. As a result, its measuring accuracy is limited to a few hundred meters, and can not be significantly improved due to the system. From the unpublished DE 10 201 1 010 337 is a camera system for detecting and tracking of moving objects located at a great distance from a high-flying aircraft, which is above the dense
- Atmosphere flies known. This system has the great advantage of launching missiles practically out of the atmosphere from below against the background of cold space
- Core area of the beam has a hundred times higher luminance than the more distant beam and is fixed at the nozzle exit, so does not cause any fluctuation.
- an ideal extremely bright (1 megawatt / m 2 ) point light source with a diameter of only a few meters is available for tracing the tracks, which accurately and steadily executes the trajectory of the rocket.
- the object of the invention is to provide a sensor that can locate and track this point light source over a distance of up to 1500 km with an accuracy of a few meters.
- the multispectral camera can sequentially acquire multispectral images (e.g., at 700nm, 800nm, and 950nm) of a scene by means of a motor driven filter wheel having at least 3 narrowband (e.g., 20nm) transmission filters. From this, by conversion according to the blackbody radiation laws, a temperature image of the scene with a resolution of e.g. 50 ° K can be calculated.
- multispectral images e.g., at 700nm, 800nm, and 950nm
- 3 narrowband e.g. 20nm
- the approximately 2300 ° K hot core area of a solid rocket and its characteristic shape can clearly be distinguished from the 2100 ° K hot core area of a liquid rocket and a different shape, and with sufficient optical resolution of the camera from 1 m to 2 m in 1000 km distance also measure the size of the core area and its temperature distribution.
- military solid rockets can be distinguished from civil liquid rockets and different rocket types by the size, number and arrangement of the engines.
- This camera system has a camera provided with a long-focal-length camera lens, which is arranged on a position-stabilized platform.
- This camera is equipped with a high-speed shutter and a first and a second image sensor.
- the captured by the camera optics light radiation can be performed either on the first or the second image sensor, wherein one of the image sensors is associated with a further telephoto optics.
- the camera optics further comprises a pivotable mirror, with which it is possible to scan an area line by line by pivoting the mirror, wherein the captured image signal is fed to one of the image sensors. If a target object is detected during this scanning process, the light beam is deflected to the other image sensor, which is then used for object identification and, if appropriate, for target tracking.
- Object of the present invention is to provide a method for image processing, with which it is possible, even over long distances,
- Another object is to use this method for
- Image processing to perform an automatic object detection
- the inventive method designed in this way has the following
- Image processing device for improving the signal-to-noise ratio of the image information, wherein the processing is carried out in the following sub-steps:
- Raster filter element of a raster filter having an odd number of rows and an odd number of columns; b3) determining the brightness values of each of the raster filters
- Rasterfilterelement each other raster filter element having an individual light attenuation property
- step b4) summing the brightness values determined in step b3) into a sum brightness value and associating this sum brightness value with the raster pixel covered by the central raster filter element;
- step b) generating a result image with the same screening as the raw image from the individual sum brightness values of the raster image elements obtained in step b).
- Brightness gradient of the raw image is smoothed. Furthermore, noise pixels that emerge from the background of the raw image are removed, thus smoothing the image as well.
- the brightness curve in the resultant image is continuous and differentiable from the raw image and the image contrast is improved, so that an object contained in the raw image emerges more clearly and clearly in the resulting image.
- Image processing is characterized in that in step a) the image information of the scene is detected in more than one electromagnetic wavelength range in order to obtain raw images of the scene in different spectral ranges; that the steps b) and c) are performed for all raw images of the scene in order to obtain result images of different spectral regions, and that the result images of the different spectral regions are combined by superimposition into a multispectral result image.
- a multi-spectral image with filters selected to match the temperature of the observed body (e.g., 2300 ° K) on the short-wave flank of the blackbody radiation curve can be used to convert the multispectral color image to a temperature image.
- This temperature image makes it possible to have a small stable core temperature range within a much larger, possibly locally brighter, strongly fluctuating, background brightness field, such as, e.g. to find and track a missile fire tail.
- Image processing the following steps performed: the detection of the scene in step a) is performed at different angles of rotation about the optical axis of the optical device;
- the individual result images are compared with sample images of individual objects stored in an object database
- That pattern image with the least deviation from one or more of the result images identifies the object contained in the scene and determines the position of the object in the result image.
- Object identification enabled. Furthermore, by means of this automatic object recognition method, the position of the object in the result image can be determined, and thus a directional vector of the movement of the object (eg a rocket) can already be predicted with greater accuracy than in the prior art in a single scene detected and analyzed.
- the object eg a rocket
- Raster elements of the result image with corresponding raster elements of the pattern image is done.
- the directed to the observation device part of the object is achieved by the observation device with the features of claim 6 and also by the observation device with the features of claim 7. While the observation device is formed with the features of claim 6, to the inventive method for image processing realize, the observation device according to claim 7 is adapted to perform the inventive method for automatic object recognition using the method according to the invention for image processing.
- the image processing device has a screen-rastering module and a raster filter module is advantageous.
- the image screening module has a matrix-like arrangement of light-guiding elements which are arranged between the optical device and a sensor sensitive to the detected radiation. It is at least a part of
- Light guide each associated with a brightness-reducing raster filter element of the raster filter module.
- the optical device is designed such that it images the acquired image information in a plane of entry of the image ram module as a raw image, and it is further configured such that the raw image is displaceable with respect to the entrance plane of the image raster module on the entry level.
- a computer unit is provided which receives a brightness signal from the sensor and on which runs a software which implements method step c) of claim 1 and preferably the method steps of one of claims 2 to 5.
- This advantageous embodiment of the observation device implements the method steps according to the invention in an opto-mechanical manner. Whenever "brightness" is used in this document, it is not limited to the spectrum of visible light, but also includes the intensity of the radiation in a non-visible spectrum such as in the infrared spectrum or in the ultraviolet spectrum, but without being limited thereto.
- the method steps according to the invention can also be implemented in software, for which purpose the suitable observation device is characterized in that a visual sensor is arranged downstream of the optical device, that the optical device is designed such that it images the acquired image information in a sensor plane of the image sensor a computer unit is provided, which receives an image signal from the image sensor, and that software runs in the computer unit which implements the method steps b) and c) of claim 1 and preferably the method steps of one of claims 2 to 5, wherein the image rasterization module and the raster filter module are designed as a subroutine of the software.
- This advantageous embodiment of the observation device implements the method steps according to the invention in an optoelectronic manner.
- the composite multispectral images are processed after the preparation of the raw single images from a larger number of superimposed
- Multispectral image then has a much better signal-to-noise ratio than the raw images of preferably more than 100 with a sufficient number of superimposed individual images by averaging over the many frames.
- the composite multispectral images are preferably with a
- Multispectral image evaluation and identification method evaluated in the image evaluation device 25, 125.
- an observation of the target behavior is preferably carried out first and, in particular, the number and the trajectories of the visible objects are determined. Then a file of all flying objects and their trajectories is created, which is a safe
- the composite multispectral images can also be used to observe and analyze the behavior of the objects (separation of a missile upper stage, ejection of decoys, flight maneuvers of a warhead) over time.
- the composite multispectral images will continue to be in one
- the target image recognition can thus recognize imaged targets as target objects of a certain type, and consequently
- the target image recognition can work more reliably and sharper when the added multispectral images are subjected to multi-stage rendering prior to processing. For this, the composite multispectral images are first
- a normalized form is converted by first forming the orthogonal vectorially added total brightness as the brightness value for each pixel, and then normalizing all the color components with the total brightness.
- the entire color vector then consists of the brightness and the normalized
- Color values This can be a color coordinate system with any number
- Spectral components can be defined and in this system can all
- Color operations are performed multispectrum, which are defined in the RGB system only in three colors. For all color components and the brightness of each image pixel, the
- FIG. 2 shows an operation in which the image is simultaneously smoothed and differentiated, in which after the averaging over a plurality of images still existing interference pixels are removed and smoothed, and in which brightness transitions and edges are accentuated, so that the
- the image is also subjected to an affine color transformation in which the target characterizing spectral components are stretched and thus better evaluated and atypical spectral components are compressed. This allows the correlation of the result images with the real target objects hidden in the images in the target image recognition a higher selectivity between real targets and false targets, which could be confused with real targets, as without this image processing.
- the multispectral image recognition can be used either with the device for
- the telescope 110 generates, via the deflection mirror 112, a real 25 ⁇ 25 pixel target image of a distant target in the plane 121 of the front surface of the optical 5 ⁇ 5
- Optical fiber bundle of the image capture device 120 which occupies the same area as 5 x 5 pixels of the real target image.
- the scanning mirror 1 12 deflects the target image horizontally and vertically so that each center pixel of each 5 x 5 pixel block sweeps sequentially over the central light guide of the 5 x 5 optical fiber bundle.
- the twenty-five values for the 5 x 5 pixel blocks of each 25 x 25 pixel image are stored in the computing unit 126 for all spectral regions. This is repeated for twelve rotational positions over 360 ° of the 25 x 25 pixel image. Search areas of 15x15 pixel blocks from the target images are compared for the value of each center pixel of each 5x5 pixel block with the sought after 15x15 pixel reference images, with the differences of each nine
- Coefficient values of input image search area and current reference image are formed.
- the position resolution of the image recognition is five pixels horizontally and vertically.
- the telescope 110 In the digital-based device (FIG. 4), the telescope 110 generates a real 25x25 pixel target image of a remote illuminated target in the image plane of the image capture device 120 comprising, for example, an NIR (Near Infrared) camera via the deflection mirror 112.
- the camera converts the light signal into a digital multispectral image with high resolution.
- characteristic values of a weighting function according to the invention are calculated in the search image (25 ⁇ 25 pixels in size) for each search pixel position, as described above. Due to the described design of the evaluation function, the number of rotational positions to be examined can be limited to twelve without loss of selectivity.
- target detection, trajectory tracking, trajectory measurement and target observation and target identification of launching missiles can be performed even after engine burnout at distances up to 500 km.
- FIG. 1A shows a raster filter matrix of the observation device according to FIG. 1;
- FIG. a second embodiment of an observation device for carrying out the image processing method according to the invention;
- FIG. 2A shows a raster filter matrix of the observation device according to FIG. 2;
- Fig. 3 shows a first embodiment of an observation device for
- Fig. 4 shows a second embodiment of an observation device for
- the optical device 1 shows an observation device according to the invention with an optical device 1 and an image processing device 2.
- the optical device 1 has a telescope unit 10 with a long-focal-length objective, which is shown only schematically in the figure.
- the telescope unit 10 begins
- electromagnetic radiation S of an observed scene in the visible spectrum of the light and outside the visible spectrum of the light for example, infrared and / or ultraviolet radiation.
- the captured by the telescope 10 radiation S is moved to a
- Deflection mirror 12 passes, which is driven by a drive 14 shown only schematically in Fig. 1 for performing a two-dimensional scan movement.
- the vertical deflection of the deflection mirror 12 takes place in an angular range, which is defined in Fig. 1 by way of example by the upper boundary line a of a central beam SM and by the lower boundary line a 'and by a first lateral boundary line b and a second lateral boundary line b' of the center beam SM.
- the deflection mirror 12 performs line-by-line scanning of the radiation S captured by the telescope 10 and in each case projects a portion of the target image Z 'determined by the area Z onto an entrance plane E of a rasterization module 20 of
- the image screening module 20 has a matrix-like arrangement of light-guiding elements, not shown in detail in FIG. 1, which open at one end in the input plane E.
- a likewise matrix-type raster filter 22 is provided, which is shown only schematically in FIG. 1 and which is reproduced in cross-section in FIG. 1A as a filter matrix.
- This raster filter 22 contains a central raster filter element 22 "and a plurality of further individual raster filter elements 22 'surrounding it, which are each assigned to one of the further light-guiding elements surrounding the central light-guiding element.
- the rasterizing module 20 thus consists of five by five (ie twenty-five) Light guide, each of which a raster filter element 22 ', 22 "is assigned.
- a respective further raster filter element 22 ' is assigned factors which reflect the light transmittance of the individual raster filter element.
- the central raster filter element 22 " which is assigned to the central light-guiding element, has a light transmittance of 1.00, which corresponds to 100% .
- the light guided through the central light-guiding element is thus not attenuated by the raster filter 22.
- the light is transmitted around the central light-emitting element
- 0.21 corresponds to 21% light transmission.
- the light emitted by the individual light guide elements of the image screening module 20 through the raster filter 22 impinges on a sensor 24, which is formed, for example, by a photo light amplifier tube and which forms the sum signal of the light radiation emerging through the individual raster filter elements.
- This brightness or radiation intensity sum signal Ss is forwarded by the sensor 24 to a computer unit 26 of the image processing device 2.
- the sum of the radiation intensities of the neighboring pixels of the matrix shown in FIG. 1A is detected by the sensor 24 for each target image pixel, wherein the respective target image pixel forms the central element of the matrix.
- a target image pixel corresponds to a grid element of the grid of the light guide elements
- Each target image pixel thus becomes the
- the luminances of the pixels surrounding the central pixel element are summarily assigned according to the matrix, so that a
- the computer unit 26 after a complete scan of the deflection mirror 12, all stored sum brightness values of the individual target image pixels are combined again into a result image, which is then output via an output interface 27, for example on a display device 28.
- the formation of the result image from the stored sum brightness values of the individual target image pixels takes place in such a way that the stored sum brightness value of a target image pixel is assigned to the same position in the result image that the target image pixel assumed in the target image.
- This output result image is not only brighter than the target image Z 'originally projected onto the input plane E by the light amplification described, but due to the different weighting of the respective neighboring pixels according to the matrix from FIG. 1A, this result image is also smoothed and no longer contains any noise pixels emerge from the background.
- the brightness curve of the result image is continuous, so that the result image is differentiable.
- the image contrast is also improved by the described method.
- the brightness filter matrix (raster filter 22) is given with 5 ⁇ 5 raster filter elements and the target image Z 'spanned by the scan area is assumed to be 25 ⁇ 25 pixels, the invention can also be used for all other resolutions of the target image and for all other resolutions of the
- Brightness filter matrix or the matrix of light-guiding elements can be realized.
- FIG. 2 shows a modified variant of that shown in FIG.
- the radiation S received by the scene to be observed is transmitted by the telescope unit 110 of the optical system
- Device 101 is captured and guided by means of a - in contrast to FIG. 1 Vietnamesebewegbaren- deflecting mirror 1 12 on an image sensor 121 (for example, a CCD sensor) of an image capture device 120, which is part of the image processing device 102, wherein the target image Z 'of observed scene is completely mapped on the image sensor 121.
- an image sensor 121 for example, a CCD sensor
- the image capture device 120 converts the optical target image Z 'projected onto the image sensor 121 into a digital image, which is forwarded in the form of an image file to a computer unit 126.
- the computer unit 126 runs from a software that analogous to that in connection with
- Fig. 1 described optical image processing method prepared the obtained digital image.
- the recorded target image Z ' is converted into a raster image and processed in raster element or pixel by pixel, wherein the brightness value of each target pixel and, according to the filter matrix shown in Fig.
- Result images of different spectral ranges are superimposed in the computer unit 26, 126 to form a multi-spectral image, which is output via the output interface 27, 127.
- Fig. 3 shows an observation device as already described with reference to Fig. 1.
- the Jardinrast ceremoniessmodul 20 having the light-guiding elements about a parallel to the longitudinal direction of the Lichtleitiata axis of rotation X in predetermined angular increments (for example, 12 angular steps over 360 °) is rotatable as symbolically represented by the arrow 23.
- the telescope 10 generates via the deflection mirror 12, for example, a 25 x 25 pixel target image Z ', which is shown in Fig. 3 as a matrix with 5 x 5 fields, each field has the size of 5 x 5 pixels or raster elements and having the same size and number of raster elements as the raster filter 22.
- the deflection mirror 12 deflects the radiation captured by the telescope 10 horizontally and vertically such that each center pixel of each 5 ⁇ 5 pixel block
- Frame screening module 20 is performed.
- the brightness values for the twenty-five target picture elements z, which result in accordance with the image processing described in connection with FIG. 1, are stored in the computer unit 26.
- the image processing is preferably carried out not only in the visible light range, but also analogously for other spectral ranges.
- the result values for the other spectral ranges are also stored in the computer unit 26.
- the rasterization module 20 is rotated further by one angular step and the described rendering steps are repeated. If this procedure for all angular steps, so for a complete revolution of the
- Defined search image area and reduced to these elements z m target image is compared with stored in a memory device 29 reference images of the same size and the same resolution.
- the position and the rotational position at which the smallest difference between the search image and the stored reference image occurs is registered as the position and rotational position of a target of a reference image class contained in the monitored scene.
- Imaging process created. 4 shows a further variant of an inventive
- Observation device corresponds in its construction of the observation device described in connection with FIG. 2, so that the reference numerals in Fig. 4 are the same reference numerals as in Fig. 2 and thus the same components
- Embodiment of FIG. 4 on a reference image memory in the memory device 129 which communicates with the computer unit 126 for data exchange.
- the computer unit 26, 126 and the memory device 29, 129 form an image evaluation device 25 within the image processing device 2.
- the following characteristic values are calculated in the target image Z '(25 ⁇ 25 pixels) for each search pixel position Pz:
- the average of the individual normalized spectral components and the total brightness, weighted according to the matrix of the raster filter 122, is calculated for each one of the nine marked 5 x 5 pixel blocks z m and for all four central pixels 123 of the corner pixel blocks in the central area from new pixel blocks In each case, the average value of the average values is calculated for the respective eight 5 ⁇ 5 pixel blocks Z arranged in a ring, which surround a respective corner pixel block.
- Difference formation compared with the values for the searched reference target image and the value set with the smallest absolute value of the difference sum is registered as a representative of this search pixel.
- the target image is now decomposed into smaller subareas and the search pixel with the smallest difference sum in each subarea is searched.
- the value set of the search pixel with the smallest difference sum is interpreted as a recognized target image and at the considered search pixel position with a pixel resolution and the rotational position as the discovered target of the
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Astronomy & Astrophysics (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020157012406A KR20150072423A (ko) | 2012-10-12 | 2013-10-08 | 이미지 프로세싱 방법 및 발사된 로켓의 경로를 추적하는 방법 |
JP2015535987A JP6285940B2 (ja) | 2012-10-12 | 2013-10-08 | 画像処理方法、該画像処理方法と共に実行される対象物自動検出方法、観測装置、及び、発射されたロケットの飛翔経路を遠距離から高精度で追跡する方法 |
EP13794796.6A EP2907105A2 (de) | 2012-10-12 | 2013-10-08 | Verfahren zur bildaufbereitung und verfahren zur bahnverfolgung von raketen |
US14/435,020 US9300866B2 (en) | 2012-10-12 | 2013-10-08 | Method for image processing and method that can be performed therewith for the automatic detection of objects, observation device and method for high-precision tracking of the course followed by launched rockets over large distances |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012020104 | 2012-10-12 | ||
DE102012020104.4 | 2012-10-12 | ||
DE102012022045.6A DE102012022045A1 (de) | 2012-10-12 | 2012-11-09 | Verfahren zur Bildaufbereitung und damit durchführbares Verfahren zur automatischen Objekterkennung sowie Beobachtungsvorrichtung und Verfahren zur hochgenauen Bahn-Verfolgung startender Raketen auf große Entfernungen |
DE102012022045.6 | 2012-11-09 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2014056473A2 true WO2014056473A2 (de) | 2014-04-17 |
WO2014056473A3 WO2014056473A3 (de) | 2014-07-17 |
Family
ID=50383016
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2013/000569 WO2014056473A2 (de) | 2012-10-12 | 2013-10-08 | Verfahren zur bildaufbereitung und damit durchführbares verfahren zur automatischen objekterkennung sowie beobachtungsvorrichtung und verfahren zur hochgenauen bahnverfolgung startender raketen auf grosse entfernungen |
Country Status (6)
Country | Link |
---|---|
US (1) | US9300866B2 (de) |
EP (1) | EP2907105A2 (de) |
JP (1) | JP6285940B2 (de) |
KR (1) | KR20150072423A (de) |
DE (1) | DE102012022045A1 (de) |
WO (1) | WO2014056473A2 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111797872A (zh) * | 2019-04-09 | 2020-10-20 | 深圳市家家分类科技有限公司 | 控制方法、电子装置、计算机可读存储介质及降解设备 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015019474A1 (ja) * | 2013-08-08 | 2015-02-12 | 株式会社島津製作所 | 画像処理装置 |
US11587323B2 (en) * | 2019-06-28 | 2023-02-21 | Raytheon Company | Target model broker |
CN115147313B (zh) * | 2022-09-01 | 2022-12-30 | 中国科学院空天信息创新研究院 | 椭圆轨道遥感图像的几何校正方法、装置、设备及介质 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011010337A1 (de) | 2011-02-04 | 2012-08-09 | Eads Deutschland Gmbh | Kamerasystem zur Erfassung und Bahnverfolgung von in großer Entfernung befindlichen bewegten Objekten |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4686646A (en) * | 1985-05-01 | 1987-08-11 | Westinghouse Electric Corp. | Binary space-integrating acousto-optic processor for vector-matrix multiplication |
JPH04164275A (ja) * | 1990-10-26 | 1992-06-09 | Hamamatsu Photonics Kk | 遠隔位置の光学観測装置 |
US5299275A (en) * | 1993-03-31 | 1994-03-29 | Eastman Kodak Company | Optical fiber filter for reducing artifacts in imaging apparatus |
US6965685B1 (en) * | 2001-09-04 | 2005-11-15 | Hewlett-Packard Development Company, Lp. | Biometric sensor |
DE102011010339A1 (de) | 2011-02-04 | 2012-08-09 | Eads Deutschland Gmbh | Luftraumüberwachungssystem zur Erfassung von innnerhalb eines zu überwachenden Gebiets startenden Raketen sowie Verfahren zu Luftraumüberwachung |
-
2012
- 2012-11-09 DE DE102012022045.6A patent/DE102012022045A1/de not_active Withdrawn
-
2013
- 2013-10-08 WO PCT/DE2013/000569 patent/WO2014056473A2/de active Application Filing
- 2013-10-08 JP JP2015535987A patent/JP6285940B2/ja active Active
- 2013-10-08 US US14/435,020 patent/US9300866B2/en active Active
- 2013-10-08 EP EP13794796.6A patent/EP2907105A2/de not_active Withdrawn
- 2013-10-08 KR KR1020157012406A patent/KR20150072423A/ko not_active Application Discontinuation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011010337A1 (de) | 2011-02-04 | 2012-08-09 | Eads Deutschland Gmbh | Kamerasystem zur Erfassung und Bahnverfolgung von in großer Entfernung befindlichen bewegten Objekten |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111797872A (zh) * | 2019-04-09 | 2020-10-20 | 深圳市家家分类科技有限公司 | 控制方法、电子装置、计算机可读存储介质及降解设备 |
CN111797872B (zh) * | 2019-04-09 | 2023-08-01 | 深圳市家家分类科技有限公司 | 控制方法、电子装置、计算机可读存储介质及降解设备 |
Also Published As
Publication number | Publication date |
---|---|
US20150281572A1 (en) | 2015-10-01 |
WO2014056473A3 (de) | 2014-07-17 |
US9300866B2 (en) | 2016-03-29 |
JP2015537290A (ja) | 2015-12-24 |
JP6285940B2 (ja) | 2018-02-28 |
KR20150072423A (ko) | 2015-06-29 |
DE102012022045A1 (de) | 2014-04-17 |
EP2907105A2 (de) | 2015-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2002103385A1 (de) | Verfahren zur bereitstellung von bildinformationen | |
DE102004018813A1 (de) | Verfahren zur Erkennung und/oder Verfolgung von Objekten | |
EP0355310A2 (de) | System zum Erkennen des Starts und des Anflugs von Objekten | |
DE3905591A1 (de) | Vorrichtung zur gewinnung kontrastreicher bilder | |
EP1460454B1 (de) | Verfahren zur gemeinsamen Verarbeitung von tiefenaufgelösten Bildern und Videobildern | |
DE202016007867U1 (de) | Steuerung des Sichtlinienwinkels einer Bildverarbeitungsplattform | |
WO2014056473A2 (de) | Verfahren zur bildaufbereitung und damit durchführbares verfahren zur automatischen objekterkennung sowie beobachtungsvorrichtung und verfahren zur hochgenauen bahnverfolgung startender raketen auf grosse entfernungen | |
DE10154861A1 (de) | Verfahren zur Bereitstellung von Bildinformationen | |
DE102018108936A1 (de) | Formmesssystem und Formmessverfahren | |
DE102005055879A1 (de) | Flugverkehr-Leiteinrichtung | |
DE102019008472B4 (de) | Multilinsen-Kamerasystem und Verfahren zur hyperspektralen Aufnahme von Bildern | |
DE102012020093A1 (de) | Anordnung zur Ortung, Erfassung und Überwachung von Eisbergen sowie Verfahren zur Bestimmung eines von treibenden Eisbergen ausgehenden Gefährdungspotentials für stationäre oder schwimmende Meeresbauwerke | |
CN108257090B (zh) | 一种面向机载行扫相机的高动态图像拼接方法 | |
DE102008015979A1 (de) | Bewegt-Bild-Verarbeitungssystem und Bewegt-Bild-Verarbeitungsverfahren | |
DE4205056A1 (de) | Verfahren und system zum abtasten einer szene zu dem zweck des klassifizierens einer mehrzahl von sich bewegenden objekten | |
DE102016102610A1 (de) | Laserscanner | |
DE102012111199A1 (de) | Optische Vorrichtung mit multifokaler Bilderfassung | |
DE102006060612B4 (de) | Verfahren zur Überwachung von Zielobjekten und Multispektralkamera dazu | |
DE102017117212A9 (de) | System und Verfahren zur Stereotriangulation | |
EP3200149B1 (de) | Verfahren zum erkennen eines objekts in einem suchbild | |
DE102012020922A1 (de) | Laserscanner | |
DE102017006877A1 (de) | Vorrichtung und Verfahren zum Erfassen von Flugkörpern mit einem Stereokamerasystem und einem Hochfrequenzscanner | |
DE102021203812B4 (de) | Optische Messvorrichtung und Verfahren zum Bestimmen eines mehrdimensionalen Oberflächenmodells | |
DE102016125372A1 (de) | Verfahren zum Bestimmen der Position und/oder Orientierung durch gleichzeitige Aufnahme von Objektrauminformation aus erheblich unterschiedlichen Wellenlängenbereichen in einem Bild, digitales Kamerasystem und mobile elektronische Einrichtung | |
US20230370563A1 (en) | Multi-spectral and panchromatic imaging apparatus and associated system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13794796 Country of ref document: EP Kind code of ref document: A2 |
|
ENP | Entry into the national phase |
Ref document number: 2015535987 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14435020 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2013794796 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013794796 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20157012406 Country of ref document: KR Kind code of ref document: A |