GB2260188A - Target acquisition training apparatus - Google Patents

Target acquisition training apparatus Download PDF

Info

Publication number
GB2260188A
GB2260188A GB9220650A GB9220650A GB2260188A GB 2260188 A GB2260188 A GB 2260188A GB 9220650 A GB9220650 A GB 9220650A GB 9220650 A GB9220650 A GB 9220650A GB 2260188 A GB2260188 A GB 2260188A
Authority
GB
United Kingdom
Prior art keywords
target
optical image
aimpoint
assessment
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9220650A
Other versions
GB9220650D0 (en
Inventor
Mark Tweedie
Thomas Samuel John Harvey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Short Brothers PLC
Original Assignee
Short Brothers PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Short Brothers PLC filed Critical Short Brothers PLC
Publication of GB9220650D0 publication Critical patent/GB9220650D0/en
Publication of GB2260188A publication Critical patent/GB2260188A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • F41G3/2605Teaching or practice apparatus for gun-aiming or gun-laying using a view recording device cosighted with the gun
    • F41G3/2611Teaching or practice apparatus for gun-aiming or gun-laying using a view recording device cosighted with the gun coacting with a TV-monitor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • F41G3/2616Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device
    • F41G3/2622Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile
    • F41G3/2627Cooperating with a motion picture projector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Apparatus for producing aimpoint assessment data in target acquisition training comprises a projector (17) which projects on to a screen (16) an optical image which includes a target (15) and which is generated from picture elements derived from a video signal source, a weapon or simulated weapon (14) provided for an aimer (11) who aims the weapon at target (15), and a video camera (13) on the weapon which generates signals representative of a field of view about the aimpoint axis of the weapon (14). The projected image includes high visibility bar patterns (27, 28) which extend transverse to the scanning lines of the image on each side of the target and present a plurality of high visibility alternating white and black zones. A microprocessor computes from the camera video signals aimpoint assessment outputs measured with respect to the centroids of the high visibility zones. <IMAGE>

Description

Target Acquisition Training Apparatus The present invention relates to target acquisition training apparatus and is particularly although not exclusively concerned with weapon-fire training apparatus in which an optical image including one or more target images is projected on to a screen and in which the optical image is derived from a video signal source, that is to say, is built up from picture elements (pixels) produced by sequential scanning of a field along scanning lines which provide coverage of the field with the scanning of each line producing along the line throughout one dimension of the field a succession of picture elements which are derived from successive signal values of an optical image generating video signal.
In conventional multi-arms trainers, using optical images projected from video sources on to a wall or screen, the aimpoint accuracy achieved is usually limited by the pixel stability of the projected optical image.
Substantial pixel jitter can occur along the scanning lines in such systems, so that the accuracy achieved, when measured on a single video line may be too low for effective aimpoint assessment in training. Furthermore, the projected optical image can be subject to substantial line jitter.
It is one object of the present invention to provide in a weapon-fire training apparatus in which the optical image displaying one or more target images is derived from a video signal source, means whereby the adverse effects of pixel instability in the projected image can be reduced to provide for greater aimpoint assessment accuracy.
According to a first aspect of the present there is provided target acquisition training apparatus comprising projection means for projecting an optical image including one or more targets, the projection means including optical image generating means which generates the optical image from picture elements derived in succession from successive signal values of an optical image generating video signal, target acquisition means for simulated acquisition of a target by an aimer who brings an aimpoint axis of the target aquisition means on to the target, the target acquisition means having associated therewith a video camera which generates camera video signals representative of a field of view about the aimpoint axis, and computing means to produce from the camera video signals an output enabling an assessment to be made of the target acquisition skill of the aimer, characterised in that the optical image generating means provides for the inclusion in the optical image of a plurality of high visibility discrete zones spaced about the target, targets or each target and that the computing means includes aimpoint assessment output means responsive to the camera video signals representing the high visibility zones to produce aimpoint assessment outputs measured with respect to optical image reference coordinates which are based on the coordinates of centroids of the high visibility zones.
By target acquisition is meant the bringing of an aimpoint of a device such as a weapon or simulated weapon on to a target by an aimer of the device or his attempts thereat and may but not necessarily include the firing or simulated firing of the device at the target.
In a preferred embodiment of the invention hereinafter to be described the aimpoint assessment output means produces aimpoint assessment outputs measured with respect to optical image reference coordinates which correspond to or are based on the average value of the coordinates of the centroids of the high visibility zones.
In an embodiment of the invention hereinafter to be described, picture elements which form the projected optical image are built up by sequential scanning of a field along scanning lines which provide coverage of the field with the scanning of each line producing along the line throughout one dimension of the field a succession of picture elements which are derived from successive signal values of the optical image generating video signal, wherein the high visibility zones are formed in the optical image on each side of the target, targets or each target in a high visibility bar pattern which extends transverse to the scanning lines, the high visibility discrete zones being formed as white sections of each bar pattern which alternate with black sections thereof and wherein the aimpoint assessment output means is responsive to the camera video signals to select from each scanning line in each white section a picture element or elements having a predetermined characteristic and a corresponding picture element or elements from each of the other lines in the white section to produce a centroid coordinate of the selected picture elements for each white section and to average the centroid coordinates to produce an optical image reference coordinate.
The aimpoint assessment output means may also be made responsive to the camera video signals to select predetermined lines of picture elements in each white section to produce a line centroid coordinate for the selected lines in each white section and to average the line centroid coordinates of the white sections to produce a further optical image reference coordinate.
In an embodiment of the invention hereinafter to be described the bar patterns are arranged on each side of the target, targets or each target and are arranged parallel to each other and at a fixed distance apart and wherein the computing means includes range aimpoint assessment output means which is responsive to the camera video signals representing the bar patterns to produce from data representing the fixed distance apart of the two bar patterns a range output representing the range of the aimer from the projected optical image.
The computing means may also include parallax error correction means responsive to the range output to calculate the parallax error and to produce corrected aimpoint assessment outputs to offset the parallax error.
The computing means may further include bar patterns parallel to each other and the computing means may then include cant angle measuring means responsive to the camera generated video signals representing the bar patterns to produce a cant angle output representing the cant angle of the target acquisition means.
In the embodiment of the invention hereinafter to be described the target acquisition means takes the form of a weapon or simulated weapon which will usually include a sight for use by the aimer for simulated acquisition of a target.
According to a second aspect of the present invention there is provided a method of producing aimpoint assessment output data in target acquisition training comprising the steps of projecting an optical image which includes one or more targets and in which the image is generated from picture elements derived in succession from successive signal values of an optical image generating video signal, providing an aimer with target acquisition means for simulated acquisition of a target by the aimer who brings an aimpoint axis of the target acquisition means on to the target, providing in association with the target acquisition means a video camera which generates camera video signals representative of a field of view about the aimpoint axis of the target acquisition means and computing from the camera video signals an output enabling an assessment to be made of the target acquisition skill of the aimer characterised by the steps of including in the optical image a plurality of high visibility discrete zones spaced about the target, targets or each target and computing from the camera video signals representing the high visibility zones aimpoint assessment outputs measured with respect to optical image reference coordinates which are based on the coordinates of centroids of the high visibililty zones.
Preferably, the aimpoint assessment outputs are measured with respect to optical image reference coordinates which correspond to or are based on the average value of the coordinates of the centroids of the high visibility zones.
One embodiment of the invention will now be described by way of example with reference to the accompanying drawings in which: Fig 1 is a schematic perspective view of part of a multiarms multi-lane weapon-fire training range embodying target acquisition training apparatus according to the invention Fig 2 is a schematic elevation of an optical image projected on to a projection screen of the range illustrated in Fig 1 and including a plurality of targets for use in target acquisition training of a marksman utilising one of four firing lanes in the range and including a high visibility bar pattern on each side of the targets for use in accordance with the invention Fig 3 is a block schematic diagram of the range illustrated in Fig 1 showing sub-assemblies of the target acquisition training apparatus according to the invention and their interconnections Fig 4 is a schematic side elevation of a part of a weapon with a camera mounted on it for use in the target acquisition training apparatus illustrated in Figs 1 and 3 Figs 5 and 6 are schematic plan views of displays provided by the target acquisition training apparatus shown in Figs 1 and 3 for target acquisition assessment monitoring and Fig 7 is a schematic diagram of waveforms of signals processed in the target acquisition training apparatus illustrated in Figs 1 to 6.
Referring first to Fig 1, the multi-lane multi-arms weapon-fire training range shown is an indoor 4-lane range for use by marksmen, only two of which are schematically illustrated and indicated by reference numerals 11 and 12. As will be seen, the marksman 11 occupies an end lane L1 and is provided with a simulated weapon 14 which he directs at a target images 15 projected on to a screen 16 by a video projector subassembly 17. The projector 17 provides full coverage of the screen 16 and presents target images 15 for each of the other lanes.
The marksman 12 is provided with a simulated weapon 18 for use against target images 15 displayed on the screen 16 in lane L2 by the projector 17. It is to be noted that the target acquisition training of the two marksmen 11 and 12 is provided in respect of weapons of differnt type and the other lanes (not shown) can if desired be used by other marksmen training in target acquisition using simulated weapons of the same or other types.
The training range shown in Fig 1 is placed under the control of a controller 19 who is provided with target assessment displays on a monitor screen 20 of a master console sub-assembly 21. In addition each of the marksmen is provided with floor box sub-assemblies 22 and 23 which provide on monitors 24 and 25 information as to their own target acquisition performance.
Each of the simulated weapons 14 and 18 has mounted thereon a video camera 13 which generates camera video signals representative of a field of view about the aimpoint axis of the simulated weapon. Each of the floor box sub-assemblies 22 and 23 and each of the other two floor box sub-assemblies includes a microprocessor for processing the camera video signals from the associated weapons to provide aimpoint assessment outputs to the monitors 24 and 25 and to the monitor 21 of the range controller's console sub-assembly 21 in accordance with the invention and as hereinafter to be described.
The target images 15 as displayed on the screen 16 in each of the lanes are part of an optical image field projected on to the screen by the video projector 17.
The image field is built up by the sequential scanning by the projector 17 along horizontal scanning lines which provide coverage of the screen 16 with the scanning of each line producing along the line throughout the horizontal dimension of the screen a succession of picture elements which are derived from successive signal values of the video signal of the projector 17. Such images are subject to pixel and line jitter as hereinbefore described.
Referring now to Fig 2, a part of the optical image field projected on to the screen 16 in the region of the target images 15 for lane L1 is shown. The target images 15 comprise outline target images 15A, 15B and 15C representing targets at successively increasing ranges, and at which the marksman 11 can aim using his simulated weapon 14 and make a strike at the target by simulated firing of the weapon 14. Included in the projected optical image field at the sides of the target images 15 are high visibility bar patterns 27 and 28 which extend vertically on the screen and therefore transverse to the horizontal scanning lines of the projected optical image field. Each of the bar patterns 27 and 28 is formed by white sections 29 alternating with black sections 30.
The white sections 29 form high visibility discrete zones which are formed by a succession of picture elements along successive scanning lines passing through the zone.
The video camera mounted on the simulated weapon 14 provides camera video output signals representative of a field of view about the aimpoint axis of the weapon 14 and views not only the target images 15 when the markman brings the weapon sight on to the target images but also the bar patterns 27 and 28. As a result, the camera video signals generated by the weapon video camera include not only signals representative of the target images 15 but also signals representative of the high visibility zones provided by the white sections 29 of the bar patterns 27 and 28.
The camera video signals from the camera mounted on the weapon 14 are transmitted for processing by a microprocessor in floor box sub-assembly 22 which analyzes the camera video signals, detects the relative positions of the bar patterns 27 and 28 within the camera field of view and from stored information of the target positions between the bar patterns, calculates the marksman's aimpoint with respect to the target images.
In particular, the microprocessor detects the location of peaks in the video signals corresponding to the white sections of the bar patterns 27 and 28, averages the peak coordinates per white section to calculate the centroid coordinates of each white section, then averages the values of the coordinates for all the white sections to produce reference coordinates which are then converted to an aimpoint assessment output. This averaging process, performed per video frame, is shown dramatically to reduce the adverse effects of pixel jitter on aimpoint measurement, and thereby significantly increase the measurement accuracy.
Referring now to Fig 3, the sub-assemblies of the target acquisition training apparatus shown in Fig 1 are illustrated in block diagram form and include a screen sub-assembly 31 carrying the screen 16 and provided with speakers 32 and 33, the projector sub-assembly 17, weapon sub-assemblies 14 and 18 together with two further weapon sub-assemblies 34 and 35 for use by marksmen in the other two lanes of the range, floor box sub-assemblies 22 and 23 together with further floor box sub-assemblies 36 and 37, the electronics rack sub-assembly 26, the master control sub-assembly 21 and a compressor sub-assembly 38 which provides for the application to the simulated weapons of the weapon sub-assemblies 14, 18, 34 and 35 pneumatic pulses providing recoil of the simulated weapons when fired by the marksmen.
Pixel jitter can be regarded as contributing noise to the aimpoint measurement process, which shows up as a distribution of signals representing the optical image reference coordinates, per video frame. This distribution of signals is then averaged to reduce the noise. For instance, if the microprocessor selects a picture element having a predetermined peak characteristic in say a total of 10 consecutive camera video lines, per "white" section of the bar pattern and there are 4 "white" sections in each bar pattern, then the processor will calculate 40 x-coordinates per bar pattern. The pixel jitter will produce, for example, in the case of perfectly vertical bar pattern image, a distribution in the 40 x- coordinates, for each bar pattern.However, the numerical averaging process over, for example, the above 40 x- coordinates will result in the pixel jitter induced noise in the x-direction being reduced from eg + N pixels as measured on 1 camera video line, to + N/40, as measured in 40 lines, in 1 complete video frame. The noise in the aimpoint assessment output is thus reduced by the square root of the number of samples in each video frame.
Averaging the 40 y-coordinates together would however not reduce jitter as the highest and lowest y-coordinates are averaged and they would be subject to jitter. Therefore, if there are 4 "white" sections in the example, the middle y-coordinate of each section is obtained (by averaging) and these centre y-coordinates are then averaged. This results in a jitter noise reduction in the y-coordinate of the aimpoint assessment output equal to the square root of twice the number of "white" sections, ie the y-coordinate jitter noise is reduced by 2.8 in this example.
The microprocessor never actually measures the pixel and line jitter directly, but rather significantly reduces its adverse effect on aimpoint assessment accuracy, by producing reference coordinates per video frame, averaged over many video line samples of peaks in the white sections.
A typical optical sight used on a weapon would have a magnification factor of about 4x, and therefore a field of view of about plus and minus 80.
In the embodiment of the invention illustrated in Fig 1, it will be seen that the simulated weapon 14 is shown to represent a self loading rifle, with the video camera 13 mounted on the rifle barrel. The mounting of the video camera 13 is shown more fully in Fig 4 to which reference is now made. As will be seen, the forward end of the rifle barrel 39 carries a combined camera and lens housing 40 supported on the barrel 39 by brackets 41 and 42. The housing includes a video camera unit 43 and a lens body 44 which presents to the camera 43 an image of a field of view about the optical axis 46 of the lens body 44 which is arranged to converge with the optical sight of the weapon 14 at a predetermined range of say 8m of the marksman 11 from the screen 16.
It will be appreciated that to obtain the necessary measurement accuracy from the video camera 13, its field of view is necessarily much smaller than that of the optical sight, and represents only a portion of the aimer's normal field of view through an optical sight.
An example of a typical field of view and measurement accuracy for the weapon-mounted camera will hereinafter be discussed.
A high-resolution video/graphics projector 17 is used to project an optical video image on to the screen 16. The optical video image is derived from a video digitisation and storage board, which allows for superposition of representative target images 15 and bar patterns 27 and 28 on top of the background range image. This also allows for the target images 15 to be positioned accurately at pre-designated points between the bar patterns. The measurement of the latter's position within the weapon cameras field of view thus allows the aimpoint with respect to the target to be determined, since the target images 15 are already at a known location between the bar patterns 27 and 28.
The projector 17 in the embodiment illustrated produces a 4-lane range video image on a 4 m wide by 2 m high screen, with the marksman positioned at for example 8 m from the screen 16. The projected image would, typically, be digitised to 1688 pixels horizontally, by 928 lines vertically. Thus one pixel in the projected image would be nominally 2.37 mm wide on the screen, and would subtend an angle of approximately. 0.3 mrads at the marksman's position, 8 m from the screen. However, the required accuracy of good marksmanship training is about 0.1 mrads from point-to-point, or an error of + or - 0.05 mrads. Thus, the projected image must be sampled only over a small area, with a high resolution video camera on the marksman's weapon, to produce the necessary measurement accuracy.
For example, if 0.1 mrads point to point measurement accuracy is required, using a COD array sensor camera, with 500 pixels horizontally, then (assuming sampling at the pixel frequency of the image produced by the COD camera) the lens used must give a horizontal field of view of about 50 mrads. At 8 m, therefore, the camera must view a portion of the 4 m x 2 m video screen 16, of dimensions 400 mm horizontally x 300 mm vertically, taking account of the visual 4:3 aspect ratio of COD sensors designed for use in normal TV cameras. For a 1/2" format COD sensor, of dimensions 6.4 mm (H) by 4.8 mm (V), the required lens focal length is 125 mm.
It is to be noted that the above assumes use of a pixel clock that corresponds exactly to the number of pixels on the COD array. The pixel clock is used to count time along the video lines output from the camera, and thus pixel position. The peaks in the video image may thus be located as spatial coordinates on the COD sensor, based on the timing measurements from the pixel clock. It is possible to increase the measurement accuracy by use of a stable pixel clock that is oversampled compared to the number of pixels on the COD sensor, or alternatively, to maintain the same angular accuracy, but increase the cameras field of view in conjunction with the pixel clock increasing.Essentially, for a fixed angular measurement accuracy, one can trade off pixel clock frequency against the camera lens focal length/field of view, so that the final system will not be restricted to the figures given in the examples.
A typical example of bar pattern dimensions may be taken to be about 100 mm high on screen, spaced at 200 mm apart, thus allowing plus or minus about 100 mm movement of the camera (for detection of both bar patterns in one image) to either side of the target, or an angle of plus or minus 12.5 mrads at 8 m from the screen. To put this in context of a typical standing man target at 100 m would subtend 4 mrads horizontally by 17 mrads vertically, or be displayed as 32 mm (H) by 136 mm (V), in this example, allowing aimpoint to be detected over a considerable region round the target. This is a system requirement, to deal with weapon handling, trigger snatch, poor aiming, sight zeroing, simulated wind effects and the like.
Each of the bar patterns 27 and 28 may, for example, measure 100 mm (V) by 10 mm (H) on screen, and be composed of 4 white sections and 3 black sections, so that each white section would be a nominal 14 mm in height. If the weapon camera's vertical field of view is 300 mm a 8 m, then, with 585 active video lines for a conventional TV type COD sensor, each video line on the COD sensor subtends 300/585 = 0.51 mm, at 8 m. Therefore the white sections of the bar pattern, would cover about 14 mm/0.51 mm = 27 video lines, as seen by the COD sensor. It is to be noted that the video projector, with 928 lines in 2 m, or 2.16 mm per projected video line, would use 6 or 7 projector lines in representing the same white section of the bar pattern. The distinction between the pixels and lines used by the video projector and those used by the weapon-mounted camera is also to be noted.
The microprocessor, using the COD camera video output, attempts to find the peak coordinates, per camera video line, of each white section 29 of the bar pattern, per video frame. In this example, it could find a maximum of 27 peak coordinates per white section 29, if one peak position is detected on each of the 27 video lines that the white section subtends vertically on the COD sensor.
However, the nature of the video projector will result in a fall-off in intensity, (due to projector lens blurring effects) at the top and bottom of each white section, with the result that a peak may only be detected eg, 10 COD lines. The use of 4 white sections then gives a total of about 40 peak coordinates detected per bar pattern.
The centroid of each white section 29 is computed, and the 4 centroids are averaged, to give a single averaged (x,y) coordinate per bar pattern. There are for example 40 coordinates averaged in the x direction (horizontally) and 8 coordinates in the y (vertical) direction. (There are two y-coordinates (top and bottom) per white section). This means that any random oscillations of pixels that occur in the projected video image, due to projector electrical instabilities, can be averaged out, and their effect on aimpoint measurement dramatically reduced.
Consider, for example, a pixel in the projected video image, nominally 2.37 mm (H) and 2.16 mm (V) on the video screen, and oscillating by at worst, plus and minus its width, and plus and minus 0.5 times its height; the projector scanning lines are measurably much more stable vertically, than the pixels are, horizontally. At 8 m this gives oscillations of plus and minus 0.3 mrads (H) and 0.14 mrads (V). Clearly, this would give unacceptable aimpoint measurement errors, compared to the necessary plus or minus 0.05 mrads, if measured from a single projector pixel. However because of the averaging process, the aimpoint average oscillation is reduced by f40 (H) and V8 (V).The overall measurement error, then due to random projector pixel oscillation, is now plus or minus 0.047 mrads (H) and 0.049 mrads (V), per video frame, compared to the desired plus and minus 0.05 mrads.
In the above example, the worst observed projector pixel oscillations have been quoted, and an explanation given as to how the averaging process enables adverse effects to be countered. In practice, modern projectors are somewhat more stable, and therefore have slightly lower levels of pixel oscillation. However, some random video noise also occurs in the COD sensor video output, and this, in turn, also causes peak positions, per white section, to oscillate slightly, thus contributing to the aimpoint measurement noise. Once again, however, the averaging process employed reduces the factor to an acceptable level, so that the specified aimpoint assessment accuracy can still be achieved.
The sensor used must be such as to ensure adequate signal to noise ratio for the external processing circuitry.
Typically, one might use a 1/2" format interline transfer COD sensor array, with a peak response in the visible region, preferably at about 550 nm, near the video projector's peak output wavelength; a minimum of 500 pixels horizontally, should suffice, and the sensor would need to be of a type having an integral microlens array on its surface, for maximum sensitivity. A lens aperture of F/4 to F/2 should provide sufficient light throughput.
The camera processing boards would probably need to be set up with a maximum gain of +24 dB, to provide sufficient signal, and an automatic gain control circuit to cope with variations in projector output levels. Each video frame output at 40 ms intervals may be analysed in hardware, in conjunction with a pixel clock, to locate the white section peak coordinates per video line. These coordinates may then be transferred to another processor for data validation, and then averaging to give aimpoint assessment data. The aimpoint assessment data is then transmitted to the computer monitor 24 beside the marksman 11, and shows his aimpoint tracking as illustrated in Fig 5, plus where appropriate bullet strike point when ballistics, wind and cant effects are considered, at the correct position, on or near the target at which he is aiming.
The lay-out of the bar patterns inherently give information on range changes of the weapon with respect to the screen as required for parrallax error correction, and on cant angle of the weapon.
It is to be noted that the measurements based on a single pixel in the projected video image cannot contain enough information to enable these parameters to be determined, and additional sensors would be needed, in this case, notwithstanding the fact the measurement accuracy would in any event be poor due to pixel oscillation anyway.
If one considers the weapon mounted camera and optical sight to be coincident at 8 m, when correctly zeroed, then with a vertical distance of 80 mm between the sight and cameras optical axis, at the weapon, the convergence angle is approx 10 mrads. Now if the weapon is moved 1 m further back to 9 m, the sight and camera now converge at 1 m in front of the screen, which means that the camera aimpoint on the screen is 10 mm high with respect to the sight's aimpoint. This 10 mm error at 8 m is equivalent to an aimpoint sensing error of 1.25 mrads, or for example 125 mm at 100 m; it is to be remembered that the desired system accuracy is plus or minus 0.05 mrads, so the above error must be detected and corrected.
Fortunately, with 2 vertical bar patterns in the camera field of view, range changes will show up as distance changes between the bar patterns. Indeed, a less accurate, but still workable solution can be achieved if only 1 barcode exists within the camera's field of view, since the vertical separation between the 4 white sections will also change with range. In any event, if the camera's field of view at 8 m is taken to be 400 mm (H) by 300 mm (V), with 500 pixels (H) and a bar pattern separation on-screen of 200 mm, corresponding, therefore, to 250 COD pixels, it is obvious that range changes of 1 part in 250 will be detectable.Thus, if the camera and sight are zeroed to converge at a nominal 8 m, and the bar pattern separation, as sensed by the COD camera, in terms of pixels is stored during the zeroing process, future range changes can be computed, by comparison with the stored bar pattern separation on zeroing, and hence the above parallax error can be computed and corrected.
It is to be noted that no external range measuring devices are required.
Reference has been made in the description to the measurement of white section peak coordinates per video line and the averaging of these coordinates to provide centroid coordinates for each white section. It will however be appreciated that a simple choice of peaks per video line assumes a symmetrical intensity distribution within each white section which of course does not occur in practice. The peaks per video line are therefore not detected directly but rather by detecting the rising and falling edges of the white section in each video line through it. The rising and falling edges, thus detected, are spaced symmetrically about the peak and therefore can be taken to represent the central coordinates of the white section per video line on which it occurs.
Detection of the rising and falling edges within the video waveform tends to give more robust results than direct peak detection as the former accurately locates central coordinates for waveforms which are flat at the top which would be difficult to detect accurately by direct peak detection.
As previously described the external video microprocessor is housed in the floor box sub-assembly 22 at the markman's firing position along with the feedback monitor 24. The video processor PCB has a low pass filter input with a cut-off frequency of 500KHz. This corresponds to a sinusoid of period slightly less than twice the width of the video pulse from the camera caused by a white section of a bar pattern. This filtered pulse then passes through an automatic gain control stage and has automatic clamping of the video black level. The next section of the circuit, an edge detector splits the signal and passes the signal through a low pass filter and adds a voltage offset. The waveforms are then compared to the original using a comparator and if the original signal is above the filtered offset signal, a TTL pulse is output to the next section.The waveforms are illustrated in Figure 7. The TTL pulse is then input into a digital IC along with a pixel clock and camera line synch, so that x- and y-coordinates for pulses are output. These coordinates are logged into memory and a microcontroller performs and x and y averaging as explained earlier. The calculated centres of each bar pattern are then passed via an RS232 link for computer calculation of cant angle, parallax correction and aimpoint in real time.
For moving targets, it is recognised that moving bar patterns would be distracting. The aimpoint assessment output may in these circumstances be produced only when the trigger of the weapon is pulled, so as to hide the high visibility bar patterns from the marksman, and only to flash them on to the projected video image on trigger pull, for a couple of frames.
However, the most common use of the bar patterns involves making them a feature of the displayed scene, for example as lane marker posts, as usually seen on a traditional marksmanship range. They are then displayed continuously, and so the weapon aimpoint is therefore measured continuously, both before and after trigger pull. It is this sequence of aimpoint measurements which gives vital information of the weapon handling capabilities of the marksman, both before and after trigger pull.
While target acquisition training apparatus according to the invention may be used for both high accuracy range marksmanship training and lower accuracy combat practice skills on rapidly moving targets it is more advantageous to use it to provide the high accuracy range-type marksmanship training as hereinbefore described.
The invention hereinbefore described provides a system, primarily for marksmanship simulation and training on range type targets, using a weapon-mounted camera coupled to an external processor which interrogates the scene projected on to a large screen by a video/graphics projector, displaying multi-lane range type marksmanship targets for multiple marksmen, and calculates the relative aimpoint for each marksman, with respect to the target displayed in his lane, to an accuracy significantly greater than that possible using "light pen" - type measurements on the large screen projected video.
Systems hitherto proposed which used "light pen" type measurements on single pixels in the projected video image, were either limited in accuracy when displayed on a large screen, or required a single screen per marksman to achieve the required accuracy, in the region of 0.1 mrads from point to point.
On the other hand apparatus according to the embodiment of the invention hereinbefore described achieves the required accuracy on a large screen display, of lower video resolution than the measurement accuracy, thus allowing a plurality of marksmen to engage separate targets simultaneously, on the same large screen display.
The result is a much more realistic simulation of a marksmanship range for weaponry training.
An essential feature of the system is the provision of high visibility discrete zones in bar patterns in the projected video image. These are conveniently disguised as range type lane marker posts, overlaid on the background range image, with the targets displayed at known relative locations between the lane marker posts.
The processor analyzing the video signal from the weaponmounted camera detects the relative positions of the high visibility zones within the camera field of view, and from stored information of target position relative to the zones, calculates the marksmans aimpoint with respect to the target. The bar patterns, therefore, form the basis of the aimpoint measurement method employed, and in addition, when disguised as described, are highly acceptable to the user.
The external processor analyzing the video signal from the weapon-mounted camera detects the location of the peaks in the video signal, corresponding to the white sections of the bar patterns. The peak positions per white section are then averaged to calculate the centroid coordinates of each white section, which are then averaged to produce reference coordinates for producing an aimpoint assessment output. This averaging process, performed per video frame, dramatically reduces the effect of random noise sources on the aimpoint measurement, and thereby significantly increase the measurement accuracy.
The two main noise sources which would cause aimpoint errors, but are now averaged out to a much lower level are pixel jitter oscillations (horizontal and vertical) in the video graphics projected image, and random noise in the COD camera used to view the projected image.
The bar patterns are, furthermore, spatially extended within the camera's field of view, by necessity, which means that their image size changes measurably as the weapon to screen range changes. Thus range changes can be measured from the processing results, and this allows parallax errors (arising from the fact that the weapon sight and camera only converge on the screen at one range and thus are misconverged elsewhere) to be computed and corrected for; this arises directly from the implementation of the bar patterns described and requires no external range measuring devices.
The vertical extension of the bar patterns, and the fact that centroids are calculated, one for each white section, for example, 4 white sections per bar pattern, also allows the weapon cant angle from the vertical to be determined by the processor, without additional external measurement devices. If, for example the first bar pattern's coordinates are (x1y1) and the other bar pattern's corresponding coordinates are (x2,y2), the cant angle is tan -1 (y2-y1)/(x2-x1) Cant angle from the vertical can conveniently be displayed on the marksman's monitor as a clock face as illustrated in Fig 6 at the top right-hand corner of the display where indicated by the reference numeral 48, a strike of the target being indicated by the reference numeral 49.
Where the simulated weapon is a hand gun, as represented in Fig 1 by the weapon 18 held by the marksman 12, the video camera mounted on it may conveniently be in the form of a simulated barrel of the gun.

Claims (18)

1. Target acquisition training apparatus comprising projection means for projecting an optical image including one or more targets, the projection means including optical image generating means which generates the optical image from picture elements derived in succession from successive signal values of an optical image generating video signal, target acquisition means for simulated acquisition of a target by an aimer who brings an aimpoint axis of the target aquisition means on to the target, the target acquisition means having associated therewith a video camera which generates camera video signals representative of a field of view about the aimpoint axis, and computing means to produce from the camera video signals an output enabling an assessment to be made of the target acquisition skill of the aimer, characterised in that the optical image generating means provides for the inclusion in the optical image of a plurality of high visibility discrete zones spaced about the target, targets or each target and that the computing means includes aimpoint assessment output means responsive to the camera video signals representing the high visibility zones to produce aimpoint assessment outputs measured with respect to optical image reference coordinates which are based on the coordinates of centroids of the high visibility zones.
2. Apparatus according to claim 1, wherein the aimpoint assessment output means produces aimpoint assessment outputs measured with respect to optical image reference coordinates which correspond to or are based on the average value of the coordinates of the centroids of the high visibility zones.
3. Apparatus according to claim 2 wherein picture elements which form the projected optical image are built up by sequential scanning of a field along scanning lines which provide coverage of the field with the scanning of each line producing along the line throughout one dimension of the field a succession of picture elements which are derived from successive signal values of the optical image generating video signal, wherein the high visibility zones are formed in the optical image on each side of the target, targets or each target in a high visibility bar pattern which extends transverse to the scanning lines, the high visibility discrete zones being formed as white sections of each bar pattern which alternate with black sections thereof and wherein the aimpoint assessment output means is responsive to the camera video signals to select from each scanning line in each white section a picture element or elements having a predetermined characteristic and a corresponding picture element or elements from each of the other lines in the white section to produce a centroid coordinate of the selected picture elements for each white section and to average the centroid coordinates to produce an optical image reference coordinate.
4. Apparatus according to claim 3 wherein the aimpoint assessment output means is responsive to the camera video signals to select predetermined lines of picture elements in each white section to produce a line centroid coordinate for the selected lines in each white section and to average the line centroid coordinates of the white sections to produce a further optical image reference coordinate.
5. Apparatus according to claim 3 or 4 wherein the bar patterns are arranged on each side of the target, targets or each target and are arranged parallel to each other and at a fixed distance apart and wherein the computing means includes range aimpoint assessment output means which is responsive to the camera video signals representing the bar patterns to produce from data representing the fixed distance apart of the two bar patterns a range output representing the range of the aimer from the projected optical image.
6. Apparatus according to claim 5 wherein the computing means includes parallax error correction means responsive to the range output to calculate the parallax error and to produce corrected aimpoint assessment outputs to offset the parallax error.
7. Apparatus according to claim 3 or 4 wherein the bar patterns on each side of the optical image are arranged on each side of the target, targets or each target parallel to each other and wherein the computing means includes cant angle measuring means responsive to the camera generated video signals representing the bar patterns to produce a cant angle output representing the cant angle of the target acquisition means.
8. Apparatus according to any of claims 1 to 7, wherein the target acquisition means comprises a weapon or simulated weapon.
9. Apparatus according to claim 8, wherein the weapon includes a sight for use by the aimer for simulated acquisition of a target.
10. A method of producing aimpoint assessment output data in target acquisition training comprising the steps of projecting an optical image which includes one or more targets and in which the image is generated from picture elements derived in succession from successive signal values of an optical image generating video signal, providing an aimer with target acquisition means for simulated acquisition of a target by the aimer who brings an aimpoint axis of the target acquisition means on to the target, providing in association with the target acquisition means a video camera which generates camera video signals representative of a field of view about the aimpoint axis of the target acquisition means and computing from the camera video signals an output enabling an assessment to be made of the target acquisition skill of the aimer characterised by the steps of including in the optical image a plurality of high visibility discrete zones spaced about the target, targets or each target and computing from the camera video signals representing the high visibility zones aimpoint assessment outputs measured with respect to optical image reference coordinates which are based on the coordinates of centroids of the high visibililty zones.
11. A method according to claim 10 wherein the aimpoint assessment outputs are measured with respect to optical image reference coodinates which correspond to or are based on the average value of the coordinates of the centroids of the high visibility zones.
12. A method according to claim 11 comprising the steps of building up the picture elements which form the projected optical image by sequential scanning of a field along scanning lines which provide coverage of the field with the scanning of each line producing along the line throughout one dimension of the field a succession of picture elements which are derived from successive signal values of the optical image generating video signal, forming the high visibility zones in the optical image on each side of the target, targets or each target in a high visibility bar pattern which extends tranverse to the scanning lines with the high visibility discrete zones being formed as white sections of each bar pattern which alternate with black sections thereof, selecting camera video signals representing a picture element or elements having a predetermined characteristic in each scanning line in each white section and camera video signals representing a corresponding picture element or elements in each of the other lines in the white section to produce a centroid coordinate of the selected picture elements for each white section and averaging the centroid coodinates to produce an optical image reference coodinate.
13. A method according to claim 12 comprising the step of selecting camera video signals representing predetermined picture elements in each white section, computing a line centroid coordinate of the predetermined lines in each white section and averaging the line centroid coordinates of the white sections to produce a further optical image reference coordinate.
14. A method according to claim 12 or 13 comprising the steps of arranging the bar patterns on each side of the target, targets or each target parallel to each other and at a fixed distance apart and computing from the camera video signals and data representing the distance apart of the two bar patterns a range output representing the range of the aimer from the projected optical image.
15. A method according to claim 14 comprising the step of calculating the parallax error from the range output and producing corrected aimpoint assessment outputs to offset the parallax error.
16. A method according to claim 12 or 13 comprising the steps of arranging the bar patterns on each side of the target, targets or each target parallel to each other and computing from the camera video signals representing the white sections of the bar patterns the cant angle of the target acquisition means with respect to the bar patterns.
17. Target acquisition training apparatus substantially as hereinbefore described with reference to the accompanying drawings.
18. A method of producing aimpoint assessment output data in target acquisition training substantially as hereinbefore described with reference to the accompanying drawings.
GB9220650A 1991-10-02 1992-09-30 Target acquisition training apparatus Withdrawn GB2260188A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB919120930A GB9120930D0 (en) 1991-10-02 1991-10-02 Target acquisition training apparatus

Publications (2)

Publication Number Publication Date
GB9220650D0 GB9220650D0 (en) 1992-11-11
GB2260188A true GB2260188A (en) 1993-04-07

Family

ID=10702312

Family Applications (2)

Application Number Title Priority Date Filing Date
GB919120930A Pending GB9120930D0 (en) 1991-10-02 1991-10-02 Target acquisition training apparatus
GB9220650A Withdrawn GB2260188A (en) 1991-10-02 1992-09-30 Target acquisition training apparatus

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB919120930A Pending GB9120930D0 (en) 1991-10-02 1991-10-02 Target acquisition training apparatus

Country Status (2)

Country Link
GB (2) GB9120930D0 (en)
WO (1) WO1993007437A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2268252A (en) * 1992-06-30 1994-01-05 British Aerospace Simulation L Weapon training
FR2772908A1 (en) * 1997-12-24 1999-06-25 Aerospatiale Missile firing simulator system
WO2006019974A2 (en) * 2004-07-15 2006-02-23 Cubic Corporation Enhancement of aimpoint in simulated training systems
EP1837743A2 (en) * 2006-03-20 2007-09-26 Samsung Electronics Co., Ltd. Pointing input device, method, and system using image pattern
US7329127B2 (en) * 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US10670373B2 (en) 2017-11-28 2020-06-02 Modular High-End Ltd. Firearm training system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2681454B2 (en) * 1995-02-21 1997-11-26 コナミ株式会社 Shooting game device
JP2668343B2 (en) * 1995-02-21 1997-10-27 コナミ株式会社 Competition game equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2046410A (en) * 1979-04-04 1980-11-12 Detras Training Aids Ltd Target apparatus
GB2152645A (en) * 1984-01-04 1985-08-07 Hendry Electronics Ltd D Target trainer
GB2161251A (en) * 1984-07-07 1986-01-08 Ferranti Plc Weapon training apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3945133A (en) * 1975-06-20 1976-03-23 The United States Of America As Represented By The Secretary Of The Navy Weapons training simulator utilizing polarized light
US4290757A (en) * 1980-06-09 1981-09-22 The United States Of America As Represented By The Secretary Of The Navy Burst on target simulation device for training with rockets
GB2160298B (en) * 1984-06-14 1987-07-15 Ferranti Plc Weapon aim-training apparatus
US4824374A (en) * 1986-08-04 1989-04-25 Hendry Dennis J Target trainer

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2046410A (en) * 1979-04-04 1980-11-12 Detras Training Aids Ltd Target apparatus
GB2152645A (en) * 1984-01-04 1985-08-07 Hendry Electronics Ltd D Target trainer
GB2161251A (en) * 1984-07-07 1986-01-08 Ferranti Plc Weapon training apparatus

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2268252A (en) * 1992-06-30 1994-01-05 British Aerospace Simulation L Weapon training
US6296486B1 (en) 1997-12-23 2001-10-02 Aerospatiale Societe Nationale Industrielle Missile firing simulator with the gunner immersed in a virtual space
FR2772908A1 (en) * 1997-12-24 1999-06-25 Aerospatiale Missile firing simulator system
WO1999034163A1 (en) * 1997-12-24 1999-07-08 Aerospatiale Societe Nationale Industrielle Missile firing simulator with the gunner immersed in a virtual space
US7329127B2 (en) * 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US7687751B2 (en) 2004-07-15 2010-03-30 Cubic Corporation Enhancement of aimpoint in simulated training systems
WO2006019974A3 (en) * 2004-07-15 2006-05-04 Cubic Corp Enhancement of aimpoint in simulated training systems
US7345265B2 (en) 2004-07-15 2008-03-18 Cubic Corporation Enhancement of aimpoint in simulated training systems
WO2006019974A2 (en) * 2004-07-15 2006-02-23 Cubic Corporation Enhancement of aimpoint in simulated training systems
EP1837743A2 (en) * 2006-03-20 2007-09-26 Samsung Electronics Co., Ltd. Pointing input device, method, and system using image pattern
EP1837743A3 (en) * 2006-03-20 2011-10-26 Samsung Electronics Co., Ltd. Pointing input device, method, and system using image pattern
US8106884B2 (en) 2006-03-20 2012-01-31 Samsung Electronics Co., Ltd. Pointing input device, method, and system using image pattern
US10670373B2 (en) 2017-11-28 2020-06-02 Modular High-End Ltd. Firearm training system

Also Published As

Publication number Publication date
GB9220650D0 (en) 1992-11-11
GB9120930D0 (en) 1991-11-27
WO1993007437A1 (en) 1993-04-15

Similar Documents

Publication Publication Date Title
EP0852961B1 (en) Shooting video game machine
JP3748271B2 (en) Shooting game equipment
US6540607B2 (en) Video game position and orientation detection system
EP0728503B1 (en) A shooting game machine
US4619616A (en) Weapon aim-training apparatus
US4164081A (en) Remote target hit monitoring system
US5208417A (en) Method and system for aiming a small caliber weapon
US4923402A (en) Marksmanship expert trainer
US4446480A (en) Head position and orientation sensor
US6997716B2 (en) Continuous aimpoint tracking system
US5991043A (en) Impact position marker for ordinary or simulated shooting
US20090091623A1 (en) Method and device for use in calibration of a projector image display towards a display screen, and a display screen for such use
US20070190495A1 (en) Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios
US9910507B2 (en) Image display apparatus and pointing method for same
CN110836616B (en) Image correction detection method for accurately positioning laser simulated shooting impact point
US20110053120A1 (en) Marksmanship training device
SE463229B (en) PROCEDURES FOR ANALYZING THE SHOOTING PROCESS FOR SHOOTING EXERCISES
US20100092925A1 (en) Training simulator for sharp shooting
GB2260188A (en) Target acquisition training apparatus
WO1994015165A1 (en) Target acquisition training apparatus and method of training in target acquisition
KR100751503B1 (en) Target practice device
US6663391B1 (en) Spotlighted position detection system and simulator
KR20020007880A (en) Method for finding the position of virtual impact point at a virtual firing range by using infrared
JPS6232987A (en) Laser gun game apparatus and detection of hit in said game apparatus
US6964607B2 (en) Game system and game method

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)