EP2802839B1 - Systems and methods for arranging firearms training scenarios - Google Patents

Systems and methods for arranging firearms training scenarios Download PDF

Info

Publication number
EP2802839B1
EP2802839B1 EP13751094.7A EP13751094A EP2802839B1 EP 2802839 B1 EP2802839 B1 EP 2802839B1 EP 13751094 A EP13751094 A EP 13751094A EP 2802839 B1 EP2802839 B1 EP 2802839B1
Authority
EP
European Patent Office
Prior art keywords
target
robotic
training
operations
operations data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP13751094.7A
Other languages
German (de)
French (fr)
Other versions
EP2802839A1 (en
EP2802839A4 (en
Inventor
Alex Brooks
Tobias Kaupp
Alexei MAKARENKO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Marathon Robotics Pty Ltd
Original Assignee
Marathon Robotics Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2012900675A external-priority patent/AU2012900675A0/en
Application filed by Marathon Robotics Pty Ltd filed Critical Marathon Robotics Pty Ltd
Publication of EP2802839A1 publication Critical patent/EP2802839A1/en
Publication of EP2802839A4 publication Critical patent/EP2802839A4/en
Application granted granted Critical
Publication of EP2802839B1 publication Critical patent/EP2802839B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A33/00Adaptations for training; Gun simulators
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J11/00Target ranges
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J9/00Moving targets, i.e. moving when fired at
    • F41J9/02Land-based targets, e.g. inflatable targets supported by fluid pressure
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/24Targets producing a particular effect when hit, e.g. detonation of pyrotechnic charge, bell ring, photograph

Definitions

  • the present invention relates to systems and methods for arranging firearms training scenarios and particularly relates to firearms scenarios utilising robotic mobile targets.
  • mobile targets have been used in the form of a mannequin or the like mounted on a moveable platform on wheels. These may be directly radio-controlled by a human operator during a training exercise. This adds a significant workload to training exercises, particularly when multiple moving targets are required, and it is difficult to present multiple trainees with identical training scenarios.
  • these mobile targets have been programmed to move along a preprogrammed route in a training area to simulate persons moving about, and the personnel being trained must attempt to hit the mannequins. Route definition is performed on a computer screen.
  • the mobile targets are autonomous and the target's onboard computer generates the route for the target to follow according to constraints pre-defined on the computer screen.
  • An example of such a system is described in the present applicant's International Patent application no PCT/AU2010/001165 published as WO 2011035363 .
  • the intended outcome is to present targets to a trainee in some desired fashion.
  • considerable thought should be put into the routes of the tracks, since they are difficult to move subsequently.
  • novel methods of defining the routes are required to facilitate quick, easy, and intuitive generation of new routes.
  • JP-A-2008267710 discloses a moving target system capable of arbitrarily moving a target in a range and performing practical and advanced shooting training.
  • a first aspect provides a method of arranging a firearms training scenario utilising at least one robotic mobile target in a training area, the method including the steps of: sending commands to at least one robotic target in a training area to cause the target to operate in the training area; recording operations data representative of the operations carried out by the at least one robotic target; and subsequently conducting a training scenario in the training area wherein the at least one robotic target bases its actions at least partially on the previously recorded operations data.
  • the operations data may include command data representative of at least some of the commands sent to the at least one robotic target.
  • the operations data may include actions data representative of at least some of the actions carried out by the at least one robotic target in reacting to the commands.
  • the operations data may include outcome data representative of at least some of the outcomes of executing the commands.
  • the step of sending commands to the at least one robotic target may be carried out by a human operator using a remote control input device.
  • the step of sending commands may be carried out whilst the human operator is situated at a location in the training area where the at least one of the trainees will be situated during the step of conducting the training scenario.
  • the operations data may be recorded by the at least one robotic target.
  • the operations data may include data representative of the location, orientation or velocity of the at least one robotic target in the training area.
  • the operations data may include data representative of any of sounds produced by the at least one robotic target, raising or lowering of simulated weapons, deployment of special effects by the at least one robotic target or at least one robotic target remaining static.
  • the at least one robotic target may intentionally deviate from the operations data.
  • the at least one robotic target may deviate from the operations data to avoid an obstacle.
  • the at least one robotic target may randomly deviate from the operations data.
  • the scenario may utilise more than one robotic target and each base their operations on their own set of operations data.
  • the at least one robotic target may commence operations in the training scenario following the elapsing of a pre-determined interval of time, or in response to detecting personnel in the training area, or in response to detecting movement of another target in the training area.
  • the present invention provides a system for use in conducting a firearms training scenario utilising at least one robotic mobile target in a training area, the system including: sending means for sending commands to at least one robotic target in a training area to cause the target to operate in the training area; recording means for recording operations data representative of the operations carried out by the at least one robotic target; the at least one robotic target is arranged to participate in a firearms training scenario in the training area; and wherein the at least one robotic target is arranged to base its actions at least partially on recorded operations data.
  • the operations data may include command data representative of commands sent to the at least one robotic target.
  • the operations data may include actions data representative of actions carried out by the at least one robotic target in reacting to commands.
  • the operations data may include outcome data representative of outcomes of executing the commands.
  • the sending means may include a remote control input device.
  • the recording means may be embodied in the at least one robotic target.
  • the operations data may include data representative of the location, orientation or velocity of the at least one robotic target in the training area.
  • the operations data may include data representative of any of sounds produced by the at least one robotic target, raising or lowering of simulated weapons, deployment of special effects by the at least one robotic target or at least one robotic target remaining static.
  • the at least one robotic target may be arranged to intentionally deviate from the operations data.
  • the at least one robotic target may be arranged to deviate from the operations data to avoid an obstacle.
  • the at least one robotic target may be arranged to randomly deviate from the operations data.
  • the system may include more than one robotic target.
  • the at least one robotic targets may be arranged to commence actions following the elapsing of a pre-determined interval of time, or in response to detecting personnel in the training area, or in response to detecting movement of another target in the training area.
  • a human operator manually controls the operations of one target in a recording session. This can be achieved through the use of a remote user interface.
  • the target records its operations.
  • the operator later commands the target to replay the operations any number of times, for the benefit of the same or different trainees.
  • the operations of the mobile units may include any of: sounds produced by the mobile units, movements of the mobile units, raising or lowering of simulated weapons, deployment of special effects by the mobile units, changes in velocity or direction of the mobile units or mobile units remaining static.
  • the target may be unable to faithfully replay the previously recorded operations. It may happen for example if it encounters an obstacle which was not in the training area at the time of the recording. In this case the target may use its sensors to detect the obstacle and navigate safely around it while attempting to return to the original path as soon as practicable.
  • the target may be instructed to alter some of the parameters during replay.
  • the change in the parameters may be random or repeatable, or a combination of the two. Random changes make the actions of the robots more unpredictable, and therefore, more challenging for the trainees. Repeatable changes allow the instructor to fine-tune the scenario to the training needs of a particular trainee. Repeatable changes are also well-suited for firearms training courses where it is desirable that each trainee faces essentially the same training scenario.
  • the replay of recorded operations may be triggered manually by the instructor or automatically, based on a timer, or actions of other targets, or sensed actions of human participants in the exercise.
  • Operations of multiple targets may also be recorded and replayed using the described approach.
  • the recording can be achieved by multiple instructors controlling multiple targets simultaneously, or by one instructor controlling one target at a time.
  • Robot 100 has a motorised wheeled base 1. On the base 1 is mounted a mannequin 6 shaped like a human torso. Robot 100 is controlled by an on board computer 2, configured with software, which is mounted on the base 1 and protected by an armoured cover 3 from bullet strikes. Robot 100 includes wireless communication means 4 such as wifi to enable sending and receiving of information to and from a human operator (not shown), or to and from other robots, or to and from a control base station (not shown). Robot 100 includes a GPS receiver 12 to determine its own position.
  • wireless communication means 4 such as wifi to enable sending and receiving of information to and from a human operator (not shown), or to and from other robots, or to and from a control base station (not shown).
  • Robot 100 includes a GPS receiver 12 to determine its own position.
  • Robot 100 includes a laser rangefinder 13 to enable it to detect features in the local environment to thereby see around. Fixed and moving obstacles are detected by analysing each laser scan. When an obstacle is detected in the robot's intended motion path, the motion plan is modified to safely navigate around it.
  • Figures 2 to 7 depict preparation and execution of firearms training exercises carried out in a training area using one or several robots 100 of figure 1 .
  • a training area is shown 10 in which are located high walls 15, 16, 17 and a barrel 18.
  • a firearms instructor 41 arranging a firearms training exercise.
  • a mobile unit in the form of human-shaped robot 31.
  • This robot 31 is of the type of robot 100 shown in figure 1 .
  • the robot is arranged to execute and record operations based on remote commands of the human instructor as will now be described.
  • Firearms instructor 41 positions himself at the south edge of the exercise area to observe the area from the position of where the trainee(s) will be later situated.
  • the instructor sends a sequence of remote commands 51 to the target 31 using a specialised remote control hand-held device.
  • the device includes a joystick for inputting directional commands along with other buttons for sending commands to carry out other types of operations, such as deploy special effects as will be later described. Desired speed of movement of the target in any direction is indicated by the instructor by the degree of deflection applied to the joystick.
  • the remote control device communicates with the target 31 by radio communication.
  • the instructor can issue the following commands to the target:
  • the target 31 operates in the training area in response to the commands it receives.
  • the target also records operations data representative of the operations that are carried out.
  • the operations data recorded includes data representative of the commands issued and also data indicative of the operation steps carried out in response to the commands.
  • the target reacts to directional commands to move between certain positions in the training area, then it records these operations in the form of positional outcomes of executing these commands by storing GPS coordinate data of the points that it moved between in the form of waypoints. This ensures that the movements made subsequently by the target during the replay of a training scenario are a faithful reproduction of the movements witnessed by the instructor at the time of recording the scenario.
  • the recorded operations data enables compensation for variations in conditions such as increased wheel slippage of targets in wet weather or other minor variations in conditions.
  • Target 31 may record the following outcomes resulting from executing operator commands:
  • Target 31 operates in the training area by executing the commands received from the instructor.
  • the target 31 records its operations in the form of operations data which includes data representative of the commands and also data representative of the actions taken in reacting to the commands.
  • the instructor also provides information to the target as to the future intended location of trainees in the training exercise.
  • the remote control device includes its own GPS positioning capability and a button which indicates "I'm at the Trainee Location".
  • the remote localises itself and sends the location to the robotic target which saves it for future use.
  • the instructor drives the target to the intended trainee location by way of joystick control and pushes a button which indicates "You're at the Trainee Location”.
  • the robot uses its own GPS positioning system to determine the location and saves it for future use.
  • a training exercise is being carried out in the training area.
  • the actions of the robot 31 are based on the operations data that was previously recorded in figure 2 .
  • the armed personnel 21 is the "blue" force (friendly), and the robot 31 is the “red” force (enemy).
  • the red force has occupied the training area; the blue force must clear the area of red force.
  • the armed personnel 21 is entering the training area from the south.
  • the firearms instructor 41 initiates the previously recorded exercise causing the target 31 to start moving along the path 36.
  • Armed person 21 takes note of target 31, takes aim and shoots.
  • the exercise proceeds as in figure 3 but there is now a barrel 19 which was not there at the time when the exercise was recorded.
  • the onboard computer determines that there is an obstacle which prevents it from following the pre-recorded path 36.
  • the onboard computer calculates a new path 37 which allows it to navigate safely around the obstacle and return to the pre-recorded path 36 as soon it is practical.
  • the instructor 41 commanded the target to execute the scenario recorded in figure 2 with an increased level of difficulty for armed personnel 21.
  • the onboard computer Based on the analysis of the training area, the shape of pre-recorded path 36, and the location of personnel 21, the onboard computer calculated a new path 38 which takes it behind the barrel 18. The barrel partially obscures the target making it more difficult to observe and to shoot.
  • two firearms instructors 41 and 42 are recording another firearms training exercise.
  • the instructors position themselves inside the training area in order to better observe the targets 31, 32 and the high walls 15, 17.
  • Instructor 41 sends a sequence of remote commands 51 to target 31 while instructor 42 sends a sequence of remote commands 52 to target 32.
  • Target 31 is commanded to move along path 38, from position 73, around the western end of high wall 15; simulate loud human speech when it reaches position 74; and proceed south to position 75.
  • Target 32 is commanded to move along path 39 from position 76, around the western end of high wall 17; simulate multiple shots 77; and proceed south to position 78.
  • the training exercise recorded in figure 6 is being replayed.
  • the armed personnel 21 again is entering the training area from the south.
  • the firearms instructor 42 initiates the previously recorded exercise causing the targets 31 and 32 to start moving along the paths 38 and 39.
  • the timing of the two targets' actions is arranged such that target 32 waits until target 31 has produced simulated speech as a trigger for emerging from behind wall 17. Therefore, the simulated speech occurs before target 32 exposes itself from behind high wall 17.
  • Armed person 21 is challenged to shoot and hit target 32 before it simulates firing shots, despite the distraction from target 31.
  • the record of changes in the target's position over time forms the target's trajectory.
  • the record of other operations e.g. audio effects, may be correlated to the recorded trajectory.
  • This type of geo-referencing enables more faithful reproduction of the original target presentation.
  • the audio effect was intended to be played by target 31 at position 74 and not simply 15 seconds after the start of motion.
  • FIG 8 the steps in a recording session and replay session are illustrated.
  • an operator issues commands to a robotic target.
  • the robot peforms actions by way of utilising its various actuators.
  • the results of its actions are referred to as outcomes.
  • operations data relating to commands, actions and outcomes is recorded and referred to as operations data.
  • the robotic target uses the previously recorded operations data to plan and carry out actions by way of its various actuators in an attempt to reproduce the outcomes of the recording session.
  • the ability of the robots to maintain estimates of their own positions within the training area is important for their ability to repeat the operations that they took in response to the commands.
  • the robots 100 carried GPS receivers to localise themselves within the training range.
  • the robots may localise themselves by way of any of many methods described in the literature, e.g. tracking range and bearing to laser reflecting beacons, measuring signal strength of radio beacons, or detecting buried magnets.
  • the robots 100 carried laser rangefinders to sense objects and movements of objects in front of them.
  • the robots may sense objects and movements of objects by way of other sensors such as cameras, radars or sonars.
  • one of many well-known obstacle avoidance algorithms may be employed to calculate a safe motion plan which avoids collision with the obstacles.
  • the robots might perform the following variations to the previously recorded operations, or a combination of these variations:
  • the replay of recorded operations was triggered manually by the instructor. In other embodiments it may be triggered automatically, based on a timer, or actions of other targets, or sensed actions of human participants in the exercise. With a user interface, the operator may also want to pause the replay somewhere in the middle, or to begin replay part-way through the activity sequence.
  • Operations of multiple targets may also be recorded and replayed.
  • the operations of multiple targets were recorded in parallel, i.e. multiple operators control multiple targets simultaneously. With this approach, the timing of the targets' actions relative to one-another is also captured.
  • the operations of multiple targets may be recorded in series, i.e. a single operator controls the targets one after another, and then assembles the individual activities into a coordinated scenario.
  • the targets begin their activities on one of the triggers listed above (the two simplest approaches being that all activities begin simultaneously, or each activity is triggered independently by the operator).
  • Some form of dynamic obstacle avoidance may be needed when multiple robots operate in close proximity to one-another.
  • the operator can stand anywhere while recording the target's activity, but there are two advantageous locations:
  • the remote commands are sent to the robot using a specialised hand-held device.
  • the remote commands could be sent using a computer, phone, gaming device, etc.
  • the remote commands are sent to the robot using a wifi connection. In other embodiments the remote commands could be sent using any radio or a wired link.
  • the firearms training exercises were carried out using live ammunition.
  • the ammunition used could be simunition (simulated ammunition) or the firearms may be replaced by or augmented with lasers and laser targets to simulate ammunition.
  • the armed personnel taking part in the training exercise were soldiers.
  • embodiments of the invention have application in training other types of people such as security guards, members of private military companies, law enforcement officers, and private citizens who may be members of a gun club or shooting academy.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Manipulator (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Description

    Technical Field
  • The present invention relates to systems and methods for arranging firearms training scenarios and particularly relates to firearms scenarios utilising robotic mobile targets.
  • Background to the Invention
  • Armed personnel such as soldiers typically receive training to assist them in dealing with armed combat situations that they might encounter during their active duties. Such training can include training exercises using live ammunition such as practice in shooting at targets. Such training is crucial to the personnel's performance and safety in real life situations. There remains a need for improved systems and methods for training armed personnel.
  • To date, such training has involved the use of static shooting targets, pop-up targets, and targets moved on tracks. For targets on tracks, the routes are defined by the tracks and the motion along those routes is controlled directly in real-time or is pre-defined on a computer screen.
  • In some cases, mobile targets have been used in the form of a mannequin or the like mounted on a moveable platform on wheels. These may be directly radio-controlled by a human operator during a training exercise. This adds a significant workload to training exercises, particularly when multiple moving targets are required, and it is difficult to present multiple trainees with identical training scenarios.
  • In some cases, these mobile targets have been programmed to move along a preprogrammed route in a training area to simulate persons moving about, and the personnel being trained must attempt to hit the mannequins. Route definition is performed on a computer screen. In other cases, the mobile targets are autonomous and the target's onboard computer generates the route for the target to follow according to constraints pre-defined on the computer screen. An example of such a system is described in the present applicant's International Patent application no PCT/AU2010/001165 published as WO 2011035363 .
  • In all cases, the intended outcome is to present targets to a trainee in some desired fashion. When presenting moving targets along tracks, considerable thought should be put into the routes of the tracks, since they are difficult to move subsequently. With the advent of trackless targets that can move along any route, novel methods of defining the routes are required to facilitate quick, easy, and intuitive generation of new routes.
  • Problems with definition of routes on a computer screen include:
    1. 1. When looking at the computer screen, the operator has to imagine what the trainee will see from their perspective, what angles and openings will be visible from a certain vantage point, etc. This is especially difficult when there are elevation changes within the training range, so the operator has to think in three dimensions while plotting target trajectories on a two-dimensional screen. As a result, creating a route for a mobile target may involve iterating between defining the route on the screen, watching the target move from the trainee's intended vantage point, modifying the route on the screen, etc (a potentially cumbersome process).
    2. 2. Defining a route on a computer screen requires that the route be defined relative to something meaningful that can be displayed on the screen, i.e. a map of some kind. This mandates an extra step before a mobile trackless target can be used in a new training range: that map must first be generated. Even if the map is very simple, e.g. an aerial photograph of the training range geo-referenced in a GPS coordinate system, it is still an extra step and may require additional resources such as an internet connection to download the aerial photograph.
  • Frost, Roger: "Robbie discloses a target embodied as a three-dimensional human form that can be used as a target for small-arms fire and forms a starting point for the independent claims 1 and 14. JP-A-2008267710 discloses a moving target system capable of arbitrarily moving a target in a range and performing practical and advanced shooting training.
  • Summary of the Invention
  • In accordance with independent claim 1 a first aspect the present invention provides a method of arranging a firearms training scenario utilising at least one robotic mobile target in a training area, the method including the steps of: sending commands to at least one robotic target in a training area to cause the target to operate in the training area; recording operations data representative of the operations carried out by the at least one robotic target; and subsequently conducting a training scenario in the training area wherein the at least one robotic target bases its actions at least partially on the previously recorded operations data.
  • The operations data may include command data representative of at least some of the commands sent to the at least one robotic target.
  • The operations data may include actions data representative of at least some of the actions carried out by the at least one robotic target in reacting to the commands.
  • The operations data may include outcome data representative of at least some of the outcomes of executing the commands.
  • The step of sending commands to the at least one robotic target may be carried out by a human operator using a remote control input device.
  • The step of sending commands may be carried out whilst the human operator is situated at a location in the training area where the at least one of the trainees will be situated during the step of conducting the training scenario.
  • The operations data may be recorded by the at least one robotic target. The operations data may include data representative of the location, orientation or velocity of the at least one robotic target in the training area.
  • The operations data may include data representative of any of sounds produced by the at least one robotic target, raising or lowering of simulated weapons, deployment of special effects by the at least one robotic target or at least one robotic target remaining static.
  • During the step of conducting the training scenario, the at least one robotic target may intentionally deviate from the operations data.
  • The at least one robotic target may deviate from the operations data to avoid an obstacle.
  • The at least one robotic target may randomly deviate from the operations data.
  • The scenario may utilise more than one robotic target and each base their operations on their own set of operations data.
  • The at least one robotic target may commence operations in the training scenario following the elapsing of a pre-determined interval of time, or in response to detecting personnel in the training area, or in response to detecting movement of another target in the training area.
  • In accordance with the independent claim 14 a second aspect the present invention provides a system for use in conducting a firearms training scenario utilising at least one robotic mobile target in a training area, the system including: sending means for sending commands to at least one robotic target in a training area to cause the target to operate in the training area; recording means for recording operations data representative of the operations carried out by the at least one robotic target; the at least one robotic target is arranged to participate in a firearms training scenario in the training area; and wherein the at least one robotic target is arranged to base its actions at least partially on recorded operations data.
  • The operations data may include command data representative of commands sent to the at least one robotic target.
  • The operations data may include actions data representative of actions carried out by the at least one robotic target in reacting to commands.
  • The operations data may include outcome data representative of outcomes of executing the commands.
  • The sending means may include a remote control input device.
  • The recording means may be embodied in the at least one robotic target.
  • The operations data may include data representative of the location, orientation or velocity of the at least one robotic target in the training area.
  • The operations data may include data representative of any of sounds produced by the at least one robotic target, raising or lowering of simulated weapons, deployment of special effects by the at least one robotic target or at least one robotic target remaining static.
  • The at least one robotic target may be arranged to intentionally deviate from the operations data.
  • The at least one robotic target may be arranged to deviate from the operations data to avoid an obstacle.
  • The at least one robotic target may be arranged to randomly deviate from the operations data.
  • The system may include more than one robotic target.
  • The at least one robotic targets may be arranged to commence actions following the elapsing of a pre-determined interval of time, or in response to detecting personnel in the training area, or in response to detecting movement of another target in the training area.
  • In this specification the following terms have the following intended meanings:
    • "commands" : instructions sent by an operator to a robot during the recording session.
    • "outcomes" : all aspects of target's performance which, when taken in aggregate, define how the target is presented to the trainees.
    • "operations data" : the persistent record created during the recording session which may include a combination of commands, actions and outcomes.
    • "actions" : operation steps planned and executed by the robot. During the recording session, the actions are generated in response to the operator commands. During the replay session, the actions are generated based on the operations data and real time sensor data with the objective to recreate the operations carried out during the recording session as faithfully as possible.
  • In embodiments of the invention, a human operator manually controls the operations of one target in a recording session. This can be achieved through the use of a remote user interface. The target records its operations. The operator later commands the target to replay the operations any number of times, for the benefit of the same or different trainees.
  • The operations of the mobile units may include any of: sounds produced by the mobile units, movements of the mobile units, raising or lowering of simulated weapons, deployment of special effects by the mobile units, changes in velocity or direction of the mobile units or mobile units remaining static.
  • The target may be unable to faithfully replay the previously recorded operations. It may happen for example if it encounters an obstacle which was not in the training area at the time of the recording. In this case the target may use its sensors to detect the obstacle and navigate safely around it while attempting to return to the original path as soon as practicable.
  • Instead of faithfully replaying the original sequence of operations, the target may be instructed to alter some of the parameters during replay. The change in the parameters may be random or repeatable, or a combination of the two. Random changes make the actions of the robots more unpredictable, and therefore, more challenging for the trainees. Repeatable changes allow the instructor to fine-tune the scenario to the training needs of a particular trainee. Repeatable changes are also well-suited for firearms training courses where it is desirable that each trainee faces essentially the same training scenario.
  • The replay of recorded operations may be triggered manually by the instructor or automatically, based on a timer, or actions of other targets, or sensed actions of human participants in the exercise.
  • Operations of multiple targets may also be recorded and replayed using the described approach. The recording can be achieved by multiple instructors controlling multiple targets simultaneously, or by one instructor controlling one target at a time.
  • Brief Description of the Drawings
  • An embodiment of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
    • Figure 1 is a schematic representation of a human-shaped robot used in embodiments of the invention;
    • Figure 2 is a schematic bird's eye view of a training area in which recording of a training exercise is taking place using a robot according to figure 1;
    • Figure 3 shows the training area of figure 2 in which a replay of the training exercise of figure 2 is taking place;
    • Figure 4 shows the training area of figure 2 in which another replay of the training exercise of figure 2 is taking place;
    • Figure 5 shows the training area of figure 2 in which yet another replay of the training exercise of figure 2 is taking place;
    • Figure 6 shows the training area of figure 2 in which recording of another training exercise is taking place;
    • Figure 7 shows the training area of figure 2 in which a replay of the training exercise of figure 6 is taking place; and
    • Figure 8 is a flow chart illustrating the steps of collecting and using operations data in the recording and replay sessions of a training scenario.
    Detailed Description of the Preferred Embodiment
  • Referring to figure 1, an embodiment of a robotic mobile target is shown in the form of human-shaped robot 100. Robot 100 has a motorised wheeled base 1. On the base 1 is mounted a mannequin 6 shaped like a human torso. Robot 100 is controlled by an on board computer 2, configured with software, which is mounted on the base 1 and protected by an armoured cover 3 from bullet strikes. Robot 100 includes wireless communication means 4 such as wifi to enable sending and receiving of information to and from a human operator (not shown), or to and from other robots, or to and from a control base station (not shown). Robot 100 includes a GPS receiver 12 to determine its own position.
  • Robot 100 includes a laser rangefinder 13 to enable it to detect features in the local environment to thereby see around. Fixed and moving obstacles are detected by analysing each laser scan. When an obstacle is detected in the robot's intended motion path, the motion plan is modified to safely navigate around it.
  • Figures 2 to 7 depict preparation and execution of firearms training exercises carried out in a training area using one or several robots 100 of figure 1.
  • Referring to figure 2, a training area is shown 10 in which are located high walls 15, 16, 17 and a barrel 18. At the south edge of the training area is a firearms instructor 41 arranging a firearms training exercise. In the training area is a mobile unit in the form of human-shaped robot 31. This robot 31 is of the type of robot 100 shown in figure 1. The robot is arranged to execute and record operations based on remote commands of the human instructor as will now be described.
  • Firearms instructor 41 positions himself at the south edge of the exercise area to observe the area from the position of where the trainee(s) will be later situated. The instructor sends a sequence of remote commands 51 to the target 31 using a specialised remote control hand-held device. The device includes a joystick for inputting directional commands along with other buttons for sending commands to carry out other types of operations, such as deploy special effects as will be later described. Desired speed of movement of the target in any direction is indicated by the instructor by the degree of deflection applied to the joystick. The remote control device communicates with the target 31 by radio communication.
  • The instructor can issue the following commands to the target:
    1. 1. Motion control commands by joystick input, i.e. turn left, turn right, straight, move faster, slower, reverse direction, or stop moving.
    2. 2. Raise or lower arms holding objects or simulated weapons;
    3. 3. Create audio effects from an onboard speaker;
    4. 4. Create light effects to illuminate the target itself or the ground around it;
    5. 5. Create other effects such as simulated gunfire, pyrotechnics, explosions, or smoke.
  • The target 31 operates in the training area in response to the commands it receives. The target also records operations data representative of the operations that are carried out. The operations data recorded includes data representative of the commands issued and also data indicative of the operation steps carried out in response to the commands.
  • For example, if the target reacts to directional commands to move between certain positions in the training area, then it records these operations in the form of positional outcomes of executing these commands by storing GPS coordinate data of the points that it moved between in the form of waypoints. This ensures that the movements made subsequently by the target during the replay of a training scenario are a faithful reproduction of the movements witnessed by the instructor at the time of recording the scenario. The recorded operations data enables compensation for variations in conditions such as increased wheel slippage of targets in wet weather or other minor variations in conditions.
  • Target 31 may record the following outcomes resulting from executing operator commands:
    1. 1. Positional outcomes such as target's coordinates and orientation
    2. 2. Movement outcomes such as target's linear and rotational velocities, as well as linear and angular accelerations;
    3. 3. Raising or lowering arms holding objects or simulated weapons;
    4. 4. Creation of audio effects from an onboard speaker;
    5. 5. Creation of light effects to illuminate the target itself or the ground around it;
    6. 6. Creation of other effects such as simulated gunfire, pyrotechnics, explosions, or smoke.
  • The instructor commands the target to move along the path 36 from position 71 behind the wall 15, out into the open area, in front of and around the barrel 18, and to its final position 72 behind the wall 17. Target 31 operates in the training area by executing the commands received from the instructor. The target 31 records its operations in the form of operations data which includes data representative of the commands and also data representative of the actions taken in reacting to the commands.
  • The instructor also provides information to the target as to the future intended location of trainees in the training exercise. The remote control device includes its own GPS positioning capability and a button which indicates "I'm at the Trainee Location". The remote localises itself and sends the location to the robotic target which saves it for future use. Alternatively, the instructor drives the target to the intended trainee location by way of joystick control and pushes a button which indicates "You're at the Trainee Location". The robot uses its own GPS positioning system to determine the location and saves it for future use.
  • Referring to figure 3, a training exercise is being carried out in the training area. The actions of the robot 31 are based on the operations data that was previously recorded in figure 2. For the purpose of the training exercise, the armed personnel 21 is the "blue" force (friendly), and the robot 31 is the "red" force (enemy). In this exercise, it is imagined that the red force has occupied the training area; the blue force must clear the area of red force. The armed personnel 21 is entering the training area from the south. The firearms instructor 41 initiates the previously recorded exercise causing the target 31 to start moving along the path 36. Armed person 21 takes note of target 31, takes aim and shoots.
  • Referring to figure 4, the exercise proceeds as in figure 3 but there is now a barrel 19 which was not there at the time when the exercise was recorded. Based on the continuous analysis of the output of the laser rangefinder mounted on robot 31, the onboard computer determines that there is an obstacle which prevents it from following the pre-recorded path 36. The onboard computer calculates a new path 37 which allows it to navigate safely around the obstacle and return to the pre-recorded path 36 as soon it is practical.
  • In figure 5 the instructor 41 commanded the target to execute the scenario recorded in figure 2 with an increased level of difficulty for armed personnel 21. Based on the analysis of the training area, the shape of pre-recorded path 36, and the location of personnel 21, the onboard computer calculated a new path 38 which takes it behind the barrel 18. The barrel partially obscures the target making it more difficult to observe and to shoot.
  • In figure 6, two firearms instructors 41 and 42 are recording another firearms training exercise. The instructors position themselves inside the training area in order to better observe the targets 31, 32 and the high walls 15, 17. Instructor 41 sends a sequence of remote commands 51 to target 31 while instructor 42 sends a sequence of remote commands 52 to target 32. Target 31 is commanded to move along path 38, from position 73, around the western end of high wall 15; simulate loud human speech when it reaches position 74; and proceed south to position 75. Target 32 is commanded to move along path 39 from position 76, around the western end of high wall 17; simulate multiple shots 77; and proceed south to position 78.
  • Referring to figure 7, the training exercise recorded in figure 6 is being replayed. The armed personnel 21 again is entering the training area from the south. The firearms instructor 42 initiates the previously recorded exercise causing the targets 31 and 32 to start moving along the paths 38 and 39. The timing of the two targets' actions is arranged such that target 32 waits until target 31 has produced simulated speech as a trigger for emerging from behind wall 17. Therefore, the simulated speech occurs before target 32 exposes itself from behind high wall 17. Armed person 21 is challenged to shoot and hit target 32 before it simulates firing shots, despite the distraction from target 31.
  • The record of changes in the target's position over time forms the target's trajectory. The record of other operations, e.g. audio effects, may be correlated to the recorded trajectory. This type of geo-referencing enables more faithful reproduction of the original target presentation. For example, the audio effect was intended to be played by target 31 at position 74 and not simply 15 seconds after the start of motion.
  • Referring to figure 8, the steps in a recording session and replay session are illustrated. In the recording session an operator issues commands to a robotic target. The robot peforms actions by way of utilising its various actuators. The results of its actions are referred to as outcomes. During the recording session data relating to commands, actions and outcomes is recorded and referred to as operations data.
  • In the replay session, the robotic target uses the previously recorded operations data to plan and carry out actions by way of its various actuators in an attempt to reproduce the outcomes of the recording session.
  • The ability of the robots to maintain estimates of their own positions within the training area is important for their ability to repeat the operations that they took in response to the commands. In the embodiments described above, the robots 100 carried GPS receivers to localise themselves within the training range. In other embodiments the robots may localise themselves by way of any of many methods described in the literature, e.g. tracking range and bearing to laser reflecting beacons, measuring signal strength of radio beacons, or detecting buried magnets.
  • In the embodiments described above, the robots 100 carried laser rangefinders to sense objects and movements of objects in front of them. In other embodiments the robots may sense objects and movements of objects by way of other sensors such as cameras, radars or sonars. After the obstacles in the robot's vicinity are detected, one of many well-known obstacle avoidance algorithms may be employed to calculate a safe motion plan which avoids collision with the obstacles.
  • In various scenarios, the robots might perform the following variations to the previously recorded operations, or a combination of these variations:
    1. 1. Deviate from the recorded velocity profile, i.e. faster, slower, pause, skip a pause, change pause duration;
    2. 2. Make small deviations from the recorded path, i.e. to the left or to the right;
    3. 3. Make significant deviations from the recorded path in order to use the cover of natural or man-made obstacles;
    4. 4. Make more or fewer sound-effects or other actions.
  • In the embodiments described above, the replay of recorded operations was triggered manually by the instructor. In other embodiments it may be triggered automatically, based on a timer, or actions of other targets, or sensed actions of human participants in the exercise. With a user interface, the operator may also want to pause the replay somewhere in the middle, or to begin replay part-way through the activity sequence.
  • Operations of multiple targets may also be recorded and replayed. In the embodiments described above, the operations of multiple targets were recorded in parallel, i.e. multiple operators control multiple targets simultaneously. With this approach, the timing of the targets' actions relative to one-another is also captured. In other embodiments the operations of multiple targets may be recorded in series, i.e. a single operator controls the targets one after another, and then assembles the individual activities into a coordinated scenario.
  • During replay of multi-target recordings, the targets begin their activities on one of the triggers listed above (the two simplest approaches being that all activities begin simultaneously, or each activity is triggered independently by the operator). Some form of dynamic obstacle avoidance may be needed when multiple robots operate in close proximity to one-another.
  • The operator can stand anywhere while recording the target's activity, but there are two advantageous locations:
    1. 1. The operator can follow along behind the target. This addresses two general problems encountered when controlling a robot from a distance:
      • obstacles are difficult to avoid due to poor depth perception at long range; and
      • when a robot passes behind an obstacle and line-of-sight is lost, situation awareness suffers.
    2. 2. The operator can stand exactly where the trainees will be when they are conducting the training scenario. This eliminates the problems associated with defining routes via the abstraction of a computer screen: the operator can see exactly what the trainees will see, and can define the route accordingly. Providing feedback to the operator, e.g. in the form of a live video feed, can mitigate the problems associated with controlling a robot at a distance or out of line-of-sight.
  • In the embodiments described above, the remote commands are sent to the robot using a specialised hand-held device. In other embodiments the remote commands could be sent using a computer, phone, gaming device, etc.
  • In the embodiments described above, the remote commands are sent to the robot using a wifi connection. In other embodiments the remote commands could be sent using any radio or a wired link.
  • In the embodiments described above, the firearms training exercises were carried out using live ammunition. In other embodiments the ammunition used could be simunition (simulated ammunition) or the firearms may be replaced by or augmented with lasers and laser targets to simulate ammunition.
  • In the embodiment described above, the armed personnel taking part in the training exercise were soldiers. Similarly, embodiments of the invention have application in training other types of people such as security guards, members of private military companies, law enforcement officers, and private citizens who may be members of a gun club or shooting academy.
  • It can be seen that embodiments of the invention have at least one of the following advantages:
    • saving labour by allowing to record the scenario once and replay it many times later, possibly with programmed or random variations. In the case of multi-target recording, a single instructor can operate multiple targets.
    • eliminating requirement for a pre-existing map of the training range. The routes are defined relative to what the operator sees, and may be stored e.g. as a path in a GPS coordinate system.
    • providing an intuitive, "what-you-see-is-what-you-get" interface for describing the scenario without at any point needing an abstract visual representation of the training range that can be shown to an operator on a computer screen.

Claims (20)

  1. A method of arranging a firearms training scenario utilising at least one robotic mobile target (31) in a training area, the method including the steps of:
    sending commands to at least one robotic target (31) in a training area (10) to cause the target to operate in the training area;
    recording operations data representative of the operations carried out by the at least one robotic target; and
    subsequently conducting a training scenario in the training area wherein the at least one robotic target bases its actions at least partially on the previously recorded operations data.
  2. A method according to claim 1 wherein the operations data includes command data representative of at least some of the commands sent to the at least one robotic target.
  3. A method according to either of claim 1 or claim 2 wherein the operations data includes actions data representative of at least some of the actions carried out by the at least one robotic target in reacting to the commands.
  4. A method according to any one of claims 1 to 3 wherein the operations data includes outcome data representative of at least some of the outcomes of executing the commands.
  5. A method according to any preceding claim wherein the step of sending commands (51) to the at least one robotic target is carried out by a human operator (41) using a remote control input device.
  6. A method according to claim 5 wherein the step of sending commands is carried out whilst the human operator (41) is situated at a location in the training area where the at least one of the trainees (21) will be situated during the step of conducting the training scenario.
  7. A method according to any preceding claim wherein the operations data is recorded by the at least one robotic target (31).
  8. A method according to any preceding claim wherein the operations data includes data representative of the location, orientation or velocity of the at least one robotic target in the training area.
  9. A method according to any preceding claim wherein the operations data includes data representative of any of sounds produced by the at least one robotic target, raising or lowering of simulated weapons, deployment of special effects by the at least one robotic target or at least one robotic target remaining static.
  10. A method according to any preceding claim wherein, during the step of conducting the training scenario, the at least one robotic target intentionally deviates from the operations data.
  11. A method according to claim 10 wherein the at least one robotic target deviates from the operations data to avoid an obstacle or the at least one robotic target randomly deviates from the operations data.
  12. A method according to any preceding claim wherein the scenario utilises more than one robotic target and each base their operations on their own set of operations data.
  13. A method according to any preceding claim wherein the at least one robotic target commences operations in the training scenario following the elapsing of a pre-determined interval of time, or in response to detecting personnel in the training area, or in response to detecting movement of another target in the training area.
  14. A system for use in conducting a firearms training scenario arranged according to any of claims 1-13 utilising at least one robotic mobile target (31) in a training area (10), the system including:
    sending means (4) for sending commands to at least one robotic target in a training area to cause the target to operate in the training area;
    recording means for recording operations data representative of the operations carried out by the at least one robotic target;
    the at least one robotic target (31) being arranged to participate in a firearms training scenario in the training area; and
    wherein the at least one robotic target is arranged to base its actions at least partially on recorded operations data.
  15. A system according to claim 14 wherein the sending means includes a remote control input device.
  16. A system according to claim 14 or 15 wherein the recording means is embodied in the at least one robotic target.
  17. A system according to any one of claims 14 to 16 wherein the at least one robotic target is arranged to intentionally deviate from the operations data.
  18. A system according to claim 17 wherein the at least one robotic target is arranged to deviate from the operations data to avoid an obstacle or the at least one robotic target is arranged to randomly deviate from the operations data.
  19. A system according to any one of claims 14 to 18 including more than one robotic target (31).
  20. A system according to any one of claims 14 to 19 wherein the at least one robotic target is arranged to commence actions following the elapsing of a pre-determined interval of time, or in response to detecting personnel in the training area (10), or in response to detecting movement of another target in the training area.
EP13751094.7A 2012-02-23 2013-01-23 Systems and methods for arranging firearms training scenarios Active EP2802839B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2012900675A AU2012900675A0 (en) 2012-02-23 Systems and methods for arranging firearms training scenarios
PCT/AU2013/000051 WO2013123547A1 (en) 2012-02-23 2013-01-23 Systems and methods for arranging firearms training scenarios

Publications (3)

Publication Number Publication Date
EP2802839A1 EP2802839A1 (en) 2014-11-19
EP2802839A4 EP2802839A4 (en) 2015-07-15
EP2802839B1 true EP2802839B1 (en) 2017-07-12

Family

ID=49028672

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13751094.7A Active EP2802839B1 (en) 2012-02-23 2013-01-23 Systems and methods for arranging firearms training scenarios

Country Status (3)

Country Link
US (1) US20140356817A1 (en)
EP (1) EP2802839B1 (en)
WO (1) WO2013123547A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8791911B2 (en) 2011-02-09 2014-07-29 Robotzone, Llc Multichannel controller
US9390617B2 (en) 2011-06-10 2016-07-12 Robotzone, Llc Camera motion control system with variable autonomy
US9726463B2 (en) * 2014-07-16 2017-08-08 Robtozone, LLC Multichannel controller for target shooting range
EP3262371B1 (en) * 2015-02-23 2020-04-29 Marathon Robotics Pty Ltd A method of providing a live fire training environment and a moveable target for use with such a method
KR102042390B1 (en) 2016-09-27 2019-11-07 탁티칼트림 이.케이. target
RU2667132C1 (en) * 2017-03-06 2018-09-14 Открытое акционерное общество "Завод им. В.А. Дегтярева" Robotic modular self-contained range equipment
RU2670395C1 (en) * 2017-06-07 2018-10-22 Открытое акционерное общество "Завод им. В.А. Дегтярева" Control module for range facilities
US11143479B2 (en) * 2018-06-12 2021-10-12 Lei He Artificial and intelligent anti-terrorism device for stopping ongoing crime
RU2722515C1 (en) * 2019-07-17 2020-06-01 Акционерное общество "Концерн "Калашников" Target complex and method of control thereof
CN110360878A (en) * 2019-08-13 2019-10-22 苏州融萃特种机器人有限公司 A kind of man-machine coordination simulative training system and its method
US11958183B2 (en) 2019-09-19 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU6506098A (en) * 1997-03-11 1998-09-29 Octavia Design Limited Combat simulator
US6283756B1 (en) * 2000-01-20 2001-09-04 The B.F. Goodrich Company Maneuver training system using global positioning satellites, RF transceiver, and laser-based rangefinder and warning receiver
US6845297B2 (en) * 2000-05-01 2005-01-18 Irobot Corporation Method and system for remote control of mobile robot
US20070105070A1 (en) * 2005-11-08 2007-05-10 Luther Trawick Electromechanical robotic soldier
JP2008267710A (en) * 2007-04-20 2008-11-06 Kyosan Electric Mfg Co Ltd Moving target system
US8655257B2 (en) * 2009-08-24 2014-02-18 Daniel Spychaiski Radio controlled combat training device and method of using the same
AU2010300068C1 (en) * 2009-09-23 2021-01-14 Marathon Robotics Pty Ltd Methods and systems for use in training armed personnel
US20110111385A1 (en) * 2009-11-06 2011-05-12 Honeywell International Inc. Automated training system and method based on performance evaluation
US8712602B1 (en) * 2011-05-24 2014-04-29 Timothy S. Oliver Mobile target system
US20130341869A1 (en) * 2012-01-18 2013-12-26 Jonathan D. Lenoff Target Shot Placement Apparatus and Method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
EP2802839A1 (en) 2014-11-19
WO2013123547A1 (en) 2013-08-29
US20140356817A1 (en) 2014-12-04
EP2802839A4 (en) 2015-07-15

Similar Documents

Publication Publication Date Title
EP2802839B1 (en) Systems and methods for arranging firearms training scenarios
EP2480857B1 (en) Methods and systems for use in training armed personnel
US9132342B2 (en) Dynamic environment and location based augmented reality (AR) systems
US8770977B2 (en) Instructor-lead training environment and interfaces therewith
US20120274922A1 (en) Lidar methods and apparatus
US20030195022A1 (en) System and method for player tracking
US20190244536A1 (en) Intelligent tactical engagement trainer
KR101470805B1 (en) Simulation training system for curved trajectory firearms marksmanship in interior and control method thereof
CN113834373A (en) Real person deduction virtual reality indoor and outdoor attack and defense fight training system and method
AU2013201379B8 (en) Systems and methods for arranging firearms training scenarios
KR102117862B1 (en) The combat simulation trainig thereof method using AR VR MR
CN109453525B (en) Entertainment interaction system and method based on immersive robot
US11662178B1 (en) System and method of marksmanship training utilizing a drone and an optical system
KR102279384B1 (en) A multi-access multiple cooperation military education training system
CN116798288A (en) Sentry terminal simulator and military duty training assessment simulation equipment
CN105403100A (en) Laser simulated shooting counter-training system
Martin Army Research Institute Virtual Environment Research Testbed
CN105403099A (en) Actual combat shooting training system
CN105403098A (en) Laser simulation actual combat shooting training system
CA3198008A1 (en) Training apparatus including a weapon
CN105423809A (en) Laser simulated practical shooting training system
CN105403097A (en) Laser simulation shooting counter training system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140731

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20150617

RIC1 Information provided on ipc code assigned before grant

Ipc: F41J 5/24 20060101ALN20150611BHEP

Ipc: F41J 11/00 20090101ALI20150611BHEP

Ipc: F41J 9/02 20060101AFI20150611BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: F41J 11/00 20090101ALI20170405BHEP

Ipc: F41J 5/24 20060101ALN20170405BHEP

Ipc: F41J 9/02 20060101AFI20170405BHEP

INTG Intention to grant announced

Effective date: 20170504

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: MARATHON ROBOTICS PTY LTD

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 908762

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170715

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602013023444

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20170712

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 908762

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170712

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171012

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171013

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171112

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171012

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602013023444

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

26N No opposition filed

Effective date: 20180413

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602013023444

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180123

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180801

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20180131

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180123

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180123

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20130123

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170712

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170712

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231213

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240117

Year of fee payment: 12